Sample records for multiple analysis approach

  1. Multiple Correlation versus Multiple Regression.

    ERIC Educational Resources Information Center

    Huberty, Carl J.

    2003-01-01

    Describes differences between multiple correlation analysis (MCA) and multiple regression analysis (MRA), showing how these approaches involve different research questions and study designs, different inferential approaches, different analysis strategies, and different reported information. (SLD)

  2. Integrative prescreening in analysis of multiple cancer genomic studies

    PubMed Central

    2012-01-01

    Background In high throughput cancer genomic studies, results from the analysis of single datasets often suffer from a lack of reproducibility because of small sample sizes. Integrative analysis can effectively pool and analyze multiple datasets and provides a cost effective way to improve reproducibility. In integrative analysis, simultaneously analyzing all genes profiled may incur high computational cost. A computationally affordable remedy is prescreening, which fits marginal models, can be conducted in a parallel manner, and has low computational cost. Results An integrative prescreening approach is developed for the analysis of multiple cancer genomic datasets. Simulation shows that the proposed integrative prescreening has better performance than alternatives, particularly including prescreening with individual datasets, an intensity approach and meta-analysis. We also analyze multiple microarray gene profiling studies on liver and pancreatic cancers using the proposed approach. Conclusions The proposed integrative prescreening provides an effective way to reduce the dimensionality in cancer genomic studies. It can be coupled with existing analysis methods to identify cancer markers. PMID:22799431

  3. Propensity score analysis with partially observed covariates: How should multiple imputation be used?

    PubMed

    Leyrat, Clémence; Seaman, Shaun R; White, Ian R; Douglas, Ian; Smeeth, Liam; Kim, Joseph; Resche-Rigon, Matthieu; Carpenter, James R; Williamson, Elizabeth J

    2017-01-01

    Inverse probability of treatment weighting is a popular propensity score-based approach to estimate marginal treatment effects in observational studies at risk of confounding bias. A major issue when estimating the propensity score is the presence of partially observed covariates. Multiple imputation is a natural approach to handle missing data on covariates: covariates are imputed and a propensity score analysis is performed in each imputed dataset to estimate the treatment effect. The treatment effect estimates from each imputed dataset are then combined to obtain an overall estimate. We call this method MIte. However, an alternative approach has been proposed, in which the propensity scores are combined across the imputed datasets (MIps). Therefore, there are remaining uncertainties about how to implement multiple imputation for propensity score analysis: (a) should we apply Rubin's rules to the inverse probability of treatment weighting treatment effect estimates or to the propensity score estimates themselves? (b) does the outcome have to be included in the imputation model? (c) how should we estimate the variance of the inverse probability of treatment weighting estimator after multiple imputation? We studied the consistency and balancing properties of the MIte and MIps estimators and performed a simulation study to empirically assess their performance for the analysis of a binary outcome. We also compared the performance of these methods to complete case analysis and the missingness pattern approach, which uses a different propensity score model for each pattern of missingness, and a third multiple imputation approach in which the propensity score parameters are combined rather than the propensity scores themselves (MIpar). Under a missing at random mechanism, complete case and missingness pattern analyses were biased in most cases for estimating the marginal treatment effect, whereas multiple imputation approaches were approximately unbiased as long as the outcome was included in the imputation model. Only MIte was unbiased in all the studied scenarios and Rubin's rules provided good variance estimates for MIte. The propensity score estimated in the MIte approach showed good balancing properties. In conclusion, when using multiple imputation in the inverse probability of treatment weighting context, MIte with the outcome included in the imputation model is the preferred approach.

  4. Simultaneous Two-Way Clustering of Multiple Correspondence Analysis

    ERIC Educational Resources Information Center

    Hwang, Heungsun; Dillon, William R.

    2010-01-01

    A 2-way clustering approach to multiple correspondence analysis is proposed to account for cluster-level heterogeneity of both respondents and variable categories in multivariate categorical data. Specifically, in the proposed method, multiple correspondence analysis is combined with k-means in a unified framework in which "k"-means is…

  5. Simplification of multiple Fourier series - An example of algorithmic approach

    NASA Technical Reports Server (NTRS)

    Ng, E. W.

    1981-01-01

    This paper describes one example of multiple Fourier series which originate from a problem of spectral analysis of time series data. The example is exercised here with an algorithmic approach which can be generalized for other series manipulation on a computer. The generalized approach is presently pursued towards applications to a variety of multiple series and towards a general purpose algorithm for computer algebra implementation.

  6. Singapore Primary Students' Pursuit of Multiple Achievement Goals: A Latent Profile Analysis

    ERIC Educational Resources Information Center

    Ning, Hoi Kwan

    2018-01-01

    Based on measures of approach and avoidance mastery and performance goals delineated in the 2 × 2 achievement goal framework, this study utilized a person-centered approach to examine Singapore primary students' (N = 819) multiple goals pursuit in the general school context. Latent profile analysis identified six types of students with distinct…

  7. Combined statistical analyses for long-term stability data with multiple storage conditions: a simulation study.

    PubMed

    Almalik, Osama; Nijhuis, Michiel B; van den Heuvel, Edwin R

    2014-01-01

    Shelf-life estimation usually requires that at least three registration batches are tested for stability at multiple storage conditions. The shelf-life estimates are often obtained by linear regression analysis per storage condition, an approach implicitly suggested by ICH guideline Q1E. A linear regression analysis combining all data from multiple storage conditions was recently proposed in the literature when variances are homogeneous across storage conditions. The combined analysis is expected to perform better than the separate analysis per storage condition, since pooling data would lead to an improved estimate of the variation and higher numbers of degrees of freedom, but this is not evident for shelf-life estimation. Indeed, the two approaches treat the observed initial batch results, the intercepts in the model, and poolability of batches differently, which may eliminate or reduce the expected advantage of the combined approach with respect to the separate approach. Therefore, a simulation study was performed to compare the distribution of simulated shelf-life estimates on several characteristics between the two approaches and to quantify the difference in shelf-life estimates. In general, the combined statistical analysis does estimate the true shelf life more consistently and precisely than the analysis per storage condition, but it did not outperform the separate analysis in all circumstances.

  8. Cross-Population Joint Analysis of eQTLs: Fine Mapping and Functional Annotation

    PubMed Central

    Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-01-01

    Mapping expression quantitative trait loci (eQTLs) has been shown as a powerful tool to uncover the genetic underpinnings of many complex traits at molecular level. In this paper, we present an integrative analysis approach that leverages eQTL data collected from multiple population groups. In particular, our approach effectively identifies multiple independent cis-eQTL signals that are consistent across populations, accounting for population heterogeneity in allele frequencies and linkage disequilibrium patterns. Furthermore, by integrating genomic annotations, our analysis framework enables high-resolution functional analysis of eQTLs. We applied our statistical approach to analyze the GEUVADIS data consisting of samples from five population groups. From this analysis, we concluded that i) jointly analysis across population groups greatly improves the power of eQTL discovery and the resolution of fine mapping of causal eQTL ii) many genes harbor multiple independent eQTLs in their cis regions iii) genetic variants that disrupt transcription factor binding are significantly enriched in eQTLs (p-value = 4.93 × 10-22). PMID:25906321

  9. The Health Action Process Approach as a Motivational Model of Dietary Self-Management for People with Multiple Sclerosis: A Path Analysis

    ERIC Educational Resources Information Center

    Chiu, Chung-Yi; Lynch, Ruth Torkelson; Chan, Fong; Rose, Lindsey

    2012-01-01

    The main objective of this study was to evaluate the health action process approach (HAPA) as a motivational model for dietary self-management for people with multiple sclerosis (MS). Quantitative descriptive research design using path analysis was used. Participants were 209 individuals with MS recruited from the National MS Society and a…

  10. Application of Multiple Imputation for Missing Values in Three-Way Three-Mode Multi-Environment Trial Data

    PubMed Central

    Tian, Ting; McLachlan, Geoffrey J.; Dieters, Mark J.; Basford, Kaye E.

    2015-01-01

    It is a common occurrence in plant breeding programs to observe missing values in three-way three-mode multi-environment trial (MET) data. We proposed modifications of models for estimating missing observations for these data arrays, and developed a novel approach in terms of hierarchical clustering. Multiple imputation (MI) was used in four ways, multiple agglomerative hierarchical clustering, normal distribution model, normal regression model, and predictive mean match. The later three models used both Bayesian analysis and non-Bayesian analysis, while the first approach used a clustering procedure with randomly selected attributes and assigned real values from the nearest neighbour to the one with missing observations. Different proportions of data entries in six complete datasets were randomly selected to be missing and the MI methods were compared based on the efficiency and accuracy of estimating those values. The results indicated that the models using Bayesian analysis had slightly higher accuracy of estimation performance than those using non-Bayesian analysis but they were more time-consuming. However, the novel approach of multiple agglomerative hierarchical clustering demonstrated the overall best performances. PMID:26689369

  11. Application of Multiple Imputation for Missing Values in Three-Way Three-Mode Multi-Environment Trial Data.

    PubMed

    Tian, Ting; McLachlan, Geoffrey J; Dieters, Mark J; Basford, Kaye E

    2015-01-01

    It is a common occurrence in plant breeding programs to observe missing values in three-way three-mode multi-environment trial (MET) data. We proposed modifications of models for estimating missing observations for these data arrays, and developed a novel approach in terms of hierarchical clustering. Multiple imputation (MI) was used in four ways, multiple agglomerative hierarchical clustering, normal distribution model, normal regression model, and predictive mean match. The later three models used both Bayesian analysis and non-Bayesian analysis, while the first approach used a clustering procedure with randomly selected attributes and assigned real values from the nearest neighbour to the one with missing observations. Different proportions of data entries in six complete datasets were randomly selected to be missing and the MI methods were compared based on the efficiency and accuracy of estimating those values. The results indicated that the models using Bayesian analysis had slightly higher accuracy of estimation performance than those using non-Bayesian analysis but they were more time-consuming. However, the novel approach of multiple agglomerative hierarchical clustering demonstrated the overall best performances.

  12. Multiple perspective vulnerability analysis of the power network

    NASA Astrophysics Data System (ADS)

    Wang, Shuliang; Zhang, Jianhua; Duan, Na

    2018-02-01

    To understand the vulnerability of the power network from multiple perspectives, multi-angle and multi-dimensional vulnerability analysis as well as community based vulnerability analysis are proposed in this paper. Taking into account of central China power grid as an example, correlation analysis of different vulnerability models is discussed. Then, vulnerabilities produced by different vulnerability metrics under the given vulnerability models and failure scenarios are analyzed. At last, applying the community detecting approach, critical areas of central China power grid are identified, Vulnerable and robust communities on both topological and functional perspective are acquired and analyzed. The approach introduced in this paper can be used to help decision makers develop optimal protection strategies. It will be also useful to give a multiple vulnerability analysis of the other infrastructure systems.

  13. Sparse Group Penalized Integrative Analysis of Multiple Cancer Prognosis Datasets

    PubMed Central

    Liu, Jin; Huang, Jian; Xie, Yang; Ma, Shuangge

    2014-01-01

    SUMMARY In cancer research, high-throughput profiling studies have been extensively conducted, searching for markers associated with prognosis. Because of the “large d, small n” characteristic, results generated from the analysis of a single dataset can be unsatisfactory. Recent studies have shown that integrative analysis, which simultaneously analyzes multiple datasets, can be more effective than single-dataset analysis and classic meta-analysis. In most of existing integrative analysis, the homogeneity model has been assumed, which postulates that different datasets share the same set of markers. Several approaches have been designed to reinforce this assumption. In practice, different datasets may differ in terms of patient selection criteria, profiling techniques, and many other aspects. Such differences may make the homogeneity model too restricted. In this study, we assume the heterogeneity model, under which different datasets are allowed to have different sets of markers. With multiple cancer prognosis datasets, we adopt the AFT (accelerated failure time) model to describe survival. This model may have the lowest computational cost among popular semiparametric survival models. For marker selection, we adopt a sparse group MCP (minimax concave penalty) approach. This approach has an intuitive formulation and can be computed using an effective group coordinate descent algorithm. Simulation study shows that it outperforms the existing approaches under both the homogeneity and heterogeneity models. Data analysis further demonstrates the merit of heterogeneity model and proposed approach. PMID:23938111

  14. A Bayesian Approach to a Multiple-Group Latent Class-Profile Analysis: The Timing of Drinking Onset and Subsequent Drinking Behaviors among U.S. Adolescents

    ERIC Educational Resources Information Center

    Chung, Hwan; Anthony, James C.

    2013-01-01

    This article presents a multiple-group latent class-profile analysis (LCPA) by taking a Bayesian approach in which a Markov chain Monte Carlo simulation is employed to achieve more robust estimates for latent growth patterns. This article describes and addresses a label-switching problem that involves the LCPA likelihood function, which has…

  15. Association analysis of multiple traits by an approach of combining P values.

    PubMed

    Chen, Lili; Wang, Yong; Zhou, Yajing

    2018-03-01

    Increasing evidence shows that one variant can affect multiple traits, which is a widespread phenomenon in complex diseases. Joint analysis of multiple traits can increase statistical power of association analysis and uncover the underlying genetic mechanism. Although there are many statistical methods to analyse multiple traits, most of these methods are usually suitable for detecting common variants associated with multiple traits. However, because of low minor allele frequency of rare variant, these methods are not optimal for rare variant association analysis. In this paper, we extend an adaptive combination of P values method (termed ADA) for single trait to test association between multiple traits and rare variants in the given region. For a given region, we use reverse regression model to test each rare variant associated with multiple traits and obtain the P value of single-variant test. Further, we take the weighted combination of these P values as the test statistic. Extensive simulation studies show that our approach is more powerful than several other comparison methods in most cases and is robust to the inclusion of a high proportion of neutral variants and the different directions of effects of causal variants.

  16. An Approach towards Ultrasound Kidney Cysts Detection using Vector Graphic Image Analysis

    NASA Astrophysics Data System (ADS)

    Mahmud, Wan Mahani Hafizah Wan; Supriyanto, Eko

    2017-08-01

    This study develops new approach towards detection of kidney ultrasound image for both with single cyst as well as multiple cysts. 50 single cyst images and 25 multiple cysts images were used to test the developed algorithm. Steps involved in developing this algorithm were vector graphic image formation and analysis, thresholding, binarization, filtering as well as roundness test. Performance evaluation to 50 single cyst images gave accuracy of 92%, while for multiple cysts images, the accuracy was about 86.89% when tested to 25 multiple cysts images. This developed algorithm may be used in developing a computerized system such as computer aided diagnosis system to help medical experts in diagnosis of kidney cysts.

  17. Exploring High-D Spaces with Multiform Matrices and Small Multiples

    PubMed Central

    MacEachren, Alan; Dai, Xiping; Hardisty, Frank; Guo, Diansheng; Lengerich, Gene

    2011-01-01

    We introduce an approach to visual analysis of multivariate data that integrates several methods from information visualization, exploratory data analysis (EDA), and geovisualization. The approach leverages the component-based architecture implemented in GeoVISTA Studio to construct a flexible, multiview, tightly (but generically) coordinated, EDA toolkit. This toolkit builds upon traditional ideas behind both small multiples and scatterplot matrices in three fundamental ways. First, we develop a general, MultiForm, Bivariate Matrix and a complementary MultiForm, Bivariate Small Multiple plot in which different bivariate representation forms can be used in combination. We demonstrate the flexibility of this approach with matrices and small multiples that depict multivariate data through combinations of: scatterplots, bivariate maps, and space-filling displays. Second, we apply a measure of conditional entropy to (a) identify variables from a high-dimensional data set that are likely to display interesting relationships and (b) generate a default order of these variables in the matrix or small multiple display. Third, we add conditioning, a kind of dynamic query/filtering in which supplementary (undisplayed) variables are used to constrain the view onto variables that are displayed. Conditioning allows the effects of one or more well understood variables to be removed from the analysis, making relationships among remaining variables easier to explore. We illustrate the individual and combined functionality enabled by this approach through application to analysis of cancer diagnosis and mortality data and their associated covariates and risk factors. PMID:21947129

  18. A multiple technique approach to the analysis of urinary calculi.

    PubMed

    Rodgers, A L; Nassimbeni, L R; Mulder, K J

    1982-01-01

    10 urinary calculi have been qualitatively and quantitatively analysed using X-ray diffraction, infra-red, scanning electron microscopy, X-ray fluorescence, atomic absorption and density gradient procedures. Constituents and compositional features which often go undetected due to limitations in the particular analytical procedure being used, have been identified and a detailed picture of each stone's composition and structure has been obtained. In all cases at least two components were detected suggesting that the multiple technique approach might cast some doubt as to the existence of "pure" stones. Evidence for a continuous, non-sequential deposition mechanism has been detected. In addition, the usefulness of each technique in the analysis of urinary stones has been assessed and the multiple technique approach has been evaluated as a whole.

  19. Note on Professor Sizer's Paper.

    ERIC Educational Resources Information Center

    Balderston, Frederick E.

    1979-01-01

    Issues suggested by John Sizer's paper, an overview of the assessment of institutional performance, include: the efficient-frontier approach, multiple-criterion decision-making models, performance analysis approached as path analysis, and assessment of academic quality. (JMD)

  20. QuickNGS elevates Next-Generation Sequencing data analysis to a new level of automation.

    PubMed

    Wagle, Prerana; Nikolić, Miloš; Frommolt, Peter

    2015-07-01

    Next-Generation Sequencing (NGS) has emerged as a widely used tool in molecular biology. While time and cost for the sequencing itself are decreasing, the analysis of the massive amounts of data remains challenging. Since multiple algorithmic approaches for the basic data analysis have been developed, there is now an increasing need to efficiently use these tools to obtain results in reasonable time. We have developed QuickNGS, a new workflow system for laboratories with the need to analyze data from multiple NGS projects at a time. QuickNGS takes advantage of parallel computing resources, a comprehensive back-end database, and a careful selection of previously published algorithmic approaches to build fully automated data analysis workflows. We demonstrate the efficiency of our new software by a comprehensive analysis of 10 RNA-Seq samples which we can finish in only a few minutes of hands-on time. The approach we have taken is suitable to process even much larger numbers of samples and multiple projects at a time. Our approach considerably reduces the barriers that still limit the usability of the powerful NGS technology and finally decreases the time to be spent before proceeding to further downstream analysis and interpretation of the data.

  1. Multiple imputation methods for bivariate outcomes in cluster randomised trials.

    PubMed

    DiazOrdaz, K; Kenward, M G; Gomes, M; Grieve, R

    2016-09-10

    Missing observations are common in cluster randomised trials. The problem is exacerbated when modelling bivariate outcomes jointly, as the proportion of complete cases is often considerably smaller than the proportion having either of the outcomes fully observed. Approaches taken to handling such missing data include the following: complete case analysis, single-level multiple imputation that ignores the clustering, multiple imputation with a fixed effect for each cluster and multilevel multiple imputation. We contrasted the alternative approaches to handling missing data in a cost-effectiveness analysis that uses data from a cluster randomised trial to evaluate an exercise intervention for care home residents. We then conducted a simulation study to assess the performance of these approaches on bivariate continuous outcomes, in terms of confidence interval coverage and empirical bias in the estimated treatment effects. Missing-at-random clustered data scenarios were simulated following a full-factorial design. Across all the missing data mechanisms considered, the multiple imputation methods provided estimators with negligible bias, while complete case analysis resulted in biased treatment effect estimates in scenarios where the randomised treatment arm was associated with missingness. Confidence interval coverage was generally in excess of nominal levels (up to 99.8%) following fixed-effects multiple imputation and too low following single-level multiple imputation. Multilevel multiple imputation led to coverage levels of approximately 95% throughout. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  2. Multiple Imputation of a Randomly Censored Covariate Improves Logistic Regression Analysis.

    PubMed

    Atem, Folefac D; Qian, Jing; Maye, Jacqueline E; Johnson, Keith A; Betensky, Rebecca A

    2016-01-01

    Randomly censored covariates arise frequently in epidemiologic studies. The most commonly used methods, including complete case and single imputation or substitution, suffer from inefficiency and bias. They make strong parametric assumptions or they consider limit of detection censoring only. We employ multiple imputation, in conjunction with semi-parametric modeling of the censored covariate, to overcome these shortcomings and to facilitate robust estimation. We develop a multiple imputation approach for randomly censored covariates within the framework of a logistic regression model. We use the non-parametric estimate of the covariate distribution or the semiparametric Cox model estimate in the presence of additional covariates in the model. We evaluate this procedure in simulations, and compare its operating characteristics to those from the complete case analysis and a survival regression approach. We apply the procedures to an Alzheimer's study of the association between amyloid positivity and maternal age of onset of dementia. Multiple imputation achieves lower standard errors and higher power than the complete case approach under heavy and moderate censoring and is comparable under light censoring. The survival regression approach achieves the highest power among all procedures, but does not produce interpretable estimates of association. Multiple imputation offers a favorable alternative to complete case analysis and ad hoc substitution methods in the presence of randomly censored covariates within the framework of logistic regression.

  3. Voxelwise multivariate analysis of multimodality magnetic resonance imaging

    PubMed Central

    Naylor, Melissa G.; Cardenas, Valerie A.; Tosun, Duygu; Schuff, Norbert; Weiner, Michael; Schwartzman, Armin

    2015-01-01

    Most brain magnetic resonance imaging (MRI) studies concentrate on a single MRI contrast or modality, frequently structural MRI. By performing an integrated analysis of several modalities, such as structural, perfusion-weighted, and diffusion-weighted MRI, new insights may be attained to better understand the underlying processes of brain diseases. We compare two voxelwise approaches: (1) fitting multiple univariate models, one for each outcome and then adjusting for multiple comparisons among the outcomes and (2) fitting a multivariate model. In both cases, adjustment for multiple comparisons is performed over all voxels jointly to account for the search over the brain. The multivariate model is able to account for the multiple comparisons over outcomes without assuming independence because the covariance structure between modalities is estimated. Simulations show that the multivariate approach is more powerful when the outcomes are correlated and, even when the outcomes are independent, the multivariate approach is just as powerful or more powerful when at least two outcomes are dependent on predictors in the model. However, multiple univariate regressions with Bonferroni correction remains a desirable alternative in some circumstances. To illustrate the power of each approach, we analyze a case control study of Alzheimer's disease, in which data from three MRI modalities are available. PMID:23408378

  4. Methodological issues underlying multiple decrement life table analysis.

    PubMed

    Mode, C J; Avery, R C; Littman, G S; Potter, R G

    1977-02-01

    In this paper, the actuarial method of multiple decrement life table analysis of censored, longitudinal data is examined. The discussion is organized in terms of the first segment of usage of an intrauterine device. Weaknesses of the actuarial approach are pointed out, and an alternative approach, based on the classical model of competing risks, is proposed. Finally, the actuarial and the alternative method of analyzing censored data are compared, using data from the Taichung Medical Study on Intrauterine Devices.

  5. An introduction to multiplicity issues in clinical trials: the what, why, when and how.

    PubMed

    Li, Guowei; Taljaard, Monica; Van den Heuvel, Edwin R; Levine, Mitchell Ah; Cook, Deborah J; Wells, George A; Devereaux, Philip J; Thabane, Lehana

    2017-04-01

    In clinical trials it is not uncommon to face a multiple testing problem which can have an impact on both type I and type II error rates, leading to inappropriate interpretation of trial results. Multiplicity issues may need to be considered at the design, analysis and interpretation stages of a trial. The proportion of trial reports not adequately correcting for multiple testing remains substantial. The purpose of this article is to provide an introduction to multiple testing issues in clinical trials, and to reduce confusion around the need for multiplicity adjustments. We use a tutorial, question-and-answer approach to address the key issues of why, when and how to consider multiplicity adjustments in trials. We summarize the relevant circumstances under which multiplicity adjustments ought to be considered, as well as options for carrying out multiplicity adjustments in terms of trial design factors including Population, Intervention/Comparison, Outcome, Time frame and Analysis (PICOTA). Results are presented in an easy-to-use table and flow diagrams. Confusion about multiplicity issues can be reduced or avoided by considering the potential impact of multiplicity on type I and II errors and, if necessary pre-specifying statistical approaches to either avoid or adjust for multiplicity in the trial protocol or analysis plan. © The Author 2016; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.

  6. VAUD: A Visual Analysis Approach for Exploring Spatio-Temporal Urban Data.

    PubMed

    Chen, Wei; Huang, Zhaosong; Wu, Feiran; Zhu, Minfeng; Guan, Huihua; Maciejewski, Ross

    2017-10-02

    Urban data is massive, heterogeneous, and spatio-temporal, posing a substantial challenge for visualization and analysis. In this paper, we design and implement a novel visual analytics approach, Visual Analyzer for Urban Data (VAUD), that supports the visualization, querying, and exploration of urban data. Our approach allows for cross-domain correlation from multiple data sources by leveraging spatial-temporal and social inter-connectedness features. Through our approach, the analyst is able to select, filter, aggregate across multiple data sources and extract information that would be hidden to a single data subset. To illustrate the effectiveness of our approach, we provide case studies on a real urban dataset that contains the cyber-, physical-, and socialinformation of 14 million citizens over 22 days.

  7. 3D fluid-structure modelling and vibration analysis for fault diagnosis of Francis turbine using multiple ANN and multiple ANFIS

    NASA Astrophysics Data System (ADS)

    Saeed, R. A.; Galybin, A. N.; Popov, V.

    2013-01-01

    This paper discusses condition monitoring and fault diagnosis in Francis turbine based on integration of numerical modelling with several different artificial intelligence (AI) techniques. In this study, a numerical approach for fluid-structure (turbine runner) analysis is presented. The results of numerical analysis provide frequency response functions (FRFs) data sets along x-, y- and z-directions under different operating load and different position and size of faults in the structure. To extract features and reduce the dimensionality of the obtained FRF data, the principal component analysis (PCA) has been applied. Subsequently, the extracted features are formulated and fed into multiple artificial neural networks (ANN) and multiple adaptive neuro-fuzzy inference systems (ANFIS) in order to identify the size and position of the damage in the runner and estimate the turbine operating conditions. The results demonstrated the effectiveness of this approach and provide satisfactory accuracy even when the input data are corrupted with certain level of noise.

  8. A Parallel Independent Component Analysis Approach to Investigate Genomic Influence on Brain Function

    PubMed Central

    Liu, Jingyu; Demirci, Oguz; Calhoun, Vince D.

    2009-01-01

    Relationships between genomic data and functional brain images are of great interest but require new analysis approaches to integrate the high-dimensional data types. This letter presents an extension of a technique called parallel independent component analysis (paraICA), which enables the joint analysis of multiple modalities including interconnections between them. We extend our earlier work by allowing for multiple interconnections and by providing important overfitting controls. Performance was assessed by simulations under different conditions, and indicated reliable results can be extracted by properly balancing overfitting and underfitting. An application to functional magnetic resonance images and single nucleotide polymorphism array produced interesting findings. PMID:19834575

  9. A Parallel Independent Component Analysis Approach to Investigate Genomic Influence on Brain Function.

    PubMed

    Liu, Jingyu; Demirci, Oguz; Calhoun, Vince D

    2008-01-01

    Relationships between genomic data and functional brain images are of great interest but require new analysis approaches to integrate the high-dimensional data types. This letter presents an extension of a technique called parallel independent component analysis (paraICA), which enables the joint analysis of multiple modalities including interconnections between them. We extend our earlier work by allowing for multiple interconnections and by providing important overfitting controls. Performance was assessed by simulations under different conditions, and indicated reliable results can be extracted by properly balancing overfitting and underfitting. An application to functional magnetic resonance images and single nucleotide polymorphism array produced interesting findings.

  10. A Unified Approach to Functional Principal Component Analysis and Functional Multiple-Set Canonical Correlation.

    PubMed

    Choi, Ji Yeh; Hwang, Heungsun; Yamamoto, Michio; Jung, Kwanghee; Woodward, Todd S

    2017-06-01

    Functional principal component analysis (FPCA) and functional multiple-set canonical correlation analysis (FMCCA) are data reduction techniques for functional data that are collected in the form of smooth curves or functions over a continuum such as time or space. In FPCA, low-dimensional components are extracted from a single functional dataset such that they explain the most variance of the dataset, whereas in FMCCA, low-dimensional components are obtained from each of multiple functional datasets in such a way that the associations among the components are maximized across the different sets. In this paper, we propose a unified approach to FPCA and FMCCA. The proposed approach subsumes both techniques as special cases. Furthermore, it permits a compromise between the techniques, such that components are obtained from each set of functional data to maximize their associations across different datasets, while accounting for the variance of the data well. We propose a single optimization criterion for the proposed approach, and develop an alternating regularized least squares algorithm to minimize the criterion in combination with basis function approximations to functions. We conduct a simulation study to investigate the performance of the proposed approach based on synthetic data. We also apply the approach for the analysis of multiple-subject functional magnetic resonance imaging data to obtain low-dimensional components of blood-oxygen level-dependent signal changes of the brain over time, which are highly correlated across the subjects as well as representative of the data. The extracted components are used to identify networks of neural activity that are commonly activated across the subjects while carrying out a working memory task.

  11. Quantifying and Reducing Uncertainty in Correlated Multi-Area Short-Term Load Forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Yannan; Hou, Zhangshuan; Meng, Da

    2016-07-17

    In this study, we represent and reduce the uncertainties in short-term electric load forecasting by integrating time series analysis tools including ARIMA modeling, sequential Gaussian simulation, and principal component analysis. The approaches are mainly focusing on maintaining the inter-dependency between multiple geographically related areas. These approaches are applied onto cross-correlated load time series as well as their forecast errors. Multiple short-term prediction realizations are then generated from the reduced uncertainty ranges, which are useful for power system risk analyses.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pachuilo, Andrew R; Ragan, Eric; Goodall, John R

    Visualization tools can take advantage of multiple coordinated views to support analysis of large, multidimensional data sets. Effective design of such views and layouts can be challenging, but understanding users analysis strategies can inform design improvements. We outline an approach for intelligent design configuration of visualization tools with multiple coordinated views, and we discuss a proposed software framework to support the approach. The proposed software framework could capture and learn from user interaction data to automate new compositions of views and widgets. Such a framework could reduce the time needed for meta analysis of the visualization use and lead tomore » more effective visualization design.« less

  13. Better Crunching: Recommendations for Multivariate Data Analysis Approaches for Program Impact Evaluations

    ERIC Educational Resources Information Center

    Braverman, Marc T.

    2016-01-01

    Extension program evaluations often present opportunities to analyze data in multiple ways. This article suggests that program evaluations can involve more sophisticated data analysis approaches than are often used. On the basis of a hypothetical program scenario and corresponding data set, two approaches to testing for evidence of program impact…

  14. Integrative Exploratory Analysis of Two or More Genomic Datasets.

    PubMed

    Meng, Chen; Culhane, Aedin

    2016-01-01

    Exploratory analysis is an essential step in the analysis of high throughput data. Multivariate approaches such as correspondence analysis (CA), principal component analysis, and multidimensional scaling are widely used in the exploratory analysis of single dataset. Modern biological studies often assay multiple types of biological molecules (e.g., mRNA, protein, phosphoproteins) on a same set of biological samples, thereby creating multiple different types of omics data or multiassay data. Integrative exploratory analysis of these multiple omics data is required to leverage the potential of multiple omics studies. In this chapter, we describe the application of co-inertia analysis (CIA; for analyzing two datasets) and multiple co-inertia analysis (MCIA; for three or more datasets) to address this problem. These methods are powerful yet simple multivariate approaches that represent samples using a lower number of variables, allowing a more easily identification of the correlated structure in and between multiple high dimensional datasets. Graphical representations can be employed to this purpose. In addition, the methods simultaneously project samples and variables (genes, proteins) onto the same lower dimensional space, so the most variant variables from each dataset can be selected and associated with samples, which can be further used to facilitate biological interpretation and pathway analysis. We applied CIA to explore the concordance between mRNA and protein expression in a panel of 60 tumor cell lines from the National Cancer Institute. In the same 60 cell lines, we used MCIA to perform a cross-platform comparison of mRNA gene expression profiles obtained on four different microarray platforms. Last, as an example of integrative analysis of multiassay or multi-omics data we analyzed transcriptomic, proteomic, and phosphoproteomic data from pluripotent (iPS) and embryonic stem (ES) cell lines.

  15. Voxelwise multivariate analysis of multimodality magnetic resonance imaging.

    PubMed

    Naylor, Melissa G; Cardenas, Valerie A; Tosun, Duygu; Schuff, Norbert; Weiner, Michael; Schwartzman, Armin

    2014-03-01

    Most brain magnetic resonance imaging (MRI) studies concentrate on a single MRI contrast or modality, frequently structural MRI. By performing an integrated analysis of several modalities, such as structural, perfusion-weighted, and diffusion-weighted MRI, new insights may be attained to better understand the underlying processes of brain diseases. We compare two voxelwise approaches: (1) fitting multiple univariate models, one for each outcome and then adjusting for multiple comparisons among the outcomes and (2) fitting a multivariate model. In both cases, adjustment for multiple comparisons is performed over all voxels jointly to account for the search over the brain. The multivariate model is able to account for the multiple comparisons over outcomes without assuming independence because the covariance structure between modalities is estimated. Simulations show that the multivariate approach is more powerful when the outcomes are correlated and, even when the outcomes are independent, the multivariate approach is just as powerful or more powerful when at least two outcomes are dependent on predictors in the model. However, multiple univariate regressions with Bonferroni correction remain a desirable alternative in some circumstances. To illustrate the power of each approach, we analyze a case control study of Alzheimer's disease, in which data from three MRI modalities are available. Copyright © 2013 Wiley Periodicals, Inc.

  16. Cross-Sectional Time Series Designs: A General Transformation Approach.

    ERIC Educational Resources Information Center

    Velicer, Wayne F.; McDonald, Roderick P.

    1991-01-01

    The general transformation approach to time series analysis is extended to the analysis of multiple unit data by the development of a patterned transformation matrix. The procedure includes alternatives for special cases and requires only minor revisions in existing computer software. (SLD)

  17. Combining information from multiple flood projections in a hierarchical Bayesian framework

    NASA Astrophysics Data System (ADS)

    Le Vine, Nataliya

    2016-04-01

    This study demonstrates, in the context of flood frequency analysis, the potential of a recently proposed hierarchical Bayesian approach to combine information from multiple models. The approach explicitly accommodates shared multimodel discrepancy as well as the probabilistic nature of the flood estimates, and treats the available models as a sample from a hypothetical complete (but unobserved) set of models. The methodology is applied to flood estimates from multiple hydrological projections (the Future Flows Hydrology data set) for 135 catchments in the UK. The advantages of the approach are shown to be: (1) to ensure adequate "baseline" with which to compare future changes; (2) to reduce flood estimate uncertainty; (3) to maximize use of statistical information in circumstances where multiple weak predictions individually lack power, but collectively provide meaningful information; (4) to diminish the importance of model consistency when model biases are large; and (5) to explicitly consider the influence of the (model performance) stationarity assumption. Moreover, the analysis indicates that reducing shared model discrepancy is the key to further reduction of uncertainty in the flood frequency analysis. The findings are of value regarding how conclusions about changing exposure to flooding are drawn, and to flood frequency change attribution studies.

  18. An optimization based sampling approach for multiple metrics uncertainty analysis using generalized likelihood uncertainty estimation

    NASA Astrophysics Data System (ADS)

    Zhou, Rurui; Li, Yu; Lu, Di; Liu, Haixing; Zhou, Huicheng

    2016-09-01

    This paper investigates the use of an epsilon-dominance non-dominated sorted genetic algorithm II (ɛ-NSGAII) as a sampling approach with an aim to improving sampling efficiency for multiple metrics uncertainty analysis using Generalized Likelihood Uncertainty Estimation (GLUE). The effectiveness of ɛ-NSGAII based sampling is demonstrated compared with Latin hypercube sampling (LHS) through analyzing sampling efficiency, multiple metrics performance, parameter uncertainty and flood forecasting uncertainty with a case study of flood forecasting uncertainty evaluation based on Xinanjiang model (XAJ) for Qing River reservoir, China. Results obtained demonstrate the following advantages of the ɛ-NSGAII based sampling approach in comparison to LHS: (1) The former performs more effective and efficient than LHS, for example the simulation time required to generate 1000 behavioral parameter sets is shorter by 9 times; (2) The Pareto tradeoffs between metrics are demonstrated clearly with the solutions from ɛ-NSGAII based sampling, also their Pareto optimal values are better than those of LHS, which means better forecasting accuracy of ɛ-NSGAII parameter sets; (3) The parameter posterior distributions from ɛ-NSGAII based sampling are concentrated in the appropriate ranges rather than uniform, which accords with their physical significance, also parameter uncertainties are reduced significantly; (4) The forecasted floods are close to the observations as evaluated by three measures: the normalized total flow outside the uncertainty intervals (FOUI), average relative band-width (RB) and average deviation amplitude (D). The flood forecasting uncertainty is also reduced a lot with ɛ-NSGAII based sampling. This study provides a new sampling approach to improve multiple metrics uncertainty analysis under the framework of GLUE, and could be used to reveal the underlying mechanisms of parameter sets under multiple conflicting metrics in the uncertainty analysis process.

  19. An integrative system biology approach to unravel potential drug candidates for multiple age related disorders.

    PubMed

    Srivastava, Isha; Khurana, Pooja; Yadav, Mohini; Hasija, Yasha

    2017-12-01

    Aging, though an inevitable part of life, is becoming a worldwide social and economic problem. Healthy aging is usually marked by low probability of age related disorders. Good therapeutic approaches are still in need to cure age related disorders. Occurrence of more than one ARD in an individual, expresses the need of discovery of such target proteins, which can affect multiple ARDs. Advanced scientific and medical research technologies throughout last three decades have arrived to the point where lots of key molecular determinants affect human disorders can be examined thoroughly. In this study, we designed and executed an approach to prioritize drugs that may target multiple age related disorders. Our methodology, focused on the analysis of biological pathways and protein protein interaction networks that may contribute to the pharmacology of age related disorders, included various steps such as retrieval and analysis of data, protein-protein interaction network analysis, and statistical and comparative analysis of topological coefficients, pathway, and functional enrichment analysis, and identification of drug-target proteins. We assume that the identified molecular determinants may be prioritized for further screening as novel drug targets to cure multiple ARDs. Based on the analysis, an online tool named as 'ARDnet' has been developed to construct and demonstrate ARD interactions at the level of PPI, ARDs and ARDs protein interaction, ARDs pathway interaction and drug-target interaction. The tool is freely made available at http://genomeinformatics.dtu.ac.in/ARDNet/Index.html. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. An omnibus test for family-based association studies with multiple SNPs and multiple phenotypes.

    PubMed

    Lasky-Su, Jessica; Murphy, Amy; McQueen, Matthew B; Weiss, Scott; Lange, Christoph

    2010-06-01

    We propose an omnibus family-based association test (MFBAT) that can be applied to multiple markers and multiple phenotypes and that has only one degree of freedom. The proposed test statistic extends current FBAT methodology to incorporate multiple markers as well as multiple phenotypes. Using simulation studies, power estimates for the proposed methodology are compared with the standard methodologies. On the basis of these simulations, we find that MFBAT substantially outperforms other methods, including haplotypic approaches and doing multiple tests with single single-nucleotide polymorphisms (SNPs) and single phenotypes. The practical relevance of the approach is illustrated by an application to asthma in which SNP/phenotype combinations are identified and reach overall significance that would not have been identified using other approaches. This methodology is directly applicable to cases in which there are multiple SNPs, such as candidate gene studies, cases in which there are multiple phenotypes, such as expression data, and cases in which there are multiple phenotypes and genotypes, such as genome-wide association studies that incorporate expression profiles as phenotypes. This program is available in the PBAT analysis package.

  1. Meta-analysis of correlated traits via summary statistics from GWASs with an application in hypertension.

    PubMed

    Zhu, Xiaofeng; Feng, Tao; Tayo, Bamidele O; Liang, Jingjing; Young, J Hunter; Franceschini, Nora; Smith, Jennifer A; Yanek, Lisa R; Sun, Yan V; Edwards, Todd L; Chen, Wei; Nalls, Mike; Fox, Ervin; Sale, Michele; Bottinger, Erwin; Rotimi, Charles; Liu, Yongmei; McKnight, Barbara; Liu, Kiang; Arnett, Donna K; Chakravati, Aravinda; Cooper, Richard S; Redline, Susan

    2015-01-08

    Genome-wide association studies (GWASs) have identified many genetic variants underlying complex traits. Many detected genetic loci harbor variants that associate with multiple-even distinct-traits. Most current analysis approaches focus on single traits, even though the final results from multiple traits are evaluated together. Such approaches miss the opportunity to systemically integrate the phenome-wide data available for genetic association analysis. In this study, we propose a general approach that can integrate association evidence from summary statistics of multiple traits, either correlated, independent, continuous, or binary traits, which might come from the same or different studies. We allow for trait heterogeneity effects. Population structure and cryptic relatedness can also be controlled. Our simulations suggest that the proposed method has improved statistical power over single-trait analysis in most of the cases we studied. We applied our method to the Continental Origins and Genetic Epidemiology Network (COGENT) African ancestry samples for three blood pressure traits and identified four loci (CHIC2, HOXA-EVX1, IGFBP1/IGFBP3, and CDH17; p < 5.0 × 10(-8)) associated with hypertension-related traits that were missed by a single-trait analysis in the original report. Six additional loci with suggestive association evidence (p < 5.0 × 10(-7)) were also observed, including CACNA1D and WNT3. Our study strongly suggests that analyzing multiple phenotypes can improve statistical power and that such analysis can be executed with the summary statistics from GWASs. Our method also provides a way to study a cross phenotype (CP) association by using summary statistics from GWASs of multiple phenotypes. Copyright © 2015 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  2. The Use of Multiple Regression and Trend Analysis to Understand Enrollment Fluctuations. AIR Forum 1979 Paper.

    ERIC Educational Resources Information Center

    Campbell, S. Duke; Greenberg, Barry

    The development of a predictive equation capable of explaining a significant percentage of enrollment variability at Florida International University is described. A model utilizing trend analysis and a multiple regression approach to enrollment forecasting was adapted to investigate enrollment dynamics at the university. Four independent…

  3. Applied immuno-epidemiological research: an approach for integrating existing knowledge into the statistical analysis of multiple immune markers.

    PubMed

    Genser, Bernd; Fischer, Joachim E; Figueiredo, Camila A; Alcântara-Neves, Neuza; Barreto, Mauricio L; Cooper, Philip J; Amorim, Leila D; Saemann, Marcus D; Weichhart, Thomas; Rodrigues, Laura C

    2016-05-20

    Immunologists often measure several correlated immunological markers, such as concentrations of different cytokines produced by different immune cells and/or measured under different conditions, to draw insights from complex immunological mechanisms. Although there have been recent methodological efforts to improve the statistical analysis of immunological data, a framework is still needed for the simultaneous analysis of multiple, often correlated, immune markers. This framework would allow the immunologists' hypotheses about the underlying biological mechanisms to be integrated. We present an analytical approach for statistical analysis of correlated immune markers, such as those commonly collected in modern immuno-epidemiological studies. We demonstrate i) how to deal with interdependencies among multiple measurements of the same immune marker, ii) how to analyse association patterns among different markers, iii) how to aggregate different measures and/or markers to immunological summary scores, iv) how to model the inter-relationships among these scores, and v) how to use these scores in epidemiological association analyses. We illustrate the application of our approach to multiple cytokine measurements from 818 children enrolled in a large immuno-epidemiological study (SCAALA Salvador), which aimed to quantify the major immunological mechanisms underlying atopic diseases or asthma. We demonstrate how to aggregate systematically the information captured in multiple cytokine measurements to immunological summary scores aimed at reflecting the presumed underlying immunological mechanisms (Th1/Th2 balance and immune regulatory network). We show how these aggregated immune scores can be used as predictors in regression models with outcomes of immunological studies (e.g. specific IgE) and compare the results to those obtained by a traditional multivariate regression approach. The proposed analytical approach may be especially useful to quantify complex immune responses in immuno-epidemiological studies, where investigators examine the relationship among epidemiological patterns, immune response, and disease outcomes.

  4. Two-Way Regularized Fuzzy Clustering of Multiple Correspondence Analysis.

    PubMed

    Kim, Sunmee; Choi, Ji Yeh; Hwang, Heungsun

    2017-01-01

    Multiple correspondence analysis (MCA) is a useful tool for investigating the interrelationships among dummy-coded categorical variables. MCA has been combined with clustering methods to examine whether there exist heterogeneous subclusters of a population, which exhibit cluster-level heterogeneity. These combined approaches aim to classify either observations only (one-way clustering of MCA) or both observations and variable categories (two-way clustering of MCA). The latter approach is favored because its solutions are easier to interpret by providing explicitly which subgroup of observations is associated with which subset of variable categories. Nonetheless, the two-way approach has been built on hard classification that assumes observations and/or variable categories to belong to only one cluster. To relax this assumption, we propose two-way fuzzy clustering of MCA. Specifically, we combine MCA with fuzzy k-means simultaneously to classify a subgroup of observations and a subset of variable categories into a common cluster, while allowing both observations and variable categories to belong partially to multiple clusters. Importantly, we adopt regularized fuzzy k-means, thereby enabling us to decide the degree of fuzziness in cluster memberships automatically. We evaluate the performance of the proposed approach through the analysis of simulated and real data, in comparison with existing two-way clustering approaches.

  5. Conjoint analysis: a pragmatic approach for the accounting of multiple benefits in southern forest management

    Treesearch

    F. Christian Zinkhan; Thomas P. Holmes; D. Evan Mercer

    1994-01-01

    With conjoint analysis as its foundation, a practical approach for measuring the utility and dollar value of non-market outputs from southern forests is described and analyzed. The approach can be used in the process of evaluating alternative silvicultural and broader natural resource management plans when non-market as well as market outputs are recognized. When...

  6. Multiple products monitoring as a robust approach for peptide quantification.

    PubMed

    Baek, Je-Hyun; Kim, Hokeun; Shin, Byunghee; Yu, Myeong-Hee

    2009-07-01

    Quantification of target peptides and proteins is crucial for biomarker discovery. Approaches such as selected reaction monitoring (SRM) and multiple reaction monitoring (MRM) rely on liquid chromatography and mass spectrometric analysis of defined peptide product ions. These methods are not very widespread because the determination of quantifiable product ion using either SRM or MRM is a very time-consuming process. We developed a novel approach for quantifying target peptides without such an arduous process of ion selection. This method is based on monitoring multiple product ions (multiple products monitoring: MpM) from full-range MS2 spectra of a target precursor. The MpM method uses a scoring system that considers both the absolute intensities of product ions and the similarities between the query MS2 spectrum and the reference MS2 spectrum of the target peptide. Compared with conventional approaches, MpM greatly improves sensitivity and selectivity of peptide quantification using an ion-trap mass spectrometer.

  7. Should multiple imputation be the method of choice for handling missing data in randomized trials?

    PubMed Central

    Sullivan, Thomas R; White, Ian R; Salter, Amy B; Ryan, Philip; Lee, Katherine J

    2016-01-01

    The use of multiple imputation has increased markedly in recent years, and journal reviewers may expect to see multiple imputation used to handle missing data. However in randomized trials, where treatment group is always observed and independent of baseline covariates, other approaches may be preferable. Using data simulation we evaluated multiple imputation, performed both overall and separately by randomized group, across a range of commonly encountered scenarios. We considered both missing outcome and missing baseline data, with missing outcome data induced under missing at random mechanisms. Provided the analysis model was correctly specified, multiple imputation produced unbiased treatment effect estimates, but alternative unbiased approaches were often more efficient. When the analysis model overlooked an interaction effect involving randomized group, multiple imputation produced biased estimates of the average treatment effect when applied to missing outcome data, unless imputation was performed separately by randomized group. Based on these results, we conclude that multiple imputation should not be seen as the only acceptable way to handle missing data in randomized trials. In settings where multiple imputation is adopted, we recommend that imputation is carried out separately by randomized group. PMID:28034175

  8. Should multiple imputation be the method of choice for handling missing data in randomized trials?

    PubMed

    Sullivan, Thomas R; White, Ian R; Salter, Amy B; Ryan, Philip; Lee, Katherine J

    2016-01-01

    The use of multiple imputation has increased markedly in recent years, and journal reviewers may expect to see multiple imputation used to handle missing data. However in randomized trials, where treatment group is always observed and independent of baseline covariates, other approaches may be preferable. Using data simulation we evaluated multiple imputation, performed both overall and separately by randomized group, across a range of commonly encountered scenarios. We considered both missing outcome and missing baseline data, with missing outcome data induced under missing at random mechanisms. Provided the analysis model was correctly specified, multiple imputation produced unbiased treatment effect estimates, but alternative unbiased approaches were often more efficient. When the analysis model overlooked an interaction effect involving randomized group, multiple imputation produced biased estimates of the average treatment effect when applied to missing outcome data, unless imputation was performed separately by randomized group. Based on these results, we conclude that multiple imputation should not be seen as the only acceptable way to handle missing data in randomized trials. In settings where multiple imputation is adopted, we recommend that imputation is carried out separately by randomized group.

  9. A Behavior Analysis Approach toward Chronic Food Refusal in Children with Gastrostomy-Tube Dependency.

    ERIC Educational Resources Information Center

    Luiselli, James K.; Luiselli, Tracy Evans

    1995-01-01

    This report describes a behavior analysis treatment approach to establishing oral feeding in children with multiple developmental disabilities and gastrostomy-tube dependency. Pretreatment screening, functional assessment, and treatment are reported as implemented within a behavioral consultation model. A case study illustrates the sequence and…

  10. Structured plant metabolomics for the simultaneous exploration of multiple factors.

    PubMed

    Vasilev, Nikolay; Boccard, Julien; Lang, Gerhard; Grömping, Ulrike; Fischer, Rainer; Goepfert, Simon; Rudaz, Serge; Schillberg, Stefan

    2016-11-17

    Multiple factors act simultaneously on plants to establish complex interaction networks involving nutrients, elicitors and metabolites. Metabolomics offers a better understanding of complex biological systems, but evaluating the simultaneous impact of different parameters on metabolic pathways that have many components is a challenging task. We therefore developed a novel approach that combines experimental design, untargeted metabolic profiling based on multiple chromatography systems and ionization modes, and multiblock data analysis, facilitating the systematic analysis of metabolic changes in plants caused by different factors acting at the same time. Using this method, target geraniol compounds produced in transgenic tobacco cell cultures were grouped into clusters based on their response to different factors. We hypothesized that our novel approach may provide more robust data for process optimization in plant cell cultures producing any target secondary metabolite, based on the simultaneous exploration of multiple factors rather than varying one factor each time. The suitability of our approach was verified by confirming several previously reported examples of elicitor-metabolite crosstalk. However, unravelling all factor-metabolite networks remains challenging because it requires the identification of all biochemically significant metabolites in the metabolomics dataset.

  11. On Holo-Hilbert Spectral Analysis: A Full Informational Spectral Representation for Nonlinear and Non-Stationary Data

    NASA Technical Reports Server (NTRS)

    Huang, Norden E.; Hu, Kun; Yang, Albert C. C.; Chang, Hsing-Chih; Jia, Deng; Liang, Wei-Kuang; Yeh, Jia Rong; Kao, Chu-Lan; Juan, Chi-Huang; Peng, Chung Kang; hide

    2016-01-01

    The Holo-Hilbert spectral analysis (HHSA) method is introduced to cure the deficiencies of traditional spectral analysis and to give a full informational representation of nonlinear and non-stationary data. It uses a nested empirical mode decomposition and Hilbert-Huang transform (HHT) approach to identify intrinsic amplitude and frequency modulations often present in nonlinear systems. Comparisons are first made with traditional spectrum analysis, which usually achieved its results through convolutional integral transforms based on additive expansions of an a priori determined basis, mostly under linear and stationary assumptions. Thus, for non-stationary processes, the best one could do historically was to use the time- frequency representations, in which the amplitude (or energy density) variation is still represented in terms of time. For nonlinear processes, the data can have both amplitude and frequency modulations (intra-mode and inter-mode) generated by two different mechanisms: linear additive or nonlinear multiplicative processes. As all existing spectral analysis methods are based on additive expansions, either a priori or adaptive, none of them could possibly represent the multiplicative processes. While the earlier adaptive HHT spectral analysis approach could accommodate the intra-wave nonlinearity quite remarkably, it remained that any inter-wave nonlinear multiplicative mechanisms that include cross-scale coupling and phase-lock modulations were left untreated. To resolve the multiplicative processes issue, additional dimensions in the spectrum result are needed to account for the variations in both the amplitude and frequency modulations simultaneously. HHSA accommodates all the processes: additive and multiplicative, intra-mode and inter-mode, stationary and nonstationary, linear and nonlinear interactions. The Holo prefix in HHSA denotes a multiple dimensional representation with both additive and multiplicative capabilities.

  12. On Holo-Hilbert spectral analysis: a full informational spectral representation for nonlinear and non-stationary data

    PubMed Central

    Huang, Norden E.; Hu, Kun; Yang, Albert C. C.; Chang, Hsing-Chih; Jia, Deng; Liang, Wei-Kuang; Yeh, Jia Rong; Kao, Chu-Lan; Juan, Chi-Hung; Peng, Chung Kang; Meijer, Johanna H.; Wang, Yung-Hung; Long, Steven R.; Wu, Zhauhua

    2016-01-01

    The Holo-Hilbert spectral analysis (HHSA) method is introduced to cure the deficiencies of traditional spectral analysis and to give a full informational representation of nonlinear and non-stationary data. It uses a nested empirical mode decomposition and Hilbert–Huang transform (HHT) approach to identify intrinsic amplitude and frequency modulations often present in nonlinear systems. Comparisons are first made with traditional spectrum analysis, which usually achieved its results through convolutional integral transforms based on additive expansions of an a priori determined basis, mostly under linear and stationary assumptions. Thus, for non-stationary processes, the best one could do historically was to use the time–frequency representations, in which the amplitude (or energy density) variation is still represented in terms of time. For nonlinear processes, the data can have both amplitude and frequency modulations (intra-mode and inter-mode) generated by two different mechanisms: linear additive or nonlinear multiplicative processes. As all existing spectral analysis methods are based on additive expansions, either a priori or adaptive, none of them could possibly represent the multiplicative processes. While the earlier adaptive HHT spectral analysis approach could accommodate the intra-wave nonlinearity quite remarkably, it remained that any inter-wave nonlinear multiplicative mechanisms that include cross-scale coupling and phase-lock modulations were left untreated. To resolve the multiplicative processes issue, additional dimensions in the spectrum result are needed to account for the variations in both the amplitude and frequency modulations simultaneously. HHSA accommodates all the processes: additive and multiplicative, intra-mode and inter-mode, stationary and non-stationary, linear and nonlinear interactions. The Holo prefix in HHSA denotes a multiple dimensional representation with both additive and multiplicative capabilities. PMID:26953180

  13. Treatments of Missing Values in Large National Data Affect Conclusions: The Impact of Multiple Imputation on Arthroplasty Research.

    PubMed

    Ondeck, Nathaniel T; Fu, Michael C; Skrip, Laura A; McLynn, Ryan P; Su, Edwin P; Grauer, Jonathan N

    2018-03-01

    Despite the advantages of large, national datasets, one continuing concern is missing data values. Complete case analysis, where only cases with complete data are analyzed, is commonly used rather than more statistically rigorous approaches such as multiple imputation. This study characterizes the potential selection bias introduced using complete case analysis and compares the results of common regressions using both techniques following unicompartmental knee arthroplasty. Patients undergoing unicompartmental knee arthroplasty were extracted from the 2005 to 2015 National Surgical Quality Improvement Program. As examples, the demographics of patients with and without missing preoperative albumin and hematocrit values were compared. Missing data were then treated with both complete case analysis and multiple imputation (an approach that reproduces the variation and associations that would have been present in a full dataset) and the conclusions of common regressions for adverse outcomes were compared. A total of 6117 patients were included, of which 56.7% were missing at least one value. Younger, female, and healthier patients were more likely to have missing preoperative albumin and hematocrit values. The use of complete case analysis removed 3467 patients from the study in comparison with multiple imputation which included all 6117 patients. The 2 methods of handling missing values led to differing associations of low preoperative laboratory values with commonly studied adverse outcomes. The use of complete case analysis can introduce selection bias and may lead to different conclusions in comparison with the statistically rigorous multiple imputation approach. Joint surgeons should consider the methods of handling missing values when interpreting arthroplasty research. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Multiple shooting shadowing for sensitivity analysis of chaotic dynamical systems

    NASA Astrophysics Data System (ADS)

    Blonigan, Patrick J.; Wang, Qiqi

    2018-02-01

    Sensitivity analysis methods are important tools for research and design with simulations. Many important simulations exhibit chaotic dynamics, including scale-resolving turbulent fluid flow simulations. Unfortunately, conventional sensitivity analysis methods are unable to compute useful gradient information for long-time-averaged quantities in chaotic dynamical systems. Sensitivity analysis with least squares shadowing (LSS) can compute useful gradient information for a number of chaotic systems, including simulations of chaotic vortex shedding and homogeneous isotropic turbulence. However, this gradient information comes at a very high computational cost. This paper presents multiple shooting shadowing (MSS), a more computationally efficient shadowing approach than the original LSS approach. Through an analysis of the convergence rate of MSS, it is shown that MSS can have lower memory usage and run time than LSS.

  15. Effectively Identifying eQTLs from Multiple Tissues by Combining Mixed Model and Meta-analytic Approaches

    PubMed Central

    Choi, Ted; Eskin, Eleazar

    2013-01-01

    Gene expression data, in conjunction with information on genetic variants, have enabled studies to identify expression quantitative trait loci (eQTLs) or polymorphic locations in the genome that are associated with expression levels. Moreover, recent technological developments and cost decreases have further enabled studies to collect expression data in multiple tissues. One advantage of multiple tissue datasets is that studies can combine results from different tissues to identify eQTLs more accurately than examining each tissue separately. The idea of aggregating results of multiple tissues is closely related to the idea of meta-analysis which aggregates results of multiple genome-wide association studies to improve the power to detect associations. In principle, meta-analysis methods can be used to combine results from multiple tissues. However, eQTLs may have effects in only a single tissue, in all tissues, or in a subset of tissues with possibly different effect sizes. This heterogeneity in terms of effects across multiple tissues presents a key challenge to detect eQTLs. In this paper, we develop a framework that leverages two popular meta-analysis methods that address effect size heterogeneity to detect eQTLs across multiple tissues. We show by using simulations and multiple tissue data from mouse that our approach detects many eQTLs undetected by traditional eQTL methods. Additionally, our method provides an interpretation framework that accurately predicts whether an eQTL has an effect in a particular tissue. PMID:23785294

  16. Validity and Realibility of Chemistry Systemic Multiple Choices Questions (CSMCQs)

    ERIC Educational Resources Information Center

    Priyambodo, Erfan; Marfuatun

    2016-01-01

    Nowadays, Rasch model analysis is used widely in social research, moreover in educational research. In this research, Rasch model is used to determine the validation and the reliability of systemic multiple choices question in chemistry teaching and learning. There were 30 multiple choices question with systemic approach for high school student…

  17. A retrospective likelihood approach for efficient integration of multiple omics factors in case-control association studies.

    PubMed

    Balliu, Brunilda; Tsonaka, Roula; Boehringer, Stefan; Houwing-Duistermaat, Jeanine

    2015-03-01

    Integrative omics, the joint analysis of outcome and multiple types of omics data, such as genomics, epigenomics, and transcriptomics data, constitute a promising approach for powerful and biologically relevant association studies. These studies often employ a case-control design, and often include nonomics covariates, such as age and gender, that may modify the underlying omics risk factors. An open question is how to best integrate multiple omics and nonomics information to maximize statistical power in case-control studies that ascertain individuals based on the phenotype. Recent work on integrative omics have used prospective approaches, modeling case-control status conditional on omics, and nonomics risk factors. Compared to univariate approaches, jointly analyzing multiple risk factors with a prospective approach increases power in nonascertained cohorts. However, these prospective approaches often lose power in case-control studies. In this article, we propose a novel statistical method for integrating multiple omics and nonomics factors in case-control association studies. Our method is based on a retrospective likelihood function that models the joint distribution of omics and nonomics factors conditional on case-control status. The new method provides accurate control of Type I error rate and has increased efficiency over prospective approaches in both simulated and real data. © 2015 Wiley Periodicals, Inc.

  18. Qualitative case study data analysis: an example from practice.

    PubMed

    Houghton, Catherine; Murphy, Kathy; Shaw, David; Casey, Dympna

    2015-05-01

    To illustrate an approach to data analysis in qualitative case study methodology. There is often little detail in case study research about how data were analysed. However, it is important that comprehensive analysis procedures are used because there are often large sets of data from multiple sources of evidence. Furthermore, the ability to describe in detail how the analysis was conducted ensures rigour in reporting qualitative research. The research example used is a multiple case study that explored the role of the clinical skills laboratory in preparing students for the real world of practice. Data analysis was conducted using a framework guided by the four stages of analysis outlined by Morse ( 1994 ): comprehending, synthesising, theorising and recontextualising. The specific strategies for analysis in these stages centred on the work of Miles and Huberman ( 1994 ), which has been successfully used in case study research. The data were managed using NVivo software. Literature examining qualitative data analysis was reviewed and strategies illustrated by the case study example provided. Discussion Each stage of the analysis framework is described with illustration from the research example for the purpose of highlighting the benefits of a systematic approach to handling large data sets from multiple sources. By providing an example of how each stage of the analysis was conducted, it is hoped that researchers will be able to consider the benefits of such an approach to their own case study analysis. This paper illustrates specific strategies that can be employed when conducting data analysis in case study research and other qualitative research designs.

  19. MANGO: a new approach to multiple sequence alignment.

    PubMed

    Zhang, Zefeng; Lin, Hao; Li, Ming

    2007-01-01

    Multiple sequence alignment is a classical and challenging task for biological sequence analysis. The problem is NP-hard. The full dynamic programming takes too much time. The progressive alignment heuristics adopted by most state of the art multiple sequence alignment programs suffer from the 'once a gap, always a gap' phenomenon. Is there a radically new way to do multiple sequence alignment? This paper introduces a novel and orthogonal multiple sequence alignment method, using multiple optimized spaced seeds and new algorithms to handle these seeds efficiently. Our new algorithm processes information of all sequences as a whole, avoiding problems caused by the popular progressive approaches. Because the optimized spaced seeds are provably significantly more sensitive than the consecutive k-mers, the new approach promises to be more accurate and reliable. To validate our new approach, we have implemented MANGO: Multiple Alignment with N Gapped Oligos. Experiments were carried out on large 16S RNA benchmarks showing that MANGO compares favorably, in both accuracy and speed, against state-of-art multiple sequence alignment methods, including ClustalW 1.83, MUSCLE 3.6, MAFFT 5.861, Prob-ConsRNA 1.11, Dialign 2.2.1, DIALIGN-T 0.2.1, T-Coffee 4.85, POA 2.0 and Kalign 2.0.

  20. Mediation Analysis with Multiple Mediators

    PubMed Central

    VanderWeele, T.J.; Vansteelandt, S.

    2014-01-01

    Recent advances in the causal inference literature on mediation have extended traditional approaches to direct and indirect effects to settings that allow for interactions and non-linearities. In this paper, these approaches from causal inference are further extended to settings in which multiple mediators may be of interest. Two analytic approaches, one based on regression and one based on weighting are proposed to estimate the effect mediated through multiple mediators and the effects through other pathways. The approaches proposed here accommodate exposure-mediator interactions and, to a certain extent, mediator-mediator interactions as well. The methods handle binary or continuous mediators and binary, continuous or count outcomes. When the mediators affect one another, the strategy of trying to assess direct and indirect effects one mediator at a time will in general fail; the approach given in this paper can still be used. A characterization is moreover given as to when the sum of the mediated effects for multiple mediators considered separately will be equal to the mediated effect of all of the mediators considered jointly. The approach proposed in this paper is robust to unmeasured common causes of two or more mediators. PMID:25580377

  1. Mediation Analysis with Multiple Mediators.

    PubMed

    VanderWeele, T J; Vansteelandt, S

    2014-01-01

    Recent advances in the causal inference literature on mediation have extended traditional approaches to direct and indirect effects to settings that allow for interactions and non-linearities. In this paper, these approaches from causal inference are further extended to settings in which multiple mediators may be of interest. Two analytic approaches, one based on regression and one based on weighting are proposed to estimate the effect mediated through multiple mediators and the effects through other pathways. The approaches proposed here accommodate exposure-mediator interactions and, to a certain extent, mediator-mediator interactions as well. The methods handle binary or continuous mediators and binary, continuous or count outcomes. When the mediators affect one another, the strategy of trying to assess direct and indirect effects one mediator at a time will in general fail; the approach given in this paper can still be used. A characterization is moreover given as to when the sum of the mediated effects for multiple mediators considered separately will be equal to the mediated effect of all of the mediators considered jointly. The approach proposed in this paper is robust to unmeasured common causes of two or more mediators.

  2. Falcon: A Temporal Visual Analysis System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A.

    2016-09-05

    Flexible visible exploration of long, high-resolution time series from multiple sensor streams is a challenge in several domains. Falcon is a visual analytics approach that helps researchers acquire a deep understanding of patterns in log and imagery data. Falcon allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations with multiple levels of detail. These capabilities are applicable to the analysis of any quantitative time series.

  3. Classification and Clustering Methods for Multiple Environmental Factors in Gene-Environment Interaction: Application to the Multi-Ethnic Study of Atherosclerosis.

    PubMed

    Ko, Yi-An; Mukherjee, Bhramar; Smith, Jennifer A; Kardia, Sharon L R; Allison, Matthew; Diez Roux, Ana V

    2016-11-01

    There has been an increased interest in identifying gene-environment interaction (G × E) in the context of multiple environmental exposures. Most G × E studies analyze one exposure at a time, but we are exposed to multiple exposures in reality. Efficient analysis strategies for complex G × E with multiple environmental factors in a single model are still lacking. Using the data from the Multiethnic Study of Atherosclerosis, we illustrate a two-step approach for modeling G × E with multiple environmental factors. First, we utilize common clustering and classification strategies (e.g., k-means, latent class analysis, classification and regression trees, Bayesian clustering using Dirichlet Process) to define subgroups corresponding to distinct environmental exposure profiles. Second, we illustrate the use of an additive main effects and multiplicative interaction model, instead of the conventional saturated interaction model using product terms of factors, to study G × E with the data-driven exposure subgroups defined in the first step. We demonstrate useful analytical approaches to translate multiple environmental exposures into one summary class. These tools not only allow researchers to consider several environmental exposures in G × E analysis but also provide some insight into how genes modify the effect of a comprehensive exposure profile instead of examining effect modification for each exposure in isolation.

  4. Citation Patterns of Engineering, Statistics, and Computer Science Researchers: An Internal and External Citation Analysis across Multiple Engineering Subfields

    ERIC Educational Resources Information Center

    Kelly, Madeline

    2015-01-01

    This study takes a multidimensional approach to citation analysis, examining citations in multiple subfields of engineering, from both scholarly journals and doctoral dissertations. The three major goals of the study are to determine whether there are differences between citations drawn from dissertations and those drawn from journal articles; to…

  5. Meta-Analysis with Complex Research Designs: Dealing with Dependence from Multiple Measures and Multiple Group Comparisons

    ERIC Educational Resources Information Center

    Scammacca, Nancy; Roberts, Greg; Stuebing, Karla K.

    2014-01-01

    Previous research has shown that treating dependent effect sizes as independent inflates the variance of the mean effect size and introduces bias by giving studies with more effect sizes more weight in the meta-analysis. This article summarizes the different approaches to handling dependence that have been advocated by methodologists, some of…

  6. A Quantitative and Combinatorial Approach to Non-Linear Meanings of Multiplication

    ERIC Educational Resources Information Center

    Tillema, Erik; Gatza, Andrew

    2016-01-01

    We provide a conceptual analysis of how combinatorics problems have the potential to support students to establish non-linear meanings of multiplication (NLMM). The problems we analyze we have used in a series of studies with 6th, 8th, and 10th grade students. We situate the analysis in prior work on students' quantitative and multiplicative…

  7. The Synthesis Approach to Analysing Educational Design Dataset: Application of Three Scaffolds to a Learning by Design Task for Postgraduate Education Students

    ERIC Educational Resources Information Center

    Thompson, Kate; Carvalho, Lucila; Aditomo, Anindito; Dimitriadis, Yannis; Dyke, Gregory; Evans, Michael A.; Khosronejad, Maryam; Martinez-Maldonado, Roberto; Reimann, Peter; Wardak, Dewa

    2015-01-01

    The aims of the Synthesis and Scaffolding Project were to understand: the role of specific scaffolds in relation to the activity of learners, and the activity of learners during a collaborative design task from multiple perspectives, through the collection and analysis of multiple streams of data and the adoption of a synthesis approach to the…

  8. Laplace Transform Based Radiative Transfer Studies

    NASA Astrophysics Data System (ADS)

    Hu, Y.; Lin, B.; Ng, T.; Yang, P.; Wiscombe, W.; Herath, J.; Duffy, D.

    2006-12-01

    Multiple scattering is the major uncertainty for data analysis of space-based lidar measurements. Until now, accurate quantitative lidar data analysis has been limited to very thin objects that are dominated by single scattering, where photons from the laser beam only scatter a single time with particles in the atmosphere before reaching the receiver, and simple linear relationship between physical property and lidar signal exists. In reality, multiple scattering is always a factor in space-based lidar measurement and it dominates space- based lidar returns from clouds, dust aerosols, vegetation canopy and phytoplankton. While multiple scattering are clear signals, the lack of a fast-enough lidar multiple scattering computation tool forces us to treat the signal as unwanted "noise" and use simple multiple scattering correction scheme to remove them. Such multiple scattering treatments waste the multiple scattering signals and may cause orders of magnitude errors in retrieved physical properties. Thus the lack of fast and accurate time-dependent radiative transfer tools significantly limits lidar remote sensing capabilities. Analyzing lidar multiple scattering signals requires fast and accurate time-dependent radiative transfer computations. Currently, multiple scattering is done with Monte Carlo simulations. Monte Carlo simulations take minutes to hours and are too slow for interactive satellite data analysis processes and can only be used to help system / algorithm design and error assessment. We present an innovative physics approach to solve the time-dependent radiative transfer problem. The technique utilizes FPGA based reconfigurable computing hardware. The approach is as following, 1. Physics solution: Perform Laplace transform on the time and spatial dimensions and Fourier transform on the viewing azimuth dimension, and convert the radiative transfer differential equation solving into a fast matrix inversion problem. The majority of the radiative transfer computation goes to matrix inversion processes, FFT and inverse Laplace transforms. 2. Hardware solutions: Perform the well-defined matrix inversion, FFT and Laplace transforms on highly parallel, reconfigurable computing hardware. This physics-based computational tool leads to accurate quantitative analysis of space-based lidar signals and improves data quality of current lidar mission such as CALIPSO. This presentation will introduce the basic idea of this approach, preliminary results based on SRC's FPGA-based Mapstation, and how we may apply it to CALIPSO data analysis.

  9. Statistical Significance for Hierarchical Clustering

    PubMed Central

    Kimes, Patrick K.; Liu, Yufeng; Hayes, D. Neil; Marron, J. S.

    2017-01-01

    Summary Cluster analysis has proved to be an invaluable tool for the exploratory and unsupervised analysis of high dimensional datasets. Among methods for clustering, hierarchical approaches have enjoyed substantial popularity in genomics and other fields for their ability to simultaneously uncover multiple layers of clustering structure. A critical and challenging question in cluster analysis is whether the identified clusters represent important underlying structure or are artifacts of natural sampling variation. Few approaches have been proposed for addressing this problem in the context of hierarchical clustering, for which the problem is further complicated by the natural tree structure of the partition, and the multiplicity of tests required to parse the layers of nested clusters. In this paper, we propose a Monte Carlo based approach for testing statistical significance in hierarchical clustering which addresses these issues. The approach is implemented as a sequential testing procedure guaranteeing control of the family-wise error rate. Theoretical justification is provided for our approach, and its power to detect true clustering structure is illustrated through several simulation studies and applications to two cancer gene expression datasets. PMID:28099990

  10. Quantitative meta-analytic approaches for the analysis of animal toxicology and epidemiologic data in human health risk assessments

    EPA Science Inventory

    Often, human health risk assessments have relied on qualitative approaches for hazard identification to integrate evidence across multiple studies to conclude whether particular hazards exist. However, quantitative approaches for evidence integration, including the application o...

  11. Stepped MS(All) Relied Transition (SMART): An approach to rapidly determine optimal multiple reaction monitoring mass spectrometry parameters for small molecules.

    PubMed

    Ye, Hui; Zhu, Lin; Wang, Lin; Liu, Huiying; Zhang, Jun; Wu, Mengqiu; Wang, Guangji; Hao, Haiping

    2016-02-11

    Multiple reaction monitoring (MRM) is a universal approach for quantitative analysis because of its high specificity and sensitivity. Nevertheless, optimization of MRM parameters remains as a time and labor-intensive task particularly in multiplexed quantitative analysis of small molecules in complex mixtures. In this study, we have developed an approach named Stepped MS(All) Relied Transition (SMART) to predict the optimal MRM parameters of small molecules. SMART requires firstly a rapid and high-throughput analysis of samples using a Stepped MS(All) technique (sMS(All)) on a Q-TOF, which consists of serial MS(All) events acquired from low CE to gradually stepped-up CE values in a cycle. The optimal CE values can then be determined by comparing the extracted ion chromatograms for the ion pairs of interest among serial scans. The SMART-predicted parameters were found to agree well with the parameters optimized on a triple quadrupole from the same vendor using a mixture of standards. The parameters optimized on a triple quadrupole from a different vendor was also employed for comparison, and found to be linearly correlated with the SMART-predicted parameters, suggesting the potential applications of the SMART approach among different instrumental platforms. This approach was further validated by applying to simultaneous quantification of 31 herbal components in the plasma of rats treated with a herbal prescription. Because the sMS(All) acquisition can be accomplished in a single run for multiple components independent of standards, the SMART approach are expected to find its wide application in the multiplexed quantitative analysis of complex mixtures. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Pleiotropy Analysis of Quantitative Traits at Gene Level by Multivariate Functional Linear Models

    PubMed Central

    Wang, Yifan; Liu, Aiyi; Mills, James L.; Boehnke, Michael; Wilson, Alexander F.; Bailey-Wilson, Joan E.; Xiong, Momiao; Wu, Colin O.; Fan, Ruzong

    2015-01-01

    In genetics, pleiotropy describes the genetic effect of a single gene on multiple phenotypic traits. A common approach is to analyze the phenotypic traits separately using univariate analyses and combine the test results through multiple comparisons. This approach may lead to low power. Multivariate functional linear models are developed to connect genetic variant data to multiple quantitative traits adjusting for covariates for a unified analysis. Three types of approximate F-distribution tests based on Pillai–Bartlett trace, Hotelling–Lawley trace, and Wilks’s Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants in one genetic region. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and optimal sequence kernel association test (SKAT-O). Extensive simulations were performed to evaluate the false positive rates and power performance of the proposed models and tests. We show that the approximate F-distribution tests control the type I error rates very well. Overall, simultaneous analysis of multiple traits can increase power performance compared to an individual test of each trait. The proposed methods were applied to analyze (1) four lipid traits in eight European cohorts, and (2) three biochemical traits in the Trinity Students Study. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and SKAT-O for the three biochemical traits. The approximate F-distribution tests of the proposed functional linear models are more sensitive than those of the traditional multivariate linear models that in turn are more sensitive than SKAT-O in the univariate case. The analysis of the four lipid traits and the three biochemical traits detects more association than SKAT-O in the univariate case. PMID:25809955

  13. Pleiotropy analysis of quantitative traits at gene level by multivariate functional linear models.

    PubMed

    Wang, Yifan; Liu, Aiyi; Mills, James L; Boehnke, Michael; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao; Wu, Colin O; Fan, Ruzong

    2015-05-01

    In genetics, pleiotropy describes the genetic effect of a single gene on multiple phenotypic traits. A common approach is to analyze the phenotypic traits separately using univariate analyses and combine the test results through multiple comparisons. This approach may lead to low power. Multivariate functional linear models are developed to connect genetic variant data to multiple quantitative traits adjusting for covariates for a unified analysis. Three types of approximate F-distribution tests based on Pillai-Bartlett trace, Hotelling-Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants in one genetic region. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and optimal sequence kernel association test (SKAT-O). Extensive simulations were performed to evaluate the false positive rates and power performance of the proposed models and tests. We show that the approximate F-distribution tests control the type I error rates very well. Overall, simultaneous analysis of multiple traits can increase power performance compared to an individual test of each trait. The proposed methods were applied to analyze (1) four lipid traits in eight European cohorts, and (2) three biochemical traits in the Trinity Students Study. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and SKAT-O for the three biochemical traits. The approximate F-distribution tests of the proposed functional linear models are more sensitive than those of the traditional multivariate linear models that in turn are more sensitive than SKAT-O in the univariate case. The analysis of the four lipid traits and the three biochemical traits detects more association than SKAT-O in the univariate case. © 2015 WILEY PERIODICALS, INC.

  14. Scenario and multiple criteria decision analysis for energy and environmental security of military and industrial installations.

    PubMed

    Karvetski, Christopher W; Lambert, James H; Linkov, Igor

    2011-04-01

    Military and industrial facilities need secure and reliable power generation. Grid outages can result in cascading infrastructure failures as well as security breaches and should be avoided. Adding redundancy and increasing reliability can require additional environmental, financial, logistical, and other considerations and resources. Uncertain scenarios consisting of emergent environmental conditions, regulatory changes, growth of regional energy demands, and other concerns result in further complications. Decisions on selecting energy alternatives are made on an ad hoc basis. The present work integrates scenario analysis and multiple criteria decision analysis (MCDA) to identify combinations of impactful emergent conditions and to perform a preliminary benefits analysis of energy and environmental security investments for industrial and military installations. Application of a traditional MCDA approach would require significant stakeholder elicitations under multiple uncertain scenarios. The approach proposed in this study develops and iteratively adjusts a scoring function for investment alternatives to find the scenarios with the most significant impacts on installation security. A robust prioritization of investment alternatives can be achieved by integrating stakeholder preferences and focusing modeling and decision-analytical tools on a few key emergent conditions and scenarios. The approach is described and demonstrated for a campus of several dozen interconnected industrial buildings within a major installation. Copyright © 2010 SETAC.

  15. Electrification Futures Study Modeling Approach | Energy Analysis | NREL

    Science.gov Websites

    Electrification Futures Study Modeling Approach Electrification Futures Study Modeling Approach To quantitatively answer the research questions of the Electrification Futures Study, researchers will use multiple accounting for infrastructure inertia through stock turnover. Load Modeling The Electrification Futures Study

  16. Multiple imputation of missing fMRI data in whole brain analysis

    PubMed Central

    Vaden, Kenneth I.; Gebregziabher, Mulugeta; Kuchinsky, Stefanie E.; Eckert, Mark A.

    2012-01-01

    Whole brain fMRI analyses rarely include the entire brain because of missing data that result from data acquisition limits and susceptibility artifact, in particular. This missing data problem is typically addressed by omitting voxels from analysis, which may exclude brain regions that are of theoretical interest and increase the potential for Type II error at cortical boundaries or Type I error when spatial thresholds are used to establish significance. Imputation could significantly expand statistical map coverage, increase power, and enhance interpretations of fMRI results. We examined multiple imputation for group level analyses of missing fMRI data using methods that leverage the spatial information in fMRI datasets for both real and simulated data. Available case analysis, neighbor replacement, and regression based imputation approaches were compared in a general linear model framework to determine the extent to which these methods quantitatively (effect size) and qualitatively (spatial coverage) increased the sensitivity of group analyses. In both real and simulated data analysis, multiple imputation provided 1) variance that was most similar to estimates for voxels with no missing data, 2) fewer false positive errors in comparison to mean replacement, and 3) fewer false negative errors in comparison to available case analysis. Compared to the standard analysis approach of omitting voxels with missing data, imputation methods increased brain coverage in this study by 35% (from 33,323 to 45,071 voxels). In addition, multiple imputation increased the size of significant clusters by 58% and number of significant clusters across statistical thresholds, compared to the standard voxel omission approach. While neighbor replacement produced similar results, we recommend multiple imputation because it uses an informed sampling distribution to deal with missing data across subjects that can include neighbor values and other predictors. Multiple imputation is anticipated to be particularly useful for 1) large fMRI data sets with inconsistent missing voxels across subjects and 2) addressing the problem of increased artifact at ultra-high field, which significantly limit the extent of whole brain coverage and interpretations of results. PMID:22500925

  17. A new spatial multiple discrete-continuous modeling approach to land use change analysis.

    DOT National Transportation Integrated Search

    2013-09-01

    This report formulates a multiple discrete-continuous probit (MDCP) land-use model within a : spatially explicit economic structural framework for land-use change decisions. The spatial : MDCP model is capable of predicting both the type and intensit...

  18. Multiframe video coding for improved performance over wireless channels.

    PubMed

    Budagavi, M; Gibson, J D

    2001-01-01

    We propose and evaluate a multi-frame extension to block motion compensation (BMC) coding of videoconferencing-type video signals for wireless channels. The multi-frame BMC (MF-BMC) coder makes use of the redundancy that exists across multiple frames in typical videoconferencing sequences to achieve additional compression over that obtained by using the single frame BMC (SF-BMC) approach, such as in the base-level H.263 codec. The MF-BMC approach also has an inherent ability of overcoming some transmission errors and is thus more robust when compared to the SF-BMC approach. We model the error propagation process in MF-BMC coding as a multiple Markov chain and use Markov chain analysis to infer that the use of multiple frames in motion compensation increases robustness. The Markov chain analysis is also used to devise a simple scheme which randomizes the selection of the frame (amongst the multiple previous frames) used in BMC to achieve additional robustness. The MF-BMC coders proposed are a multi-frame extension of the base level H.263 coder and are found to be more robust than the base level H.263 coder when subjected to simulated errors commonly encountered on wireless channels.

  19. Considerations of multiple imputation approaches for handling missing data in clinical trials.

    PubMed

    Quan, Hui; Qi, Li; Luo, Xiaodong; Darchy, Loic

    2018-07-01

    Missing data exist in all clinical trials and missing data issue is a very serious issue in terms of the interpretability of the trial results. There is no universally applicable solution for all missing data problems. Methods used for handling missing data issue depend on the circumstances particularly the assumptions on missing data mechanisms. In recent years, if the missing at random mechanism cannot be assumed, conservative approaches such as the control-based and returning to baseline multiple imputation approaches are applied for dealing with the missing data issues. In this paper, we focus on the variability in data analysis of these approaches. As demonstrated by examples, the choice of the variability can impact the conclusion of the analysis. Besides the methods for continuous endpoints, we also discuss methods for binary and time to event endpoints as well as consideration for non-inferiority assessment. Copyright © 2018. Published by Elsevier Inc.

  20. A Statistical Method for Synthesizing Mediation Analyses Using the Product of Coefficient Approach Across Multiple Trials

    PubMed Central

    Huang, Shi; MacKinnon, David P.; Perrino, Tatiana; Gallo, Carlos; Cruden, Gracelyn; Brown, C Hendricks

    2016-01-01

    Mediation analysis often requires larger sample sizes than main effect analysis to achieve the same statistical power. Combining results across similar trials may be the only practical option for increasing statistical power for mediation analysis in some situations. In this paper, we propose a method to estimate: 1) marginal means for mediation path a, the relation of the independent variable to the mediator; 2) marginal means for path b, the relation of the mediator to the outcome, across multiple trials; and 3) the between-trial level variance-covariance matrix based on a bivariate normal distribution. We present the statistical theory and an R computer program to combine regression coefficients from multiple trials to estimate a combined mediated effect and confidence interval under a random effects model. Values of coefficients a and b, along with their standard errors from each trial are the input for the method. This marginal likelihood based approach with Monte Carlo confidence intervals provides more accurate inference than the standard meta-analytic approach. We discuss computational issues, apply the method to two real-data examples and make recommendations for the use of the method in different settings. PMID:28239330

  1. Global Profiling and Novel Structure Discovery Using Multiple Neutral Loss/Precursor Ion Scanning Combined with Substructure Recognition and Statistical Analysis (MNPSS): Characterization of Terpene-Conjugated Curcuminoids in Curcuma longa as a Case Study.

    PubMed

    Qiao, Xue; Lin, Xiong-hao; Ji, Shuai; Zhang, Zheng-xiang; Bo, Tao; Guo, De-an; Ye, Min

    2016-01-05

    To fully understand the chemical diversity of an herbal medicine is challenging. In this work, we describe a new approach to globally profile and discover novel compounds from an herbal extract using multiple neutral loss/precursor ion scanning combined with substructure recognition and statistical analysis. Turmeric (the rhizomes of Curcuma longa L.) was used as an example. This approach consists of three steps: (i) multiple neutral loss/precursor ion scanning to obtain substructure information; (ii) targeted identification of new compounds by extracted ion current and substructure recognition; and (iii) untargeted identification using total ion current and multivariate statistical analysis to discover novel structures. Using this approach, 846 terpecurcumins (terpene-conjugated curcuminoids) were discovered from turmeric, including a number of potentially novel compounds. Furthermore, two unprecedented compounds (terpecurcumins X and Y) were purified, and their structures were identified by NMR spectroscopy. This study extended the application of mass spectrometry to global profiling of natural products in herbal medicines and could help chemists to rapidly discover novel compounds from a complex matrix.

  2. Economic Analysis of a Multi-Site Prevention Program: Assessment of Program Costs and Characterizing Site-level Variability

    PubMed Central

    Corso, Phaedra S.; Ingels, Justin B.; Kogan, Steven M.; Foster, E. Michael; Chen, Yi-Fu; Brody, Gene H.

    2013-01-01

    Programmatic cost analyses of preventive interventions commonly have a number of methodological difficulties. To determine the mean total costs and properly characterize variability, one often has to deal with small sample sizes, skewed distributions, and especially missing data. Standard approaches for dealing with missing data such as multiple imputation may suffer from a small sample size, a lack of appropriate covariates, or too few details around the method used to handle the missing data. In this study, we estimate total programmatic costs for a prevention trial evaluating the Strong African American Families-Teen program. This intervention focuses on the prevention of substance abuse and risky sexual behavior. To account for missing data in the assessment of programmatic costs we compare multiple imputation to probabilistic sensitivity analysis. The latter approach uses collected cost data to create a distribution around each input parameter. We found that with the multiple imputation approach, the mean (95% confidence interval) incremental difference was $2149 ($397, $3901). With the probabilistic sensitivity analysis approach, the incremental difference was $2583 ($778, $4346). Although the true cost of the program is unknown, probabilistic sensitivity analysis may be a more viable alternative for capturing variability in estimates of programmatic costs when dealing with missing data, particularly with small sample sizes and the lack of strong predictor variables. Further, the larger standard errors produced by the probabilistic sensitivity analysis method may signal its ability to capture more of the variability in the data, thus better informing policymakers on the potentially true cost of the intervention. PMID:23299559

  3. Economic analysis of a multi-site prevention program: assessment of program costs and characterizing site-level variability.

    PubMed

    Corso, Phaedra S; Ingels, Justin B; Kogan, Steven M; Foster, E Michael; Chen, Yi-Fu; Brody, Gene H

    2013-10-01

    Programmatic cost analyses of preventive interventions commonly have a number of methodological difficulties. To determine the mean total costs and properly characterize variability, one often has to deal with small sample sizes, skewed distributions, and especially missing data. Standard approaches for dealing with missing data such as multiple imputation may suffer from a small sample size, a lack of appropriate covariates, or too few details around the method used to handle the missing data. In this study, we estimate total programmatic costs for a prevention trial evaluating the Strong African American Families-Teen program. This intervention focuses on the prevention of substance abuse and risky sexual behavior. To account for missing data in the assessment of programmatic costs we compare multiple imputation to probabilistic sensitivity analysis. The latter approach uses collected cost data to create a distribution around each input parameter. We found that with the multiple imputation approach, the mean (95 % confidence interval) incremental difference was $2,149 ($397, $3,901). With the probabilistic sensitivity analysis approach, the incremental difference was $2,583 ($778, $4,346). Although the true cost of the program is unknown, probabilistic sensitivity analysis may be a more viable alternative for capturing variability in estimates of programmatic costs when dealing with missing data, particularly with small sample sizes and the lack of strong predictor variables. Further, the larger standard errors produced by the probabilistic sensitivity analysis method may signal its ability to capture more of the variability in the data, thus better informing policymakers on the potentially true cost of the intervention.

  4. Assessing the use of multiple sources in student essays.

    PubMed

    Hastings, Peter; Hughes, Simon; Magliano, Joseph P; Goldman, Susan R; Lawless, Kimberly

    2012-09-01

    The present study explored different approaches for automatically scoring student essays that were written on the basis of multiple texts. Specifically, these approaches were developed to classify whether or not important elements of the texts were present in the essays. The first was a simple pattern-matching approach called "multi-word" that allowed for flexible matching of words and phrases in the sentences. The second technique was latent semantic analysis (LSA), which was used to compare student sentences to original source sentences using its high-dimensional vector-based representation. Finally, the third was a machine-learning technique, support vector machines, which learned a classification scheme from the corpus. The results of the study suggested that the LSA-based system was superior for detecting the presence of explicit content from the texts, but the multi-word pattern-matching approach was better for detecting inferences outside or across texts. These results suggest that the best approach for analyzing essays of this nature should draw upon multiple natural language processing approaches.

  5. Shifting from Stewardship to Analytics of Massive Science Data

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Doyle, R.; Law, E.; Hughes, S.; Huang, T.; Mahabal, A.

    2015-12-01

    Currently, the analysis of large data collections is executed through traditional computational and data analysis approaches, which require users to bring data to their desktops and perform local data analysis. Data collection, archiving and analysis from future remote sensing missions, be it from earth science satellites, planetary robotic missions, or massive radio observatories may not scale as more capable instruments stress existing architectural approaches and systems due to more continuous data streams, data from multiple observational platforms, and measurements and models from different agencies. A new paradigm is needed in order to increase the productivity and effectiveness of scientific data analysis. This paradigm must recognize that architectural choices, data processing, management, analysis, etc are interrelated, and must be carefully coordinated in any system that aims to allow efficient, interactive scientific exploration and discovery to exploit massive data collections. Future observational systems, including satellite and airborne experiments, and research in climate modeling will significantly increase the size of the data requiring new methodological approaches towards data analytics where users can more effectively interact with the data and apply automated mechanisms for data reduction, reduction and fusion across these massive data repositories. This presentation will discuss architecture, use cases, and approaches for developing a big data analytics strategy across multiple science disciplines.

  6. Testing Group Mean Differences of Latent Variables in Multilevel Data Using Multiple-Group Multilevel CFA and Multilevel MIMIC Modeling.

    PubMed

    Kim, Eun Sook; Cao, Chunhua

    2015-01-01

    Considering that group comparisons are common in social science, we examined two latent group mean testing methods when groups of interest were either at the between or within level of multilevel data: multiple-group multilevel confirmatory factor analysis (MG ML CFA) and multilevel multiple-indicators multiple-causes modeling (ML MIMIC). The performance of these methods were investigated through three Monte Carlo studies. In Studies 1 and 2, either factor variances or residual variances were manipulated to be heterogeneous between groups. In Study 3, which focused on within-level multiple-group analysis, six different model specifications were considered depending on how to model the intra-class group correlation (i.e., correlation between random effect factors for groups within cluster). The results of simulations generally supported the adequacy of MG ML CFA and ML MIMIC for multiple-group analysis with multilevel data. The two methods did not show any notable difference in the latent group mean testing across three studies. Finally, a demonstration with real data and guidelines in selecting an appropriate approach to multilevel multiple-group analysis are provided.

  7. Motivational and Volitional Variables Associated with Stages of Change for Exercise in Multiple Sclerosis: A Multiple Discriminant Analysis

    ERIC Educational Resources Information Center

    Chiu, Chung-Yi; Fitzgerald, Sandra D.; Strand, David M.; Muller, Veronica; Brooks, Jessica; Chan, Fong

    2012-01-01

    The main objective of this study was to determine whether motivational and volitional variables identified in the health action process approach (HAPA) model can be used to successfully differentiate people with multiple sclerosis (MS) in different stages of change for exercise and physical activity. Ex-post-facto design using multiple…

  8. Identifying Opportunities for Decision Support Systems in Support of Regional Resource Use Planning: An Approach Through Soft Systems Methodology.

    PubMed

    Zhu; Dale

    2000-10-01

    / Regional resource use planning relies on key regional stakeholder groups using and having equitable access to appropriate social, economic, and environmental information and assessment tools. Decision support systems (DSS) can improve stakeholder access to such information and analysis tools. Regional resource use planning, however, is a complex process involving multiple issues, multiple assessment criteria, multiple stakeholders, and multiple values. There is a need for an approach to DSS development that can assist in understanding and modeling complex problem situations in regional resource use so that areas where DSSs could provide effective support can be identified, and the user requirements can be well established. This paper presents an approach based on the soft systems methodology for identifying DSS opportunities for regional resource use planning, taking the Central Highlands Region of Queensland, Australia, as a case study.

  9. A Benefit-Risk Analysis Approach to Capture Regulatory Decision-Making: Multiple Myeloma.

    PubMed

    Raju, G K; Gurumurthi, Karthik; Domike, Reuben; Kazandjian, Dickran; Landgren, Ola; Blumenthal, Gideon M; Farrell, Ann; Pazdur, Richard; Woodcock, Janet

    2018-01-01

    Drug regulators around the world make decisions about drug approvability based on qualitative benefit-risk analysis. In this work, a quantitative benefit-risk analysis approach captures regulatory decision-making about new drugs to treat multiple myeloma (MM). MM assessments have been based on endpoints such as time to progression (TTP), progression-free survival (PFS), and objective response rate (ORR) which are different than benefit-risk analysis based on overall survival (OS). Twenty-three FDA decisions on MM drugs submitted to FDA between 2003 and 2016 were identified and analyzed. The benefits and risks were quantified relative to comparators (typically the control arm of the clinical trial) to estimate whether the median benefit-risk was positive or negative. A sensitivity analysis was demonstrated using ixazomib to explore the magnitude of uncertainty. FDA approval decision outcomes were consistent and logical using this benefit-risk framework. © 2017 American Society for Clinical Pharmacology and Therapeutics.

  10. Sensitivity Analysis of Multiple Informant Models When Data are Not Missing at Random

    PubMed Central

    Blozis, Shelley A.; Ge, Xiaojia; Xu, Shu; Natsuaki, Misaki N.; Shaw, Daniel S.; Neiderhiser, Jenae; Scaramella, Laura; Leve, Leslie; Reiss, David

    2014-01-01

    Missing data are common in studies that rely on multiple informant data to evaluate relationships among variables for distinguishable individuals clustered within groups. Estimation of structural equation models using raw data allows for incomplete data, and so all groups may be retained even if only one member of a group contributes data. Statistical inference is based on the assumption that data are missing completely at random or missing at random. Importantly, whether or not data are missing is assumed to be independent of the missing data. A saturated correlates model that incorporates correlates of the missingness or the missing data into an analysis and multiple imputation that may also use such correlates offer advantages over the standard implementation of SEM when data are not missing at random because these approaches may result in a data analysis problem for which the missingness is ignorable. This paper considers these approaches in an analysis of family data to assess the sensitivity of parameter estimates to assumptions about missing data, a strategy that may be easily implemented using SEM software. PMID:25221420

  11. Deciphering Rashomon: an approach to verbal autopsies of maternal deaths.

    PubMed

    Iyer, Aditi; Sen, Gita; Sreevathsa, Anuradha

    2013-01-01

    The paper discusses an approach to verbal autopsies that engages with the Rashomon phenomenon affecting ex post facto constructions of death and responds to the call for maternal safety. This method differs from other verbal autopsies in its approach to data collection and its framework of analysis. In our approach, data collection entails working with and triangulating multiple narratives, and minimising power inequalities in the investigation process. The framework of analysis focuses on the missed opportunities for death prevention as an alternative to (or deepening of) the Three Delays Model. This framework assesses the behavioural responses of health providers, as well as community and family members at each opportunity for death prevention and categorises them into four groups: non-actions, inadequate actions, inappropriate actions and unavoidably delayed actions. We demonstrate the application of this approach to show how verbal autopsies can delve beneath multiple narratives and rigorously identify health system, behavioural and cultural factors that contribute to avoidable maternal mortality.

  12. Benefit-Risk Analysis for Decision-Making: An Approach.

    PubMed

    Raju, G K; Gurumurthi, K; Domike, R

    2016-12-01

    The analysis of benefit and risk is an important aspect of decision-making throughout the drug lifecycle. In this work, the use of a benefit-risk analysis approach to support decision-making was explored. The proposed approach builds on the qualitative US Food and Drug Administration (FDA) approach to include a more explicit analysis based on international standards and guidance that enables aggregation and comparison of benefit and risk on a common basis and a lifecycle focus. The approach is demonstrated on six decisions over the lifecycle (e.g., accelerated approval, withdrawal, and traditional approval) using two case studies: natalizumab for multiple sclerosis (MS) and bedaquiline for multidrug-resistant tuberculosis (MDR-TB). © 2016 American Society for Clinical Pharmacology and Therapeutics.

  13. Allocating physicians' overhead costs to services: an econometric/accounting-activity based-approach.

    PubMed

    Peden, Al; Baker, Judith J

    2002-01-01

    Using the optimizing properties of econometric analysis, this study analyzes how physician overhead costs (OC) can be allocated to multiple activities to maximize precision in reimbursing the costs of services. Drawing on work by Leibenstein and Friedman, the analysis also shows that allocating OC to multiple activities unbiased by revenue requires controlling for revenue when making the estimates. Further econometric analysis shows that it is possible to save about 10 percent of OC by paying only for those that are necessary.

  14. A formal concept analysis approach to consensus clustering of multi-experiment expression data

    PubMed Central

    2014-01-01

    Background Presently, with the increasing number and complexity of available gene expression datasets, the combination of data from multiple microarray studies addressing a similar biological question is gaining importance. The analysis and integration of multiple datasets are expected to yield more reliable and robust results since they are based on a larger number of samples and the effects of the individual study-specific biases are diminished. This is supported by recent studies suggesting that important biological signals are often preserved or enhanced by multiple experiments. An approach to combining data from different experiments is the aggregation of their clusterings into a consensus or representative clustering solution which increases the confidence in the common features of all the datasets and reveals the important differences among them. Results We propose a novel generic consensus clustering technique that applies Formal Concept Analysis (FCA) approach for the consolidation and analysis of clustering solutions derived from several microarray datasets. These datasets are initially divided into groups of related experiments with respect to a predefined criterion. Subsequently, a consensus clustering algorithm is applied to each group resulting in a clustering solution per group. These solutions are pooled together and further analysed by employing FCA which allows extracting valuable insights from the data and generating a gene partition over all the experiments. In order to validate the FCA-enhanced approach two consensus clustering algorithms are adapted to incorporate the FCA analysis. Their performance is evaluated on gene expression data from multi-experiment study examining the global cell-cycle control of fission yeast. The FCA results derived from both methods demonstrate that, although both algorithms optimize different clustering characteristics, FCA is able to overcome and diminish these differences and preserve some relevant biological signals. Conclusions The proposed FCA-enhanced consensus clustering technique is a general approach to the combination of clustering algorithms with FCA for deriving clustering solutions from multiple gene expression matrices. The experimental results presented herein demonstrate that it is a robust data integration technique able to produce good quality clustering solution that is representative for the whole set of expression matrices. PMID:24885407

  15. Categorical Variables in Multiple Regression: Some Cautions.

    ERIC Educational Resources Information Center

    O'Grady, Kevin E.; Medoff, Deborah R.

    1988-01-01

    Limitations of dummy coding and nonsense coding as methods of coding categorical variables for use as predictors in multiple regression analysis are discussed. The combination of these approaches often yields estimates and tests of significance that are not intended by researchers for inclusion in their models. (SLD)

  16. Targeted quantitative analysis of Streptococcus pyogenes virulence factors by multiple reaction monitoring.

    PubMed

    Lange, Vinzenz; Malmström, Johan A; Didion, John; King, Nichole L; Johansson, Björn P; Schäfer, Juliane; Rameseder, Jonathan; Wong, Chee-Hong; Deutsch, Eric W; Brusniak, Mi-Youn; Bühlmann, Peter; Björck, Lars; Domon, Bruno; Aebersold, Ruedi

    2008-08-01

    In many studies, particularly in the field of systems biology, it is essential that identical protein sets are precisely quantified in multiple samples such as those representing differentially perturbed cell states. The high degree of reproducibility required for such experiments has not been achieved by classical mass spectrometry-based proteomics methods. In this study we describe the implementation of a targeted quantitative approach by which predetermined protein sets are first identified and subsequently quantified at high sensitivity reliably in multiple samples. This approach consists of three steps. First, the proteome is extensively mapped out by multidimensional fractionation and tandem mass spectrometry, and the data generated are assembled in the PeptideAtlas database. Second, based on this proteome map, peptides uniquely identifying the proteins of interest, proteotypic peptides, are selected, and multiple reaction monitoring (MRM) transitions are established and validated by MS2 spectrum acquisition. This process of peptide selection, transition selection, and validation is supported by a suite of software tools, TIQAM (Targeted Identification for Quantitative Analysis by MRM), described in this study. Third, the selected target protein set is quantified in multiple samples by MRM. Applying this approach we were able to reliably quantify low abundance virulence factors from cultures of the human pathogen Streptococcus pyogenes exposed to increasing amounts of plasma. The resulting quantitative protein patterns enabled us to clearly define the subset of virulence proteins that is regulated upon plasma exposure.

  17. Probing of multiple magnetic responses in magnetic inductors using atomic force microscopy.

    PubMed

    Park, Seongjae; Seo, Hosung; Seol, Daehee; Yoon, Young-Hwan; Kim, Mi Yang; Kim, Yunseok

    2016-02-08

    Even though nanoscale analysis of magnetic properties is of significant interest, probing methods are relatively less developed compared to the significance of the technique, which has multiple potential applications. Here, we demonstrate an approach for probing various magnetic properties associated with eddy current, coil current and magnetic domains in magnetic inductors using multidimensional magnetic force microscopy (MMFM). The MMFM images provide combined magnetic responses from the three different origins, however, each contribution to the MMFM response can be differentiated through analysis based on the bias dependence of the response. In particular, the bias dependent MMFM images show locally different eddy current behavior with values dependent on the type of materials that comprise the MI. This approach for probing magnetic responses can be further extended to the analysis of local physical features.

  18. [Efficacy of racecadotril vs. smectite, probiotics or zinc as an integral part of treatment of acute diarrhea in children under five years: A meta-analysis of multiple treatments].

    PubMed

    Gutiérrez-Castrellón, Pedro; Ortíz-Hernández, Anna Alejandra; Llamosas-Gallardo, Beatriz; Acosta-Bastidas, Mario A; Jiménez-Gutiérrez, Carlos; Diaz-García, Luisa; Anzo-Osorio, Anahí; Estevez-Jiménez, Juliana; Jiménez-Escobar, Irma; Vidal-Vázquez, Rosa Patricia

    2015-01-01

    Despite major advances in treatment, acute diarrhea continues to be a public health problem in children under five years. There is no systematic approach to treatment and most evidence is assembled comparing active treatment vs. placebo. Systematic review of evidence on efficacy of adjuvants for treatment of acute diarrhea through a network meta-analysis. A systematic search of multiple databases searching clinical trials related to the use of racecadotril, smectite, Lactobacillus GG, Lactobacillus reuteri, Saccharomyces boulardii and zinc as adjuvants in acute diarrhea was done. The primary endpoint was duration of diarrhea. Information is displayed through network meta-analysis.The superiority of each coadjutant was analyzed by Sucra approach. Network meta-analysis showed race cadotril was better when compared with placebo and other adjuvants. Sucra analysis showed racecadotril as the first option followed by smectite and Lactobacillus reuteri. Considering a strategic decision making approach, network meta-analysis allows us to establish the therapeutic superiority of racecadotril as an adjunct for the comprehensive management of acute diarrhea in children aged less than five years.

  19. An Extension of Dominance Analysis to Canonical Correlation Analysis

    ERIC Educational Resources Information Center

    Huo, Yan; Budescu, David V.

    2009-01-01

    Dominance analysis (Budescu, 1993) offers a general framework for determination of relative importance of predictors in univariate and multivariate multiple regression models. This approach relies on pairwise comparisons of the contribution of predictors in all relevant subset models. In this article we extend dominance analysis to canonical…

  20. Advanced Multiple Processor Configuration Study. Final Report.

    ERIC Educational Resources Information Center

    Clymer, S. J.

    This summary of a study on multiple processor configurations includes the objectives, background, approach, and results of research undertaken to provide the Air Force with a generalized model of computer processor combinations for use in the evaluation of proposed flight training simulator computational designs. An analysis of a real-time flight…

  1. THE ANALYSIS OF MIXED DISCRETE AND CONTINUOUS OUTCOMES USING DESIRABILITY FUNCTIONS.

    EPA Science Inventory

    Multiple types of outcomes are sometimes measured on each animal in toxicology dose-response experiments, and multiple analyses may increase the overall type I error. One approach to analyzing these outcomes in an integrated way is through the use of a composite score. We int...

  2. A Bayesian Missing Data Framework for Generalized Multiple Outcome Mixed Treatment Comparisons

    ERIC Educational Resources Information Center

    Hong, Hwanhee; Chu, Haitao; Zhang, Jing; Carlin, Bradley P.

    2016-01-01

    Bayesian statistical approaches to mixed treatment comparisons (MTCs) are becoming more popular because of their flexibility and interpretability. Many randomized clinical trials report multiple outcomes with possible inherent correlations. Moreover, MTC data are typically sparse (although richer than standard meta-analysis, comparing only two…

  3. Building Regression Models: The Importance of Graphics.

    ERIC Educational Resources Information Center

    Dunn, Richard

    1989-01-01

    Points out reasons for using graphical methods to teach simple and multiple regression analysis. Argues that a graphically oriented approach has considerable pedagogic advantages in the exposition of simple and multiple regression. Shows that graphical methods may play a central role in the process of building regression models. (Author/LS)

  4. Analysis of Genome-Wide Association Studies with Multiple Outcomes Using Penalization

    PubMed Central

    Liu, Jin; Huang, Jian; Ma, Shuangge

    2012-01-01

    Genome-wide association studies have been extensively conducted, searching for markers for biologically meaningful outcomes and phenotypes. Penalization methods have been adopted in the analysis of the joint effects of a large number of SNPs (single nucleotide polymorphisms) and marker identification. This study is partly motivated by the analysis of heterogeneous stock mice dataset, in which multiple correlated phenotypes and a large number of SNPs are available. Existing penalization methods designed to analyze a single response variable cannot accommodate the correlation among multiple response variables. With multiple response variables sharing the same set of markers, joint modeling is first employed to accommodate the correlation. The group Lasso approach is adopted to select markers associated with all the outcome variables. An efficient computational algorithm is developed. Simulation study and analysis of the heterogeneous stock mice dataset show that the proposed method can outperform existing penalization methods. PMID:23272092

  5. Multiple imputation for IPD meta-analysis: allowing for heterogeneity and studies with missing covariates.

    PubMed

    Quartagno, M; Carpenter, J R

    2016-07-30

    Recently, multiple imputation has been proposed as a tool for individual patient data meta-analysis with sporadically missing observations, and it has been suggested that within-study imputation is usually preferable. However, such within study imputation cannot handle variables that are completely missing within studies. Further, if some of the contributing studies are relatively small, it may be appropriate to share information across studies when imputing. In this paper, we develop and evaluate a joint modelling approach to multiple imputation of individual patient data in meta-analysis, with an across-study probability distribution for the study specific covariance matrices. This retains the flexibility to allow for between-study heterogeneity when imputing while allowing (i) sharing information on the covariance matrix across studies when this is appropriate, and (ii) imputing variables that are wholly missing from studies. Simulation results show both equivalent performance to the within-study imputation approach where this is valid, and good results in more general, practically relevant, scenarios with studies of very different sizes, non-negligible between-study heterogeneity and wholly missing variables. We illustrate our approach using data from an individual patient data meta-analysis of hypertension trials. © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  6. Meta-Analysis With Complex Research Designs: Dealing With Dependence From Multiple Measures and Multiple Group Comparisons

    PubMed Central

    Scammacca, Nancy; Roberts, Greg; Stuebing, Karla K.

    2013-01-01

    Previous research has shown that treating dependent effect sizes as independent inflates the variance of the mean effect size and introduces bias by giving studies with more effect sizes more weight in the meta-analysis. This article summarizes the different approaches to handling dependence that have been advocated by methodologists, some of which are more feasible to implement with education research studies than others. A case study using effect sizes from a recent meta-analysis of reading interventions is presented to compare the results obtained from different approaches to dealing with dependence. Overall, mean effect sizes and variance estimates were found to be similar, but estimates of indexes of heterogeneity varied. Meta-analysts are advised to explore the effect of the method of handling dependence on the heterogeneity estimates before conducting moderator analyses and to choose the approach to dependence that is best suited to their research question and their data set. PMID:25309002

  7. Analysis Commons, A Team Approach to Discovery in a Big-Data Environment for Genetic Epidemiology

    PubMed Central

    Brody, Jennifer A.; Morrison, Alanna C.; Bis, Joshua C.; O'Connell, Jeffrey R.; Brown, Michael R.; Huffman, Jennifer E.; Ames, Darren C.; Carroll, Andrew; Conomos, Matthew P.; Gabriel, Stacey; Gibbs, Richard A.; Gogarten, Stephanie M.; Gupta, Namrata; Jaquish, Cashell E.; Johnson, Andrew D.; Lewis, Joshua P.; Liu, Xiaoming; Manning, Alisa K.; Papanicolaou, George J.; Pitsillides, Achilleas N.; Rice, Kenneth M.; Salerno, William; Sitlani, Colleen M.; Smith, Nicholas L.; Heckbert, Susan R.; Laurie, Cathy C.; Mitchell, Braxton D.; Vasan, Ramachandran S.; Rich, Stephen S.; Rotter, Jerome I.; Wilson, James G.; Boerwinkle, Eric; Psaty, Bruce M.; Cupples, L. Adrienne

    2017-01-01

    Summary paragraph The exploding volume of whole-genome sequence (WGS) and multi-omics data requires new approaches for analysis. As one solution, we have created a cloud-based Analysis Commons, which brings together genotype and phenotype data from multiple studies in a setting that is accessible by multiple investigators. This framework addresses many of the challenges of multi-center WGS analyses, including data sharing mechanisms, phenotype harmonization, integrated multi-omics analyses, annotation, and computational flexibility. In this setting, the computational pipeline facilitates a sequence-to-discovery analysis workflow illustrated here by an analysis of plasma fibrinogen levels in 3996 individuals from the National Heart, Lung, and Blood Institute (NHLBI) Trans-Omics for Precision Medicine (TOPMed) WGS program. The Analysis Commons represents a novel model for transforming WGS resources from a massive quantity of phenotypic and genomic data into knowledge of the determinants of health and disease risk in diverse human populations. PMID:29074945

  8. Assessing the validity of discourse analysis: transdisciplinary convergence

    NASA Astrophysics Data System (ADS)

    Jaipal-Jamani, Kamini

    2014-12-01

    Research studies using discourse analysis approaches make claims about phenomena or issues based on interpretation of written or spoken text, which includes images and gestures. How are findings/interpretations from discourse analysis validated? This paper proposes transdisciplinary convergence as a way to validate discourse analysis approaches to research. The argument is made that discourse analysis explicitly grounded in semiotics, systemic functional linguistics, and critical theory, offers a credible research methodology. The underlying assumptions, constructs, and techniques of analysis of these three theoretical disciplines can be drawn on to show convergence of data at multiple levels, validating interpretations from text analysis.

  9. The receiver operational characteristic for binary classification with multiple indices and its application to the neuroimaging study of Alzheimer's disease.

    PubMed

    Wu, Xia; Li, Juan; Ayutyanont, Napatkamon; Protas, Hillary; Jagust, William; Fleisher, Adam; Reiman, Eric; Yao, Li; Chen, Kewei

    2013-01-01

    Given a single index, the receiver operational characteristic (ROC) curve analysis is routinely utilized for characterizing performances in distinguishing two conditions/groups in terms of sensitivity and specificity. Given the availability of multiple data sources (referred to as multi-indices), such as multimodal neuroimaging data sets, cognitive tests, and clinical ratings and genomic data in Alzheimer’s disease (AD) studies, the single-index-based ROC underutilizes all available information. For a long time, a number of algorithmic/analytic approaches combining multiple indices have been widely used to simultaneously incorporate multiple sources. In this study, we propose an alternative for combining multiple indices using logical operations, such as “AND,” “OR,” and “at least n” (where n is an integer), to construct multivariate ROC (multiV-ROC) and characterize the sensitivity and specificity statistically associated with the use of multiple indices. With and without the “leave-one-out” cross-validation, we used two data sets from AD studies to showcase the potentially increased sensitivity/specificity of the multiV-ROC in comparison to the single-index ROC and linear discriminant analysis (an analytic way of combining multi-indices). We conclude that, for the data sets we investigated, the proposed multiV-ROC approach is capable of providing a natural and practical alternative with improved classification accuracy as compared to univariate ROC and linear discriminant analysis.

  10. The Receiver Operational Characteristic for Binary Classification with Multiple Indices and Its Application to the Neuroimaging Study of Alzheimer’s Disease

    PubMed Central

    Wu, Xia; Li, Juan; Ayutyanont, Napatkamon; Protas, Hillary; Jagust, William; Fleisher, Adam; Reiman, Eric; Yao, Li; Chen, Kewei

    2014-01-01

    Given a single index, the receiver operational characteristic (ROC) curve analysis is routinely utilized for characterizing performances in distinguishing two conditions/groups in terms of sensitivity and specificity. Given the availability of multiple data sources (referred to as multi-indices), such as multimodal neuroimaging data sets, cognitive tests, and clinical ratings and genomic data in Alzheimer’s disease (AD) studies, the single-index-based ROC underutilizes all available information. For a long time, a number of algorithmic/analytic approaches combining multiple indices have been widely used to simultaneously incorporate multiple sources. In this study, we propose an alternative for combining multiple indices using logical operations, such as “AND,” “OR,” and “at least n” (where n is an integer), to construct multivariate ROC (multiV-ROC) and characterize the sensitivity and specificity statistically associated with the use of multiple indices. With and without the “leave-one-out” cross-validation, we used two data sets from AD studies to showcase the potentially increased sensitivity/specificity of the multiV-ROC in comparison to the single-index ROC and linear discriminant analysis (an analytic way of combining multi-indices). We conclude that, for the data sets we investigated, the proposed multiV-ROC approach is capable of providing a natural and practical alternative with improved classification accuracy as compared to univariate ROC and linear discriminant analysis. PMID:23702553

  11. Meta-Analysis of Multiple Simulation-Based Experiments

    DTIC Science & Technology

    2013-06-01

    Alberts et al ., 2010), C2 Approaches differ on at least three major aspects: the allocation of decision rights (ADR), the pattern of interaction among...results obtained from the meta-analysis support the hypothesis that more network-enabled C2 Approaches are more agile (for details see Bernier et al ...consult Bernier, Chan et al . (2013) for more details. DoI PoI ADR Figure 2: Mapping of all CiCs into each axis of the C2 Approach Space. 18th

  12. A Statistical Approach for Testing Cross-Phenotype Effects of Rare Variants

    PubMed Central

    Broadaway, K. Alaine; Cutler, David J.; Duncan, Richard; Moore, Jacob L.; Ware, Erin B.; Jhun, Min A.; Bielak, Lawrence F.; Zhao, Wei; Smith, Jennifer A.; Peyser, Patricia A.; Kardia, Sharon L.R.; Ghosh, Debashis; Epstein, Michael P.

    2016-01-01

    Increasing empirical evidence suggests that many genetic variants influence multiple distinct phenotypes. When cross-phenotype effects exist, multivariate association methods that consider pleiotropy are often more powerful than univariate methods that model each phenotype separately. Although several statistical approaches exist for testing cross-phenotype effects for common variants, there is a lack of similar tests for gene-based analysis of rare variants. In order to fill this important gap, we introduce a statistical method for cross-phenotype analysis of rare variants using a nonparametric distance-covariance approach that compares similarity in multivariate phenotypes to similarity in rare-variant genotypes across a gene. The approach can accommodate both binary and continuous phenotypes and further can adjust for covariates. Our approach yields a closed-form test whose significance can be evaluated analytically, thereby improving computational efficiency and permitting application on a genome-wide scale. We use simulated data to demonstrate that our method, which we refer to as the Gene Association with Multiple Traits (GAMuT) test, provides increased power over competing approaches. We also illustrate our approach using exome-chip data from the Genetic Epidemiology Network of Arteriopathy. PMID:26942286

  13. Learning Analysis of K-12 Students' Online Problem Solving: A Three-Stage Assessment Approach

    ERIC Educational Resources Information Center

    Hu, Yiling; Wu, Bian; Gu, Xiaoqing

    2017-01-01

    Problem solving is considered a fundamental human skill. However, large-scale assessment of problem solving in K-12 education remains a challenging task. Researchers have argued for the development of an enhanced assessment approach through joint effort from multiple disciplines. In this study, a three-stage approach based on an evidence-centered…

  14. Integrated data analysis for genome-wide research.

    PubMed

    Steinfath, Matthias; Repsilber, Dirk; Scholz, Matthias; Walther, Dirk; Selbig, Joachim

    2007-01-01

    Integrated data analysis is introduced as the intermediate level of a systems biology approach to analyse different 'omics' datasets, i.e., genome-wide measurements of transcripts, protein levels or protein-protein interactions, and metabolite levels aiming at generating a coherent understanding of biological function. In this chapter we focus on different methods of correlation analyses ranging from simple pairwise correlation to kernel canonical correlation which were recently applied in molecular biology. Several examples are presented to illustrate their application. The input data for this analysis frequently originate from different experimental platforms. Therefore, preprocessing steps such as data normalisation and missing value estimation are inherent to this approach. The corresponding procedures, potential pitfalls and biases, and available software solutions are reviewed. The multiplicity of observations obtained in omics-profiling experiments necessitates the application of multiple testing correction techniques.

  15. Estimation of lung tumor position from multiple anatomical features on 4D-CT using multiple regression analysis.

    PubMed

    Ono, Tomohiro; Nakamura, Mitsuhiro; Hirose, Yoshinori; Kitsuda, Kenji; Ono, Yuka; Ishigaki, Takashi; Hiraoka, Masahiro

    2017-09-01

    To estimate the lung tumor position from multiple anatomical features on four-dimensional computed tomography (4D-CT) data sets using single regression analysis (SRA) and multiple regression analysis (MRA) approach and evaluate an impact of the approach on internal target volume (ITV) for stereotactic body radiotherapy (SBRT) of the lung. Eleven consecutive lung cancer patients (12 cases) underwent 4D-CT scanning. The three-dimensional (3D) lung tumor motion exceeded 5 mm. The 3D tumor position and anatomical features, including lung volume, diaphragm, abdominal wall, and chest wall positions, were measured on 4D-CT images. The tumor position was estimated by SRA using each anatomical feature and MRA using all anatomical features. The difference between the actual and estimated tumor positions was defined as the root-mean-square error (RMSE). A standard partial regression coefficient for the MRA was evaluated. The 3D lung tumor position showed a high correlation with the lung volume (R = 0.92 ± 0.10). Additionally, ITVs derived from SRA and MRA approaches were compared with ITV derived from contouring gross tumor volumes on all 10 phases of the 4D-CT (conventional ITV). The RMSE of the SRA was within 3.7 mm in all directions. Also, the RMSE of the MRA was within 1.6 mm in all directions. The standard partial regression coefficient for the lung volume was the largest and had the most influence on the estimated tumor position. Compared with conventional ITV, average percentage decrease of ITV were 31.9% and 38.3% using SRA and MRA approaches, respectively. The estimation accuracy of lung tumor position was improved by the MRA approach, which provided smaller ITV than conventional ITV. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  16. Statistical strategies for averaging EC50 from multiple dose-response experiments.

    PubMed

    Jiang, Xiaoqi; Kopp-Schneider, Annette

    2015-11-01

    In most dose-response studies, repeated experiments are conducted to determine the EC50 value for a chemical, requiring averaging EC50 estimates from a series of experiments. Two statistical strategies, the mixed-effect modeling and the meta-analysis approach, can be applied to estimate average behavior of EC50 values over all experiments by considering the variabilities within and among experiments. We investigated these two strategies in two common cases of multiple dose-response experiments in (a) complete and explicit dose-response relationships are observed in all experiments and in (b) only in a subset of experiments. In case (a), the meta-analysis strategy is a simple and robust method to average EC50 estimates. In case (b), all experimental data sets can be first screened using the dose-response screening plot, which allows visualization and comparison of multiple dose-response experimental results. As long as more than three experiments provide information about complete dose-response relationships, the experiments that cover incomplete relationships can be excluded from the meta-analysis strategy of averaging EC50 estimates. If there are only two experiments containing complete dose-response information, the mixed-effects model approach is suggested. We subsequently provided a web application for non-statisticians to implement the proposed meta-analysis strategy of averaging EC50 estimates from multiple dose-response experiments.

  17. Estimating regional greenhouse gas fluxes: An uncertainty analysis of planetary boundary layer techniques and bottom-up inventories

    USDA-ARS?s Scientific Manuscript database

    Quantification of regional greenhouse gas (GHG) fluxes is essential for establishing mitigation strategies and evaluating their effectiveness. Here, we used multiple top-down approaches and multiple trace gas observations at a tall tower to estimate GHG regional fluxes and evaluate the GHG fluxes de...

  18. Broad-Based National Education in Globalisation: Conceptualisation, Multiple Functions and Management

    ERIC Educational Resources Information Center

    Cheng, Yin Cheong; Yuen, Timothy W. W.

    2017-01-01

    Purpose: The purpose of this paper is to contribute to the worldwide discussion of conceptualization, multiple functions and management of national education in an era of globalisation by proposing a new comprehensive framework for research, policy analysis and practical implementation. Design/Methodology/Approach: Based on a review of the…

  19. Mexican American Adolescents' Profiles of Risk and Mental Health: A Person-Centered Longitudinal Approach

    ERIC Educational Resources Information Center

    Zeiders, Katharine H.; Roosa, Mark W.; Knight, George P.; Gonzales, Nancy A.

    2013-01-01

    Although Mexican American adolescents experience multiple risk factors in their daily lives, most research examines the influences of risk factors on adjustment independently, ignoring the additive and interactive effects of multiple risk factors. Guided by a person-centered perspective and utilizing latent profile analysis, this study identified…

  20. Comparing multiple competing interventions in the absence of randomized trials using clinical risk-benefit analysis

    PubMed Central

    2012-01-01

    Background To demonstrate the use of risk-benefit analysis for comparing multiple competing interventions in the absence of randomized trials, we applied this approach to the evaluation of five anticoagulants to prevent thrombosis in patients undergoing orthopedic surgery. Methods Using a cost-effectiveness approach from a clinical perspective (i.e. risk benefit analysis) we compared thromboprophylaxis with warfarin, low molecular weight heparin, unfractionated heparin, fondaparinux or ximelagatran in patients undergoing major orthopedic surgery, with sub-analyses according to surgery type. Proportions and variances of events defining risk (major bleeding) and benefit (thrombosis averted) were obtained through a meta-analysis and used to define beta distributions. Monte Carlo simulations were conducted and used to calculate incremental risks, benefits, and risk-benefit ratios. Finally, net clinical benefit was calculated for all replications across a range of risk-benefit acceptability thresholds, with a reference range obtained by estimating the case fatality rate - ratio of thrombosis to bleeding. Results The analysis showed that compared to placebo ximelagatran was superior to other options but final results were influenced by type of surgery, since ximelagatran was superior in total knee replacement but not in total hip replacement. Conclusions Using simulation and economic techniques we demonstrate a method that allows comparing multiple competing interventions in the absence of randomized trials with multiple arms by determining the option with the best risk-benefit profile. It can be helpful in clinical decision making since it incorporates risk, benefit, and personal risk acceptance. PMID:22233221

  1. Which empowerment, which Health Promotion? Conceptual convergences and divergences in preventive health practices.

    PubMed

    Ferreira, Marcos Santos; Castiel, Luis David

    2009-01-01

    Based on the multiple meanings, 'empowerment' can be identified with either conservative or critical Health Promotion approaches. From a conservative approach, the concept is viewed as an essentially individual phenomenon, centered on the provision of information and the external transfer of power in the name of the collective good. From this approach, the relationship between 'psychological' and 'community' empowerment is not considered. From a critical approach, the concept is viewed as a relational phenomenon that manifests itself in the dialectic conflict of interests between individuals, groups, and social classes. From this approach, 'psychological' and 'community' empowerment are seen as micro and macro levels of analysis, and social transformations are the result of simultaneous changes at these levels. The use of the notion of empowerment without critical reflection or political analysis of power relations in society disseminates vague, romantic, and homogeneous views of 'community'. Therefore, to assume the relational nature of empowerment means to accept its interdependence with the notion of participation, without which there can be no social transformation. Thus, one should be vigilant about multiple meanings that empowerment can given in Health Promotion discourse.

  2. Maximum Marginal Likelihood Estimation of a Monotonic Polynomial Generalized Partial Credit Model with Applications to Multiple Group Analysis.

    PubMed

    Falk, Carl F; Cai, Li

    2016-06-01

    We present a semi-parametric approach to estimating item response functions (IRF) useful when the true IRF does not strictly follow commonly used functions. Our approach replaces the linear predictor of the generalized partial credit model with a monotonic polynomial. The model includes the regular generalized partial credit model at the lowest order polynomial. Our approach extends Liang's (A semi-parametric approach to estimate IRFs, Unpublished doctoral dissertation, 2007) method for dichotomous item responses to the case of polytomous data. Furthermore, item parameter estimation is implemented with maximum marginal likelihood using the Bock-Aitkin EM algorithm, thereby facilitating multiple group analyses useful in operational settings. Our approach is demonstrated on both educational and psychological data. We present simulation results comparing our approach to more standard IRF estimation approaches and other non-parametric and semi-parametric alternatives.

  3. An integrated modeling approach to support management decisions of coupled groundwater-agricultural systems under multiple uncertainties

    NASA Astrophysics Data System (ADS)

    Hagos Subagadis, Yohannes; Schütze, Niels; Grundmann, Jens

    2015-04-01

    The planning and implementation of effective water resources management strategies need an assessment of multiple (physical, environmental, and socio-economic) issues, and often requires new research in which knowledge of diverse disciplines are combined in a unified methodological and operational frameworks. Such integrative research to link different knowledge domains faces several practical challenges. Such complexities are further compounded by multiple actors frequently with conflicting interests and multiple uncertainties about the consequences of potential management decisions. A fuzzy-stochastic multiple criteria decision analysis tool was developed in this study to systematically quantify both probabilistic and fuzzy uncertainties associated with complex hydrosystems management. It integrated physical process-based models, fuzzy logic, expert involvement and stochastic simulation within a general framework. Subsequently, the proposed new approach is applied to a water-scarce coastal arid region water management problem in northern Oman, where saltwater intrusion into a coastal aquifer due to excessive groundwater extraction for irrigated agriculture has affected the aquifer sustainability, endangering associated socio-economic conditions as well as traditional social structure. Results from the developed method have provided key decision alternatives which can serve as a platform for negotiation and further exploration. In addition, this approach has enabled to systematically quantify both probabilistic and fuzzy uncertainties associated with the decision problem. Sensitivity analysis applied within the developed tool has shown that the decision makers' risk aversion and risk taking attitude may yield in different ranking of decision alternatives. The developed approach can be applied to address the complexities and uncertainties inherent in water resources systems to support management decisions, while serving as a platform for stakeholder participation.

  4. Multiple solution of linear algebraic systems by an iterative method with recomputed preconditioner in the analysis of microstrip structures

    NASA Astrophysics Data System (ADS)

    Ahunov, Roman R.; Kuksenko, Sergey P.; Gazizov, Talgat R.

    2016-06-01

    A multiple solution of linear algebraic systems with dense matrix by iterative methods is considered. To accelerate the process, the recomputing of the preconditioning matrix is used. A priory condition of the recomputing based on change of the arithmetic mean of the current solution time during the multiple solution is proposed. To confirm the effectiveness of the proposed approach, the numerical experiments using iterative methods BiCGStab and CGS for four different sets of matrices on two examples of microstrip structures are carried out. For solution of 100 linear systems the acceleration up to 1.6 times, compared to the approach without recomputing, is obtained.

  5. Methods for Mediation Analysis with Missing Data

    ERIC Educational Resources Information Center

    Zhang, Zhiyong; Wang, Lijuan

    2013-01-01

    Despite wide applications of both mediation models and missing data techniques, formal discussion of mediation analysis with missing data is still rare. We introduce and compare four approaches to dealing with missing data in mediation analysis including list wise deletion, pairwise deletion, multiple imputation (MI), and a two-stage maximum…

  6. A risk-adapted approach is beneficial in the management of bilateral femoral shaft fractures in multiple trauma patients: an analysis based on the trauma registry of the German Trauma Society.

    PubMed

    Steinhausen, Eva; Lefering, Rolf; Tjardes, Thorsten; Neugebauer, Edmund A M; Bouillon, Bertil; Rixen, Dieter

    2014-05-01

    Today, there is a trend toward damage-control orthopedics (DCO) in the management of multiple trauma patients with long bone fractures. However, there is no widely accepted concept. A risk-adapted approach seems to result in low acute morbidity and mortality. Multiple trauma patients with bilateral femoral shaft fractures (FSFs) are considered to be more severely injured. The objective of this study was to validate the risk-adapted approach in the management of multiple trauma patients with bilateral FSF. Data analysis is based on the trauma registry of the German Trauma Society (1993-2008, n = 42,248). Multiple trauma patients with bilateral FSF were analyzed in subgroups according to the type of primary operative strategy. Outcome parameters were mortality and major complications as (multiple) organ failure and sepsis. A total of 379 patients with bilateral FSF were divided into four groups as follows: (1) no operation (8.4%), (2) bilateral temporary external fixation (DCO) (50.9%), bilateral primary definitive osteosynthesis (early total care [ETC]) (25.1%), and primary definitive osteosynthesis of one FSF and DCO contralaterally (mixed) (15.6%). Compared with the ETC group, the DCO group was more severely injured. The incidence of (multiple) organ failure and mortality rates were higher in the DCO group but without significance. Adjusted for injury severity, there was no significant difference of mortality rates between DCO and ETC. Injury severity and mortality rates were significantly increased in the no-operation group. The mixed group was similar to the ETC group regarding injury severity and outcome. In Germany, both DCO and ETC are practiced in multiple trauma patients with bilateral FSF so far. The unstable or potentially unstable patient is reasonably treated with DCO. The clearly stable patient is reasonably treated with nailing. When in doubt, the patient is probably not totally stable, and the safest precaution may be to use DCO as a risk-adapted approach. Therapeutic study, level IV. Epidemiologic study, level III.

  7. Standardised Library Instruction Assessment: An Institution-Specific Approach

    ERIC Educational Resources Information Center

    Staley, Shannon M.; Branch, Nicole A.; Hewitt, Tom L.

    2010-01-01

    Introduction: We explore the use of a psychometric model for locally-relevant, information literacy assessment, using an online tool for standardised assessment of student learning during discipline-based library instruction sessions. Method: A quantitative approach to data collection and analysis was used, employing standardised multiple-choice…

  8. Tau-independent Phase Analysis: A Novel Method for Accurately Determining Phase Shifts.

    PubMed

    Tackenberg, Michael C; Jones, Jeff R; Page, Terry L; Hughey, Jacob J

    2018-06-01

    Estimations of period and phase are essential in circadian biology. While many techniques exist for estimating period, comparatively few methods are available for estimating phase. Current approaches to analyzing phase often vary between studies and are sensitive to coincident changes in period and the stage of the circadian cycle at which the stimulus occurs. Here we propose a new technique, tau-independent phase analysis (TIPA), for quantifying phase shifts in multiple types of circadian time-course data. Through comprehensive simulations, we show that TIPA is both more accurate and more precise than the standard actogram approach. TIPA is computationally simple and therefore will enable accurate and reproducible quantification of phase shifts across multiple subfields of chronobiology.

  9. Cytogenetics in the management of multiple myeloma: an update by the Groupe francophone de cytogénétique hématologique (GFCH).

    PubMed

    Daudignon, Agnès; Quilichini, Benoît; Ameye, Geneviève; Poirel, Hélène; Bastard, Christian; Terré, Christine

    2016-10-01

    Cytogenetics of multiple myeloma has evolved in recent years by the emergence of Interphasic fluorescence in situ hybridization (FISH) performed on sorted plasma cells detecting abnormalities independently of a proliferative and infiltrative index. Cytogenetic analysis plays a major part in the risk stratification of myeloma diagnosis due to prognostic impact of various cytogenetic abnormalities as well as to the association between emerging therapeutic approaches in MM. Thus, practice guidelines now recommend interphasic FISH or alternative molecular technics as the initial analysis for multiple myeloma. The Groupe francophone de cytogénétique hématologique (GFCH) proposes in this issue an update of managing multiple myeloma cytogenetics.

  10. Multiple Interactive Pollutants in Water Quality Trading

    NASA Astrophysics Data System (ADS)

    Sarang, Amin; Lence, Barbara J.; Shamsai, Abolfazl

    2008-10-01

    Efficient environmental management calls for the consideration of multiple pollutants, for which two main types of transferable discharge permit (TDP) program have been described: separate permits that manage each pollutant individually in separate markets, with each permit based on the quantity of the pollutant or its environmental effects, and weighted-sum permits that aggregate several pollutants as a single commodity to be traded in a single market. In this paper, we perform a mathematical analysis of TDP programs for multiple pollutants that jointly affect the environment (i.e., interactive pollutants) and demonstrate the practicality of this approach for cost-efficient maintenance of river water quality. For interactive pollutants, the relative weighting factors are functions of the water quality impacts, marginal damage function, and marginal treatment costs at optimality. We derive the optimal set of weighting factors required by this approach for important scenarios for multiple interactive pollutants and propose using an analytical elasticity of substitution function to estimate damage functions for these scenarios. We evaluate the applicability of this approach using a hypothetical example that considers two interactive pollutants. We compare the weighted-sum permit approach for interactive pollutants with individual permit systems and TDP programs for multiple additive pollutants. We conclude by discussing practical considerations and implementation issues that result from the application of weighted-sum permit programs.

  11. Deriving percentage study weights in multi-parameter meta-analysis models: with application to meta-regression, network meta-analysis and one-stage individual participant data models.

    PubMed

    Riley, Richard D; Ensor, Joie; Jackson, Dan; Burke, Danielle L

    2017-01-01

    Many meta-analysis models contain multiple parameters, for example due to multiple outcomes, multiple treatments or multiple regression coefficients. In particular, meta-regression models may contain multiple study-level covariates, and one-stage individual participant data meta-analysis models may contain multiple patient-level covariates and interactions. Here, we propose how to derive percentage study weights for such situations, in order to reveal the (otherwise hidden) contribution of each study toward the parameter estimates of interest. We assume that studies are independent, and utilise a decomposition of Fisher's information matrix to decompose the total variance matrix of parameter estimates into study-specific contributions, from which percentage weights are derived. This approach generalises how percentage weights are calculated in a traditional, single parameter meta-analysis model. Application is made to one- and two-stage individual participant data meta-analyses, meta-regression and network (multivariate) meta-analysis of multiple treatments. These reveal percentage study weights toward clinically important estimates, such as summary treatment effects and treatment-covariate interactions, and are especially useful when some studies are potential outliers or at high risk of bias. We also derive percentage study weights toward methodologically interesting measures, such as the magnitude of ecological bias (difference between within-study and across-study associations) and the amount of inconsistency (difference between direct and indirect evidence in a network meta-analysis).

  12. Meta‐analysis of test accuracy studies using imputation for partial reporting of multiple thresholds

    PubMed Central

    Deeks, J.J.; Martin, E.C.; Riley, R.D.

    2017-01-01

    Introduction For tests reporting continuous results, primary studies usually provide test performance at multiple but often different thresholds. This creates missing data when performing a meta‐analysis at each threshold. A standard meta‐analysis (no imputation [NI]) ignores such missing data. A single imputation (SI) approach was recently proposed to recover missing threshold results. Here, we propose a new method that performs multiple imputation of the missing threshold results using discrete combinations (MIDC). Methods The new MIDC method imputes missing threshold results by randomly selecting from the set of all possible discrete combinations which lie between the results for 2 known bounding thresholds. Imputed and observed results are then synthesised at each threshold. This is repeated multiple times, and the multiple pooled results at each threshold are combined using Rubin's rules to give final estimates. We compared the NI, SI, and MIDC approaches via simulation. Results Both imputation methods outperform the NI method in simulations. There was generally little difference in the SI and MIDC methods, but the latter was noticeably better in terms of estimating the between‐study variances and generally gave better coverage, due to slightly larger standard errors of pooled estimates. Given selective reporting of thresholds, the imputation methods also reduced bias in the summary receiver operating characteristic curve. Simulations demonstrate the imputation methods rely on an equal threshold spacing assumption. A real example is presented. Conclusions The SI and, in particular, MIDC methods can be used to examine the impact of missing threshold results in meta‐analysis of test accuracy studies. PMID:29052347

  13. Radiogenomics: a systems biology approach to understanding genetic risk factors for radiotherapy toxicity ?

    PubMed Central

    Herskind, Carsten; Talbot, Christopher J.; Kerns, Sarah L.; Veldwijk, Marlon R.; Rosenstein, Barry S.; West, Catharine M. L.

    2016-01-01

    Adverse reactions in normal tissue after radiotherapy (RT) limit the dose that can be given to tumour cells. Since 80% of individual variation in clinical response is estimated to be caused by patient-related factors, identifying these factors might allow prediction of patients with increased risk of developing severe reactions. While inactivation of cell renewal is considered a major cause of toxicity in early-reacting normal tissues, complex interactions involving multiple cell types, cytokines, and hypoxia seem important for late reactions. Here, we review ‘omics’ approaches such as screening of genetic polymorphisms or gene expression analysis, and assess the potential of epigenetic factors, posttranslational modification, signal transduction, and metabolism. Furthermore, functional assays have suggested possible associations with clinical risk of adverse reaction. Pathway analysis incorporating different ‘omics’ approaches may be more efficient in identifying critical pathways than pathway analysis based on single ‘omics’ data sets. Integrating these pathways with functional assays may be powerful in identifying multiple subgroups of RT patients characterized by different mechanisms. Thus ‘omics’ and functional approaches may synergize if they are integrated into radiogenomics ‘systems biology’ to facilitate the goal of individualised radiotherapy. PMID:26944314

  14. Unscaled Bayes factors for multiple hypothesis testing in microarray experiments.

    PubMed

    Bertolino, Francesco; Cabras, Stefano; Castellanos, Maria Eugenia; Racugno, Walter

    2015-12-01

    Multiple hypothesis testing collects a series of techniques usually based on p-values as a summary of the available evidence from many statistical tests. In hypothesis testing, under a Bayesian perspective, the evidence for a specified hypothesis against an alternative, conditionally on data, is given by the Bayes factor. In this study, we approach multiple hypothesis testing based on both Bayes factors and p-values, regarding multiple hypothesis testing as a multiple model selection problem. To obtain the Bayes factors we assume default priors that are typically improper. In this case, the Bayes factor is usually undetermined due to the ratio of prior pseudo-constants. We show that ignoring prior pseudo-constants leads to unscaled Bayes factor which do not invalidate the inferential procedure in multiple hypothesis testing, because they are used within a comparative scheme. In fact, using partial information from the p-values, we are able to approximate the sampling null distribution of the unscaled Bayes factor and use it within Efron's multiple testing procedure. The simulation study suggests that under normal sampling model and even with small sample sizes, our approach provides false positive and false negative proportions that are less than other common multiple hypothesis testing approaches based only on p-values. The proposed procedure is illustrated in two simulation studies, and the advantages of its use are showed in the analysis of two microarray experiments. © The Author(s) 2011.

  15. Multiple Criteria and Multiple Periods Performance Analysis: The Comparison of North African Railways

    NASA Astrophysics Data System (ADS)

    Sabri, Karim; Colson, Gérard E.; Mbangala, Augustin M.

    2008-10-01

    Multi-period differences of technical and financial performances are analysed by comparing five North African railways over the period (1990-2004). A first approach is based on the Malmquist DEA TFP index for measuring the total factors productivity change, decomposed into technical efficiency change and technological changes. A multiple criteria analysis is also performed using the PROMETHEE II method and the software ARGOS. These methods provide complementary detailed information, especially by discriminating the technological and management progresses by Malmquist and the two dimensions of performance by Promethee: that are the service to the community and the enterprises performances, often in conflict.

  16. Testing and analysis of flat and curved panels with multiple cracks

    NASA Technical Reports Server (NTRS)

    Broek, David; Jeong, David Y.; Thomson, Douglas

    1994-01-01

    An experimental and analytical investigation of multiple cracking in various types of test specimens is described in this paper. The testing phase is comprised of a flat unstiffened panel series and curved stiffened and unstiffened panel series. The test specimens contained various configurations for initial damage. Static loading was applied to these specimens until ultimate failure, while loads and crack propagation were recorded. This data provides the basis for developing and validating methodologies for predicting linkup of multiple cracks, progression to failure, and overall residual strength. The results from twelve flat coupon and ten full scale curved panel tests are presented. In addition, an engineering analysis procedure was developed to predict multiple crack linkup. Reasonable agreement was found between predictions and actual test results for linkup and residual strength for both flat and curved panels. The results indicate that an engineering analysis approach has the potential to quantitatively assess the effect of multiple cracks in the arrest capability of an aircraft fuselage structure.

  17. A Participatory Action Research Approach To Evaluating Inclusive School Programs.

    ERIC Educational Resources Information Center

    Dymond, Stacy K.

    2001-01-01

    This article proposes a model for evaluating inclusive schools. Key elements of the model are inclusion of stakeholders in the evaluation process through a participatory action research approach, analysis of program processes and outcomes, use of multiple methods and measures, and obtaining perceptions from diverse stakeholder groups. (Contains…

  18. Multiple Approaches to the Evaluation of Educational Reform: From Cost-Benefit to Power-Benefit Analysis.

    ERIC Educational Resources Information Center

    Paulston, Rolland G.

    Theories or explanations of educational evaluation are discussed and categorized under two broad methodological headings, the objectivist and subjectivist epistemological orientations. They can be seen as potentially complementary empirical approaches that offer evaluators two methodological orientations to assess educational-reform outcomes.…

  19. Selecting Personal Computers.

    ERIC Educational Resources Information Center

    Djang, Philipp A.

    1993-01-01

    Describes a Multiple Criteria Decision Analysis Approach for the selection of personal computers that combines the capabilities of Analytic Hierarchy Process and Integer Goal Programing. An example of how decision makers can use this approach to determine what kind of personal computers and how many of each type to purchase is given. (nine…

  20. DIF Detection Using Multiple-Group Categorical CFA with Minimum Free Baseline Approach

    ERIC Educational Resources Information Center

    Chang, Yu-Wei; Huang, Wei-Kang; Tsai, Rung-Ching

    2015-01-01

    The aim of this study is to assess the efficiency of using the multiple-group categorical confirmatory factor analysis (MCCFA) and the robust chi-square difference test in differential item functioning (DIF) detection for polytomous items under the minimum free baseline strategy. While testing for DIF items, despite the strong assumption that all…

  1. What Is Successful Writing? An Investigation into the Multiple Ways Writers Can Write Successful Essays

    ERIC Educational Resources Information Center

    Crossley, Scott A.; Roscoe, Rod; McNamara, Danielle S.

    2014-01-01

    This study identifies multiple profiles of successful essays via a cluster analysis approach using linguistic features reported by a variety of natural language processing tools. The findings from the study indicate that there are four profiles of successful writers for the samples analyzed. These four profiles are linguistically distinct from one…

  2. Using a Multiple Perspectives Framework: A Methodological Approach to Analyse Complex and Contradictory Interview Data

    ERIC Educational Resources Information Center

    Santoro, Ninetta

    2014-01-01

    In this article I describe how a multiple perspectives framework drawn from the field of social work informed my analysis of interview data obtained from Australian preservice teachers who had gone on an international study trip. One incident recounted differently by three separate interviewees meant that the sometimes-similar and…

  3. Parallel processing considerations for image recognition tasks

    NASA Astrophysics Data System (ADS)

    Simske, Steven J.

    2011-01-01

    Many image recognition tasks are well-suited to parallel processing. The most obvious example is that many imaging tasks require the analysis of multiple images. From this standpoint, then, parallel processing need be no more complicated than assigning individual images to individual processors. However, there are three less trivial categories of parallel processing that will be considered in this paper: parallel processing (1) by task; (2) by image region; and (3) by meta-algorithm. Parallel processing by task allows the assignment of multiple workflows-as diverse as optical character recognition [OCR], document classification and barcode reading-to parallel pipelines. This can substantially decrease time to completion for the document tasks. For this approach, each parallel pipeline is generally performing a different task. Parallel processing by image region allows a larger imaging task to be sub-divided into a set of parallel pipelines, each performing the same task but on a different data set. This type of image analysis is readily addressed by a map-reduce approach. Examples include document skew detection and multiple face detection and tracking. Finally, parallel processing by meta-algorithm allows different algorithms to be deployed on the same image simultaneously. This approach may result in improved accuracy.

  4. An efficient approach to understanding and predicting the effects of multiple task characteristics on performance.

    PubMed

    Richardson, Miles

    2017-04-01

    In ergonomics there is often a need to identify and predict the separate effects of multiple factors on performance. A cost-effective fractional factorial approach to understanding the relationship between task characteristics and task performance is presented. The method has been shown to provide sufficient independent variability to reveal and predict the effects of task characteristics on performance in two domains. The five steps outlined are: selection of performance measure, task characteristic identification, task design for user trials, data collection, regression model development and task characteristic analysis. The approach can be used for furthering knowledge of task performance, theoretical understanding, experimental control and prediction of task performance. Practitioner Summary: A cost-effective method to identify and predict the separate effects of multiple factors on performance is presented. The five steps allow a better understanding of task factors during the design process.

  5. A realistic evaluation: the case of protocol-based care

    PubMed Central

    2010-01-01

    Background 'Protocol based care' was envisioned by policy makers as a mechanism for delivering on the service improvement agenda in England. Realistic evaluation is an increasingly popular approach, but few published examples exist, particularly in implementation research. To fill this gap, within this paper we describe the application of a realistic evaluation approach to the study of protocol-based care, whilst sharing findings of relevance about standardising care through the use of protocols, guidelines, and pathways. Methods Situated between positivism and relativism, realistic evaluation is concerned with the identification of underlying causal mechanisms, how they work, and under what conditions. Fundamentally it focuses attention on finding out what works, for whom, how, and in what circumstances. Results In this research, we were interested in understanding the relationships between the type and nature of particular approaches to protocol-based care (mechanisms), within different clinical settings (context), and what impacts this resulted in (outcomes). An evidence review using the principles of realist synthesis resulted in a number of propositions, i.e., context, mechanism, and outcome threads (CMOs). These propositions were then 'tested' through multiple case studies, using multiple methods including non-participant observation, interviews, and document analysis through an iterative analysis process. The initial propositions (conjectured CMOs) only partially corresponded to the findings that emerged during analysis. From the iterative analysis process of scrutinising mechanisms, context, and outcomes we were able to draw out some theoretically generalisable features about what works, for whom, how, and what circumstances in relation to the use of standardised care approaches (refined CMOs). Conclusions As one of the first studies to apply realistic evaluation in implementation research, it was a good fit, particularly given the growing emphasis on understanding how context influences evidence-based practice. The strengths and limitations of the approach are considered, including how to operationalise it and some of the challenges. This approach provided a useful interpretive framework with which to make sense of the multiple factors that were simultaneously at play and being observed through various data sources, and for developing explanatory theory about using standardised care approaches in practice. PMID:20504293

  6. Missing continuous outcomes under covariate dependent missingness in cluster randomised trials

    PubMed Central

    Diaz-Ordaz, Karla; Bartlett, Jonathan W

    2016-01-01

    Attrition is a common occurrence in cluster randomised trials which leads to missing outcome data. Two approaches for analysing such trials are cluster-level analysis and individual-level analysis. This paper compares the performance of unadjusted cluster-level analysis, baseline covariate adjusted cluster-level analysis and linear mixed model analysis, under baseline covariate dependent missingness in continuous outcomes, in terms of bias, average estimated standard error and coverage probability. The methods of complete records analysis and multiple imputation are used to handle the missing outcome data. We considered four scenarios, with the missingness mechanism and baseline covariate effect on outcome either the same or different between intervention groups. We show that both unadjusted cluster-level analysis and baseline covariate adjusted cluster-level analysis give unbiased estimates of the intervention effect only if both intervention groups have the same missingness mechanisms and there is no interaction between baseline covariate and intervention group. Linear mixed model and multiple imputation give unbiased estimates under all four considered scenarios, provided that an interaction of intervention and baseline covariate is included in the model when appropriate. Cluster mean imputation has been proposed as a valid approach for handling missing outcomes in cluster randomised trials. We show that cluster mean imputation only gives unbiased estimates when missingness mechanism is the same between the intervention groups and there is no interaction between baseline covariate and intervention group. Multiple imputation shows overcoverage for small number of clusters in each intervention group. PMID:27177885

  7. Missing continuous outcomes under covariate dependent missingness in cluster randomised trials.

    PubMed

    Hossain, Anower; Diaz-Ordaz, Karla; Bartlett, Jonathan W

    2017-06-01

    Attrition is a common occurrence in cluster randomised trials which leads to missing outcome data. Two approaches for analysing such trials are cluster-level analysis and individual-level analysis. This paper compares the performance of unadjusted cluster-level analysis, baseline covariate adjusted cluster-level analysis and linear mixed model analysis, under baseline covariate dependent missingness in continuous outcomes, in terms of bias, average estimated standard error and coverage probability. The methods of complete records analysis and multiple imputation are used to handle the missing outcome data. We considered four scenarios, with the missingness mechanism and baseline covariate effect on outcome either the same or different between intervention groups. We show that both unadjusted cluster-level analysis and baseline covariate adjusted cluster-level analysis give unbiased estimates of the intervention effect only if both intervention groups have the same missingness mechanisms and there is no interaction between baseline covariate and intervention group. Linear mixed model and multiple imputation give unbiased estimates under all four considered scenarios, provided that an interaction of intervention and baseline covariate is included in the model when appropriate. Cluster mean imputation has been proposed as a valid approach for handling missing outcomes in cluster randomised trials. We show that cluster mean imputation only gives unbiased estimates when missingness mechanism is the same between the intervention groups and there is no interaction between baseline covariate and intervention group. Multiple imputation shows overcoverage for small number of clusters in each intervention group.

  8. Improving homology modeling of G-protein coupled receptors through multiple-template derived conserved inter-residue interactions

    NASA Astrophysics Data System (ADS)

    Chaudhari, Rajan; Heim, Andrew J.; Li, Zhijun

    2015-05-01

    Evidenced by the three-rounds of G-protein coupled receptors (GPCR) Dock competitions, improving homology modeling methods of helical transmembrane proteins including the GPCRs, based on templates of low sequence identity, remains an eminent challenge. Current approaches addressing this challenge adopt the philosophy of "modeling first, refinement next". In the present work, we developed an alternative modeling approach through the novel application of available multiple templates. First, conserved inter-residue interactions are derived from each additional template through conservation analysis of each template-target pairwise alignment. Then, these interactions are converted into distance restraints and incorporated in the homology modeling process. This approach was applied to modeling of the human β2 adrenergic receptor using the bovin rhodopsin and the human protease-activated receptor 1 as templates and improved model quality was demonstrated compared to the homology model generated by standard single-template and multiple-template methods. This method of "refined restraints first, modeling next", provides a fast and complementary way to the current modeling approaches. It allows rational identification and implementation of additional conserved distance restraints extracted from multiple templates and/or experimental data, and has the potential to be applicable to modeling of all helical transmembrane proteins.

  9. Single-Molecule Studies of Actin Assembly and Disassembly Factors

    PubMed Central

    Smith, Benjamin A.; Gelles, Jeff; Goode, Bruce L.

    2014-01-01

    The actin cytoskeleton is very dynamic and highly regulated by multiple associated proteins in vivo. Understanding how this system of proteins functions in the processes of actin network assembly and disassembly requires methods to dissect the mechanisms of activity of individual factors and of multiple factors acting in concert. The advent of single-filament and single-molecule fluorescence imaging methods has provided a powerful new approach to discovering actin-regulatory activities and obtaining direct, quantitative insights into the pathways of molecular interactions that regulate actin network architecture and dynamics. Here we describe techniques for acquisition and analysis of single-molecule data, applied to the novel challenges of studying the filament assembly and disassembly activities of actin-associated proteins in vitro. We discuss the advantages of single-molecule analysis in directly visualizing the order of molecular events, measuring the kinetic rates of filament binding and dissociation, and studying the coordination among multiple factors. The methods described here complement traditional biochemical approaches in elucidating actin-regulatory mechanisms in reconstituted filamentous networks. PMID:24630103

  10. Simultaneous estimation of multiple phases in digital holographic interferometry using state space analysis

    NASA Astrophysics Data System (ADS)

    Kulkarni, Rishikesh; Rastogi, Pramod

    2018-05-01

    A new approach is proposed for the multiple phase estimation from a multicomponent exponential phase signal recorded in multi-beam digital holographic interferometry. It is capable of providing multidimensional measurements in a simultaneous manner from a single recording of the exponential phase signal encoding multiple phases. Each phase within a small window around each pixel is appproximated with a first order polynomial function of spatial coordinates. The problem of accurate estimation of polynomial coefficients, and in turn the unwrapped phases, is formulated as a state space analysis wherein the coefficients and signal amplitudes are set as the elements of a state vector. The state estimation is performed using the extended Kalman filter. An amplitude discrimination criterion is utilized in order to unambiguously estimate the coefficients associated with the individual signal components. The performance of proposed method is stable over a wide range of the ratio of signal amplitudes. The pixelwise phase estimation approach of the proposed method allows it to handle the fringe patterns that may contain invalid regions.

  11. Mixture-based gatekeeping procedures in adaptive clinical trials.

    PubMed

    Kordzakhia, George; Dmitrienko, Alex; Ishida, Eiji

    2018-01-01

    Clinical trials with data-driven decision rules often pursue multiple clinical objectives such as the evaluation of several endpoints or several doses of an experimental treatment. These complex analysis strategies give rise to "multivariate" multiplicity problems with several components or sources of multiplicity. A general framework for defining gatekeeping procedures in clinical trials with adaptive multistage designs is proposed in this paper. The mixture method is applied to build a gatekeeping procedure at each stage and inferences at each decision point (interim or final analysis) are performed using the combination function approach. An advantage of utilizing the mixture method is that it enables powerful gatekeeping procedures applicable to a broad class of settings with complex logical relationships among the hypotheses of interest. Further, the combination function approach supports flexible data-driven decisions such as a decision to increase the sample size or remove a treatment arm. The paper concludes with a clinical trial example that illustrates the methodology by applying it to develop an adaptive two-stage design with a mixture-based gatekeeping procedure.

  12. A Two-Step Approach to Uncertainty Quantification of Core Simulators

    DOE PAGES

    Yankov, Artem; Collins, Benjamin; Klein, Markus; ...

    2012-01-01

    For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor andmore » in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.« less

  13. Visualization-based analysis of multiple response survey data

    NASA Astrophysics Data System (ADS)

    Timofeeva, Anastasiia

    2017-11-01

    During the survey, the respondents are often allowed to tick more than one answer option for a question. Analysis and visualization of such data have difficulties because of the need for processing multiple response variables. With standard representation such as pie and bar charts, information about the association between different answer options is lost. The author proposes a visualization approach for multiple response variables based on Venn diagrams. For a more informative representation with a large number of overlapping groups it is suggested to use similarity and association matrices. Some aggregate indicators of dissimilarity (similarity) are proposed based on the determinant of the similarity matrix and the maximum eigenvalue of association matrix. The application of the proposed approaches is well illustrated by the example of the analysis of advertising sources. Intersection of sets indicates that the same consumer audience is covered by several advertising sources. This information is very important for the allocation of the advertising budget. The differences between target groups in advertising sources are of interest. To identify such differences the hypothesis of homogeneity and independence are tested. Recent approach to the problem are briefly reviewed and compared. An alternative procedure is suggested. It is based on partition of a consumer audience into pairwise disjoint subsets and includes hypothesis testing of the difference between the population proportions. It turned out to be more suitable for the real problem being solved.

  14. Merging information from multi-model flood projections in a hierarchical Bayesian framework

    NASA Astrophysics Data System (ADS)

    Le Vine, Nataliya

    2016-04-01

    Multi-model ensembles are becoming widely accepted for flood frequency change analysis. The use of multiple models results in large uncertainty around estimates of flood magnitudes, due to both uncertainty in model selection and natural variability of river flow. The challenge is therefore to extract the most meaningful signal from the multi-model predictions, accounting for both model quality and uncertainties in individual model estimates. The study demonstrates the potential of a recently proposed hierarchical Bayesian approach to combine information from multiple models. The approach facilitates explicit treatment of shared multi-model discrepancy as well as the probabilistic nature of the flood estimates, by treating the available models as a sample from a hypothetical complete (but unobserved) set of models. The advantages of the approach are: 1) to insure an adequate 'baseline' conditions with which to compare future changes; 2) to reduce flood estimate uncertainty; 3) to maximize use of statistical information in circumstances where multiple weak predictions individually lack power, but collectively provide meaningful information; 4) to adjust multi-model consistency criteria when model biases are large; and 5) to explicitly consider the influence of the (model performance) stationarity assumption. Moreover, the analysis indicates that reducing shared model discrepancy is the key to further reduction of uncertainty in the flood frequency analysis. The findings are of value regarding how conclusions about changing exposure to flooding are drawn, and to flood frequency change attribution studies.

  15. Approach to proliferation risk assessment based on multiple objective analysis framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrianov, A.; Kuptsov, I.; Studgorodok 1, Obninsk, Kaluga region, 249030

    2013-07-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materialsmore » circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk.« less

  16. Seeking unique and common biological themes in multiple gene lists or datasets: pathway pattern extraction pipeline for pathway-level comparative analysis.

    PubMed

    Yi, Ming; Mudunuri, Uma; Che, Anney; Stephens, Robert M

    2009-06-29

    One of the challenges in the analysis of microarray data is to integrate and compare the selected (e.g., differential) gene lists from multiple experiments for common or unique underlying biological themes. A common way to approach this problem is to extract common genes from these gene lists and then subject these genes to enrichment analysis to reveal the underlying biology. However, the capacity of this approach is largely restricted by the limited number of common genes shared by datasets from multiple experiments, which could be caused by the complexity of the biological system itself. We now introduce a new Pathway Pattern Extraction Pipeline (PPEP), which extends the existing WPS application by providing a new pathway-level comparative analysis scheme. To facilitate comparing and correlating results from different studies and sources, PPEP contains new interfaces that allow evaluation of the pathway-level enrichment patterns across multiple gene lists. As an exploratory tool, this analysis pipeline may help reveal the underlying biological themes at both the pathway and gene levels. The analysis scheme provided by PPEP begins with multiple gene lists, which may be derived from different studies in terms of the biological contexts, applied technologies, or methodologies. These lists are then subjected to pathway-level comparative analysis for extraction of pathway-level patterns. This analysis pipeline helps to explore the commonality or uniqueness of these lists at the level of pathways or biological processes from different but relevant biological systems using a combination of statistical enrichment measurements, pathway-level pattern extraction, and graphical display of the relationships of genes and their associated pathways as Gene-Term Association Networks (GTANs) within the WPS platform. As a proof of concept, we have used the new method to analyze many datasets from our collaborators as well as some public microarray datasets. This tool provides a new pathway-level analysis scheme for integrative and comparative analysis of data derived from different but relevant systems. The tool is freely available as a Pathway Pattern Extraction Pipeline implemented in our existing software package WPS, which can be obtained at http://www.abcc.ncifcrf.gov/wps/wps_index.php.

  17. Transitioning to multiple imputation : a new method to impute missing blood alcohol concentration (BAC) values in FARS

    DOT National Transportation Integrated Search

    2002-01-01

    The National Center for Statistics and Analysis (NCSA) of the National Highway Traffic Safety : Administration (NHTSA) has undertaken several approaches to remedy the problem of missing blood alcohol : test results in the Fatality Analysis Reporting ...

  18. Fast alignment-free sequence comparison using spaced-word frequencies.

    PubMed

    Leimeister, Chris-Andre; Boden, Marcus; Horwege, Sebastian; Lindner, Sebastian; Morgenstern, Burkhard

    2014-07-15

    Alignment-free methods for sequence comparison are increasingly used for genome analysis and phylogeny reconstruction; they circumvent various difficulties of traditional alignment-based approaches. In particular, alignment-free methods are much faster than pairwise or multiple alignments. They are, however, less accurate than methods based on sequence alignment. Most alignment-free approaches work by comparing the word composition of sequences. A well-known problem with these methods is that neighbouring word matches are far from independent. To reduce the statistical dependency between adjacent word matches, we propose to use 'spaced words', defined by patterns of 'match' and 'don't care' positions, for alignment-free sequence comparison. We describe a fast implementation of this approach using recursive hashing and bit operations, and we show that further improvements can be achieved by using multiple patterns instead of single patterns. To evaluate our approach, we use spaced-word frequencies as a basis for fast phylogeny reconstruction. Using real-world and simulated sequence data, we demonstrate that our multiple-pattern approach produces better phylogenies than approaches relying on contiguous words. Our program is freely available at http://spaced.gobics.de/. © The Author 2014. Published by Oxford University Press.

  19. Who do we think we are? Analysing the content and form of identity work in the English National Health Service.

    PubMed

    McDermott, Imelda; Checkland, Kath; Harrison, Stephen; Snow, Stephanie; Coleman, Anna

    2013-01-01

    The language used by National Health Service (NHS) "commissioning" managers when discussing their roles and responsibilities can be seen as a manifestation of "identity work", defined as a process of identifying. This paper aims to offer a novel approach to analysing "identity work" by triangulation of multiple analytical methods, combining analysis of the content of text with analysis of its form. Fairclough's discourse analytic methodology is used as a framework. Following Fairclough, the authors use analytical methods associated with Halliday's systemic functional linguistics. While analysis of the content of interviews provides some information about NHS Commissioners' perceptions of their roles and responsibilities, analysis of the form of discourse that they use provides a more detailed and nuanced view. Overall, the authors found that commissioning managers have a higher level of certainty about what commissioning is not rather than what commissioning is; GP managers have a high level of certainty of their identity as a GP rather than as a manager; and both GP managers and non-GP managers oscillate between multiple identities depending on the different situations they are in. This paper offers a novel approach to triangulation, based not on the usual comparison of multiple data sources, but rather based on the application of multiple analytical methods to a single source of data. This paper also shows the latent uncertainty about the nature of commissioning enterprise in the English NHS.

  20. Meta-analysis with missing study-level sample variance data.

    PubMed

    Chowdhry, Amit K; Dworkin, Robert H; McDermott, Michael P

    2016-07-30

    We consider a study-level meta-analysis with a normally distributed outcome variable and possibly unequal study-level variances, where the object of inference is the difference in means between a treatment and control group. A common complication in such an analysis is missing sample variances for some studies. A frequently used approach is to impute the weighted (by sample size) mean of the observed variances (mean imputation). Another approach is to include only those studies with variances reported (complete case analysis). Both mean imputation and complete case analysis are only valid under the missing-completely-at-random assumption, and even then the inverse variance weights produced are not necessarily optimal. We propose a multiple imputation method employing gamma meta-regression to impute the missing sample variances. Our method takes advantage of study-level covariates that may be used to provide information about the missing data. Through simulation studies, we show that multiple imputation, when the imputation model is correctly specified, is superior to competing methods in terms of confidence interval coverage probability and type I error probability when testing a specified group difference. Finally, we describe a similar approach to handling missing variances in cross-over studies. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  1. Modelling multiple sources of dissemination bias in meta-analysis.

    PubMed

    Bowden, Jack; Jackson, Dan; Thompson, Simon G

    2010-03-30

    Asymmetry in the funnel plot for a meta-analysis suggests the presence of dissemination bias. This may be caused by publication bias through the decisions of journal editors, by selective reporting of research results by authors or by a combination of both. Typically, study results that are statistically significant or have larger estimated effect sizes are more likely to appear in the published literature, hence giving a biased picture of the evidence-base. Previous statistical approaches for addressing dissemination bias have assumed only a single selection mechanism. Here we consider a more realistic scenario in which multiple dissemination processes, involving both the publishing authors and journals, are operating. In practical applications, the methods can be used to provide sensitivity analyses for the potential effects of multiple dissemination biases operating in meta-analysis.

  2. Comparing cluster-level dynamic treatment regimens using sequential, multiple assignment, randomized trials: Regression estimation and sample size considerations.

    PubMed

    NeCamp, Timothy; Kilbourne, Amy; Almirall, Daniel

    2017-08-01

    Cluster-level dynamic treatment regimens can be used to guide sequential treatment decision-making at the cluster level in order to improve outcomes at the individual or patient-level. In a cluster-level dynamic treatment regimen, the treatment is potentially adapted and re-adapted over time based on changes in the cluster that could be impacted by prior intervention, including aggregate measures of the individuals or patients that compose it. Cluster-randomized sequential multiple assignment randomized trials can be used to answer multiple open questions preventing scientists from developing high-quality cluster-level dynamic treatment regimens. In a cluster-randomized sequential multiple assignment randomized trial, sequential randomizations occur at the cluster level and outcomes are observed at the individual level. This manuscript makes two contributions to the design and analysis of cluster-randomized sequential multiple assignment randomized trials. First, a weighted least squares regression approach is proposed for comparing the mean of a patient-level outcome between the cluster-level dynamic treatment regimens embedded in a sequential multiple assignment randomized trial. The regression approach facilitates the use of baseline covariates which is often critical in the analysis of cluster-level trials. Second, sample size calculators are derived for two common cluster-randomized sequential multiple assignment randomized trial designs for use when the primary aim is a between-dynamic treatment regimen comparison of the mean of a continuous patient-level outcome. The methods are motivated by the Adaptive Implementation of Effective Programs Trial which is, to our knowledge, the first-ever cluster-randomized sequential multiple assignment randomized trial in psychiatry.

  3. A new breast cancer risk analysis approach using features extracted from multiple sub-regions on bilateral mammograms

    NASA Astrophysics Data System (ADS)

    Sun, Wenqing; Tseng, Tzu-Liang B.; Zheng, Bin; Zhang, Jianying; Qian, Wei

    2015-03-01

    A novel breast cancer risk analysis approach is proposed for enhancing performance of computerized breast cancer risk analysis using bilateral mammograms. Based on the intensity of breast area, five different sub-regions were acquired from one mammogram, and bilateral features were extracted from every sub-region. Our dataset includes 180 bilateral mammograms from 180 women who underwent routine screening examinations, all interpreted as negative and not recalled by the radiologists during the original screening procedures. A computerized breast cancer risk analysis scheme using four image processing modules, including sub-region segmentation, bilateral feature extraction, feature selection, and classification was designed to detect and compute image feature asymmetry between the left and right breasts imaged on the mammograms. The highest computed area under the curve (AUC) is 0.763 ± 0.021 when applying the multiple sub-region features to our testing dataset. The positive predictive value and the negative predictive value were 0.60 and 0.73, respectively. The study demonstrates that (1) features extracted from multiple sub-regions can improve the performance of our scheme compared to using features from whole breast area only; (2) a classifier using asymmetry bilateral features can effectively predict breast cancer risk; (3) incorporating texture and morphological features with density features can boost the classification accuracy.

  4. Performance-Approach Goal Effects Depend on How They Are Defined: Meta-Analytic Evidence from Multiple Educational Outcomes

    ERIC Educational Resources Information Center

    Senko, Corwin; Dawson, Blair

    2017-01-01

    Achievement goal theory originally defined performance-approach goals as striving to demonstrate competence to outsiders by outperforming peers. The research, however, has operationalized the goals inconsistently, emphasizing the competence demonstration element in some cases and the peer comparison element in others. A meta-analysis by Hulleman…

  5. Do It Right! Requiring Multiple Submissions of Math and NMR Analysis Assignments in the Laboratory

    ERIC Educational Resources Information Center

    Slade, David J.

    2017-01-01

    The first-semester introductory organic chemistry laboratory has been adapted to include mini postlab assignments that students must complete correctly, through as many attempts as prove to be necessary. The use of multiple drafts of writing assignments is a standard approach to improving writing, so the system was designed to require drafts for…

  6. FIA: An Open Forensic Integration Architecture for Composing Digital Evidence

    NASA Astrophysics Data System (ADS)

    Raghavan, Sriram; Clark, Andrew; Mohay, George

    The analysis and value of digital evidence in an investigation has been the domain of discourse in the digital forensic community for several years. While many works have considered different approaches to model digital evidence, a comprehensive understanding of the process of merging different evidence items recovered during a forensic analysis is still a distant dream. With the advent of modern technologies, pro-active measures are integral to keeping abreast of all forms of cyber crimes and attacks. This paper motivates the need to formalize the process of analyzing digital evidence from multiple sources simultaneously. In this paper, we present the forensic integration architecture (FIA) which provides a framework for abstracting the evidence source and storage format information from digital evidence and explores the concept of integrating evidence information from multiple sources. The FIA architecture identifies evidence information from multiple sources that enables an investigator to build theories to reconstruct the past. FIA is hierarchically composed of multiple layers and adopts a technology independent approach. FIA is also open and extensible making it simple to adapt to technological changes. We present a case study using a hypothetical car theft case to demonstrate the concepts and illustrate the value it brings into the field.

  7. Linear mixed-effects modeling approach to FMRI group analysis

    PubMed Central

    Chen, Gang; Saad, Ziad S.; Britton, Jennifer C.; Pine, Daniel S.; Cox, Robert W.

    2013-01-01

    Conventional group analysis is usually performed with Student-type t-test, regression, or standard AN(C)OVA in which the variance–covariance matrix is presumed to have a simple structure. Some correction approaches are adopted when assumptions about the covariance structure is violated. However, as experiments are designed with different degrees of sophistication, these traditional methods can become cumbersome, or even be unable to handle the situation at hand. For example, most current FMRI software packages have difficulty analyzing the following scenarios at group level: (1) taking within-subject variability into account when there are effect estimates from multiple runs or sessions; (2) continuous explanatory variables (covariates) modeling in the presence of a within-subject (repeated measures) factor, multiple subject-grouping (between-subjects) factors, or the mixture of both; (3) subject-specific adjustments in covariate modeling; (4) group analysis with estimation of hemodynamic response (HDR) function by multiple basis functions; (5) various cases of missing data in longitudinal studies; and (6) group studies involving family members or twins. Here we present a linear mixed-effects modeling (LME) methodology that extends the conventional group analysis approach to analyze many complicated cases, including the six prototypes delineated above, whose analyses would be otherwise either difficult or unfeasible under traditional frameworks such as AN(C)OVA and general linear model (GLM). In addition, the strength of the LME framework lies in its flexibility to model and estimate the variance–covariance structures for both random effects and residuals. The intraclass correlation (ICC) values can be easily obtained with an LME model with crossed random effects, even at the presence of confounding fixed effects. The simulations of one prototypical scenario indicate that the LME modeling keeps a balance between the control for false positives and the sensitivity for activation detection. The importance of hypothesis formulation is also illustrated in the simulations. Comparisons with alternative group analysis approaches and the limitations of LME are discussed in details. PMID:23376789

  8. Linear mixed-effects modeling approach to FMRI group analysis.

    PubMed

    Chen, Gang; Saad, Ziad S; Britton, Jennifer C; Pine, Daniel S; Cox, Robert W

    2013-06-01

    Conventional group analysis is usually performed with Student-type t-test, regression, or standard AN(C)OVA in which the variance-covariance matrix is presumed to have a simple structure. Some correction approaches are adopted when assumptions about the covariance structure is violated. However, as experiments are designed with different degrees of sophistication, these traditional methods can become cumbersome, or even be unable to handle the situation at hand. For example, most current FMRI software packages have difficulty analyzing the following scenarios at group level: (1) taking within-subject variability into account when there are effect estimates from multiple runs or sessions; (2) continuous explanatory variables (covariates) modeling in the presence of a within-subject (repeated measures) factor, multiple subject-grouping (between-subjects) factors, or the mixture of both; (3) subject-specific adjustments in covariate modeling; (4) group analysis with estimation of hemodynamic response (HDR) function by multiple basis functions; (5) various cases of missing data in longitudinal studies; and (6) group studies involving family members or twins. Here we present a linear mixed-effects modeling (LME) methodology that extends the conventional group analysis approach to analyze many complicated cases, including the six prototypes delineated above, whose analyses would be otherwise either difficult or unfeasible under traditional frameworks such as AN(C)OVA and general linear model (GLM). In addition, the strength of the LME framework lies in its flexibility to model and estimate the variance-covariance structures for both random effects and residuals. The intraclass correlation (ICC) values can be easily obtained with an LME model with crossed random effects, even at the presence of confounding fixed effects. The simulations of one prototypical scenario indicate that the LME modeling keeps a balance between the control for false positives and the sensitivity for activation detection. The importance of hypothesis formulation is also illustrated in the simulations. Comparisons with alternative group analysis approaches and the limitations of LME are discussed in details. Published by Elsevier Inc.

  9. Comparing multiple imputation methods for systematically missing subject-level data.

    PubMed

    Kline, David; Andridge, Rebecca; Kaizar, Eloise

    2017-06-01

    When conducting research synthesis, the collection of studies that will be combined often do not measure the same set of variables, which creates missing data. When the studies to combine are longitudinal, missing data can occur on the observation-level (time-varying) or the subject-level (non-time-varying). Traditionally, the focus of missing data methods for longitudinal data has been on missing observation-level variables. In this paper, we focus on missing subject-level variables and compare two multiple imputation approaches: a joint modeling approach and a sequential conditional modeling approach. We find the joint modeling approach to be preferable to the sequential conditional approach, except when the covariance structure of the repeated outcome for each individual has homogenous variance and exchangeable correlation. Specifically, the regression coefficient estimates from an analysis incorporating imputed values based on the sequential conditional method are attenuated and less efficient than those from the joint method. Remarkably, the estimates from the sequential conditional method are often less efficient than a complete case analysis, which, in the context of research synthesis, implies that we lose efficiency by combining studies. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  10. Targeted versus statistical approaches to selecting parameters for modelling sediment provenance

    NASA Astrophysics Data System (ADS)

    Laceby, J. Patrick

    2017-04-01

    One effective field-based approach to modelling sediment provenance is the source fingerprinting technique. Arguably, one of the most important steps for this approach is selecting the appropriate suite of parameters or fingerprints used to model source contributions. Accordingly, approaches to selecting parameters for sediment source fingerprinting will be reviewed. Thereafter, opportunities and limitations of these approaches and some future research directions will be presented. For properties to be effective tracers of sediment, they must discriminate between sources whilst behaving conservatively. Conservative behavior is characterized by constancy in sediment properties, where the properties of sediment sources remain constant, or at the very least, any variation in these properties should occur in a predictable and measurable way. Therefore, properties selected for sediment source fingerprinting should remain constant through sediment detachment, transportation and deposition processes, or vary in a predictable and measurable way. One approach to select conservative properties for sediment source fingerprinting is to identify targeted tracers, such as caesium-137, that provide specific source information (e.g. surface versus subsurface origins). A second approach is to use statistical tests to select an optimal suite of conservative properties capable of modelling sediment provenance. In general, statistical approaches use a combination of a discrimination (e.g. Kruskal Wallis H-test, Mann-Whitney U-test) and parameter selection statistics (e.g. Discriminant Function Analysis or Principle Component Analysis). The challenge is that modelling sediment provenance is often not straightforward and there is increasing debate in the literature surrounding the most appropriate approach to selecting elements for modelling. Moving forward, it would be beneficial if researchers test their results with multiple modelling approaches, artificial mixtures, and multiple lines of evidence to provide secondary support to their initial modelling results. Indeed, element selection can greatly impact modelling results and having multiple lines of evidence will help provide confidence when modelling sediment provenance.

  11. Whole-of-society approach for public health policymaking: a case study of polycentric governance from Quebec, Canada.

    PubMed

    Addy, Nii A; Poirier, Alain; Blouin, Chantal; Drager, Nick; Dubé, Laurette

    2014-12-01

    In adopting a whole-of-society (WoS) approach that engages multiple stakeholders in public health policies across contexts, the authors propose that effective governance presents a challenge. The purpose of this paper is to highlight a case for how polycentric governance underlying the WoS approach is already functioning, while outlining an agenda to enable adaptive learning for improving such governance processes. Drawing upon a case study from Quebec, Canada, we employ empirically developed concepts from extensive, decades-long work of the 2009 Nobel laureate Elinor Ostrom in the governance of policy in nonhealth domains to analyze early efforts at polycentric governance in policies around overnutrition, highlighting interactions between international, domestic, state and nonstate actors and processes. Using information from primary and secondary sources, we analyze the emergence of the broader policy context of Quebec's public health system in the 20th century. We present a microsituational analysis of the WoS approach for Quebec's 21st century policies on healthy lifestyles, emphasizing the role of governance at the community level. We argue for rethinking prescriptive policy analysis of the 20th century, proposing an agenda for diagnostic policy analysis, which explicates the multiple sets of actors and interacting variables shaping polycentric governance for operationalizing the WoS approach to policymaking in specific contexts. © 2014 New York Academy of Sciences.

  12. Addressing small sample size bias in multiple-biomarker trials: Inclusion of biomarker-negative patients and Firth correction.

    PubMed

    Habermehl, Christina; Benner, Axel; Kopp-Schneider, Annette

    2018-03-01

    In recent years, numerous approaches for biomarker-based clinical trials have been developed. One of these developments are multiple-biomarker trials, which aim to investigate multiple biomarkers simultaneously in independent subtrials. For low-prevalence biomarkers, small sample sizes within the subtrials have to be expected, as well as many biomarker-negative patients at the screening stage. The small sample sizes may make it unfeasible to analyze the subtrials individually. This imposes the need to develop new approaches for the analysis of such trials. With an expected large group of biomarker-negative patients, it seems reasonable to explore options to benefit from including them in such trials. We consider advantages and disadvantages of the inclusion of biomarker-negative patients in a multiple-biomarker trial with a survival endpoint. We discuss design options that include biomarker-negative patients in the study and address the issue of small sample size bias in such trials. We carry out a simulation study for a design where biomarker-negative patients are kept in the study and are treated with standard of care. We compare three different analysis approaches based on the Cox model to examine if the inclusion of biomarker-negative patients can provide a benefit with respect to bias and variance of the treatment effect estimates. We apply the Firth correction to reduce the small sample size bias. The results of the simulation study suggest that for small sample situations, the Firth correction should be applied to adjust for the small sample size bias. Additional to the Firth penalty, the inclusion of biomarker-negative patients in the analysis can lead to further but small improvements in bias and standard deviation of the estimates. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. A Quantile Regression Approach to Understanding the Relations Between Morphological Awareness, Vocabulary, and Reading Comprehension in Adult Basic Education Students

    PubMed Central

    Tighe, Elizabeth L.; Schatschneider, Christopher

    2015-01-01

    The purpose of this study was to investigate the joint and unique contributions of morphological awareness and vocabulary knowledge at five reading comprehension levels in Adult Basic Education (ABE) students. We introduce the statistical technique of multiple quantile regression, which enabled us to assess the predictive utility of morphological awareness and vocabulary knowledge at multiple points (quantiles) along the continuous distribution of reading comprehension. To demonstrate the efficacy of our multiple quantile regression analysis, we compared and contrasted our results with a traditional multiple regression analytic approach. Our results indicated that morphological awareness and vocabulary knowledge accounted for a large portion of the variance (82-95%) in reading comprehension skills across all quantiles. Morphological awareness exhibited the greatest unique predictive ability at lower levels of reading comprehension whereas vocabulary knowledge exhibited the greatest unique predictive ability at higher levels of reading comprehension. These results indicate the utility of using multiple quantile regression to assess trajectories of component skills across multiple levels of reading comprehension. The implications of our findings for ABE programs are discussed. PMID:25351773

  14. Instantiating the multiple levels of analysis perspective in a program of study on externalizing behavior

    PubMed Central

    Beauchaine, Theodore P.; Gatzke-Kopp, Lisa M.

    2014-01-01

    During the last quarter century, developmental psychopathology has become increasingly inclusive and now spans disciplines ranging from psychiatric genetics to primary prevention. As a result, developmental psychopathologists have extended traditional diathesis–stress and transactional models to include causal processes at and across all relevant levels of analysis. Such research is embodied in what is known as the multiple levels of analysis perspective. We describe how multiple levels of analysis research has informed our current thinking about antisocial and borderline personality development among trait impulsive and therefore vulnerable individuals. Our approach extends the multiple levels of analysis perspective beyond simple Biology × Environment interactions by evaluating impulsivity across physiological systems (genetic, autonomic, hormonal, neural), psychological constructs (social, affective, motivational), developmental epochs (preschool, middle childhood, adolescence, adulthood), sexes (male, female), and methods of inquiry (self-report, informant report, treatment outcome, cardiovascular, electrophysiological, neuroimaging). By conducting our research using any and all available methods across these levels of analysis, we have arrived at a developmental model of trait impulsivity that we believe confers a greater understanding of this highly heritable trait and captures at least some heterogeneity in key behavioral outcomes, including delinquency and suicide. PMID:22781868

  15. Analysis of Advanced Modular Power Systems (AMPS) for Deep Space Exploration

    NASA Technical Reports Server (NTRS)

    Oeftering, Richard; Soeder, James F.; Beach, Ray

    2014-01-01

    The Advanced Modular Power Systems (AMPS) project is developing a modular approach to spacecraft power systems for exploration beyond Earth orbit. AMPS is intended to meet the need of reducing the cost of design development, test and integration and also reducing the operational logistics cost of supporting exploration missions. AMPS seeks to establish modular power building blocks with standardized electrical, mechanical, thermal and data interfaces that can be applied across multiple exploration vehicles. The presentation discusses the results of a cost analysis that compares the cost of the modular approach against a traditional non-modular approach.

  16. Resolving the double tension: Toward a new approach to measurement modeling in cross-national research

    NASA Astrophysics Data System (ADS)

    Medina, Tait Runnfeldt

    The increasing global reach of survey research provides sociologists with new opportunities to pursue theory building and refinement through comparative analysis. However, comparison across a broad array of diverse contexts introduces methodological complexities related to the development of constructs (i.e., measurement modeling) that if not adequately recognized and properly addressed undermine the quality of research findings and cast doubt on the validity of substantive conclusions. The motivation for this dissertation arises from a concern that the availability of cross-national survey data has outpaced sociologists' ability to appropriately analyze and draw meaningful conclusions from such data. I examine the implicit assumptions and detail the limitations of three commonly used measurement models in cross-national analysis---summative scale, pooled factor model, and multiple-group factor model with measurement invariance. Using the orienting lens of the double tension I argue that a new approach to measurement modeling that incorporates important cross-national differences into the measurement process is needed. Two such measurement models---multiple-group factor model with partial measurement invariance (Byrne, Shavelson and Muthen 1989) and the alignment method (Asparouhov and Muthen 2014; Muthen and Asparouhov 2014)---are discussed in detail and illustrated using a sociologically relevant substantive example. I demonstrate that the former approach is vulnerable to an identification problem that arbitrarily impacts substantive conclusions. I conclude that the alignment method is built on model assumptions that are consistent with theoretical understandings of cross-national comparability and provides an approach to measurement modeling and construct development that is uniquely suited for cross-national research. The dissertation makes three major contributions: First, it provides theoretical justification for a new cross-national measurement model and explicates a link between theoretical conceptions of cross-national comparability and a statistical method. Second, it provides a clear and detailed discussion of model identification in multiple-group confirmatory factor analysis that is missing from the literature. This discussion sets the stage for the introduction of the identification problem within multiple-group confirmatory factor analysis with partial measurement invariance and the alternative approach to model identification employed by the alignment method. Third, it offers the first pedagogical presentation of the alignment method using a sociologically relevant example.

  17. Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM

    ERIC Educational Resources Information Center

    Warner, Rebecca M.

    2007-01-01

    This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…

  18. A Bifactor Approach to Model Multifaceted Constructs in Statistical Mediation Analysis

    ERIC Educational Resources Information Center

    Gonzalez, Oscar; MacKinnon, David P.

    2018-01-01

    Statistical mediation analysis allows researchers to identify the most important mediating constructs in the causal process studied. Identifying specific mediators is especially relevant when the hypothesized mediating construct consists of multiple related facets. The general definition of the construct and its facets might relate differently to…

  19. Multiscale Measurement of Extreme Response Style

    ERIC Educational Resources Information Center

    Bolt, Daniel M.; Newton, Joseph R.

    2011-01-01

    This article extends a methodological approach considered by Bolt and Johnson for the measurement and control of extreme response style (ERS) to the analysis of rating data from multiple scales. Specifically, it is shown how the simultaneous analysis of item responses across scales allows for more accurate identification of ERS, and more effective…

  20. Precision Efficacy Analysis for Regression.

    ERIC Educational Resources Information Center

    Brooks, Gordon P.

    When multiple linear regression is used to develop a prediction model, sample size must be large enough to ensure stable coefficients. If the derivation sample size is inadequate, the model may not predict well for future subjects. The precision efficacy analysis for regression (PEAR) method uses a cross- validity approach to select sample sizes…

  1. Meta-Analysis of Scale Reliability Using Latent Variable Modeling

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.

    2013-01-01

    A latent variable modeling approach is outlined that can be used for meta-analysis of reliability coefficients of multicomponent measuring instruments. Important limitations of efforts to combine composite reliability findings across multiple studies are initially pointed out. A reliability synthesis procedure is discussed that is based on…

  2. Apportioning Sources of Riverine Nitrogen at Multiple Watershed Scales

    NASA Astrophysics Data System (ADS)

    Boyer, E. W.; Alexander, R. B.; Sebestyen, S. D.

    2005-05-01

    Loadings of reactive nitrogen (N) entering terrestrial landscapes have increased in recent decades due to anthropogenic activities associated with food and energy production. In the northeastern USA, this enhanced supply of N has been linked to many environmental concerns in both terrestrial and aquatic ecosystems, such as forest decline, lake and stream acidification, human respiratory problems, and coastal eutrophication. Thus N is a priority pollutant with regard to a whole host of air, land, and water quality issues, highlighting the need for methods to identify and quantify various N sources. Further, understanding precursor sources of N is critical to current and proposed public policies targeted at the reduction of N inputs to the terrestrial landscape and receiving waters. We present results from published and ongoing studies using multiple approaches to fingerprint sources of N in the northeastern USA, at watershed scales ranging from the headwaters to the coastal zone. The approaches include: 1) a mass balance model with a nitrogen-budgeting approach for analyses of large watersheds; 2) a spatially-referenced regression model with an empirical modeling approach for analyses of water quality at regional scales; and 3) a meta-analysis of monitoring data with a chemical tracer approach, utilizing concentrations of multiple elements and isotopic composition of N from water samples collected in the streams and rivers. We discuss the successes and limitations of these various approaches for apportioning contributions of N from multiple sources to receiving waters at regional scales.

  3. Comparison of the phenolic composition of fruit juices by single step gradient HPLC analysis of multiple components versus multiple chromatographic runs optimised for individual families.

    PubMed

    Bremner, P D; Blacklock, C J; Paganga, G; Mullen, W; Rice-Evans, C A; Crozier, A

    2000-06-01

    After minimal sample preparation, two different HPLC methodologies, one based on a single gradient reversed-phase HPLC step, the other on multiple HPLC runs each optimised for specific components, were used to investigate the composition of flavonoids and phenolic acids in apple and tomato juices. The principal components in apple juice were identified as chlorogenic acid, phloridzin, caffeic acid and p-coumaric acid. Tomato juice was found to contain chlorogenic acid, caffeic acid, p-coumaric acid, naringenin and rutin. The quantitative estimates of the levels of these compounds, obtained with the two HPLC procedures, were very similar, demonstrating that either method can be used to analyse accurately the phenolic components of apple and tomato juices. Chlorogenic acid in tomato juice was the only component not fully resolved in the single run study and the multiple run analysis prior to enzyme treatment. The single run system of analysis is recommended for the initial investigation of plant phenolics and the multiple run approach for analyses where chromatographic resolution requires improvement.

  4. Taking Multiple Exposure Into Account Can Improve Assessment of Chemical Risks.

    PubMed

    Clerc, Frédéric; Bertrand, Nicolas Jean Hyacinthe; La Rocca, Bénédicte

    2017-12-15

    During work, operators may be exposed to several chemicals simultaneously. Most exposure assessment approaches only determine exposure levels for each substance individually. However, such individual-substance approaches may not correctly estimate the toxicity of 'cocktails' of chemicals, as the toxicity of a cocktail may differ from the toxicity of substances on their own. This study presents an approach that can better take into account multiple exposure when assessing chemical risks. Almost 30000 work situations, monitored between 2005 and 2014 and recorded in two French databases, were analysed using MiXie software. The algorithms employed in MiXie can identify toxicological classes associated with several substances, based on the additivity of the selected effects of each substance. The results of our retrospective analysis show that MiXie was able to identify almost 20% more potentially hazardous situations than identified using a single-substance approach. It therefore appears essential to review the ways in which multiple exposure is taken into account during risk assessment. © The Author(s) 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  5. Analytic Methods for Evaluating Patterns of Multiple Congenital Anomalies in Birth Defect Registries.

    PubMed

    Agopian, A J; Evans, Jane A; Lupo, Philip J

    2018-01-15

    It is estimated that 20 to 30% of infants with birth defects have two or more birth defects. Among these infants with multiple congenital anomalies (MCA), co-occurring anomalies may represent either chance (i.e., unrelated etiologies) or pathogenically associated patterns of anomalies. While some MCA patterns have been recognized and described (e.g., known syndromes), others have not been identified or characterized. Elucidating these patterns may result in a better understanding of the etiologies of these MCAs. This article reviews the literature with regard to analytic methods that have been used to evaluate patterns of MCAs, in particular those using birth defect registry data. A popular method for MCA assessment involves a comparison of the observed to expected ratio for a given combination of MCAs, or one of several modified versions of this comparison. Other methods include use of numerical taxonomy or other clustering techniques, multiple regression analysis, and log-linear analysis. Advantages and disadvantages of these approaches, as well as specific applications, were outlined. Despite the availability of multiple analytic approaches, relatively few MCA combinations have been assessed. The availability of large birth defects registries and computing resources that allow for automated, big data strategies for prioritizing MCA patterns may provide for new avenues for better understanding co-occurrence of birth defects. Thus, the selection of an analytic approach may depend on several considerations. Birth Defects Research 110:5-11, 2018. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  6. Subject order-independent group ICA (SOI-GICA) for functional MRI data analysis.

    PubMed

    Zhang, Han; Zuo, Xi-Nian; Ma, Shuang-Ye; Zang, Yu-Feng; Milham, Michael P; Zhu, Chao-Zhe

    2010-07-15

    Independent component analysis (ICA) is a data-driven approach to study functional magnetic resonance imaging (fMRI) data. Particularly, for group analysis on multiple subjects, temporally concatenation group ICA (TC-GICA) is intensively used. However, due to the usually limited computational capability, data reduction with principal component analysis (PCA: a standard preprocessing step of ICA decomposition) is difficult to achieve for a large dataset. To overcome this, TC-GICA employs multiple-stage PCA data reduction. Such multiple-stage PCA data reduction, however, leads to variable outputs due to different subject concatenation orders. Consequently, the ICA algorithm uses the variable multiple-stage PCA outputs and generates variable decompositions. In this study, a rigorous theoretical analysis was conducted to prove the existence of such variability. Simulated and real fMRI experiments were used to demonstrate the subject-order-induced variability of TC-GICA results using multiple PCA data reductions. To solve this problem, we propose a new subject order-independent group ICA (SOI-GICA). Both simulated and real fMRI data experiments demonstrated the high robustness and accuracy of the SOI-GICA results compared to those of traditional TC-GICA. Accordingly, we recommend SOI-GICA for group ICA-based fMRI studies, especially those with large data sets. Copyright 2010 Elsevier Inc. All rights reserved.

  7. Antiwindup analysis and design approaches for MIMO systems

    NASA Technical Reports Server (NTRS)

    Marcopoli, Vincent R.; Phillips, Stephen M.

    1994-01-01

    Performance degradation of multiple-input multiple-output (MIMO) control systems having limited actuators is often handled by augmenting the controller with an antiwindup mechanism, which attempts to maintain system performance when limits are encountered. The goals of this paper are: (1) To develop a method to analyze antiwindup systems to determine precisely what stability and performance degradation is incurred under limited conditions. It is shown that by reformulating limited actuator commands as resulting from multiplicative perturbations to the corresponding controller requests, mu-analysis tools can be utilized to obtain quantitative measures of stability and performance degradation. (2) To propose a linear, time invariant (LTI) criterion on which to base the antiwindup design. These analysis and design methods are illustrated through the evaluation of two competing antiwindup schemes augmenting the controller of a Short Take-Off and Vertical Landing (STOVL) aircraft in transition flight.

  8. Antiwindup analysis and design approaches for MIMO systems

    NASA Technical Reports Server (NTRS)

    Marcopoli, Vincent R.; Phillips, Stephen M.

    1993-01-01

    Performance degradation of multiple-input multiple-output (MIMO) control systems having limited actuators is often handled by augmenting the controller with an antiwindup mechanism, which attempts to maintain system performance when limits are encountered. The goals of this paper are: 1) to develop a method to analyze antiwindup systems to determine precisely what stability and performance degradation is incurred under limited conditions. It is shown that by reformulating limited actuator commands as resulting from multiplicative perturbations to the corresponding controller requests, mu-analysis tools can be utilized to obtain quantitative measures of stability and performance degradation. 2) To propose a linear, time invariant (LTI) criterion on which to base the antiwindup design. These analysis and design methods are illustrated through the evaluation of two competing antiwindup schemes augmenting the controller of a Short Take-Off and Vertical Landing (STOVL) aircraft in transition flight.

  9. Comparison of Nonlinear Random Response Using Equivalent Linearization and Numerical Simulation

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Muravyov, Alexander A.

    2000-01-01

    A recently developed finite-element-based equivalent linearization approach for the analysis of random vibrations of geometrically nonlinear multiple degree-of-freedom structures is validated. The validation is based on comparisons with results from a finite element based numerical simulation analysis using a numerical integration technique in physical coordinates. In particular, results for the case of a clamped-clamped beam are considered for an extensive load range to establish the limits of validity of the equivalent linearization approach.

  10. A Simple Approach to Inference in Covariance Structure Modeling with Missing Data: Bayesian Analysis. Project 2.4, Quantitative Models To Monitor the Status and Progress of Learning and Performance and Their Antecedents.

    ERIC Educational Resources Information Center

    Muthen, Bengt

    This paper investigates methods that avoid using multiple groups to represent the missing data patterns in covariance structure modeling, attempting instead to do a single-group analysis where the only action the analyst has to take is to indicate that data is missing. A new covariance structure approach developed by B. Muthen and G. Arminger is…

  11. A Network Pharmacology Approach to Determine the Active Components and Potential Targets of Curculigo Orchioides in the Treatment of Osteoporosis.

    PubMed

    Wang, Nani; Zhao, Guizhi; Zhang, Yang; Wang, Xuping; Zhao, Lisha; Xu, Pingcui; Shou, Dan

    2017-10-27

    BACKGROUND Osteoporosis is a complex bone disorder with a genetic predisposition, and is a cause of health problems worldwide. In China, Curculigo orchioides (CO) has been widely used as a herbal medicine in the prevention and treatment of osteoporosis. However, research on the mechanism of action of CO is still lacking. The aim of this study was to identify the absorbable components, potential targets, and associated treatment pathways of CO using a network pharmacology approach. MATERIAL AND METHODS We explored the chemical components of CO and used the five main principles of drug absorption to identify absorbable components. Targets for the therapeutic actions of CO were obtained from the PharmMapper server database. Pathway enrichment analysis was performed using the Comparative Toxicogenomics Database (CTD). Cytoscape was used to visualize the multiple components-multiple target-multiple pathways-multiple disease network for CO. RESULTS We identified 77 chemical components of CO, of which 32 components could be absorbed in the blood. These potential active components of CO regulated 83 targets and affected 58 pathways. Data analysis showed that the genes for estrogen receptor alpha (ESR1) and beta (ESR2), and the gene for 11 beta-hydroxysteroid dehydrogenase type 1, or cortisone reductase (HSD11B1) were the main targets of CO. Endocrine regulatory factors and factors regulating calcium reabsorption, steroid hormone biosynthesis, and metabolic pathways were related to these main targets and to ten corresponding compounds. CONCLUSIONS The network pharmacology approach used in our study has attempted to explain the mechanisms for the effects of CO in the prevention and treatment of osteoporosis, and provides an alternative approach to the investigation of the effects of this complex compound.

  12. Validated Metrics of Quick Flow Improve Assessments of Streamflow Generation Processes at the Long-Term Sleepers River Research Site

    NASA Astrophysics Data System (ADS)

    Sebestyen, S. D.; Shanley, J. B.

    2015-12-01

    There are multiple approaches to quantify quick flow components of streamflow. Physical hydrograph separations of quick flow using recession analysis (RA) are objective, reproducible, and easily calculated for long-duration streamflow records (years to decades). However, this approach has rarely been validated to have a physical basis for interpretation. In contrast, isotopic hydrograph separation (IHS) and end member mixing analysis using multiple solutes (EMMA) have been used to identify flow components and flowpath routing through catchment soils. Nonetheless, these two approaches are limited by data from limited and isolated periods (hours to weeks) during which more-intensive grab samples were analyzed. These limitations oftentimes make IHS and EMMA difficult to generalize beyond brief windows of time. At the Sleepers River Research Watershed (SRRW) in northern Vermont, USA, we have data from multiple snowmelt events over a two decade period and from multiple nested catchments to assess relationships among RA, IHS, and EMMA. Quick flow separations were highly correlated among the three techniques, which shows links among metrics of quick flow, water sources, and flow path routing in a small (41 ha), forested catchment (W-9) The similarity in responses validates a physical interpretation for a particular RA approach (the Ekhardt recursive RA filter). This validation provides a new tool to estimate new water inputs and flowpath routing for more and longer periods when chemical or isotopic tracers may not have been measured. At three other SRRW catchments, we found similar strong correlations among the three techniques. Consistent responses across four catchments provide evidence to support other research at the SRRW that shows that runoff generation mechanisms are similar despite differences in catchment sizes and land covers.

  13. Spinal focal lesion detection in multiple myeloma using multimodal image features

    NASA Astrophysics Data System (ADS)

    Fränzle, Andrea; Hillengass, Jens; Bendl, Rolf

    2015-03-01

    Multiple myeloma is a tumor disease in the bone marrow that affects the skeleton systemically, i.e. multiple lesions can occur in different sites in the skeleton. To quantify overall tumor mass for determining degree of disease and for analysis of therapy response, volumetry of all lesions is needed. Since the large amount of lesions in one patient impedes manual segmentation of all lesions, quantification of overall tumor volume is not possible until now. Therefore development of automatic lesion detection and segmentation methods is necessary. Since focal tumors in multiple myeloma show different characteristics in different modalities (changes in bone structure in CT images, hypointensity in T1 weighted MR images and hyperintensity in T2 weighted MR images), multimodal image analysis is necessary for the detection of focal tumors. In this paper a pattern recognition approach is presented that identifies focal lesions in lumbar vertebrae based on features from T1 and T2 weighted MR images. Image voxels within bone are classified using random forests based on plain intensities and intensity value derived features (maximum, minimum, mean, median) in a 5 x 5 neighborhood around a voxel from both T1 and T2 weighted MR images. A test data sample of lesions in 8 lumbar vertebrae from 4 multiple myeloma patients can be classified at an accuracy of 95% (using a leave-one-patient-out test). The approach provides a reasonable delineation of the example lesions. This is an important step towards automatic tumor volume quantification in multiple myeloma.

  14. Bayesian multiple-source localization in an uncertain ocean environment.

    PubMed

    Dosso, Stan E; Wilmut, Michael J

    2011-06-01

    This paper considers simultaneous localization of multiple acoustic sources when properties of the ocean environment (water column and seabed) are poorly known. A Bayesian formulation is developed in which the environmental parameters, noise statistics, and locations and complex strengths (amplitudes and phases) of multiple sources are considered to be unknown random variables constrained by acoustic data and prior information. Two approaches are considered for estimating source parameters. Focalization maximizes the posterior probability density (PPD) over all parameters using adaptive hybrid optimization. Marginalization integrates the PPD using efficient Markov-chain Monte Carlo methods to produce joint marginal probability distributions for source ranges and depths, from which source locations are obtained. This approach also provides quantitative uncertainty analysis for all parameters, which can aid in understanding of the inverse problem and may be of practical interest (e.g., source-strength probability distributions). In both approaches, closed-form maximum-likelihood expressions for source strengths and noise variance at each frequency allow these parameters to be sampled implicitly, substantially reducing the dimensionality and difficulty of the inversion. Examples are presented of both approaches applied to single- and multi-frequency localization of multiple sources in an uncertain shallow-water environment, and a Monte Carlo performance evaluation study is carried out. © 2011 Acoustical Society of America

  15. Elastic-net regularization approaches for genome-wide association studies of rheumatoid arthritis.

    PubMed

    Cho, Seoae; Kim, Haseong; Oh, Sohee; Kim, Kyunga; Park, Taesung

    2009-12-15

    The current trend in genome-wide association studies is to identify regions where the true disease-causing genes may lie by evaluating thousands of single-nucleotide polymorphisms (SNPs) across the whole genome. However, many challenges exist in detecting disease-causing genes among the thousands of SNPs. Examples include multicollinearity and multiple testing issues, especially when a large number of correlated SNPs are simultaneously tested. Multicollinearity can often occur when predictor variables in a multiple regression model are highly correlated, and can cause imprecise estimation of association. In this study, we propose a simple stepwise procedure that identifies disease-causing SNPs simultaneously by employing elastic-net regularization, a variable selection method that allows one to address multicollinearity. At Step 1, the single-marker association analysis was conducted to screen SNPs. At Step 2, the multiple-marker association was scanned based on the elastic-net regularization. The proposed approach was applied to the rheumatoid arthritis (RA) case-control data set of Genetic Analysis Workshop 16. While the selected SNPs at the screening step are located mostly on chromosome 6, the elastic-net approach identified putative RA-related SNPs on other chromosomes in an increased proportion. For some of those putative RA-related SNPs, we identified the interactions with sex, a well known factor affecting RA susceptibility.

  16. A Factor Analytic and Regression Approach to Functional Age: Potential Effects of Race.

    ERIC Educational Resources Information Center

    Colquitt, Alan L.; And Others

    Factor analysis and multiple regression are two major approaches used to look at functional age, which takes account of the extensive variation in the rate of physiological and psychological maturation throughout life. To examine the role of racial or cultural influences on the measurement of functional age, a battery of 12 tests concentrating on…

  17. Multiple constraint analysis of regional land-surface carbon flux

    Treesearch

    D.P. Turner; M. Göckede; B.E. Law; W.D. Ritts; W.B. Cohen; Z. Yang; T. Hudiburg; R. Kennedy; M. Duane

    2011-01-01

    We applied and compared bottom-up (process model-based) and top-down (atmospheric inversion-based) scaling approaches to evaluate the spatial and temporal patterns of net ecosystem production (NEP) over a 2.5 × 105 km2 area (the state of Oregon) in the western United States. Both approaches indicated a carbon sink over this...

  18. Comparing compound-specific and bulk stable nitrogen isotope trophic discrimination factors across multiple freshwater fish species and diets

    USDA-ARS?s Scientific Manuscript database

    The use of nitrogen stable isotopes for estimation of animal trophic position has become an indispensable approach in food web ecology. Compound-specific isotope analysis of amino acids is a new approach for estimating trophic position that may overcome key issues associated with nitrogen stable iso...

  19. What Makes the Difference? An Analysis of a Reading Intervention Programme Implemented in Rural Schools in Cambodia

    ERIC Educational Resources Information Center

    Courtney, Jane; Gravelle, Maggie

    2014-01-01

    This article compares the existing single-strategy approach towards the teaching of early literacy in schools in rural Cambodia with a multiple-strategy approach introduced as part of a reading intervention programme. Classroom observations, questionnaires and in-depth interviews with teachers were used to explore teachers' practices and…

  20. Application of stable isotope ratio analysis for biodegradation monitoring in groundwater

    USGS Publications Warehouse

    Hatzinger, Paul B.; Böhlke, John Karl; Sturchio, Neil C.

    2013-01-01

    Stable isotope ratio analysis is increasingly being applied as a tool to detect, understand, and quantify biodegradation of organic and inorganic contaminants in groundwater. An important feature of this approach is that it allows degradative losses of contaminants to be distinguished from those caused by non-destructive processes such as dilution, dispersion, and sorption. Recent advances in analytical techniques, and new approaches for interpreting stable isotope data, have expanded the utility of this method while also exposing complications and ambiguities that must be considered in data interpretations. Isotopic analyses of multiple elements in a compound, and multiple compounds in the environment, are being used to distinguish biodegradative pathways by their characteristic isotope effects. Numerical models of contaminant transport, degradation pathways, and isotopic composition are improving quantitative estimates of in situ contaminant degradation rates under realistic environmental conditions.

  1. Customisation of the exome data analysis pipeline using a combinatorial approach.

    PubMed

    Pattnaik, Swetansu; Vaidyanathan, Srividya; Pooja, Durgad G; Deepak, Sa; Panda, Binay

    2012-01-01

    The advent of next generation sequencing (NGS) technologies have revolutionised the way biologists produce, analyse and interpret data. Although NGS platforms provide a cost-effective way to discover genome-wide variants from a single experiment, variants discovered by NGS need follow up validation due to the high error rates associated with various sequencing chemistries. Recently, whole exome sequencing has been proposed as an affordable option compared to whole genome runs but it still requires follow up validation of all the novel exomic variants. Customarily, a consensus approach is used to overcome the systematic errors inherent to the sequencing technology, alignment and post alignment variant detection algorithms. However, the aforementioned approach warrants the use of multiple sequencing chemistry, multiple alignment tools, multiple variant callers which may not be viable in terms of time and money for individual investigators with limited informatics know-how. Biologists often lack the requisite training to deal with the huge amount of data produced by NGS runs and face difficulty in choosing from the list of freely available analytical tools for NGS data analysis. Hence, there is a need to customise the NGS data analysis pipeline to preferentially retain true variants by minimising the incidence of false positives and make the choice of right analytical tools easier. To this end, we have sampled different freely available tools used at the alignment and post alignment stage suggesting the use of the most suitable combination determined by a simple framework of pre-existing metrics to create significant datasets.

  2. Accounting for heterogeneity in meta-analysis using a multiplicative model-an empirical study.

    PubMed

    Mawdsley, David; Higgins, Julian P T; Sutton, Alex J; Abrams, Keith R

    2017-03-01

    In meta-analysis, the random-effects model is often used to account for heterogeneity. The model assumes that heterogeneity has an additive effect on the variance of effect sizes. An alternative model, which assumes multiplicative heterogeneity, has been little used in the medical statistics community, but is widely used by particle physicists. In this paper, we compare the two models using a random sample of 448 meta-analyses drawn from the Cochrane Database of Systematic Reviews. In general, differences in goodness of fit are modest. The multiplicative model tends to give results that are closer to the null, with a narrower confidence interval. Both approaches make different assumptions about the outcome of the meta-analysis. In our opinion, the selection of the more appropriate model will often be guided by whether the multiplicative model's assumption of a single effect size is plausible. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  3. Laser scanning cytometry for automation of the micronucleus assay

    PubMed Central

    Darzynkiewicz, Zbigniew; Smolewski, Piotr; Holden, Elena; Luther, Ed; Henriksen, Mel; François, Maxime; Leifert, Wayne; Fenech, Michael

    2011-01-01

    Laser scanning cytometry (LSC) provides a novel approach for automated scoring of micronuclei (MN) in different types of mammalian cells, serving as a biomarker of genotoxicity and mutagenicity. In this review, we discuss the advances to date in measuring MN in cell lines, buccal cells and erythrocytes, describe the advantages and outline potential challenges of this distinctive approach of analysis of nuclear anomalies. The use of multiple laser wavelengths in LSC and the high dynamic range of fluorescence and absorption detection allow simultaneous measurement of multiple cellular and nuclear features such as cytoplasmic area, nuclear area, DNA content and density of nuclei and MN, protein content and density of cytoplasm as well as other features using molecular probes. This high-content analysis approach allows the cells of interest to be identified (e.g. binucleated cells in cytokinesis-blocked cultures) and MN scored specifically in them. MN assays in cell lines (e.g. the CHO cell MN assay) using LSC are increasingly used in routine toxicology screening. More high-content MN assays and the expansion of MN analysis by LSC to other models (i.e. exfoliated cells, dermal cell models, etc.) hold great promise for robust and exciting developments in MN assay automation as a high-content high-throughput analysis procedure. PMID:21164197

  4. Conjoint Analysis: A Preference-Based Approach for the Accounting of Multiple Benefits in Southern Forest Management

    Treesearch

    F. Christian Zinkhan; Thomas P. Holmes; D. Evan Mercer

    1997-01-01

    Conjoint analysis, which enables a manager to measure the relative importance of a forest's multidimensional attributes, is critically reviewed and assessed. Special attention is given to the feasibility of using conjoint analysis for measuring the utility of market and nonmarket outputs from southern forests. Also, an application to a case of designing a nature...

  5. When is hub gene selection better than standard meta-analysis?

    PubMed

    Langfelder, Peter; Mischel, Paul S; Horvath, Steve

    2013-01-01

    Since hub nodes have been found to play important roles in many networks, highly connected hub genes are expected to play an important role in biology as well. However, the empirical evidence remains ambiguous. An open question is whether (or when) hub gene selection leads to more meaningful gene lists than a standard statistical analysis based on significance testing when analyzing genomic data sets (e.g., gene expression or DNA methylation data). Here we address this question for the special case when multiple genomic data sets are available. This is of great practical importance since for many research questions multiple data sets are publicly available. In this case, the data analyst can decide between a standard statistical approach (e.g., based on meta-analysis) and a co-expression network analysis approach that selects intramodular hubs in consensus modules. We assess the performance of these two types of approaches according to two criteria. The first criterion evaluates the biological insights gained and is relevant in basic research. The second criterion evaluates the validation success (reproducibility) in independent data sets and often applies in clinical diagnostic or prognostic applications. We compare meta-analysis with consensus network analysis based on weighted correlation network analysis (WGCNA) in three comprehensive and unbiased empirical studies: (1) Finding genes predictive of lung cancer survival, (2) finding methylation markers related to age, and (3) finding mouse genes related to total cholesterol. The results demonstrate that intramodular hub gene status with respect to consensus modules is more useful than a meta-analysis p-value when identifying biologically meaningful gene lists (reflecting criterion 1). However, standard meta-analysis methods perform as good as (if not better than) a consensus network approach in terms of validation success (criterion 2). The article also reports a comparison of meta-analysis techniques applied to gene expression data and presents novel R functions for carrying out consensus network analysis, network based screening, and meta analysis.

  6. One-step generation of complete gene knockout mice and monkeys by CRISPR/Cas9-mediated gene editing with multiple sgRNAs.

    PubMed

    Zuo, Erwei; Cai, Yi-Jun; Li, Kui; Wei, Yu; Wang, Bang-An; Sun, Yidi; Liu, Zhen; Liu, Jiwei; Hu, Xinde; Wei, Wei; Huo, Xiaona; Shi, Linyu; Tang, Cheng; Liang, Dan; Wang, Yan; Nie, Yan-Hong; Zhang, Chen-Chen; Yao, Xuan; Wang, Xing; Zhou, Changyang; Ying, Wenqin; Wang, Qifang; Chen, Ren-Chao; Shen, Qi; Xu, Guo-Liang; Li, Jinsong; Sun, Qiang; Xiong, Zhi-Qi; Yang, Hui

    2017-07-01

    The CRISPR/Cas9 system is an efficient gene-editing method, but the majority of gene-edited animals showed mosaicism, with editing occurring only in a portion of cells. Here we show that single gene or multiple genes can be completely knocked out in mouse and monkey embryos by zygotic injection of Cas9 mRNA and multiple adjacent single-guide RNAs (spaced 10-200 bp apart) that target only a single key exon of each gene. Phenotypic analysis of F0 mice following targeted deletion of eight genes on the Y chromosome individually demonstrated the robustness of this approach in generating knockout mice. Importantly, this approach delivers complete gene knockout at high efficiencies (100% on Arntl and 91% on Prrt2) in monkey embryos. Finally, we could generate a complete Prrt2 knockout monkey in a single step, demonstrating the usefulness of this approach in rapidly establishing gene-edited monkey models.

  7. Mark-recapture with multiple, non-invasive marks.

    PubMed

    Bonner, Simon J; Holmberg, Jason

    2013-09-01

    Non-invasive marks, including pigmentation patterns, acquired scars, and genetic markers, are often used to identify individuals in mark-recapture experiments. If animals in a population can be identified from multiple, non-invasive marks then some individuals may be counted twice in the observed data. Analyzing the observed histories without accounting for these errors will provide incorrect inference about the population dynamics. Previous approaches to this problem include modeling data from only one mark and combining estimators obtained from each mark separately assuming that they are independent. Motivated by the analysis of data from the ECOCEAN online whale shark (Rhincodon typus) catalog, we describe a Bayesian method to analyze data from multiple, non-invasive marks that is based on the latent-multinomial model of Link et al. (2010, Biometrics 66, 178-185). Further to this, we describe a simplification of the Markov chain Monte Carlo algorithm of Link et al. (2010, Biometrics 66, 178-185) that leads to more efficient computation. We present results from the analysis of the ECOCEAN whale shark data and from simulation studies comparing our method with the previous approaches. © 2013, The International Biometric Society.

  8. The multiple decrement life table: a unifying framework for cause-of-death analysis in ecology.

    PubMed

    Carey, James R

    1989-01-01

    The multiple decrement life table is used widely in the human actuarial literature and provides statistical expressions for mortality in three different forms: i) the life table from all causes-of-death combined; ii) the life table disaggregated into selected cause-of-death categories; and iii) the life table with particular causes and combinations of causes eliminated. The purpose of this paper is to introduce the multiple decrement life table to the ecological literature by applying the methods to published death-by-cause information on Rhagoletis pomonella. Interrelations between the current approach and conventional tools used in basic and applied ecology are discussed including the conventional life table, Key Factor Analysis and Abbott's Correction used in toxicological bioassay.

  9. Regularized rare variant enrichment analysis for case-control exome sequencing data.

    PubMed

    Larson, Nicholas B; Schaid, Daniel J

    2014-02-01

    Rare variants have recently garnered an immense amount of attention in genetic association analysis. However, unlike methods traditionally used for single marker analysis in GWAS, rare variant analysis often requires some method of aggregation, since single marker approaches are poorly powered for typical sequencing study sample sizes. Advancements in sequencing technologies have rendered next-generation sequencing platforms a realistic alternative to traditional genotyping arrays. Exome sequencing in particular not only provides base-level resolution of genetic coding regions, but also a natural paradigm for aggregation via genes and exons. Here, we propose the use of penalized regression in combination with variant aggregation measures to identify rare variant enrichment in exome sequencing data. In contrast to marginal gene-level testing, we simultaneously evaluate the effects of rare variants in multiple genes, focusing on gene-based least absolute shrinkage and selection operator (LASSO) and exon-based sparse group LASSO models. By using gene membership as a grouping variable, the sparse group LASSO can be used as a gene-centric analysis of rare variants while also providing a penalized approach toward identifying specific regions of interest. We apply extensive simulations to evaluate the performance of these approaches with respect to specificity and sensitivity, comparing these results to multiple competing marginal testing methods. Finally, we discuss our findings and outline future research. © 2013 WILEY PERIODICALS, INC.

  10. Mission Analysis for Multiple Rendezvous of Near-Earth Asteroids Using Earth Gravity Assist

    DTIC Science & Technology

    2010-03-01

    devices. Finding solutions with this approach leads to a quicker timeline for possible missions since one does not have to wait for the propulsion...in this research. The discussion focuses on their approach to the problem and the applicability to this research. The headings are the titles of... approach the problem utilizing conventional impulsive thrust propulsion systems and utilize data presented from the JPL website for locating the

  11. Challenges and Support When Teaching Science Through an Integrated Inquiry and Literacy Approach

    NASA Astrophysics Data System (ADS)

    Ødegaard, Marianne; Haug, Berit; Mork, Sonja M.; Ove Sørvik, Gard

    2014-12-01

    In the Budding Science and Literacy project, we explored how working with an integrated inquiry-based science and literacy approach may challenge and support the teaching and learning of science at the classroom level. By studying the inter-relationship between multiple learning modalities and phases of inquiry, we wished to illuminate possible dynamics between science inquiry and literacy in an integrated science approach. Six teachers and their students were recruited from a professional development course for the current classroom study. The teachers were to try out the Budding Science teaching model. This paper presents an overall video analysis of our material demonstrating variations and patterns of inquiry-based science and literacy activities. Our analysis revealed that multiple learning modalities (read it, write it, do it, and talk it) are used in the integrated approach; oral activities dominate. The inquiry phases shifted throughout the students' investigations, but the consolidating phases of discussion and communication were given less space. The data phase of inquiry seems essential as a driving force for engaging in science learning in consolidating situations. The multiple learning modalities were integrated in all inquiry phases, but to a greater extent in preparation and data. Our results indicate that literacy activities embedded in science inquiry provide support for teaching and learning science; however, the greatest challenge for teachers is to find the time and courage to exploit the discussion and communication phases to consolidate the students' conceptual learning.

  12. Modeling Choice Under Uncertainty in Military Systems Analysis

    DTIC Science & Technology

    1991-11-01

    operators rather than fuzzy operators. This is suggested for further research. 4.3 ANALYTIC HIERARCHICAL PROCESS ( AHP ) In AHP , objectives, functions and...14 4.1 IMPRECISELY SPECIFIED MULTIPLE A’ITRIBUTE UTILITY THEORY... 14 4.2 FUZZY DECISION ANALYSIS...14 4.3 ANALYTIC HIERARCHICAL PROCESS ( AHP ) ................................... 14 4.4 SUBJECTIVE TRANSFER FUNCTION APPROACH

  13. Mass Spectrometry Theatre: A Model for Big-Screen Instrumental Analysis

    ERIC Educational Resources Information Center

    Allison, John

    2008-01-01

    Teaching lecture or lab courses in instrumental analysis can be a source of frustration since one can only crowd a small number of students around a single instrument, typically leading to round-robin approaches. Round-robin labs can spread students into multiple labs and limit instructor-student interactions. We discuss "Mass Spectrometry…

  14. Geopolitical E-Analysis Based on E-Learning Content

    ERIC Educational Resources Information Center

    Dinicu, Anca; Oancea, Romana

    2017-01-01

    In a world of great complexity, understanding the manner states act and react becomes more and more an intriguing quest due to the multiple relations of dependence and interdependence that characterize "the global puzzle". Within this context, an analysis based on a geopolitical approach becomes a very useful means used to determine not…

  15. Composite scores in comparative effectiveness research: counterbalancing parsimony and dimensionality in patient-reported outcomes.

    PubMed

    Schwartz, Carolyn E; Patrick, Donald L

    2014-07-01

    When planning a comparative effectiveness study comparing disease-modifying treatments, competing demands influence choice of outcomes. Current practice emphasizes parsimony, although understanding multidimensional treatment impact can help to personalize medical decision-making. We discuss both sides of this 'tug of war'. We discuss the assumptions, advantages and drawbacks of composite scores and multidimensional outcomes. We describe possible solutions to the multiple comparison problem, including conceptual hierarchy distinctions, statistical approaches, 'real-world' benchmarks of effectiveness and subgroup analysis. We conclude that comparative effectiveness research should consider multiple outcome dimensions and compare different approaches that fit the individual context of study objectives.

  16. Multiple stage MS in analysis of plasma, serum, urine and in vitro samples relevant to clinical and forensic toxicology.

    PubMed

    Meyer, Golo M; Maurer, Hans H; Meyer, Markus R

    2016-01-01

    This paper reviews MS approaches applied to metabolism studies, structure elucidation and qualitative or quantitative screening of drugs (of abuse) and/or their metabolites. Applications in clinical and forensic toxicology were included using blood plasma or serum, urine, in vitro samples, liquids, solids or plant material. Techniques covered are liquid chromatography coupled to low-resolution and high-resolution multiple stage mass analyzers. Only PubMed listed studies published in English between January 2008 and January 2015 were considered. Approaches are discussed focusing on sample preparation and mass spectral settings. Comments on advantages and limitations of these techniques complete the review.

  17. An efficient Bayesian meta-analysis approach for studying cross-phenotype genetic associations

    PubMed Central

    Majumdar, Arunabha; Haldar, Tanushree; Bhattacharya, Sourabh; Witte, John S.

    2018-01-01

    Simultaneous analysis of genetic associations with multiple phenotypes may reveal shared genetic susceptibility across traits (pleiotropy). For a locus exhibiting overall pleiotropy, it is important to identify which specific traits underlie this association. We propose a Bayesian meta-analysis approach (termed CPBayes) that uses summary-level data across multiple phenotypes to simultaneously measure the evidence of aggregate-level pleiotropic association and estimate an optimal subset of traits associated with the risk locus. This method uses a unified Bayesian statistical framework based on a spike and slab prior. CPBayes performs a fully Bayesian analysis by employing the Markov Chain Monte Carlo (MCMC) technique Gibbs sampling. It takes into account heterogeneity in the size and direction of the genetic effects across traits. It can be applied to both cohort data and separate studies of multiple traits having overlapping or non-overlapping subjects. Simulations show that CPBayes can produce higher accuracy in the selection of associated traits underlying a pleiotropic signal than the subset-based meta-analysis ASSET. We used CPBayes to undertake a genome-wide pleiotropic association study of 22 traits in the large Kaiser GERA cohort and detected six independent pleiotropic loci associated with at least two phenotypes. This includes a locus at chromosomal region 1q24.2 which exhibits an association simultaneously with the risk of five different diseases: Dermatophytosis, Hemorrhoids, Iron Deficiency, Osteoporosis and Peripheral Vascular Disease. We provide an R-package ‘CPBayes’ implementing the proposed method. PMID:29432419

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patton, Robert M; Potok, Thomas E

    Assessing the potential property and social impacts of an event, such as tornado or wildfire, continues to be a challenging research area. From financial markets to disaster management to epidemiology, the importance of understanding the impacts that events create cannot be understated. Our work describes an approach to fuse information from multiple sources, then to analyze the information cycles to identify prior temporal patterns related to the impact of an event. This approach is then applied to the analysis of news reports from multiple news sources pertaining to several different natural disasters. Results show that our approach can project themore » severity of the impacts of certain natural disasters, such as heat waves on droughts and wild fires. In addition, results show that specific types of disaster consistently produce similar impacts when each time they occur.« less

  19. Fusion of magnetometer and gradiometer sensors of MEG in the presence of multiplicative error.

    PubMed

    Mohseni, Hamid R; Woolrich, Mark W; Kringelbach, Morten L; Luckhoo, Henry; Smith, Penny Probert; Aziz, Tipu Z

    2012-07-01

    Novel neuroimaging techniques have provided unprecedented information on the structure and function of the living human brain. Multimodal fusion of data from different sensors promises to radically improve this understanding, yet optimal methods have not been developed. Here, we demonstrate a novel method for combining multichannel signals. We show how this method can be used to fuse signals from the magnetometer and gradiometer sensors used in magnetoencephalography (MEG), and through extensive experiments using simulation, head phantom and real MEG data, show that it is both robust and accurate. This new approach works by assuming that the lead fields have multiplicative error. The criterion to estimate the error is given within a spatial filter framework such that the estimated power is minimized in the worst case scenario. The method is compared to, and found better than, existing approaches. The closed-form solution and the conditions under which the multiplicative error can be optimally estimated are provided. This novel approach can also be employed for multimodal fusion of other multichannel signals such as MEG and EEG. Although the multiplicative error is estimated based on beamforming, other methods for source analysis can equally be used after the lead-field modification.

  20. A rapid, automated approach to optimisation of multiple reaction monitoring conditions for quantitative bioanalytical mass spectrometry.

    PubMed

    Higton, D M

    2001-01-01

    An improvement to the procedure for the rapid optimisation of mass spectrometry (PROMS), for the development of multiple reaction methods (MRM) for quantitative bioanalytical liquid chromatography/tandem mass spectrometry (LC/MS/MS), is presented. PROMS is an automated protocol that uses flow-injection analysis (FIA) and AppleScripts to create methods and acquire the data for optimisation. The protocol determines the optimum orifice potential, the MRM conditions for each compound, and finally creates the MRM methods needed for sample analysis. The sensitivities of the MRM methods created by PROMS approach those created manually. MRM method development using PROMS currently takes less than three minutes per compound compared to at least fifteen minutes manually. To further enhance throughput, approaches to MRM optimisation using one injection per compound, two injections per pool of five compounds and one injection per pool of five compounds have been investigated. No significant difference in the optimised instrumental parameters for MRM methods were found between the original PROMS approach and these new methods, which are up to ten times faster. The time taken for an AppleScript to determine the optimum conditions and build the MRM methods is the same with all approaches. Copyright 2001 John Wiley & Sons, Ltd.

  1. Multiple criteria decision analysis for health technology assessment.

    PubMed

    Thokala, Praveen; Duenas, Alejandra

    2012-12-01

    Multicriteria decision analysis (MCDA) has been suggested by some researchers as a method to capture the benefits beyond quality adjusted life-years in a transparent and consistent manner. The objectives of this article were to analyze the possible application of MCDA approaches in health technology assessment and to describe their relative advantages and disadvantages. This article begins with an introduction to the most common types of MCDA models and a critical review of state-of-the-art methods for incorporating multiple criteria in health technology assessment. An overview of MCDA is provided and is compared against the current UK National Institute for Health and Clinical Excellence health technology appraisal process. A generic MCDA modeling approach is described, and the different MCDA modeling approaches are applied to a hypothetical case study. A comparison of the different MCDA approaches is provided, and the generic issues that need consideration before the application of MCDA in health technology assessment are examined. There are general practical issues that might arise from using an MCDA approach, and it is suggested that appropriate care be taken to ensure the success of MCDA techniques in the appraisal process. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  2. FIRE: an SPSS program for variable selection in multiple linear regression analysis via the relative importance of predictors.

    PubMed

    Lorenzo-Seva, Urbano; Ferrando, Pere J

    2011-03-01

    We provide an SPSS program that implements currently recommended techniques and recent developments for selecting variables in multiple linear regression analysis via the relative importance of predictors. The approach consists of: (1) optimally splitting the data for cross-validation, (2) selecting the final set of predictors to be retained in the equation regression, and (3) assessing the behavior of the chosen model using standard indices and procedures. The SPSS syntax, a short manual, and data files related to this article are available as supplemental materials from brm.psychonomic-journals.org/content/supplemental.

  3. Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klumpp, John

    We propose a radiation detection system which generates its own discrete sampling distribution based on past measurements of background. The advantage to this approach is that it can take into account variations in background with respect to time, location, energy spectra, detector-specific characteristics (i.e. different efficiencies at different count rates and energies), etc. This would therefore be a 'machine learning' approach, in which the algorithm updates and improves its characterization of background over time. The system would have a 'learning mode,' in which it measures and analyzes background count rates, and a 'detection mode,' in which it compares measurements frommore » an unknown source against its unique background distribution. By characterizing and accounting for variations in the background, general purpose radiation detectors can be improved with little or no increase in cost. The statistical and computational techniques to perform this kind of analysis have already been developed. The necessary signal analysis can be accomplished using existing Bayesian algorithms which account for multiple channels, multiple detectors, and multiple time intervals. Furthermore, Bayesian machine-learning techniques have already been developed which, with trivial modifications, can generate appropriate decision thresholds based on the comparison of new measurements against a nonparametric sampling distribution. (authors)« less

  4. Limitations in Using Multiple Imputation to Harmonize Individual Participant Data for Meta-Analysis.

    PubMed

    Siddique, Juned; de Chavez, Peter J; Howe, George; Cruden, Gracelyn; Brown, C Hendricks

    2018-02-01

    Individual participant data (IPD) meta-analysis is a meta-analysis in which the individual-level data for each study are obtained and used for synthesis. A common challenge in IPD meta-analysis is when variables of interest are measured differently in different studies. The term harmonization has been coined to describe the procedure of placing variables on the same scale in order to permit pooling of data from a large number of studies. Using data from an IPD meta-analysis of 19 adolescent depression trials, we describe a multiple imputation approach for harmonizing 10 depression measures across the 19 trials by treating those depression measures that were not used in a study as missing data. We then apply diagnostics to address the fit of our imputation model. Even after reducing the scale of our application, we were still unable to produce accurate imputations of the missing values. We describe those features of the data that made it difficult to harmonize the depression measures and provide some guidelines for using multiple imputation for harmonization in IPD meta-analysis.

  5. Using real options analysis to support strategic management decisions

    NASA Astrophysics Data System (ADS)

    Kabaivanov, Stanimir; Markovska, Veneta; Milev, Mariyan

    2013-12-01

    Decision making is a complex process that requires taking into consideration multiple heterogeneous sources of uncertainty. Standard valuation and financial analysis techniques often fail to properly account for all these sources of risk as well as for all sources of additional flexibility. In this paper we explore applications of a modified binomial tree method for real options analysis (ROA) in an effort to improve decision making process. Usual cases of use of real options are analyzed with elaborate study on the applications and advantages that company management can derive from their application. A numeric results based on extending simple binomial tree approach for multiple sources of uncertainty are provided to demonstrate the improvement effects on management decisions.

  6. Neophyte experiences of football (soccer) match analysis: a multiple case study approach.

    PubMed

    McKenna, Mark; Cowan, Daryl Thomas; Stevenson, David; Baker, Julien Steven

    2018-03-05

    Performance analysis is extensively used in sport, but its pedagogical application is little understood. Given its expanding role across football, this study explored the experiences of neophyte performance analysts. Experiences of six analysis interns, across three professional football clubs, were investigated as multiple cases of new match analysis. Each intern was interviewed after their first season, with archival data providing background information. Four themes emerged from qualitative analysis: (1) "building of relationships" was important, along with trust and role clarity; (2) "establishing an analysis system" was difficult due to tacit coach knowledge, but analysis was established; (3) the quality of the "feedback process" hinged on coaching styles, with balance of feedback and athlete engagement considered essential; (4) "establishing effect" was complex with no statistical effects reported; yet enhanced relationships, role clarity, and improved performances were reported. Other emic accounts are required to further understand occupational culture within performance analysis.

  7. Multiple approaches to characterize the microbial community in a thermophilic anaerobic digester running on swine manure: a case study.

    PubMed

    Tuan, Nguyen Ngoc; Chang, Yi-Chia; Yu, Chang-Ping; Huang, Shir-Ly

    2014-01-01

    In this study, the first survey of microbial community in thermophilic anaerobic digester using swine manure as sole feedstock was performed by multiple approaches including denaturing gradient gel electrophoresis (DGGE), clone library and pyrosequencing techniques. The integrated analysis of 21 DGGE bands, 126 clones and 8506 pyrosequencing read sequences revealed that Clostridia from the phylum Firmicutes account for the most dominant Bacteria. In addition, our analysis also identified additional taxa that were missed by the previous researches, including members of the bacterial phyla Synergistetes, Planctomycetes, Armatimonadetes, Chloroflexi and Nitrospira which might also play a role in thermophilic anaerobic digester. Most archaeal 16S rRNA sequences could be assigned to the order Methanobacteriales instead of Methanomicrobiales comparing to previous studies. In addition, this study reported that the member of Methanothermobacter genus was firstly found in thermophilic anaerobic digester. Copyright © 2014 Elsevier GmbH. All rights reserved.

  8. Pathway-based analyses.

    PubMed

    Kent, Jack W

    2016-02-03

    New technologies for acquisition of genomic data, while offering unprecedented opportunities for genetic discovery, also impose severe burdens of interpretation and penalties for multiple testing. The Pathway-based Analyses Group of the Genetic Analysis Workshop 19 (GAW19) sought reduction of multiple-testing burden through various approaches to aggregation of highdimensional data in pathways informed by prior biological knowledge. Experimental methods testedincluded the use of "synthetic pathways" (random sets of genes) to estimate power and false-positive error rate of methods applied to simulated data; data reduction via independent components analysis, single-nucleotide polymorphism (SNP)-SNP interaction, and use of gene sets to estimate genetic similarity; and general assessment of the efficacy of prior biological knowledge to reduce the dimensionality of complex genomic data. The work of this group explored several promising approaches to managing high-dimensional data, with the caveat that these methods are necessarily constrained by the quality of external bioinformatic annotation.

  9. A Quantile Regression Approach to Understanding the Relations Among Morphological Awareness, Vocabulary, and Reading Comprehension in Adult Basic Education Students.

    PubMed

    Tighe, Elizabeth L; Schatschneider, Christopher

    2016-07-01

    The purpose of this study was to investigate the joint and unique contributions of morphological awareness and vocabulary knowledge at five reading comprehension levels in adult basic education (ABE) students. We introduce the statistical technique of multiple quantile regression, which enabled us to assess the predictive utility of morphological awareness and vocabulary knowledge at multiple points (quantiles) along the continuous distribution of reading comprehension. To demonstrate the efficacy of our multiple quantile regression analysis, we compared and contrasted our results with a traditional multiple regression analytic approach. Our results indicated that morphological awareness and vocabulary knowledge accounted for a large portion of the variance (82%-95%) in reading comprehension skills across all quantiles. Morphological awareness exhibited the greatest unique predictive ability at lower levels of reading comprehension whereas vocabulary knowledge exhibited the greatest unique predictive ability at higher levels of reading comprehension. These results indicate the utility of using multiple quantile regression to assess trajectories of component skills across multiple levels of reading comprehension. The implications of our findings for ABE programs are discussed. © Hammill Institute on Disabilities 2014.

  10. Fast Quantitative Analysis Of Museum Objects Using Laser-Induced Breakdown Spectroscopy And Multiple Regression Algorithms

    NASA Astrophysics Data System (ADS)

    Lorenzetti, G.; Foresta, A.; Palleschi, V.; Legnaioli, S.

    2009-09-01

    The recent development of mobile instrumentation, specifically devoted to in situ analysis and study of museum objects, allows the acquisition of many LIBS spectra in very short time. However, such large amount of data calls for new analytical approaches which would guarantee a prompt analysis of the results obtained. In this communication, we will present and discuss the advantages of statistical analytical methods, such as Partial Least Squares Multiple Regression algorithms vs. the classical calibration curve approach. PLS algorithms allows to obtain in real time the information on the composition of the objects under study; this feature of the method, compared to the traditional off-line analysis of the data, is extremely useful for the optimization of the measurement times and number of points associated with the analysis. In fact, the real time availability of the compositional information gives the possibility of concentrating the attention on the most `interesting' parts of the object, without over-sampling the zones which would not provide useful information for the scholars or the conservators. Some example on the applications of this method will be presented, including the studies recently performed by the researcher of the Applied Laser Spectroscopy Laboratory on museum bronze objects.

  11. A multiple indicator solution approach to endogeneity in discrete-choice models for environmental valuation.

    PubMed

    Mariel, Petr; Hoyos, David; Artabe, Alaitz; Guevara, C Angelo

    2018-08-15

    Endogeneity is an often neglected issue in empirical applications of discrete choice modelling despite its severe consequences in terms of inconsistent parameter estimation and biased welfare measures. This article analyses the performance of the multiple indicator solution method to deal with endogeneity arising from omitted explanatory variables in discrete choice models for environmental valuation. We also propose and illustrate a factor analysis procedure for the selection of the indicators in practice. Additionally, the performance of this method is compared with the recently proposed hybrid choice modelling framework. In an empirical application we find that the multiple indicator solution method and the hybrid model approach provide similar results in terms of welfare estimates, although the multiple indicator solution method is more parsimonious and notably easier to implement. The empirical results open a path to explore the performance of this method when endogeneity is thought to have a different cause or under a different set of indicators. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Hardware-Independent Proofs of Numerical Programs

    NASA Technical Reports Server (NTRS)

    Boldo, Sylvie; Nguyen, Thi Minh Tuyen

    2010-01-01

    On recent architectures, a numerical program may give different answers depending on the execution hardware and the compilation. Our goal is to formally prove properties about numerical programs that are true for multiple architectures and compilers. We propose an approach that states the rounding error of each floating-point computation whatever the environment. This approach is implemented in the Frama-C platform for static analysis of C code. Small case studies using this approach are entirely and automatically proved

  13. ADDING REALISM TO NUCLEAR MATERIAL DISSOLVING ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williamson, B.

    2011-08-15

    Two new criticality modeling approaches have greatly increased the efficiency of dissolver operations in H-Canyon. The first new approach takes credit for the linear, physical distribution of the mass throughout the entire length of the fuel assembly. This distribution of mass is referred to as the linear density. Crediting the linear density of the fuel bundles results in using lower fissile concentrations, which allows higher masses to be charged to the dissolver. Also, this approach takes credit for the fact that only part of the fissile mass is wetted at a time. There are multiple assemblies stacked on top ofmore » each other in a bundle. On average, only 50-75% of the mass (the bottom two or three assemblies) is wetted at a time. This means that only 50-75% (depending on operating level) of the mass is moderated and is contributing to the reactivity of the system. The second new approach takes credit for the progression of the dissolving process. Previously, dissolving analysis looked at a snapshot in time where the same fissile material existed both in the wells and in the bulk solution at the same time. The second new approach models multiple consecutive phases that simulate the fissile material moving from a high concentration in the wells to a low concentration in the bulk solution. This approach is more realistic and allows higher fissile masses to be charged to the dissolver.« less

  14. Pooling Data from Multiple Longitudinal Studies: The Role of Item Response Theory in Integrative Data Analysis

    PubMed Central

    Curran, Patrick J.; Hussong, Andrea M.; Cai, Li; Huang, Wenjing; Chassin, Laurie; Sher, Kenneth J.; Zucker, Robert A.

    2010-01-01

    There are a number of significant challenges encountered when studying development over an extended period of time including subject attrition, changing measurement structures across group and developmental period, and the need to invest substantial time and money. Integrative data analysis is an emerging set of methodologies that overcomes many of the challenges of single sample designs through the pooling of data drawn from multiple existing developmental studies. This approach is characterized by a host of advantages, but this also introduces several new complexities that must be addressed prior to broad adoption by developmental researchers. In this paper we focus on methods for fitting measurement models and creating scale scores using data drawn from multiple longitudinal studies. We present findings from the analysis of repeated measures of internalizing symptomatology that were pooled from three existing developmental studies. We describe and demonstrate each step in the analysis and we conclude with a discussion of potential limitations and directions for future research. PMID:18331129

  15. Multivariate meta-analysis using individual participant data

    PubMed Central

    Riley, R. D.; Price, M. J.; Jackson, D.; Wardle, M.; Gueyffier, F.; Wang, J.; Staessen, J. A.; White, I. R.

    2016-01-01

    When combining results across related studies, a multivariate meta-analysis allows the joint synthesis of correlated effect estimates from multiple outcomes. Joint synthesis can improve efficiency over separate univariate syntheses, may reduce selective outcome reporting biases, and enables joint inferences across the outcomes. A common issue is that within-study correlations needed to fit the multivariate model are unknown from published reports. However, provision of individual participant data (IPD) allows them to be calculated directly. Here, we illustrate how to use IPD to estimate within-study correlations, using a joint linear regression for multiple continuous outcomes and bootstrapping methods for binary, survival and mixed outcomes. In a meta-analysis of 10 hypertension trials, we then show how these methods enable multivariate meta-analysis to address novel clinical questions about continuous, survival and binary outcomes; treatment–covariate interactions; adjusted risk/prognostic factor effects; longitudinal data; prognostic and multiparameter models; and multiple treatment comparisons. Both frequentist and Bayesian approaches are applied, with example software code provided to derive within-study correlations and to fit the models. PMID:26099484

  16. Correcting bias due to missing stage data in the non-parametric estimation of stage-specific net survival for colorectal cancer using multiple imputation.

    PubMed

    Falcaro, Milena; Carpenter, James R

    2017-06-01

    Population-based net survival by tumour stage at diagnosis is a key measure in cancer surveillance. Unfortunately, data on tumour stage are often missing for a non-negligible proportion of patients and the mechanism giving rise to the missingness is usually anything but completely at random. In this setting, restricting analysis to the subset of complete records gives typically biased results. Multiple imputation is a promising practical approach to the issues raised by the missing data, but its use in conjunction with the Pohar-Perme method for estimating net survival has not been formally evaluated. We performed a resampling study using colorectal cancer population-based registry data to evaluate the ability of multiple imputation, used along with the Pohar-Perme method, to deliver unbiased estimates of stage-specific net survival and recover missing stage information. We created 1000 independent data sets, each containing 5000 patients. Stage data were then made missing at random under two scenarios (30% and 50% missingness). Complete records analysis showed substantial bias and poor confidence interval coverage. Across both scenarios our multiple imputation strategy virtually eliminated the bias and greatly improved confidence interval coverage. In the presence of missing stage data complete records analysis often gives severely biased results. We showed that combining multiple imputation with the Pohar-Perme estimator provides a valid practical approach for the estimation of stage-specific colorectal cancer net survival. As usual, when the percentage of missing data is high the results should be interpreted cautiously and sensitivity analyses are recommended. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Systems and precision medicine approaches to diabetes heterogeneity: a Big Data perspective.

    PubMed

    Capobianco, Enrico

    2017-12-01

    Big Data, and in particular Electronic Health Records, provide the medical community with a great opportunity to analyze multiple pathological conditions at an unprecedented depth for many complex diseases, including diabetes. How can we infer on diabetes from large heterogeneous datasets? A possible solution is provided by invoking next-generation computational methods and data analytics tools within systems medicine approaches. By deciphering the multi-faceted complexity of biological systems, the potential of emerging diagnostic tools and therapeutic functions can be ultimately revealed. In diabetes, a multidimensional approach to data analysis is needed to better understand the disease conditions, trajectories and the associated comorbidities. Elucidation of multidimensionality comes from the analysis of factors such as disease phenotypes, marker types, and biological motifs while seeking to make use of multiple levels of information including genetics, omics, clinical data, and environmental and lifestyle factors. Examining the synergy between multiple dimensions represents a challenge. In such regard, the role of Big Data fuels the rise of Precision Medicine by allowing an increasing number of descriptions to be captured from individuals. Thus, data curations and analyses should be designed to deliver highly accurate predicted risk profiles and treatment recommendations. It is important to establish linkages between systems and precision medicine in order to translate their principles into clinical practice. Equivalently, to realize their full potential, the involved multiple dimensions must be able to process information ensuring inter-exchange, reducing ambiguities and redundancies, and ultimately improving health care solutions by introducing clinical decision support systems focused on reclassified phenotypes (or digital biomarkers) and community-driven patient stratifications.

  18. Analytical multiple scattering correction to the Mie theory: Application to the analysis of the lidar signal

    NASA Technical Reports Server (NTRS)

    Flesia, C.; Schwendimann, P.

    1992-01-01

    The contribution of the multiple scattering to the lidar signal is dependent on the optical depth tau. Therefore, the radar analysis, based on the assumption that the multiple scattering can be neglected is limited to cases characterized by low values of the optical depth (tau less than or equal to 0.1) and hence it exclude scattering from most clouds. Moreover, all inversion methods relating lidar signal to number densities and particle size must be modified since the multiple scattering affects the direct analysis. The essential requests of a realistic model for lidar measurements which include the multiple scattering and which can be applied to practical situations follow. (1) Requested are not only a correction term or a rough approximation describing results of a certain experiment, but a general theory of multiple scattering tying together the relevant physical parameter we seek to measure. (2) An analytical generalization of the lidar equation which can be applied in the case of a realistic aerosol is requested. A pure analytical formulation is important in order to avoid the convergency and stability problems which, in the case of numerical approach, are due to the large number of events that have to be taken into account in the presence of large depth and/or a strong experimental noise.

  19. Meta-analysis of multiple outcomes: a multilevel approach.

    PubMed

    Van den Noortgate, Wim; López-López, José Antonio; Marín-Martínez, Fulgencio; Sánchez-Meca, Julio

    2015-12-01

    In meta-analysis, dependent effect sizes are very common. An example is where in one or more studies the effect of an intervention is evaluated on multiple outcome variables for the same sample of participants. In this paper, we evaluate a three-level meta-analytic model to account for this kind of dependence, extending the simulation results of Van den Noortgate, López-López, Marín-Martínez, and Sánchez-Meca Behavior Research Methods, 45, 576-594 (2013) by allowing for a variation in the number of effect sizes per study, in the between-study variance, in the correlations between pairs of outcomes, and in the sample size of the studies. At the same time, we explore the performance of the approach if the outcomes used in a study can be regarded as a random sample from a population of outcomes. We conclude that although this approach is relatively simple and does not require prior estimates of the sampling covariances between effect sizes, it gives appropriate mean effect size estimates, standard error estimates, and confidence interval coverage proportions in a variety of realistic situations.

  20. Effects of physiotherapy interventions on balance in multiple sclerosis: a systematic review and meta-analysis of randomized controlled trials.

    PubMed

    Paltamaa, Jaana; Sjögren, Tuulikki; Peurala, Sinikka H; Heinonen, Ari

    2012-10-01

    To determine the effects of physiotherapy interventions on balance in people with multiple sclerosis. A systematic literature search was conducted in Medline, Cinahl, Embase, PEDro, both electronically and by manual search up to March 2011. Randomized controlled trials of physiotherapy interventions in people with multiple sclerosis, with an outcome measure linked to the International Classification of Functioning, Disability and Health (ICF) category of "Changing and maintaining body position", were included. The quality of studies was determined by the van Tulder criteria. Meta-analyses were performed in subgroups according to the intervention. After screening 233 full-text papers, 11 studies were included in a qualitative analysis and 7 in a meta-analysis. The methodological quality of the studies ranged from poor to moderate. Low evidence was found for the efficacy of specific balance exercises, physical therapy based on an individualized problem-solving approach, and resistance and aerobic exercises on improving balance among ambulatory people with multiple sclerosis. These findings indicate small, but significant, effects of physiotherapy on balance in people with multiple sclerosis who have a mild to moderate level of disability. However, evidence for severely disabled people is lacking, and further research is needed.

  1. Interactions between cadmium and decabrominated diphenyl ether on blood cells count in rats-Multiple factorial regression analysis.

    PubMed

    Curcic, Marijana; Buha, Aleksandra; Stankovic, Sanja; Milovanovic, Vesna; Bulat, Zorica; Đukić-Ćosić, Danijela; Antonijević, Evica; Vučinić, Slavica; Matović, Vesna; Antonijevic, Biljana

    2017-02-01

    The objective of this study was to assess toxicity of Cd and BDE-209 mixture on haematological parameters in subacutely exposed rats and to determine the presence and type of interactions between these two chemicals using multiple factorial regression analysis. Furthermore, for the assessment of interaction type, an isobologram based methodology was applied and compared with multiple factorial regression analysis. Chemicals were given by oral gavage to the male Wistar rats weighing 200-240g for 28days. Animals were divided in 16 groups (8/group): control vehiculum group, three groups of rats were treated with 2.5, 7.5 or 15mg Cd/kg/day. These doses were chosen on the bases of literature data and reflect relatively high Cd environmental exposure, three groups of rats were treated with 1000, 2000 or 4000mg BDE-209/kg/bw/day, doses proved to induce toxic effects in rats. Furthermore, nine groups of animals were treated with different mixtures of Cd and BDE-209 containing doses of Cd and BDE-209 stated above. Blood samples were taken at the end of experiment and red blood cells, white blood cells and platelets counts were determined. For interaction assessment multiple factorial regression analysis and fitted isobologram approach were used. In this study, we focused on multiple factorial regression analysis as a method for interaction assessment. We also investigated the interactions between Cd and BDE-209 by the derived model for the description of the obtained fitted isobologram curves. Current study indicated that co-exposure to Cd and BDE-209 can result in significant decrease in RBC count, increase in WBC count and decrease in PLT count, when compared with controls. Multiple factorial regression analysis used for the assessment of interactions type between Cd and BDE-209 indicated synergism for the effect on RBC count and no interactions i.e. additivity for the effects on WBC and PLT counts. On the other hand, isobologram based approach showed slight antagonism for the effects on RBC and WBC while no interactions were proved for the joint effect on PLT count. These results confirm that the assessment of interactions between chemicals in the mixture greatly depends on the concept or method used for this evaluation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  2. FPGA-Based Filterbank Implementation for Parallel Digital Signal Processing

    NASA Technical Reports Server (NTRS)

    Berner, Stephan; DeLeon, Phillip

    1999-01-01

    One approach to parallel digital signal processing decomposes a high bandwidth signal into multiple lower bandwidth (rate) signals by an analysis bank. After processing, the subband signals are recombined into a fullband output signal by a synthesis bank. This paper describes an implementation of the analysis and synthesis banks using (Field Programmable Gate Arrays) FPGAs.

  3. A counterfactual p-value approach for benefit-risk assessment in clinical trials.

    PubMed

    Zeng, Donglin; Chen, Ming-Hui; Ibrahim, Joseph G; Wei, Rachel; Ding, Beiying; Ke, Chunlei; Jiang, Qi

    2015-01-01

    Clinical trials generally allow various efficacy and safety outcomes to be collected for health interventions. Benefit-risk assessment is an important issue when evaluating a new drug. Currently, there is a lack of standardized and validated benefit-risk assessment approaches in drug development due to various challenges. To quantify benefits and risks, we propose a counterfactual p-value (CP) approach. Our approach considers a spectrum of weights for weighting benefit-risk values and computes the extreme probabilities of observing the weighted benefit-risk value in one treatment group as if patients were treated in the other treatment group. The proposed approach is applicable to single benefit and single risk outcome as well as multiple benefit and risk outcomes assessment. In addition, the prior information in the weight schemes relevant to the importance of outcomes can be incorporated in the approach. The proposed CPs plot is intuitive with a visualized weight pattern. The average area under CP and preferred probability over time are used for overall treatment comparison and a bootstrap approach is applied for statistical inference. We assess the proposed approach using simulated data with multiple efficacy and safety endpoints and compare its performance with a stochastic multi-criteria acceptability analysis approach.

  4. POHCS AND PICS SCREENING PROTOCOL

    EPA Science Inventory

    The report describes risk-driven analysis strategies and a tiered survey approach of analyses that should be useful for building data bases related to other waste combustion processes. NOTE: The need to characterize hazardous waste incinerator emissions for multiple organic compo...

  5. 3D measurement using combined Gray code and dual-frequency phase-shifting approach

    NASA Astrophysics Data System (ADS)

    Yu, Shuang; Zhang, Jing; Yu, Xiaoyang; Sun, Xiaoming; Wu, Haibin; Liu, Xin

    2018-04-01

    The combined Gray code and phase-shifting approach is a commonly used 3D measurement technique. In this technique, an error that equals integer multiples of the phase-shifted fringe period, i.e. period jump error, often exists in the absolute analog code, which can lead to gross measurement errors. To overcome this problem, the present paper proposes 3D measurement using a combined Gray code and dual-frequency phase-shifting approach. Based on 3D measurement using the combined Gray code and phase-shifting approach, one set of low-frequency phase-shifted fringe patterns with an odd-numbered multiple of the original phase-shifted fringe period is added. Thus, the absolute analog code measured value can be obtained by the combined Gray code and phase-shifting approach, and the low-frequency absolute analog code measured value can also be obtained by adding low-frequency phase-shifted fringe patterns. Then, the corrected absolute analog code measured value can be obtained by correcting the former by the latter, and the period jump errors can be eliminated, resulting in reliable analog code unwrapping. For the proposed approach, we established its measurement model, analyzed its measurement principle, expounded the mechanism of eliminating period jump errors by error analysis, and determined its applicable conditions. Theoretical analysis and experimental results show that the proposed approach can effectively eliminate period jump errors, reliably perform analog code unwrapping, and improve the measurement accuracy.

  6. Unsupervised multiple kernel learning for heterogeneous data integration.

    PubMed

    Mariette, Jérôme; Villa-Vialaneix, Nathalie

    2018-03-15

    Recent high-throughput sequencing advances have expanded the breadth of available omics datasets and the integrated analysis of multiple datasets obtained on the same samples has allowed to gain important insights in a wide range of applications. However, the integration of various sources of information remains a challenge for systems biology since produced datasets are often of heterogeneous types, with the need of developing generic methods to take their different specificities into account. We propose a multiple kernel framework that allows to integrate multiple datasets of various types into a single exploratory analysis. Several solutions are provided to learn either a consensus meta-kernel or a meta-kernel that preserves the original topology of the datasets. We applied our framework to analyse two public multi-omics datasets. First, the multiple metagenomic datasets, collected during the TARA Oceans expedition, was explored to demonstrate that our method is able to retrieve previous findings in a single kernel PCA as well as to provide a new image of the sample structures when a larger number of datasets are included in the analysis. To perform this analysis, a generic procedure is also proposed to improve the interpretability of the kernel PCA in regards with the original data. Second, the multi-omics breast cancer datasets, provided by The Cancer Genome Atlas, is analysed using a kernel Self-Organizing Maps with both single and multi-omics strategies. The comparison of these two approaches demonstrates the benefit of our integration method to improve the representation of the studied biological system. Proposed methods are available in the R package mixKernel, released on CRAN. It is fully compatible with the mixOmics package and a tutorial describing the approach can be found on mixOmics web site http://mixomics.org/mixkernel/. jerome.mariette@inra.fr or nathalie.villa-vialaneix@inra.fr. Supplementary data are available at Bioinformatics online.

  7. Application of several physical techniques in the total analysis of a canine urinary calculus.

    PubMed

    Rodgers, A L; Mezzabotta, M; Mulder, K J; Nassimbeni, L R

    1981-06-01

    A single calculus from the bladder of a Beagle bitch has been analyzed by a multiple technique approach employing x-ray diffraction, infrared spectroscopy, scanning electron microscopy, x-ray fluorescence spectrometry, atomic absorption spectrophotometry and density gradient fractionation. The qualitative and quantitative data obtained showed excellent agreement, lending confidence to such an approach for the evaluation and understanding of stone disease.

  8. SCOPA and META-SCOPA: software for the analysis and aggregation of genome-wide association studies of multiple correlated phenotypes.

    PubMed

    Mägi, Reedik; Suleimanov, Yury V; Clarke, Geraldine M; Kaakinen, Marika; Fischer, Krista; Prokopenko, Inga; Morris, Andrew P

    2017-01-11

    Genome-wide association studies (GWAS) of single nucleotide polymorphisms (SNPs) have been successful in identifying loci contributing genetic effects to a wide range of complex human diseases and quantitative traits. The traditional approach to GWAS analysis is to consider each phenotype separately, despite the fact that many diseases and quantitative traits are correlated with each other, and often measured in the same sample of individuals. Multivariate analyses of correlated phenotypes have been demonstrated, by simulation, to increase power to detect association with SNPs, and thus may enable improved detection of novel loci contributing to diseases and quantitative traits. We have developed the SCOPA software to enable GWAS analysis of multiple correlated phenotypes. The software implements "reverse regression" methodology, which treats the genotype of an individual at a SNP as the outcome and the phenotypes as predictors in a general linear model. SCOPA can be applied to quantitative traits and categorical phenotypes, and can accommodate imputed genotypes under a dosage model. The accompanying META-SCOPA software enables meta-analysis of association summary statistics from SCOPA across GWAS. Application of SCOPA to two GWAS of high-and low-density lipoprotein cholesterol, triglycerides and body mass index, and subsequent meta-analysis with META-SCOPA, highlighted stronger association signals than univariate phenotype analysis at established lipid and obesity loci. The META-SCOPA meta-analysis also revealed a novel signal of association at genome-wide significance for triglycerides mapping to GPC5 (lead SNP rs71427535, p = 1.1x10 -8 ), which has not been reported in previous large-scale GWAS of lipid traits. The SCOPA and META-SCOPA software enable discovery and dissection of multiple phenotype association signals through implementation of a powerful reverse regression approach.

  9. Isolating and Examining Sources of Suppression and Multicollinearity in Multiple Linear Regression.

    PubMed

    Beckstead, Jason W

    2012-03-30

    The presence of suppression (and multicollinearity) in multiple regression analysis complicates interpretation of predictor-criterion relationships. The mathematical conditions that produce suppression in regression analysis have received considerable attention in the methodological literature but until now nothing in the way of an analytic strategy to isolate, examine, and remove suppression effects has been offered. In this article such an approach, rooted in confirmatory factor analysis theory and employing matrix algebra, is developed. Suppression is viewed as the result of criterion-irrelevant variance operating among predictors. Decomposition of predictor variables into criterion-relevant and criterion-irrelevant components using structural equation modeling permits derivation of regression weights with the effects of criterion-irrelevant variance omitted. Three examples with data from applied research are used to illustrate the approach: the first assesses child and parent characteristics to explain why some parents of children with obsessive-compulsive disorder accommodate their child's compulsions more so than do others, the second examines various dimensions of personal health to explain individual differences in global quality of life among patients following heart surgery, and the third deals with quantifying the relative importance of various aptitudes for explaining academic performance in a sample of nursing students. The approach is offered as an analytic tool for investigators interested in understanding predictor-criterion relationships when complex patterns of intercorrelation among predictors are present and is shown to augment dominance analysis.

  10. The multiple complex exponential model and its application to EEG analysis

    NASA Astrophysics Data System (ADS)

    Chen, Dao-Mu; Petzold, J.

    The paper presents a novel approach to the analysis of the EEG signal, which is based on a multiple complex exponential (MCE) model. Parameters of the model are estimated using a nonharmonic Fourier expansion algorithm. The central idea of the algorithm is outlined, and the results, estimated on the basis of simulated data, are presented and compared with those obtained by the conventional methods of signal analysis. Preliminary work on various application possibilities of the MCE model in EEG data analysis is described. It is shown that the parameters of the MCE model reflect the essential information contained in an EEG segment. These parameters characterize the EEG signal in a more objective way because they are closer to the recent supposition of the nonlinear character of the brain's dynamic behavior.

  11. A novel approach for evaluating the risk of health care failure modes.

    PubMed

    Chang, Dong Shang; Chung, Jenq Hann; Sun, Kuo Lung; Yang, Fu Chiang

    2012-12-01

    Failure mode and effects analysis (FMEA) can be employed to reduce medical errors by identifying the risk ranking of the health care failure modes and taking priority action for safety improvement. The purpose of this paper is to propose a novel approach of data analysis. The approach is to integrate FMEA and a mathematical tool-Data envelopment analysis (DEA) with "slack-based measure" (SBM), in the field of data analysis. The risk indexes (severity, occurrence, and detection) of FMEA are viewed as multiple inputs of DEA. The practicality and usefulness of the proposed approach is illustrated by one case of health care. Being a systematic approach for improving the service quality of health care, the approach can offer quantitative corrective information of risk indexes that thereafter reduce failure possibility. For safety improvement, these new targets of the risk indexes could be used for management by objectives. But FMEA cannot provide quantitative corrective information of risk indexes. The novel approach can surely overcome this chief shortcoming of FMEA. After combining DEA SBM model with FMEA, the two goals-increase of patient safety, medical cost reduction-can be together achieved.

  12. Green supplier selection: a new genetic/immune strategy with industrial application

    NASA Astrophysics Data System (ADS)

    Kumar, Amit; Jain, Vipul; Kumar, Sameer; Chandra, Charu

    2016-10-01

    With the onset of the 'climate change movement', organisations are striving to include environmental criteria into the supplier selection process. This article hybridises a Green Data Envelopment Analysis (GDEA)-based approach with a new Genetic/Immune Strategy for Data Envelopment Analysis (GIS-DEA). A GIS-DEA approach provides a different view to solving multi-criteria decision making problems using data envelopment analysis (DEA) by considering DEA as a multi-objective optimisation problem with efficiency as one objective and proximity of solution to decision makers' preferences as the other objective. The hybrid approach called GIS-GDEA is applied here to a well-known automobile spare parts manufacturer in India and the results presented. User validation developed based on specific set of criteria suggests that the supplier selection process with GIS-GDEA is more practical than other approaches in a current industrial scenario with multiple decision makers.

  13. Maltreatment histories of foster youth exiting out-of-home care through emancipation: a latent class analysis.

    PubMed

    Havlicek, Judy

    2014-01-01

    Little is known about maltreatment among foster youth transitioning to adulthood. Multiple entries into out-of-home care and unsuccessful attempts at reunification may nevertheless reflect extended exposure to chronic maltreatment and multiple types of victimization. This study used administrative data from the Illinois Department of Children and Family Services to identify all unduplicated allegations of maltreatment in a cohort of 801 foster youth transitioning to adulthood in the state of Illinois. A latent variable modeling approach generated profiles of maltreatment based on substantiated and unsubstantiated reports of maltreatment taken from state administrative data. Four indicators of maltreatment were included in the latent class analysis: multiple types of maltreatment, predominant type of maltreatment, chronicity, and number of different perpetrators. The analysis identified four subpopulations of foster youth in relation to maltreatment. Study findings highlight the heterogeneity of maltreatment in the lives of foster youth transitioning to adulthood and draw attention to a need to raise awareness among service providers to screen for chronic maltreatment and multiple types of victimization. © The Author(s) 2014.

  14. Analytical approach for modeling and performance analysis of microring resonators as optical filters with multiple output bus waveguides

    NASA Astrophysics Data System (ADS)

    Lakra, Suchita; Mandal, Sanjoy

    2017-06-01

    A quadruple micro-optical ring resonator (QMORR) with multiple output bus waveguides is mathematically modeled and analyzed by making use of the delay-line signal processing approach in Z-domain and Mason's gain formula. The performances of QMORR with two output bus waveguides with vertical coupling are analyzed. This proposed structure is capable of providing wider free spectral response from both the output buses with appreciable cross talk. Thus, this configuration could provide increased capacity to insert a large number of communication channels. The simulated frequency response characteristic and its dispersion and group delay characteristics are graphically presented using the MATLAB environment.

  15. A Systems Biology Approach for Identifying Hepatotoxicant Groups Based on Similarity in Mechanisms of Action and Chemical Structure.

    PubMed

    Hebels, Dennie G A J; Rasche, Axel; Herwig, Ralf; van Westen, Gerard J P; Jennen, Danyel G J; Kleinjans, Jos C S

    2016-01-01

    When evaluating compound similarity, addressing multiple sources of information to reach conclusions about common pharmaceutical and/or toxicological mechanisms of action is a crucial strategy. In this chapter, we describe a systems biology approach that incorporates analyses of hepatotoxicant data for 33 compounds from three different sources: a chemical structure similarity analysis based on the 3D Tanimoto coefficient, a chemical structure-based protein target prediction analysis, and a cross-study/cross-platform meta-analysis of in vitro and in vivo human and rat transcriptomics data derived from public resources (i.e., the diXa data warehouse). Hierarchical clustering of the outcome scores of the separate analyses did not result in a satisfactory grouping of compounds considering their known toxic mechanism as described in literature. However, a combined analysis of multiple data types may hypothetically compensate for missing or unreliable information in any of the single data types. We therefore performed an integrated clustering analysis of all three data sets using the R-based tool iClusterPlus. This indeed improved the grouping results. The compound clusters that were formed by means of iClusterPlus represent groups that show similar gene expression while simultaneously integrating a similarity in structure and protein targets, which corresponds much better with the known mechanism of action of these toxicants. Using an integrative systems biology approach may thus overcome the limitations of the separate analyses when grouping liver toxicants sharing a similar mechanism of toxicity.

  16. Weakly nonparallel and curvature effects on stationary crossflow instability: Comparison of results from multiple-scales analysis and parabolized stability equations

    NASA Technical Reports Server (NTRS)

    Singer, Bart A.; Choudhari, Meelan; Li, Fei

    1995-01-01

    A multiple-scales approach is used to approximate the effects of nonparallelism and streamwise surface curvature on the growth of stationary crossflow vortices in incompressible, three-dimesional boundary layers. The results agree with results predicted by solving the parabolized stability equations in regions where the nonparallelism is sufficiently weak. As the nonparallelism increases, the agreement between the two approaches worsens. An attempt has been made to quantify the nonparallelism on flow stability in terms of a nondimensional number that describes the rate of change of the mean flow relative to the disturbance wavelength. We find that the above nondimensional number provides useful information about the adequacy of the multiple-scales approximation for different disturbances for a given flow geometry, but the number does not collapse data for different flow geometries onto a single curve.

  17. Motion compensation via redundant-wavelet multihypothesis.

    PubMed

    Fowler, James E; Cui, Suxia; Wang, Yonghui

    2006-10-01

    Multihypothesis motion compensation has been widely used in video coding with previous attention focused on techniques employing predictions that are diverse spatially or temporally. In this paper, the multihypothesis concept is extended into the transform domain by using a redundant wavelet transform to produce multiple predictions that are diverse in transform phase. The corresponding multiple-phase inverse transform implicitly combines the phase-diverse predictions into a single spatial-domain prediction for motion compensation. The performance advantage of this redundant-wavelet-multihypothesis approach is investigated analytically, invoking the fact that the multiple-phase inverse involves a projection that significantly reduces the power of a dense-motion residual modeled as additive noise. The analysis shows that redundant-wavelet multihypothesis is capable of up to a 7-dB reduction in prediction-residual variance over an equivalent single-phase, single-hypothesis approach. Experimental results substantiate the performance advantage for a block-based implementation.

  18. deltaGseg: macrostate estimation via molecular dynamics simulations and multiscale time series analysis.

    PubMed

    Low, Diana H P; Motakis, Efthymios

    2013-10-01

    Binding free energy calculations obtained through molecular dynamics simulations reflect intermolecular interaction states through a series of independent snapshots. Typically, the free energies of multiple simulated series (each with slightly different starting conditions) need to be estimated. Previous approaches carry out this task by moving averages at certain decorrelation times, assuming that the system comes from a single conformation description of binding events. Here, we discuss a more general approach that uses statistical modeling, wavelets denoising and hierarchical clustering to estimate the significance of multiple statistically distinct subpopulations, reflecting potential macrostates of the system. We present the deltaGseg R package that performs macrostate estimation from multiple replicated series and allows molecular biologists/chemists to gain physical insight into the molecular details that are not easily accessible by experimental techniques. deltaGseg is a Bioconductor R package available at http://bioconductor.org/packages/release/bioc/html/deltaGseg.html.

  19. Equivalent Linearization Analysis of Geometrically Nonlinear Random Vibrations Using Commercial Finite Element Codes

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Muravyov, Alexander A.

    2002-01-01

    Two new equivalent linearization implementations for geometrically nonlinear random vibrations are presented. Both implementations are based upon a novel approach for evaluating the nonlinear stiffness within commercial finite element codes and are suitable for use with any finite element code having geometrically nonlinear static analysis capabilities. The formulation includes a traditional force-error minimization approach and a relatively new version of a potential energy-error minimization approach, which has been generalized for multiple degree-of-freedom systems. Results for a simply supported plate under random acoustic excitation are presented and comparisons of the displacement root-mean-square values and power spectral densities are made with results from a nonlinear time domain numerical simulation.

  20. A Pragmatic Cognitive System Engineering Approach to Model Dynamic Human Decision-Making Activities in Intelligent and Automated Systems

    DTIC Science & Technology

    2003-10-01

    Among the procedures developed to identify cognitive processes, there are the Cognitive Task Analysis (CTA) and the Cognitive Work Analysis (CWA...of Cognitive Task Design. [11] Potter, S.S., Roth, E.M., Woods, D.D., and Elm, W.C. (2000). Cognitive Task Analysis as Bootstrapping Multiple...Converging Techniques, In Schraagen, Chipman, and Shalin (Eds.). Cognitive Task Analysis . Mahwah, NJ: Lawrence Erlbaum Associates. [12] Roth, E.M

  1. A multiple ion counter total evaporation (MICTE) method for precise analysis of plutonium by thermal ionization mass spectrometry

    DOE PAGES

    Inglis, Jeremy D.; Maassen, Joel; Kara, Azim; ...

    2017-04-28

    This study presents a total evaporation method for the analysis of sub-picogram quantities of Pu, utilizing an array of multiple ion counters. Data from three standards are presented to assess the utility of the technique. An external precision of 1.5% RSD (2σ) was achieved on aliquots approaching 100 fg for the minor 240Pu isotope. Accurate analysis of <1 femtogram of 240Pu, is achievable, with an external reproducibility of better than 10% RSD (2σ). Finally, this new technique represents a significant advance in the total evaporation method and will allow routine measurement of femtogram sized Pu samples by thermal ionization massmore » spectrometry.« less

  2. A multiple ion counter total evaporation (MICTE) method for precise analysis of plutonium by thermal ionization mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Inglis, Jeremy D.; Maassen, Joel; Kara, Azim

    This study presents a total evaporation method for the analysis of sub-picogram quantities of Pu, utilizing an array of multiple ion counters. Data from three standards are presented to assess the utility of the technique. An external precision of 1.5% RSD (2σ) was achieved on aliquots approaching 100 fg for the minor 240Pu isotope. Accurate analysis of <1 femtogram of 240Pu, is achievable, with an external reproducibility of better than 10% RSD (2σ). Finally, this new technique represents a significant advance in the total evaporation method and will allow routine measurement of femtogram sized Pu samples by thermal ionization massmore » spectrometry.« less

  3. Comparison of procedures for correction of matrix interferences in the analysis of soils by ICP-OES with CCD detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadler, D.A.; Sun, F.; Littlejohn, D.

    1995-12-31

    ICP-OES is a useful technique for multi-element analysis of soils. However, as a number of elements are present in relatively high concentrations, matrix interferences can occur and examples have been widely reported. The availability of CCD detectors has increased the opportunities for rapid multi-element, multi-wave-length determination of elemental concentrations in soils and other environmental samples. As the composition of soils from industrial sites can vary considerably, especially when taken from different pit horizons, procedures are required to assess the extent of interferences and correct the effects, on a simultaneous multi-element basis. In single element analysis, plasma operating conditions can sometimesmore » be varied to minimize or even remove multiplicative interferences. In simultaneous multi-element analysis, the scope for this approach may be limited, depending on the spectrochemical characteristics of the emitting analyte species. Matrix matching, by addition of major sample components to the analyte calibrant solutions, can be used to minimize inaccuracies. However, there are also limitations to this procedure, when the sample composition varies significantly. Multiplicative interference effects can also be assessed by a {open_quotes}single standard addition{close_quotes} of each analyte to the sample solution and the information obtained may be used to correct the analyte concentrations determined directly. Each of these approaches has been evaluated to ascertain the best procedure for multi-element analysis of industrial soils by ICP-OES with CCD detection at multiple wavelengths. Standard reference materials and field samples have been analyzed to illustrate the efficacy of each procedure.« less

  4. Advanced interferometric synthetic aperture radar (InSAR) time series analysis using interferograms of multiple-orbit tracks: A case study on Miyake-jima

    NASA Astrophysics Data System (ADS)

    Ozawa, Taku; Ueda, Hideki

    2011-12-01

    InSAR time series analysis is an effective tool for detecting spatially and temporally complicated volcanic deformation. To obtain details of such deformation, we developed an advanced InSAR time series analysis using interferograms of multiple-orbit tracks. Considering only right- (or only left-) looking SAR observations, incidence directions for different orbit tracks are mostly included in a common plane. Therefore, slant-range changes in their interferograms can be expressed by two components in the plane. This approach estimates the time series of their components from interferograms of multiple-orbit tracks by the least squares analysis, and higher accuracy is obtained if many interferograms of different orbit tracks are available. Additionally, this analysis can combine interferograms for different incidence angles. In a case study on Miyake-jima, we obtained a deformation time series corresponding to GPS observations from PALSAR interferograms of six orbit tracks. The obtained accuracy was better than that with the SBAS approach, demonstrating its effectiveness. Furthermore, it is expected that higher accuracy would be obtained if SAR observations were carried out more frequently in all orbit tracks. The deformation obtained in the case study indicates uplift along the west coast and subsidence with contraction around the caldera. The speed of the uplift was almost constant, but the subsidence around the caldera decelerated from 2009. A flat deformation source was estimated near sea level under the caldera, implying that deceleration of subsidence was related to interaction between volcanic thermal activity and the aquifer.

  5. Detecting multiple outliers in linear functional relationship model for circular variables using clustering technique

    NASA Astrophysics Data System (ADS)

    Mokhtar, Nurkhairany Amyra; Zubairi, Yong Zulina; Hussin, Abdul Ghapor

    2017-05-01

    Outlier detection has been used extensively in data analysis to detect anomalous observation in data and has important application in fraud detection and robust analysis. In this paper, we propose a method in detecting multiple outliers for circular variables in linear functional relationship model. Using the residual values of the Caires and Wyatt model, we applied the hierarchical clustering procedure. With the use of tree diagram, we illustrate the graphical approach of the detection of outlier. A simulation study is done to verify the accuracy of the proposed method. Also, an illustration to a real data set is given to show its practical applicability.

  6. Informed actions: where to cost effectively manage multiple threats to species to maximize return on investment.

    PubMed

    Auerbach, Nancy A; Tulloch, Ayesha I T; Possingham, Hugh P

    Conservation practitioners, faced with managing multiple threats to biodiversity and limited funding, must prioritize investment in different management actions. From an economic perspective, it is routine practice to invest where the highest rate of return is expected. This return-on-investment (ROI) thinking can also benefit species conservation, and researchers are developing sophisticated approaches to support decision-making for cost-effective conservation. However, applied use of these approaches is limited. Managers may be wary of “black-box” algorithms or complex methods that are difficult to explain to funding agencies. As an alternative, we demonstrate the use of a basic ROI analysis for determining where to invest in cost-effective management to address threats to species. This method can be applied using basic geographic information system and spreadsheet calculations. We illustrate the approach in a management action prioritization for a biodiverse region of eastern Australia. We use ROI to prioritize management actions for two threats to a suite of threatened species: habitat degradation by cattle grazing, and predation by invasive red foxes (Vulpes vulpes). We show how decisions based on cost-effective threat management depend upon how expected benefits to species are defined and how benefits and costs co-vary. By considering a combination of species richness, restricted habitats, species vulnerability, and costs of management actions, small investments can result in greater expected benefit compared with management decisions that consider only species richness. Furthermore, a landscape management strategy that implements multiple actions is more efficient than managing only for one threat, or more traditional approaches that don't consider ROI. Our approach provides transparent and logical decision support for prioritizing different actions intended to abate threats associated with multiple species; it is of use when managers need a justifiable and repeatable approach to investment.

  7. Using cognitive task analysis to develop simulation-based training for medical tasks.

    PubMed

    Cannon-Bowers, Jan; Bowers, Clint; Stout, Renee; Ricci, Katrina; Hildabrand, Annette

    2013-10-01

    Pressures to increase the efficacy and effectiveness of medical training are causing the Department of Defense to investigate the use of simulation technologies. This article describes a comprehensive cognitive task analysis technique that can be used to simultaneously generate training requirements, performance metrics, scenario requirements, and simulator/simulation requirements for medical tasks. On the basis of a variety of existing techniques, we developed a scenario-based approach that asks experts to perform the targeted task multiple times, with each pass probing a different dimension of the training development process. In contrast to many cognitive task analysis approaches, we argue that our technique can be highly cost effective because it is designed to accomplish multiple goals. The technique was pilot tested with expert instructors from a large military medical training command. These instructors were employed to generate requirements for two selected combat casualty care tasks-cricothyroidotomy and hemorrhage control. Results indicated that the technique is feasible to use and generates usable data to inform simulation-based training system design. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.

  8. Scalable Open Science Approach for Mutation Calling of Tumor Exomes Using Multiple Genomic Pipelines.

    PubMed

    Ellrott, Kyle; Bailey, Matthew H; Saksena, Gordon; Covington, Kyle R; Kandoth, Cyriac; Stewart, Chip; Hess, Julian; Ma, Singer; Chiotti, Kami E; McLellan, Michael; Sofia, Heidi J; Hutter, Carolyn; Getz, Gad; Wheeler, David; Ding, Li

    2018-03-28

    The Cancer Genome Atlas (TCGA) cancer genomics dataset includes over 10,000 tumor-normal exome pairs across 33 different cancer types, in total >400 TB of raw data files requiring analysis. Here we describe the Multi-Center Mutation Calling in Multiple Cancers project, our effort to generate a comprehensive encyclopedia of somatic mutation calls for the TCGA data to enable robust cross-tumor-type analyses. Our approach accounts for variance and batch effects introduced by the rapid advancement of DNA extraction, hybridization-capture, sequencing, and analysis methods over time. We present best practices for applying an ensemble of seven mutation-calling algorithms with scoring and artifact filtering. The dataset created by this analysis includes 3.5 million somatic variants and forms the basis for PanCan Atlas papers. The results have been made available to the research community along with the methods used to generate them. This project is the result of collaboration from a number of institutes and demonstrates how team science drives extremely large genomics projects. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  9. A wavelet analysis for the X-ray absorption spectra of molecules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Penfold, T. J.; Ecole polytechnique Federale de Lausanne, Laboratoire de chimie et biochimie computationnelles, ISIC, FSB-BCH, CH-1015 Lausanne; SwissFEL, Paul Scherrer Inst, CH-5232 Villigen

    2013-01-07

    We present a Wavelet transform analysis for the X-ray absorption spectra of molecules. In contrast to the traditionally used Fourier transform approach, this analysis yields a 2D correlation plot in both R- and k-space. As a consequence, it is possible to distinguish between different scattering pathways at the same distance from the absorbing atom and between the contributions of single and multiple scattering events, making an unambiguous assignment of the fine structure oscillations for complex systems possible. We apply this to two previously studied transition metal complexes, namely iron hexacyanide in both its ferric and ferrous form, and a rheniummore » diimine complex, [ReX(CO){sub 3}(bpy)], where X = Br, Cl, or ethyl pyridine (Etpy). Our results demonstrate the potential advantages of using this approach and they highlight the importance of multiple scattering, and specifically the focusing phenomenon to the extended X-ray absorption fine structure (EXAFS) spectra of these complexes. We also shed light on the low sensitivity of the EXAFS spectrum to the Re-X scattering pathway.« less

  10. Clustering Educational Digital Library Usage Data: A Comparison of Latent Class Analysis and K-Means Algorithms

    ERIC Educational Resources Information Center

    Xu, Beijie; Recker, Mimi; Qi, Xiaojun; Flann, Nicholas; Ye, Lei

    2013-01-01

    This article examines clustering as an educational data mining method. In particular, two clustering algorithms, the widely used K-means and the model-based Latent Class Analysis, are compared, using usage data from an educational digital library service, the Instructional Architect (IA.usu.edu). Using a multi-faceted approach and multiple data…

  11. Estimating forest ecosystem evapotranspiration at multiple temporal scales with a dimension analysis approach

    Treesearch

    Guoyi Zhou; Ge Sun; Xu Wang; Chuanyan Zhou; Steven G. McNulty; James M. Vose; Devendra M. Amatya

    2008-01-01

    It is critical that evapotranspiration (ET) be quantified accurately so that scientists can evaluate the effects of land management and global change on water availability, streamflow, nutrient and sediment loading, and ecosystem productivity in watersheds. The objective of this study was to derive a new semi-empirical ET modeled using a dimension analysis method that...

  12. Maximizing the Information and Validity of a Linear Composite in the Factor Analysis Model for Continuous Item Responses

    ERIC Educational Resources Information Center

    Ferrando, Pere J.

    2008-01-01

    This paper develops results and procedures for obtaining linear composites of factor scores that maximize: (a) test information, and (b) validity with respect to external variables in the multiple factor analysis (FA) model. I treat FA as a multidimensional item response theory model, and use Ackerman's multidimensional information approach based…

  13. Multivariate Methods for Meta-Analysis of Genetic Association Studies.

    PubMed

    Dimou, Niki L; Pantavou, Katerina G; Braliou, Georgia G; Bagos, Pantelis G

    2018-01-01

    Multivariate meta-analysis of genetic association studies and genome-wide association studies has received a remarkable attention as it improves the precision of the analysis. Here, we review, summarize and present in a unified framework methods for multivariate meta-analysis of genetic association studies and genome-wide association studies. Starting with the statistical methods used for robust analysis and genetic model selection, we present in brief univariate methods for meta-analysis and we then scrutinize multivariate methodologies. Multivariate models of meta-analysis for a single gene-disease association studies, including models for haplotype association studies, multiple linked polymorphisms and multiple outcomes are discussed. The popular Mendelian randomization approach and special cases of meta-analysis addressing issues such as the assumption of the mode of inheritance, deviation from Hardy-Weinberg Equilibrium and gene-environment interactions are also presented. All available methods are enriched with practical applications and methodologies that could be developed in the future are discussed. Links for all available software implementing multivariate meta-analysis methods are also provided.

  14. The impact of multiple endpoint dependency on Q and I(2) in meta-analysis.

    PubMed

    Thompson, Christopher Glen; Becker, Betsy Jane

    2014-09-01

    A common assumption in meta-analysis is that effect sizes are independent. When correlated effect sizes are analyzed using traditional univariate techniques, this assumption is violated. This research assesses the impact of dependence arising from treatment-control studies with multiple endpoints on homogeneity measures Q and I(2) in scenarios using the unbiased standardized-mean-difference effect size. Univariate and multivariate meta-analysis methods are examined. Conditions included different overall outcome effects, study sample sizes, numbers of studies, between-outcomes correlations, dependency structures, and ways of computing the correlation. The univariate approach used typical fixed-effects analyses whereas the multivariate approach used generalized least-squares (GLS) estimates of a fixed-effects model, weighted by the inverse variance-covariance matrix. Increased dependence among effect sizes led to increased Type I error rates from univariate models. When effect sizes were strongly dependent, error rates were drastically higher than nominal levels regardless of study sample size and number of studies. In contrast, using GLS estimation to account for multiple-endpoint dependency maintained error rates within nominal levels. Conversely, mean I(2) values were not greatly affected by increased amounts of dependency. Last, we point out that the between-outcomes correlation should be estimated as a pooled within-groups correlation rather than using a full-sample estimator that does not consider treatment/control group membership. Copyright © 2014 John Wiley & Sons, Ltd.

  15. ATDRS payload technology R & D

    NASA Technical Reports Server (NTRS)

    Anzic, G.; Connolly, D. J.; Fujikawa, G.; Andro, M.; Kunath, R. R.; Sharp, G. R.

    1990-01-01

    Four technology development tasks were chosen to reduce (or at least better understand) the technology risks associated with proposed approaches to Advanced Tracking and Data Relay Satellite (ATDRS). The four tasks relate to a Tri-Band Antenna feed system, a Digital Beamforming System for the S Band Multiple-Access System (SMA), an SMA Phased Array Antenna, and a Configuration Thermal/Mechanical Analysis task. The objective, approach, and status of each are discussed.

  16. ATDRS payload technology research and development

    NASA Technical Reports Server (NTRS)

    Anzic, G.; Connolly, D. J.; Fujikawa, G.; Andro, M.; Kunath, R. R.; Sharp, G. R.

    1990-01-01

    Four technology development tasks were chosen to reduce (or at least better understand) the technology risks associated with proposed approaches to Advanced Tracking and Data Relay Satellite (ATDRS). The four tasks relate to a Tri-Band Antenna feed system, a Digital Beamforming System for the S Band Multiple Access System (SMA), an SMA Phased Array Antenna, and a Configuration Thermal/Mechanical Analysis task. The objective, approach, and status of each are discussed.

  17. ATDRS payload technology R & D

    NASA Astrophysics Data System (ADS)

    Anzic, G.; Connolly, D. J.; Fujikawa, G.; Andro, M.; Kunath, R. R.; Sharp, G. R.

    Four technology development tasks were chosen to reduce (or at least better understand) the technology risks associated with proposed approaches to Advanced Tracking and Data Relay Satellite (ATDRS). The four tasks relate to a Tri-Band Antenna feed system, a Digital Beamforming System for the S Band Multiple-Access System (SMA), an SMA Phased Array Antenna, and a Configuration Thermal/Mechanical Analysis task. The objective, approach, and status of each are discussed.

  18. A cost-effective high-throughput metabarcoding approach powerful enough to genotype ~44 000 year-old rodent remains from Northern Africa.

    PubMed

    Guimaraes, S; Pruvost, M; Daligault, J; Stoetzel, E; Bennett, E A; Côté, N M-L; Nicolas, V; Lalis, A; Denys, C; Geigl, E-M; Grange, T

    2017-05-01

    We present a cost-effective metabarcoding approach, aMPlex Torrent, which relies on an improved multiplex PCR adapted to highly degraded DNA, combining barcoding and next-generation sequencing to simultaneously analyse many heterogeneous samples. We demonstrate the strength of these improvements by generating a phylochronology through the genotyping of ancient rodent remains from a Moroccan cave whose stratigraphy covers the last 120 000 years. Rodents are important for epidemiology, agronomy and ecological investigations and can act as bioindicators for human- and/or climate-induced environmental changes. Efficient and reliable genotyping of ancient rodent remains has the potential to deliver valuable phylogenetic and paleoecological information. The analysis of multiple ancient skeletal remains of very small size with poor DNA preservation, however, requires a sensitive high-throughput method to generate sufficient data. We show this approach to be particularly adapted at accessing this otherwise difficult taxonomic and genetic resource. As a highly scalable, lower cost and less labour-intensive alternative to targeted sequence capture approaches, we propose the aMPlex Torrent strategy to be a useful tool for the genetic analysis of multiple degraded samples in studies involving ecology, archaeology, conservation and evolutionary biology. © 2016 John Wiley & Sons Ltd.

  19. Combining results of multiple search engines in proteomics.

    PubMed

    Shteynberg, David; Nesvizhskii, Alexey I; Moritz, Robert L; Deutsch, Eric W

    2013-09-01

    A crucial component of the analysis of shotgun proteomics datasets is the search engine, an algorithm that attempts to identify the peptide sequence from the parent molecular ion that produced each fragment ion spectrum in the dataset. There are many different search engines, both commercial and open source, each employing a somewhat different technique for spectrum identification. The set of high-scoring peptide-spectrum matches for a defined set of input spectra differs markedly among the various search engine results; individual engines each provide unique correct identifications among a core set of correlative identifications. This has led to the approach of combining the results from multiple search engines to achieve improved analysis of each dataset. Here we review the techniques and available software for combining the results of multiple search engines and briefly compare the relative performance of these techniques.

  20. Combining Results of Multiple Search Engines in Proteomics*

    PubMed Central

    Shteynberg, David; Nesvizhskii, Alexey I.; Moritz, Robert L.; Deutsch, Eric W.

    2013-01-01

    A crucial component of the analysis of shotgun proteomics datasets is the search engine, an algorithm that attempts to identify the peptide sequence from the parent molecular ion that produced each fragment ion spectrum in the dataset. There are many different search engines, both commercial and open source, each employing a somewhat different technique for spectrum identification. The set of high-scoring peptide-spectrum matches for a defined set of input spectra differs markedly among the various search engine results; individual engines each provide unique correct identifications among a core set of correlative identifications. This has led to the approach of combining the results from multiple search engines to achieve improved analysis of each dataset. Here we review the techniques and available software for combining the results of multiple search engines and briefly compare the relative performance of these techniques. PMID:23720762

  1. A review of multi-risk methodologies for natural hazards: Consequences and challenges for a climate change impact assessment.

    PubMed

    Gallina, Valentina; Torresan, Silvia; Critto, Andrea; Sperotto, Anna; Glade, Thomas; Marcomini, Antonio

    2016-03-01

    This paper presents a review of existing multi-risk assessment concepts and tools applied by organisations and projects providing the basis for the development of a multi-risk methodology in a climate change perspective. Relevant initiatives were developed for the assessment of multiple natural hazards (e.g. floods, storm surges, droughts) affecting the same area in a defined timeframe (e.g. year, season, decade). Major research efforts were focused on the identification and aggregation of multiple hazard types (e.g. independent, correlated, cascading hazards) by means of quantitative and semi-quantitative approaches. Moreover, several methodologies aim to assess the vulnerability of multiple targets to specific natural hazards by means of vulnerability functions and indicators at the regional and local scale. The overall results of the review show that multi-risk approaches do not consider the effects of climate change and mostly rely on the analysis of static vulnerability (i.e. no time-dependent vulnerabilities, no changes among exposed elements). A relevant challenge is therefore to develop comprehensive formal approaches for the assessment of different climate-induced hazards and risks, including dynamic exposure and vulnerability. This requires the selection and aggregation of suitable hazard and vulnerability metrics to make a synthesis of information about multiple climate impacts, the spatial analysis and ranking of risks, including their visualization and communication to end-users. To face these issues, climate impact assessors should develop cross-sectorial collaborations among different expertise (e.g. modellers, natural scientists, economists) integrating information on climate change scenarios with sectorial climate impact assessment, towards the development of a comprehensive multi-risk assessment process. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. A comparison of multiple imputation methods for incomplete longitudinal binary data.

    PubMed

    Yamaguchi, Yusuke; Misumi, Toshihiro; Maruo, Kazushi

    2018-01-01

    Longitudinal binary data are commonly encountered in clinical trials. Multiple imputation is an approach for getting a valid estimation of treatment effects under an assumption of missing at random mechanism. Although there are a variety of multiple imputation methods for the longitudinal binary data, a limited number of researches have reported on relative performances of the methods. Moreover, when focusing on the treatment effect throughout a period that has often been used in clinical evaluations of specific disease areas, no definite investigations comparing the methods have been available. We conducted an extensive simulation study to examine comparative performances of six multiple imputation methods available in the SAS MI procedure for longitudinal binary data, where two endpoints of responder rates at a specified time point and throughout a period were assessed. The simulation study suggested that results from naive approaches of a single imputation with non-responders and a complete case analysis could be very sensitive against missing data. The multiple imputation methods using a monotone method and a full conditional specification with a logistic regression imputation model were recommended for obtaining unbiased and robust estimations of the treatment effect. The methods were illustrated with data from a mental health research.

  3. A Comprehensive Strategy for Accurate Mutation Detection of the Highly Homologous PMS2.

    PubMed

    Li, Jianli; Dai, Hongzheng; Feng, Yanming; Tang, Jia; Chen, Stella; Tian, Xia; Gorman, Elizabeth; Schmitt, Eric S; Hansen, Terah A A; Wang, Jing; Plon, Sharon E; Zhang, Victor Wei; Wong, Lee-Jun C

    2015-09-01

    Germline mutations in the DNA mismatch repair gene PMS2 underlie the cancer susceptibility syndrome, Lynch syndrome. However, accurate molecular testing of PMS2 is complicated by a large number of highly homologous sequences. To establish a comprehensive approach for mutation detection of PMS2, we have designed a strategy combining targeted capture next-generation sequencing (NGS), multiplex ligation-dependent probe amplification, and long-range PCR followed by NGS to simultaneously detect point mutations and copy number changes of PMS2. Exonic deletions (E2 to E9, E5 to E9, E8, E10, E14, and E1 to E15), duplications (E11 to E12), and a nonsense mutation, p.S22*, were identified. Traditional multiplex ligation-dependent probe amplification and Sanger sequencing approaches cannot differentiate the origin of the exonic deletions in the 3' region when PMS2 and PMS2CL share identical sequences as a result of gene conversion. Our approach allows unambiguous identification of mutations in the active gene with a straightforward long-range-PCR/NGS method. Breakpoint analysis of multiple samples revealed that recurrent exon 14 deletions are mediated by homologous Alu sequences. Our comprehensive approach provides a reliable tool for accurate molecular analysis of genes containing multiple copies of highly homologous sequences and should improve PMS2 molecular analysis for patients with Lynch syndrome. Copyright © 2015 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  4. [Concept analysis of a participatory approach to occupational safety and health].

    PubMed

    Yoshikawa, Etsuko

    2013-01-01

    The purpose of this study was to analyze a participatory approach to occupational safety and health, and to examine the possibility of applying the concept to the practice and research of occupational safety and health. According to Rodger's method, descriptive data concerning antecedents, attributes and consequences were qualitatively analyzed. A total of 39 articles were selected for analysis. Attributes with a participatory approach were: "active involvement of both workers and employers", "focusing on action-oriented low-cost and multiple area improvements based on good practices", "the process of emphasis on consensus building", and "utilization of a local network". Antecedents of the participatory approach were classified as: "existing risks at the workplace", "difficulty of occupational safety and health activities", "characteristics of the workplace and workers", and "needs for the workplace". The derived consequences were: "promoting occupational safety and health activities", "emphasis of self-management", "creation of safety and healthy workplace", and "contributing to promotion of quality of life and productivity". A participatory approach in occupational safety and health is defined as, the process of emphasis on consensus building to promote occupational safety and health activities with emphasis on self-management, which focuses on action-oriented low-cost and multiple area improvements based on good practices with active involvement of both workers and employers through utilization of local networks. We recommend that the role of the occupational health professional be clarified and an evaluation framework be established for the participatory approach to promote occupational safety and health activities by involving both workers and employers.

  5. An analysis of pilot error-related aircraft accidents

    NASA Technical Reports Server (NTRS)

    Kowalsky, N. B.; Masters, R. L.; Stone, R. B.; Babcock, G. L.; Rypka, E. W.

    1974-01-01

    A multidisciplinary team approach to pilot error-related U.S. air carrier jet aircraft accident investigation records successfully reclaimed hidden human error information not shown in statistical studies. New analytic techniques were developed and applied to the data to discover and identify multiple elements of commonality and shared characteristics within this group of accidents. Three techniques of analysis were used: Critical element analysis, which demonstrated the importance of a subjective qualitative approach to raw accident data and surfaced information heretofore unavailable. Cluster analysis, which was an exploratory research tool that will lead to increased understanding and improved organization of facts, the discovery of new meaning in large data sets, and the generation of explanatory hypotheses. Pattern recognition, by which accidents can be categorized by pattern conformity after critical element identification by cluster analysis.

  6. The Propagation of Movement Variability in Time: A Methodological Approach for Discrete Movements with Multiple Degrees of Freedom.

    PubMed

    Krüger, Melanie; Straube, Andreas; Eggert, Thomas

    2017-01-01

    In recent years, theory-building in motor neuroscience and our understanding of the synergistic control of the redundant human motor system has significantly profited from the emergence of a range of different mathematical approaches to analyze the structure of movement variability. Approaches such as the Uncontrolled Manifold method or the Noise-Tolerance-Covariance decomposition method allow to detect and interpret changes in movement coordination due to e.g., learning, external task constraints or disease, by analyzing the structure of within-subject, inter-trial movement variability. Whereas, for cyclical movements (e.g., locomotion), mathematical approaches exist to investigate the propagation of movement variability in time (e.g., time series analysis), similar approaches are missing for discrete, goal-directed movements, such as reaching. Here, we propose canonical correlation analysis as a suitable method to analyze the propagation of within-subject variability across different time points during the execution of discrete movements. While similar analyses have already been applied for discrete movements with only one degree of freedom (DoF; e.g., Pearson's product-moment correlation), canonical correlation analysis allows to evaluate the coupling of inter-trial variability across different time points along the movement trajectory for multiple DoF-effector systems, such as the arm. The theoretical analysis is illustrated by empirical data from a study on reaching movements under normal and disturbed proprioception. The results show increased movement duration, decreased movement amplitude, as well as altered movement coordination under ischemia, which results in a reduced complexity of movement control. Movement endpoint variability is not increased under ischemia. This suggests that healthy adults are able to immediately and efficiently adjust the control of complex reaching movements to compensate for the loss of proprioceptive information. Further, it is shown that, by using canonical correlation analysis, alterations in movement coordination that indicate changes in the control strategy concerning the use of motor redundancy can be detected, which represents an important methodical advance in the context of neuromechanics.

  7. Multi-viewpoint clustering analysis

    NASA Technical Reports Server (NTRS)

    Mehrotra, Mala; Wild, Chris

    1993-01-01

    In this paper, we address the feasibility of partitioning rule-based systems into a number of meaningful units to enhance the comprehensibility, maintainability and reliability of expert systems software. Preliminary results have shown that no single structuring principle or abstraction hierarchy is sufficient to understand complex knowledge bases. We therefore propose the Multi View Point - Clustering Analysis (MVP-CA) methodology to provide multiple views of the same expert system. We present the results of using this approach to partition a deployed knowledge-based system that navigates the Space Shuttle's entry. We also discuss the impact of this approach on verification and validation of knowledge-based systems.

  8. Tracing catchment fine sediment sources using the new SIFT (SedIment Fingerprinting Tool) open source software.

    PubMed

    Pulley, S; Collins, A L

    2018-09-01

    The mitigation of diffuse sediment pollution requires reliable provenance information so that measures can be targeted. Sediment source fingerprinting represents one approach for supporting these needs, but recent methodological developments have resulted in an increasing complexity of data processing methods rendering the approach less accessible to non-specialists. A comprehensive new software programme (SIFT; SedIment Fingerprinting Tool) has therefore been developed which guides the user through critical data analysis decisions and automates all calculations. Multiple source group configurations and composite fingerprints are identified and tested using multiple methods of uncertainty analysis. This aims to explore the sediment provenance information provided by the tracers more comprehensively than a single model, and allows for model configurations with high uncertainties to be rejected. This paper provides an overview of its application to an agricultural catchment in the UK to determine if the approach used can provide a reduction in uncertainty and increase in precision. Five source group classifications were used; three formed using a k-means cluster analysis containing 2, 3 and 4 clusters, and two a-priori groups based upon catchment geology. Three different composite fingerprints were used for each classification and bi-plots, range tests, tracer variability ratios and virtual mixtures tested the reliability of each model configuration. Some model configurations performed poorly when apportioning the composition of virtual mixtures, and different model configurations could produce different sediment provenance results despite using composite fingerprints able to discriminate robustly between the source groups. Despite this uncertainty, dominant sediment sources were identified, and those in close proximity to each sediment sampling location were found to be of greatest importance. This new software, by integrating recent methodological developments in tracer data processing, guides users through key steps. Critically, by applying multiple model configurations and uncertainty assessment, it delivers more robust solutions for informing catchment management of the sediment problem than many previously used approaches. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  9. A novel nano-immunoassay method for quantification of proteins from CD138-purified myeloma cells: biological and clinical utility

    PubMed Central

    Misiewicz-Krzeminska, Irena; Corchete, Luis Antonio; Rojas, Elizabeta A.; Martínez-López, Joaquín; García-Sanz, Ramón; Oriol, Albert; Bladé, Joan; Lahuerta, Juan-José; Miguel, Jesús San; Mateos, María-Victoria; Gutiérrez, Norma C.

    2018-01-01

    Protein analysis in bone marrow samples from patients with multiple myeloma has been limited by the low concentration of proteins obtained after CD138+ cell selection. A novel approach based on capillary nano-immunoassay could make it possible to quantify dozens of proteins from each myeloma sample in an automated manner. Here we present a method for the accurate and robust quantification of the expression of multiple proteins extracted from CD138-purified multiple myeloma samples frozen in RLT Plus buffer, which is commonly used for nucleic acid preservation and isolation. Additionally, the biological and clinical value of this analysis for a panel of 12 proteins essential to the pathogenesis of multiple myeloma was evaluated in 63 patients with newly diagnosed multiple myeloma. The analysis of the prognostic impact of CRBN/Cereblon and IKZF1/Ikaros mRNA/protein showed that only the protein levels were able to predict progression-free survival of patients; mRNA levels were not associated with prognosis. Interestingly, high levels of Cereblon and Ikaros proteins were associated with longer progression-free survival only in patients who received immunomodulatory drugs and not in those treated with other drugs. In conclusion, the capillary nano-immunoassay platform provides a novel opportunity for automated quantification of the expression of more than 20 proteins in CD138+ primary multiple myeloma samples. PMID:29545347

  10. A novel nano-immunoassay method for quantification of proteins from CD138-purified myeloma cells: biological and clinical utility.

    PubMed

    Misiewicz-Krzeminska, Irena; Corchete, Luis Antonio; Rojas, Elizabeta A; Martínez-López, Joaquín; García-Sanz, Ramón; Oriol, Albert; Bladé, Joan; Lahuerta, Juan-José; Miguel, Jesús San; Mateos, María-Victoria; Gutiérrez, Norma C

    2018-05-01

    Protein analysis in bone marrow samples from patients with multiple myeloma has been limited by the low concentration of proteins obtained after CD138 + cell selection. A novel approach based on capillary nano-immunoassay could make it possible to quantify dozens of proteins from each myeloma sample in an automated manner. Here we present a method for the accurate and robust quantification of the expression of multiple proteins extracted from CD138-purified multiple myeloma samples frozen in RLT Plus buffer, which is commonly used for nucleic acid preservation and isolation. Additionally, the biological and clinical value of this analysis for a panel of 12 proteins essential to the pathogenesis of multiple myeloma was evaluated in 63 patients with newly diagnosed multiple myeloma. The analysis of the prognostic impact of CRBN /Cereblon and IKZF1 /Ikaros mRNA/protein showed that only the protein levels were able to predict progression-free survival of patients; mRNA levels were not associated with prognosis. Interestingly, high levels of Cereblon and Ikaros proteins were associated with longer progression-free survival only in patients who received immunomodulatory drugs and not in those treated with other drugs. In conclusion, the capillary nano-immunoassay platform provides a novel opportunity for automated quantification of the expression of more than 20 proteins in CD138 + primary multiple myeloma samples. Copyright © 2018 Ferrata Storti Foundation.

  11. Two approaches to incorporate clinical data uncertainty into multiple criteria decision analysis for benefit-risk assessment of medicinal products.

    PubMed

    Wen, Shihua; Zhang, Lanju; Yang, Bo

    2014-07-01

    The Problem formulation, Objectives, Alternatives, Consequences, Trade-offs, Uncertainties, Risk attitude, and Linked decisions (PrOACT-URL) framework and multiple criteria decision analysis (MCDA) have been recommended by the European Medicines Agency for structured benefit-risk assessment of medicinal products undergoing regulatory review. The objective of this article was to provide solutions to incorporate the uncertainty from clinical data into the MCDA model when evaluating the overall benefit-risk profiles among different treatment options. Two statistical approaches, the δ-method approach and the Monte-Carlo approach, were proposed to construct the confidence interval of the overall benefit-risk score from the MCDA model as well as other probabilistic measures for comparing the benefit-risk profiles between treatment options. Both approaches can incorporate the correlation structure between clinical parameters (criteria) in the MCDA model and are straightforward to implement. The two proposed approaches were applied to a case study to evaluate the benefit-risk profile of an add-on therapy for rheumatoid arthritis (drug X) relative to placebo. It demonstrated a straightforward way to quantify the impact of the uncertainty from clinical data to the benefit-risk assessment and enabled statistical inference on evaluating the overall benefit-risk profiles among different treatment options. The δ-method approach provides a closed form to quantify the variability of the overall benefit-risk score in the MCDA model, whereas the Monte-Carlo approach is more computationally intensive but can yield its true sampling distribution for statistical inference. The obtained confidence intervals and other probabilistic measures from the two approaches enhance the benefit-risk decision making of medicinal products. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  12. Regression-based pediatric norms for the brief visuospatial memory test: revised and the symbol digit modalities test.

    PubMed

    Smerbeck, A M; Parrish, J; Yeh, E A; Hoogs, M; Krupp, Lauren B; Weinstock-Guttman, B; Benedict, R H B

    2011-04-01

    The Brief Visuospatial Memory Test - Revised (BVMTR) and the Symbol Digit Modalities Test (SDMT) oral-only administration are known to be sensitive to cerebral disease in adult samples, but pediatric norms are not available. A demographically balanced sample of healthy control children (N = 92) ages 6-17 was tested with the BVMTR and SDMT. Multiple regression analysis (MRA) was used to develop demographically controlled normative equations. This analysis provided equations that were then used to construct demographically adjusted z-scores for the BVMTR Trial 1, Trial 2, Trial 3, Total Learning, and Delayed Recall indices, as well as the SDMT total correct score. To demonstrate the utility of this approach, a comparison group of children with acute disseminated encephalomyelitis (ADEM) or multiple sclerosis (MS) were also assessed. We find that these visual processing tests discriminate neurological patients from controls. As the tests are validated in adult multiple sclerosis, they are likely to be useful in monitoring pediatric onset multiple sclerosis patients as they transition into adulthood.

  13. Molecular profiling of multiple myeloma: from gene expression analysis to next-generation sequencing.

    PubMed

    Agnelli, Luca; Tassone, Pierfrancesco; Neri, Antonino

    2013-06-01

    Multiple myeloma is a fatal malignant proliferation of clonal bone marrow Ig-secreting plasma cells, characterized by wide clinical, biological, and molecular heterogeneity. Herein, global gene and microRNA expression, genome-wide DNA profilings, and next-generation sequencing technology used to investigate the genomic alterations underlying the bio-clinical heterogeneity in multiple myeloma are discussed. High-throughput technologies have undoubtedly allowed a better comprehension of the molecular basis of the disease, a fine stratification, and early identification of high-risk patients, and have provided insights toward targeted therapy studies. However, such technologies are at risk of being affected by laboratory- or cohort-specific biases, and are moreover influenced by high number of expected false positives. This aspect has a major weight in myeloma, which is characterized by large molecular heterogeneity. Therefore, meta-analysis as well as multiple approaches are desirable if not mandatory to validate the results obtained, in line with commonly accepted recommendation for tumor diagnostic/prognostic biomarker studies.

  14. Comprehensive metabolic modeling of multiple 13C-isotopomer data sets to study metabolism in perfused working hearts.

    PubMed

    Crown, Scott B; Kelleher, Joanne K; Rouf, Rosanne; Muoio, Deborah M; Antoniewicz, Maciek R

    2016-10-01

    In many forms of cardiomyopathy, alterations in energy substrate metabolism play a key role in disease pathogenesis. Stable isotope tracing in rodent heart perfusion systems can be used to determine cardiac metabolic fluxes, namely those relative fluxes that contribute to pyruvate, the acetyl-CoA pool, and pyruvate anaplerosis, which are critical to cardiac homeostasis. Methods have previously been developed to interrogate these relative fluxes using isotopomer enrichments of measured metabolites and algebraic equations to determine a predefined metabolic flux model. However, this approach is exquisitely sensitive to measurement error, thus precluding accurate relative flux parameter determination. In this study, we applied a novel mathematical approach to determine relative cardiac metabolic fluxes using 13 C-metabolic flux analysis ( 13 C-MFA) aided by multiple tracer experiments and integrated data analysis. Using 13 C-MFA, we validated a metabolic network model to explain myocardial energy substrate metabolism. Four different 13 C-labeled substrates were queried (i.e., glucose, lactate, pyruvate, and oleate) based on a previously published study. We integrated the analysis of the complete set of isotopomer data gathered from these mouse heart perfusion experiments into a single comprehensive network model that delineates substrate contributions to both pyruvate and acetyl-CoA pools at a greater resolution than that offered by traditional methods using algebraic equations. To our knowledge, this is the first rigorous application of 13 C-MFA to interrogate data from multiple tracer experiments in the perfused heart. We anticipate that this approach can be used widely to study energy substrate metabolism in this and other similar biological systems. Copyright © 2016 the American Physiological Society.

  15. Comprehensive metabolic modeling of multiple 13C-isotopomer data sets to study metabolism in perfused working hearts

    PubMed Central

    Kelleher, Joanne K.; Rouf, Rosanne; Muoio, Deborah M.; Antoniewicz, Maciek R.

    2016-01-01

    In many forms of cardiomyopathy, alterations in energy substrate metabolism play a key role in disease pathogenesis. Stable isotope tracing in rodent heart perfusion systems can be used to determine cardiac metabolic fluxes, namely those relative fluxes that contribute to pyruvate, the acetyl-CoA pool, and pyruvate anaplerosis, which are critical to cardiac homeostasis. Methods have previously been developed to interrogate these relative fluxes using isotopomer enrichments of measured metabolites and algebraic equations to determine a predefined metabolic flux model. However, this approach is exquisitely sensitive to measurement error, thus precluding accurate relative flux parameter determination. In this study, we applied a novel mathematical approach to determine relative cardiac metabolic fluxes using 13C-metabolic flux analysis (13C-MFA) aided by multiple tracer experiments and integrated data analysis. Using 13C-MFA, we validated a metabolic network model to explain myocardial energy substrate metabolism. Four different 13C-labeled substrates were queried (i.e., glucose, lactate, pyruvate, and oleate) based on a previously published study. We integrated the analysis of the complete set of isotopomer data gathered from these mouse heart perfusion experiments into a single comprehensive network model that delineates substrate contributions to both pyruvate and acetyl-CoA pools at a greater resolution than that offered by traditional methods using algebraic equations. To our knowledge, this is the first rigorous application of 13C-MFA to interrogate data from multiple tracer experiments in the perfused heart. We anticipate that this approach can be used widely to study energy substrate metabolism in this and other similar biological systems. PMID:27496880

  16. Bayesian wavelet PCA methodology for turbomachinery damage diagnosis under uncertainty

    NASA Astrophysics Data System (ADS)

    Xu, Shengli; Jiang, Xiaomo; Huang, Jinzhi; Yang, Shuhua; Wang, Xiaofang

    2016-12-01

    Centrifugal compressor often suffers various defects such as impeller cracking, resulting in forced outage of the total plant. Damage diagnostics and condition monitoring of such a turbomachinery system has become an increasingly important and powerful tool to prevent potential failure in components and reduce unplanned forced outage and further maintenance costs, while improving reliability, availability and maintainability of a turbomachinery system. This paper presents a probabilistic signal processing methodology for damage diagnostics using multiple time history data collected from different locations of a turbomachine, considering data uncertainty and multivariate correlation. The proposed methodology is based on the integration of three advanced state-of-the-art data mining techniques: discrete wavelet packet transform, Bayesian hypothesis testing, and probabilistic principal component analysis. The multiresolution wavelet analysis approach is employed to decompose a time series signal into different levels of wavelet coefficients. These coefficients represent multiple time-frequency resolutions of a signal. Bayesian hypothesis testing is then applied to each level of wavelet coefficient to remove possible imperfections. The ratio of posterior odds Bayesian approach provides a direct means to assess whether there is imperfection in the decomposed coefficients, thus avoiding over-denoising. Power spectral density estimated by the Welch method is utilized to evaluate the effectiveness of Bayesian wavelet cleansing method. Furthermore, the probabilistic principal component analysis approach is developed to reduce dimensionality of multiple time series and to address multivariate correlation and data uncertainty for damage diagnostics. The proposed methodology and generalized framework is demonstrated with a set of sensor data collected from a real-world centrifugal compressor with impeller cracks, through both time series and contour analyses of vibration signal and principal components.

  17. A Novel Quantitative Approach to Concept Analysis: The Internomological Network

    PubMed Central

    Cook, Paul F.; Larsen, Kai R.; Sakraida, Teresa J.; Pedro, Leli

    2012-01-01

    Background When a construct such as patients’ transition to self-management of chronic illness is studied by researchers across multiple disciplines, the meaning of key terms can become confused. This results from inherent problems in language where a term can have multiple meanings (polysemy) and different words can mean the same thing (synonymy). Objectives To test a novel quantitative method for clarifying the meaning of constructs by examining the similarity of published contexts in which they are used. Method Published terms related to the concept transition to self-management of chronic illness were analyzed using the internomological network (INN), a type of latent semantic analysis to calculate the mathematical relationships between constructs based on the contexts in which researchers use each term. This novel approach was tested by comparing results to those from concept analysis, a best-practice qualitative approach to clarifying meanings of terms. By comparing results of the two methods, the best synonyms of transition to self-management, as well as key antecedent, attribute, and consequence terms, were identified. Results Results from INN analysis were consistent with those from concept analysis. The potential synonyms self-management, transition, and adaptation had the greatest utility. Adaptation was the clearest overall synonym, but had lower cross-disciplinary use. The terms coping and readiness had more circumscribed meanings. The INN analysis confirmed key features of transition to self-management, and suggested related concepts not found by the previous review. Discussion The INN analysis is a promising novel methodology that allows researchers to quantify the semantic relationships between constructs. The method works across disciplinary boundaries, and may help to integrate the diverse literature on self-management of chronic illness. PMID:22592387

  18. Use of Multiscale Entropy to Facilitate Artifact Detection in Electroencephalographic Signals

    PubMed Central

    Mariani, Sara; Borges, Ana F. T.; Henriques, Teresa; Goldberger, Ary L.; Costa, Madalena D.

    2016-01-01

    Electroencephalographic (EEG) signals present a myriad of challenges to analysis, beginning with the detection of artifacts. Prior approaches to noise detection have utilized multiple techniques, including visual methods, independent component analysis and wavelets. However, no single method is broadly accepted, inviting alternative ways to address this problem. Here, we introduce a novel approach based on a statistical physics method, multiscale entropy (MSE) analysis, which quantifies the complexity of a signal. We postulate that noise corrupted EEG signals have lower information content, and, therefore, reduced complexity compared with their noise free counterparts. We test the new method on an open-access database of EEG signals with and without added artifacts due to electrode motion. PMID:26738116

  19. Multifractal characteristics of multiparticle production in heavy-ion collisions at SPS energies

    NASA Astrophysics Data System (ADS)

    Khan, Shaista; Ahmad, Shakeel

    Entropy, dimensions and other multifractal characteristics of multiplicity distributions of relativistic charged hadrons produced in ion-ion collisions at SPS energies are investigated. The analysis of the experimental data is carried out in terms of phase space bin-size dependence of multiplicity distributions following the Takagi’s approach. Yet another method is also followed to study the multifractality which, is not related to the bin-width and (or) the detector resolution, rather involves multiplicity distribution of charged particles in full phase space in terms of information entropy and its generalization, Rényi’s order-q information entropy. The findings reveal the presence of multifractal structure — a remarkable property of the fluctuations. Nearly constant values of multifractal specific heat “c” estimated by the two different methods of analysis followed indicate that the parameter “c” may be used as a universal characteristic of the particle production in high energy collisions. The results obtained from the analysis of the experimental data agree well with the predictions of Monte Carlo model AMPT.

  20. A Multicriteria Decision Making Approach for Estimating the Number of Clusters in a Data Set

    PubMed Central

    Peng, Yi; Zhang, Yong; Kou, Gang; Shi, Yong

    2012-01-01

    Determining the number of clusters in a data set is an essential yet difficult step in cluster analysis. Since this task involves more than one criterion, it can be modeled as a multiple criteria decision making (MCDM) problem. This paper proposes a multiple criteria decision making (MCDM)-based approach to estimate the number of clusters for a given data set. In this approach, MCDM methods consider different numbers of clusters as alternatives and the outputs of any clustering algorithm on validity measures as criteria. The proposed method is examined by an experimental study using three MCDM methods, the well-known clustering algorithm–k-means, ten relative measures, and fifteen public-domain UCI machine learning data sets. The results show that MCDM methods work fairly well in estimating the number of clusters in the data and outperform the ten relative measures considered in the study. PMID:22870181

  1. Multivariate longitudinal data analysis with mixed effects hidden Markov models.

    PubMed

    Raffa, Jesse D; Dubin, Joel A

    2015-09-01

    Multiple longitudinal responses are often collected as a means to capture relevant features of the true outcome of interest, which is often hidden and not directly measurable. We outline an approach which models these multivariate longitudinal responses as generated from a hidden disease process. We propose a class of models which uses a hidden Markov model with separate but correlated random effects between multiple longitudinal responses. This approach was motivated by a smoking cessation clinical trial, where a bivariate longitudinal response involving both a continuous and a binomial response was collected for each participant to monitor smoking behavior. A Bayesian method using Markov chain Monte Carlo is used. Comparison of separate univariate response models to the bivariate response models was undertaken. Our methods are demonstrated on the smoking cessation clinical trial dataset, and properties of our approach are examined through extensive simulation studies. © 2015, The International Biometric Society.

  2. A Critical Analysis and Applied Intersectionality Framework with Intercultural Queer Couples.

    PubMed

    Chan, Christian D; Erby, Adrienne N

    2018-01-01

    Intercultural queer couples are growing at an extensive rate in the United States, exemplifying diversity across multiple dimensions (e.g., race, ethnicity, sexuality, affectional identity, gender identity) while experiencing multiple converging forms of oppression (e.g., racism, heterosexism, genderism). Given the dearth of conceptual and empirical literature that unifies both dimensions related to intercultural and queer, applied practices and research contend with a unilateral approach focusing exclusively on either intercultural or queer couples. Intersectionality theory has revolutionized critical scholarship to determine overlapping forms of oppression, decenter hegemonic structures of power relations and social contexts, and enact a social justice agenda. This article addresses the following aims: (1) an overview of the gaps eliciting unilateral approaches to intercultural queer couples; (2) an illustration of intersectionality's theoretical underpinnings as a critical approach; and (3) applications for insights in practices and research with intercultural queer couples.

  3. Multivariate meta-analysis using individual participant data.

    PubMed

    Riley, R D; Price, M J; Jackson, D; Wardle, M; Gueyffier, F; Wang, J; Staessen, J A; White, I R

    2015-06-01

    When combining results across related studies, a multivariate meta-analysis allows the joint synthesis of correlated effect estimates from multiple outcomes. Joint synthesis can improve efficiency over separate univariate syntheses, may reduce selective outcome reporting biases, and enables joint inferences across the outcomes. A common issue is that within-study correlations needed to fit the multivariate model are unknown from published reports. However, provision of individual participant data (IPD) allows them to be calculated directly. Here, we illustrate how to use IPD to estimate within-study correlations, using a joint linear regression for multiple continuous outcomes and bootstrapping methods for binary, survival and mixed outcomes. In a meta-analysis of 10 hypertension trials, we then show how these methods enable multivariate meta-analysis to address novel clinical questions about continuous, survival and binary outcomes; treatment-covariate interactions; adjusted risk/prognostic factor effects; longitudinal data; prognostic and multiparameter models; and multiple treatment comparisons. Both frequentist and Bayesian approaches are applied, with example software code provided to derive within-study correlations and to fit the models. © 2014 The Authors. Research Synthesis Methods published by John Wiley & Sons, Ltd.

  4. Selected reaction monitoring mass spectrometry: a methodology overview.

    PubMed

    Ebhardt, H Alexander

    2014-01-01

    Moving past the discovery phase of proteomics, the term targeted proteomics combines multiple approaches investigating a certain set of proteins in more detail. One such targeted proteomics approach is the combination of liquid chromatography and selected or multiple reaction monitoring mass spectrometry (SRM, MRM). SRM-MS requires prior knowledge of the fragmentation pattern of peptides, as the presence of the analyte in a sample is determined by measuring the m/z values of predefined precursor and fragment ions. Using scheduled SRM-MS, many analytes can robustly be monitored allowing for high-throughput sample analysis of the same set of proteins over many conditions. In this chapter, fundaments of SRM-MS are explained as well as an optimized SRM pipeline from assay generation to data analyzed.

  5. Evaluation of DNA mixtures from database search.

    PubMed

    Chung, Yuk-Ka; Hu, Yue-Qing; Fung, Wing K

    2010-03-01

    With the aim of bridging the gap between DNA mixture analysis and DNA database search, a novel approach is proposed to evaluate the forensic evidence of DNA mixtures when the suspect is identified by the search of a database of DNA profiles. General formulae are developed for the calculation of the likelihood ratio for a two-person mixture under general situations including multiple matches and imperfect evidence. The influence of the prior probabilities on the weight of evidence under the scenario of multiple matches is demonstrated by a numerical example based on Hong Kong data. Our approach is shown to be capable of presenting the forensic evidence of DNA mixtures in a comprehensive way when the suspect is identified through database search.

  6. Mass Spectrometry: A Technique of Many Faces

    PubMed Central

    Olshina, Maya A.; Sharon, Michal

    2016-01-01

    Protein complexes form the critical foundation for a wide range of biological process, however understanding the intricate details of their activities is often challenging. In this review we describe how mass spectrometry plays a key role in the analysis of protein assemblies and the cellular pathways which they are involved in. Specifically, we discuss how the versatility of mass spectrometric approaches provides unprecedented information on multiple levels. We demonstrate this on the ubiquitin-proteasome proteolytic pathway, a process that is responsible for protein turnover. We follow the various steps of this degradation route and illustrate the different mass spectrometry workflows that were applied for elucidating molecular information. Overall, this review aims to stimulate the integrated use of multiple mass spectrometry approaches for analyzing complex biological systems. PMID:28100928

  7. Sit less and move more: perspectives of adults with multiple sclerosis.

    PubMed

    Aminian, Saeideh; Ezeugwu, Victor E; Motl, Robert W; Manns, Patricia J

    2017-12-20

    Multiple sclerosis is a chronic neurological disease with the highest prevalence in Canada. Replacing sedentary behavior with light activities may be a feasible approach to manage multiple sclerosis symptoms. This study explored the perspectives of adults with multiple sclerosis about sedentary behavior, physical activity and ways to change behavior. Fifteen adults with multiple sclerosis (age 43 ± 13 years; mean ± standard deviation), recruited through the multiple sclerosis Clinic at the University of Alberta, Edmonton, Canada, participated in semi-structured interviews. Interview audios were transcribed verbatim and coded. NVivo software was used to facilitate the inductive process of thematic analysis. Balancing competing priorities between sitting and moving was the primary theme. Participants were aware of the benefits of physical activity to their overall health, and in the management of fatigue and muscle stiffness. Due to fatigue, they often chose sitting to get their energy back. Further, some barriers included perceived fear of losing balance or embarrassment while walking. Activity monitoring, accountability, educational and individualized programs were suggested strategies to motivate more movement. Adults with multiple sclerosis were open to the idea of replacing sitting with light activities. Motivational and educational programs are required to help them to change sedentary behavior to moving more. IMPLICATIONS FOR REHABILITATION One of the most challenging and common difficulties of multiple sclerosis is walking impairment that worsens because of multiple sclerosis progression, and is a common goal in the rehabilitation of people with multiple sclerosis. The deterioration in walking abilities is related to lower levels of physical activity and more sedentary behavior, such that adults with multiple sclerosis spend 8 to 10.5 h per day sitting. Replacing prolonged sedentary behavior with light physical activities, and incorporating education, encouragement, and self-monitoring strategies are feasible approaches to manage the symptoms of multiple sclerosis.

  8. An MPI-CUDA approach for hypersonic flows with detailed state-to-state air kinetics using a GPU cluster

    NASA Astrophysics Data System (ADS)

    Bonelli, Francesco; Tuttafesta, Michele; Colonna, Gianpiero; Cutrone, Luigi; Pascazio, Giuseppe

    2017-10-01

    This paper describes the most advanced results obtained in the context of fluid dynamic simulations of high-enthalpy flows using detailed state-to-state air kinetics. Thermochemical non-equilibrium, typical of supersonic and hypersonic flows, was modeled by using both the accurate state-to-state approach and the multi-temperature model proposed by Park. The accuracy of the two thermochemical non-equilibrium models was assessed by comparing the results with experimental findings, showing better predictions provided by the state-to-state approach. To overcome the huge computational cost of the state-to-state model, a multiple-nodes GPU implementation, based on an MPI-CUDA approach, was employed and a comprehensive code performance analysis is presented. Both the pure MPI-CPU and the MPI-CUDA implementations exhibit excellent scalability performance. GPUs outperform CPUs computing especially when the state-to-state approach is employed, showing speed-ups, of the single GPU with respect to the single-core CPU, larger than 100 in both the case of one MPI process and multiple MPI process.

  9. Comparing approaches for using climate projections in assessing water resources investments for systems with multiple stakeholder groups

    NASA Astrophysics Data System (ADS)

    Hurford, Anthony; Harou, Julien

    2015-04-01

    Climate change has challenged conventional methods of planning water resources infrastructure investment, relying on stationarity of time-series data. It is not clear how to best use projections of future climatic conditions. Many-objective simulation-optimisation and trade-off analysis using evolutionary algorithms has been proposed as an approach to addressing complex planning problems with multiple conflicting objectives. The search for promising assets and policies can be carried out across a range of climate projections, to identify the configurations of infrastructure investment shown by model simulation to be robust under diverse future conditions. Climate projections can be used in different ways within a simulation model to represent the range of possible future conditions and understand how optimal investments vary according to the different hydrological conditions. We compare two approaches, optimising over an ensemble of different 20-year flow and PET timeseries projections, and separately for individual future scenarios built synthetically from the original ensemble. Comparing trade-off curves and surfaces generated by the two approaches helps understand the limits and benefits of optimising under different sets of conditions. The comparison is made for the Tana Basin in Kenya, where climate change combined with multiple conflicting objectives of water management and infrastructure investment mean decision-making is particularly challenging.

  10. Peptidomic analysis of endogenous plasma peptides from patients with pancreatic neuroendocrine tumours.

    PubMed

    Kay, Richard G; Challis, Benjamin G; Casey, Ruth T; Roberts, Geoffrey P; Meek, Claire L; Reimann, Frank; Gribble, Fiona M

    2018-06-01

    Diagnosis of pancreatic neuroendocrine tumours requires the study of patient plasma with multiple immunoassays, using multiple aliquots of plasma. The application of mass spectrometry based techniques could reduce the cost and amount of plasma required for diagnosis. Plasma samples from two patients with pancreatic neuroendocrine tumours were extracted using an established acetonitrile based plasma peptide enrichment strategy. The circulating peptidome was characterised using nano and high flow rate LC/MS analyses. To assess the diagnostic potential of the analytical approach, a large sample batch (68 plasmas) from control subjects, and aliquots from subjects harbouring two different types of pancreatic neuroendocrine tumour (insulinoma and glucagonoma) were analysed using a 10-minute LC/MS peptide screen. The untargeted plasma peptidomics approach identified peptides derived from the glucagon prohormone, chromogranin A, chromogranin B and other peptide hormones and proteins related to control of peptide secretion. The glucagon prohormone derived peptides that were detected were compared against putative peptides that were identified using multiple antibody pairs against glucagon peptides. Comparison of the plasma samples for relative levels of selected peptides showed clear separation between the glucagonoma and the insulinoma and control samples. The combination of the organic solvent extraction methodology with high flow rate analysis could potentially be used to aid diagnosis and monitor treatment of patients with functioning pancreatic neuroendocrine tumours. However, significant validation will be required before this approach can be clinically applied. This article is protected by copyright. All rights reserved.

  11. Surrogate 239Pu(n, fxn) and 241Pu(n, fxn) average fission-neutron-multiplicity measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burke, J. T.; Alan, B. S.; Akindele, O. A.

    2017-09-26

    We have constructed a new neutron-charged-particle detector array called NeutronSTARS. It has been described extensively in LLNL-TR-703909 [1] and Akindele et al [2]. We have used this new neutron-charged-particle array to measure the 241Pu and 239Pu fissionneutron multiplicity as a function of equivalent incident-neutron energy from 100 keV to 20 MeV. The experimental approach, detector array, data analysis, and results are summarized in the following sections.

  12. Multiple electron processes of He and Ne by proton impact

    NASA Astrophysics Data System (ADS)

    Terekhin, Pavel Nikolaevich; Montenegro, Pablo; Quinto, Michele; Monti, Juan; Fojon, Omar; Rivarola, Roberto

    2016-05-01

    A detailed investigation of multiple electron processes (single and multiple ionization, single capture, transfer-ionization) of He and Ne is presented for proton impact at intermediate and high collision energies. Exclusive absolute cross sections for these processes have been obtained by calculation of transition probabilities in the independent electron and independent event models as a function of impact parameter in the framework of the continuum distorted wave-eikonal initial state theory. A binomial analysis is employed to calculate exclusive probabilities. The comparison with available theoretical and experimental results shows that exclusive probabilities are needed for a reliable description of the experimental data. The developed approach can be used for obtaining the input database for modeling multiple electron processes of charged particles passing through the matter.

  13. Estimating the mass variance in neutron multiplicity counting-A comparison of approaches

    NASA Astrophysics Data System (ADS)

    Dubi, C.; Croft, S.; Favalli, A.; Ocherashvili, A.; Pedersen, B.

    2017-12-01

    In the standard practice of neutron multiplicity counting , the first three sampled factorial moments of the event triggered neutron count distribution are used to quantify the three main neutron source terms: the spontaneous fissile material effective mass, the relative (α , n) production and the induced fission source responsible for multiplication. This study compares three methods to quantify the statistical uncertainty of the estimated mass: the bootstrap method, propagation of variance through moments, and statistical analysis of cycle data method. Each of the three methods was implemented on a set of four different NMC measurements, held at the JRC-laboratory in Ispra, Italy, sampling four different Pu samples in a standard Plutonium Scrap Multiplicity Counter (PSMC) well counter.

  14. Estimating the mass variance in neutron multiplicity counting $-$ A comparison of approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dubi, C.; Croft, S.; Favalli, A.

    In the standard practice of neutron multiplicity counting, the first three sampled factorial moments of the event triggered neutron count distribution are used to quantify the three main neutron source terms: the spontaneous fissile material effective mass, the relative (α,n) production and the induced fission source responsible for multiplication. This study compares three methods to quantify the statistical uncertainty of the estimated mass: the bootstrap method, propagation of variance through moments, and statistical analysis of cycle data method. Each of the three methods was implemented on a set of four different NMC measurements, held at the JRC-laboratory in Ispra, Italy,more » sampling four different Pu samples in a standard Plutonium Scrap Multiplicity Counter (PSMC) well counter.« less

  15. Estimating the mass variance in neutron multiplicity counting $-$ A comparison of approaches

    DOE PAGES

    Dubi, C.; Croft, S.; Favalli, A.; ...

    2017-09-14

    In the standard practice of neutron multiplicity counting, the first three sampled factorial moments of the event triggered neutron count distribution are used to quantify the three main neutron source terms: the spontaneous fissile material effective mass, the relative (α,n) production and the induced fission source responsible for multiplication. This study compares three methods to quantify the statistical uncertainty of the estimated mass: the bootstrap method, propagation of variance through moments, and statistical analysis of cycle data method. Each of the three methods was implemented on a set of four different NMC measurements, held at the JRC-laboratory in Ispra, Italy,more » sampling four different Pu samples in a standard Plutonium Scrap Multiplicity Counter (PSMC) well counter.« less

  16. Taking an intersectional approach to define latent classes of socioeconomic status, ethnicity and migration status for psychiatric epidemiological research.

    PubMed

    Goodwin, L; Gazard, B; Aschan, L; MacCrimmon, S; Hotopf, M; Hatch, S L

    2017-04-09

    Inequalities in mental health are well documented using individual social statuses such as socioeconomic status (SES), ethnicity and migration status. However, few studies have taken an intersectional approach to investigate inequalities in mental health using latent class analysis (LCA). This study will examine the association between multiple indicator classes of social identity with common mental disorder (CMD). Data on CMD symptoms were assessed in a diverse inner London sample of 1052 participants in the second wave of the South East London Community Health study. LCA was used to define classes of social identity using multiple indicators of SES, ethnicity and migration status. Adjusted associations between CMD and both individual indicators and multiple indicators of social identity are presented. LCA identified six groups that were differentiated by varying levels of privilege and disadvantage based on multiple SES indicators. This intersectional approach highlighted nuanced differences in odds of CMD, with the economically inactive group with multiple levels of disadvantage most likely to have a CMD. Adding ethnicity and migration status further differentiated between groups. The migrant, economically inactive and White British, economically inactive classes both had increased odds of CMD. This is the first study to examine the intersections of SES, ethnicity and migration status with CMD using LCA. Results showed that both the migrant, economically inactive and the White British, economically inactive classes had a similarly high prevalence of CMD. Findings suggest that LCA is a useful methodology for investigating health inequalities by intersectional identities.

  17. A National Approach for Mapping and Quantifying Habitat-based Biodiversity Metrics Across Multiple Spatial Scales

    EPA Science Inventory

    Ecosystem services, i.e., "services provided to humans from natural systems," have become a key issue of this century in resource management, conservation planning, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national inte...

  18. Methods for forecasting freight in uncertainty : time series analysis of multiple factors.

    DOT National Transportation Integrated Search

    2011-01-31

    The main goal of this research was to analyze and more accurately model freight movement in : Alabama. Ultimately, the goal of this project was to provide an overall approach to the : integration of accurate freight models into transportation plans a...

  19. A Flexible Approach for the Statistical Visualization of Ensemble Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Potter, K.; Wilson, A.; Bremer, P.

    2009-09-29

    Scientists are increasingly moving towards ensemble data sets to explore relationships present in dynamic systems. Ensemble data sets combine spatio-temporal simulation results generated using multiple numerical models, sampled input conditions and perturbed parameters. While ensemble data sets are a powerful tool for mitigating uncertainty, they pose significant visualization and analysis challenges due to their complexity. We present a collection of overview and statistical displays linked through a high level of interactivity to provide a framework for gaining key scientific insight into the distribution of the simulation results as well as the uncertainty associated with the data. In contrast to methodsmore » that present large amounts of diverse information in a single display, we argue that combining multiple linked statistical displays yields a clearer presentation of the data and facilitates a greater level of visual data analysis. We demonstrate this approach using driving problems from climate modeling and meteorology and discuss generalizations to other fields.« less

  20. Aerodynamic analysis for aircraft with nacelles, pylons, and winglets at transonic speeds

    NASA Technical Reports Server (NTRS)

    Boppe, Charles W.

    1987-01-01

    A computational method has been developed to provide an analysis for complex realistic aircraft configurations at transonic speeds. Wing-fuselage configurations with various combinations of pods, pylons, nacelles, and winglets can be analyzed along with simpler shapes such as airfoils, isolated wings, and isolated bodies. The flexibility required for the treatment of such diverse geometries is obtained by using a multiple nested grid approach in the finite-difference relaxation scheme. Aircraft components (and their grid systems) can be added or removed as required. As a result, the computational method can be used in the same manner as a wind tunnel to study high-speed aerodynamic interference effects. The multiple grid approach also provides high boundary point density/cost ratio. High resolution pressure distributions can be obtained. Computed results are correlated with wind tunnel and flight data using four different transport configurations. Experimental/computational component interference effects are included for cases where data are available. The computer code used for these comparisons is described in the appendices.

  1. Multiple Criteria Decision Analysis for Health Care Decision Making--An Introduction: Report 1 of the ISPOR MCDA Emerging Good Practices Task Force.

    PubMed

    Thokala, Praveen; Devlin, Nancy; Marsh, Kevin; Baltussen, Rob; Boysen, Meindert; Kalo, Zoltan; Longrenn, Thomas; Mussen, Filip; Peacock, Stuart; Watkins, John; Ijzerman, Maarten

    2016-01-01

    Health care decisions are complex and involve confronting trade-offs between multiple, often conflicting, objectives. Using structured, explicit approaches to decisions involving multiple criteria can improve the quality of decision making and a set of techniques, known under the collective heading multiple criteria decision analysis (MCDA), are useful for this purpose. MCDA methods are widely used in other sectors, and recently there has been an increase in health care applications. In 2014, ISPOR established an MCDA Emerging Good Practices Task Force. It was charged with establishing a common definition for MCDA in health care decision making and developing good practice guidelines for conducting MCDA to aid health care decision making. This initial ISPOR MCDA task force report provides an introduction to MCDA - it defines MCDA; provides examples of its use in different kinds of decision making in health care (including benefit risk analysis, health technology assessment, resource allocation, portfolio decision analysis, shared patient clinician decision making and prioritizing patients' access to services); provides an overview of the principal methods of MCDA; and describes the key steps involved. Upon reviewing this report, readers should have a solid overview of MCDA methods and their potential for supporting health care decision making. Copyright © 2016. Published by Elsevier Inc.

  2. Effect of slice thickness on brain magnetic resonance image texture analysis

    PubMed Central

    2010-01-01

    Background The accuracy of texture analysis in clinical evaluation of magnetic resonance images depends considerably on imaging arrangements and various image quality parameters. In this paper, we study the effect of slice thickness on brain tissue texture analysis using a statistical approach and classification of T1-weighted images of clinically confirmed multiple sclerosis patients. Methods We averaged the intensities of three consecutive 1-mm slices to simulate 3-mm slices. Two hundred sixty-four texture parameters were calculated for both the original and the averaged slices. Wilcoxon's signed ranks test was used to find differences between the regions of interest representing white matter and multiple sclerosis plaques. Linear and nonlinear discriminant analyses were applied with several separate training and test sets to determine the actual classification accuracy. Results Only moderate differences in distributions of the texture parameter value for 1-mm and simulated 3-mm-thick slices were found. Our study also showed that white matter areas are well separable from multiple sclerosis plaques even if the slice thickness differs between training and test sets. Conclusions Three-millimeter-thick magnetic resonance image slices acquired with a 1.5 T clinical magnetic resonance scanner seem to be sufficient for texture analysis of multiple sclerosis plaques and white matter tissue. PMID:20955567

  3. Identification and analysis of chemical constituents and rat serum metabolites in Suan-Zao-Ren granule using ultra high performance liquid chromatography quadrupole time-of-flight mass spectrometry combined with multiple data processing approaches.

    PubMed

    Du, Yiyang; He, Bosai; Li, Qing; He, Jiao; Wang, Di; Bi, Kaishun

    2017-07-01

    Suan-Zao-Ren granule is widely used to treat insomnia in China. However, because of the complexity and diversity of the chemical compositions in traditional Chinese medicine formula, the comprehensive analysis of constituents in vitro and in vivo is rather difficult. In our study, an ultra high performance liquid chromatography with quadrupole time-of-flight mass spectrometry and the PeakView® software, which uses multiple data processing approaches including product ion filter, neutral loss filter, and mass defect filter, method was developed to characterize the ingredients and rat serum metabolites in Suan-Zao-Ren granule. A total of 101 constituents were detected in vitro. Under the same analysis conditions, 68 constituents were characterized in rat serum, including 35 prototype components and 33 metabolites. The metabolic pathways of main components were also illustrated. Among them, the metabolic pathways of timosaponin AI were firstly revealed. The bioactive compounds mainly underwent the phase I metabolic pathways including hydroxylation, oxidation, hydrolysis, and phase II metabolic pathways including sulfate conjugation, glucuronide conjugation, cysteine conjugation, acetycysteine conjugation, and glutathione conjugation. In conclusion, our results showed that this analysis approach was extremely useful for the in-depth pharmacological research of Suan-Zao-Ren granule and provided a chemical basis for its rational. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. A reproducible approach to high-throughput biological data acquisition and integration

    PubMed Central

    Rahnavard, Gholamali; Waldron, Levi; McIver, Lauren; Shafquat, Afrah; Franzosa, Eric A.; Miropolsky, Larissa; Sweeney, Christopher

    2015-01-01

    Modern biological research requires rapid, complex, and reproducible integration of multiple experimental results generated both internally and externally (e.g., from public repositories). Although large systematic meta-analyses are among the most effective approaches both for clinical biomarker discovery and for computational inference of biomolecular mechanisms, identifying, acquiring, and integrating relevant experimental results from multiple sources for a given study can be time-consuming and error-prone. To enable efficient and reproducible integration of diverse experimental results, we developed a novel approach for standardized acquisition and analysis of high-throughput and heterogeneous biological data. This allowed, first, novel biomolecular network reconstruction in human prostate cancer, which correctly recovered and extended the NFκB signaling pathway. Next, we investigated host-microbiome interactions. In less than an hour of analysis time, the system retrieved data and integrated six germ-free murine intestinal gene expression datasets to identify the genes most influenced by the gut microbiota, which comprised a set of immune-response and carbohydrate metabolism processes. Finally, we constructed integrated functional interaction networks to compare connectivity of peptide secretion pathways in the model organisms Escherichia coli, Bacillus subtilis, and Pseudomonas aeruginosa. PMID:26157642

  5. Congruence analysis of geodetic networks - hypothesis tests versus model selection by information criteria

    NASA Astrophysics Data System (ADS)

    Lehmann, Rüdiger; Lösler, Michael

    2017-12-01

    Geodetic deformation analysis can be interpreted as a model selection problem. The null model indicates that no deformation has occurred. It is opposed to a number of alternative models, which stipulate different deformation patterns. A common way to select the right model is the usage of a statistical hypothesis test. However, since we have to test a series of deformation patterns, this must be a multiple test. As an alternative solution for the test problem, we propose the p-value approach. Another approach arises from information theory. Here, the Akaike information criterion (AIC) or some alternative is used to select an appropriate model for a given set of observations. Both approaches are discussed and applied to two test scenarios: A synthetic levelling network and the Delft test data set. It is demonstrated that they work but behave differently, sometimes even producing different results. Hypothesis tests are well-established in geodesy, but may suffer from an unfavourable choice of the decision error rates. The multiple test also suffers from statistical dependencies between the test statistics, which are neglected. Both problems are overcome by applying information criterions like AIC.

  6. Analysis of a Near Field MIMO Wireless Channel Using 5.6 GHz Dipole Antennas

    NASA Astrophysics Data System (ADS)

    Maricar, Mohamed Ismaeel; Gradoni, Gabriele; Greedy, Steve; Ivrlac, Michel T.; Nossek, Josef A.; Phang, Sendy; Creagh, Stephen C.; Tanner, Gregor; Thomas, David W. P.

    2016-05-01

    Understanding the impact of interference upon the performance of a multiple input multiple output (MIMO) based device is of paramount importance in ensuring a design is both resilient and robust. In this work the effect of element-element interference in the creation of multiple channels of a wireless link approaching the near-field regime is studied. The elements of the 2-antenna transmit- and receive-arrays are chosen to be identical folded dipole antennas operating at 5.6 GHz. We find that two equally strong channels can be created even if the antennas interact at sub-wavelength distances, thus confirming previous theoretical predictions.

  7. On the Interpretation and Use of Mediation: Multiple Perspectives on Mediation Analysis.

    PubMed

    Agler, Robert; De Boeck, Paul

    2017-01-01

    Mediation analysis has become a very popular approach in psychology, and it is one that is associated with multiple perspectives that are often at odds, often implicitly. Explicitly discussing these perspectives and their motivations, advantages, and disadvantages can help to provide clarity to conversations and research regarding the use and refinement of mediation models. We discuss five such pairs of perspectives on mediation analysis, their associated advantages and disadvantages, and their implications: with vs. without a mediation hypothesis, specific effects vs. a global model, directness vs. indirectness of causation, effect size vs. null hypothesis testing, and hypothesized vs. alternative explanations. Discussion of the perspectives is facilitated by a small simulation study. Some philosophical and linguistic considerations are briefly discussed, as well as some other perspectives we do not develop here.

  8. Multivariate meta-analysis for non-linear and other multi-parameter associations

    PubMed Central

    Gasparrini, A; Armstrong, B; Kenward, M G

    2012-01-01

    In this paper, we formalize the application of multivariate meta-analysis and meta-regression to synthesize estimates of multi-parameter associations obtained from different studies. This modelling approach extends the standard two-stage analysis used to combine results across different sub-groups or populations. The most straightforward application is for the meta-analysis of non-linear relationships, described for example by regression coefficients of splines or other functions, but the methodology easily generalizes to any setting where complex associations are described by multiple correlated parameters. The modelling framework of multivariate meta-analysis is implemented in the package mvmeta within the statistical environment R. As an illustrative example, we propose a two-stage analysis for investigating the non-linear exposure–response relationship between temperature and non-accidental mortality using time-series data from multiple cities. Multivariate meta-analysis represents a useful analytical tool for studying complex associations through a two-stage procedure. Copyright © 2012 John Wiley & Sons, Ltd. PMID:22807043

  9. Sedimentation equilibrium analysis of protein interactions with global implicit mass conservation constraints and systematic noise decomposition.

    PubMed

    Vistica, Jennifer; Dam, Julie; Balbo, Andrea; Yikilmaz, Emine; Mariuzza, Roy A; Rouault, Tracey A; Schuck, Peter

    2004-03-15

    Sedimentation equilibrium is a powerful tool for the characterization of protein self-association and heterogeneous protein interactions. Frequently, it is applied in a configuration with relatively long solution columns and with equilibrium profiles being acquired sequentially at several rotor speeds. The present study proposes computational tools, implemented in the software SEDPHAT, for the global analysis of equilibrium data at multiple rotor speeds with multiple concentrations and multiple optical detection methods. The detailed global modeling of such equilibrium data can be a nontrivial computational problem. It was shown previously that mass conservation constraints can significantly improve and extend the analysis of heterogeneous protein interactions. Here, a method for using conservation of mass constraints for the macromolecular redistribution is proposed in which the effective loading concentrations are calculated from the sedimentation equilibrium profiles. The approach is similar to that described by Roark (Biophys. Chem. 5 (1976) 185-196), but its utility is extended by determining the bottom position of the solution columns from the macromolecular redistribution. For analyzing heterogeneous associations at multiple protein concentrations, additional constraints that relate the effective loading concentrations of the different components or their molar ratio in the global analysis are introduced. Equilibrium profiles at multiple rotor speeds also permit the algebraic determination of radial-dependent baseline profiles, which can govern interference optical ultracentrifugation data, but usually also occur, to a smaller extent, in absorbance optical data. Finally, the global analysis of equilibrium profiles at multiple rotor speeds with implicit mass conservation and computation of the bottom of the solution column provides an unbiased scale for determining molar mass distributions of noninteracting species. The properties of these tools are studied with theoretical and experimental data sets.

  10. A novel bi-level meta-analysis approach: applied to biological pathway analysis.

    PubMed

    Nguyen, Tin; Tagett, Rebecca; Donato, Michele; Mitrea, Cristina; Draghici, Sorin

    2016-02-01

    The accumulation of high-throughput data in public repositories creates a pressing need for integrative analysis of multiple datasets from independent experiments. However, study heterogeneity, study bias, outliers and the lack of power of available methods present real challenge in integrating genomic data. One practical drawback of many P-value-based meta-analysis methods, including Fisher's, Stouffer's, minP and maxP, is that they are sensitive to outliers. Another drawback is that, because they perform just one statistical test for each individual experiment, they may not fully exploit the potentially large number of samples within each study. We propose a novel bi-level meta-analysis approach that employs the additive method and the Central Limit Theorem within each individual experiment and also across multiple experiments. We prove that the bi-level framework is robust against bias, less sensitive to outliers than other methods, and more sensitive to small changes in signal. For comparative analysis, we demonstrate that the intra-experiment analysis has more power than the equivalent statistical test performed on a single large experiment. For pathway analysis, we compare the proposed framework versus classical meta-analysis approaches (Fisher's, Stouffer's and the additive method) as well as against a dedicated pathway meta-analysis package (MetaPath), using 1252 samples from 21 datasets related to three human diseases, acute myeloid leukemia (9 datasets), type II diabetes (5 datasets) and Alzheimer's disease (7 datasets). Our framework outperforms its competitors to correctly identify pathways relevant to the phenotypes. The framework is sufficiently general to be applied to any type of statistical meta-analysis. The R scripts are available on demand from the authors. sorin@wayne.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. Limited Agreement of Independent RNAi Screens for Virus-Required Host Genes Owes More to False-Negative than False-Positive Factors

    PubMed Central

    Wang, Zhishi; Craven, Mark; Newton, Michael A.; Ahlquist, Paul

    2013-01-01

    Systematic, genome-wide RNA interference (RNAi) analysis is a powerful approach to identify gene functions that support or modulate selected biological processes. An emerging challenge shared with some other genome-wide approaches is that independent RNAi studies often show limited agreement in their lists of implicated genes. To better understand this, we analyzed four genome-wide RNAi studies that identified host genes involved in influenza virus replication. These studies collectively identified and validated the roles of 614 cell genes, but pair-wise overlap among the four gene lists was only 3% to 15% (average 6.7%). However, a number of functional categories were overrepresented in multiple studies. The pair-wise overlap of these enriched-category lists was high, ∼19%, implying more agreement among studies than apparent at the gene level. Probing this further, we found that the gene lists implicated by independent studies were highly connected in interacting networks by independent functional measures such as protein-protein interactions, at rates significantly higher than predicted by chance. We also developed a general, model-based approach to gauge the effects of false-positive and false-negative factors and to estimate, from a limited number of studies, the total number of genes involved in a process. For influenza virus replication, this novel statistical approach estimates the total number of cell genes involved to be ∼2,800. This and multiple other aspects of our experimental and computational results imply that, when following good quality control practices, the low overlap between studies is primarily due to false negatives rather than false-positive gene identifications. These results and methods have implications for and applications to multiple forms of genome-wide analysis. PMID:24068911

  12. MALDI imaging mass spectrometry analysis-A new approach for protein mapping in multiple sclerosis brain lesions.

    PubMed

    Maccarrone, Giuseppina; Nischwitz, Sandra; Deininger, Sören-Oliver; Hornung, Joachim; König, Fatima Barbara; Stadelmann, Christine; Turck, Christoph W; Weber, Frank

    2017-03-15

    Multiple sclerosis is a disease of the central nervous system characterized by recurrent inflammatory demyelinating lesions in the early disease stage. Lesion formation and mechanisms leading to lesion remyelination are not fully understood. Matrix Assisted Laser Desorption Ionisation Mass Spectrometry imaging (MALDI-IMS) is a technology which analyses proteins and peptides in tissue, preserves their spatial localization, and generates molecular maps within the tissue section. In a pilot study we employed MALDI imaging mass spectrometry to profile and identify peptides and proteins expressed in normal-appearing white matter, grey matter and multiple sclerosis brain lesions with different extents of remyelination. The unsupervised clustering analysis of the mass spectra generated images which reflected the tissue section morphology in luxol fast blue stain and in myelin basic protein immunohistochemistry. Lesions with low remyelination extent were defined by compounds with molecular weight smaller than 5300Da, while more completely remyelinated lesions showed compounds with molecular weights greater than 15,200Da. An in-depth analysis of the mass spectra enabled the detection of cortical lesions which were not seen by routine luxol fast blue histology. An ion mass, mainly distributed at the rim of multiple sclerosis lesions, was identified by liquid chromatography and tandem mass spectrometry as thymosin beta-4, a protein known to be involved in cell migration and in restorative processes. The ion mass of thymosin beta-4 was profiled by MALDI imaging mass spectrometry in brain slides of 12 multiple sclerosis patients and validated by immunohistochemical analysis. In summary, our results demonstrate the ability of the MALDI-IMS technology to map proteins within the brain parenchyma and multiple sclerosis lesions and to identify potential markers involved in multiple sclerosis pathogenesis and/or remyelination. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Performance evaluation of 2D and 3D deep learning approaches for automatic segmentation of multiple organs on CT images

    NASA Astrophysics Data System (ADS)

    Zhou, Xiangrong; Yamada, Kazuma; Kojima, Takuya; Takayama, Ryosuke; Wang, Song; Zhou, Xinxin; Hara, Takeshi; Fujita, Hiroshi

    2018-02-01

    The purpose of this study is to evaluate and compare the performance of modern deep learning techniques for automatically recognizing and segmenting multiple organ regions on 3D CT images. CT image segmentation is one of the important task in medical image analysis and is still very challenging. Deep learning approaches have demonstrated the capability of scene recognition and semantic segmentation on nature images and have been used to address segmentation problems of medical images. Although several works showed promising results of CT image segmentation by using deep learning approaches, there is no comprehensive evaluation of segmentation performance of the deep learning on segmenting multiple organs on different portions of CT scans. In this paper, we evaluated and compared the segmentation performance of two different deep learning approaches that used 2D- and 3D deep convolutional neural networks (CNN) without- and with a pre-processing step. A conventional approach that presents the state-of-the-art performance of CT image segmentation without deep learning was also used for comparison. A dataset that includes 240 CT images scanned on different portions of human bodies was used for performance evaluation. The maximum number of 17 types of organ regions in each CT scan were segmented automatically and compared to the human annotations by using ratio of intersection over union (IU) as the criterion. The experimental results demonstrated the IUs of the segmentation results had a mean value of 79% and 67% by averaging 17 types of organs that segmented by a 3D- and 2D deep CNN, respectively. All the results of the deep learning approaches showed a better accuracy and robustness than the conventional segmentation method that used probabilistic atlas and graph-cut methods. The effectiveness and the usefulness of deep learning approaches were demonstrated for solving multiple organs segmentation problem on 3D CT images.

  14. Missing data treatments matter: an analysis of multiple imputation for anterior cervical discectomy and fusion procedures.

    PubMed

    Ondeck, Nathaniel T; Fu, Michael C; Skrip, Laura A; McLynn, Ryan P; Cui, Jonathan J; Basques, Bryce A; Albert, Todd J; Grauer, Jonathan N

    2018-04-09

    The presence of missing data is a limitation of large datasets, including the National Surgical Quality Improvement Program (NSQIP). In addressing this issue, most studies use complete case analysis, which excludes cases with missing data, thus potentially introducing selection bias. Multiple imputation, a statistically rigorous approach that approximates missing data and preserves sample size, may be an improvement over complete case analysis. The present study aims to evaluate the impact of using multiple imputation in comparison with complete case analysis for assessing the associations between preoperative laboratory values and adverse outcomes following anterior cervical discectomy and fusion (ACDF) procedures. This is a retrospective review of prospectively collected data. Patients undergoing one-level ACDF were identified in NSQIP 2012-2015. Perioperative adverse outcome variables assessed included the occurrence of any adverse event, severe adverse events, and hospital readmission. Missing preoperative albumin and hematocrit values were handled using complete case analysis and multiple imputation. These preoperative laboratory levels were then tested for associations with 30-day postoperative outcomes using logistic regression. A total of 11,999 patients were included. Of this cohort, 63.5% of patients had missing preoperative albumin and 9.9% had missing preoperative hematocrit. When using complete case analysis, only 4,311 patients were studied. The removed patients were significantly younger, healthier, of a common body mass index, and male. Logistic regression analysis failed to identify either preoperative hypoalbuminemia or preoperative anemia as significantly associated with adverse outcomes. When employing multiple imputation, all 11,999 patients were included. Preoperative hypoalbuminemia was significantly associated with the occurrence of any adverse event and severe adverse events. Preoperative anemia was significantly associated with the occurrence of any adverse event, severe adverse events, and hospital readmission. Multiple imputation is a rigorous statistical procedure that is being increasingly used to address missing values in large datasets. Using this technique for ACDF avoided the loss of cases that may have affected the representativeness and power of the study and led to different results than complete case analysis. Multiple imputation should be considered for future spine studies. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. gsSKAT: Rapid gene set analysis and multiple testing correction for rare-variant association studies using weighted linear kernels.

    PubMed

    Larson, Nicholas B; McDonnell, Shannon; Cannon Albright, Lisa; Teerlink, Craig; Stanford, Janet; Ostrander, Elaine A; Isaacs, William B; Xu, Jianfeng; Cooney, Kathleen A; Lange, Ethan; Schleutker, Johanna; Carpten, John D; Powell, Isaac; Bailey-Wilson, Joan E; Cussenot, Olivier; Cancel-Tassin, Geraldine; Giles, Graham G; MacInnis, Robert J; Maier, Christiane; Whittemore, Alice S; Hsieh, Chih-Lin; Wiklund, Fredrik; Catalona, William J; Foulkes, William; Mandal, Diptasri; Eeles, Rosalind; Kote-Jarai, Zsofia; Ackerman, Michael J; Olson, Timothy M; Klein, Christopher J; Thibodeau, Stephen N; Schaid, Daniel J

    2017-05-01

    Next-generation sequencing technologies have afforded unprecedented characterization of low-frequency and rare genetic variation. Due to low power for single-variant testing, aggregative methods are commonly used to combine observed rare variation within a single gene. Causal variation may also aggregate across multiple genes within relevant biomolecular pathways. Kernel-machine regression and adaptive testing methods for aggregative rare-variant association testing have been demonstrated to be powerful approaches for pathway-level analysis, although these methods tend to be computationally intensive at high-variant dimensionality and require access to complete data. An additional analytical issue in scans of large pathway definition sets is multiple testing correction. Gene set definitions may exhibit substantial genic overlap, and the impact of the resultant correlation in test statistics on Type I error rate control for large agnostic gene set scans has not been fully explored. Herein, we first outline a statistical strategy for aggregative rare-variant analysis using component gene-level linear kernel score test summary statistics as well as derive simple estimators of the effective number of tests for family-wise error rate control. We then conduct extensive simulation studies to characterize the behavior of our approach relative to direct application of kernel and adaptive methods under a variety of conditions. We also apply our method to two case-control studies, respectively, evaluating rare variation in hereditary prostate cancer and schizophrenia. Finally, we provide open-source R code for public use to facilitate easy application of our methods to existing rare-variant analysis results. © 2017 WILEY PERIODICALS, INC.

  16. A MAD-Bayes Algorithm for State-Space Inference and Clustering with Application to Querying Large Collections of ChIP-Seq Data Sets.

    PubMed

    Zuo, Chandler; Chen, Kailei; Keleş, Sündüz

    2017-06-01

    Current analytic approaches for querying large collections of chromatin immunoprecipitation followed by sequencing (ChIP-seq) data from multiple cell types rely on individual analysis of each data set (i.e., peak calling) independently. This approach discards the fact that functional elements are frequently shared among related cell types and leads to overestimation of the extent of divergence between different ChIP-seq samples. Methods geared toward multisample investigations have limited applicability in settings that aim to integrate 100s to 1000s of ChIP-seq data sets for query loci (e.g., thousands of genomic loci with a specific binding site). Recently, Zuo et al. developed a hierarchical framework for state-space matrix inference and clustering, named MBASIC, to enable joint analysis of user-specified loci across multiple ChIP-seq data sets. Although this versatile framework estimates both the underlying state-space (e.g., bound vs. unbound) and also groups loci with similar patterns together, its Expectation-Maximization-based estimation structure hinders its applicability with large number of loci and samples. We address this limitation by developing MAP-based asymptotic derivations from Bayes (MAD-Bayes) framework for MBASIC. This results in a K-means-like optimization algorithm that converges rapidly and hence enables exploring multiple initialization schemes and flexibility in tuning. Comparison with MBASIC indicates that this speed comes at a relatively insignificant loss in estimation accuracy. Although MAD-Bayes MBASIC is specifically designed for the analysis of user-specified loci, it is able to capture overall patterns of histone marks from multiple ChIP-seq data sets similar to those identified by genome-wide segmentation methods such as ChromHMM and Spectacle.

  17. Robust biological parametric mapping: an improved technique for multimodal brain image analysis

    NASA Astrophysics Data System (ADS)

    Yang, Xue; Beason-Held, Lori; Resnick, Susan M.; Landman, Bennett A.

    2011-03-01

    Mapping the quantitative relationship between structure and function in the human brain is an important and challenging problem. Numerous volumetric, surface, region of interest and voxelwise image processing techniques have been developed to statistically assess potential correlations between imaging and non-imaging metrics. Recently, biological parametric mapping has extended the widely popular statistical parametric approach to enable application of the general linear model to multiple image modalities (both for regressors and regressands) along with scalar valued observations. This approach offers great promise for direct, voxelwise assessment of structural and functional relationships with multiple imaging modalities. However, as presented, the biological parametric mapping approach is not robust to outliers and may lead to invalid inferences (e.g., artifactual low p-values) due to slight mis-registration or variation in anatomy between subjects. To enable widespread application of this approach, we introduce robust regression and robust inference in the neuroimaging context of application of the general linear model. Through simulation and empirical studies, we demonstrate that our robust approach reduces sensitivity to outliers without substantial degradation in power. The robust approach and associated software package provides a reliable way to quantitatively assess voxelwise correlations between structural and functional neuroimaging modalities.

  18. How does spatial extent of fMRI datasets affect independent component analysis decomposition?

    PubMed

    Aragri, Adriana; Scarabino, Tommaso; Seifritz, Erich; Comani, Silvia; Cirillo, Sossio; Tedeschi, Gioacchino; Esposito, Fabrizio; Di Salle, Francesco

    2006-09-01

    Spatial independent component analysis (sICA) of functional magnetic resonance imaging (fMRI) time series can generate meaningful activation maps and associated descriptive signals, which are useful to evaluate datasets of the entire brain or selected portions of it. Besides computational implications, variations in the input dataset combined with the multivariate nature of ICA may lead to different spatial or temporal readouts of brain activation phenomena. By reducing and increasing a volume of interest (VOI), we applied sICA to different datasets from real activation experiments with multislice acquisition and single or multiple sensory-motor task-induced blood oxygenation level-dependent (BOLD) signal sources with different spatial and temporal structure. Using receiver operating characteristics (ROC) methodology for accuracy evaluation and multiple regression analysis as benchmark, we compared sICA decompositions of reduced and increased VOI fMRI time-series containing auditory, motor and hemifield visual activation occurring separately or simultaneously in time. Both approaches yielded valid results; however, the results of the increased VOI approach were spatially more accurate compared to the results of the decreased VOI approach. This is consistent with the capability of sICA to take advantage of extended samples of statistical observations and suggests that sICA is more powerful with extended rather than reduced VOI datasets to delineate brain activity. (c) 2006 Wiley-Liss, Inc.

  19. Integrative Data Analysis of Multi-Platform Cancer Data with a Multimodal Deep Learning Approach.

    PubMed

    Liang, Muxuan; Li, Zhizhong; Chen, Ting; Zeng, Jianyang

    2015-01-01

    Identification of cancer subtypes plays an important role in revealing useful insights into disease pathogenesis and advancing personalized therapy. The recent development of high-throughput sequencing technologies has enabled the rapid collection of multi-platform genomic data (e.g., gene expression, miRNA expression, and DNA methylation) for the same set of tumor samples. Although numerous integrative clustering approaches have been developed to analyze cancer data, few of them are particularly designed to exploit both deep intrinsic statistical properties of each input modality and complex cross-modality correlations among multi-platform input data. In this paper, we propose a new machine learning model, called multimodal deep belief network (DBN), to cluster cancer patients from multi-platform observation data. In our integrative clustering framework, relationships among inherent features of each single modality are first encoded into multiple layers of hidden variables, and then a joint latent model is employed to fuse common features derived from multiple input modalities. A practical learning algorithm, called contrastive divergence (CD), is applied to infer the parameters of our multimodal DBN model in an unsupervised manner. Tests on two available cancer datasets show that our integrative data analysis approach can effectively extract a unified representation of latent features to capture both intra- and cross-modality correlations, and identify meaningful disease subtypes from multi-platform cancer data. In addition, our approach can identify key genes and miRNAs that may play distinct roles in the pathogenesis of different cancer subtypes. Among those key miRNAs, we found that the expression level of miR-29a is highly correlated with survival time in ovarian cancer patients. These results indicate that our multimodal DBN based data analysis approach may have practical applications in cancer pathogenesis studies and provide useful guidelines for personalized cancer therapy.

  20. Supraorbital Versus Endoscopic Endonasal Approaches for Olfactory Groove Meningiomas: A Cost-Minimization Study.

    PubMed

    Gandhoke, Gurpreet S; Pease, Matthew; Smith, Kenneth J; Sekula, Raymond F

    2017-09-01

    To perform a cost-minimization study comparing the supraorbital and endoscopic endonasal (EEA) approach with or without craniotomy for the resection of olfactory groove meningiomas (OGMs). We built a decision tree using probabilities of gross total resection (GTR) and cerebrospinal fluid (CSF) leak rates with the supraorbital approach versus EEA with and without additional craniotomy. The cost (not charge or reimbursement) at each "stem" of this decision tree for both surgical options was obtained from our hospital's finance department. After a base case calculation, we applied plausible ranges to all parameters and carried out multiple 1-way sensitivity analyses. Probabilistic sensitivity analyses confirmed our results. The probabilities of GTR (0.8) and CSF leak (0.2) for the supraorbital craniotomy were obtained from our series of 5 patients who underwent a supraorbital approach for the resection of an OGM. The mean tumor volume was 54.6 cm 3 (range, 17-94.2 cm 3 ). Literature-reported rates of GTR (0.6) and CSF leak (0.3) with EEA were applied to our economic analysis. Supraorbital craniotomy was the preferred strategy, with an expected value of $29,423, compared with an EEA cost of $83,838. On multiple 1-way sensitivity analyses, supraorbital craniotomy remained the preferred strategy, with a minimum cost savings of $46,000 and a maximum savings of $64,000. Probabilistic sensitivity analysis found the lowest cost difference between the 2 surgical options to be $37,431. Compared with EEA, supraorbital craniotomy provides substantial cost savings in the treatment of OGMs. Given the potential differences in effectiveness between approaches, a cost-effectiveness analysis should be undertaken. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Isolation with Migration Models for More Than Two Populations

    PubMed Central

    Hey, Jody

    2010-01-01

    A method for studying the divergence of multiple closely related populations is described and assessed. The approach of Hey and Nielsen (2007, Integration within the Felsenstein equation for improved Markov chain Monte Carlo methods in population genetics. Proc Natl Acad Sci USA. 104:2785–2790) for fitting an isolation-with-migration model was extended to the case of multiple populations with a known phylogeny. Analysis of simulated data sets reveals the kinds of history that are accessible with a multipopulation analysis. Necessarily, processes associated with older time periods in a phylogeny are more difficult to estimate; and histories with high levels of gene flow are particularly difficult with more than two populations. However, for histories with modest levels of gene flow, or for very large data sets, it is possible to study large complex divergence problems that involve multiple closely related populations or species. PMID:19955477

  2. Isolation with migration models for more than two populations.

    PubMed

    Hey, Jody

    2010-04-01

    A method for studying the divergence of multiple closely related populations is described and assessed. The approach of Hey and Nielsen (2007, Integration within the Felsenstein equation for improved Markov chain Monte Carlo methods in population genetics. Proc Natl Acad Sci USA. 104:2785-2790) for fitting an isolation-with-migration model was extended to the case of multiple populations with a known phylogeny. Analysis of simulated data sets reveals the kinds of history that are accessible with a multipopulation analysis. Necessarily, processes associated with older time periods in a phylogeny are more difficult to estimate; and histories with high levels of gene flow are particularly difficult with more than two populations. However, for histories with modest levels of gene flow, or for very large data sets, it is possible to study large complex divergence problems that involve multiple closely related populations or species.

  3. Approach for validating actinide and fission product compositions for burnup credit criticality safety analyses

    DOE PAGES

    Radulescu, Georgeta; Gauld, Ian C.; Ilas, Germina; ...

    2014-11-01

    This paper describes a depletion code validation approach for criticality safety analysis using burnup credit for actinide and fission product nuclides in spent nuclear fuel (SNF) compositions. The technical basis for determining the uncertainties in the calculated nuclide concentrations is comparison of calculations to available measurements obtained from destructive radiochemical assay of SNF samples. Probability distributions developed for the uncertainties in the calculated nuclide concentrations were applied to the SNF compositions of a criticality safety analysis model by the use of a Monte Carlo uncertainty sampling method to determine bias and bias uncertainty in effective neutron multiplication factor. Application ofmore » the Monte Carlo uncertainty sampling approach is demonstrated for representative criticality safety analysis models of pressurized water reactor spent fuel pool storage racks and transportation packages using burnup-dependent nuclide concentrations calculated with SCALE 6.1 and the ENDF/B-VII nuclear data. Furthermore, the validation approach and results support a recent revision of the U.S. Nuclear Regulatory Commission Interim Staff Guidance 8.« less

  4. Addressing Gender Equity in Nonfaculty Salaries.

    ERIC Educational Resources Information Center

    Toukoushian, Robert K.

    2000-01-01

    Discusses methodology of gender equity studies on noninstructional employees of colleges and universities, including variable selection in the multiple regression model and alternative approaches for measuring wage gaps. Analysis of staff data at one institution finds that experience and market differences account for 80 percent of gender pay…

  5. Analysis of norovirus contamination of seafood

    USDA-ARS?s Scientific Manuscript database

    The study of human norovirus (NoVs) replication in vitro would be a highly useful tool to virologists and immunologists. For this reason, we have searched for new approaches to determine viability of noroviruses in food samples (especially sea food). Our research team has multiple years of experie...

  6. Action and Organizational Learning in an Elevator Company

    ERIC Educational Resources Information Center

    De Loo, Ivo

    2006-01-01

    Purpose: To highlight the relevance of management control in action learning programs that aim to foster organizational learning. Design/methodology/approach: Literature review plus case study. The latter consists of archival analysis and multiple interviews. Findings: When action learning programs are built around singular learning experiences,…

  7. An Introduction to Modern Missing Data Analyses

    ERIC Educational Resources Information Center

    Baraldi, Amanda N.; Enders, Craig K.

    2010-01-01

    A great deal of recent methodological research has focused on two modern missing data analysis methods: maximum likelihood and multiple imputation. These approaches are advantageous to traditional techniques (e.g. deletion and mean imputation techniques) because they require less stringent assumptions and mitigate the pitfalls of traditional…

  8. Multiple Hypnotizabilities: Differentiating the Building Blocks of Hypnotic Response

    ERIC Educational Resources Information Center

    Woody, Erik Z.; Barnier, Amanda J.; McConkey, Kevin M.

    2005-01-01

    Although hypnotizability can be conceptualized as involving component subskills, standard measures do not differentiate them from a more general unitary trait, partly because the measures include limited sets of dichotomous items. To overcome this, the authors applied full-information factor analysis, a sophisticated analytic approach for…

  9. Using Multilevel Modeling in Language Assessment Research: A Conceptual Introduction

    ERIC Educational Resources Information Center

    Barkaoui, Khaled

    2013-01-01

    This article critiques traditional single-level statistical approaches (e.g., multiple regression analysis) to examining relationships between language test scores and variables in the assessment setting. It highlights the conceptual, methodological, and statistical problems associated with these techniques in dealing with multilevel or nested…

  10. Psychobiological operationalization of RDoC constructs: Methodological and conceptual opportunities and challenges.

    PubMed

    MacNamara, Annmarie; Phan, K Luan

    2016-03-01

    NIMH's Research Domain Criteria (RDoC) project seeks to advance the diagnosis, prevention, and treatment of mental disorders by promoting psychobiological research on dimensional constructs that might cut across traditional diagnostic boundaries (Kozak & Cuthbert, ). At the core of this approach is the notion that these dimensional constructs can be assessed across different units of analysis (e.g., genes, physiology, behavior), enriching the constructs and providing more complete explanations of clinical problems. While the conceptual aspects of RDoC have been discussed in several prior papers, its methodological aspects have received comparatively less attention. For example, how to integrate data from different units of analysis has been relatively unclear. Here, we discuss one means of psychobiologically operationalizing RDoC constructs across different units of analysis (the psychoneurometric approach; Yancey et al., ), highlighting ways in which this approach might be refined in future iterations. We conclude that there is much to be learned from this technique; however, greater attention to scale-development methods and to psychometrics will likely benefit this and other methodological approaches to combining measurements across multiple units of analysis. © 2016 Society for Psychophysiological Research.

  11. Method for Identifying Probable Archaeological Sites from Remotely Sensed Data

    NASA Technical Reports Server (NTRS)

    Tilton, James C.; Comer, Douglas C.; Priebe, Carey E.; Sussman, Daniel

    2011-01-01

    Archaeological sites are being compromised or destroyed at a catastrophic rate in most regions of the world. The best solution to this problem is for archaeologists to find and study these sites before they are compromised or destroyed. One way to facilitate the necessary rapid, wide area surveys needed to find these archaeological sites is through the generation of maps of probable archaeological sites from remotely sensed data. We describe an approach for identifying probable locations of archaeological sites over a wide area based on detecting subtle anomalies in vegetative cover through a statistically based analysis of remotely sensed data from multiple sources. We further developed this approach under a recent NASA ROSES Space Archaeology Program project. Under this project we refined and elaborated this statistical analysis to compensate for potential slight miss-registrations between the remote sensing data sources and the archaeological site location data. We also explored data quantization approaches (required by the statistical analysis approach), and we identified a superior data quantization approached based on a unique image segmentation approach. In our presentation we will summarize our refined approach and demonstrate the effectiveness of the overall approach with test data from Santa Catalina Island off the southern California coast. Finally, we discuss our future plans for further improving our approach.

  12. A multiple-point geostatistical approach to quantifying uncertainty for flow and transport simulation in geologically complex environments

    NASA Astrophysics Data System (ADS)

    Cronkite-Ratcliff, C.; Phelps, G. A.; Boucher, A.

    2011-12-01

    In many geologic settings, the pathways of groundwater flow are controlled by geologic heterogeneities which have complex geometries. Models of these geologic heterogeneities, and consequently, their effects on the simulated pathways of groundwater flow, are characterized by uncertainty. Multiple-point geostatistics, which uses a training image to represent complex geometric descriptions of geologic heterogeneity, provides a stochastic approach to the analysis of geologic uncertainty. Incorporating multiple-point geostatistics into numerical models provides a way to extend this analysis to the effects of geologic uncertainty on the results of flow simulations. We present two case studies to demonstrate the application of multiple-point geostatistics to numerical flow simulation in complex geologic settings with both static and dynamic conditioning data. Both cases involve the development of a training image from a complex geometric description of the geologic environment. Geologic heterogeneity is modeled stochastically by generating multiple equally-probable realizations, all consistent with the training image. Numerical flow simulation for each stochastic realization provides the basis for analyzing the effects of geologic uncertainty on simulated hydraulic response. The first case study is a hypothetical geologic scenario developed using data from the alluvial deposits in Yucca Flat, Nevada. The SNESIM algorithm is used to stochastically model geologic heterogeneity conditioned to the mapped surface geology as well as vertical drill-hole data. Numerical simulation of groundwater flow and contaminant transport through geologic models produces a distribution of hydraulic responses and contaminant concentration results. From this distribution of results, the probability of exceeding a given contaminant concentration threshold can be used as an indicator of uncertainty about the location of the contaminant plume boundary. The second case study considers a characteristic lava-flow aquifer system in Pahute Mesa, Nevada. A 3D training image is developed by using object-based simulation of parametric shapes to represent the key morphologic features of rhyolite lava flows embedded within ash-flow tuffs. In addition to vertical drill-hole data, transient pressure head data from aquifer tests can be used to constrain the stochastic model outcomes. The use of both static and dynamic conditioning data allows the identification of potential geologic structures that control hydraulic response. These case studies demonstrate the flexibility of the multiple-point geostatistics approach for considering multiple types of data and for developing sophisticated models of geologic heterogeneities that can be incorporated into numerical flow simulations.

  13. A novel approach to analyzing lung cancer mortality disparities: Using the exposome and a graph-theoretical toolchain

    PubMed Central

    Juarez, Paul D; Hood, Darryl B; Rogers, Gary L; Baktash, Suzanne H; Saxton, Arnold M; Matthews-Juarez, Patricia; Im, Wansoo; Cifuentes, Myriam Patricia; Phillips, Charles A; Lichtveld, Maureen Y; Langston, Michael A

    2017-01-01

    Objectives The aim is to identify exposures associated with lung cancer mortality and mortality disparities by race and gender using an exposome database coupled to a graph theoretical toolchain. Methods Graph theoretical algorithms were employed to extract paracliques from correlation graphs using associations between 2162 environmental exposures and lung cancer mortality rates in 2067 counties, with clique doubling applied to compute an absolute threshold of significance. Factor analysis and multiple linear regressions then were used to analyze differences in exposures associated with lung cancer mortality and mortality disparities by race and gender. Results While cigarette consumption was highly correlated with rates of lung cancer mortality for both white men and women, previously unidentified novel exposures were more closely associated with lung cancer mortality and mortality disparities for blacks, particularly black women. Conclusions Exposures beyond smoking moderate lung cancer mortality and mortality disparities by race and gender. Policy Implications An exposome approach and database coupled with scalable combinatorial analytics provides a powerful new approach for analyzing relationships between multiple environmental exposures, pathways and health outcomes. An assessment of multiple exposures is needed to appropriately translate research findings into environmental public health practice and policy. PMID:29152601

  14. Epigenomics of Hypertension

    PubMed Central

    Liang, Mingyu; Cowley, Allen W.; Mattson, David L.; Kotchen, Theodore A.; Liu, Yong

    2013-01-01

    Multiple genes and pathways are involved in the pathogenesis of hypertension. Epigenomic studies of hypertension are beginning to emerge and hold great promise of providing novel insights into the mechanisms underlying hypertension. Epigenetic marks or mediators including DNA methylation, histone modifications, and non-coding RNA can be studied at a genome or near-genome scale using epigenomic approaches. At the single gene level, several studies have identified changes in epigenetic modifications in genes expressed in the kidney that correlate with the development of hypertension. Systematic analysis and integration of epigenetic marks at the genome scale, demonstration of cellular and physiological roles of specific epigenetic modifications, and investigation of inheritance are among the major challenges and opportunities for future epigenomic and epigenetic studies of hypertension. Essential hypertension is a multifactorial disease involving multiple genetic and environmental factors and mediated by alterations in multiple biological pathways. Because the non-genetic mechanisms may involve epigenetic modifications, epigenomics is one of the latest concepts and approaches brought to bear on hypertension research. In this article, we summarize briefly the concepts and techniques for epigenomics, discuss the rationale for applying epigenomic approaches to study hypertension, and review the current state of this research area. PMID:24011581

  15. Modelling consequences of change in biodiversity and ...

    EPA Pesticide Factsheets

    This chapter offers an assessment of the rapidly changing landscape of methods assessing and forecasting the benefits that people receive from nature and how these benefits are shaped by institutions and various anthropogenic assets. There has been an explosion of activity in understanding and modeling the benefits that people receive from nature, and this explosion has provided a diversity of approaches that are both complementary and contradictory. However, there remain major gaps in what current models can do. They are not well suited to estimate most types of benefits at national, regional, or global scales. they are focused on decision analysis, but have not focused on implementation, learning, or dialogue. This hap in particular means that current models are not well suited to bridging among multiple knowledge systems, however, there are initial efforts made towards this goal. Furthermore, while participatory social-ecological scenarios are able to bridge multiple knowledge systems in their assessment and analysis of multiple ecosystem series, the social-ecological scenarios community is fragmented and not well connected. Consequently, IPBES has an excellent knowledge base to build upon, but a real investment in building a more integrated modeling and scenarios community of practice is needed to produce a more complete and useful toolbox of approaches to meet the needs of IPBES assessment and other assessment of nature benefits. This Chapter describes

  16. Parametric optimization of multiple quality characteristics in laser cutting of Inconel-718 by using hybrid approach of multiple regression analysis and genetic algorithm

    NASA Astrophysics Data System (ADS)

    Shrivastava, Prashant Kumar; Pandey, Arun Kumar

    2018-06-01

    Inconel-718 has found high demand in different industries due to their superior mechanical properties. The traditional cutting methods are facing difficulties for cutting these alloys due to their low thermal potential, lower elasticity and high chemical compatibility at inflated temperature. The challenges of machining and/or finishing of unusual shapes and/or sizes in these materials have also faced by traditional machining. Laser beam cutting may be applied for the miniaturization and ultra-precision cutting and/or finishing by appropriate control of different process parameter. This paper present multi-objective optimization the kerf deviation, kerf width and kerf taper in the laser cutting of Incone-718 sheet. The second order regression models have been developed for different quality characteristics by using the experimental data obtained through experimentation. The regression models have been used as objective function for multi-objective optimization based on the hybrid approach of multiple regression analysis and genetic algorithm. The comparison of optimization results to experimental results shows an improvement of 88%, 10.63% and 42.15% in kerf deviation, kerf width and kerf taper, respectively. Finally, the effects of different process parameters on quality characteristics have also been discussed.

  17. A Scalable Approach for Discovering Conserved Active Subnetworks across Species

    PubMed Central

    Verfaillie, Catherine M.; Hu, Wei-Shou; Myers, Chad L.

    2010-01-01

    Overlaying differential changes in gene expression on protein interaction networks has proven to be a useful approach to interpreting the cell's dynamic response to a changing environment. Despite successes in finding active subnetworks in the context of a single species, the idea of overlaying lists of differentially expressed genes on networks has not yet been extended to support the analysis of multiple species' interaction networks. To address this problem, we designed a scalable, cross-species network search algorithm, neXus (Network - cross(X)-species - Search), that discovers conserved, active subnetworks based on parallel differential expression studies in multiple species. Our approach leverages functional linkage networks, which provide more comprehensive coverage of functional relationships than physical interaction networks by combining heterogeneous types of genomic data. We applied our cross-species approach to identify conserved modules that are differentially active in stem cells relative to differentiated cells based on parallel gene expression studies and functional linkage networks from mouse and human. We find hundreds of conserved active subnetworks enriched for stem cell-associated functions such as cell cycle, DNA repair, and chromatin modification processes. Using a variation of this approach, we also find a number of species-specific networks, which likely reflect mechanisms of stem cell function that have diverged between mouse and human. We assess the statistical significance of the subnetworks by comparing them with subnetworks discovered on random permutations of the differential expression data. We also describe several case examples that illustrate the utility of comparative analysis of active subnetworks. PMID:21170309

  18. Occupant traffic estimation through structural vibration sensing

    NASA Astrophysics Data System (ADS)

    Pan, Shijia; Mirshekari, Mostafa; Zhang, Pei; Noh, Hae Young

    2016-04-01

    The number of people passing through different indoor areas is useful in various smart structure applications, including occupancy-based building energy/space management, marketing research, security, etc. Existing approaches to estimate occupant traffic include vision-, sound-, and radio-based (mobile) sensing methods, which have placement limitations (e.g., requirement of line-of-sight, quiet environment, carrying a device all the time). Such limitations make these direct sensing approaches difficult to deploy and maintain. An indirect approach using geophones to measure floor vibration induced by footsteps can be utilized. However, the main challenge lies in distinguishing multiple simultaneous walkers by developing features that can effectively represent the number of mixed signals and characterize the selected features under different traffic conditions. This paper presents a method to monitor multiple persons. Once the vibration signals are obtained, features are extracted to describe the overlapping vibration signals induced by multiple footsteps, which are used for occupancy traffic estimation. In particular, we focus on analysis of the efficiency and limitations of the four selected key features when used for estimating various traffic conditions. We characterize these features with signals collected from controlled impulse load tests as well as from multiple people walking through a real-world sensing area. In our experiments, the system achieves the mean estimation error of +/-0.2 people for different occupant traffic conditions (from one to four) using k-nearest neighbor classifier.

  19. Proteomic analysis of formalin-fixed paraffin embedded tissue by MALDI imaging mass spectrometry

    PubMed Central

    Casadonte, Rita; Caprioli, Richard M

    2012-01-01

    Archived formalin-fixed paraffin-embedded (FFPE) tissue collections represent a valuable informational resource for proteomic studies. Multiple FFPE core biopsies can be assembled in a single block to form tissue microarrays (TMAs). We describe a protocol for analyzing protein in FFPE -TMAs using matrix-assisted laser desorption/ionization (MAL DI) imaging mass spectrometry (IMS). The workflow incorporates an antigen retrieval step following deparaffinization, in situ trypsin digestion, matrix application and then mass spectrometry signal acquisition. The direct analysis of FFPE -TMA tissue using IMS allows direct analysis of multiple tissue samples in a single experiment without extraction and purification of proteins. The advantages of high speed and throughput, easy sample handling and excellent reproducibility make this technology a favorable approach for the proteomic analysis of clinical research cohorts with large sample numbers. For example, TMA analysis of 300 FFPE cores would typically require 6 h of total time through data acquisition, not including data analysis. PMID:22011652

  20. Analysis of bHLH coding genes using gene co-expression network approach.

    PubMed

    Srivastava, Swati; Sanchita; Singh, Garima; Singh, Noopur; Srivastava, Gaurava; Sharma, Ashok

    2016-07-01

    Network analysis provides a powerful framework for the interpretation of data. It uses novel reference network-based metrices for module evolution. These could be used to identify module of highly connected genes showing variation in co-expression network. In this study, a co-expression network-based approach was used for analyzing the genes from microarray data. Our approach consists of a simple but robust rank-based network construction. The publicly available gene expression data of Solanum tuberosum under cold and heat stresses were considered to create and analyze a gene co-expression network. The analysis provide highly co-expressed module of bHLH coding genes based on correlation values. Our approach was to analyze the variation of genes expression, according to the time period of stress through co-expression network approach. As the result, the seed genes were identified showing multiple connections with other genes in the same cluster. Seed genes were found to be vary in different time periods of stress. These analyzed seed genes may be utilized further as marker genes for developing the stress tolerant plant species.

  1. Multiple diffraction in an icosahedral Al-Cu-Fe quasicrystal

    NASA Astrophysics Data System (ADS)

    Fan, C. Z.; Weber, Th.; Deloudi, S.; Steurer, W.

    2011-07-01

    In order to reveal its influence on quasicrystal structure analysis, multiple diffraction (MD) effects in an icosahedral Al-Cu-Fe quasicrystal have been investigated in-house on an Oxford Diffraction four-circle diffractometer equipped with an Onyx™ CCD area detector and MoKα radiation. For that purpose, an automated approach for Renninger scans (ψ-scans) has been developed. Two weak reflections were chosen as the main reflections (called P) in the present measurements. As is well known for periodic crystals, it is also observed for this quasicrystal that the intensity of the main reflection may significantly increase if the simultaneous (H) and the coupling (P-H) reflections are both strong, while there is no obvious MD effect if one of them is weak. The occurrence of MD events during ψ-scans has been studied based on an ideal structure model and the kinematical MD theory. The reliability of the approach is revealed by the good agreement between simulation and experiment. It shows that the multiple diffraction effect is quite significant.

  2. Bayesian Estimation of Pneumonia Etiology: Epidemiologic Considerations and Applications to the Pneumonia Etiology Research for Child Health Study

    PubMed Central

    Fu, Wei; Shi, Qiyuan; Prosperi, Christine; Wu, Zhenke; Hammitt, Laura L.; Feikin, Daniel R.; Baggett, Henry C.; Howie, Stephen R.C.; Scott, J. Anthony G.; Murdoch, David R.; Madhi, Shabir A.; Thea, Donald M.; Brooks, W. Abdullah; Kotloff, Karen L.; Li, Mengying; Park, Daniel E.; Lin, Wenyi; Levine, Orin S.; O’Brien, Katherine L.; Zeger, Scott L.

    2017-01-01

    Abstract In pneumonia, specimens are rarely obtained directly from the infection site, the lung, so the pathogen causing infection is determined indirectly from multiple tests on peripheral clinical specimens, which may have imperfect and uncertain sensitivity and specificity, so inference about the cause is complex. Analytic approaches have included expert review of case-only results, case–control logistic regression, latent class analysis, and attributable fraction, but each has serious limitations and none naturally integrate multiple test results. The Pneumonia Etiology Research for Child Health (PERCH) study required an analytic solution appropriate for a case–control design that could incorporate evidence from multiple specimens from cases and controls and that accounted for measurement error. We describe a Bayesian integrated approach we developed that combined and extended elements of attributable fraction and latent class analyses to meet some of these challenges and illustrate the advantage it confers regarding the challenges identified for other methods. PMID:28575370

  3. Non-invasive prenatal diagnosis of achondroplasia and thanatophoric dysplasia: next-generation sequencing allows for a safer, more accurate, and comprehensive approach

    PubMed Central

    Chitty, Lyn S; Mason, Sarah; Barrett, Angela N; McKay, Fiona; Lench, Nicholas; Daley, Rebecca; Jenkins, Lucy A

    2015-01-01

    Abstract Objective Accurate prenatal diagnosis of genetic conditions can be challenging and usually requires invasive testing. Here, we demonstrate the potential of next-generation sequencing (NGS) for the analysis of cell-free DNA in maternal blood to transform prenatal diagnosis of monogenic disorders. Methods Analysis of cell-free DNA using a PCR and restriction enzyme digest (PCR–RED) was compared with a novel NGS assay in pregnancies at risk of achondroplasia and thanatophoric dysplasia. Results PCR–RED was performed in 72 cases and was correct in 88.6%, inconclusive in 7% with one false negative. NGS was performed in 47 cases and was accurate in 96.2% with no inconclusives. Both approaches were used in 27 cases, with NGS giving the correct result in the two cases inconclusive with PCR–RED. Conclusion NGS provides an accurate, flexible approach to non-invasive prenatal diagnosis of de novo and paternally inherited mutations. It is more sensitive than PCR–RED and is ideal when screening a gene with multiple potential pathogenic mutations. These findings highlight the value of NGS in the development of non-invasive prenatal diagnosis for other monogenic disorders. © 2015 The Authors. Prenatal Diagnosis published by John Wiley & Sons, Ltd. What's already known about this topic? Non-invasive prenatal diagnosis (NIPD) using PCR-based methods has been reported for the detection or exclusion of individual paternally inherited or de novo alleles in maternal plasma. What does this study add? NIPD using next generation sequencing provides an accurate, more sensitive approach which can be used to detect multiple mutations in a single assay and so is ideal when screening a gene with multiple potential pathogenic mutations. Next generation sequencing thus provides a flexible approach to non-invasive prenatal diagnosis ideal for use in a busy service laboratory. PMID:25728633

  4. Spatially-Distributed Cost-Effectiveness Analysis Framework to Control Phosphorus from Agricultural Diffuse Pollution.

    PubMed

    Geng, Runzhe; Wang, Xiaoyan; Sharpley, Andrew N; Meng, Fande

    2015-01-01

    Best management practices (BMPs) for agricultural diffuse pollution control are implemented at the field or small-watershed scale. However, the benefits of BMP implementation on receiving water quality at multiple spatial is an ongoing challenge. In this paper, we introduce an integrated approach that combines risk assessment (i.e., Phosphorus (P) index), model simulation techniques (Hydrological Simulation Program-FORTRAN), and a BMP placement tool at various scales to identify the optimal location for implementing multiple BMPs and estimate BMP effectiveness after implementation. A statistically significant decrease in nutrient discharge from watersheds is proposed to evaluate the effectiveness of BMPs, strategically targeted within watersheds. Specifically, we estimate two types of cost-effectiveness curves (total pollution reduction and proportion of watersheds improved) for four allocation approaches. Selection of a ''best approach" depends on the relative importance of the two types of effectiveness, which involves a value judgment based on the random/aggregated degree of BMP distribution among and within sub-watersheds. A statistical optimization framework is developed and evaluated in Chaohe River Watershed located in the northern mountain area of Beijing. Results show that BMP implementation significantly (p >0.001) decrease P loss from the watershed. Remedial strategies where BMPs were targeted to areas of high risk of P loss, deceased P loads compared with strategies where BMPs were randomly located across watersheds. Sensitivity analysis indicated that aggregated BMP placement in particular watershed is the most cost-effective scenario to decrease P loss. The optimization approach outlined in this paper is a spatially hierarchical method for targeting nonpoint source controls across a range of scales from field to farm, to watersheds, to regions. Further, model estimates showed targeting at multiple scales is necessary to optimize program efficiency. The integrated model approach described that selects and places BMPs at varying levels of implementation, provides a new theoretical basis and technical guidance for diffuse pollution management in agricultural watersheds.

  5. Stability analysis of nonlinear autonomous systems - General theory and application to flutter

    NASA Technical Reports Server (NTRS)

    Smith, L. L.; Morino, L.

    1975-01-01

    The analysis makes use of a singular perturbation method, the multiple time scaling. Concepts of stable and unstable limit cycles are introduced. The solution is obtained in the form of an asymptotic expansion. Numerical results are presented for the nonlinear flutter of panels and airfoils in supersonic flow. The approach used is an extension of a method for analyzing nonlinear panel flutter reported by Morino (1969).

  6. Gene Therapy for Fracture Repair

    DTIC Science & Technology

    2007-05-01

    Methods: We have adopted the Agilent rat oligomer chip to analyze our fracture RNA in our microarray analysis. This chip has 20,046 unique gene...signal during fluorescent labeling of the cDNA. This approach is highly advantageous for reducing the RNA input into the system, minimizing the numbers...perform the analysis on these extremely limited samples without pooling the RNA from multiple individuals. We are therefore able to analyze the

  7. Qualitative and quantitative analysis of heparin and low molecular weight heparins using size exclusion chromatography with multiple angle laser scattering/refractive index and inductively coupled plasma/mass spectrometry detectors.

    PubMed

    Ouyang, Yilan; Zeng, Yangyang; Yi, Lin; Tang, Hong; Li, Duxin; Linhardt, Robert J; Zhang, Zhenqing

    2017-11-03

    Heparin, a highly sulfated glycosaminoglycan, has been used as a clinical anticoagulant over 80 years. Low molecular weight heparins (LMWHs), heparins partially depolymerized using different processes, are widely used as clinical anticoagulants. Qualitative molecular weight (MW) and quantitative mass content analysis are two important factors that contribute to LMWH quality control. Size exclusion chromatography (SEC), relying on multiple angle laser scattering (MALS)/refractive index (RI) detectors, has been developed for accurate analysis of heparin MW in the absence of standards. However, the cations, which ion-pair with the anionic polysaccharide chains of heparin and LMWHs, had not been considered in previous reports. In this study, SEC with MALS/RI and inductively coupled plasma/mass spectrometry detectors were used in a comprehensive analytical approach taking both anionic polysaccharide and ion-paired cations heparin products. This approach was also applied to quantitative analysis of heparin and LMWHs. Full profiles of MWs and mass recoveries for three commercial heparin/LMWH products, heparin sodium, enoxaparin sodium and nadroparin calcium, were obtained and all showed higher MWs than previously reported. This important improvement more precisely characterized the MW properties of heparin/LMWHs and potentially many other anionic polysaccharides. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Recent Achievements in Characterizing the Histone Code and Approaches to Integrating Epigenomics and Systems Biology.

    PubMed

    Janssen, K A; Sidoli, S; Garcia, B A

    2017-01-01

    Functional epigenetic regulation occurs by dynamic modification of chromatin, including genetic material (i.e., DNA methylation), histone proteins, and other nuclear proteins. Due to the highly complex nature of the histone code, mass spectrometry (MS) has become the leading technique in identification of single and combinatorial histone modifications. MS has now overcome antibody-based strategies due to its automation, high resolution, and accurate quantitation. Moreover, multiple approaches to analysis have been developed for global quantitation of posttranslational modifications (PTMs), including large-scale characterization of modification coexistence (middle-down and top-down proteomics), which is not currently possible with any other biochemical strategy. Recently, our group and others have simplified and increased the effectiveness of analyzing histone PTMs by improving multiple MS methods and data analysis tools. This review provides an overview of the major achievements in the analysis of histone PTMs using MS with a focus on the most recent improvements. We speculate that the workflow for histone analysis at its state of the art is highly reliable in terms of identification and quantitation accuracy, and it has the potential to become a routine method for systems biology thanks to the possibility of integrating histone MS results with genomics and proteomics datasets. © 2017 Elsevier Inc. All rights reserved.

  9. Systems-level mechanisms of action of Panax ginseng: a network pharmacological approach.

    PubMed

    Park, Sa-Yoon; Park, Ji-Hun; Kim, Hyo-Su; Lee, Choong-Yeol; Lee, Hae-Jeung; Kang, Ki Sung; Kim, Chang-Eop

    2018-01-01

    Panax ginseng has been used since ancient times based on the traditional Asian medicine theory and clinical experiences, and currently, is one of the most popular herbs in the world. To date, most of the studies concerning P. ginseng have focused on specific mechanisms of action of individual constituents. However, in spite of many studies on the molecular mechanisms of P. ginseng , it still remains unclear how multiple active ingredients of P. ginseng interact with multiple targets simultaneously, giving the multidimensional effects on various conditions and diseases. In order to decipher the systems-level mechanism of multiple ingredients of P. ginseng , a novel approach is needed beyond conventional reductive analysis. We aim to review the systems-level mechanism of P. ginseng by adopting novel analytical framework-network pharmacology. Here, we constructed a compound-target network of P. ginseng using experimentally validated and machine learning-based prediction results. The targets of the network were analyzed in terms of related biological process, pathways, and diseases. The majority of targets were found to be related with primary metabolic process, signal transduction, nitrogen compound metabolic process, blood circulation, immune system process, cell-cell signaling, biosynthetic process, and neurological system process. In pathway enrichment analysis of targets, mainly the terms related with neural activity showed significant enrichment and formed a cluster. Finally, relative degrees analysis for the target-disease association of P. ginseng revealed several categories of related diseases, including respiratory, psychiatric, and cardiovascular diseases.

  10. Resources for health promotion: rhetoric, research and reality.

    PubMed

    Minke, Sharlene Wolbeck; Raine, Kim D; Plotnikoff, Ronald C; Anderson, Donna; Khalema, Ernest; Smith, Cynthia

    2007-01-01

    Canadian political discourse supports the importance of health promotion and advocates the allocation of health resources to health promotion. Furthermore, the current literature frequently identifies financial and human resources as important elements of organizational capacity for health promotion. In the Alberta Heart Health Project (AHHP), we sought to learn if the allocation of health resources in a regionalized health system was congruent with the espoused support for health promotion in Alberta, Canada. The AHHP used a mixed method approach in a time series design. Participants were drawn from multiple organizational levels (i.e., service providers, managers, board members) across all Regional Health Authorities (RHAs). Data were triangulated through multiple collection methods, primarily an organizational capacity survey, analysis of organizational documents, focus groups, and personal interviews. Analysis techniques were drawn from quantitative (i.e., frequency distributions, ANOVAs) and qualitative (i.e., content and thematic analysis) approaches. In most cases, small amounts (<5%) of financial resources were allocated to health promotion in RHAs' core budgets. Respondents reported seeking multiple sources of public health financing to support their health promotion initiatives. Human resources for health promotion were characterized by fragmented responsibilities and short-term work. Furthermore, valuable human resources were consumed in ongoing searches for funding that typically covered short time periods. Resource allocations to health promotion in Alberta RHAs are inconsistent with the current emphasis on health promotion as an organizational priority. Inadequate and unstable funding erodes the RHAs' capacity for health promotion. Sustainable health promotion calls for the assured allocation of adequate, sustainable financial resources.

  11. The economics of project analysis: Optimal investment criteria and methods of study

    NASA Technical Reports Server (NTRS)

    Scriven, M. C.

    1979-01-01

    Insight is provided toward the development of an optimal program for investment analysis of project proposals offering commercial potential and its components. This involves a critique of economic investment criteria viewed in relation to requirements of engineering economy analysis. An outline for a systems approach to project analysis is given Application of the Leontief input-output methodology to analysis of projects involving multiple processes and products is investigated. Effective application of elements of neoclassical economic theory to investment analysis of project components is demonstrated. Patterns of both static and dynamic activity levels are incorporated.

  12. Behavioral Modeling of Adversaries with Multiple Objectives in Counterterrorism.

    PubMed

    Mazicioglu, Dogucan; Merrick, Jason R W

    2018-05-01

    Attacker/defender models have primarily assumed that each decisionmaker optimizes the cost of the damage inflicted and its economic repercussions from their own perspective. Two streams of recent research have sought to extend such models. One stream suggests that it is more realistic to consider attackers with multiple objectives, but this research has not included the adaption of the terrorist with multiple objectives to defender actions. The other stream builds off experimental studies that show that decisionmakers deviate from optimal rational behavior. In this article, we extend attacker/defender models to incorporate multiple objectives that a terrorist might consider in planning an attack. This includes the tradeoffs that a terrorist might consider and their adaption to defender actions. However, we must also consider experimental evidence of deviations from the rationality assumed in the commonly used expected utility model in determining such adaption. Thus, we model the attacker's behavior using multiattribute prospect theory to account for the attacker's multiple objectives and deviations from rationality. We evaluate our approach by considering an attacker with multiple objectives who wishes to smuggle radioactive material into the United States and a defender who has the option to implement a screening process to hinder the attacker. We discuss the problems with implementing such an approach, but argue that research in this area must continue to avoid misrepresenting terrorist behavior in determining optimal defensive actions. © 2017 Society for Risk Analysis.

  13. Experiences with Probabilistic Analysis Applied to Controlled Systems

    NASA Technical Reports Server (NTRS)

    Kenny, Sean P.; Giesy, Daniel P.

    2004-01-01

    This paper presents a semi-analytic method for computing frequency dependent means, variances, and failure probabilities for arbitrarily large-order closed-loop dynamical systems possessing a single uncertain parameter or with multiple highly correlated uncertain parameters. The approach will be shown to not suffer from the same computational challenges associated with computing failure probabilities using conventional FORM/SORM techniques. The approach is demonstrated by computing the probabilistic frequency domain performance of an optimal feed-forward disturbance rejection scheme.

  14. Quantitative glycomics.

    PubMed

    Orlando, Ron

    2010-01-01

    The ability to quantitatively determine changes is an essential component of comparative glycomics. Multiple strategies are available by which this can be accomplished. These include label-free approaches and strategies where an isotopic label is incorporated into the glycans prior to analysis. The focus of this chapter is to describe each of these approaches while providing insight into their strengths and weaknesses, so that glycomic investigators can make an educated choice of the strategy that is best suited for their particular application.

  15. Quantitative analysis of glycoprotein glycans.

    PubMed

    Orlando, Ron

    2013-01-01

    The ability to quantitatively determine changes in the N- and O-linked glycans is an essential component of comparative glycomics. Multiple strategies are available to by which this can be accomplished, including; both label free approaches and isotopic labeling strategies. The focus of this chapter is to describe each of these approaches while providing insight into their strengths and weaknesses, so that glycomic investigators can make an educated choice of the strategy that is best suited for their particular application.

  16. A Bayesian Approach to Integrated Ecological and Human Health Risk Assessment for the South River, Virginia Mercury-Contaminated Site.

    PubMed

    Harris, Meagan J; Stinson, Jonah; Landis, Wayne G

    2017-07-01

    We conducted a regional-scale integrated ecological and human health risk assessment by applying the relative risk model with Bayesian networks (BN-RRM) to a case study of the South River, Virginia mercury-contaminated site. Risk to four ecological services of the South River (human health, water quality, recreation, and the recreational fishery) was evaluated using a multiple stressor-multiple endpoint approach. These four ecological services were selected as endpoints based on stakeholder feedback and prioritized management goals for the river. The BN-RRM approach allowed for the calculation of relative risk to 14 biotic, human health, recreation, and water quality endpoints from chemical and ecological stressors in five risk regions of the South River. Results indicated that water quality and the recreational fishery were the ecological services at highest risk in the South River. Human health risk for users of the South River was low relative to the risk to other endpoints. Risk to recreation in the South River was moderate with little spatial variability among the five risk regions. Sensitivity and uncertainty analysis identified stressors and other parameters that influence risk for each endpoint in each risk region. This research demonstrates a probabilistic approach to integrated ecological and human health risk assessment that considers the effects of chemical and ecological stressors across the landscape. © 2017 Society for Risk Analysis.

  17. Reduced rank models for travel time estimation of low order mode pulses.

    PubMed

    Chandrayadula, Tarun K; Wage, Kathleen E; Worcester, Peter F; Dzieciuch, Matthew A; Mercer, James A; Andrew, Rex K; Howe, Bruce M

    2013-10-01

    Mode travel time estimation in the presence of internal waves (IWs) is a challenging problem. IWs perturb the sound speed, which results in travel time wander and mode scattering. A standard approach to travel time estimation is to pulse compress the broadband signal, pick the peak of the compressed time series, and average the peak time over multiple receptions to reduce variance. The peak-picking approach implicitly assumes there is a single strong arrival and does not perform well when there are multiple arrivals due to scattering. This article presents a statistical model for the scattered mode arrivals and uses the model to design improved travel time estimators. The model is based on an Empirical Orthogonal Function (EOF) analysis of the mode time series. Range-dependent simulations and data from the Long-range Ocean Acoustic Propagation Experiment (LOAPEX) indicate that the modes are represented by a small number of EOFs. The reduced-rank EOF model is used to construct a travel time estimator based on the Matched Subspace Detector (MSD). Analysis of simulation and experimental data show that the MSDs are more robust to IW scattering than peak picking. The simulation analysis also highlights how IWs affect the mode excitation by the source.

  18. A Multiple Streams analysis of the decisions to fund gender-neutral HPV vaccination in Canada.

    PubMed

    Shapiro, Gilla K; Guichon, Juliet; Prue, Gillian; Perez, Samara; Rosberger, Zeev

    2017-07-01

    In Canada, the human papillomavirus (HPV) vaccine is licensed and recommended for females and males. Although all Canadian jurisdictions fund school-based HPV vaccine programs for girls, only six jurisdictions fund school-based HPV vaccination for boys. The research aimed to analyze the factors that underpin government decisions to fund HPV vaccine for boys using a theoretical policy model, Kingdon's Multiple Streams framework. This approach assesses policy development by examining three concurrent, but independent, streams that guide analysis: Problem Stream, Policy Stream, and Politics Stream. Analysis from the Problem Stream highlights that males are affected by HPV-related diseases and are involved in transmitting HPV infection to their sexual partners. Policy Stream analysis makes clear that while the inclusion of males in HPV vaccine programs is suitable, equitable, and acceptable; there is debate regarding cost-effectiveness. Politics Stream analysis identifies the perspectives of six different stakeholder groups and highlights the contribution of government officials at the provincial and territorial level. Kingdon's Multiple Streams framework helps clarify the opportunities and barriers for HPV vaccine policy change. This analysis identified that the interpretation of cost-effectiveness models and advocacy of stakeholders such as citizen-advocates and HPV-affected politicians have been particularly important in galvanizing policy change. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Multi-sensor analysis of urban ecosystems

    USGS Publications Warehouse

    Gallo, Kevin P.; Ji, Lei

    2004-01-01

    This study examines the synthesis of multiple space-based sensors to characterize the urban environment Single scene data (e.g., ASTER visible and near-IR surface reflectance, and land surface temperature data), multi-temporal data (e.g., one year of 16-day MODIS and AVHRR vegetation index data), and DMSP-OLS nighttime light data acquired in the early 1990s and 2000 were evaluated for urban ecosystem analysis. The advantages of a multi-sensor approach for the analysis of urban ecosystem processes are discussed.

  20. Combining real-time monitoring and knowledge-based analysis in MARVEL

    NASA Technical Reports Server (NTRS)

    Schwuttke, Ursula M.; Quan, A. G.; Angelino, R.; Veregge, J. R.

    1993-01-01

    Real-time artificial intelligence is gaining increasing attention for applications in which conventional software methods are unable to meet technology needs. One such application area is the monitoring and analysis of complex systems. MARVEL, a distributed monitoring and analysis tool with multiple expert systems, was developed and successfully applied to the automation of interplanetary spacecraft operations at NASA's Jet Propulsion Laboratory. MARVEL implementation and verification approaches, the MARVEL architecture, and the specific benefits that were realized by using MARVEL in operations are described.

  1. Primary care clinicians' experiences with treatment decision making for older persons with multiple conditions.

    PubMed

    Fried, Terri R; Tinetti, Mary E; Iannone, Lynne

    2011-01-10

    Clinicians are caring for an increasing number of older patients with multiple diseases in the face of uncertainty concerning the benefits and harms associated with guideline-directed interventions. Understanding how primary care clinicians approach treatment decision making for these patients is critical to the design of interventions to improve the decision-making process. Focus groups were conducted with 40 primary care clinicians (physicians, nurse practitioners, and physician assistants) in academic, community, and Veterans Affairs-affiliated primary care practices. Participants were given open-ended questions about their approach to treatment decision making for older persons with multiple medical conditions. Responses were organized into themes using qualitative content analysis. The participants were concerned about their patients' ability to adhere to complex regimens derived from guideline-directed care. There was variability in beliefs regarding, and approaches to balancing, the benefits and harms of guideline-directed care. There was also variability regarding how the participants involved patients in the process of decision making, with clinicians describing conflicts between their own and their patients' goals. The participants listed a number of barriers to making good treatment decisions, including the lack of outcome data, the role of specialists, patient and family expectations, and insufficient time and reimbursement. The experiences of practicing clinicians suggest that they struggle with the uncertainties of applying disease-specific guidelines to their older patients with multiple conditions. To improve decision making, they need more data, alternative guidelines, approaches to reconciling their own and their patients' priorities, the support of their subspecialist colleagues, and an altered reimbursement system.

  2. Fused-data transrectal EIT for prostate cancer imaging.

    PubMed

    Murphy, Ethan K; Wu, Xiaotian; Halter, Ryan J

    2018-05-25

    Prostate cancer is a significant problem affecting 1 in 7 men. Unfortunately, the diagnostic gold-standard of ultrasound-guided biopsy misses 10%-30% of all cancers. The objective of this study was to develop an electrical impedance tomography (EIT) approach that has the potential to image the entire prostate using multiple impedance measurements recorded between electrodes integrated onto an end-fired transrectal ultrasound (TRUS) device and a biopsy probe (BP). Simulations and sensitivity analyses were used to investigate the best combination of electrodes, and measured tank experiments were used to evaluate a fused-data transrectal EIT (fd-TREIT) and BP approach. Simulations and sensitivity analysis revealed that (1) TREIT measurements are not sufficiently sensitive to image the whole prostate, (2) the combination of TREIT  +  BP measurements increases the sensitive region of TREIT-only measurements by 12×, and (3) the fusion of multiple TREIT  +  BP measurements collected during a routine or customized 12-core biopsy procedure can cover up to 76.1% or 94.1% of a nominal 50 cm 3 prostate, respectively. Three measured tank experiments of the fd-TREIT  +  BP approach successfully and accurately recovered the positions of 2-3 metal or plastic inclusions. The measured tank experiments represent important steps in the development of an algorithm that can combine EIT from multiple locations and from multiple probes-data that could be collected during a routine TRUS-guided 12-core biopsy. Overall, this result is a step towards a clinically deployable impedance imaging approach to scanning the entire prostate, which could significantly help to improve prostate cancer diagnosis.

  3. Predicting failure to return to work.

    PubMed

    Mills, R

    2012-08-01

    The research question is: is it possible to predict, at the time of workers' compensation claim lodgement, which workers will have a prolonged return to work (RTW) outcome? This paper illustrates how a traditional analytic approach to the analysis of an existing large database can be insufficient to answer the research question, and suggests an alternative data management and analysis approach. This paper retrospectively analyses 9018 workers' compensation claims from two different workers' compensation jurisdictions in Australia (two data sets) over a 4-month period in 2007. De-identified data, submitted at the time of claim lodgement, were compared with RTW outcomes for up to 3 months. Analysis consisted of descriptive, parametric (analysis of variance and multiple regression), survival (proportional hazards) and data mining (partitioning) analysis. No significant associations were found on parametric analysis. Multiple associations were found between the predictor variables and RTW outcome on survival analysis, with marked differences being found between some sub-groups on partitioning--where diagnosis was found to be the strongest discriminator (particularly neck and shoulder injuries). There was a consistent trend for female gender to be associated with a prolonged RTW outcome. The supplied data were not sufficient to enable the development of a predictive model. If we want to predict early who will have a prolonged RTW in Australia, workers' compensation claim forms should be redesigned, data management improved and specialised analytic techniques used. © 2011 The Author. Internal Medicine Journal © 2011 Royal Australasian College of Physicians.

  4. Repeater Analysis for Combining Information from Different Assessments

    ERIC Educational Resources Information Center

    Haberman, Shelby; Yao, Lili

    2015-01-01

    Admission decisions frequently rely on multiple assessments. As a consequence, it is important to explore rational approaches to combine the information from different educational tests. For example, U.S. graduate schools usually receive both TOEFL iBT® scores and GRE® General scores of foreign applicants for admission; however, little guidance…

  5. A self-consistent global emissions inventory spanning 1850-2050 – why we need one and why we do not have one

    EPA Science Inventory

    While emissions inventory development has advanced significantly in recent years, the scientific community still lacks a global inventory utilizing consistent estimation approaches spanning multiple centuries. In this analysis, we investigate the strengths and weaknesses of cur...

  6. New Method for Analysis of Multiple Anthelmintic Residues in Animal Tissue

    USDA-ARS?s Scientific Manuscript database

    For the first time, 39 of the major anthelmintics can be detected in one rapid and sensitive LC-MS/MS method, including the flukicides, which have been generally overlooked in surveillance programs. Utilizing the QuEChERS approach, residues were extracted from liver and milk using acetonitrile, sod...

  7. Novel Platform Technologies for Analysis of Norovirus Contamination of Sea Food

    USDA-ARS?s Scientific Manuscript database

    The study of human norovirus (NoVs) replication in vitro would be a highly useful tool to virologists and immunologists. For this reason, we have searched for new approaches to determine viability of noroviruses in food samples (especially seafood). Our research team has multiple years of experien...

  8. Broadband Structural Dynamics: Understanding the Impulse-Response of Structures Across Multiple Length and Time Scales

    DTIC Science & Technology

    2010-08-18

    Spectral domain response calculated • Time domain response obtained through inverse transform Approach 4: WASABI Wavelet Analysis of Structural Anomalies...differences at unity scale! Time Function Transform Apply Spectral Domain Transfer Function Time Function Inverse Transform Transform Transform  mtP

  9. Factors Associated with Sexual Behavior among Adolescents: A Multivariate Analysis.

    ERIC Educational Resources Information Center

    Harvey, S. Marie; Spigner, Clarence

    1995-01-01

    A self-administered survey examining multiple factors associated with engaging in sexual intercourse was completed by 1,026 high school students in a classroom setting. Findings suggest that effective interventions to address teenage pregnancy need to utilize a multifaceted approach to the prevention of high-risk behaviors. (JPS)

  10. The Effects of Mobile Collaborative Activities in a Second Language Course

    ERIC Educational Resources Information Center

    Ilic, Peter

    2015-01-01

    This research is designed to explore the areas of collaborative learning and the use of smartphones as a support for collaborative learning through a year-long exploratory multiple case study approach integrating both qualitative and quantitative data analysis. Qualitative exploratory interviews are combined with Multidimensional Scaling Analysis…

  11. Weighting and Aggregation in Composite Indicator Construction: A Multiplicative Optimization Approach

    ERIC Educational Resources Information Center

    Zhou, P.; Ang, B. W.; Zhou, D. Q.

    2010-01-01

    Composite indicators (CIs) have increasingly been accepted as a useful tool for benchmarking, performance comparisons, policy analysis and public communication in many different fields. Several recent studies show that as a data aggregation technique in CI construction the weighted product (WP) method has some desirable properties. However, a…

  12. Individual Differences in Achievement Goals: A Longitudinal Study of Cognitive, Emotional, and Achievement Outcomes

    ERIC Educational Resources Information Center

    Daniels, Lia M.; Haynes, Tara L.; Stupnisky, Robert H.; Perry, Raymond P.; Newall, Nancy E.; Pekrun, Reinhard

    2008-01-01

    Within achievement goal theory debate remains regarding the adaptiveness of certain combinations of goals. Assuming a multiple-goals perspective, we used cluster analysis to classify 1002 undergraduate students according to their mastery and performance-approach goals. Four clusters emerged, representing different goal combinations: high…

  13. Measuring students' self-regulated learning in professional education: bridging the gap between event and aptitude measurements.

    PubMed

    Endedijk, Maaike D; Brekelmans, Mieke; Sleegers, Peter; Vermunt, Jan D

    Self-regulated learning has benefits for students' academic performance in school, but also for expertise development during their professional career. This study examined the validity of an instrument to measure student teachers' regulation of their learning to teach across multiple and different kinds of learning events in the context of a postgraduate professional teacher education programme. Based on an analysis of the literature, we developed a log with structured questions that could be used as a multiple-event instrument to determine the quality of student teachers' regulation of learning by combining data from multiple learning experiences. The findings showed that this structured version of the instrument measured student teachers' regulation of their learning in a valid and reliable way. Furthermore, with the aid of the Structured Learning Report individual differences in student teachers' regulation of learning could be discerned. Together the findings indicate that a multiple-event instrument can be used to measure regulation of learning in multiple contexts for various learning experiences at the same time, without the necessity of relying on students' ability to rate themselves across all these different experiences. In this way, this instrument can make an important contribution to bridging the gap between two dominant approaches to measure SRL, the traditional aptitude and event measurement approach.

  14. Modeling Zone-3 Protection with Generic Relay Models for Dynamic Contingency Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Qiuhua; Vyakaranam, Bharat GNVSR; Diao, Ruisheng

    This paper presents a cohesive approach for calculating and coordinating the settings of multiple zone-3 protections for dynamic contingency analysis. The zone-3 protections are represented by generic distance relay models. A two-step approach for determining zone-3 relay settings is proposed. The first step is to calculate settings, particularly, the reach, of each zone-3 relay individually by iteratively running line open-end fault short circuit analysis; the blinder is also employed and properly set to meet the industry standard under extreme loading conditions. The second step is to systematically coordinate the protection settings of the zone-3 relays. The main objective of thismore » coordination step is to address the over-reaching issues. We have developed a tool to automate the proposed approach and generate the settings of all distance relays in a PSS/E dyr format file. The calculated zone-3 settings have been tested on a modified IEEE 300 system using a dynamic contingency analysis tool (DCAT).« less

  15. Application of model predictive control for optimal operation of wind turbines

    NASA Astrophysics Data System (ADS)

    Yuan, Yuan; Cao, Pei; Tang, J.

    2017-04-01

    For large-scale wind turbines, reducing maintenance cost is a major challenge. Model predictive control (MPC) is a promising approach to deal with multiple conflicting objectives using the weighed sum approach. In this research, model predictive control method is applied to wind turbine to find an optimal balance between multiple objectives, such as the energy capture, loads on turbine components, and the pitch actuator usage. The actuator constraints are integrated into the objective function at the control design stage. The analysis is carried out in both the partial load region and full load region, and the performances are compared with those of a baseline gain scheduling PID controller. The application of this strategy achieves enhanced balance of component loads, the average power and actuator usages in partial load region.

  16. Comparison of Statistical Approaches for Dealing With Immortal Time Bias in Drug Effectiveness Studies

    PubMed Central

    Karim, Mohammad Ehsanul; Gustafson, Paul; Petkau, John; Tremlett, Helen

    2016-01-01

    In time-to-event analyses of observational studies of drug effectiveness, incorrect handling of the period between cohort entry and first treatment exposure during follow-up may result in immortal time bias. This bias can be eliminated by acknowledging a change in treatment exposure status with time-dependent analyses, such as fitting a time-dependent Cox model. The prescription time-distribution matching (PTDM) method has been proposed as a simpler approach for controlling immortal time bias. Using simulation studies and theoretical quantification of bias, we compared the performance of the PTDM approach with that of the time-dependent Cox model in the presence of immortal time. Both assessments revealed that the PTDM approach did not adequately address immortal time bias. Based on our simulation results, another recently proposed observational data analysis technique, the sequential Cox approach, was found to be more useful than the PTDM approach (Cox: bias = −0.002, mean squared error = 0.025; PTDM: bias = −1.411, mean squared error = 2.011). We applied these approaches to investigate the association of β-interferon treatment with delaying disability progression in a multiple sclerosis cohort in British Columbia, Canada (Long-Term Benefits and Adverse Effects of Beta-Interferon for Multiple Sclerosis (BeAMS) Study, 1995–2008). PMID:27455963

  17. A powerful approach reveals numerous expression quantitative trait haplotypes in multiple tissues.

    PubMed

    Ying, Dingge; Li, Mulin Jun; Sham, Pak Chung; Li, Miaoxin

    2018-04-26

    Recently many studies showed single nucleotide polymorphisms (SNPs) affect gene expression and contribute to development of complex traits/diseases in a tissue context-dependent manner. However, little is known about haplotype's influence on gene expression and complex traits, which reflects the interaction effect between SNPs. In the present study, we firstly proposed a regulatory region guided eQTL haplotype association analysis approach, and then systematically investigate the expression quantitative trait loci (eQTL) haplotypes in 20 different tissues by the approach. The approach has a powerful design of reducing computational burden by the utilization of regulatory predictions for candidate SNP selection and multiple testing corrections on non-independent haplotypes. The application results in multiple tissues showed that haplotype-based eQTLs not only increased the number of eQTL genes in a tissue specific manner, but were also enriched in loci that associated with complex traits in a tissue-matched manner. In addition, we found that tag SNPs of eQTL haplotypes from whole blood were selectively enriched in certain combination of regulatory elements (e.g. promoters and enhancers) according to predicted chromatin states. In summary, this eQTL haplotype detection approach, together with the application results, shed insights into synergistic effect of sequence variants on gene expression and their susceptibility to complex diseases. The executable application "eHaplo" is implemented in Java and is publicly available at http://grass.cgs.hku.hk/limx/ehaplo/. jonsonfox@gmail.com, limiaoxin@mail.sysu.edu.cn. Supplementary data are available at Bioinformatics online.

  18. Risk assessment of pesticides and other stressors in bees: Principles, data gaps and perspectives from the European Food Safety Authority.

    PubMed

    Rortais, Agnès; Arnold, Gérard; Dorne, Jean-Lou; More, Simon J; Sperandio, Giorgio; Streissl, Franz; Szentes, Csaba; Verdonck, Frank

    2017-06-01

    Current approaches to risk assessment in bees do not take into account co-exposures from multiple stressors. The European Food Safety Authority (EFSA) is deploying resources and efforts to move towards a holistic risk assessment approach of multiple stressors in bees. This paper describes the general principles of pesticide risk assessment in bees, including recent developments at EFSA dealing with risk assessment of single and multiple pesticide residues and biological hazards. The EFSA Guidance Document on the risk assessment of plant protection products in bees highlights the need for the inclusion of an uncertainty analysis, other routes of exposures and multiple stressors such as chemical mixtures and biological agents. The EFSA risk assessment on the survival, spread and establishment of the small hive beetle, Aethina tumida, an invasive alien species, is provided with potential insights for other bee pests such as the Asian hornet, Vespa velutina. Furthermore, data gaps are identified at each step of the risk assessment, and recommendations are made for future research that could be supported under the framework of Horizon 2020. Finally, the recent work conducted at EFSA is presented, under the overarching MUST-B project ("EU efforts towards the development of a holistic approach for the risk assessment on MUltiple STressors in Bees") comprising a toolbox for harmonised data collection under field conditions and a mechanistic model to assess effects from pesticides and other stressors such as biological agents and beekeeping management practices, at the colony level and in a spatially complex landscape. Future perspectives at EFSA include the development of a data model to collate high quality data to calibrate and validate the model to be used as a regulatory tool. Finally, the evidence collected within the framework of MUST-B will support EFSA's activities on the development of a holistic approach to the risk assessment of multiple stressors in bees. In conclusion, EFSA calls for collaborative action at the EU level to establish a common and open access database to serve multiple purposes and different stakeholders. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  19. Quantitative analysis of single- vs. multiple-set programs in resistance training.

    PubMed

    Wolfe, Brian L; LeMura, Linda M; Cole, Phillip J

    2004-02-01

    The purpose of this study was to examine the existing research on single-set vs. multiple-set resistance training programs. Using the meta-analytic approach, we included studies that met the following criteria in our analysis: (a) at least 6 subjects per group; (b) subject groups consisting of single-set vs. multiple-set resistance training programs; (c) pretest and posttest strength measures; (d) training programs of 6 weeks or more; (e) apparently "healthy" individuals free from orthopedic limitations; and (f) published studies in English-language journals only. Sixteen studies generated 103 effect sizes (ESs) based on a total of 621 subjects, ranging in age from 15-71 years. Across all designs, intervention strategies, and categories, the pretest to posttest ES in muscular strength was (chi = 1.4 +/- 1.4; 95% confidence interval, 0.41-3.8; p < 0.001). The results of 2 x 2 analysis of variance revealed simple main effects for age, training status (trained vs. untrained), and research design (p < 0.001). No significant main effects were found for sex, program duration, and set end point. Significant interactions were found for training status and program duration (6-16 weeks vs. 17-40 weeks) and number of sets performed (single vs. multiple). The data indicated that trained individuals performing multiple sets generated significantly greater increases in strength (p < 0.001). For programs with an extended duration, multiple sets were superior to single sets (p < 0.05). This quantitative review indicates that single-set programs for an initial short training period in untrained individuals result in similar strength gains as multiple-set programs. However, as progression occurs and higher gains are desired, multiple-set programs are more effective.

  20. Multiple imputation for handling missing outcome data when estimating the relative risk.

    PubMed

    Sullivan, Thomas R; Lee, Katherine J; Ryan, Philip; Salter, Amy B

    2017-09-06

    Multiple imputation is a popular approach to handling missing data in medical research, yet little is known about its applicability for estimating the relative risk. Standard methods for imputing incomplete binary outcomes involve logistic regression or an assumption of multivariate normality, whereas relative risks are typically estimated using log binomial models. It is unclear whether misspecification of the imputation model in this setting could lead to biased parameter estimates. Using simulated data, we evaluated the performance of multiple imputation for handling missing data prior to estimating adjusted relative risks from a correctly specified multivariable log binomial model. We considered an arbitrary pattern of missing data in both outcome and exposure variables, with missing data induced under missing at random mechanisms. Focusing on standard model-based methods of multiple imputation, missing data were imputed using multivariate normal imputation or fully conditional specification with a logistic imputation model for the outcome. Multivariate normal imputation performed poorly in the simulation study, consistently producing estimates of the relative risk that were biased towards the null. Despite outperforming multivariate normal imputation, fully conditional specification also produced somewhat biased estimates, with greater bias observed for higher outcome prevalences and larger relative risks. Deleting imputed outcomes from analysis datasets did not improve the performance of fully conditional specification. Both multivariate normal imputation and fully conditional specification produced biased estimates of the relative risk, presumably since both use a misspecified imputation model. Based on simulation results, we recommend researchers use fully conditional specification rather than multivariate normal imputation and retain imputed outcomes in the analysis when estimating relative risks. However fully conditional specification is not without its shortcomings, and so further research is needed to identify optimal approaches for relative risk estimation within the multiple imputation framework.

  1. A multi-scale convolutional neural network for phenotyping high-content cellular images.

    PubMed

    Godinez, William J; Hossain, Imtiaz; Lazic, Stanley E; Davies, John W; Zhang, Xian

    2017-07-01

    Identifying phenotypes based on high-content cellular images is challenging. Conventional image analysis pipelines for phenotype identification comprise multiple independent steps, with each step requiring method customization and adjustment of multiple parameters. Here, we present an approach based on a multi-scale convolutional neural network (M-CNN) that classifies, in a single cohesive step, cellular images into phenotypes by using directly and solely the images' pixel intensity values. The only parameters in the approach are the weights of the neural network, which are automatically optimized based on training images. The approach requires no a priori knowledge or manual customization, and is applicable to single- or multi-channel images displaying single or multiple cells. We evaluated the classification performance of the approach on eight diverse benchmark datasets. The approach yielded overall a higher classification accuracy compared with state-of-the-art results, including those of other deep CNN architectures. In addition to using the network to simply obtain a yes-or-no prediction for a given phenotype, we use the probability outputs calculated by the network to quantitatively describe the phenotypes. This study shows that these probability values correlate with chemical treatment concentrations. This finding validates further our approach and enables chemical treatment potency estimation via CNNs. The network specifications and solver definitions are provided in Supplementary Software 1. william_jose.godinez_navarro@novartis.com or xian-1.zhang@novartis.com. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  2. On the Interpretation and Use of Mediation: Multiple Perspectives on Mediation Analysis

    PubMed Central

    Agler, Robert; De Boeck, Paul

    2017-01-01

    Mediation analysis has become a very popular approach in psychology, and it is one that is associated with multiple perspectives that are often at odds, often implicitly. Explicitly discussing these perspectives and their motivations, advantages, and disadvantages can help to provide clarity to conversations and research regarding the use and refinement of mediation models. We discuss five such pairs of perspectives on mediation analysis, their associated advantages and disadvantages, and their implications: with vs. without a mediation hypothesis, specific effects vs. a global model, directness vs. indirectness of causation, effect size vs. null hypothesis testing, and hypothesized vs. alternative explanations. Discussion of the perspectives is facilitated by a small simulation study. Some philosophical and linguistic considerations are briefly discussed, as well as some other perspectives we do not develop here. PMID:29187828

  3. A Hybrid One-Way ANOVA Approach for the Robust and Efficient Estimation of Differential Gene Expression with Multiple Patterns

    PubMed Central

    Mollah, Mohammad Manir Hossain; Jamal, Rahman; Mokhtar, Norfilza Mohd; Harun, Roslan; Mollah, Md. Nurul Haque

    2015-01-01

    Background Identifying genes that are differentially expressed (DE) between two or more conditions with multiple patterns of expression is one of the primary objectives of gene expression data analysis. Several statistical approaches, including one-way analysis of variance (ANOVA), are used to identify DE genes. However, most of these methods provide misleading results for two or more conditions with multiple patterns of expression in the presence of outlying genes. In this paper, an attempt is made to develop a hybrid one-way ANOVA approach that unifies the robustness and efficiency of estimation using the minimum β-divergence method to overcome some problems that arise in the existing robust methods for both small- and large-sample cases with multiple patterns of expression. Results The proposed method relies on a β-weight function, which produces values between 0 and 1. The β-weight function with β = 0.2 is used as a measure of outlier detection. It assigns smaller weights (≥ 0) to outlying expressions and larger weights (≤ 1) to typical expressions. The distribution of the β-weights is used to calculate the cut-off point, which is compared to the observed β-weight of an expression to determine whether that gene expression is an outlier. This weight function plays a key role in unifying the robustness and efficiency of estimation in one-way ANOVA. Conclusion Analyses of simulated gene expression profiles revealed that all eight methods (ANOVA, SAM, LIMMA, EBarrays, eLNN, KW, robust BetaEB and proposed) perform almost identically for m = 2 conditions in the absence of outliers. However, the robust BetaEB method and the proposed method exhibited considerably better performance than the other six methods in the presence of outliers. In this case, the BetaEB method exhibited slightly better performance than the proposed method for the small-sample cases, but the the proposed method exhibited much better performance than the BetaEB method for both the small- and large-sample cases in the presence of more than 50% outlying genes. The proposed method also exhibited better performance than the other methods for m > 2 conditions with multiple patterns of expression, where the BetaEB was not extended for this condition. Therefore, the proposed approach would be more suitable and reliable on average for the identification of DE genes between two or more conditions with multiple patterns of expression. PMID:26413858

  4. Why is CDMA the solution for mobile satellite communication

    NASA Technical Reports Server (NTRS)

    Gilhousen, Klein S.; Jacobs, Irwin M.; Padovani, Roberto; Weaver, Lindsay A.

    1989-01-01

    It is demonstrated that spread spectrum Code Division Multiple Access (CDMA) systems provide an economically superior solution to satellite mobile communications by increasing the system maximum capacity with respect to single channel per carrier Frequency Division Multiple Access (FDMA) systems. Following the comparative analysis of CDMA and FDMA systems, the design of a model that was developed to test the feasibility of the approach and the performance of a spread spectrum system in a mobile environment. Results of extensive computer simulations as well as laboratory and field tests results are presented.

  5. Application of homomorphism to secure image sharing

    NASA Astrophysics Data System (ADS)

    Islam, Naveed; Puech, William; Hayat, Khizar; Brouzet, Robert

    2011-09-01

    In this paper, we present a new approach for sharing images between l players by exploiting the additive and multiplicative homomorphic properties of two well-known public key cryptosystems, i.e. RSA and Paillier. Contrary to the traditional schemes, the proposed approach employs secret sharing in a way that limits the influence of the dealer over the protocol and allows each player to participate with the help of his key-image. With the proposed approach, during the encryption step, each player encrypts his own key-image using the dealer's public key. The dealer encrypts the secret-to-be-shared image with the same public key and then, the l encrypted key-images plus the encrypted to-be shared image are multiplied homomorphically to get another encrypted image. After this step, the dealer can safely get a scrambled image which corresponds to the addition or multiplication of the l + 1 original images ( l key-images plus the secret image) because of the additive homomorphic property of the Paillier algorithm or multiplicative homomorphic property of the RSA algorithm. When the l players want to extract the secret image, they do not need to use keys and the dealer has no role. Indeed, with our approach, to extract the secret image, the l players need only to subtract their own key-image with no specific order from the scrambled image. Thus, the proposed approach provides an opportunity to use operators like multiplication on encrypted images for the development of a secure privacy preserving protocol in the image domain. We show that it is still possible to extract a visible version of the secret image with only l-1 key-images (when one key-image is missing) or when the l key-images used for the extraction are different from the l original key-images due to a lossy compression for example. Experimental results and security analysis verify and prove that the proposed approach is secure from cryptographic viewpoint.

  6. On analyzing free-response data on location level

    NASA Astrophysics Data System (ADS)

    Bandos, Andriy I.; Obuchowski, Nancy A.

    2017-03-01

    Free-response ROC (FROC) data are typically collected when primary question of interest is focused on the proportions of the correct detection-localization of known targets and frequencies of false positive responses, which can be multiple per subject (image). These studies are particularly relevant for CAD and related applications. The fundamental tool of the location-level FROC analysis is the FROC curve. Although there are many methods of FROC analysis, as we describe in this work, some of the standard and popular approaches, while important, are not suitable for analyzing specifically the location-level FROC performance as summarized by the FROC curve. Analysis of the FROC curve, on the other hand, might not be straightforward. Recently we developed an approach for the location-level analysis of the FROC data using the well-known tools for clustered ROC analysis. In the current work, based on previously developed concepts, and using specific examples, we demonstrate the key reasons why specifically location-level FROC performance cannot be fully addressed by the common approaches as well as illustrate the proposed solution. Specifically, we consider the two most salient FROC approaches, namely JAFROC and the area under the exponentially transformed FROC curve (AFE) and show that clearly superior FROC curves can have lower values for these indices. We describe the specific features that make these approaches inconsistent with FROC curves. This work illustrates some caveats for using the common approaches for location-level FROC analysis and provides guidelines for the appropriate assessment or comparison of FROC systems.

  7. Multielevation calibration of frequency-domain electromagnetic data

    USGS Publications Warehouse

    Minsley, Burke J.; Kass, M. Andy; Hodges, Greg; Smith, Bruce D.

    2014-01-01

    Systematic calibration errors must be taken into account because they can substantially impact the accuracy of inverted subsurface resistivity models derived from frequency-domain electromagnetic data, resulting in potentially misleading interpretations. We have developed an approach that uses data acquired at multiple elevations over the same location to assess calibration errors. A significant advantage is that this method does not require prior knowledge of subsurface properties from borehole or ground geophysical data (though these can be readily incorporated if available), and is, therefore, well suited to remote areas. The multielevation data were used to solve for calibration parameters and a single subsurface resistivity model that are self consistent over all elevations. The deterministic and Bayesian formulations of the multielevation approach illustrate parameter sensitivity and uncertainty using synthetic- and field-data examples. Multiplicative calibration errors (gain and phase) were found to be better resolved at high frequencies and when data were acquired over a relatively conductive area, whereas additive errors (bias) were reasonably resolved over conductive and resistive areas at all frequencies. The Bayesian approach outperformed the deterministic approach when estimating calibration parameters using multielevation data at a single location; however, joint analysis of multielevation data at multiple locations using the deterministic algorithm yielded the most accurate estimates of calibration parameters. Inversion results using calibration-corrected data revealed marked improvement in misfit, lending added confidence to the interpretation of these models.

  8. Robust Pedestrian Tracking and Recognition from FLIR Video: A Unified Approach via Sparse Coding

    PubMed Central

    Li, Xin; Guo, Rui; Chen, Chao

    2014-01-01

    Sparse coding is an emerging method that has been successfully applied to both robust object tracking and recognition in the vision literature. In this paper, we propose to explore a sparse coding-based approach toward joint object tracking-and-recognition and explore its potential in the analysis of forward-looking infrared (FLIR) video to support nighttime machine vision systems. A key technical contribution of this work is to unify existing sparse coding-based approaches toward tracking and recognition under the same framework, so that they can benefit from each other in a closed-loop. On the one hand, tracking the same object through temporal frames allows us to achieve improved recognition performance through dynamical updating of template/dictionary and combining multiple recognition results; on the other hand, the recognition of individual objects facilitates the tracking of multiple objects (i.e., walking pedestrians), especially in the presence of occlusion within a crowded environment. We report experimental results on both the CASIAPedestrian Database and our own collected FLIR video database to demonstrate the effectiveness of the proposed joint tracking-and-recognition approach. PMID:24961216

  9. Satellite Remote Sensing of Harmful Algal Blooms (HABs) and a Potential Synthesized Framework

    PubMed Central

    Shen, Li; Xu, Huiping; Guo, Xulin

    2012-01-01

    Harmful algal blooms (HABs) are severe ecological disasters threatening aquatic systems throughout the World, which necessitate scientific efforts in detecting and monitoring them. Compared with traditional in situ point observations, satellite remote sensing is considered as a promising technique for studying HABs due to its advantages of large-scale, real-time, and long-term monitoring. The present review summarizes the suitability of current satellite data sources and different algorithms for detecting HABs. It also discusses the spatial scale issue of HABs. Based on the major problems identified from previous literature, including the unsystematic understanding of HABs, the insufficient incorporation of satellite remote sensing, and a lack of multiple oceanographic explanations of the mechanisms causing HABs, this review also attempts to provide a comprehensive understanding of the complicated mechanism of HABs impacted by multiple oceanographic factors. A potential synthesized framework can be established by combining multiple accessible satellite remote sensing approaches including visual interpretation, spectra analysis, parameters retrieval and spatial-temporal pattern analysis. This framework aims to lead to a systematic and comprehensive monitoring of HABs based on satellite remote sensing from multiple oceanographic perspectives. PMID:22969372

  10. Multiple imputation for cure rate quantile regression with censored data.

    PubMed

    Wu, Yuanshan; Yin, Guosheng

    2017-03-01

    The main challenge in the context of cure rate analysis is that one never knows whether censored subjects are cured or uncured, or whether they are susceptible or insusceptible to the event of interest. Considering the susceptible indicator as missing data, we propose a multiple imputation approach to cure rate quantile regression for censored data with a survival fraction. We develop an iterative algorithm to estimate the conditionally uncured probability for each subject. By utilizing this estimated probability and Bernoulli sample imputation, we can classify each subject as cured or uncured, and then employ the locally weighted method to estimate the quantile regression coefficients with only the uncured subjects. Repeating the imputation procedure multiple times and taking an average over the resultant estimators, we obtain consistent estimators for the quantile regression coefficients. Our approach relaxes the usual global linearity assumption, so that we can apply quantile regression to any particular quantile of interest. We establish asymptotic properties for the proposed estimators, including both consistency and asymptotic normality. We conduct simulation studies to assess the finite-sample performance of the proposed multiple imputation method and apply it to a lung cancer study as an illustration. © 2016, The International Biometric Society.

  11. A Decentralized Adaptive Approach to Fault Tolerant Flight Control

    NASA Technical Reports Server (NTRS)

    Wu, N. Eva; Nikulin, Vladimir; Heimes, Felix; Shormin, Victor

    2000-01-01

    This paper briefly reports some results of our study on the application of a decentralized adaptive control approach to a 6 DOF nonlinear aircraft model. The simulation results showed the potential of using this approach to achieve fault tolerant control. Based on this observation and some analysis, the paper proposes a multiple channel adaptive control scheme that makes use of the functionally redundant actuating and sensing capabilities in the model, and explains how to implement the scheme to tolerate actuator and sensor failures. The conditions, under which the scheme is applicable, are stated in the paper.

  12. Single Marker and Haplotype-Based Association Analysis of Semolina and Pasta Colour in Elite Durum Wheat Breeding Lines Using a High-Density Consensus Map.

    PubMed

    N'Diaye, Amidou; Haile, Jemanesh K; Cory, Aron T; Clarke, Fran R; Clarke, John M; Knox, Ron E; Pozniak, Curtis J

    2017-01-01

    Association mapping is usually performed by testing the correlation between a single marker and phenotypes. However, because patterns of variation within genomes are inherited as blocks, clustering markers into haplotypes for genome-wide scans could be a worthwhile approach to improve statistical power to detect associations. The availability of high-density molecular data allows the possibility to assess the potential of both approaches to identify marker-trait associations in durum wheat. In the present study, we used single marker- and haplotype-based approaches to identify loci associated with semolina and pasta colour in durum wheat, the main objective being to evaluate the potential benefits of haplotype-based analysis for identifying quantitative trait loci. One hundred sixty-nine durum lines were genotyped using the Illumina 90K Infinium iSelect assay, and 12,234 polymorphic single nucleotide polymorphism (SNP) markers were generated and used to assess the population structure and the linkage disequilibrium (LD) patterns. A total of 8,581 SNPs previously localized to a high-density consensus map were clustered into 406 haplotype blocks based on the average LD distance of 5.3 cM. Combining multiple SNPs into haplotype blocks increased the average polymorphism information content (PIC) from 0.27 per SNP to 0.50 per haplotype. The haplotype-based analysis identified 12 loci associated with grain pigment colour traits, including the five loci identified by the single marker-based analysis. Furthermore, the haplotype-based analysis resulted in an increase of the phenotypic variance explained (50.4% on average) and the allelic effect (33.7% on average) when compared to single marker analysis. The presence of multiple allelic combinations within each haplotype locus offers potential for screening the most favorable haplotype series and may facilitate marker-assisted selection of grain pigment colour in durum wheat. These results suggest a benefit of haplotype-based analysis over single marker analysis to detect loci associated with colour traits in durum wheat.

  13. A study and experiment plan for digital mobile communication via satellite

    NASA Technical Reports Server (NTRS)

    Jones, J. J.; Craighill, E. J.; Evans, R. G.; Vincze, A. D.; Tom, N. N.

    1978-01-01

    The viability of mobile communications is examined within the context of a frequency division multiple access, single channel per carrier satellite system emphasizing digital techniques to serve a large population of users. The intent is to provide the mobile users with a grade of service consistant with the requirements for remote, rural (perhaps emergency) voice communications, but which approaches toll quality speech. A traffic model is derived on which to base the determination of the required maximum number of satellite channels to provide the anticipated level of service. Various voice digitalization and digital modulation schemes are reviewed along with a general link analysis of the mobile system. Demand assignment multiple access considerations and analysis tradeoffs are presented. Finally, a completed configuration is described.

  14. Solvation Structure and Thermodynamic Mapping (SSTMap): An Open-Source, Flexible Package for the Analysis of Water in Molecular Dynamics Trajectories.

    PubMed

    Haider, Kamran; Cruz, Anthony; Ramsey, Steven; Gilson, Michael K; Kurtzman, Tom

    2018-01-09

    We have developed SSTMap, a software package for mapping structural and thermodynamic water properties in molecular dynamics trajectories. The package introduces automated analysis and mapping of local measures of frustration and enhancement of water structure. The thermodynamic calculations are based on Inhomogeneous Fluid Solvation Theory (IST), which is implemented using both site-based and grid-based approaches. The package also extends the applicability of solvation analysis calculations to multiple molecular dynamics (MD) simulation programs by using existing cross-platform tools for parsing MD parameter and trajectory files. SSTMap is implemented in Python and contains both command-line tools and a Python module to facilitate flexibility in setting up calculations and for automated generation of large data sets involving analysis of multiple solutes. Output is generated in formats compatible with popular Python data science packages. This tool will be used by the molecular modeling community for computational analysis of water in problems of biophysical interest such as ligand binding and protein function.

  15. Unified Database for Rejected Image Analysis Across Multiple Vendors in Radiography.

    PubMed

    Little, Kevin J; Reiser, Ingrid; Liu, Lili; Kinsey, Tiffany; Sánchez, Adrian A; Haas, Kateland; Mallory, Florence; Froman, Carmen; Lu, Zheng Feng

    2017-02-01

    Reject rate analysis has been part of radiography departments' quality control since the days of screen-film radiography. In the era of digital radiography, one might expect that reject rate analysis is easily facilitated because of readily available information produced by the modality during the examination procedure. Unfortunately, this is not always the case. The lack of an industry standard and the wide variety of system log entries and formats have made it difficult to implement a robust multivendor reject analysis program, and logs do not always include all relevant information. The increased use of digital detectors exacerbates this problem because of higher reject rates associated with digital radiography compared with computed radiography. In this article, the authors report on the development of a unified database for vendor-neutral reject analysis across multiple sites within an academic institution and share their experience from a team-based approach to reduce reject rates. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  16. Chemical purity using quantitative 1H-nuclear magnetic resonance: a hierarchical Bayesian approach for traceable calibrations

    NASA Astrophysics Data System (ADS)

    Toman, Blaza; Nelson, Michael A.; Lippa, Katrice A.

    2016-10-01

    Chemical purity assessment using quantitative 1H-nuclear magnetic resonance spectroscopy is a method based on ratio references of mass and signal intensity of the analyte species to that of chemical standards of known purity. As such, it is an example of a calculation using a known measurement equation with multiple inputs. Though multiple samples are often analyzed during purity evaluations in order to assess measurement repeatability, the uncertainty evaluation must also account for contributions from inputs to the measurement equation. Furthermore, there may be other uncertainty components inherent in the experimental design, such as independent implementation of multiple calibration standards. As such, the uncertainty evaluation is not purely bottom up (based on the measurement equation) or top down (based on the experimental design), but inherently contains elements of both. This hybrid form of uncertainty analysis is readily implemented with Bayesian statistical analysis. In this article we describe this type of analysis in detail and illustrate it using data from an evaluation of chemical purity and its uncertainty for a folic acid material.

  17. Learning in Earth and space science: a review of conceptual change instructional approaches

    NASA Astrophysics Data System (ADS)

    Mills, Reece; Tomas, Louisa; Lewthwaite, Brian

    2016-03-01

    In response to calls for research into effective instruction in the Earth and space sciences, and to identify directions for future research, this systematic review of the literature explores research into instructional approaches designed to facilitate conceptual change. In total, 52 studies were identified and analyzed. Analysis focused on the general characteristics of the research, the conceptual change instructional approaches that were used, and the methods employed to evaluate the effectiveness of these approaches. The findings of this review support four assertions about the existing research: (1) astronomical phenomena have received greater attention than geological phenomena; (2) most studies have viewed conceptual change from a cognitive perspective only; (3) data about conceptual change were generated pre- and post-intervention only; and (4) the interventions reviewed presented limited opportunities to involve students in the construction and manipulation of multiple representations of the phenomenon being investigated. Based upon these assertions, the authors recommend that new research in the Earth and space science disciplines challenges traditional notions of conceptual change by exploring the role of affective variables on learning, focuses on the learning of geological phenomena through the construction of multiple representations, and employs qualitative data collection throughout the implementation of an instructional approach.

  18. Biochemometrics for Natural Products Research: Comparison of Data Analysis Approaches and Application to Identification of Bioactive Compounds.

    PubMed

    Kellogg, Joshua J; Todd, Daniel A; Egan, Joseph M; Raja, Huzefa A; Oberlies, Nicholas H; Kvalheim, Olav M; Cech, Nadja B

    2016-02-26

    A central challenge of natural products research is assigning bioactive compounds from complex mixtures. The gold standard approach to address this challenge, bioassay-guided fractionation, is often biased toward abundant, rather than bioactive, mixture components. This study evaluated the combination of bioassay-guided fractionation with untargeted metabolite profiling to improve active component identification early in the fractionation process. Key to this methodology was statistical modeling of the integrated biological and chemical data sets (biochemometric analysis). Three data analysis approaches for biochemometric analysis were compared, namely, partial least-squares loading vectors, S-plots, and the selectivity ratio. Extracts from the endophytic fungi Alternaria sp. and Pyrenochaeta sp. with antimicrobial activity against Staphylococcus aureus served as test cases. Biochemometric analysis incorporating the selectivity ratio performed best in identifying bioactive ions from these extracts early in the fractionation process, yielding altersetin (3, MIC 0.23 μg/mL) and macrosphelide A (4, MIC 75 μg/mL) as antibacterial constituents from Alternaria sp. and Pyrenochaeta sp., respectively. This study demonstrates the potential of biochemometrics coupled with bioassay-guided fractionation to identify bioactive mixture components. A benefit of this approach is the ability to integrate multiple stages of fractionation and bioassay data into a single analysis.

  19. Use Hierarchical Storage and Analysis to Exploit Intrinsic Parallelism

    NASA Astrophysics Data System (ADS)

    Zender, C. S.; Wang, W.; Vicente, P.

    2013-12-01

    Big Data is an ugly name for the scientific opportunities and challenges created by the growing wealth of geoscience data. How to weave large, disparate datasets together to best reveal their underlying properties, to exploit their strengths and minimize their weaknesses, to continually aggregate more information than the world knew yesterday and less than we will learn tomorrow? Data analytics techniques (statistics, data mining, machine learning, etc.) can accelerate pattern recognition and discovery. However, often researchers must, prior to analysis, organize multiple related datasets into a coherent framework. Hierarchical organization permits entire dataset to be stored in nested groups that reflect their intrinsic relationships and similarities. Hierarchical data can be simpler and faster to analyze by coding operators to automatically parallelize processes over isomorphic storage units, i.e., groups. The newest generation of netCDF Operators (NCO) embody this hierarchical approach, while still supporting traditional analysis approaches. We will use NCO to demonstrate the trade-offs involved in processing a prototypical Big Data application (analysis of CMIP5 datasets) using hierarchical and traditional analysis approaches.

  20. Integrated mixed methods policy analysis for sustainable food systems: trends, challenges and future research.

    PubMed

    Cuevas, Soledad

    Agriculture is a major contributor to greenhouse gas emissions, an important part of which is associated to deforestation and indirect land use change. Appropriate and coherent food policies can play an important role in aligning health, economic and environmental goals. From the point of view of policy analysis, however, this requires multi-sectoral, interdisciplinary approaches which can be highly complex. Important methodological advances in the area are not exempted from limitations and criticism. We argue that there is scope for further developments in integrated quantitative and qualitative policy analysis combining existing methods, including mathematical modelling and stakeholder analysis. We outline methodological trends in the field, briefly characterise integrated mixed methods policy analysis and identify contributions, challenges and opportunities for future research. In particular, this type of approach can help address issues of uncertainty and context-specific validity, incorporate multiple perspectives and help advance meaningful interdisciplinary collaboration in the field. Substantial challenges remain, however, such as the integration of key issues related to non-communicable disease, or the incorporation of a broader range of qualitative approaches that can address important cultural and ethical dimensions of food.

  1. VIBRA: An interactive computer program for steady-state vibration response analysis of linear damped structures

    NASA Technical Reports Server (NTRS)

    Bowman, L. M.

    1984-01-01

    An interactive steady state frequency response computer program with graphics is documented. Single or multiple forces may be applied to the structure using a modal superposition approach to calculate response. The method can be reapplied to linear, proportionally damped structures in which the damping may be viscous or structural. The theoretical approach and program organization are described. Example problems, user instructions, and a sample interactive session are given to demonstate the program's capability in solving a variety of problems.

  2. The Treatment Efficacy of Multiple Opposition Phonological Approach via Telepractice for Two Children with Severe Phonological Disorders in Rural Areas of West Texas in the USA

    ERIC Educational Resources Information Center

    Lee, Sue Ann S.

    2018-01-01

    The goals of the present study were to (1) examine the effects of the multiple opposition phonological approach on improving phoneme production accuracy in children with severe phonological disorders and (2) explore whether the multiple opposition approach is feasible for the telepractice service delivery model. A multiple-baseline,…

  3. A structured framework for assessing sensitivity to missing data assumptions in longitudinal clinical trials.

    PubMed

    Mallinckrodt, C H; Lin, Q; Molenberghs, M

    2013-01-01

    The objective of this research was to demonstrate a framework for drawing inference from sensitivity analyses of incomplete longitudinal clinical trial data via a re-analysis of data from a confirmatory clinical trial in depression. A likelihood-based approach that assumed missing at random (MAR) was the primary analysis. Robustness to departure from MAR was assessed by comparing the primary result to those from a series of analyses that employed varying missing not at random (MNAR) assumptions (selection models, pattern mixture models and shared parameter models) and to MAR methods that used inclusive models. The key sensitivity analysis used multiple imputation assuming that after dropout the trajectory of drug-treated patients was that of placebo treated patients with a similar outcome history (placebo multiple imputation). This result was used as the worst reasonable case to define the lower limit of plausible values for the treatment contrast. The endpoint contrast from the primary analysis was - 2.79 (p = .013). In placebo multiple imputation, the result was - 2.17. Results from the other sensitivity analyses ranged from - 2.21 to - 3.87 and were symmetrically distributed around the primary result. Hence, no clear evidence of bias from missing not at random data was found. In the worst reasonable case scenario, the treatment effect was 80% of the magnitude of the primary result. Therefore, it was concluded that a treatment effect existed. The structured sensitivity framework of using a worst reasonable case result based on a controlled imputation approach with transparent and debatable assumptions supplemented a series of plausible alternative models under varying assumptions was useful in this specific situation and holds promise as a generally useful framework. Copyright © 2012 John Wiley & Sons, Ltd.

  4. A note on the use of the generalized odds ratio in meta-analysis of association studies involving bi- and tri-allelic polymorphisms.

    PubMed

    Pereira, Tiago V; Mingroni-Netto, Regina C

    2011-06-06

    The generalized odds ratio (GOR) was recently suggested as a genetic model-free measure for association studies. However, its properties were not extensively investigated. We used Monte Carlo simulations to investigate type-I error rates, power and bias in both effect size and between-study variance estimates of meta-analyses using the GOR as a summary effect, and compared these results to those obtained by usual approaches of model specification. We further applied the GOR in a real meta-analysis of three genome-wide association studies in Alzheimer's disease. For bi-allelic polymorphisms, the GOR performs virtually identical to a standard multiplicative model of analysis (e.g. per-allele odds ratio) for variants acting multiplicatively, but augments slightly the power to detect variants with a dominant mode of action, while reducing the probability to detect recessive variants. Although there were differences among the GOR and usual approaches in terms of bias and type-I error rates, both simulation- and real data-based results provided little indication that these differences will be substantial in practice for meta-analyses involving bi-allelic polymorphisms. However, the use of the GOR may be slightly more powerful for the synthesis of data from tri-allelic variants, particularly when susceptibility alleles are less common in the populations (≤10%). This gain in power may depend on knowledge of the direction of the effects. For the synthesis of data from bi-allelic variants, the GOR may be regarded as a multiplicative-like model of analysis. The use of the GOR may be slightly more powerful in the tri-allelic case, particularly when susceptibility alleles are less common in the populations.

  5. Velocity landscape correlation resolves multiple flowing protein populations from fluorescence image time series.

    PubMed

    Pandžić, Elvis; Abu-Arish, Asmahan; Whan, Renee M; Hanrahan, John W; Wiseman, Paul W

    2018-02-16

    Molecular, vesicular and organellar flows are of fundamental importance for the delivery of nutrients and essential components used in cellular functions such as motility and division. With recent advances in fluorescence/super-resolution microscopy modalities we can resolve the movements of these objects at higher spatio-temporal resolutions and with better sensitivity. Previously, spatio-temporal image correlation spectroscopy has been applied to map molecular flows by correlation analysis of fluorescence fluctuations in image series. However, an underlying assumption of this approach is that the sampled time windows contain one dominant flowing component. Although this was true for most of the cases analyzed earlier, in some situations two or more different flowing populations can be present in the same spatio-temporal window. We introduce an approach, termed velocity landscape correlation (VLC), which detects and extracts multiple flow components present in a sampled image region via an extension of the correlation analysis of fluorescence intensity fluctuations. First we demonstrate theoretically how this approach works, test the performance of the method with a range of computer simulated image series with varying flow dynamics. Finally we apply VLC to study variable fluxing of STIM1 proteins on microtubules connected to the plasma membrane of Cystic Fibrosis Bronchial Epithelial (CFBE) cells. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. Pathways between Socioeconomic Disadvantage and Childhood Growth in the Scottish Longitudinal Study, 1991-2001.

    PubMed

    Silverwood, Richard J; Williamson, Lee; Grundy, Emily M; De Stavola, Bianca L

    2016-01-01

    Socioeconomically disadvantaged children are more likely to be of shorter stature and overweight, leading to greater risk of obesity in adulthood. Disentangling the mediatory pathways between socioeconomic disadvantage and childhood size may help in the development of appropriate policies aimed at reducing these health inequalities. We aimed to elucidate the putative mediatory role of birth weight using a representative sample of the Scottish population born 1991-2001 (n = 16,628). Estimated height and overweight/obesity at age 4.5 years were related to three measures of socioeconomic disadvantage (mother's education, Scottish Index of Multiple Deprivation, synthetic weekly income). Mediation was examined using two approaches: a 'traditional' mediation analysis and a counterfactual-based mediation analysis. Both analyses identified a negative effect of each measure of socioeconomic disadvantage on height, mediated to some extent by birth weight, and a positive 'direct effect' of mother's education and Scottish Index of Multiple Deprivation on overweight/obesity, which was partly counterbalanced by a negative 'indirect effect'. The extent of mediation estimated when adopting the traditional approach was greater than when adopting the counterfactual-based approach because of inappropriate handling of intermediate confounding in the former. Our findings suggest that higher birth weight in more disadvantaged groups is associated with reduced social inequalities in height but also with increased inequalities in overweight/obesity.

  7. Seismic joint analysis for non-destructive testing of asphalt and concrete slabs

    USGS Publications Warehouse

    Ryden, N.; Park, C.B.

    2005-01-01

    A seismic approach is used to estimate the thickness and elastic stiffness constants of asphalt or concrete slabs. The overall concept of the approach utilizes the robustness of the multichannel seismic method. A multichannel-equivalent data set is compiled from multiple time series recorded from multiple hammer impacts at progressively different offsets from a fixed receiver. This multichannel simulation with one receiver (MSOR) replaces the true multichannel recording in a cost-effective and convenient manner. A recorded data set is first processed to evaluate the shear wave velocity through a wave field transformation, normally used in the multichannel analysis of surface waves (MASW) method, followed by a Lambwave inversion. Then, the same data set is used to evaluate compression wave velocity from a combined processing of the first-arrival picking and a linear regression. Finally, the amplitude spectra of the time series are used to evaluate the thickness by following the concepts utilized in the Impact Echo (IE) method. Due to the powerful signal extraction capabilities ensured by the multichannel processing schemes used, the entire procedure for all three evaluations can be fully automated and results can be obtained directly in the field. A field data set is used to demonstrate the proposed approach.

  8. A fully automatic three-step liver segmentation method on LDA-based probability maps for multiple contrast MR images.

    PubMed

    Gloger, Oliver; Kühn, Jens; Stanski, Adam; Völzke, Henry; Puls, Ralf

    2010-07-01

    Automatic 3D liver segmentation in magnetic resonance (MR) data sets has proven to be a very challenging task in the domain of medical image analysis. There exist numerous approaches for automatic 3D liver segmentation on computer tomography data sets that have influenced the segmentation of MR images. In contrast to previous approaches to liver segmentation in MR data sets, we use all available MR channel information of different weightings and formulate liver tissue and position probabilities in a probabilistic framework. We apply multiclass linear discriminant analysis as a fast and efficient dimensionality reduction technique and generate probability maps then used for segmentation. We develop a fully automatic three-step 3D segmentation approach based upon a modified region growing approach and a further threshold technique. Finally, we incorporate characteristic prior knowledge to improve the segmentation results. This novel 3D segmentation approach is modularized and can be applied for normal and fat accumulated liver tissue properties. Copyright 2010 Elsevier Inc. All rights reserved.

  9. Power calculation for comparing diagnostic accuracies in a multi-reader, multi-test design.

    PubMed

    Kim, Eunhee; Zhang, Zheng; Wang, Youdan; Zeng, Donglin

    2014-12-01

    Receiver operating characteristic (ROC) analysis is widely used to evaluate the performance of diagnostic tests with continuous or ordinal responses. A popular study design for assessing the accuracy of diagnostic tests involves multiple readers interpreting multiple diagnostic test results, called the multi-reader, multi-test design. Although several different approaches to analyzing data from this design exist, few methods have discussed the sample size and power issues. In this article, we develop a power formula to compare the correlated areas under the ROC curves (AUC) in a multi-reader, multi-test design. We present a nonparametric approach to estimate and compare the correlated AUCs by extending DeLong et al.'s (1988, Biometrics 44, 837-845) approach. A power formula is derived based on the asymptotic distribution of the nonparametric AUCs. Simulation studies are conducted to demonstrate the performance of the proposed power formula and an example is provided to illustrate the proposed procedure. © 2014, The International Biometric Society.

  10. Spectral decompositions of multiple time series: a Bayesian non-parametric approach.

    PubMed

    Macaro, Christian; Prado, Raquel

    2014-01-01

    We consider spectral decompositions of multiple time series that arise in studies where the interest lies in assessing the influence of two or more factors. We write the spectral density of each time series as a sum of the spectral densities associated to the different levels of the factors. We then use Whittle's approximation to the likelihood function and follow a Bayesian non-parametric approach to obtain posterior inference on the spectral densities based on Bernstein-Dirichlet prior distributions. The prior is strategically important as it carries identifiability conditions for the models and allows us to quantify our degree of confidence in such conditions. A Markov chain Monte Carlo (MCMC) algorithm for posterior inference within this class of frequency-domain models is presented.We illustrate the approach by analyzing simulated and real data via spectral one-way and two-way models. In particular, we present an analysis of functional magnetic resonance imaging (fMRI) brain responses measured in individuals who participated in a designed experiment to study pain perception in humans.

  11. Local Geometry and Evolutionary Conservation of Protein Surfaces Reveal the Multiple Recognition Patches in Protein-Protein Interactions

    PubMed Central

    Laine, Elodie; Carbone, Alessandra

    2015-01-01

    Protein-protein interactions (PPIs) are essential to all biological processes and they represent increasingly important therapeutic targets. Here, we present a new method for accurately predicting protein-protein interfaces, understanding their properties, origins and binding to multiple partners. Contrary to machine learning approaches, our method combines in a rational and very straightforward way three sequence- and structure-based descriptors of protein residues: evolutionary conservation, physico-chemical properties and local geometry. The implemented strategy yields very precise predictions for a wide range of protein-protein interfaces and discriminates them from small-molecule binding sites. Beyond its predictive power, the approach permits to dissect interaction surfaces and unravel their complexity. We show how the analysis of the predicted patches can foster new strategies for PPIs modulation and interaction surface redesign. The approach is implemented in JET2, an automated tool based on the Joint Evolutionary Trees (JET) method for sequence-based protein interface prediction. JET2 is freely available at www.lcqb.upmc.fr/JET2. PMID:26690684

  12. Gene features selection for three-class disease classification via multiple orthogonal partial least square discriminant analysis and S-plot using microarray data.

    PubMed

    Yang, Mingxing; Li, Xiumin; Li, Zhibin; Ou, Zhimin; Liu, Ming; Liu, Suhuan; Li, Xuejun; Yang, Shuyu

    2013-01-01

    DNA microarray analysis is characterized by obtaining a large number of gene variables from a small number of observations. Cluster analysis is widely used to analyze DNA microarray data to make classification and diagnosis of disease. Because there are so many irrelevant and insignificant genes in a dataset, a feature selection approach must be employed in data analysis. The performance of cluster analysis of this high-throughput data depends on whether the feature selection approach chooses the most relevant genes associated with disease classes. Here we proposed a new method using multiple Orthogonal Partial Least Squares-Discriminant Analysis (mOPLS-DA) models and S-plots to select the most relevant genes to conduct three-class disease classification and prediction. We tested our method using Golub's leukemia microarray data. For three classes with subtypes, we proposed hierarchical orthogonal partial least squares-discriminant analysis (OPLS-DA) models and S-plots to select features for two main classes and their subtypes. For three classes in parallel, we employed three OPLS-DA models and S-plots to choose marker genes for each class. The power of feature selection to classify and predict three-class disease was evaluated using cluster analysis. Further, the general performance of our method was tested using four public datasets and compared with those of four other feature selection methods. The results revealed that our method effectively selected the most relevant features for disease classification and prediction, and its performance was better than that of the other methods.

  13. Multiple Learning Approaches in the Professional Development of School Leaders -- Theoretical Perspectives and Empirical Findings on Self-assessment and Feedback

    ERIC Educational Resources Information Center

    Huber, Stephan Gerhard

    2013-01-01

    This article investigates the use of multiple learning approaches and different modes and types of learning in the (continuous) professional development (PD) of school leaders, particularly the use of self-assessment and feedback. First, formats and multiple approaches to professional learning are described. Second, a possible approach to…

  14. Histogram analysis parameters identify multiple associations between DWI and DCE MRI in head and neck squamous cell carcinoma.

    PubMed

    Meyer, Hans Jonas; Leifels, Leonard; Schob, Stefan; Garnov, Nikita; Surov, Alexey

    2018-01-01

    Nowadays, multiparametric investigations of head and neck squamous cell carcinoma (HNSCC) are established. These approaches can better characterize tumor biology and behavior. Diffusion weighted imaging (DWI) can by means of apparent diffusion coefficient (ADC) quantitatively characterize different tissue compartments. Dynamic contrast-enhanced magnetic resonance imaging (DCE MRI) reflects perfusion and vascularization of tissues. Recently, a novel approach of data acquisition, namely histogram analysis of different images is a novel diagnostic approach, which can provide more information of tissue heterogeneity. The purpose of this study was to analyze possible associations between DWI, and DCE parameters derived from histogram analysis in patients with HNSCC. Overall, 34 patients, 9 women and 25 men, mean age, 56.7±10.2years, with different HNSCC were involved in the study. DWI was obtained by using of an axial echo planar imaging sequence with b-values of 0 and 800s/mm 2 . Dynamic T1w DCE sequence after intravenous application of contrast medium was performed for estimation of the following perfusion parameters: volume transfer constant (K trans ), volume of the extravascular extracellular leakage space (Ve), and diffusion of contrast medium from the extravascular extracellular leakage space back to the plasma (Kep). Both ADC and perfusion parameters maps were processed offline in DICOM format with custom-made Matlab-based application. Thereafter, polygonal ROIs were manually drawn on the transferred maps on each slice. For every parameter, mean, maximal, minimal, and median values, as well percentiles 10th, 25th, 75th, 90th, kurtosis, skewness, and entropy were estimated. Сorrelation analysis identified multiple statistically significant correlations between the investigated parameters. Ve related parameters correlated well with different ADC values. Especially, percentiles 10 and 75, mode, and median values showed stronger correlations in comparison to other parameters. Thereby, the calculated correlation coefficients ranged from 0.62 to 0.69. Furthermore, K trans related parameters showed multiple slightly to moderate significant correlations with different ADC values. Strongest correlations were identified between ADC P75 and K trans min (p=0.58, P=0.0007), and ADC P75 and K trans P10 (p=0.56, P=0.001). Only four K ep related parameters correlated statistically significant with ADC fractions. Strongest correlation was found between K ep max and ADC mode (p=-0.47, P=0.008). Multiple statistically significant correlations between, DWI and DCE MRI parameters derived from histogram analysis were identified in HNSCC. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Combining evidence from multiple electronic health care databases: performances of one-stage and two-stage meta-analysis in matched case-control studies.

    PubMed

    La Gamba, Fabiola; Corrao, Giovanni; Romio, Silvana; Sturkenboom, Miriam; Trifirò, Gianluca; Schink, Tania; de Ridder, Maria

    2017-10-01

    Clustering of patients in databases is usually ignored in one-stage meta-analysis of multi-database studies using matched case-control data. The aim of this study was to compare bias and efficiency of such a one-stage meta-analysis with a two-stage meta-analysis. First, we compared the approaches by generating matched case-control data under 5 simulated scenarios, built by varying: (1) the exposure-outcome association; (2) its variability among databases; (3) the confounding strength of one covariate on this association; (4) its variability; and (5) the (heterogeneous) confounding strength of two covariates. Second, we made the same comparison using empirical data from the ARITMO project, a multiple database study investigating the risk of ventricular arrhythmia following the use of medications with arrhythmogenic potential. In our study, we specifically investigated the effect of current use of promethazine. Bias increased for one-stage meta-analysis with increasing (1) between-database variance of exposure effect and (2) heterogeneous confounding generated by two covariates. The efficiency of one-stage meta-analysis was slightly lower than that of two-stage meta-analysis for the majority of investigated scenarios. Based on ARITMO data, there were no evident differences between one-stage (OR = 1.50, CI = [1.08; 2.08]) and two-stage (OR = 1.55, CI = [1.12; 2.16]) approaches. When the effect of interest is heterogeneous, a one-stage meta-analysis ignoring clustering gives biased estimates. Two-stage meta-analysis generates estimates at least as accurate and precise as one-stage meta-analysis. However, in a study using small databases and rare exposures and/or outcomes, a correct one-stage meta-analysis becomes essential. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Effects of Rhythmic Auditory Cueing in Gait Rehabilitation for Multiple Sclerosis: A Mini Systematic Review and Meta-Analysis

    PubMed Central

    Ghai, Shashank; Ghai, Ishan

    2018-01-01

    Rhythmic auditory cueing has been shown to enhance gait performance in several movement disorders. The “entrainment effect” generated by the stimulations can enhance auditory motor coupling and instigate plasticity. However, a consensus as to its influence over gait training among patients with multiple sclerosis is still warranted. A systematic review and meta-analysis was carried out to analyze the effects of rhythmic auditory cueing in studies gait performance in patients with multiple sclerosis. This systematic identification of published literature was performed according to PRISMA guidelines, from inception until Dec 2017, on online databases: Web of science, PEDro, EBSCO, MEDLINE, Cochrane, EMBASE, and PROQUEST. Studies were critically appraised using PEDro scale. Of 602 records, five studies (PEDro score: 5.7 ± 1.3) involving 188 participants (144 females/40 males) met our inclusion criteria. The meta-analysis revealed enhancements in spatiotemporal parameters of gait i.e., velocity (Hedge's g: 0.67), stride length (0.70), and cadence (1.0), and reduction in timed 25 feet walking test (−0.17). Underlying neurophysiological mechanisms, and clinical implications are discussed. This present review bridges the gaps in literature by suggesting application of rhythmic auditory cueing in conventional rehabilitation approaches to enhance gait performance in the multiple sclerosis community. PMID:29942278

  17. Early Parallel Activation of Semantics and Phonology in Picture Naming: Evidence from a Multiple Linear Regression MEG Study

    PubMed Central

    Miozzo, Michele; Pulvermüller, Friedemann; Hauk, Olaf

    2015-01-01

    The time course of brain activation during word production has become an area of increasingly intense investigation in cognitive neuroscience. The predominant view has been that semantic and phonological processes are activated sequentially, at about 150 and 200–400 ms after picture onset. Although evidence from prior studies has been interpreted as supporting this view, these studies were arguably not ideally suited to detect early brain activation of semantic and phonological processes. We here used a multiple linear regression approach to magnetoencephalography (MEG) analysis of picture naming in order to investigate early effects of variables specifically related to visual, semantic, and phonological processing. This was combined with distributed minimum-norm source estimation and region-of-interest analysis. Brain activation associated with visual image complexity appeared in occipital cortex at about 100 ms after picture presentation onset. At about 150 ms, semantic variables became physiologically manifest in left frontotemporal regions. In the same latency range, we found an effect of phonological variables in the left middle temporal gyrus. Our results demonstrate that multiple linear regression analysis is sensitive to early effects of multiple psycholinguistic variables in picture naming. Crucially, our results suggest that access to phonological information might begin in parallel with semantic processing around 150 ms after picture onset. PMID:25005037

  18. Handling missing rows in multi-omics data integration: multiple imputation in multiple factor analysis framework.

    PubMed

    Voillet, Valentin; Besse, Philippe; Liaubet, Laurence; San Cristobal, Magali; González, Ignacio

    2016-10-03

    In omics data integration studies, it is common, for a variety of reasons, for some individuals to not be present in all data tables. Missing row values are challenging to deal with because most statistical methods cannot be directly applied to incomplete datasets. To overcome this issue, we propose a multiple imputation (MI) approach in a multivariate framework. In this study, we focus on multiple factor analysis (MFA) as a tool to compare and integrate multiple layers of information. MI involves filling the missing rows with plausible values, resulting in M completed datasets. MFA is then applied to each completed dataset to produce M different configurations (the matrices of coordinates of individuals). Finally, the M configurations are combined to yield a single consensus solution. We assessed the performance of our method, named MI-MFA, on two real omics datasets. Incomplete artificial datasets with different patterns of missingness were created from these data. The MI-MFA results were compared with two other approaches i.e., regularized iterative MFA (RI-MFA) and mean variable imputation (MVI-MFA). For each configuration resulting from these three strategies, the suitability of the solution was determined against the true MFA configuration obtained from the original data and a comprehensive graphical comparison showing how the MI-, RI- or MVI-MFA configurations diverge from the true configuration was produced. Two approaches i.e., confidence ellipses and convex hulls, to visualize and assess the uncertainty due to missing values were also described. We showed how the areas of ellipses and convex hulls increased with the number of missing individuals. A free and easy-to-use code was proposed to implement the MI-MFA method in the R statistical environment. We believe that MI-MFA provides a useful and attractive method for estimating the coordinates of individuals on the first MFA components despite missing rows. MI-MFA configurations were close to the true configuration even when many individuals were missing in several data tables. This method takes into account the uncertainty of MI-MFA configurations induced by the missing rows, thereby allowing the reliability of the results to be evaluated.

  19. The concurrent multiplicative-additive approach for gauge-radar/satellite multisensor precipitation estimates

    NASA Astrophysics Data System (ADS)

    Garcia-Pintado, J.; Barberá, G. G.; Erena Arrabal, M.; Castillo, V. M.

    2010-12-01

    Objective analysis schemes (OAS), also called ``succesive correction methods'' or ``observation nudging'', have been proposed for multisensor precipitation estimation combining remote sensing data (meteorological radar or satellite) with data from ground-based raingauge networks. However, opposite to the more complex geostatistical approaches, the OAS techniques for this use are not optimized. On the other hand, geostatistical techniques ideally require, at the least, modelling the covariance from the rain gauge data at every time step evaluated, which commonly cannot be soundly done. Here, we propose a new procedure (concurrent multiplicative-additive objective analysis scheme [CMA-OAS]) for operational rainfall estimation using rain gauges and meteorological radar, which does not require explicit modelling of spatial covariances. On the basis of a concurrent multiplicative-additive (CMA) decomposition of the spatially nonuniform radar bias, within-storm variability of rainfall and fractional coverage of rainfall are taken into account. Thus both spatially nonuniform radar bias, given that rainfall is detected, and bias in radar detection of rainfall are handled. The interpolation procedure of CMA-OAS is built on the OAS, whose purpose is to estimate a filtered spatial field of the variable of interest through a successive correction of residuals resulting from a Gaussian kernel smoother applied on spatial samples. The CMA-OAS, first, poses an optimization problem at each gauge-radar support point to obtain both a local multiplicative-additive radar bias decomposition and a regionalization parameter. Second, local biases and regionalization parameters are integrated into an OAS to estimate the multisensor rainfall at the ground level. The approach considers radar estimates as background a priori information (first guess), so that nudging to observations (gauges) may be relaxed smoothly to the first guess, and the relaxation shape is obtained from the sequential optimization. The procedure is suited to relatively sparse rain gauge networks. To show the procedure, six storms are analyzed at hourly steps over 10,663 km2. Results generally indicated an improved quality with respect to other methods evaluated: a standard mean-field bias adjustment, an OAS spatially variable adjustment with multiplicative factors, ordinary cokriging, and kriging with external drift. In theory, it could be equally applicable to gauge-satellite estimates and other hydrometeorological variables.

  20. Measuring the Gas Constant "R": Propagation of Uncertainty and Statistics

    ERIC Educational Resources Information Center

    Olsen, Robert J.; Sattar, Simeen

    2013-01-01

    Determining the gas constant "R" by measuring the properties of hydrogen gas collected in a gas buret is well suited for comparing two approaches to uncertainty analysis using a single data set. The brevity of the experiment permits multiple determinations, allowing for statistical evaluation of the standard uncertainty u[subscript…

  1. Design Optimization through M.A.S.H. Analysis

    ERIC Educational Resources Information Center

    Ringholz, David

    2005-01-01

    In the classroom, it is often challenging to find new ways to approach and present complex material. This is particularly true in design education, where innovation is highly valued and often required. A student developing a design for a new product has to successfully resolve multiple variables simultaneously while refining his/her own…

  2. Girls in Primary School Science Classrooms: Theorising beyond Dominant Discourses of Gender

    ERIC Educational Resources Information Center

    Cervoni, Cleti; Ivinson, Gabrielle

    2011-01-01

    The paper explores the ways girls appropriate gender through actions, gesture and talk to achieve things in primary school science classrooms. It draws on socio-cultural approaches to show that when everyday classroom practices are viewed from multiple planes of analysis, historical, institutional and in the micro dynamics of classroom…

  3. Using Formative Student Feedback: A Continuous Quality Improvement Approach for Online Course Development

    ERIC Educational Resources Information Center

    Bloxham, Kristy Taylor

    2010-01-01

    The objective of this study was to examine the use of frequent, anonymous student course surveys as a tool in supporting continuous quality improvement (CQI) principles in online instruction. The study used a qualitative, multiple-case design involving four separate online courses. Analysis methods included pattern matching/explanation building,…

  4. NPV Sensitivity Analysis: A Dynamic Excel Approach

    ERIC Educational Resources Information Center

    Mangiero, George A.; Kraten, Michael

    2017-01-01

    Financial analysts generally create static formulas for the computation of NPV. When they do so, however, it is not readily apparent how sensitive the value of NPV is to changes in multiple interdependent and interrelated variables. It is the aim of this paper to analyze this variability by employing a dynamic, visually graphic presentation using…

  5. A Meta-Analysis of Trials Evaluating Patient Education and Counseling for Three Groups of Preventive Health Behaviors.

    ERIC Educational Resources Information Center

    Mullen, Patricia Dolan; Simons-Morton, Denise G.; Ramirez, Gilbert; Frankowski, Ralph F.; Green, Lawrence W.; Mains, Douglas A.

    1997-01-01

    The overall effectiveness of patient education and counseling on preventive health behaviors was examined across published clinical trials, 1971-1994. The effectiveness of various approaches for modifying specific types of behaviors among patients without diagnosed disease was assessed. Multiple regression models indicated differences among…

  6. Modelling Student Satisfaction and Motivation in the Integrated Educational Environment: An Empirical study

    ERIC Educational Resources Information Center

    Stukalina, Yulia

    2016-01-01

    Purpose: The purpose of this paper is to explore some issues related to enhancing the quality of educational services provided by a university in the agenda of integrating quality assurance activities and strategic management procedures. Design/methodology/approach: Employing multiple regression analysis the author has examined some factors that…

  7. Exploring General versus Task-Specific Assessments of Metacognition in University Chemistry Students: A Multitrait-Multimethod Analysis

    ERIC Educational Resources Information Center

    Wang, Chia-Yu

    2015-01-01

    The purpose of this study was to use multiple assessments to investigate the general versus task-specific characteristics of metacognition in dissimilar chemistry topics. This mixed-method approach investigated the nature of undergraduate general chemistry students' metacognition using four assessments: a self-report questionnaire, assessment of…

  8. Corruption in Higher Education: Conceptual Approaches and Measurement Techniques

    ERIC Educational Resources Information Center

    Osipian, Ararat L.

    2007-01-01

    Corruption is a complex and multifaceted phenomenon. Forms of corruption are multiple. Measuring corruption is necessary not only for getting ideas about the scale and scope of the problem, but for making simple comparisons between the countries and conducting comparative analysis of corruption. While the total impact of corruption is indeed…

  9. Engineer's Needs for Scientific and Technical Information.

    ERIC Educational Resources Information Center

    David, A., Ed.; And Others

    This study has as its main object the formulation of an approach, as global and comprehensive as possible, to the multiple aspects of the engineer's needs for scientific and technical information. The basis of the study is an analysis of the engineer's role, its characteristics, different specialties, levels of training, and categories of…

  10. The quandaries and promise of risk management: a scientist's perspective on integration of science and management.

    Treesearch

    B.G. Marcot

    2007-01-01

    This paper briefly lists constraints and problems of traditional approaches to natural resource risk analysis and risk management. Such problems include disparate definitions of risk, multiple and conflicting objectives and decisions, conflicting interpretations of uncertainty, and failure of articulating decision criteria, risk attitudes, modeling assumptions, and...

  11. Visualization of Sedentary Behavior Using an Event-Based Approach

    ERIC Educational Resources Information Center

    Loudon, David; Granat, Malcolm H.

    2015-01-01

    Visualization is commonly used in the interpretation of physical behavior (PB) data, either in conjunction with or as precursor to formal analysis. Effective representations of the data can enable the identification of patterns of behavior, and how they relate to the temporal context in a single day, or across multiple days. An understanding of…

  12. A Comparison of Four Approaches to Account for Method Effects in Latent State-Trait Analyses

    ERIC Educational Resources Information Center

    Geiser, Christian; Lockhart, Ginger

    2012-01-01

    Latent state-trait (LST) analysis is frequently applied in psychological research to determine the degree to which observed scores reflect stable person-specific effects, effects of situations and/or person-situation interactions, and random measurement error. Most LST applications use multiple repeatedly measured observed variables as indicators…

  13. Physician performance assessment using a composite quality index.

    PubMed

    Liu, Kaibo; Jain, Shabnam; Shi, Jianjun

    2013-07-10

    Assessing physician performance is important for the purposes of measuring and improving quality of service and reducing healthcare delivery costs. In recent years, physician performance scorecards have been used to provide feedback on individual measures; however, one key challenge is how to develop a composite quality index that combines multiple measures for overall physician performance evaluation. A controversy arises over establishing appropriate weights to combine indicators in multiple dimensions, and cannot be easily resolved. In this study, we proposed a generic unsupervised learning approach to develop a single composite index for physician performance assessment by using non-negative principal component analysis. We developed a new algorithm named iterative quadratic programming to solve the numerical issue in the non-negative principal component analysis approach. We conducted real case studies to demonstrate the performance of the proposed method. We provided interpretations from both statistical and clinical perspectives to evaluate the developed composite ranking score in practice. In addition, we implemented the root cause assessment techniques to explain physician performance for improvement purposes. Copyright © 2012 John Wiley & Sons, Ltd.

  14. Knowledge-based expert systems and a proof-of-concept case study for multiple sequence alignment construction and analysis.

    PubMed

    Aniba, Mohamed Radhouene; Siguenza, Sophie; Friedrich, Anne; Plewniak, Frédéric; Poch, Olivier; Marchler-Bauer, Aron; Thompson, Julie Dawn

    2009-01-01

    The traditional approach to bioinformatics analyses relies on independent task-specific services and applications, using different input and output formats, often idiosyncratic, and frequently not designed to inter-operate. In general, such analyses were performed by experts who manually verified the results obtained at each step in the process. Today, the amount of bioinformatics information continuously being produced means that handling the various applications used to study this information presents a major data management and analysis challenge to researchers. It is now impossible to manually analyse all this information and new approaches are needed that are capable of processing the large-scale heterogeneous data in order to extract the pertinent information. We review the recent use of integrated expert systems aimed at providing more efficient knowledge extraction for bioinformatics research. A general methodology for building knowledge-based expert systems is described, focusing on the unstructured information management architecture, UIMA, which provides facilities for both data and process management. A case study involving a multiple alignment expert system prototype called AlexSys is also presented.

  15. Knowledge-based expert systems and a proof-of-concept case study for multiple sequence alignment construction and analysis

    PubMed Central

    Aniba, Mohamed Radhouene; Siguenza, Sophie; Friedrich, Anne; Plewniak, Frédéric; Poch, Olivier; Marchler-Bauer, Aron

    2009-01-01

    The traditional approach to bioinformatics analyses relies on independent task-specific services and applications, using different input and output formats, often idiosyncratic, and frequently not designed to inter-operate. In general, such analyses were performed by experts who manually verified the results obtained at each step in the process. Today, the amount of bioinformatics information continuously being produced means that handling the various applications used to study this information presents a major data management and analysis challenge to researchers. It is now impossible to manually analyse all this information and new approaches are needed that are capable of processing the large-scale heterogeneous data in order to extract the pertinent information. We review the recent use of integrated expert systems aimed at providing more efficient knowledge extraction for bioinformatics research. A general methodology for building knowledge-based expert systems is described, focusing on the unstructured information management architecture, UIMA, which provides facilities for both data and process management. A case study involving a multiple alignment expert system prototype called AlexSys is also presented. PMID:18971242

  16. Clustering and group selection of multiple criteria alternatives with application to space-based networks.

    PubMed

    Malakooti, Behnam; Yang, Ziyong

    2004-02-01

    In many real-world problems, the range of consequences of different alternatives are considerably different. In addition, sometimes, selection of a group of alternatives (instead of only one best alternative) is necessary. Traditional decision making approaches treat the set of alternatives with the same method of analysis and selection. In this paper, we propose clustering alternatives into different groups so that different methods of analysis, selection, and implementation for each group can be applied. As an example, consider the selection of a group of functions (or tasks) to be processed by a group of processors. The set of tasks can be grouped according to their similar criteria, and hence, each cluster of tasks to be processed by a processor. The selection of the best alternative for each clustered group can be performed using existing methods; however, the process of selecting groups is different than the process of selecting alternatives within a group. We develop theories and procedures for clustering discrete multiple criteria alternatives. We also demonstrate how the set of alternatives is clustered into mutually exclusive groups based on 1) similar features among alternatives; 2) ideal (or most representative) alternatives given by the decision maker; and 3) other preferential information of the decision maker. The clustering of multiple criteria alternatives also has the following advantages. 1) It decreases the set of alternatives to be considered by the decision maker (for example, different decision makers are assigned to different groups of alternatives). 2) It decreases the number of criteria. 3) It may provide a different approach for analyzing multiple decision makers problems. Each decision maker may cluster alternatives differently, and hence, clustering of alternatives may provide a basis for negotiation. The developed approach is applicable for solving a class of telecommunication networks problems where a set of objects (such as routers, processors, or intelligent autonomous vehicles) are to be clustered into similar groups. Objects are clustered based on several criteria and the decision maker's preferences.

  17. Genetic variation influences glutamate concentrations in brains of patients with multiple sclerosis.

    PubMed

    Baranzini, Sergio E; Srinivasan, Radhika; Khankhanian, Pouya; Okuda, Darin T; Nelson, Sarah J; Matthews, Paul M; Hauser, Stephen L; Oksenberg, Jorge R; Pelletier, Daniel

    2010-09-01

    Glutamate is the main excitatory neurotransmitter in the mammalian brain. Appropriate transmission of nerve impulses through glutamatergic synapses is required throughout the brain and forms the basis of many processes including learning and memory. However, abnormally high levels of extracellular brain glutamate can lead to neuroaxonal cell death. We have previously reported elevated glutamate levels in the brains of patients suffering from multiple sclerosis. Here two complementary analyses to assess the extent of genomic control over glutamate levels were used. First, a genome-wide association analysis in 382 patients with multiple sclerosis using brain glutamate concentration as a quantitative trait was conducted. In a second approach, a protein interaction network was used to find associated genes within the same pathway. The top associated marker was rs794185 (P < 6.44 x 10(-7)), a non-coding single nucleotide polymorphism within the gene sulphatase modifying factor 1. Our pathway approach identified a module composed of 70 genes with high relevance to glutamate biology. Individuals carrying a higher number of associated alleles from genes in this module showed the highest levels of glutamate. These individuals also showed greater decreases in N-acetylaspartate and in brain volume over 1 year of follow-up. Patients were then stratified by the amount of annual brain volume loss and the same approach was performed in the 'high' (n = 250) and 'low' (n = 132) neurodegeneration groups. The association with rs794185 was highly significant in the group with high neurodegeneration. Further, results from the network-based pathway analysis remained largely unchanged even after stratification. Results from these analyses indicated that variance in the activity of neurochemical pathways implicated in neurodegeneration is explained, at least in part, by the inheritance of common genetic polymorphisms. Spectroscopy-based imaging provides a novel quantitative endophenotype for genetic association studies directed towards identifying new factors that contribute to the heterogeneity of clinical expression of multiple sclerosis.

  18. Future Directions in Vulnerability to Depression among Youth: Integrating Risk Factors and Processes across Multiple Levels of Analysis

    PubMed Central

    Hankin, Benjamin L.

    2014-01-01

    Depression is a developmental phenomenon. Considerable progress has been made in describing the syndrome, establishing its prevalence and features, providing clues as to its etiology, and developing evidence-based treatment and prevention options. Despite considerable headway in distinct lines of vulnerability research, there is an explanatory gap in the field ability to more comprehensively explain and predict who is likely to become depressed, when, and why. Still, despite clear success in predicting moderate variance for future depression, especially with empirically rigorous methods and designs, the heterogeneous and multi-determined nature of depression suggests that additional etiologies need to be included to advance knowledge on developmental pathways to depression. This paper advocates for a multiple levels of analysis approach to investigating vulnerability to depression across the lifespan and providing a more comprehensive understanding of its etiology. One example of a multiple levels of analysis model of vulnerabilities to depression is provided that integrates the most accessible, observable factors (e.g., cognitive and temperament risks), intermediate processes and endophenotypes (e.g., information processing biases, biological stress physiology, and neural activation and connectivity), and genetic influences (e.g., candidate genes and epigenetics). Evidence for each of these factors as well as their cross-level integration is provided. Methodological and conceptual considerations important for conducting integrative, multiple levels of depression vulnerability research are discussed. Finally, translational implications for how a multiple levels of analysis perspective may confer additional leverage to reduce the global burden of depression and improve care are considered. PMID:22900513

  19. Role of diversity in ICA and IVA: theory and applications

    NASA Astrophysics Data System (ADS)

    Adalı, Tülay

    2016-05-01

    Independent component analysis (ICA) has been the most popular approach for solving the blind source separation problem. Starting from a simple linear mixing model and the assumption of statistical independence, ICA can recover a set of linearly-mixed sources to within a scaling and permutation ambiguity. It has been successfully applied to numerous data analysis problems in areas as diverse as biomedicine, communications, finance, geo- physics, and remote sensing. ICA can be achieved using different types of diversity—statistical property—and, can be posed to simultaneously account for multiple types of diversity such as higher-order-statistics, sample dependence, non-circularity, and nonstationarity. A recent generalization of ICA, independent vector analysis (IVA), generalizes ICA to multiple data sets and adds the use of one more type of diversity, statistical dependence across the data sets, for jointly achieving independent decomposition of multiple data sets. With the addition of each new diversity type, identification of a broader class of signals become possible, and in the case of IVA, this includes sources that are independent and identically distributed Gaussians. We review the fundamentals and properties of ICA and IVA when multiple types of diversity are taken into account, and then ask the question whether diversity plays an important role in practical applications as well. Examples from various domains are presented to demonstrate that in many scenarios it might be worthwhile to jointly account for multiple statistical properties. This paper is submitted in conjunction with the talk delivered for the "Unsupervised Learning and ICA Pioneer Award" at the 2016 SPIE Conference on Sensing and Analysis Technologies for Biomedical and Cognitive Applications.

  20. Joint independent component analysis for simultaneous EEG-fMRI: principle and simulation.

    PubMed

    Moosmann, Matthias; Eichele, Tom; Nordby, Helge; Hugdahl, Kenneth; Calhoun, Vince D

    2008-03-01

    An optimized scheme for the fusion of electroencephalography and event related potentials with functional magnetic resonance imaging (BOLD-fMRI) data should simultaneously assess all available electrophysiologic and hemodynamic information in a common data space. In doing so, it should be possible to identify features of latent neural sources whose trial-to-trial dynamics are jointly reflected in both modalities. We present a joint independent component analysis (jICA) model for analysis of simultaneous single trial EEG-fMRI measurements from multiple subjects. We outline the general idea underlying the jICA approach and present results from simulated data under realistic noise conditions. Our results indicate that this approach is a feasible and physiologically plausible data-driven way to achieve spatiotemporal mapping of event related responses in the human brain.

  1. Multiple mutant clones in blood rarely coexist

    NASA Astrophysics Data System (ADS)

    Dingli, David; Pacheco, Jorge M.; Traulsen, Arne

    2008-02-01

    Leukemias arise due to mutations in the genome of hematopoietic (blood) cells. Hematopoiesis has a multicompartment architecture, with cells exhibiting different rates of replication and differentiation. At the root of this process, one finds a small number of stem cells, and hence the description of the mutation-selection dynamics of blood cells calls for a stochastic approach. We use stochastic dynamics to investigate to which extent acquired hematopoietic disorders are associated with mutations of single or multiple genes within developing blood cells. Our analysis considers the appearance of mutations both in the stem cell compartment as well as in more committed compartments. We conclude that in the absence of genomic instability, acquired hematopoietic disorders due to mutations in multiple genes are most likely very rare events, as multiple mutations typically require much longer development times compared to those associated with a single mutation.

  2. An Improved Wake Vortex Tracking Algorithm for Multiple Aircraft

    NASA Technical Reports Server (NTRS)

    Switzer, George F.; Proctor, Fred H.; Ahmad, Nashat N.; LimonDuparcmeur, Fanny M.

    2010-01-01

    The accurate tracking of vortex evolution from Large Eddy Simulation (LES) data is a complex and computationally intensive problem. The vortex tracking requires the analysis of very large three-dimensional and time-varying datasets. The complexity of the problem is further compounded by the fact that these vortices are embedded in a background turbulence field, and they may interact with the ground surface. Another level of complication can arise, if vortices from multiple aircrafts are simulated. This paper presents a new technique for post-processing LES data to obtain wake vortex tracks and wake intensities. The new approach isolates vortices by defining "regions of interest" (ROI) around each vortex and has the ability to identify vortex pairs from multiple aircraft. The paper describes the new methodology for tracking wake vortices and presents application of the technique for single and multiple aircraft.

  3. Estimation of Causal Mediation Effects for a Dichotomous Outcome in Multiple-Mediator Models using the Mediation Formula

    PubMed Central

    Nelson, Suchitra; Albert, Jeffrey M.

    2013-01-01

    Mediators are intermediate variables in the causal pathway between an exposure and an outcome. Mediation analysis investigates the extent to which exposure effects occur through these variables, thus revealing causal mechanisms. In this paper, we consider the estimation of the mediation effect when the outcome is binary and multiple mediators of different types exist. We give a precise definition of the total mediation effect as well as decomposed mediation effects through individual or sets of mediators using the potential outcomes framework. We formulate a model of joint distribution (probit-normal) using continuous latent variables for any binary mediators to account for correlations among multiple mediators. A mediation formula approach is proposed to estimate the total mediation effect and decomposed mediation effects based on this parametric model. Estimation of mediation effects through individual or subsets of mediators requires an assumption involving the joint distribution of multiple counterfactuals. We conduct a simulation study that demonstrates low bias of mediation effect estimators for two-mediator models with various combinations of mediator types. The results also show that the power to detect a non-zero total mediation effect increases as the correlation coefficient between two mediators increases, while power for individual mediation effects reaches a maximum when the mediators are uncorrelated. We illustrate our approach by applying it to a retrospective cohort study of dental caries in adolescents with low and high socioeconomic status. Sensitivity analysis is performed to assess the robustness of conclusions regarding mediation effects when the assumption of no unmeasured mediator-outcome confounders is violated. PMID:23650048

  4. Estimation of causal mediation effects for a dichotomous outcome in multiple-mediator models using the mediation formula.

    PubMed

    Wang, Wei; Nelson, Suchitra; Albert, Jeffrey M

    2013-10-30

    Mediators are intermediate variables in the causal pathway between an exposure and an outcome. Mediation analysis investigates the extent to which exposure effects occur through these variables, thus revealing causal mechanisms. In this paper, we consider the estimation of the mediation effect when the outcome is binary and multiple mediators of different types exist. We give a precise definition of the total mediation effect as well as decomposed mediation effects through individual or sets of mediators using the potential outcomes framework. We formulate a model of joint distribution (probit-normal) using continuous latent variables for any binary mediators to account for correlations among multiple mediators. A mediation formula approach is proposed to estimate the total mediation effect and decomposed mediation effects based on this parametric model. Estimation of mediation effects through individual or subsets of mediators requires an assumption involving the joint distribution of multiple counterfactuals. We conduct a simulation study that demonstrates low bias of mediation effect estimators for two-mediator models with various combinations of mediator types. The results also show that the power to detect a nonzero total mediation effect increases as the correlation coefficient between two mediators increases, whereas power for individual mediation effects reaches a maximum when the mediators are uncorrelated. We illustrate our approach by applying it to a retrospective cohort study of dental caries in adolescents with low and high socioeconomic status. Sensitivity analysis is performed to assess the robustness of conclusions regarding mediation effects when the assumption of no unmeasured mediator-outcome confounders is violated. Copyright © 2013 John Wiley & Sons, Ltd.

  5. Non-adiabatic excited state molecular dynamics of phenylene ethynylene dendrimer using a multiconfigurational Ehrenfest approach

    DOE PAGES

    Fernandez-Alberti, Sebastian; Makhov, Dmitry V.; Tretiak, Sergei; ...

    2016-03-10

    Photoinduced dynamics of electronic and vibrational unidirectional energy transfer between meta-linked building blocks in a phenylene ethynylene dendrimer is simulated using a multiconfigurational Ehrenfest in time-dependent diabatic basis (MCE-TDDB) method, a new variant of the MCE approach developed by us for dynamics involving multiple electronic states with numerous abrupt crossings. Excited-state energies, gradients and non-adiabatic coupling terms needed for dynamics simulation are calculated on-the-fly using the Collective Electron Oscillator (CEO) approach. In conclusion, a comparative analysis of our results obtained using MCE-TDDB, the conventional Ehrenfest method and the surface-hopping approach with and without decoherence corrections is presented.

  6. Annual Research Review: Discovery science strategies in studies of the pathophysiology of child and adolescent psychiatric disorders--promises and limitations.

    PubMed

    Zhao, Yihong; Castellanos, F Xavier

    2016-03-01

    Psychiatric science remains descriptive, with a categorical nosology intended to enhance interobserver reliability. Increased awareness of the mismatch between categorical classifications and the complexity of biological systems drives the search for novel frameworks including discovery science in Big Data. In this review, we provide an overview of incipient approaches, primarily focused on classically categorical diagnoses such as schizophrenia (SZ), autism spectrum disorder (ASD), and attention-deficit/hyperactivity disorder (ADHD), but also reference convincing, if focal, advances in cancer biology, to describe the challenges of Big Data and discovery science, and outline approaches being formulated to overcome existing obstacles. A paradigm shift from categorical diagnoses to a domain/structure-based nosology and from linear causal chains to complex causal network models of brain-behavior relationship is ongoing. This (r)evolution involves appreciating the complexity, dimensionality, and heterogeneity of neuropsychiatric data collected from multiple sources ('broad' data) along with data obtained at multiple levels of analysis, ranging from genes to molecules, cells, circuits, and behaviors ('deep' data). Both of these types of Big Data landscapes require the use and development of robust and powerful informatics and statistical approaches. Thus, we describe Big Data analysis pipelines and the promise and potential limitations in using Big Data approaches to study psychiatric disorders. We highlight key resources available for psychopathological studies and call for the application and development of Big Data approaches to dissect the causes and mechanisms of neuropsychiatric disorders and identify corresponding biomarkers for early diagnosis. © 2016 Association for Child and Adolescent Mental Health.

  7. Annual Research Review: Discovery science strategies in studies of the pathophysiology of child and adolescent psychiatric disorders: promises and limitations

    PubMed Central

    Zhao, Yihong; Castellanos, F. Xavier

    2015-01-01

    Background and Scope Psychiatric science remains descriptive, with a categorical nosology intended to enhance inter-observer reliability. Increased awareness of the mismatch between categorical classifications and the complexity of biological systems drives the search for novel frameworks including discovery science in Big Data. In this review, we provide an overview of incipient approaches, primarily focused on classically categorical diagnoses such as schizophrenia (SZ), autism spectrum disorder (ASD) and attention-deficit/hyperactivity disorder (ADHD), but also reference convincing, if focal, advances in cancer biology, to describe the challenges of Big Data and discovery science, and outline approaches being formulated to overcome existing obstacles. Findings A paradigm shift from categorical diagnoses to a domain/structure-based nosology and from linear causal chains to complex causal network models of brain-behavior relationship is ongoing. This (r)evolution involves appreciating the complexity, dimensionality and heterogeneity of neuropsychiatric data collected from multiple sources (“broad” data) along with data obtained at multiple levels of analysis, ranging from genes to molecules, cells, circuits and behaviors (“deep” data). Both of these types of Big Data landscapes require the use and development of robust and powerful informatics and statistical approaches. Thus, we describe Big Data analysis pipelines and the promise and potential limitations in using Big Data approaches to study psychiatric disorders. Conclusion We highlight key resources available for psychopathological studies and call for the application and development of Big Data approaches to dissect the causes and mechanisms of neuropsychiatric disorders and identify corresponding biomarkers for early diagnosis. PMID:26732133

  8. Identifying apparent local stable isotope equilibrium in a complex non-equilibrium system.

    PubMed

    He, Yuyang; Cao, Xiaobin; Wang, Jianwei; Bao, Huiming

    2018-02-28

    Although being out of equilibrium, biomolecules in organisms have the potential to approach isotope equilibrium locally because enzymatic reactions are intrinsically reversible. A rigorous approach that can describe isotope distribution among biomolecules and their apparent deviation from equilibrium state is lacking, however. Applying the concept of distance matrix in graph theory, we propose that apparent local isotope equilibrium among a subset of biomolecules can be assessed using an apparent fractionation difference (|Δα|) matrix, in which the differences between the observed isotope composition (δ') and the calculated equilibrium fractionation factor (1000lnβ) can be more rigorously evaluated than by using a previous approach for multiple biomolecules. We tested our |Δα| matrix approach by re-analyzing published data of different amino acids (AAs) in potato and in green alga. Our re-analysis shows that biosynthesis pathways could be the reason for an apparently close-to-equilibrium relationship inside AA families in potato leaves. Different biosynthesis/degradation pathways in tubers may have led to the observed isotope distribution difference between potato leaves and tubers. The analysis of data from green algae does not support the conclusion that AAs are further from equilibrium in glucose-cultured green algae than in the autotrophic ones. Application of the |Δα| matrix can help us to locate potential reversible reactions or reaction networks in a complex system such as a metabolic system. The same approach can be broadly applied to all complex systems that have multiple components, e.g. geochemical or atmospheric systems of early Earth or other planets. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Femoral head necrosis: A finite element analysis of common and novel surgical techniques.

    PubMed

    Cilla, Myriam; Checa, Sara; Preininger, Bernd; Winkler, Tobias; Perka, Carsten; Duda, Georg N; Pumberger, Matthias

    2017-10-01

    Femoral head necrosis is a common cause of secondary osteoarthritis. At the early stages, treatment strategies are normally based on core decompression techniques, where the number, location and diameter of the drilling holes varies depending on the selected approach. The purpose of this study was to investigate the risk of femoral head, neck and subtrochanteric fracture following six different core decompression techniques. Five common and a newly proposed techniques were analyzed in respect to their biomechanical consequences using finite element analysis. The geometry of a femur was reconstructed from computed-tomography images. Thereafter, the drilling configurations were simulated. The strains in the intact and drilled femurs were determined under physiological, patient-specific, muscle and joint contact forces. The following results were observed: i) - an increase in collapse and fracture risk of the femur head by disease progression ii) - for a single hole approach at the subtrochanteric region, the fracture risk increases with the diameter iii) - the highest fracture risks occur for an 8mm single hole drilling at the subtrochanteric region and approaches with multiple drilling at various entry points iv) - the proposed novel approach resulted in the most physiological strains (closer to the experienced by the healthy bone). Our results suggest that all common core decompression methods have a significant impact on the biomechanical competence of the proximal femur and impact its mechanical potential. Fracture risk increases with drilling diameter and multiple drilling with smaller diameter. We recommend the anterior approach due to its reduced soft tissue trauma and its biomechanical performance. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Global analysis of approaches for deriving total water storage changes from GRACE satellites and implications for groundwater storage change estimation

    NASA Astrophysics Data System (ADS)

    Long, D.; Scanlon, B. R.; Longuevergne, L.; Chen, X.

    2015-12-01

    Increasing interest in use of GRACE satellites and a variety of new products to monitor changes in total water storage (TWS) underscores the need to assess the reliability of output from different products. The objective of this study was to assess skills and uncertainties of different approaches for processing GRACE data to restore signal losses caused by spatial filtering based on analysis of 1°×1° grid scale data and basin scale data in 60 river basins globally. Results indicate that scaling factors from six land surface models (LSMs), including four models from GLDAS-1 (Noah 2.7, Mosaic, VIC, and CLM 2.0), CLM 4.0, and WGHM, are similar over most humid, sub-humid, and high-latitude regions but can differ by up to 100% over arid and semi-arid basins and areas with intensive irrigation. Large differences in TWS anomalies from three processing approaches (scaling factor, additive, and multiplicative corrections) were found in arid and semi-arid regions, areas with intensive irrigation, and relatively small basins (e.g., ≤ 200,000 km2). Furthermore, TWS anomaly products from gridded data with CLM4.0 scaling factors and the additive correction approach more closely agree with WGHM output than the multiplicative correction approach. Estimation of groundwater storage changes using GRACE satellites requires caution in selecting an appropriate approach for restoring TWS changes. A priori ground-based data used in forward modeling can provide a powerful tool for explaining the distribution of signal gains or losses caused by low-pass filtering in specific regions of interest and should be very useful for more reliable estimation of groundwater storage changes using GRACE satellites.

  11. Analysis of navigation and guidance requirements for commercial VTOL operations

    NASA Technical Reports Server (NTRS)

    Hoffman, W. C.; Zvara, J.; Hollister, W. M.

    1975-01-01

    The paper presents some results of a program undertaken to define navigation and guidance requirements for commercial VTOL operations in the takeoff, cruise, terminal and landing phases of flight in weather conditions up to and including Category III. Quantitative navigation requirements are given for the parameters range, coverage, operation near obstacles, horizontal accuracy, multiple landing aircraft, multiple pad requirements, inertial/radio-inertial requirements, reliability/redundancy, update rate, and data link requirements in all flight phases. A multi-configuration straw-man navigation and guidance system for commercial VTOL operations is presented. Operation of the system is keyed to a fully automatic approach for navigation, guidance and control, with pilot as monitor-manager. The system is a hybrid navigator using a relatively low-cost inertial sensor with DME updates and MLS in the approach/departure phases.

  12. Magnetite-doped polydimethylsiloxane (PDMS) for phosphopeptide enrichment.

    PubMed

    Sandison, Mairi E; Jensen, K Tveen; Gesellchen, F; Cooper, J M; Pitt, A R

    2014-10-07

    Reversible phosphorylation plays a key role in numerous biological processes. Mass spectrometry-based approaches are commonly used to analyze protein phosphorylation, but such analysis is challenging, largely due to the low phosphorylation stoichiometry. Hence, a number of phosphopeptide enrichment strategies have been developed, including metal oxide affinity chromatography (MOAC). Here, we describe a new material for performing MOAC that employs a magnetite-doped polydimethylsiloxane (PDMS), that is suitable for the creation of microwell array and microfluidic systems to enable low volume, high throughput analysis. Incubation time and sample loading were explored and optimized and demonstrate that the embedded magnetite is able to enrich phosphopeptides. This substrate-based approach is rapid, straightforward and suitable for simultaneously performing multiple, low volume enrichments.

  13. A New All Solid State Approach to Gaseous Pollutant Detection

    NASA Technical Reports Server (NTRS)

    Brown, V.; Tamstorf, K.

    1971-01-01

    Recent efforts in our laboratories have concentrated on the development of an all solid state gas sensor, by combining solid electrolyte (ion exchange membrane) technology with advanced thin film deposition processes. With the proper bias magnitude and polarity these miniature electro-chemical,cells show remarkable current responses for many common pollution gases. Current activity is now focused on complementing a multiple array (matrix) of these solid state sensors, with a digital electronic scanner device possessing "scan-compare-identify-alarm: capability. This innovative approach to multi-component pollutant gas analysis may indeed be the advanced prototype for the "third generation" class of pollution analysis instrumentation so urgently needed in the decade ahead.

  14. Grounding a new information technology implementation framework in behavioral science: a systematic analysis of the literature on IT use.

    PubMed

    Kukafka, Rita; Johnson, Stephen B; Linfante, Allison; Allegrante, John P

    2003-06-01

    Many interventions to improve the success of information technology (IT) implementations are grounded in behavioral science, using theories, and models to identify conditions and determinants of successful use. However, each model in the IT literature has evolved to address specific theoretical problems of particular disciplinary concerns, and each model has been tested and has evolved using, in most cases, a more or less restricted set of IT implementation procedures. Functionally, this limits the perspective for taking into account the multiple factors at the individual, group, and organizational levels that influence use behavior. While a rich body of literature has emerged, employing prominent models such as the Technology Adoption Model, Social-Cognitive Theory, and Diffusion of Innovation Theory, the complexity of defining a suitable multi-level intervention has largely been overlooked. A gap exists between the implementation of IT and the integration of theories and models that can be utilized to develop multi-level approaches to identify factors that impede usage behavior. We present a novel framework that is intended to guide synthesis of more than one theoretical perspective for the purpose of planning multi-level interventions to enhance IT use. This integrative framework is adapted from PRECEDE/PROCEDE, a conceptual framework used by health planners in hundreds of published studies to direct interventions that account for the multiple determinants of behavior. Since we claim that the literature on IT use behavior does not now include a multi-level approach, we undertook a systematic literature analysis to confirm this assertion. Our framework facilitated organizing this literature synthesis and our analysis was aimed at determining if the IT implementation approaches in the published literature were characterized by an approach that considered at least two levels of IT usage determinants. We found that while 61% of studies mentioned or referred to theory, none considered two or more levels. In other words, although the researchers employ behavioral theory, they omit two fundamental propositions: (1) IT usage is influenced by multiple factors and (2) interventions must be multi-dimensional. Our literature synthesis may provide additional insight into the reason for high failure rates associated with underutilized systems, and underscores the need to move beyond the current dominant approach that employs a single model to guide IT implementation plans that aim to address factors associated with IT acceptance and subsequent positive use behavior.

  15. Using the Logarithm of Odds to Define a Vector Space on Probabilistic Atlases

    PubMed Central

    Pohl, Kilian M.; Fisher, John; Bouix, Sylvain; Shenton, Martha; McCarley, Robert W.; Grimson, W. Eric L.; Kikinis, Ron; Wells, William M.

    2007-01-01

    The Logarithm of the Odds ratio (LogOdds) is frequently used in areas such as artificial neural networks, economics, and biology, as an alternative representation of probabilities. Here, we use LogOdds to place probabilistic atlases in a linear vector space. This representation has several useful properties for medical imaging. For example, it not only encodes the shape of multiple anatomical structures but also captures some information concerning uncertainty. We demonstrate that the resulting vector space operations of addition and scalar multiplication have natural probabilistic interpretations. We discuss several examples for placing label maps into the space of LogOdds. First, we relate signed distance maps, a widely used implicit shape representation, to LogOdds and compare it to an alternative that is based on smoothing by spatial Gaussians. We find that the LogOdds approach better preserves shapes in a complex multiple object setting. In the second example, we capture the uncertainty of boundary locations by mapping multiple label maps of the same object into the LogOdds space. Third, we define a framework for non-convex interpolations among atlases that capture different time points in the aging process of a population. We evaluate the accuracy of our representation by generating a deformable shape atlas that captures the variations of anatomical shapes across a population. The deformable atlas is the result of a principal component analysis within the LogOdds space. This atlas is integrated into an existing segmentation approach for MR images. We compare the performance of the resulting implementation in segmenting 20 test cases to a similar approach that uses a more standard shape model that is based on signed distance maps. On this data set, the Bayesian classification model with our new representation outperformed the other approaches in segmenting subcortical structures. PMID:17698403

  16. Master of Puppets: Cooperative Multitasking for In Situ Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morozov, Dmitriy; Lukic, Zarija

    2016-01-01

    Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. Here, we present a novel design for running multiple codes in situ: using coroutines and position-independent executables we enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. We present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. This design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The techniques we present can also be integrated into other in situ frameworks.« less

  17. Henson v1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Monozov, Dmitriy; Lukie, Zarija

    2016-04-01

    Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. The developers present a novel design for running multiple codes in situ: using coroutines and position-independent executables they enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. They present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. Our design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The presented techniques can also be integrated into other in situ frameworks.« less

  18. Protein mass analysis of histones.

    PubMed

    Galasinski, Scott C; Resing, Katheryn A; Ahn, Natalie G

    2003-09-01

    Posttranslational modification of chromatin-associated proteins, including histones and high-mobility-group (HMG) proteins, provides an important mechanism to control gene expression, genome integrity, and epigenetic inheritance. Protein mass analysis provides a rapid and unbiased approach to monitor multiple chemical modifications on individual molecules. This review describes methods for acid extraction of histones and HMG proteins, followed by separation by reverse-phase chromatography coupled to electrospray ionization mass spectrometry (LC/ESI-MS). Posttranslational modifications are detected by analysis of full-length protein masses. Confirmation of protein identity and modification state is obtained through enzymatic digestion and peptide sequencing by MS/MS. For differentially modified forms of each protein, the measured intensities are semiquantitative and allow determination of relative abundance and stoichiometry. The method simultaneously detects covalent modifications on multiple proteins and provides a facile assay for comparing chromatin modification states between different cell types and/or cellular responses.

  19. Schistosomiasis Breeding Environment Situation Analysis in Dongting Lake Area

    NASA Astrophysics Data System (ADS)

    Li, Chuanrong; Jia, Yuanyuan; Ma, Lingling; Liu, Zhaoyan; Qian, Yonggang

    2013-01-01

    Monitoring environmental characteristics, such as vegetation, soil moisture et al., of Oncomelania hupensis (O. hupensis)’ spatial/temporal distribution is of vital importance to the schistosomiasis prevention and control. In this study, the relationship between environmental factors derived from remotely sensed data and the density of O. hupensis was analyzed by a multiple linear regression model. Secondly, spatial analysis of the regression residual was investigated by the semi-variogram method. Thirdly, spatial analysis of the regression residual and the multiple linear regression model were both employed to estimate the spatial variation of O. hupensis density. Finally, the approach was used to monitor and predict the spatial and temporal variations of oncomelania of Dongting Lake region, China. And the areas of potential O. hupensis habitats were predicted and the influence of Three Gorges Dam (TGB)project on the density of O. hupensis was analyzed.

  20. The Aeronautical Data Link: Decision Framework for Architecture Analysis

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Goode, Plesent W.

    2003-01-01

    A decision analytic approach that develops optimal data link architecture configuration and behavior to meet multiple conflicting objectives of concurrent and different airspace operations functions has previously been developed. The approach, premised on a formal taxonomic classification that correlates data link performance with operations requirements, information requirements, and implementing technologies, provides a coherent methodology for data link architectural analysis from top-down and bottom-up perspectives. This paper follows the previous research by providing more specific approaches for mapping and transitioning between the lower levels of the decision framework. The goal of the architectural analysis methodology is to assess the impact of specific architecture configurations and behaviors on the efficiency, capacity, and safety of operations. This necessarily involves understanding the various capabilities, system level performance issues and performance and interface concepts related to the conceptual purpose of the architecture and to the underlying data link technologies. Efficient and goal-directed data link architectural network configuration is conditioned on quantifying the risks and uncertainties associated with complex structural interface decisions. Deterministic and stochastic optimal design approaches will be discussed that maximize the effectiveness of architectural designs.

  1. Meta-analysis of diagnostic test data: a bivariate Bayesian modeling approach.

    PubMed

    Verde, Pablo E

    2010-12-30

    In the last decades, the amount of published results on clinical diagnostic tests has expanded very rapidly. The counterpart to this development has been the formal evaluation and synthesis of diagnostic results. However, published results present substantial heterogeneity and they can be regarded as so far removed from the classical domain of meta-analysis, that they can provide a rather severe test of classical statistical methods. Recently, bivariate random effects meta-analytic methods, which model the pairs of sensitivities and specificities, have been presented from the classical point of view. In this work a bivariate Bayesian modeling approach is presented. This approach substantially extends the scope of classical bivariate methods by allowing the structural distribution of the random effects to depend on multiple sources of variability. Meta-analysis is summarized by the predictive posterior distributions for sensitivity and specificity. This new approach allows, also, to perform substantial model checking, model diagnostic and model selection. Statistical computations are implemented in the public domain statistical software (WinBUGS and R) and illustrated with real data examples. Copyright © 2010 John Wiley & Sons, Ltd.

  2. Multivariate and repeated measures (MRM): A new toolbox for dependent and multimodal group-level neuroimaging data

    PubMed Central

    McFarquhar, Martyn; McKie, Shane; Emsley, Richard; Suckling, John; Elliott, Rebecca; Williams, Stephen

    2016-01-01

    Repeated measurements and multimodal data are common in neuroimaging research. Despite this, conventional approaches to group level analysis ignore these repeated measurements in favour of multiple between-subject models using contrasts of interest. This approach has a number of drawbacks as certain designs and comparisons of interest are either not possible or complex to implement. Unfortunately, even when attempting to analyse group level data within a repeated-measures framework, the methods implemented in popular software packages make potentially unrealistic assumptions about the covariance structure across the brain. In this paper, we describe how this issue can be addressed in a simple and efficient manner using the multivariate form of the familiar general linear model (GLM), as implemented in a new MATLAB toolbox. This multivariate framework is discussed, paying particular attention to methods of inference by permutation. Comparisons with existing approaches and software packages for dependent group-level neuroimaging data are made. We also demonstrate how this method is easily adapted for dependency at the group level when multiple modalities of imaging are collected from the same individuals. Follow-up of these multimodal models using linear discriminant functions (LDA) is also discussed, with applications to future studies wishing to integrate multiple scanning techniques into investigating populations of interest. PMID:26921716

  3. Quantitative perceptual differences among over-the-counter vaginal products using a standardized methodology: implications for microbicide development☆

    PubMed Central

    Mahan, Ellen D.; Morrow, Kathleen M.; Hayes, John E.

    2015-01-01

    Background Increasing prevalence of HIV infection among women worldwide has motivated the development of female-initiated prevention methods, including gel-based microbicides. User acceptability is vital for microbicide success; however, varying cultural vaginal practices indicate multiple formulations must be developed to appeal to different populations. Perceptual attributes of microbicides have been identified as primary drivers of acceptability; however, previous studies do not allow for direct comparison of these qualities between multiple formulations. Study Design Six vaginal products were analyzed ex vivo using descriptive analysis. Perceptual attributes of samples were identified by trained participants (n=10) and rated quantitatively using scales based on a panel-developed lexicon. Data were analyzed using two-way ANOVAs for each attribute; product differences were assessed via Tukey’s honestly significant difference test. Results Significant differences were found between products for multiple attributes. Patterns were also seen for attributes across intended product usage (i.e., contraceptive, moisturizer or lubricant). For example, Options© Gynol II® (Caldwell Consumer Health, LLC) was significantly stickier and grainier than other products. Conclusions Descriptive analysis, a quantitative approach that is based on consensus lexicon usage among participants, successfully quantified perceptual differences among vaginal products. Since perceptual attributes of products can be directly compared quantitatively, this study represents a novel approach that could be used to inform rational design of microbicides. PMID:21757061

  4. Quantitative perceptual differences among over-the-counter vaginal products using a standardized methodology: implications for microbicide development.

    PubMed

    Mahan, Ellen D; Morrow, Kathleen M; Hayes, John E

    2011-08-01

    Increasing prevalence of HIV infection among women worldwide has motivated the development of female-initiated prevention methods, including gel-based microbicides. User acceptability is vital for microbicide success; however, varying cultural vaginal practices indicate multiple formulations must be developed to appeal to different populations. Perceptual attributes of microbicides have been identified as primary drivers of acceptability; however, previous studies do not allow for direct comparison of these qualities between multiple formulations. Six vaginal products were analyzed ex vivo using descriptive analysis. Perceptual attributes of samples were identified by trained participants (n=10) and rated quantitatively using scales based on a panel-developed lexicon. Data were analyzed using two-way ANOVAs for each attribute; product differences were assessed via Tukey's honestly significant difference test. Significant differences were found between products for multiple attributes. Patterns were also seen for attributes across intended product usage (i.e., contraceptive, moisturizer or lubricant). For example, Options© Gynol II® (Caldwell Consumer Health, LLC) was significantly stickier and grainier than other products. Descriptive analysis, a quantitative approach that is based on consensus lexicon usage among participants, successfully quantified perceptual differences among vaginal products. Since perceptual attributes of products can be directly compared quantitatively, this study represents a novel approach that could be used to inform rational design of microbicides. Copyright © 2011 Elsevier Inc. All rights reserved.

  5. TaxI: a software tool for DNA barcoding using distance methods

    PubMed Central

    Steinke, Dirk; Vences, Miguel; Salzburger, Walter; Meyer, Axel

    2005-01-01

    DNA barcoding is a promising approach to the diagnosis of biological diversity in which DNA sequences serve as the primary key for information retrieval. Most existing software for evolutionary analysis of DNA sequences was designed for phylogenetic analyses and, hence, those algorithms do not offer appropriate solutions for the rapid, but precise analyses needed for DNA barcoding, and are also unable to process the often large comparative datasets. We developed a flexible software tool for DNA taxonomy, named TaxI. This program calculates sequence divergences between a query sequence (taxon to be barcoded) and each sequence of a dataset of reference sequences defined by the user. Because the analysis is based on separate pairwise alignments this software is also able to work with sequences characterized by multiple insertions and deletions that are difficult to align in large sequence sets (i.e. thousands of sequences) by multiple alignment algorithms because of computational restrictions. Here, we demonstrate the utility of this approach with two datasets of fish larvae and juveniles from Lake Constance and juvenile land snails under different models of sequence evolution. Sets of ribosomal 16S rRNA sequences, characterized by multiple indels, performed as good as or better than cox1 sequence sets in assigning sequences to species, demonstrating the suitability of rRNA genes for DNA barcoding. PMID:16214755

  6. Linked independent component analysis for multimodal data fusion.

    PubMed

    Groves, Adrian R; Beckmann, Christian F; Smith, Steve M; Woolrich, Mark W

    2011-02-01

    In recent years, neuroimaging studies have increasingly been acquiring multiple modalities of data and searching for task- or disease-related changes in each modality separately. A major challenge in analysis is to find systematic approaches for fusing these differing data types together to automatically find patterns of related changes across multiple modalities, when they exist. Independent Component Analysis (ICA) is a popular unsupervised learning method that can be used to find the modes of variation in neuroimaging data across a group of subjects. When multimodal data is acquired for the subjects, ICA is typically performed separately on each modality, leading to incompatible decompositions across modalities. Using a modular Bayesian framework, we develop a novel "Linked ICA" model for simultaneously modelling and discovering common features across multiple modalities, which can potentially have completely different units, signal- and contrast-to-noise ratios, voxel counts, spatial smoothnesses and intensity distributions. Furthermore, this general model can be configured to allow tensor ICA or spatially-concatenated ICA decompositions, or a combination of both at the same time. Linked ICA automatically determines the optimal weighting of each modality, and also can detect single-modality structured components when present. This is a fully probabilistic approach, implemented using Variational Bayes. We evaluate the method on simulated multimodal data sets, as well as on a real data set of Alzheimer's patients and age-matched controls that combines two very different types of structural MRI data: morphological data (grey matter density) and diffusion data (fractional anisotropy, mean diffusivity, and tensor mode). Copyright © 2010 Elsevier Inc. All rights reserved.

  7. Weighted Fuzzy Risk Priority Number Evaluation of Turbine and Compressor Blades Considering Failure Mode Correlations

    NASA Astrophysics Data System (ADS)

    Gan, Luping; Li, Yan-Feng; Zhu, Shun-Peng; Yang, Yuan-Jian; Huang, Hong-Zhong

    2014-06-01

    Failure mode, effects and criticality analysis (FMECA) and Fault tree analysis (FTA) are powerful tools to evaluate reliability of systems. Although single failure mode issue can be efficiently addressed by traditional FMECA, multiple failure modes and component correlations in complex systems cannot be effectively evaluated. In addition, correlated variables and parameters are often assumed to be precisely known in quantitative analysis. In fact, due to the lack of information, epistemic uncertainty commonly exists in engineering design. To solve these problems, the advantages of FMECA, FTA, fuzzy theory, and Copula theory are integrated into a unified hybrid method called fuzzy probability weighted geometric mean (FPWGM) risk priority number (RPN) method. The epistemic uncertainty of risk variables and parameters are characterized by fuzzy number to obtain fuzzy weighted geometric mean (FWGM) RPN for single failure mode. Multiple failure modes are connected using minimum cut sets (MCS), and Boolean logic is used to combine fuzzy risk priority number (FRPN) of each MCS. Moreover, Copula theory is applied to analyze the correlation of multiple failure modes in order to derive the failure probabilities of each MCS. Compared to the case where dependency among multiple failure modes is not considered, the Copula modeling approach eliminates the error of reliability analysis. Furthermore, for purpose of quantitative analysis, probabilities importance weight from failure probabilities are assigned to FWGM RPN to reassess the risk priority, which generalize the definition of probability weight and FRPN, resulting in a more accurate estimation than that of the traditional models. Finally, a basic fatigue analysis case drawn from turbine and compressor blades in aeroengine is used to demonstrate the effectiveness and robustness of the presented method. The result provides some important insights on fatigue reliability analysis and risk priority assessment of structural system under failure correlations.

  8. Integrated Safety Analysis Tiers

    NASA Technical Reports Server (NTRS)

    Shackelford, Carla; McNairy, Lisa; Wetherholt, Jon

    2009-01-01

    Commercial partnerships and organizational constraints, combined with complex systems, may lead to division of hazard analysis across organizations. This division could cause important hazards to be overlooked, causes to be missed, controls for a hazard to be incomplete, or verifications to be inefficient. Each organization s team must understand at least one level beyond the interface sufficiently enough to comprehend integrated hazards. This paper will discuss various ways to properly divide analysis among organizations. The Ares I launch vehicle integrated safety analyses effort will be utilized to illustrate an approach that addresses the key issues and concerns arising from multiple analysis responsibilities.

  9. Joint Identification of Genetic Variants for Physical Activity in Korean Population

    PubMed Central

    Kim, Jayoun; Kim, Jaehee; Min, Haesook; Oh, Sohee; Kim, Yeonjung; Lee, Andy H.; Park, Taesung

    2014-01-01

    There has been limited research on genome-wide association with physical activity (PA). This study ascertained genetic associations between PA and 344,893 single nucleotide polymorphism (SNP) markers in 8842 Korean samples. PA data were obtained from a validated questionnaire that included information on PA intensity and duration. Metabolic equivalent of tasks were calculated to estimate the total daily PA level for each individual. In addition to single- and multiple-SNP association tests, a pathway enrichment analysis was performed to identify the biological significance of SNP markers. Although no significant SNP was found at genome-wide significance level via single-SNP association tests, 59 genetic variants mapped to 76 genes were identified via a multiple SNP approach using a bootstrap selection stability measure. Pathway analysis for these 59 variants showed that maturity onset diabetes of the young (MODY) was enriched. Joint identification of SNPs could enable the identification of multiple SNPs with good predictive power for PA and a pathway enriched for PA. PMID:25026172

  10. Estimating Interaction Effects With Incomplete Predictor Variables

    PubMed Central

    Enders, Craig K.; Baraldi, Amanda N.; Cham, Heining

    2014-01-01

    The existing missing data literature does not provide a clear prescription for estimating interaction effects with missing data, particularly when the interaction involves a pair of continuous variables. In this article, we describe maximum likelihood and multiple imputation procedures for this common analysis problem. We outline 3 latent variable model specifications for interaction analyses with missing data. These models apply procedures from the latent variable interaction literature to analyses with a single indicator per construct (e.g., a regression analysis with scale scores). We also discuss multiple imputation for interaction effects, emphasizing an approach that applies standard imputation procedures to the product of 2 raw score predictors. We thoroughly describe the process of probing interaction effects with maximum likelihood and multiple imputation. For both missing data handling techniques, we outline centering and transformation strategies that researchers can implement in popular software packages, and we use a series of real data analyses to illustrate these methods. Finally, we use computer simulations to evaluate the performance of the proposed techniques. PMID:24707955

  11. Variant-aware saturating mutagenesis using multiple Cas9 nucleases identifies regulatory elements at trait-associated loci.

    PubMed

    Canver, Matthew C; Lessard, Samuel; Pinello, Luca; Wu, Yuxuan; Ilboudo, Yann; Stern, Emily N; Needleman, Austen J; Galactéros, Frédéric; Brugnara, Carlo; Kutlar, Abdullah; McKenzie, Colin; Reid, Marvin; Chen, Diane D; Das, Partha Pratim; A Cole, Mitchel; Zeng, Jing; Kurita, Ryo; Nakamura, Yukio; Yuan, Guo-Cheng; Lettre, Guillaume; Bauer, Daniel E; Orkin, Stuart H

    2017-04-01

    Cas9-mediated, high-throughput, saturating in situ mutagenesis permits fine-mapping of function across genomic segments. Disease- and trait-associated variants identified in genome-wide association studies largely cluster at regulatory loci. Here we demonstrate the use of multiple designer nucleases and variant-aware library design to interrogate trait-associated regulatory DNA at high resolution. We developed a computational tool for the creation of saturating-mutagenesis libraries with single or multiple nucleases with incorporation of variants. We applied this methodology to the HBS1L-MYB intergenic region, which is associated with red-blood-cell traits, including fetal hemoglobin levels. This approach identified putative regulatory elements that control MYB expression. Analysis of genomic copy number highlighted potential false-positive regions, thus emphasizing the importance of off-target analysis in the design of saturating-mutagenesis experiments. Together, these data establish a widely applicable high-throughput and high-resolution methodology to identify minimal functional sequences within large disease- and trait-associated regions.

  12. A control system for a powered prosthesis using positional and myoelectric inputs from the shoulder complex.

    PubMed

    Losier, Y; Englehart, K; Hudgins, B

    2007-01-01

    The integration of multiple input sources within a control strategy for powered upper limb prostheses could provide smoother, more intuitive multi-joint reaching movements based on the user's intended motion. The work presented in this paper presents the results of using myoelectric signals (MES) of the shoulder area in combination with the position of the shoulder as input sources to multiple linear discriminant analysis classifiers. Such an approach may provide users with control signals capable of controlling three degrees of freedom (DOF). This work is another important step in the development of hybrid systems that will enable simultaneous control of multiple degrees of freedom used for reaching tasks in a prosthetic limb.

  13. Experimental design matters for statistical analysis: how to handle blocking.

    PubMed

    Jensen, Signe M; Schaarschmidt, Frank; Onofri, Andrea; Ritz, Christian

    2018-03-01

    Nowadays, evaluation of the effects of pesticides often relies on experimental designs that involve multiple concentrations of the pesticide of interest or multiple pesticides at specific comparable concentrations and, possibly, secondary factors of interest. Unfortunately, the experimental design is often more or less neglected when analysing data. Two data examples were analysed using different modelling strategies. First, in a randomized complete block design, mean heights of maize treated with a herbicide and one of several adjuvants were compared. Second, translocation of an insecticide applied to maize as a seed treatment was evaluated using incomplete data from an unbalanced design with several layers of hierarchical sampling. Extensive simulations were carried out to further substantiate the effects of different modelling strategies. It was shown that results from suboptimal approaches (two-sample t-tests and ordinary ANOVA assuming independent observations) may be both quantitatively and qualitatively different from the results obtained using an appropriate linear mixed model. The simulations demonstrated that the different approaches may lead to differences in coverage percentages of confidence intervals and type 1 error rates, confirming that misleading conclusions can easily happen when an inappropriate statistical approach is chosen. To ensure that experimental data are summarized appropriately, avoiding misleading conclusions, the experimental design should duly be reflected in the choice of statistical approaches and models. We recommend that author guidelines should explicitly point out that authors need to indicate how the statistical analysis reflects the experimental design. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  14. Improving prediction of metal uptake by Chinese cabbage (Brassica pekinensis L.) based on a soil-plant stepwise analysis.

    PubMed

    Zhang, Sha; Song, Jing; Gao, Hui; Zhang, Qiang; Lv, Ming-Chao; Wang, Shuang; Liu, Gan; Pan, Yun-Yu; Christie, Peter; Sun, Wenjie

    2016-11-01

    It is crucial to develop predictive soil-plant transfer (SPT) models to derive the threshold values of toxic metals in contaminated arable soils. The present study was designed to examine the heavy metal uptake pattern and to improve the prediction of metal uptake by Chinese cabbage grown in agricultural soils with multiple contamination by Cd, Cu, Ni, Pb, and Zn. Pot experiments were performed with 25 historically contaminated soils to determine metal accumulation in different parts of Chinese cabbage. Different soil bioavailable metal fractions were determined using different extractants (0.43M HNO3, 0.01M CaCl2, 0.005M DTPA, and 0.01M LWMOAs), soil moisture samplers, and diffusive gradients in thin films (DGT), and the fractions were compared with shoot metal uptake using both direct and stepwise multiple regression analysis. The stepwise approach significantly improved the prediction of metal uptake by cabbage over the direct approach. Strongly pH dependent or nonlinear relationships were found for the adsorption of root surfaces and in root-shoot uptake processes. Metals were linearly translocated from the root surface to the root. Therefore, the nonlinearity of uptake pattern is an important explanation for the inadequacy of the direct approach in some cases. The stepwise approach offers an alternative and robust method to study the pattern of metal uptake by Chinese cabbage (Brassica pekinensis L.). Copyright © 2016. Published by Elsevier B.V.

  15. Erythropoietin-Stimulating Agents and Survival in End-Stage Renal Disease: Comparison of Payment Policy Analysis, Instrumental Variables, and Multiple Imputation of Potential Outcomes

    PubMed Central

    Dore, David D.; Swaminathan, Shailender; Gutman, Roee; Trivedi, Amal N.; Mor, Vincent

    2013-01-01

    Objective To compare the assumptions and estimands across three approaches to estimating the effect of erythropoietin-stimulating agents (ESAs) on mortality. Study Design and Setting Using data from the Renal Management Information System, we conducted two analyses utilizing a change to bundled payment that we hypothesized mimicked random assignment to ESA (pre-post, difference-in-difference, and instrumental variable analyses). A third analysis was based on multiply imputing potential outcomes using propensity scores. Results There were 311,087 recipients of ESAs and 13,095 non-recipients. In the pre-post comparison, we identified no clear relationship between bundled payment (measured by calendar time) and the incidence of death within six months (risk difference -1.5%; 95% CI - 7.0% to 4.0%). In the instrumental variable analysis, the risk of mortality was similar among ESA recipients (risk difference -0.9%; 95% CI -2.1 to 0.3). In the multiple imputation analysis, we observed a 4.2% (95% CI 3.4% to 4.9%) absolute reduction in mortality risk with use of ESAs, but closer to the null for patients with baseline hematocrit >36%. Conclusion Methods emanating from different disciplines often rely on different assumptions, but can be informative about a similar causal contrast. The implications of these distinct approaches are discussed. PMID:23849152

  16. Demonstrating microbial co-occurrence pattern analyses within and between ecosystems

    PubMed Central

    Williams, Ryan J.; Howe, Adina; Hofmockel, Kirsten S.

    2014-01-01

    Co-occurrence patterns are used in ecology to explore interactions between organisms and environmental effects on coexistence within biological communities. Analysis of co-occurrence patterns among microbial communities has ranged from simple pairwise comparisons between all community members to direct hypothesis testing between focal species. However, co-occurrence patterns are rarely studied across multiple ecosystems or multiple scales of biological organization within the same study. Here we outline an approach to produce co-occurrence analyses that are focused at three different scales: co-occurrence patterns between ecosystems at the community scale, modules of co-occurring microorganisms within communities, and co-occurring pairs within modules that are nested within microbial communities. To demonstrate our co-occurrence analysis approach, we gathered publicly available 16S rRNA amplicon datasets to compare and contrast microbial co-occurrence at different taxonomic levels across different ecosystems. We found differences in community composition and co-occurrence that reflect environmental filtering at the community scale and consistent pairwise occurrences that may be used to infer ecological traits about poorly understood microbial taxa. However, we also found that conclusions derived from applying network statistics to microbial relationships can vary depending on the taxonomic level chosen and criteria used to build co-occurrence networks. We present our statistical analysis and code for public use in analysis of co-occurrence patterns across microbial communities. PMID:25101065

  17. Exhaled Breath Markers for Nonimaging and Noninvasive Measures for Detection of Multiple Sclerosis.

    PubMed

    Broza, Yoav Y; Har-Shai, Lior; Jeries, Raneen; Cancilla, John C; Glass-Marmor, Lea; Lejbkowicz, Izabella; Torrecilla, José S; Yao, Xuelin; Feng, Xinliang; Narita, Akimitsu; Müllen, Klaus; Miller, Ariel; Haick, Hossam

    2017-11-15

    Multiple sclerosis (MS) is the most common chronic neurological disease affecting young adults. MS diagnosis is based on clinical characteristics and confirmed by examination of the cerebrospinal fluids (CSF) or by magnetic resonance imaging (MRI) of the brain or spinal cord or both. However, neither of the current diagnostic procedures are adequate as a routine tool to determine disease state. Thus, diagnostic biomarkers are needed. In the current study, a novel approach that could meet these expectations is presented. The approach is based on noninvasive analysis of volatile organic compounds (VOCs) in breath. Exhaled breath was collected from 204 participants, 146 MS and 58 healthy control individuals. Analysis was performed by gas-chromatography mass-spectrometry (GC-MS) and nanomaterial-based sensor array. Predictive models were derived from the sensors, using artificial neural networks (ANNs). GC-MS analysis revealed significant differences in VOC abundance between MS patients and controls. Sensor data analysis on training sets was able to discriminate in binary comparisons between MS patients and controls with accuracies up to 90%. Blinded sets showed 95% positive predictive value (PPV) between MS-remission and control, 100% sensitivity with 100% negative predictive value (NPV) between MS not-treated (NT) and control, and 86% NPV between relapse and control. Possible links between VOC biomarkers and the MS pathogenesis were established. Preliminary results suggest the applicability of a new nanotechnology-based method for MS diagnostics.

  18. Assessment of BTEX-induced health risk under multiple uncertainties at a petroleum-contaminated site: An integrated fuzzy stochastic approach

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaodong; Huang, Guo H.

    2011-12-01

    Groundwater pollution has gathered more and more attention in the past decades. Conducting an assessment of groundwater contamination risk is desired to provide sound bases for supporting risk-based management decisions. Therefore, the objective of this study is to develop an integrated fuzzy stochastic approach to evaluate risks of BTEX-contaminated groundwater under multiple uncertainties. It consists of an integrated interval fuzzy subsurface modeling system (IIFMS) and an integrated fuzzy second-order stochastic risk assessment (IFSOSRA) model. The IIFMS is developed based on factorial design, interval analysis, and fuzzy sets approach to predict contaminant concentrations under hybrid uncertainties. Two input parameters (longitudinal dispersivity and porosity) are considered to be uncertain with known fuzzy membership functions, and intrinsic permeability is considered to be an interval number with unknown distribution information. A factorial design is conducted to evaluate interactive effects of the three uncertain factors on the modeling outputs through the developed IIFMS. The IFSOSRA model can systematically quantify variability and uncertainty, as well as their hybrids, presented as fuzzy, stochastic and second-order stochastic parameters in health risk assessment. The developed approach haw been applied to the management of a real-world petroleum-contaminated site within a western Canada context. The results indicate that multiple uncertainties, under a combination of information with various data-quality levels, can be effectively addressed to provide supports in identifying proper remedial efforts. A unique contribution of this research is the development of an integrated fuzzy stochastic approach for handling various forms of uncertainties associated with simulation and risk assessment efforts.

  19. Multiple scales approach to weakly nonparallel and curvature effects: Details for the novice

    NASA Technical Reports Server (NTRS)

    Singer, Bart A.; Choudhari, Meelan

    1995-01-01

    A multiple scales approach is used to approximate the effects of nonparallelism and streamwise curvature on the stability of three-dimensional disturbances in incompressible flow. The multiple scales approach is implemented with the full second-order system of equations. A detailed exposition of the source of all terms is provided.

  20. In Australia: Multiple Intelligences in Multiple Settings.

    ERIC Educational Resources Information Center

    Vialle, Wilma

    1997-01-01

    In Australia, Gardner's multiple-intelligences theory has strongly influenced primary, preschool, and special education. A survey of 30 schools revealed that teachers use two basic approaches: teaching to, and teaching through, multiple intelligences. The first approach might develop children's music skills via playing an instrument. The second…

Top