Genomic-based multiple-trait evaluation in Eucalyptus grandis using dominant DArT markers.
Cappa, Eduardo P; El-Kassaby, Yousry A; Muñoz, Facundo; Garcia, Martín N; Villalba, Pamela V; Klápště, Jaroslav; Marcucci Poltri, Susana N
2018-06-01
We investigated the impact of combining the pedigree- and genomic-based relationship matrices in a multiple-trait individual-tree mixed model (a.k.a., multiple-trait combined approach) on the estimates of heritability and on the genomic correlations between growth and stem straightness in an open-pollinated Eucalyptus grandis population. Additionally, the added advantage of incorporating genomic information on the theoretical accuracies of parents and offspring breeding values was evaluated. Our results suggested that the use of the combined approach for estimating heritabilities and additive genetic correlations in multiple-trait evaluations is advantageous and including genomic information increases the expected accuracy of breeding values. Furthermore, the multiple-trait combined approach was proven to be superior to the single-trait combined approach in predicting breeding values, in particular for low-heritability traits. Finally, our results advocate the use of the combined approach in forest tree progeny testing trials, specifically when a multiple-trait individual-tree mixed model is considered. Copyright © 2018 Elsevier B.V. All rights reserved.
Lima, Estevao; Rolanda, Carla; Correia-Pinto, Jorge
2009-05-01
An isolated transgastric port raises serious limitations in performing natural orifice translumenal endoscopic surgery (NOTES) complex procedures in the urology field. In an attempt to overcome these limitations, several solutions has been advanced, such as the hybrid approach (adding a single abdominal port access) or the pure NOTES combined approach (joining multiple natural orifice ports). To review the current state of experimental and clinical results of multiple ports in NOTES, a literature search of PubMed was performed, seeking publications from January 2002 to 2008 on NOTES. In addition, we looked at pertinent abstracts of annual meetings of the American Urological Association, the European Association of Urology, and the World Congress of Endourology from 2007. Multiple ports of entry seem to be necessary, mainly for moderately complex procedures. Thus, we could find studies using the hybrid approach (combination of transgastric or transvaginal access with a single transabdominal port), or using the pure NOTES combined approach (transgastric and transvesical, transvaginal and transcolonic, or transgastric and transvaginal). There is still limited experience in humans using these approaches, and no comparative studies exist to date. It is predictable that for moderately complex procedures, we will need multiple ports, so the transvaginal-transabdominal (hybrid) approach is the most appealing, whereas in a pure NOTES perspective, the transgastric-transvesical approach seems to be the preferred approach. We are waiting for new equipment and instruments that are more appropriate for these novel techniques.
A data fusion approach for track monitoring from multiple in-service trains
NASA Astrophysics Data System (ADS)
Lederman, George; Chen, Siheng; Garrett, James H.; Kovačević, Jelena; Noh, Hae Young; Bielak, Jacobo
2017-10-01
We present a data fusion approach for enabling data-driven rail-infrastructure monitoring from multiple in-service trains. A number of researchers have proposed using vibration data collected from in-service trains as a low-cost method to monitor track geometry. The majority of this work has focused on developing novel features to extract information about the tracks from data produced by individual sensors on individual trains. We extend this work by presenting a technique to combine extracted features from multiple passes over the tracks from multiple sensors aboard multiple vehicles. There are a number of challenges in combining multiple data sources, like different relative position coordinates depending on the location of the sensor within the train. Furthermore, as the number of sensors increases, the likelihood that some will malfunction also increases. We use a two-step approach that first minimizes position offset errors through data alignment, then fuses the data with a novel adaptive Kalman filter that weights data according to its estimated reliability. We show the efficacy of this approach both through simulations and on a data-set collected from two instrumented trains operating over a one-year period. Combining data from numerous in-service trains allows for more continuous and more reliable data-driven monitoring than analyzing data from any one train alone; as the number of instrumented trains increases, the proposed fusion approach could facilitate track monitoring of entire rail-networks.
Almalik, Osama; Nijhuis, Michiel B; van den Heuvel, Edwin R
2014-01-01
Shelf-life estimation usually requires that at least three registration batches are tested for stability at multiple storage conditions. The shelf-life estimates are often obtained by linear regression analysis per storage condition, an approach implicitly suggested by ICH guideline Q1E. A linear regression analysis combining all data from multiple storage conditions was recently proposed in the literature when variances are homogeneous across storage conditions. The combined analysis is expected to perform better than the separate analysis per storage condition, since pooling data would lead to an improved estimate of the variation and higher numbers of degrees of freedom, but this is not evident for shelf-life estimation. Indeed, the two approaches treat the observed initial batch results, the intercepts in the model, and poolability of batches differently, which may eliminate or reduce the expected advantage of the combined approach with respect to the separate approach. Therefore, a simulation study was performed to compare the distribution of simulated shelf-life estimates on several characteristics between the two approaches and to quantify the difference in shelf-life estimates. In general, the combined statistical analysis does estimate the true shelf life more consistently and precisely than the analysis per storage condition, but it did not outperform the separate analysis in all circumstances.
Tomlinson, Alan; Hair, Mario; McFadyen, Angus
2013-10-01
Dry eye is a multifactorial disease which would require a broad spectrum of test measures in the monitoring of its treatment and diagnosis. However, studies have typically reported improvements in individual measures with treatment. Alternative approaches involve multiple, combined outcomes being assessed by different statistical analyses. In order to assess the effect of various statistical approaches to the use of single and combined test measures in dry eye, this review reanalyzed measures from two previous studies (osmolarity, evaporation, tear turnover rate, and lipid film quality). These analyses assessed the measures as single variables within groups, pre- and post-intervention with a lubricant supplement, by creating combinations of these variables and by validating these combinations with the combined sample of data from all groups of dry eye subjects. The effectiveness of single measures and combinations in diagnosis of dry eye was also considered. Copyright © 2013. Published by Elsevier Inc.
Combined mining: discovering informative knowledge in complex data.
Cao, Longbing; Zhang, Huaifeng; Zhao, Yanchang; Luo, Dan; Zhang, Chengqi
2011-06-01
Enterprise data mining applications often involve complex data such as multiple large heterogeneous data sources, user preferences, and business impact. In such situations, a single method or one-step mining is often limited in discovering informative knowledge. It would also be very time and space consuming, if not impossible, to join relevant large data sources for mining patterns consisting of multiple aspects of information. It is crucial to develop effective approaches for mining patterns combining necessary information from multiple relevant business lines, catering for real business settings and decision-making actions rather than just providing a single line of patterns. The recent years have seen increasing efforts on mining more informative patterns, e.g., integrating frequent pattern mining with classifications to generate frequent pattern-based classifiers. Rather than presenting a specific algorithm, this paper builds on our existing works and proposes combined mining as a general approach to mining for informative patterns combining components from either multiple data sets or multiple features or by multiple methods on demand. We summarize general frameworks, paradigms, and basic processes for multifeature combined mining, multisource combined mining, and multimethod combined mining. Novel types of combined patterns, such as incremental cluster patterns, can result from such frameworks, which cannot be directly produced by the existing methods. A set of real-world case studies has been conducted to test the frameworks, with some of them briefed in this paper. They identify combined patterns for informing government debt prevention and improving government service objectives, which show the flexibility and instantiation capability of combined mining in discovering informative knowledge in complex data.
An improved Multimodel Approach for Global Sea Surface Temperature Forecasts
NASA Astrophysics Data System (ADS)
Khan, M. Z. K.; Mehrotra, R.; Sharma, A.
2014-12-01
The concept of ensemble combinations for formulating improved climate forecasts has gained popularity in recent years. However, many climate models share similar physics or modeling processes, which may lead to similar (or strongly correlated) forecasts. Recent approaches for combining forecasts that take into consideration differences in model accuracy over space and time have either ignored the similarity of forecast among the models or followed a pairwise dynamic combination approach. Here we present a basis for combining model predictions, illustrating the improvements that can be achieved if procedures for factoring in inter-model dependence are utilised. The utility of the approach is demonstrated by combining sea surface temperature (SST) forecasts from five climate models over a period of 1960-2005. The variable of interest, the monthly global sea surface temperature anomalies (SSTA) at a 50´50 latitude-longitude grid, is predicted three months in advance to demonstrate the utility of the proposed algorithm. Results indicate that the proposed approach offers consistent and significant improvements for majority of grid points compared to the case where the dependence among the models is ignored. Therefore, the proposed approach of combining multiple models by taking into account the existing interdependence, provides an attractive alternative to obtain improved climate forecast. In addition, an approach to combine seasonal forecasts from multiple climate models with varying periods of availability is also demonstrated.
Palmprint authentication using multiple classifiers
NASA Astrophysics Data System (ADS)
Kumar, Ajay; Zhang, David
2004-08-01
This paper investigates the performance improvement for palmprint authentication using multiple classifiers. The proposed methods on personal authentication using palmprints can be divided into three categories; appearance- , line -, and texture-based. A combination of these approaches can be used to achieve higher performance. We propose to simultaneously extract palmprint features from PCA, Line detectors and Gabor-filters and combine their corresponding matching scores. This paper also investigates the comparative performance of simple combination rules and the hybrid fusion strategy to achieve performance improvement. Our experimental results on the database of 100 users demonstrate the usefulness of such approach over those based on individual classifiers.
ERIC Educational Resources Information Center
Bottomley, Steven; Denny, Paul
2011-01-01
A participatory learning approach, combined with both a traditional and a competitive assessment, was used to motivate students and promote a deep approach to learning biochemistry. Students were challenged to research, author, and explain their own multiple-choice questions (MCQs). They were also required to answer, evaluate, and discuss MCQs…
NASA Astrophysics Data System (ADS)
Joshi, Aditya; Lindsey, Brooks D.; Dayton, Paul A.; Pinton, Gianmarco; Muller, Marie
2017-05-01
Ultrasound contrast agents (UCA), such as microbubbles, enhance the scattering properties of blood, which is otherwise hypoechoic. The multiple scattering interactions of the acoustic field with UCA are poorly understood due to the complexity of the multiple scattering theories and the nonlinear microbubble response. The majority of bubble models describe the behavior of UCA as single, isolated microbubbles suspended in infinite medium. Multiple scattering models such as the independent scattering approximation can approximate phase velocity and attenuation for low scatterer volume fractions. However, all current models and simulation approaches only describe multiple scattering and nonlinear bubble dynamics separately. Here we present an approach that combines two existing models: (1) a full-wave model that describes nonlinear propagation and scattering interactions in a heterogeneous attenuating medium and (2) a Paul-Sarkar model that describes the nonlinear interactions between an acoustic field and microbubbles. These two models were solved numerically and combined with an iterative approach. The convergence of this combined model was explored in silico for 0.5 × 106 microbubbles ml-1, 1% and 2% bubble concentration by volume. The backscattering predicted by our modeling approach was verified experimentally with water tank measurements performed with a 128-element linear array transducer. An excellent agreement in terms of the fundamental and harmonic acoustic fields is shown. Additionally, our model correctly predicts the phase velocity and attenuation measured using through transmission and predicted by the independent scattering approximation.
Yang, Zhihao; Lin, Yuan; Wu, Jiajin; Tang, Nan; Lin, Hongfei; Li, Yanpeng
2011-10-01
Knowledge about protein-protein interactions (PPIs) unveils the molecular mechanisms of biological processes. However, the volume and content of published biomedical literature on protein interactions is expanding rapidly, making it increasingly difficult for interaction database curators to detect and curate protein interaction information manually. We present a multiple kernel learning-based approach for automatic PPI extraction from biomedical literature. The approach combines the following kernels: feature-based, tree, and graph and combines their output with Ranking support vector machine (SVM). Experimental evaluations show that the features in individual kernels are complementary and the kernel combined with Ranking SVM achieves better performance than those of the individual kernels, equal weight combination and optimal weight combination. Our approach can achieve state-of-the-art performance with respect to the comparable evaluations, with 64.88% F-score and 88.02% AUC on the AImed corpus. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Bharti, P. K.; Khan, M. I.; Singh, Harbinder
2010-10-01
Off-line quality control is considered to be an effective approach to improve product quality at a relatively low cost. The Taguchi method is one of the conventional approaches for this purpose. Through this approach, engineers can determine a feasible combination of design parameters such that the variability of a product's response can be reduced and the mean is close to the desired target. The traditional Taguchi method was focused on ensuring good performance at the parameter design stage with one quality characteristic, but most products and processes have multiple quality characteristics. The optimal parameter design minimizes the total quality loss for multiple quality characteristics. Several studies have presented approaches addressing multiple quality characteristics. Most of these papers were concerned with maximizing the parameter combination of signal to noise (SN) ratios. The results reveal the advantages of this approach are that the optimal parameter design is the same as the traditional Taguchi method for the single quality characteristic; the optimal design maximizes the amount of reduction of total quality loss for multiple quality characteristics. This paper presents a literature review on solving multi-response problems in the Taguchi method and its successful implementation in various industries.
Association analysis of multiple traits by an approach of combining P values.
Chen, Lili; Wang, Yong; Zhou, Yajing
2018-03-01
Increasing evidence shows that one variant can affect multiple traits, which is a widespread phenomenon in complex diseases. Joint analysis of multiple traits can increase statistical power of association analysis and uncover the underlying genetic mechanism. Although there are many statistical methods to analyse multiple traits, most of these methods are usually suitable for detecting common variants associated with multiple traits. However, because of low minor allele frequency of rare variant, these methods are not optimal for rare variant association analysis. In this paper, we extend an adaptive combination of P values method (termed ADA) for single trait to test association between multiple traits and rare variants in the given region. For a given region, we use reverse regression model to test each rare variant associated with multiple traits and obtain the P value of single-variant test. Further, we take the weighted combination of these P values as the test statistic. Extensive simulation studies show that our approach is more powerful than several other comparison methods in most cases and is robust to the inclusion of a high proportion of neutral variants and the different directions of effects of causal variants.
Leyrat, Clémence; Seaman, Shaun R; White, Ian R; Douglas, Ian; Smeeth, Liam; Kim, Joseph; Resche-Rigon, Matthieu; Carpenter, James R; Williamson, Elizabeth J
2017-01-01
Inverse probability of treatment weighting is a popular propensity score-based approach to estimate marginal treatment effects in observational studies at risk of confounding bias. A major issue when estimating the propensity score is the presence of partially observed covariates. Multiple imputation is a natural approach to handle missing data on covariates: covariates are imputed and a propensity score analysis is performed in each imputed dataset to estimate the treatment effect. The treatment effect estimates from each imputed dataset are then combined to obtain an overall estimate. We call this method MIte. However, an alternative approach has been proposed, in which the propensity scores are combined across the imputed datasets (MIps). Therefore, there are remaining uncertainties about how to implement multiple imputation for propensity score analysis: (a) should we apply Rubin's rules to the inverse probability of treatment weighting treatment effect estimates or to the propensity score estimates themselves? (b) does the outcome have to be included in the imputation model? (c) how should we estimate the variance of the inverse probability of treatment weighting estimator after multiple imputation? We studied the consistency and balancing properties of the MIte and MIps estimators and performed a simulation study to empirically assess their performance for the analysis of a binary outcome. We also compared the performance of these methods to complete case analysis and the missingness pattern approach, which uses a different propensity score model for each pattern of missingness, and a third multiple imputation approach in which the propensity score parameters are combined rather than the propensity scores themselves (MIpar). Under a missing at random mechanism, complete case and missingness pattern analyses were biased in most cases for estimating the marginal treatment effect, whereas multiple imputation approaches were approximately unbiased as long as the outcome was included in the imputation model. Only MIte was unbiased in all the studied scenarios and Rubin's rules provided good variance estimates for MIte. The propensity score estimated in the MIte approach showed good balancing properties. In conclusion, when using multiple imputation in the inverse probability of treatment weighting context, MIte with the outcome included in the imputation model is the preferred approach.
ERIC Educational Resources Information Center
Martínez, José Felipe; Schweig, Jonathan; Goldschmidt, Pete
2016-01-01
A key question facing teacher evaluation systems is how to combine multiple measures of complex constructs into composite indicators of performance. We use data from the Measures of Effective Teaching (MET) study to investigate the measurement properties of composite indicators obtained under various conjunctive, disjunctive (or complementary),…
Exploring High-D Spaces with Multiform Matrices and Small Multiples
MacEachren, Alan; Dai, Xiping; Hardisty, Frank; Guo, Diansheng; Lengerich, Gene
2011-01-01
We introduce an approach to visual analysis of multivariate data that integrates several methods from information visualization, exploratory data analysis (EDA), and geovisualization. The approach leverages the component-based architecture implemented in GeoVISTA Studio to construct a flexible, multiview, tightly (but generically) coordinated, EDA toolkit. This toolkit builds upon traditional ideas behind both small multiples and scatterplot matrices in three fundamental ways. First, we develop a general, MultiForm, Bivariate Matrix and a complementary MultiForm, Bivariate Small Multiple plot in which different bivariate representation forms can be used in combination. We demonstrate the flexibility of this approach with matrices and small multiples that depict multivariate data through combinations of: scatterplots, bivariate maps, and space-filling displays. Second, we apply a measure of conditional entropy to (a) identify variables from a high-dimensional data set that are likely to display interesting relationships and (b) generate a default order of these variables in the matrix or small multiple display. Third, we add conditioning, a kind of dynamic query/filtering in which supplementary (undisplayed) variables are used to constrain the view onto variables that are displayed. Conditioning allows the effects of one or more well understood variables to be removed from the analysis, making relationships among remaining variables easier to explore. We illustrate the individual and combined functionality enabled by this approach through application to analysis of cancer diagnosis and mortality data and their associated covariates and risk factors. PMID:21947129
ERIC Educational Resources Information Center
Durston, Sarah; Konrad, Kerstin
2007-01-01
This paper aims to illustrate how combining multiple approaches can inform us about the neurobiology of ADHD. Converging evidence from genetic, psychopharmacological and functional neuroimaging studies has implicated dopaminergic fronto-striatal circuitry in ADHD. However, while the observation of converging evidence from multiple vantage points…
Gabb, Henry A; Blake, Catherine
2016-08-01
Simultaneous or sequential exposure to multiple environmental stressors can affect chemical toxicity. Cumulative risk assessments consider multiple stressors but it is impractical to test every chemical combination to which people are exposed. New methods are needed to prioritize chemical combinations based on their prevalence and possible health impacts. We introduce an informatics approach that uses publicly available data to identify chemicals that co-occur in consumer products, which account for a significant proportion of overall chemical load. Fifty-five asthma-associated and endocrine disrupting chemicals (target chemicals) were selected. A database of 38,975 distinct consumer products and 32,231 distinct ingredient names was created from online sources, and PubChem and the Unified Medical Language System were used to resolve synonymous ingredient names. Synonymous ingredient names are different names for the same chemical (e.g., vitamin E and tocopherol). Nearly one-third of the products (11,688 products, 30%) contained ≥ 1 target chemical and 5,229 products (13%) contained > 1. Of the 55 target chemicals, 31 (56%) appear in ≥ 1 product and 19 (35%) appear under more than one name. The most frequent three-way chemical combination (2-phenoxyethanol, methyl paraben, and ethyl paraben) appears in 1,059 products. Further work is needed to assess combined chemical exposures related to the use of multiple products. The informatics approach increased the number of products considered in a traditional analysis by two orders of magnitude, but missing/incomplete product labels can limit the effectiveness of this approach. Such an approach must resolve synonymy to ensure that chemicals of interest are not missed. Commonly occurring chemical combinations can be used to prioritize cumulative toxicology risk assessments. Gabb HA, Blake C. 2016. An informatics approach to evaluating combined chemical exposures from consumer products: a case study of asthma-associated chemicals and potential endocrine disruptors. Environ Health Perspect 124:1155-1165; http://dx.doi.org/10.1289/ehp.1510529.
Two-Way Regularized Fuzzy Clustering of Multiple Correspondence Analysis.
Kim, Sunmee; Choi, Ji Yeh; Hwang, Heungsun
2017-01-01
Multiple correspondence analysis (MCA) is a useful tool for investigating the interrelationships among dummy-coded categorical variables. MCA has been combined with clustering methods to examine whether there exist heterogeneous subclusters of a population, which exhibit cluster-level heterogeneity. These combined approaches aim to classify either observations only (one-way clustering of MCA) or both observations and variable categories (two-way clustering of MCA). The latter approach is favored because its solutions are easier to interpret by providing explicitly which subgroup of observations is associated with which subset of variable categories. Nonetheless, the two-way approach has been built on hard classification that assumes observations and/or variable categories to belong to only one cluster. To relax this assumption, we propose two-way fuzzy clustering of MCA. Specifically, we combine MCA with fuzzy k-means simultaneously to classify a subgroup of observations and a subset of variable categories into a common cluster, while allowing both observations and variable categories to belong partially to multiple clusters. Importantly, we adopt regularized fuzzy k-means, thereby enabling us to decide the degree of fuzziness in cluster memberships automatically. We evaluate the performance of the proposed approach through the analysis of simulated and real data, in comparison with existing two-way clustering approaches.
Describing and understanding behavioral responses to multiple stressors and multiple stimuli.
Hale, Robin; Piggott, Jeremy J; Swearer, Stephen E
2017-01-01
Understanding the effects of environmental change on natural ecosystems is a major challenge, particularly when multiple stressors interact to produce unexpected "ecological surprises" in the form of complex, nonadditive effects that can amplify or reduce their individual effects. Animals often respond behaviorally to environmental change, and multiple stressors can have both population-level and community-level effects. However, the individual, not combined, effects of stressors on animal behavior are commonly studied. There is a need to understand how animals respond to the more complex combinations of stressors that occur in nature, which requires a systematic and rigorous approach to quantify the various potential behavioral responses to the independent and interactive effects of stressors. We illustrate a robust, systematic approach for understanding behavioral responses to multiple stressors based on integrating schemes used to quantitatively classify interactions in multiple-stressor research and to qualitatively view interactions between multiple stimuli in behavioral experiments. We introduce and unify the two frameworks, highlighting their conceptual and methodological similarities, and use four case studies to demonstrate how this unification could improve our interpretation of interactions in behavioral experiments and guide efforts to manage the effects of multiple stressors. Our unified approach: (1) provides behavioral ecologists with a more rigorous and systematic way to quantify how animals respond to interactions between multiple stimuli, an important theoretical advance, (2) helps us better understand how animals behave when they encounter multiple, potentially interacting stressors, and (3) contributes more generally to the understanding of "ecological surprises" in multiple stressors research.
Godinez, William J; Rohr, Karl
2015-02-01
Tracking subcellular structures as well as viral structures displayed as 'particles' in fluorescence microscopy images yields quantitative information on the underlying dynamical processes. We have developed an approach for tracking multiple fluorescent particles based on probabilistic data association. The approach combines a localization scheme that uses a bottom-up strategy based on the spot-enhancing filter as well as a top-down strategy based on an ellipsoidal sampling scheme that uses the Gaussian probability distributions computed by a Kalman filter. The localization scheme yields multiple measurements that are incorporated into the Kalman filter via a combined innovation, where the association probabilities are interpreted as weights calculated using an image likelihood. To track objects in close proximity, we compute the support of each image position relative to the neighboring objects of a tracked object and use this support to recalculate the weights. To cope with multiple motion models, we integrated the interacting multiple model algorithm. The approach has been successfully applied to synthetic 2-D and 3-D images as well as to real 2-D and 3-D microscopy images, and the performance has been quantified. In addition, the approach was successfully applied to the 2-D and 3-D image data of the recent Particle Tracking Challenge at the IEEE International Symposium on Biomedical Imaging (ISBI) 2012.
Simultaneous Two-Way Clustering of Multiple Correspondence Analysis
ERIC Educational Resources Information Center
Hwang, Heungsun; Dillon, William R.
2010-01-01
A 2-way clustering approach to multiple correspondence analysis is proposed to account for cluster-level heterogeneity of both respondents and variable categories in multivariate categorical data. Specifically, in the proposed method, multiple correspondence analysis is combined with k-means in a unified framework in which "k"-means is…
3D measurement using combined Gray code and dual-frequency phase-shifting approach
NASA Astrophysics Data System (ADS)
Yu, Shuang; Zhang, Jing; Yu, Xiaoyang; Sun, Xiaoming; Wu, Haibin; Liu, Xin
2018-04-01
The combined Gray code and phase-shifting approach is a commonly used 3D measurement technique. In this technique, an error that equals integer multiples of the phase-shifted fringe period, i.e. period jump error, often exists in the absolute analog code, which can lead to gross measurement errors. To overcome this problem, the present paper proposes 3D measurement using a combined Gray code and dual-frequency phase-shifting approach. Based on 3D measurement using the combined Gray code and phase-shifting approach, one set of low-frequency phase-shifted fringe patterns with an odd-numbered multiple of the original phase-shifted fringe period is added. Thus, the absolute analog code measured value can be obtained by the combined Gray code and phase-shifting approach, and the low-frequency absolute analog code measured value can also be obtained by adding low-frequency phase-shifted fringe patterns. Then, the corrected absolute analog code measured value can be obtained by correcting the former by the latter, and the period jump errors can be eliminated, resulting in reliable analog code unwrapping. For the proposed approach, we established its measurement model, analyzed its measurement principle, expounded the mechanism of eliminating period jump errors by error analysis, and determined its applicable conditions. Theoretical analysis and experimental results show that the proposed approach can effectively eliminate period jump errors, reliably perform analog code unwrapping, and improve the measurement accuracy.
Choi, Ted; Eskin, Eleazar
2013-01-01
Gene expression data, in conjunction with information on genetic variants, have enabled studies to identify expression quantitative trait loci (eQTLs) or polymorphic locations in the genome that are associated with expression levels. Moreover, recent technological developments and cost decreases have further enabled studies to collect expression data in multiple tissues. One advantage of multiple tissue datasets is that studies can combine results from different tissues to identify eQTLs more accurately than examining each tissue separately. The idea of aggregating results of multiple tissues is closely related to the idea of meta-analysis which aggregates results of multiple genome-wide association studies to improve the power to detect associations. In principle, meta-analysis methods can be used to combine results from multiple tissues. However, eQTLs may have effects in only a single tissue, in all tissues, or in a subset of tissues with possibly different effect sizes. This heterogeneity in terms of effects across multiple tissues presents a key challenge to detect eQTLs. In this paper, we develop a framework that leverages two popular meta-analysis methods that address effect size heterogeneity to detect eQTLs across multiple tissues. We show by using simulations and multiple tissue data from mouse that our approach detects many eQTLs undetected by traditional eQTL methods. Additionally, our method provides an interpretation framework that accurately predicts whether an eQTL has an effect in a particular tissue. PMID:23785294
Global Search Capabilities of Indirect Methods for Impulsive Transfers
NASA Astrophysics Data System (ADS)
Shen, Hong-Xin; Casalino, Lorenzo; Luo, Ya-Zhong
2015-09-01
An optimization method which combines an indirect method with homotopic approach is proposed and applied to impulsive trajectories. Minimum-fuel, multiple-impulse solutions, with either fixed or open time are obtained. The homotopic approach at hand is relatively straightforward to implement and does not require an initial guess of adjoints, unlike previous adjoints estimation methods. A multiple-revolution Lambert solver is used to find multiple starting solutions for the homotopic procedure; this approach can guarantee to obtain multiple local solutions without relying on the user's intuition, thus efficiently exploring the solution space to find the global optimum. The indirect/homotopic approach proves to be quite effective and efficient in finding optimal solutions, and outperforms the joint use of evolutionary algorithms and deterministic methods in the test cases.
A hadoop-based method to predict potential effective drug combination.
Sun, Yifan; Xiong, Yi; Xu, Qian; Wei, Dongqing
2014-01-01
Combination drugs that impact multiple targets simultaneously are promising candidates for combating complex diseases due to their improved efficacy and reduced side effects. However, exhaustive screening of all possible drug combinations is extremely time-consuming and impractical. Here, we present a novel Hadoop-based approach to predict drug combinations by taking advantage of the MapReduce programming model, which leads to an improvement of scalability of the prediction algorithm. By integrating the gene expression data of multiple drugs, we constructed data preprocessing and the support vector machines and naïve Bayesian classifiers on Hadoop for prediction of drug combinations. The experimental results suggest that our Hadoop-based model achieves much higher efficiency in the big data processing steps with satisfactory performance. We believed that our proposed approach can help accelerate the prediction of potential effective drugs with the increasing of the combination number at an exponential rate in future. The source code and datasets are available upon request.
A Hadoop-Based Method to Predict Potential Effective Drug Combination
Xiong, Yi; Xu, Qian; Wei, Dongqing
2014-01-01
Combination drugs that impact multiple targets simultaneously are promising candidates for combating complex diseases due to their improved efficacy and reduced side effects. However, exhaustive screening of all possible drug combinations is extremely time-consuming and impractical. Here, we present a novel Hadoop-based approach to predict drug combinations by taking advantage of the MapReduce programming model, which leads to an improvement of scalability of the prediction algorithm. By integrating the gene expression data of multiple drugs, we constructed data preprocessing and the support vector machines and naïve Bayesian classifiers on Hadoop for prediction of drug combinations. The experimental results suggest that our Hadoop-based model achieves much higher efficiency in the big data processing steps with satisfactory performance. We believed that our proposed approach can help accelerate the prediction of potential effective drugs with the increasing of the combination number at an exponential rate in future. The source code and datasets are available upon request. PMID:25147789
Combining module based on coherent polarization beam combining.
Yang, Yan; Geng, Chao; Li, Feng; Li, Xinyang
2017-03-01
A multiaperture receiver with a phased array is an effective approach to overcome the effect of the random optical disturbance in coherent free-space laser communications, in which one of the key technologies is how to efficiently combine the multiple laser beams received by the phased array antenna. A combining module based on coherent polarization beam combining (CPBC), which can combine multiple laser beams to one laser beam with high combining efficiency and output a linearly polarized beam, is proposed in this paper. The principle of the combining module is introduced, the coherent polarization combining efficiency of CPBC is analyzed, and the performance of the combining module is evaluated. Moreover, the feasibility and the expansibility of the proposed combining module are validated in experiments of CPBC based on active phase-locking.
Gabb, Henry A.; Blake, Catherine
2016-01-01
Background: Simultaneous or sequential exposure to multiple environmental stressors can affect chemical toxicity. Cumulative risk assessments consider multiple stressors but it is impractical to test every chemical combination to which people are exposed. New methods are needed to prioritize chemical combinations based on their prevalence and possible health impacts. Objectives: We introduce an informatics approach that uses publicly available data to identify chemicals that co-occur in consumer products, which account for a significant proportion of overall chemical load. Methods: Fifty-five asthma-associated and endocrine disrupting chemicals (target chemicals) were selected. A database of 38,975 distinct consumer products and 32,231 distinct ingredient names was created from online sources, and PubChem and the Unified Medical Language System were used to resolve synonymous ingredient names. Synonymous ingredient names are different names for the same chemical (e.g., vitamin E and tocopherol). Results: Nearly one-third of the products (11,688 products, 30%) contained ≥ 1 target chemical and 5,229 products (13%) contained > 1. Of the 55 target chemicals, 31 (56%) appear in ≥ 1 product and 19 (35%) appear under more than one name. The most frequent three-way chemical combination (2-phenoxyethanol, methyl paraben, and ethyl paraben) appears in 1,059 products. Further work is needed to assess combined chemical exposures related to the use of multiple products. Conclusions: The informatics approach increased the number of products considered in a traditional analysis by two orders of magnitude, but missing/incomplete product labels can limit the effectiveness of this approach. Such an approach must resolve synonymy to ensure that chemicals of interest are not missed. Commonly occurring chemical combinations can be used to prioritize cumulative toxicology risk assessments. Citation: Gabb HA, Blake C. 2016. An informatics approach to evaluating combined chemical exposures from consumer products: a case study of asthma-associated chemicals and potential endocrine disruptors. Environ Health Perspect 124:1155–1165; http://dx.doi.org/10.1289/ehp.1510529 PMID:26955064
A Survey of Insider Attack Detection Research
2008-08-25
modeling of statistical features , such as the frequency of events, the duration of events, the co-occurrence of multiple events combined through...forms of attack that have been reported [Error! Reference source not found.]. For example: • Unauthorized extraction , duplication, or exfiltration...network level. Schultz pointed out that not one approach will work but solutions need to be based on multiple sensors to be able to find any combination
Coherent combining pulse bursts in time domain
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galvanauskas, Almantas
A beam combining and pulse stacking technique is provided that enhances laser pulse energy by coherent stacking pulse bursts (i.e. non-periodic pulsed signals) in time domain. This energy enhancement is achieved by using various configurations of Fabry-Perot, Gires-Tournois and other types of resonant cavities, so that a multiple-pulse burst incident at either a single input or multiple inputs of the system produces an output with a solitary pulse, which contains the summed energy of the incident multiple pulses from all beams. This disclosure provides a substantial improvement over conventional coherent-combining methods in that it achieves very high pulse energies usingmore » a relatively small number of combined laser systems, thus providing with orders of magnitude reduction in system size, complexity, and cost compared to current combining approaches.« less
Huang, Shi; MacKinnon, David P.; Perrino, Tatiana; Gallo, Carlos; Cruden, Gracelyn; Brown, C Hendricks
2016-01-01
Mediation analysis often requires larger sample sizes than main effect analysis to achieve the same statistical power. Combining results across similar trials may be the only practical option for increasing statistical power for mediation analysis in some situations. In this paper, we propose a method to estimate: 1) marginal means for mediation path a, the relation of the independent variable to the mediator; 2) marginal means for path b, the relation of the mediator to the outcome, across multiple trials; and 3) the between-trial level variance-covariance matrix based on a bivariate normal distribution. We present the statistical theory and an R computer program to combine regression coefficients from multiple trials to estimate a combined mediated effect and confidence interval under a random effects model. Values of coefficients a and b, along with their standard errors from each trial are the input for the method. This marginal likelihood based approach with Monte Carlo confidence intervals provides more accurate inference than the standard meta-analytic approach. We discuss computational issues, apply the method to two real-data examples and make recommendations for the use of the method in different settings. PMID:28239330
NASA Astrophysics Data System (ADS)
Xiong, Wenli; Wang, Pan; Hu, Jianmin; Jia, Yali; Wu, Lijie; Chen, Xiyang; Liu, Quanhong; Wang, Xiaobing
2015-12-01
Sonodynamic therapy (SDT) was developed as a promising noninvasive approach. The present study investigated the antitumor effect of a new sensitizer (sinoporphyrin sodium, referred to as DVDMS) combined with multiple ultrasound treatments on sarcoma 180 both in vitro and in vivo. The combined treatment significantly suppressed cell viability, potentiated apoptosis, and markedly inhibited angiogenesis in vivo. In vivo, the tumor weight inhibition ratio reached 89.82% fifteen days after three sonication treatments plus DVDMS. This effect was stronger than one ultrasound alone (32.56%) and than one round of sonication plus DVDMS (59.33%). DVDMS combined with multiple focused ultrasound treatments initiated tumor tissue destruction, induced cancer cell apoptosis, inhibited tumor angiogenesis, suppressed cancer cell proliferation, and decreased VEGF and PCNA expression levels. Moreover, the treatment did not show obvious signs of side effects or induce a drop in body weight. These results indicated that DVDMS combined with multiple focused ultrasounds may be a promising strategy against solid tumor.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, S.; Barua, A.; Zhou, M., E-mail: min.zhou@me.gatech.edu
2014-05-07
Accounting for the combined effect of multiple sources of stochasticity in material attributes, we develop an approach that computationally predicts the probability of ignition of polymer-bonded explosives (PBXs) under impact loading. The probabilistic nature of the specific ignition processes is assumed to arise from two sources of stochasticity. The first source involves random variations in material microstructural morphology; the second source involves random fluctuations in grain-binder interfacial bonding strength. The effect of the first source of stochasticity is analyzed with multiple sets of statistically similar microstructures and constant interfacial bonding strength. Subsequently, each of the microstructures in the multiple setsmore » is assigned multiple instantiations of randomly varying grain-binder interfacial strengths to analyze the effect of the second source of stochasticity. Critical hotspot size-temperature states reaching the threshold for ignition are calculated through finite element simulations that explicitly account for microstructure and bulk and interfacial dissipation to quantify the time to criticality (t{sub c}) of individual samples, allowing the probability distribution of the time to criticality that results from each source of stochastic variation for a material to be analyzed. Two probability superposition models are considered to combine the effects of the multiple sources of stochasticity. The first is a parallel and series combination model, and the second is a nested probability function model. Results show that the nested Weibull distribution provides an accurate description of the combined ignition probability. The approach developed here represents a general framework for analyzing the stochasticity in the material behavior that arises out of multiple types of uncertainty associated with the structure, design, synthesis and processing of materials.« less
Fused-data transrectal EIT for prostate cancer imaging.
Murphy, Ethan K; Wu, Xiaotian; Halter, Ryan J
2018-05-25
Prostate cancer is a significant problem affecting 1 in 7 men. Unfortunately, the diagnostic gold-standard of ultrasound-guided biopsy misses 10%-30% of all cancers. The objective of this study was to develop an electrical impedance tomography (EIT) approach that has the potential to image the entire prostate using multiple impedance measurements recorded between electrodes integrated onto an end-fired transrectal ultrasound (TRUS) device and a biopsy probe (BP). Simulations and sensitivity analyses were used to investigate the best combination of electrodes, and measured tank experiments were used to evaluate a fused-data transrectal EIT (fd-TREIT) and BP approach. Simulations and sensitivity analysis revealed that (1) TREIT measurements are not sufficiently sensitive to image the whole prostate, (2) the combination of TREIT + BP measurements increases the sensitive region of TREIT-only measurements by 12×, and (3) the fusion of multiple TREIT + BP measurements collected during a routine or customized 12-core biopsy procedure can cover up to 76.1% or 94.1% of a nominal 50 cm 3 prostate, respectively. Three measured tank experiments of the fd-TREIT + BP approach successfully and accurately recovered the positions of 2-3 metal or plastic inclusions. The measured tank experiments represent important steps in the development of an algorithm that can combine EIT from multiple locations and from multiple probes-data that could be collected during a routine TRUS-guided 12-core biopsy. Overall, this result is a step towards a clinically deployable impedance imaging approach to scanning the entire prostate, which could significantly help to improve prostate cancer diagnosis.
Comparing multiple imputation methods for systematically missing subject-level data.
Kline, David; Andridge, Rebecca; Kaizar, Eloise
2017-06-01
When conducting research synthesis, the collection of studies that will be combined often do not measure the same set of variables, which creates missing data. When the studies to combine are longitudinal, missing data can occur on the observation-level (time-varying) or the subject-level (non-time-varying). Traditionally, the focus of missing data methods for longitudinal data has been on missing observation-level variables. In this paper, we focus on missing subject-level variables and compare two multiple imputation approaches: a joint modeling approach and a sequential conditional modeling approach. We find the joint modeling approach to be preferable to the sequential conditional approach, except when the covariance structure of the repeated outcome for each individual has homogenous variance and exchangeable correlation. Specifically, the regression coefficient estimates from an analysis incorporating imputed values based on the sequential conditional method are attenuated and less efficient than those from the joint method. Remarkably, the estimates from the sequential conditional method are often less efficient than a complete case analysis, which, in the context of research synthesis, implies that we lose efficiency by combining studies. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
An omnibus test for family-based association studies with multiple SNPs and multiple phenotypes.
Lasky-Su, Jessica; Murphy, Amy; McQueen, Matthew B; Weiss, Scott; Lange, Christoph
2010-06-01
We propose an omnibus family-based association test (MFBAT) that can be applied to multiple markers and multiple phenotypes and that has only one degree of freedom. The proposed test statistic extends current FBAT methodology to incorporate multiple markers as well as multiple phenotypes. Using simulation studies, power estimates for the proposed methodology are compared with the standard methodologies. On the basis of these simulations, we find that MFBAT substantially outperforms other methods, including haplotypic approaches and doing multiple tests with single single-nucleotide polymorphisms (SNPs) and single phenotypes. The practical relevance of the approach is illustrated by an application to asthma in which SNP/phenotype combinations are identified and reach overall significance that would not have been identified using other approaches. This methodology is directly applicable to cases in which there are multiple SNPs, such as candidate gene studies, cases in which there are multiple phenotypes, such as expression data, and cases in which there are multiple phenotypes and genotypes, such as genome-wide association studies that incorporate expression profiles as phenotypes. This program is available in the PBAT analysis package.
Combining information from multiple flood projections in a hierarchical Bayesian framework
NASA Astrophysics Data System (ADS)
Le Vine, Nataliya
2016-04-01
This study demonstrates, in the context of flood frequency analysis, the potential of a recently proposed hierarchical Bayesian approach to combine information from multiple models. The approach explicitly accommodates shared multimodel discrepancy as well as the probabilistic nature of the flood estimates, and treats the available models as a sample from a hypothetical complete (but unobserved) set of models. The methodology is applied to flood estimates from multiple hydrological projections (the Future Flows Hydrology data set) for 135 catchments in the UK. The advantages of the approach are shown to be: (1) to ensure adequate "baseline" with which to compare future changes; (2) to reduce flood estimate uncertainty; (3) to maximize use of statistical information in circumstances where multiple weak predictions individually lack power, but collectively provide meaningful information; (4) to diminish the importance of model consistency when model biases are large; and (5) to explicitly consider the influence of the (model performance) stationarity assumption. Moreover, the analysis indicates that reducing shared model discrepancy is the key to further reduction of uncertainty in the flood frequency analysis. The findings are of value regarding how conclusions about changing exposure to flooding are drawn, and to flood frequency change attribution studies.
Uniportal anatomic combined unusual segmentectomies.
González-Rivas, Diego; Lirio, Francisco; Sesma, Julio
2017-01-01
Nowadays, sublobar anatomic resections are gaining momentum as a valid alternative for early stage lung cancer. Despite being technically demanding, anatomic segmentectomies can be performed by uniportal video-assisted thoracic surgery (VATS) approach to combine the benefits of minimally invasiveness with the maximum lung sparing. This procedure can be even more complex if a combined resection of multiple segments from different lobes has to be done. Here we report five cases of combined and unusual segmentectomies done by the same experienced surgeon in high volume institutions to show uniportal VATS is a feasible approach for these complex resections and to share an excellent educational resource.
Combining multiple decisions: applications to bioinformatics
NASA Astrophysics Data System (ADS)
Yukinawa, N.; Takenouchi, T.; Oba, S.; Ishii, S.
2008-01-01
Multi-class classification is one of the fundamental tasks in bioinformatics and typically arises in cancer diagnosis studies by gene expression profiling. This article reviews two recent approaches to multi-class classification by combining multiple binary classifiers, which are formulated based on a unified framework of error-correcting output coding (ECOC). The first approach is to construct a multi-class classifier in which each binary classifier to be aggregated has a weight value to be optimally tuned based on the observed data. In the second approach, misclassification of each binary classifier is formulated as a bit inversion error with a probabilistic model by making an analogy to the context of information transmission theory. Experimental studies using various real-world datasets including cancer classification problems reveal that both of the new methods are superior or comparable to other multi-class classification methods.
Categorical Variables in Multiple Regression: Some Cautions.
ERIC Educational Resources Information Center
O'Grady, Kevin E.; Medoff, Deborah R.
1988-01-01
Limitations of dummy coding and nonsense coding as methods of coding categorical variables for use as predictors in multiple regression analysis are discussed. The combination of these approaches often yields estimates and tests of significance that are not intended by researchers for inclusion in their models. (SLD)
Tremblay-LeMay, Rosemarie; Rastgoo, Nasrin; Chang, Hong
2018-03-27
Even with recent advances in therapy regimen, multiple myeloma patients commonly develop drug resistance and relapse. The relevance of targeting the PD-1/PD-L1 axis has been demonstrated in pre-clinical models. Monotherapy with PD-1 inhibitors produced disappointing results, but combinations with other drugs used in the treatment of multiple myeloma seemed promising, and clinical trials are ongoing. However, there have recently been concerns about the safety of PD-1 and PD-L1 inhibitors combined with immunomodulators in the treatment of multiple myeloma, and several trials have been suspended. There is therefore a need for alternative combinations of drugs or different approaches to target this pathway. Protein expression of PD-L1 on cancer cells, including in multiple myeloma, has been associated with intrinsic aggressive features independent of immune evasion mechanisms, thereby providing a rationale for the adoption of new strategies directly targeting PD-L1 protein expression. Drugs modulating the transcriptional and post-transcriptional regulation of PD-L1 could represent new therapeutic strategies for the treatment of multiple myeloma, help potentiate the action of other drugs or be combined to PD-1/PD-L1 inhibitors in order to avoid the potentially problematic combination with immunomodulators. This review will focus on the pathophysiology of PD-L1 expression in multiple myeloma and drugs that have been shown to modulate this expression.
Wu, Xia; Li, Juan; Ayutyanont, Napatkamon; Protas, Hillary; Jagust, William; Fleisher, Adam; Reiman, Eric; Yao, Li; Chen, Kewei
2013-01-01
Given a single index, the receiver operational characteristic (ROC) curve analysis is routinely utilized for characterizing performances in distinguishing two conditions/groups in terms of sensitivity and specificity. Given the availability of multiple data sources (referred to as multi-indices), such as multimodal neuroimaging data sets, cognitive tests, and clinical ratings and genomic data in Alzheimer’s disease (AD) studies, the single-index-based ROC underutilizes all available information. For a long time, a number of algorithmic/analytic approaches combining multiple indices have been widely used to simultaneously incorporate multiple sources. In this study, we propose an alternative for combining multiple indices using logical operations, such as “AND,” “OR,” and “at least n” (where n is an integer), to construct multivariate ROC (multiV-ROC) and characterize the sensitivity and specificity statistically associated with the use of multiple indices. With and without the “leave-one-out” cross-validation, we used two data sets from AD studies to showcase the potentially increased sensitivity/specificity of the multiV-ROC in comparison to the single-index ROC and linear discriminant analysis (an analytic way of combining multi-indices). We conclude that, for the data sets we investigated, the proposed multiV-ROC approach is capable of providing a natural and practical alternative with improved classification accuracy as compared to univariate ROC and linear discriminant analysis.
Wu, Xia; Li, Juan; Ayutyanont, Napatkamon; Protas, Hillary; Jagust, William; Fleisher, Adam; Reiman, Eric; Yao, Li; Chen, Kewei
2014-01-01
Given a single index, the receiver operational characteristic (ROC) curve analysis is routinely utilized for characterizing performances in distinguishing two conditions/groups in terms of sensitivity and specificity. Given the availability of multiple data sources (referred to as multi-indices), such as multimodal neuroimaging data sets, cognitive tests, and clinical ratings and genomic data in Alzheimer’s disease (AD) studies, the single-index-based ROC underutilizes all available information. For a long time, a number of algorithmic/analytic approaches combining multiple indices have been widely used to simultaneously incorporate multiple sources. In this study, we propose an alternative for combining multiple indices using logical operations, such as “AND,” “OR,” and “at least n” (where n is an integer), to construct multivariate ROC (multiV-ROC) and characterize the sensitivity and specificity statistically associated with the use of multiple indices. With and without the “leave-one-out” cross-validation, we used two data sets from AD studies to showcase the potentially increased sensitivity/specificity of the multiV-ROC in comparison to the single-index ROC and linear discriminant analysis (an analytic way of combining multi-indices). We conclude that, for the data sets we investigated, the proposed multiV-ROC approach is capable of providing a natural and practical alternative with improved classification accuracy as compared to univariate ROC and linear discriminant analysis. PMID:23702553
ERIC Educational Resources Information Center
McMillan, D. E.; Wessinger, William D.; Li, Mi
2009-01-01
Drugs with multiple actions can have complex discriminative-stimulus properties. An approach to studying such drugs is to train subjects to discriminate among drug combinations and individual drugs in the combination so that all of the complex discriminative stimuli are present during training. In the current experiments, a four-choice procedure…
Struchen, R; Vial, F; Andersson, M G
2017-04-26
Delayed reporting of health data may hamper the early detection of infectious diseases in surveillance systems. Furthermore, combining multiple data streams, e.g. aiming at improving a system's sensitivity, can be challenging. In this study, we used a Bayesian framework where the result is presented as the value of evidence, i.e. the likelihood ratio for the evidence under outbreak versus baseline conditions. Based on a historical data set of routinely collected cattle mortality events, we evaluated outbreak detection performance (sensitivity, time to detection, in-control run length) under the Bayesian approach among three scenarios: presence of delayed data reporting, but not accounting for it; presence of delayed data reporting accounted for; and absence of delayed data reporting (i.e. an ideal system). Performance on larger and smaller outbreaks was compared with a classical approach, considering syndromes separately or combined. We found that the Bayesian approach performed better than the classical approach, especially for the smaller outbreaks. Furthermore, the Bayesian approach performed similarly well in the scenario where delayed reporting was accounted for to the scenario where it was absent. We argue that the value of evidence framework may be suitable for surveillance systems with multiple syndromes and delayed reporting of data.
Multiple transport systems mediate virus-induced acquired resistance to oxidative stress
USDA-ARS?s Scientific Manuscript database
In this paper, we report the phenomenon of acquired cross-tolerance to oxidative (UV-C and H2O2) stress in Nicotiana benthamiana plants infected with Potato virus X (PVX) and investigate the functional expression of transport systems in mediating this phenomenon. By combining multiple approaches, we...
Advanced Multiple Processor Configuration Study. Final Report.
ERIC Educational Resources Information Center
Clymer, S. J.
This summary of a study on multiple processor configurations includes the objectives, background, approach, and results of research undertaken to provide the Air Force with a generalized model of computer processor combinations for use in the evaluation of proposed flight training simulator computational designs. An analysis of a real-time flight…
Due to the complexity of the processes contributing to beach bacteria concentrations, many researchers rely on statistical modeling, among which multiple linear regression (MLR) modeling is most widely used. Despite its ease of use and interpretation, there may be time dependence...
Sexton, Ken
2012-01-01
Systematic evaluation of cumulative health risks from the combined effects of multiple environmental stressors is becoming a vital component of risk-based decisions aimed at protecting human populations and communities. This article briefly examines the historical development of cumulative risk assessment as an analytical tool, and discusses current approaches for evaluating cumulative health effects from exposure to both chemical mixtures and combinations of chemical and nonchemical stressors. A comparison of stressor-based and effects-based assessment methods is presented, and the potential value of focusing on viable risk management options to limit the scope of cumulative evaluations is discussed. The ultimate goal of cumulative risk assessment is to provide answers to decision-relevant questions based on organized scientific analysis; even if the answers, at least for the time being, are inexact and uncertain. PMID:22470298
Density estimation in tiger populations: combining information for strong inference
Gopalaswamy, Arjun M.; Royle, J. Andrew; Delampady, Mohan; Nichols, James D.; Karanth, K. Ullas; Macdonald, David W.
2012-01-01
A productive way forward in studies of animal populations is to efficiently make use of all the information available, either as raw data or as published sources, on critical parameters of interest. In this study, we demonstrate two approaches to the use of multiple sources of information on a parameter of fundamental interest to ecologists: animal density. The first approach produces estimates simultaneously from two different sources of data. The second approach was developed for situations in which initial data collection and analysis are followed up by subsequent data collection and prior knowledge is updated with new data using a stepwise process. Both approaches are used to estimate density of a rare and elusive predator, the tiger, by combining photographic and fecal DNA spatial capture–recapture data. The model, which combined information, provided the most precise estimate of density (8.5 ± 1.95 tigers/100 km2 [posterior mean ± SD]) relative to a model that utilized only one data source (photographic, 12.02 ± 3.02 tigers/100 km2 and fecal DNA, 6.65 ± 2.37 tigers/100 km2). Our study demonstrates that, by accounting for multiple sources of available information, estimates of animal density can be significantly improved.
Density estimation in tiger populations: combining information for strong inference.
Gopalaswamy, Arjun M; Royle, J Andrew; Delampady, Mohan; Nichols, James D; Karanth, K Ullas; Macdonald, David W
2012-07-01
A productive way forward in studies of animal populations is to efficiently make use of all the information available, either as raw data or as published sources, on critical parameters of interest. In this study, we demonstrate two approaches to the use of multiple sources of information on a parameter of fundamental interest to ecologists: animal density. The first approach produces estimates simultaneously from two different sources of data. The second approach was developed for situations in which initial data collection and analysis are followed up by subsequent data collection and prior knowledge is updated with new data using a stepwise process. Both approaches are used to estimate density of a rare and elusive predator, the tiger, by combining photographic and fecal DNA spatial capture-recapture data. The model, which combined information, provided the most precise estimate of density (8.5 +/- 1.95 tigers/100 km2 [posterior mean +/- SD]) relative to a model that utilized only one data source (photographic, 12.02 +/- 3.02 tigers/100 km2 and fecal DNA, 6.65 +/- 2.37 tigers/100 km2). Our study demonstrates that, by accounting for multiple sources of available information, estimates of animal density can be significantly improved.
Mavrotas, George; Ziomas, Ioannis C; Diakouaki, Danae
2006-07-01
This article presents a methodological approach for the formulation of control strategies capable of reducing atmospheric pollution at the standards set by European legislation. The approach was implemented in the greater area of Thessaloniki and was part of a project aiming at the compliance with air quality standards in five major cities in Greece. The methodological approach comprises two stages: in the first stage, the availability of several measures contributing to a certain extent to reducing atmospheric pollution indicates a combinatorial problem and favors the use of Integer Programming. More specifically, Multiple Objective Integer Programming is used in order to generate alternative efficient combinations of the available policy measures on the basis of two conflicting objectives: public expenditure minimization and social acceptance maximization. In the second stage, these combinations of control measures (i.e., the control strategies) are then comparatively evaluated with respect to a wider set of criteria, using tools from Multiple Criteria Decision Analysis, namely, the well-known PROMETHEE method. The whole procedure is based on the active involvement of local and central authorities in order to incorporate their concerns and preferences, as well as to secure the adoption and implementation of the resulting solution.
NASA Astrophysics Data System (ADS)
Mavrotas, George; Ziomas, Ioannis C.; Diakouaki, Danae
2006-07-01
This article presents a methodological approach for the formulation of control strategies capable of reducing atmospheric pollution at the standards set by European legislation. The approach was implemented in the greater area of Thessaloniki and was part of a project aiming at the compliance with air quality standards in five major cities in Greece. The methodological approach comprises two stages: in the first stage, the availability of several measures contributing to a certain extent to reducing atmospheric pollution indicates a combinatorial problem and favors the use of Integer Programming. More specifically, Multiple Objective Integer Programming is used in order to generate alternative efficient combinations of the available policy measures on the basis of two conflicting objectives: public expenditure minimization and social acceptance maximization. In the second stage, these combinations of control measures (i.e., the control strategies) are then comparatively evaluated with respect to a wider set of criteria, using tools from Multiple Criteria Decision Analysis, namely, the well-known PROMETHEE method. The whole procedure is based on the active involvement of local and central authorities in order to incorporate their concerns and preferences, as well as to secure the adoption and implementation of the resulting solution.
An Institutional Perspective on Accountable Care Organizations.
Goodrick, Elizabeth; Reay, Trish
2016-12-01
We employ aspects of institutional theory to explore how Accountable Care Organizations (ACOs) can effectively manage the multiplicity of ideas and pressures within which they are embedded and consequently better serve patients and their communities. More specifically, we draw on the concept of institutional logics to highlight the importance of understanding the conflicting principles upon which ACOs were founded. Based on previous research conducted both inside and outside health care settings, we argue that ACOs can combine attention to these principles (or institutional logics) in different ways; the options fall on a continuum from (a) segregating the effects of multiple logics from each other by compartmentalizing responses to multiple logics to (b) fully hybridizing the different logics. We suggest that the most productive path for ACOs is to situate their approach between the two extremes of "segregating" and "fully hybridizing." This strategic approach allows ACOs to develop effective responses that combine logics without fully integrating them. We identify three ways that ACOs can embrace institutional complexity short of fully hybridizing disparate logics: (1) reinterpreting practices to make them compatible with other logics; (2) engaging in strategies that take advantage of existing synergy between conflicting logics; (3) creating opportunities for people at frontline to develop innovative ways of working that combine multiple logics. © The Author(s) 2016.
Mingo, Janire; Erramuzpe, Asier; Luna, Sandra; Aurtenetxe, Olaia; Amo, Laura; Diez, Ibai; Schepens, Jan T. G.; Hendriks, Wiljan J. A. J.; Cortés, Jesús M.; Pulido, Rafael
2016-01-01
Site-directed mutagenesis (SDM) is a powerful tool to create defined collections of protein variants for experimental and clinical purposes, but effectiveness is compromised when a large number of mutations is required. We present here a one-tube-only standardized SDM approach that generates comprehensive collections of amino acid substitution variants, including scanning- and single site-multiple mutations. The approach combines unified mutagenic primer design with the mixing of multiple distinct primer pairs and/or plasmid templates to increase the yield of a single inverse-PCR mutagenesis reaction. Also, a user-friendly program for automatic design of standardized primers for Ala-scanning mutagenesis is made available. Experimental results were compared with a modeling approach together with stochastic simulation data. For single site-multiple mutagenesis purposes and for simultaneous mutagenesis in different plasmid backgrounds, combination of primer sets and/or plasmid templates in a single reaction tube yielded the distinct mutations in a stochastic fashion. For scanning mutagenesis, we found that a combination of overlapping primer sets in a single PCR reaction allowed the yield of different individual mutations, although this yield did not necessarily follow a stochastic trend. Double mutants were generated when the overlap of primer pairs was below 60%. Our results illustrate that one-tube-only SDM effectively reduces the number of reactions required in large-scale mutagenesis strategies, facilitating the generation of comprehensive collections of protein variants suitable for functional analysis. PMID:27548698
Selected reaction monitoring mass spectrometry: a methodology overview.
Ebhardt, H Alexander
2014-01-01
Moving past the discovery phase of proteomics, the term targeted proteomics combines multiple approaches investigating a certain set of proteins in more detail. One such targeted proteomics approach is the combination of liquid chromatography and selected or multiple reaction monitoring mass spectrometry (SRM, MRM). SRM-MS requires prior knowledge of the fragmentation pattern of peptides, as the presence of the analyte in a sample is determined by measuring the m/z values of predefined precursor and fragment ions. Using scheduled SRM-MS, many analytes can robustly be monitored allowing for high-throughput sample analysis of the same set of proteins over many conditions. In this chapter, fundaments of SRM-MS are explained as well as an optimized SRM pipeline from assay generation to data analyzed.
Protein fold recognition using geometric kernel data fusion.
Zakeri, Pooya; Jeuris, Ben; Vandebril, Raf; Moreau, Yves
2014-07-01
Various approaches based on features extracted from protein sequences and often machine learning methods have been used in the prediction of protein folds. Finding an efficient technique for integrating these different protein features has received increasing attention. In particular, kernel methods are an interesting class of techniques for integrating heterogeneous data. Various methods have been proposed to fuse multiple kernels. Most techniques for multiple kernel learning focus on learning a convex linear combination of base kernels. In addition to the limitation of linear combinations, working with such approaches could cause a loss of potentially useful information. We design several techniques to combine kernel matrices by taking more involved, geometry inspired means of these matrices instead of convex linear combinations. We consider various sequence-based protein features including information extracted directly from position-specific scoring matrices and local sequence alignment. We evaluate our methods for classification on the SCOP PDB-40D benchmark dataset for protein fold recognition. The best overall accuracy on the protein fold recognition test set obtained by our methods is ∼ 86.7%. This is an improvement over the results of the best existing approach. Moreover, our computational model has been developed by incorporating the functional domain composition of proteins through a hybridization model. It is observed that by using our proposed hybridization model, the protein fold recognition accuracy is further improved to 89.30%. Furthermore, we investigate the performance of our approach on the protein remote homology detection problem by fusing multiple string kernels. The MATLAB code used for our proposed geometric kernel fusion frameworks are publicly available at http://people.cs.kuleuven.be/∼raf.vandebril/homepage/software/geomean.php?menu=5/. © The Author 2014. Published by Oxford University Press.
Hughes, James Alexander; Houghten, Sheridan; Ashlock, Daniel
2016-12-01
DNA Fragment assembly - an NP-Hard problem - is one of the major steps in of DNA sequencing. Multiple strategies have been used for this problem, including greedy graph-based algorithms, deBruijn graphs, and the overlap-layout-consensus approach. This study focuses on the overlap-layout-consensus approach. Heuristics and computational intelligence methods are combined to exploit their respective benefits. These algorithm combinations were able to produce high quality results surpassing the best results obtained by a number of competitive algorithms specially designed and tuned for this problem on thirteen of sixteen popular benchmarks. This work also reinforces the necessity of using multiple search strategies as it is clearly observed that algorithm performance is dependent on problem instance; without a deeper look into many searches, top solutions could be missed entirely. Copyright © 2016. Published by Elsevier Ireland Ltd.
Bouwhuis, Stef; Geuskens, Goedele A; Boot, Cécile R L; Bongers, Paulien M; van der Beek, Allard J
2017-08-01
To construct prediction models for transitions to combination multiple job holding (MJH) (multiple jobs as an employee) and hybrid MJH (being an employee and self-employed), among employees aged 45-64. A total of 5187 employees in the Netherlands completed online questionnaires annually between 2010 and 2013. We applied logistic regression analyses with a backward elimination strategy to construct prediction models. Transitions to combination MJH and hybrid MJH were best predicted by a combination of factors including: demographics, health and mastery, work characteristics, work history, skills and knowledge, social factors, and financial factors. Not having a permanent contract and a poor household financial situation predicted both transitions. Some predictors only predicted combination MJH, e.g., working part-time, or hybrid MJH, e.g., work-home interference. A wide variety of factors predict combination MJH and/or hybrid MJH. The prediction model approach allowed for the identification of predictors that have not been previously studied. © 2017 Wiley Periodicals, Inc.
Uniportal anatomic combined unusual segmentectomies
Lirio, Francisco; Sesma, Julio
2017-01-01
Nowadays, sublobar anatomic resections are gaining momentum as a valid alternative for early stage lung cancer. Despite being technically demanding, anatomic segmentectomies can be performed by uniportal video-assisted thoracic surgery (VATS) approach to combine the benefits of minimally invasiveness with the maximum lung sparing. This procedure can be even more complex if a combined resection of multiple segments from different lobes has to be done. Here we report five cases of combined and unusual segmentectomies done by the same experienced surgeon in high volume institutions to show uniportal VATS is a feasible approach for these complex resections and to share an excellent educational resource. PMID:29078653
Repeater Analysis for Combining Information from Different Assessments
ERIC Educational Resources Information Center
Haberman, Shelby; Yao, Lili
2015-01-01
Admission decisions frequently rely on multiple assessments. As a consequence, it is important to explore rational approaches to combine the information from different educational tests. For example, U.S. graduate schools usually receive both TOEFL iBT® scores and GRE® General scores of foreign applicants for admission; however, little guidance…
ERIC Educational Resources Information Center
Daniels, Lia M.; Haynes, Tara L.; Stupnisky, Robert H.; Perry, Raymond P.; Newall, Nancy E.; Pekrun, Reinhard
2008-01-01
Within achievement goal theory debate remains regarding the adaptiveness of certain combinations of goals. Assuming a multiple-goals perspective, we used cluster analysis to classify 1002 undergraduate students according to their mastery and performance-approach goals. Four clusters emerged, representing different goal combinations: high…
Multi-pinhole SPECT Imaging with Silicon Strip Detectors
Peterson, Todd E.; Shokouhi, Sepideh; Furenlid, Lars R.; Wilson, Donald W.
2010-01-01
Silicon double-sided strip detectors offer outstanding instrinsic spatial resolution with reasonable detection efficiency for iodine-125 emissions. This spatial resolution allows for multiple-pinhole imaging at low magnification, minimizing the problem of multiplexing. We have conducted imaging studies using a prototype system that utilizes a detector of 300-micrometer thickness and 50-micrometer strip pitch together with a 23-pinhole collimator. These studies include an investigation of the synthetic-collimator imaging approach, which combines multiple-pinhole projections acquired at multiple magnifications to obtain tomographic reconstructions from limited-angle data using the ML-EM algorithm. Sub-millimeter spatial resolution was obtained, demonstrating the basic validity of this approach. PMID:20953300
Jang, Jinsil; Jeong, Soo-Jin; Kwon, Hee-Young; Jung, Ji Hoon; Sohn, Eun Jung; Lee, Hyo-Jung; Kim, Ji-Hyun; Kim, Sun-Hee; Kim, Jin Hyoung; Kim, Sung-Hoon
2013-01-01
Background. Combination cancer therapy is one of the attractive approaches to overcome drug resistance of cancer cells. In the present study, we investigated the synergistic effect of decursin from Angelica gigas and doxorubicin on the induction of apoptosis in three human multiple myeloma cells. Methodology/Principal Findings. Combined treatment of decursin and doxorubicin significantly exerted significant cytotoxicity compared to doxorubicin or decursin in U266, RPMI8226, and MM.1S cells. Furthermore, the combination treatment enhanced the activation of caspase-9 and -3, the cleavage of PARP, and the sub G1 population compared to either drug alone in three multiple myeloma cells. In addition, the combined treatment downregulated the phosphorylation of mTOR and its downstream S6K1 and activated the phosphorylation of ERK in three multiple myeloma cells. Furthermore, the combined treatment reduced mitochondrial membrane potential, suppressed the phosphorylation of JAK2, STAT3, and Src, activated SHP-2, and attenuated the expression of cyclind-D1 and survivin in U266 cells. Conversely, tyrosine phosphatase inhibitor pervanadate reversed STAT3 inactivation and also PARP cleavage and caspase-3 activation induced by combined treatment of doxorubicin and decursin in U266 cells. Conclusions/Significance. Overall, the combination treatment of decursin and doxorubicin can enhance apoptotic activity via mTOR and/or STAT3 signaling pathway in multiple myeloma cells. PMID:23818927
Jang, Jinsil; Jeong, Soo-Jin; Kwon, Hee-Young; Jung, Ji Hoon; Sohn, Eun Jung; Lee, Hyo-Jung; Kim, Ji-Hyun; Kim, Sun-Hee; Kim, Jin Hyoung; Kim, Sung-Hoon
2013-01-01
Background. Combination cancer therapy is one of the attractive approaches to overcome drug resistance of cancer cells. In the present study, we investigated the synergistic effect of decursin from Angelica gigas and doxorubicin on the induction of apoptosis in three human multiple myeloma cells. Methodology/Principal Findings. Combined treatment of decursin and doxorubicin significantly exerted significant cytotoxicity compared to doxorubicin or decursin in U266, RPMI8226, and MM.1S cells. Furthermore, the combination treatment enhanced the activation of caspase-9 and -3, the cleavage of PARP, and the sub G1 population compared to either drug alone in three multiple myeloma cells. In addition, the combined treatment downregulated the phosphorylation of mTOR and its downstream S6K1 and activated the phosphorylation of ERK in three multiple myeloma cells. Furthermore, the combined treatment reduced mitochondrial membrane potential, suppressed the phosphorylation of JAK2, STAT3, and Src, activated SHP-2, and attenuated the expression of cyclind-D1 and survivin in U266 cells. Conversely, tyrosine phosphatase inhibitor pervanadate reversed STAT3 inactivation and also PARP cleavage and caspase-3 activation induced by combined treatment of doxorubicin and decursin in U266 cells. Conclusions/Significance. Overall, the combination treatment of decursin and doxorubicin can enhance apoptotic activity via mTOR and/or STAT3 signaling pathway in multiple myeloma cells.
Combining results of multiple search engines in proteomics.
Shteynberg, David; Nesvizhskii, Alexey I; Moritz, Robert L; Deutsch, Eric W
2013-09-01
A crucial component of the analysis of shotgun proteomics datasets is the search engine, an algorithm that attempts to identify the peptide sequence from the parent molecular ion that produced each fragment ion spectrum in the dataset. There are many different search engines, both commercial and open source, each employing a somewhat different technique for spectrum identification. The set of high-scoring peptide-spectrum matches for a defined set of input spectra differs markedly among the various search engine results; individual engines each provide unique correct identifications among a core set of correlative identifications. This has led to the approach of combining the results from multiple search engines to achieve improved analysis of each dataset. Here we review the techniques and available software for combining the results of multiple search engines and briefly compare the relative performance of these techniques.
Combining Results of Multiple Search Engines in Proteomics*
Shteynberg, David; Nesvizhskii, Alexey I.; Moritz, Robert L.; Deutsch, Eric W.
2013-01-01
A crucial component of the analysis of shotgun proteomics datasets is the search engine, an algorithm that attempts to identify the peptide sequence from the parent molecular ion that produced each fragment ion spectrum in the dataset. There are many different search engines, both commercial and open source, each employing a somewhat different technique for spectrum identification. The set of high-scoring peptide-spectrum matches for a defined set of input spectra differs markedly among the various search engine results; individual engines each provide unique correct identifications among a core set of correlative identifications. This has led to the approach of combining the results from multiple search engines to achieve improved analysis of each dataset. Here we review the techniques and available software for combining the results of multiple search engines and briefly compare the relative performance of these techniques. PMID:23720762
ERIC Educational Resources Information Center
Djang, Philipp A.
1993-01-01
Describes a Multiple Criteria Decision Analysis Approach for the selection of personal computers that combines the capabilities of Analytic Hierarchy Process and Integer Goal Programing. An example of how decision makers can use this approach to determine what kind of personal computers and how many of each type to purchase is given. (nine…
Finegan, Donal P; Scheel, Mario; Robinson, James B; Tjaden, Bernhard; Di Michiel, Marco; Hinds, Gareth; Brett, Dan J L; Shearing, Paul R
2016-11-16
Catastrophic failure of lithium-ion batteries occurs across multiple length scales and over very short time periods. A combination of high-speed operando tomography, thermal imaging and electrochemical measurements is used to probe the degradation mechanisms leading up to overcharge-induced thermal runaway of a LiCoO 2 pouch cell, through its interrelated dynamic structural, thermal and electrical responses. Failure mechanisms across multiple length scales are explored using a post-mortem multi-scale tomography approach, revealing significant morphological and phase changes in the LiCoO 2 electrode microstructure and location dependent degradation. This combined operando and multi-scale X-ray computed tomography (CT) technique is demonstrated as a comprehensive approach to understanding battery degradation and failure.
Mixture-based gatekeeping procedures in adaptive clinical trials.
Kordzakhia, George; Dmitrienko, Alex; Ishida, Eiji
2018-01-01
Clinical trials with data-driven decision rules often pursue multiple clinical objectives such as the evaluation of several endpoints or several doses of an experimental treatment. These complex analysis strategies give rise to "multivariate" multiplicity problems with several components or sources of multiplicity. A general framework for defining gatekeeping procedures in clinical trials with adaptive multistage designs is proposed in this paper. The mixture method is applied to build a gatekeeping procedure at each stage and inferences at each decision point (interim or final analysis) are performed using the combination function approach. An advantage of utilizing the mixture method is that it enables powerful gatekeeping procedures applicable to a broad class of settings with complex logical relationships among the hypotheses of interest. Further, the combination function approach supports flexible data-driven decisions such as a decision to increase the sample size or remove a treatment arm. The paper concludes with a clinical trial example that illustrates the methodology by applying it to develop an adaptive two-stage design with a mixture-based gatekeeping procedure.
A. Weiskittel; D. Maguire; R. Monserud
2007-01-01
Hybrid models offer the opportunity to improve future growth projections by combining advantages of both empirical and process-based modeling approaches. Hybrid models have been constructed in several regions and their performance relative to a purely empirical approach has varied. A hybrid model was constructed for intensively managed Douglas-fir plantations in the...
Perthold, Jan Walther; Oostenbrink, Chris
2018-05-17
Enveloping distribution sampling (EDS) is an efficient approach to calculate multiple free-energy differences from a single molecular dynamics (MD) simulation. However, the construction of an appropriate reference-state Hamiltonian that samples all states efficiently is not straightforward. We propose a novel approach for the construction of the EDS reference-state Hamiltonian, related to a previously described procedure to smoothen energy landscapes. In contrast to previously suggested EDS approaches, our reference-state Hamiltonian preserves local energy minima of the combined end-states. Moreover, we propose an intuitive, robust and efficient parameter optimization scheme to tune EDS Hamiltonian parameters. We demonstrate the proposed method with established and novel test systems and conclude that our approach allows for the automated calculation of multiple free-energy differences from a single simulation. Accelerated EDS promises to be a robust and user-friendly method to compute free-energy differences based on solid statistical mechanics.
USDA-ARS?s Scientific Manuscript database
A continuous monitoring of daily evapotranspiration (ET) at field scale can be achieved by combining thermal infrared remote sensing data information from multiple satellite platforms. Here, an integrated approach to field scale ET mapping is described, combining multi-scale surface energy balance e...
Wang, Shuangquan; Sun, Huiyong; Liu, Hui; Li, Dan; Li, Youyong; Hou, Tingjun
2016-08-01
Blockade of human ether-à-go-go related gene (hERG) channel by compounds may lead to drug-induced QT prolongation, arrhythmia, and Torsades de Pointes (TdP), and therefore reliable prediction of hERG liability in the early stages of drug design is quite important to reduce the risk of cardiotoxicity-related attritions in the later development stages. In this study, pharmacophore modeling and machine learning approaches were combined to construct classification models to distinguish hERG active from inactive compounds based on a diverse data set. First, an optimal ensemble of pharmacophore hypotheses that had good capability to differentiate hERG active from inactive compounds was identified by the recursive partitioning (RP) approach. Then, the naive Bayesian classification (NBC) and support vector machine (SVM) approaches were employed to construct classification models by integrating multiple important pharmacophore hypotheses. The integrated classification models showed improved predictive capability over any single pharmacophore hypothesis, suggesting that the broad binding polyspecificity of hERG can only be well characterized by multiple pharmacophores. The best SVM model achieved the prediction accuracies of 84.7% for the training set and 82.1% for the external test set. Notably, the accuracies for the hERG blockers and nonblockers in the test set reached 83.6% and 78.2%, respectively. Analysis of significant pharmacophores helps to understand the multimechanisms of action of hERG blockers. We believe that the combination of pharmacophore modeling and SVM is a powerful strategy to develop reliable theoretical models for the prediction of potential hERG liability.
Qiao, Xue; Lin, Xiong-hao; Ji, Shuai; Zhang, Zheng-xiang; Bo, Tao; Guo, De-an; Ye, Min
2016-01-05
To fully understand the chemical diversity of an herbal medicine is challenging. In this work, we describe a new approach to globally profile and discover novel compounds from an herbal extract using multiple neutral loss/precursor ion scanning combined with substructure recognition and statistical analysis. Turmeric (the rhizomes of Curcuma longa L.) was used as an example. This approach consists of three steps: (i) multiple neutral loss/precursor ion scanning to obtain substructure information; (ii) targeted identification of new compounds by extracted ion current and substructure recognition; and (iii) untargeted identification using total ion current and multivariate statistical analysis to discover novel structures. Using this approach, 846 terpecurcumins (terpene-conjugated curcuminoids) were discovered from turmeric, including a number of potentially novel compounds. Furthermore, two unprecedented compounds (terpecurcumins X and Y) were purified, and their structures were identified by NMR spectroscopy. This study extended the application of mass spectrometry to global profiling of natural products in herbal medicines and could help chemists to rapidly discover novel compounds from a complex matrix.
Quantitative consensus of supervised learners for diffuse lung parenchymal HRCT patterns
NASA Astrophysics Data System (ADS)
Raghunath, Sushravya; Rajagopalan, Srinivasan; Karwoski, Ronald A.; Bartholmai, Brian J.; Robb, Richard A.
2013-03-01
Automated lung parenchymal classification usually relies on supervised learning of expert chosen regions representative of the visually differentiable HRCT patterns specific to different pathologies (eg. emphysema, ground glass, honey combing, reticular and normal). Considering the elusiveness of a single most discriminating similarity measure, a plurality of weak learners can be combined to improve the machine learnability. Though a number of quantitative combination strategies exist, their efficacy is data and domain dependent. In this paper, we investigate multiple (N=12) quantitative consensus approaches to combine the clusters obtained with multiple (n=33) probability density-based similarity measures. Our study shows that hypergraph based meta-clustering and probabilistic clustering provides optimal expert-metric agreement.
van Breen, Jolien A.; Spears, Russell; Kuppens, Toon; de Lemus, Soledad
2017-01-01
Across four studies, we examine multiple identities in the context of gender and propose that women's attitudes toward gender group membership are governed by two largely orthogonal dimensions of gender identity: identification with women and identification with feminists. We argue that identification with women reflects attitudes toward the content society gives to group membership: what does it mean to be a woman in terms of group characteristics, interests and values? Identification with feminists, on the other hand, is a politicized identity dimension reflecting attitudes toward the social position of the group: what does it mean to be a woman in terms of disadvantage, inequality, and relative status? We examine the utility of this multiple identity approach in four studies. Study 1 showed that identification with women reflects attitudes toward group characteristics, such as femininity and self-stereotyping, while identification with feminists reflects attitudes toward the group's social position, such as perceived sexism. The two dimensions are shown to be largely independent, and as such provide support for the multiple identity approach. In Studies 2–4, we examine the utility of this multiple identity approach in predicting qualitative differences in gender attitudes. Results show that specific combinations of identification with women and feminists predicted attitudes toward collective action and gender stereotypes. Higher identification with feminists led to endorsement of radical collective action (Study 2) and critical attitudes toward gender stereotypes (Studies 3–4), especially at lower levels of identification with women. The different combinations of high vs. low identification with women and feminists can be thought of as reflecting four theoretical identity “types.” A woman can be (1) strongly identified with neither women nor feminists (“low identifier”), (2) strongly identified with women but less so with feminists (“traditional identifier”), (3) strongly identified with both women and feminists (“dual identifier”), or (4) strongly identified with feminists but less so with women (“distinctive feminist”). In sum, by considering identification with women and identification with feminists as multiple identities we aim to show how the multiple identity approach predicts distinct attitudes to gender issues and offer a new perspective on gender identity. PMID:28713297
van Breen, Jolien A; Spears, Russell; Kuppens, Toon; de Lemus, Soledad
2017-01-01
Across four studies, we examine multiple identities in the context of gender and propose that women's attitudes toward gender group membership are governed by two largely orthogonal dimensions of gender identity: identification with women and identification with feminists. We argue that identification with women reflects attitudes toward the content society gives to group membership: what does it mean to be a woman in terms of group characteristics, interests and values? Identification with feminists, on the other hand, is a politicized identity dimension reflecting attitudes toward the social position of the group: what does it mean to be a woman in terms of disadvantage, inequality, and relative status? We examine the utility of this multiple identity approach in four studies. Study 1 showed that identification with women reflects attitudes toward group characteristics, such as femininity and self-stereotyping, while identification with feminists reflects attitudes toward the group's social position, such as perceived sexism. The two dimensions are shown to be largely independent, and as such provide support for the multiple identity approach. In Studies 2-4, we examine the utility of this multiple identity approach in predicting qualitative differences in gender attitudes. Results show that specific combinations of identification with women and feminists predicted attitudes toward collective action and gender stereotypes. Higher identification with feminists led to endorsement of radical collective action (Study 2) and critical attitudes toward gender stereotypes (Studies 3-4), especially at lower levels of identification with women. The different combinations of high vs. low identification with women and feminists can be thought of as reflecting four theoretical identity "types." A woman can be (1) strongly identified with neither women nor feminists ("low identifier"), (2) strongly identified with women but less so with feminists ("traditional identifier"), (3) strongly identified with both women and feminists ("dual identifier"), or (4) strongly identified with feminists but less so with women ("distinctive feminist"). In sum, by considering identification with women and identification with feminists as multiple identities we aim to show how the multiple identity approach predicts distinct attitudes to gender issues and offer a new perspective on gender identity.
A Facile and General Approach to Recoverable High-Strain Multishape Shape Memory Polymers.
Li, Xingjian; Pan, Yi; Zheng, Zhaohui; Ding, Xiaobin
2018-03-01
Fabricating a single polymer network with no need to design complex structures to achieve an ideal combination of tunable high-strain multiple-shape memory effects and highly recoverable shape memory property is a great challenge for the real applications of advanced shape memory devices. Here, a facile and general approach to recoverable high-strain multishape shape memory polymers is presented via a random copolymerization of acrylate monomers and a chain-extended multiblock copolymer crosslinker. As-prepared shape memory networks show a large width at the half-peak height of the glass transition, far wider than current classical multishape shape memory polymers. A combination of tunable high-strain multishape memory effect and as high as 1000% recoverable strain in a single chemical-crosslinking network can be obtained. To the best of our knowledge, this is the first thermosetting material with a combination of highly recoverable strain and tunable high-strain multiple-shape memory effects. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Multiple trauma in children: critical care overview.
Wetzel, Randall C; Burns, R Cartland
2002-11-01
Multiple trauma is more than the sum of the injuries. Management not only of the physiologic injury but also of the pathophysiologic responses, along with integration of the child's emotional and developmental needs and the child's family, forms the basis of trauma care. Multiple trauma in children also elicits profound psychological responses from the healthcare providers involved with these children. This overview will address the pathophysiology of multiple trauma in children and the general principles of trauma management by an integrated trauma team. Trauma is a systemic disease. Multiple trauma stimulates the release of multiple inflammatory mediators. A lethal triad of hypothermia, acidosis, and coagulopathy is the direct result of trauma and secondary injury from the systemic response to trauma. Controlling and responding to the secondary pathophysiologic sequelae of trauma is the cornerstone of trauma management in the multiply injured, critically ill child. Damage control surgery is a new, rational approach to the child with multiple trauma. The selection of children for damage control surgery depends on the severity of injury. Major abdominal vascular injuries and multiple visceral injuries are best considered for this approach. The effective management of childhood multiple trauma requires a combined team approach, consideration of the child and family, an organized trauma system, and an effective quality assurance and improvement mechanism.
NASA Astrophysics Data System (ADS)
Michon, Frédéric; Aarts, Arno; Holzhammer, Tobias; Ruther, Patrick; Borghs, Gustaaf; McNaughton, Bruce; Kloosterman, Fabian
2016-08-01
Objective. Understanding how neuronal assemblies underlie cognitive function is a fundamental question in system neuroscience. It poses the technical challenge to monitor the activity of populations of neurons, potentially widely separated, in relation to behaviour. In this paper, we present a new system which aims at simultaneously recording from a large population of neurons from multiple separated brain regions in freely behaving animals. Approach. The concept of the new device is to combine the benefits of two existing electrophysiological techniques, i.e. the flexibility and modularity of micro-drive arrays and the high sampling ability of electrode-dense silicon probes. Main results. Newly engineered long bendable silicon probes were integrated into a micro-drive array. The resulting device can carry up to 16 independently movable silicon probes, each carrying 16 recording sites. Populations of neurons were recorded simultaneously in multiple cortical and/or hippocampal sites in two freely behaving implanted rats. Significance. Current approaches to monitor neuronal activity either allow to flexibly record from multiple widely separated brain regions (micro-drive arrays) but with a limited sampling density or to provide denser sampling at the expense of a flexible placement in multiple brain regions (neural probes). By combining these two approaches and their benefits, we present an alternative solution for flexible and simultaneous recordings from widely distributed populations of neurons in freely behaving rats.
Multiple-Symbol Noncoherent Decoding of Uncoded and Convolutionally Codes Continous Phase Modulation
NASA Technical Reports Server (NTRS)
Divsalar, D.; Raphaeli, D.
2000-01-01
Recently, a method for combined noncoherent detection and decoding of trellis-codes (noncoherent coded modulation) has been proposed, which can practically approach the performance of coherent detection.
Procedural and Conceptual Changes in Young Children's Problem Solving
ERIC Educational Resources Information Center
Voutsina, Chronoula
2012-01-01
This study analysed the different types of arithmetic knowledge that young children utilise when solving a multiple-step addition task. The focus of the research was on the procedural and conceptual changes that occur as children develop their overall problem solving approach. Combining qualitative case study with a micro-genetic approach,…
Hsin, Kun-Yi; Ghosh, Samik; Kitano, Hiroaki
2013-01-01
Increased availability of bioinformatics resources is creating opportunities for the application of network pharmacology to predict drug effects and toxicity resulting from multi-target interactions. Here we present a high-precision computational prediction approach that combines two elaborately built machine learning systems and multiple molecular docking tools to assess binding potentials of a test compound against proteins involved in a complex molecular network. One of the two machine learning systems is a re-scoring function to evaluate binding modes generated by docking tools. The second is a binding mode selection function to identify the most predictive binding mode. Results from a series of benchmark validations and a case study show that this approach surpasses the prediction reliability of other techniques and that it also identifies either primary or off-targets of kinase inhibitors. Integrating this approach with molecular network maps makes it possible to address drug safety issues by comprehensively investigating network-dependent effects of a drug or drug candidate. PMID:24391846
Meta‐analysis of test accuracy studies using imputation for partial reporting of multiple thresholds
Deeks, J.J.; Martin, E.C.; Riley, R.D.
2017-01-01
Introduction For tests reporting continuous results, primary studies usually provide test performance at multiple but often different thresholds. This creates missing data when performing a meta‐analysis at each threshold. A standard meta‐analysis (no imputation [NI]) ignores such missing data. A single imputation (SI) approach was recently proposed to recover missing threshold results. Here, we propose a new method that performs multiple imputation of the missing threshold results using discrete combinations (MIDC). Methods The new MIDC method imputes missing threshold results by randomly selecting from the set of all possible discrete combinations which lie between the results for 2 known bounding thresholds. Imputed and observed results are then synthesised at each threshold. This is repeated multiple times, and the multiple pooled results at each threshold are combined using Rubin's rules to give final estimates. We compared the NI, SI, and MIDC approaches via simulation. Results Both imputation methods outperform the NI method in simulations. There was generally little difference in the SI and MIDC methods, but the latter was noticeably better in terms of estimating the between‐study variances and generally gave better coverage, due to slightly larger standard errors of pooled estimates. Given selective reporting of thresholds, the imputation methods also reduced bias in the summary receiver operating characteristic curve. Simulations demonstrate the imputation methods rely on an equal threshold spacing assumption. A real example is presented. Conclusions The SI and, in particular, MIDC methods can be used to examine the impact of missing threshold results in meta‐analysis of test accuracy studies. PMID:29052347
The multiple decrement life table: a unifying framework for cause-of-death analysis in ecology.
Carey, James R
1989-01-01
The multiple decrement life table is used widely in the human actuarial literature and provides statistical expressions for mortality in three different forms: i) the life table from all causes-of-death combined; ii) the life table disaggregated into selected cause-of-death categories; and iii) the life table with particular causes and combinations of causes eliminated. The purpose of this paper is to introduce the multiple decrement life table to the ecological literature by applying the methods to published death-by-cause information on Rhagoletis pomonella. Interrelations between the current approach and conventional tools used in basic and applied ecology are discussed including the conventional life table, Key Factor Analysis and Abbott's Correction used in toxicological bioassay.
[HIV prevention program for young people--the WYSH Project as a model of "combination prevention"].
Ono-Kihara, Masako
2010-03-01
In face of the HIV pandemic that still grows, unsuccessful efforts of developing biomedical control measures or the failure of cognitive-behavioral approach to show sustained social level effectiveness, behavioral strategy is now expected to evolve into a structural prevention ("combination prevention") that involves multiple behavioral goals and multilevel approaches. WYSH Project is a combination prevention project for youth developed through socio-epidemiological approach that integrates epidemiology with social science such as social marketing and mixed method. WYSH Project includes mass education programs for youth in schools and programs for out-of-school youth through cyber network and peer communication. Started in 2002, it expanded nationwide with supports from related ministries and parent-teacher associations and has grown into a single largest youth prevention project in Japan.
Galinski, Sabrina; Wichert, Sven P; Rossner, Moritz J; Wehr, Michael C
2018-05-25
G protein-coupled receptors (GPCRs) are the largest class of cell surface receptors and are implicated in the physiological regulation of many biological processes. The high diversity of GPCRs and their physiological functions make them primary targets for therapeutic drugs. For the generation of novel compounds, however, selectivity towards a given target is a critical issue in drug development as structural similarities between members of GPCR subfamilies exist. Therefore, the activities of multiple GPCRs that are both closely and distantly related to assess compound selectivity need to be tested simultaneously. Here, we present a cell-based multiplexed GPCR activity assay, termed GPCRprofiler, which uses a β-arrestin recruitment strategy and combines split TEV protein-protein interaction and EXT-based barcode technologies. This approach enables simultaneous measurements of receptor activities of multiple GPCR-ligand combinations by applying massively parallelized reporter assays. In proof-of-principle experiments covering 19 different GPCRs, both the specificity of endogenous agonists and the polypharmacological effects of two known antipsychotics on GPCR activities were demonstrated. Technically, normalization of barcode reporters across individual assays allows quantitative pharmacological assays in a parallelized manner. In summary, the GPCRprofiler technique constitutes a flexible and scalable approach, which enables simultaneous profiling of compound actions on multiple receptor activities in living cells.
A formal concept analysis approach to consensus clustering of multi-experiment expression data
2014-01-01
Background Presently, with the increasing number and complexity of available gene expression datasets, the combination of data from multiple microarray studies addressing a similar biological question is gaining importance. The analysis and integration of multiple datasets are expected to yield more reliable and robust results since they are based on a larger number of samples and the effects of the individual study-specific biases are diminished. This is supported by recent studies suggesting that important biological signals are often preserved or enhanced by multiple experiments. An approach to combining data from different experiments is the aggregation of their clusterings into a consensus or representative clustering solution which increases the confidence in the common features of all the datasets and reveals the important differences among them. Results We propose a novel generic consensus clustering technique that applies Formal Concept Analysis (FCA) approach for the consolidation and analysis of clustering solutions derived from several microarray datasets. These datasets are initially divided into groups of related experiments with respect to a predefined criterion. Subsequently, a consensus clustering algorithm is applied to each group resulting in a clustering solution per group. These solutions are pooled together and further analysed by employing FCA which allows extracting valuable insights from the data and generating a gene partition over all the experiments. In order to validate the FCA-enhanced approach two consensus clustering algorithms are adapted to incorporate the FCA analysis. Their performance is evaluated on gene expression data from multi-experiment study examining the global cell-cycle control of fission yeast. The FCA results derived from both methods demonstrate that, although both algorithms optimize different clustering characteristics, FCA is able to overcome and diminish these differences and preserve some relevant biological signals. Conclusions The proposed FCA-enhanced consensus clustering technique is a general approach to the combination of clustering algorithms with FCA for deriving clustering solutions from multiple gene expression matrices. The experimental results presented herein demonstrate that it is a robust data integration technique able to produce good quality clustering solution that is representative for the whole set of expression matrices. PMID:24885407
Azmi, Asfar S.; Wang, Zhiwei; Philip, Philip A.; Mohammad, Ramzi M.; Sarkar, Fazlul H.
2010-01-01
Cancer therapies that target key molecules have not fulfilled expected promises for most common malignancies. Major challenges include the incomplete understanding and validation of these targets in patients, the multiplicity and complexity of genetic and epigenetic changes in the majority of cancers, and the redundancies and cross-talk found in key signaling pathways. Collectively, the uses of single-pathway targeted approaches are not effective therapies for human malignances. To overcome these barriers, it is important to understand the molecular cross-talk among key signaling pathways and how they may be altered by targeted agents. This requires innovative approaches such as understanding the global physiological environment of target proteins and the effects of modifying them without losing key molecular details. Such strategies will aid the design of novel therapeutics and their combinations against multifaceted diseases where efficacious combination therapies will focus on altering multiple pathways rather than single proteins. Integrated network modeling and systems biology has emerged as a powerful tool benefiting our understanding of drug mechanism of action in real time. This mini-review highlights the significance of the network and systems biology-based strategy and presents a “proof-of-concept” recently validated in our laboratory using the example of a combination treatment of oxaliplatin and the MDM2 inhibitor MI-219 in genetically complex and incurable pancreatic adenocarcinoma. PMID:21041384
Fitzpatrick, Paul F.
2014-01-01
Oxidation of alcohols and amines is catalyzed by multiple families of flavin-and pyridine nucleotide-dependent enzymes. Measurement of solvent isotope effects provides a unique mechanistic probe of the timing of the cleavage of the OH and NH bonds, necessary information for a complete description of the catalytic mechanism. The inherent ambiguities in interpretation of solvent isotope effects can be significantly decreased if isotope effects arising from isotopically labeled substrates are measured in combination with solvent isotope effects. The application of combined solvent and substrate (mainly deuterium) isotope effects to multiple enzymes is described here to illustrate the range of mechanistic insights that such an approach can provide. PMID:25448013
Bioinformatics approaches to predict target genes from transcription factor binding data.
Essebier, Alexandra; Lamprecht, Marnie; Piper, Michael; Bodén, Mikael
2017-12-01
Transcription factors regulate gene expression and play an essential role in development by maintaining proliferative states, driving cellular differentiation and determining cell fate. Transcription factors are capable of regulating multiple genes over potentially long distances making target gene identification challenging. Currently available experimental approaches to detect distal interactions have multiple weaknesses that have motivated the development of computational approaches. Although an improvement over experimental approaches, existing computational approaches are still limited in their application, with different weaknesses depending on the approach. Here, we review computational approaches with a focus on data dependency, cell type specificity and usability. With the aim of identifying transcription factor target genes, we apply available approaches to typical transcription factor experimental datasets. We show that approaches are not always capable of annotating all transcription factor binding sites; binding sites should be treated disparately; and a combination of approaches can increase the biological relevance of the set of genes identified as targets. Copyright © 2017 Elsevier Inc. All rights reserved.
Kremer, Lukas P M; Leufken, Johannes; Oyunchimeg, Purevdulam; Schulze, Stefan; Fufezan, Christian
2016-03-04
Proteomics data integration has become a broad field with a variety of programs offering innovative algorithms to analyze increasing amounts of data. Unfortunately, this software diversity leads to many problems as soon as the data is analyzed using more than one algorithm for the same task. Although it was shown that the combination of multiple peptide identification algorithms yields more robust results, it is only recently that unified approaches are emerging; however, workflows that, for example, aim to optimize search parameters or that employ cascaded style searches can only be made accessible if data analysis becomes not only unified but also and most importantly scriptable. Here we introduce Ursgal, a Python interface to many commonly used bottom-up proteomics tools and to additional auxiliary programs. Complex workflows can thus be composed using the Python scripting language using a few lines of code. Ursgal is easily extensible, and we have made several database search engines (X!Tandem, OMSSA, MS-GF+, Myrimatch, MS Amanda), statistical postprocessing algorithms (qvality, Percolator), and one algorithm that combines statistically postprocessed outputs from multiple search engines ("combined FDR") accessible as an interface in Python. Furthermore, we have implemented a new algorithm ("combined PEP") that combines multiple search engines employing elements of "combined FDR", PeptideShaker, and Bayes' theorem.
Integrated presentation of ecological risk from multiple stressors
NASA Astrophysics Data System (ADS)
Goussen, Benoit; Price, Oliver R.; Rendal, Cecilie; Ashauer, Roman
2016-10-01
Current environmental risk assessments (ERA) do not account explicitly for ecological factors (e.g. species composition, temperature or food availability) and multiple stressors. Assessing mixtures of chemical and ecological stressors is needed as well as accounting for variability in environmental conditions and uncertainty of data and models. Here we propose a novel probabilistic ERA framework to overcome these limitations, which focusses on visualising assessment outcomes by construct-ing and interpreting prevalence plots as a quantitative prediction of risk. Key components include environmental scenarios that integrate exposure and ecology, and ecological modelling of relevant endpoints to assess the effect of a combination of stressors. Our illustrative results demonstrate the importance of regional differences in environmental conditions and the confounding interactions of stressors. Using this framework and prevalence plots provides a risk-based approach that combines risk assessment and risk management in a meaningful way and presents a truly mechanistic alternative to the threshold approach. Even whilst research continues to improve the underlying models and data, regulators and decision makers can already use the framework and prevalence plots. The integration of multiple stressors, environmental conditions and variability makes ERA more relevant and realistic.
Integrated presentation of ecological risk from multiple stressors.
Goussen, Benoit; Price, Oliver R; Rendal, Cecilie; Ashauer, Roman
2016-10-26
Current environmental risk assessments (ERA) do not account explicitly for ecological factors (e.g. species composition, temperature or food availability) and multiple stressors. Assessing mixtures of chemical and ecological stressors is needed as well as accounting for variability in environmental conditions and uncertainty of data and models. Here we propose a novel probabilistic ERA framework to overcome these limitations, which focusses on visualising assessment outcomes by construct-ing and interpreting prevalence plots as a quantitative prediction of risk. Key components include environmental scenarios that integrate exposure and ecology, and ecological modelling of relevant endpoints to assess the effect of a combination of stressors. Our illustrative results demonstrate the importance of regional differences in environmental conditions and the confounding interactions of stressors. Using this framework and prevalence plots provides a risk-based approach that combines risk assessment and risk management in a meaningful way and presents a truly mechanistic alternative to the threshold approach. Even whilst research continues to improve the underlying models and data, regulators and decision makers can already use the framework and prevalence plots. The integration of multiple stressors, environmental conditions and variability makes ERA more relevant and realistic.
The Need for Integrated Approaches in Metabolic Engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lechner, Anna; Brunk, Elizabeth; Keasling, Jay D.
This review highlights state-of-the-art procedures for heterologous small-molecule biosynthesis, the associated bottlenecks, and new strategies that have the potential to accelerate future accomplishments in metabolic engineering. We emphasize that a combination of different approaches over multiple time and size scales must b e considered for successful pathway engineering in a heterologous host. We have classified these optimization procedures based on the "system" that is being manipulated: transcriptome, translatome, proteome, or reactome. By bridging multiple disciplines, including molecular biology, biochemistry, biophysics, and computational sciences, we can create an integral framework for the discovery and implementation of novel biosynthetic production routes.
The Need for Integrated Approaches in Metabolic Engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lechner, Anna; Brunk, Elizabeth; Keasling, Jay D.
Highlights include state-of-the-art procedures for heterologous small-molecule biosynthesis, the associated bottlenecks, and new strategies that have the potential to accelerate future accomplishments in metabolic engineering. A combination of different approaches over multiple time and size scales must be considered for successful pathway engineering in a heterologous host. We have classified these optimization procedures based on the “system” that is being manipulated: transcriptome, translatome, proteome, or reactome. Here, by bridging multiple disciplines, including molecular biology, biochemistry, biophysics, and computational sciences, we can create an integral framework for the discovery and implementation of novel biosynthetic production routes.
The Need for Integrated Approaches in Metabolic Engineering
Lechner, Anna; Brunk, Elizabeth; Keasling, Jay D.
2016-08-15
Highlights include state-of-the-art procedures for heterologous small-molecule biosynthesis, the associated bottlenecks, and new strategies that have the potential to accelerate future accomplishments in metabolic engineering. A combination of different approaches over multiple time and size scales must be considered for successful pathway engineering in a heterologous host. We have classified these optimization procedures based on the “system” that is being manipulated: transcriptome, translatome, proteome, or reactome. Here, by bridging multiple disciplines, including molecular biology, biochemistry, biophysics, and computational sciences, we can create an integral framework for the discovery and implementation of novel biosynthetic production routes.
Combining formal and functional approaches to topic structure.
Zellers, Margaret; Post, Brechtje
2012-03-01
Fragmentation between formal and functional approaches to prosodic variation is an ongoing problem in linguistic research. In particular, the frameworks of the Phonetics of Talk-in-Interaction (PTI) and Empirical Phonology (EP) take very different theoretical and methodological approaches to this kind of variation. We argue that it is fruitful to adopt the insights of both PTI's qualitative analysis and EP's quantitative analysis and combine them into a multiple-methods approach. One realm in which it is possible to combine these frameworks is in the analysis of discourse topic structure and the prosodic cues relevant to it. By combining a quantitative and a qualitative approach to discourse topic structure, it is possible to give a better account of the observed variation in prosody, for example in the case of fundamental frequency (F0) peak timing, which can be explained in terms of pitch accent distribution over different topic structure categories. Similarly, local and global patterns in speech rate variation can be better explained and motivated by adopting insights from both PTI and EP in the study of topic structure. Combining PTI and EP can provide better accounts of speech data as well as opening up new avenues of investigation which would not have been possible in either approach alone.
Low thrust chemical orbit to orbit propulsion system propellant management study
NASA Technical Reports Server (NTRS)
Dergance, R. H.; Hamlyn, K. M.; Tegart, J. R.
1981-01-01
Low thrust chemical propulsion systems were sized for transfer of large space systems from LEO to GEO. The influence of propellant combination, tankage and insulation requirements, and propellant management techniques on the LTPS mass and volume were studied. Liquid oxygen combined with hydrogen, methane or kerosene were the propellant combinations. Thrust levels of 445, 2230, and 4450 N were combined with 1, 4 and 8 perigee burn strategies. This matrix of systems was evaluated using multilayer insulation and spray-on-foam insulation systems. Various combinations of toroidal, cylindrical with ellipsoidal domes, and ellipsoidal tank shapes were investigated. Results indicate that low thrust (445 N) and single perigee burn approaches are considerably less efficient than the higher thrust level and multiple burn strategies. A modified propellant settling approach minimized propellant residuals and decreased system complexity, in addition, the toroid/ellipsoidal tank combination was predicted to be shortest.
Agopian, A J; Evans, Jane A; Lupo, Philip J
2018-01-15
It is estimated that 20 to 30% of infants with birth defects have two or more birth defects. Among these infants with multiple congenital anomalies (MCA), co-occurring anomalies may represent either chance (i.e., unrelated etiologies) or pathogenically associated patterns of anomalies. While some MCA patterns have been recognized and described (e.g., known syndromes), others have not been identified or characterized. Elucidating these patterns may result in a better understanding of the etiologies of these MCAs. This article reviews the literature with regard to analytic methods that have been used to evaluate patterns of MCAs, in particular those using birth defect registry data. A popular method for MCA assessment involves a comparison of the observed to expected ratio for a given combination of MCAs, or one of several modified versions of this comparison. Other methods include use of numerical taxonomy or other clustering techniques, multiple regression analysis, and log-linear analysis. Advantages and disadvantages of these approaches, as well as specific applications, were outlined. Despite the availability of multiple analytic approaches, relatively few MCA combinations have been assessed. The availability of large birth defects registries and computing resources that allow for automated, big data strategies for prioritizing MCA patterns may provide for new avenues for better understanding co-occurrence of birth defects. Thus, the selection of an analytic approach may depend on several considerations. Birth Defects Research 110:5-11, 2018. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Moghadam, Samira; Erfanmanesh, Maryam; Esmaeilzadeh, Abdolreza
2017-11-01
An autoimmune demyelination disease of the Central Nervous System, Multiple Sclerosis, is a chronic inflammation which mostly involves young adults. Suffering people face functional loss with a severe pain. Most current MS treatments are focused on the immune response suppression. Approved drugs suppress the inflammatory process, but factually, there is no definite cure for Multiple Sclerosis. Recently developed knowledge has demonstrated that gene and cell therapy as a hopeful approach in tissue regeneration. The authors propose a novel combined immune gene therapy for Multiple Sclerosis treatment using anti-inflammatory and remyelination of Interleukine-35 and Hepatocyte Growth Factor properties, respectively. In this hypothesis Interleukine-35 and Hepatocyte Growth Factor introduce to Mesenchymal Stem Cells of EAE mouse model via an adenovirus based vector. It is expected that Interleukine-35 and Hepatocyte Growth Factor genes expressed from MSCs could effectively perform in immunotherapy of Multiple Sclerosis. Copyright © 2017. Published by Elsevier Ltd.
The Role of Multimodel Combination in Improving Streamflow Prediction
NASA Astrophysics Data System (ADS)
Arumugam, S.; Li, W.
2008-12-01
Model errors are the inevitable part in any prediction exercise. One approach that is currently gaining attention to reduce model errors is by optimally combining multiple models to develop improved predictions. The rationale behind this approach primarily lies on the premise that optimal weights could be derived for each model so that the developed multimodel predictions will result in improved predictability. In this study, we present a new approach to combine multiple hydrological models by evaluating their predictability contingent on the predictor state. We combine two hydrological models, 'abcd' model and Variable Infiltration Capacity (VIC) model, with each model's parameter being estimated by two different objective functions to develop multimodel streamflow predictions. The performance of multimodel predictions is compared with individual model predictions using correlation, root mean square error and Nash-Sutcliffe coefficient. To quantify precisely under what conditions the multimodel predictions result in improved predictions, we evaluate the proposed algorithm by testing it against streamflow generated from a known model ('abcd' model or VIC model) with errors being homoscedastic or heteroscedastic. Results from the study show that streamflow simulated from individual models performed better than multimodels under almost no model error. Under increased model error, the multimodel consistently performed better than the single model prediction in terms of all performance measures. The study also evaluates the proposed algorithm for streamflow predictions in two humid river basins from NC as well as in two arid basins from Arizona. Through detailed validation in these four sites, the study shows that multimodel approach better predicts the observed streamflow in comparison to the single model predictions.
Feature-based fusion of medical imaging data.
Calhoun, Vince D; Adali, Tülay
2009-09-01
The acquisition of multiple brain imaging types for a given study is a very common practice. There have been a number of approaches proposed for combining or fusing multitask or multimodal information. These can be roughly divided into those that attempt to study convergence of multimodal imaging, for example, how function and structure are related in the same region of the brain, and those that attempt to study the complementary nature of modalities, for example, utilizing temporal EEG information and spatial functional magnetic resonance imaging information. Within each of these categories, one can attempt data integration (the use of one imaging modality to improve the results of another) or true data fusion (in which multiple modalities are utilized to inform one another). We review both approaches and present a recent computational approach that first preprocesses the data to compute features of interest. The features are then analyzed in a multivariate manner using independent component analysis. We describe the approach in detail and provide examples of how it has been used for different fusion tasks. We also propose a method for selecting which combination of modalities provides the greatest value in discriminating groups. Finally, we summarize and describe future research topics.
Hu, Jinsong; Van Valckenborgh, Els; Xu, Dehui; Menu, Eline; De Raeve, Hendrik; De Bruyne, Elke; De Bryune, Elke; Xu, Song; Van Camp, Ben; Handisides, Damian; Hart, Charles P; Vanderkerken, Karin
2013-09-01
Recently, we showed that hypoxia is a critical microenvironmental factor in multiple myeloma, and that the hypoxia-activated prodrug TH-302 selectively targets hypoxic multiple myeloma cells and improves multiple disease parameters in vivo. To explore approaches for sensitizing multiple myeloma cells to TH-302, we evaluated in this study the antitumor effect of TH-302 in combination with the clinically used proteasome inhibitor bortezomib. First, we show that TH-302 and bortezomib synergistically induce apoptosis in multiple myeloma cell lines in vitro. Second, we confirm that this synergism is related to the activation of caspase cascades and is mediated by changes of Bcl-2 family proteins. The combination treatment induces enhanced cleavage of caspase-3/8/9 and PARP, and therefore triggers apoptosis and enhances the cleavage of proapoptotic BH3-only protein BAD and BID as well as the antiapoptotic protein Mcl-1. In particular, TH-302 can abrogate the accumulation of antiapoptotic Mcl-1 induced by bortezomib, and decreases the expression of the prosurvival proteins Bcl-2 and Bcl-xL. Furthermore, we found that the induction of the proapoptotic BH3-only proteins PUMA (p53-upregulated modulator of apoptosis) and NOXA is associated with this synergism. In response to the genotoxic and endoplasmic reticulum stresses by TH-302 and bortezomib, the expression of PUMA and NOXA were upregulated in p53-dependent and -independent manners. Finally, in the murine 5T33MMvv model, we showed that the combination of TH-302 and bortezomib can improve multiple disease parameters and significantly prolong the survival of diseased mice. In conclusion, our studies provide a rationale for clinical evaluation of the combination of TH-302 and bortezomib in patients with multiple myeloma.
Recent Advances in Delivery of Drug-Nucleic Acid Combinations for Cancer Treatment
Li, Jing; Wang, Yan; Zhu, Yu; Oupický, David
2013-01-01
Cancer treatment that uses a combination of approaches with the ability to affect multiple disease pathways has been proven highly effective in the treatment of many cancers. Combination therapy can include multiple chemotherapeutics or combinations of chemotherapeutics with other treatment modalities like surgery or radiation. However, despite the widespread clinical use of combination therapies, relatively little attention has been given to the potential of modern nanocarrier delivery methods, like liposomes, micelles, and nanoparticles, to enhance the efficacy of combination treatments. This lack of knowledge is particularly notable in the limited success of vectors for the delivery of combinations of nucleic acids with traditional small molecule drugs. The delivery of drug-nucleic acid combinations is particularly challenging due to differences in the physicochemical properties of the two types of agents. This review discusses recent advances in the development of delivery methods using combinations of small molecule drugs and nucleic acid therapeutics to treat cancer. This review primarily focuses on the rationale used for selecting appropriate drug-nucleic acid combinations as well as progress in the development of nanocarriers suitable for simultaneous delivery of drug-nucleic acid combinations. PMID:23624358
Recent advances in delivery of drug-nucleic acid combinations for cancer treatment.
Li, Jing; Wang, Yan; Zhu, Yu; Oupický, David
2013-12-10
Cancer treatment that uses a combination of approaches with the ability to affect multiple disease pathways has been proven highly effective in the treatment of many cancers. Combination therapy can include multiple chemotherapeutics or combinations of chemotherapeutics with other treatment modalities like surgery or radiation. However, despite the widespread clinical use of combination therapies, relatively little attention has been given to the potential of modern nanocarrier delivery methods, like liposomes, micelles, and nanoparticles, to enhance the efficacy of combination treatments. This lack of knowledge is particularly notable in the limited success of vectors for the delivery of combinations of nucleic acids with traditional small molecule drugs. The delivery of drug-nucleic acid combinations is particularly challenging due to differences in the physicochemical properties of the two types of agents. This review discusses recent advances in the development of delivery methods using combinations of small molecule drugs and nucleic acid therapeutics to treat cancer. This review primarily focuses on the rationale used for selecting appropriate drug-nucleic acid combinations as well as progress in the development of nanocarriers suitable for simultaneous delivery of drug-nucleic acid combinations. Copyright © 2013 Elsevier B.V. All rights reserved.
ICPL: Intelligent Cooperative Planning and Learning for Multi-agent Systems
2012-02-29
objective was to develop a new planning approach for teams!of multiple UAVs that tightly integrates learning and cooperative!control algorithms at... algorithms at multiple levels of the planning architecture. The research results enabled a team of mobile agents to learn to adapt and react to uncertainty in...expressive representation that incorporates feature conjunctions. Our algorithm is simple to implement, fast to execute, and can be combined with any
NASA Astrophysics Data System (ADS)
Yoshida, Kenichiro; Nishidate, Izumi; Ojima, Nobutoshi; Iwata, Kayoko
2014-01-01
To quantitatively evaluate skin chromophores over a wide region of curved skin surface, we propose an approach that suppresses the effect of the shading-derived error in the reflectance on the estimation of chromophore concentrations, without sacrificing the accuracy of that estimation. In our method, we use multiple regression analysis, assuming the absorbance spectrum as the response variable and the extinction coefficients of melanin, oxygenated hemoglobin, and deoxygenated hemoglobin as the predictor variables. The concentrations of melanin and total hemoglobin are determined from the multiple regression coefficients using compensation formulae (CF) based on the diffuse reflectance spectra derived from a Monte Carlo simulation. To suppress the shading-derived error, we investigated three different combinations of multiple regression coefficients for the CF. In vivo measurements with the forearm skin demonstrated that the proposed approach can reduce the estimation errors that are due to shading-derived errors in the reflectance. With the best combination of multiple regression coefficients, we estimated that the ratio of the error to the chromophore concentrations is about 10%. The proposed method does not require any measurements or assumptions about the shape of the subjects; this is an advantage over other studies related to the reduction of shading-derived errors.
Multiple kernel learning in protein-protein interaction extraction from biomedical literature.
Yang, Zhihao; Tang, Nan; Zhang, Xiao; Lin, Hongfei; Li, Yanpeng; Yang, Zhiwei
2011-03-01
Knowledge about protein-protein interactions (PPIs) unveils the molecular mechanisms of biological processes. The volume and content of published biomedical literature on protein interactions is expanding rapidly, making it increasingly difficult for interaction database administrators, responsible for content input and maintenance to detect and manually update protein interaction information. The objective of this work is to develop an effective approach to automatic extraction of PPI information from biomedical literature. We present a weighted multiple kernel learning-based approach for automatic PPI extraction from biomedical literature. The approach combines the following kernels: feature-based, tree, graph and part-of-speech (POS) path. In particular, we extend the shortest path-enclosed tree (SPT) and dependency path tree to capture richer contextual information. Our experimental results show that the combination of SPT and dependency path tree extensions contributes to the improvement of performance by almost 0.7 percentage units in F-score and 2 percentage units in area under the receiver operating characteristics curve (AUC). Combining two or more appropriately weighed individual will further improve the performance. Both on the individual corpus and cross-corpus evaluation our combined kernel can achieve state-of-the-art performance with respect to comparable evaluations, with 64.41% F-score and 88.46% AUC on the AImed corpus. As different kernels calculate the similarity between two sentences from different aspects. Our combined kernel can reduce the risk of missing important features. More specifically, we use a weighted linear combination of individual kernels instead of assigning the same weight to each individual kernel, thus allowing the introduction of each kernel to incrementally contribute to the performance improvement. In addition, SPT and dependency path tree extensions can improve the performance by including richer context information. Copyright © 2010 Elsevier B.V. All rights reserved.
Taravat, Alireza; Oppelt, Natascha
2014-01-01
Oil spills represent a major threat to ocean ecosystems and their environmental status. Previous studies have shown that Synthetic Aperture Radar (SAR), as its recording is independent of clouds and weather, can be effectively used for the detection and classification of oil spills. Dark formation detection is the first and critical stage in oil-spill detection procedures. In this paper, a novel approach for automated dark-spot detection in SAR imagery is presented. A new approach from the combination of adaptive Weibull Multiplicative Model (WMM) and MultiLayer Perceptron (MLP) neural networks is proposed to differentiate between dark spots and the background. The results have been compared with the results of a model combining non-adaptive WMM and pulse coupled neural networks. The presented approach overcomes the non-adaptive WMM filter setting parameters by developing an adaptive WMM model which is a step ahead towards a full automatic dark spot detection. The proposed approach was tested on 60 ENVISAT and ERS2 images which contained dark spots. For the overall dataset, an average accuracy of 94.65% was obtained. Our experimental results demonstrate that the proposed approach is very robust and effective where the non-adaptive WMM & pulse coupled neural network (PCNN) model generates poor accuracies. PMID:25474376
Kurita, Takashi; Sueda, Keiichi; Tsubakimoto, Koji; Miyanaga, Noriaki
2010-07-05
We experimentally demonstrated coherent beam combining using optical parametric amplification with a nonlinear crystal pumped by random-phased multiple-beam array of the second harmonic of a Nd:YAG laser at 10-Hz repetition rate. In the proof-of-principle experiment, the phase jump between two pump beams was precisely controlled by a motorized actuator. For the demonstration of multiple-beam combining a random phase plate was used to create random-phased beamlets as a pump pulse. Far-field patterns of the pump, the signal, and the idler indicated that the spatially coherent signal beams were obtained on both cases. This approach allows scaling of the intensity of optical parametric chirped pulse amplification up to the exa-watt level while maintaining diffraction-limited beam quality.
Fernando, Julian W; Kashima, Yoshihisa; Laham, Simon M
2014-08-01
Although a great deal of research has investigated the relationship between emotions and action orientations, most studies to date have used variable-centered techniques to identify the best emotion predictor(s) of a particular action. Given that people frequently report multiple or blended emotions, a profitable area of research may be to adopt person-centered approaches to examine the action orientations elicited by a particular combination of emotions or "emotion profile." In two studies, across instances of intergroup inequality in Australia and Canada, we examined participants' experiences of six intergroup emotions: sympathy, anger directed at three targets, shame, and pride. In both studies, five groups of participants with similar emotion profiles were identified by cluster analysis and their action orientations were compared; clusters indicated that the majority of participants experienced multiple emotions. Each action orientation was also regressed on the six emotions. There were a number of differences in the results obtained from the person-centered and variable-centered approaches. This was most apparent for sympathy: the group of participants experiencing only sympathy showed little inclination to perform prosocial actions, yet sympathy was a significant predictor of numerous action orientations in regression analyses. These results imply that sympathy may only prompt a desire for action when experienced in combination with other emotions. We suggest that the use of person-centered and variable-centered approaches as complementary analytic strategies may enrich research into not only the affective predictors of action, but emotion research in general.
A heuristic approach using multiple criteria for environmentally benign 3PLs selection
NASA Astrophysics Data System (ADS)
Kongar, Elif
2005-11-01
Maintaining competitiveness in an environment where price and quality differences between competing products are disappearing depends on the company's ability to reduce costs and supply time. Timely responses to rapidly changing market conditions require an efficient Supply Chain Management (SCM). Outsourcing logistics to third-party logistics service providers (3PLs) is one commonly used way of increasing the efficiency of logistics operations, while creating a more "core competency focused" business environment. However, this alone may not be sufficient. Due to recent environmental regulations and growing public awareness regarding environmental issues, 3PLs need to be not only efficient but also environmentally benign to maintain companies' competitiveness. Even though an efficient and environmentally benign combination of 3PLs can theoretically be obtained using exhaustive search algorithms, heuristics approaches to the selection process may be superior in terms of the computational complexity. In this paper, a hybrid approach that combines a multiple criteria Genetic Algorithm (GA) with Linear Physical Weighting Algorithm (LPPW) to be used in efficient and environmentally benign 3PLs is proposed. A numerical example is also provided to illustrate the method and the analyses.
Tuning Cell and Tissue Development by Combining Multiple Mechanical Signals.
Sinha, Ravi; Verdonschot, Nico; Koopman, Bart; Rouwkema, Jeroen
2017-10-01
Mechanical signals offer a promising way to control cell and tissue development. It has been established that cells constantly probe their mechanical microenvironment and employ force feedback mechanisms to modify themselves and when possible, their environment, to reach a homeostatic state. Thus, a correct mechanical microenvironment (external forces and mechanical properties and shapes of cellular surroundings) is necessary for the proper functioning of cells. In vitro or in the case of nonbiological implants in vivo, where cells are in an artificial environment, addition of the adequate mechanical signals can, therefore, enable the cells to function normally as in vivo. Hence, a wide variety of approaches have been developed to apply mechanical stimuli (such as substrate stretch, flow-induced shear stress, substrate stiffness, topography, and modulation of attachment area) to cells in vitro. These approaches have not just revealed the effects of the mechanical signals on cells but also provided ways for probing cellular molecules and structures that can provide a mechanistic understanding of the effects. However, they remain lower in complexity compared with the in vivo conditions, where the cellular mechanical microenvironment is the result of a combination of multiple mechanical signals. Therefore, combinations of mechanical stimuli have also been applied to cells in vitro. These studies have had varying focus-developing novel platforms to apply complex combinations of mechanical stimuli, observing the co-operation/competition between stimuli, combining benefits of multiple stimuli toward an application, or uncovering the underlying mechanisms of their action. In general, they provided new insights that could not have been predicted from previous knowledge. We present here a review of several such studies and the insights gained from them, thereby making a case for such studies to be continued and further developed.
Experimental Design for Multi-drug Combination Studies Using Signaling Networks
Huang, Hengzhen; Fang, Hong-Bin; Tan, Ming T.
2017-01-01
Summary Combinations of multiple drugs are an important approach to maximize the chance for therapeutic success by inhibiting multiple pathways/targets. Analytic methods for studying drug combinations have received increasing attention because major advances in biomedical research have made available large number of potential agents for testing. The preclinical experiment on multi-drug combinations plays a key role in (especially cancer) drug development because of the complex nature of the disease, the need to reduce development time and costs. Despite recent progresses in statistical methods for assessing drug interaction, there is an acute lack of methods for designing experiments on multi-drug combinations. The number of combinations grows exponentially with the number of drugs and dose-levels and it quickly precludes laboratory testing. Utilizing experimental dose-response data of single drugs and a few combinations along with pathway/network information to obtain an estimate of the functional structure of the dose-response relationship in silico, we propose an optimal design that allows exploration of the dose-effect surface with the smallest possible sample size in this paper. The simulation studies show our proposed methods perform well. PMID:28960231
Multimodal Neuroimaging: Basic Concepts and Classification of Neuropsychiatric Diseases.
Tulay, Emine Elif; Metin, Barış; Tarhan, Nevzat; Arıkan, Mehmet Kemal
2018-06-01
Neuroimaging techniques are widely used in neuroscience to visualize neural activity, to improve our understanding of brain mechanisms, and to identify biomarkers-especially for psychiatric diseases; however, each neuroimaging technique has several limitations. These limitations led to the development of multimodal neuroimaging (MN), which combines data obtained from multiple neuroimaging techniques, such as electroencephalography, functional magnetic resonance imaging, and yields more detailed information about brain dynamics. There are several types of MN, including visual inspection, data integration, and data fusion. This literature review aimed to provide a brief summary and basic information about MN techniques (data fusion approaches in particular) and classification approaches. Data fusion approaches are generally categorized as asymmetric and symmetric. The present review focused exclusively on studies based on symmetric data fusion methods (data-driven methods), such as independent component analysis and principal component analysis. Machine learning techniques have recently been introduced for use in identifying diseases and biomarkers of disease. The machine learning technique most widely used by neuroscientists is classification-especially support vector machine classification. Several studies differentiated patients with psychiatric diseases and healthy controls with using combined datasets. The common conclusion among these studies is that the prediction of diseases increases when combining data via MN techniques; however, there remain a few challenges associated with MN, such as sample size. Perhaps in the future N-way fusion can be used to combine multiple neuroimaging techniques or nonimaging predictors (eg, cognitive ability) to overcome the limitations of MN.
Validation and calibration of structural models that combine information from multiple sources.
Dahabreh, Issa J; Wong, John B; Trikalinos, Thomas A
2017-02-01
Mathematical models that attempt to capture structural relationships between their components and combine information from multiple sources are increasingly used in medicine. Areas covered: We provide an overview of methods for model validation and calibration and survey studies comparing alternative approaches. Expert commentary: Model validation entails a confrontation of models with data, background knowledge, and other models, and can inform judgments about model credibility. Calibration involves selecting parameter values to improve the agreement of model outputs with data. When the goal of modeling is quantitative inference on the effects of interventions or forecasting, calibration can be viewed as estimation. This view clarifies issues related to parameter identifiability and facilitates formal model validation and the examination of consistency among different sources of information. In contrast, when the goal of modeling is the generation of qualitative insights about the modeled phenomenon, calibration is a rather informal process for selecting inputs that result in model behavior that roughly reproduces select aspects of the modeled phenomenon and cannot be equated to an estimation procedure. Current empirical research on validation and calibration methods consists primarily of methodological appraisals or case-studies of alternative techniques and cannot address the numerous complex and multifaceted methodological decisions that modelers must make. Further research is needed on different approaches for developing and validating complex models that combine evidence from multiple sources.
Yu, Guozhi; Hozé, Nathanaël; Rolff, Jens
2016-01-01
Antimicrobial peptides (AMPs) and antibiotics reduce the net growth rate of bacterial populations they target. It is relevant to understand if effects of multiple antimicrobials are synergistic or antagonistic, in particular for AMP responses, because naturally occurring responses involve multiple AMPs. There are several competing proposals describing how multiple types of antimicrobials add up when applied in combination, such as Loewe additivity or Bliss independence. These additivity terms are defined ad hoc from abstract principles explaining the supposed interaction between the antimicrobials. Here, we link these ad hoc combination terms to a mathematical model that represents the dynamics of antimicrobial molecules hitting targets on bacterial cells. In this multi-hit model, bacteria are killed when a certain number of targets are hit by antimicrobials. Using this bottom-up approach reveals that Bliss independence should be the model of choice if no interaction between antimicrobial molecules is expected. Loewe additivity, on the other hand, describes scenarios in which antimicrobials affect the same components of the cell, i.e. are not acting independently. While our approach idealizes the dynamics of antimicrobials, it provides a conceptual underpinning of the additivity terms. The choice of the additivity term is essential to determine synergy or antagonism of antimicrobials. This article is part of the themed issue ‘Evolutionary ecology of arthropod antimicrobial peptides’. PMID:27160596
Structured plant metabolomics for the simultaneous exploration of multiple factors.
Vasilev, Nikolay; Boccard, Julien; Lang, Gerhard; Grömping, Ulrike; Fischer, Rainer; Goepfert, Simon; Rudaz, Serge; Schillberg, Stefan
2016-11-17
Multiple factors act simultaneously on plants to establish complex interaction networks involving nutrients, elicitors and metabolites. Metabolomics offers a better understanding of complex biological systems, but evaluating the simultaneous impact of different parameters on metabolic pathways that have many components is a challenging task. We therefore developed a novel approach that combines experimental design, untargeted metabolic profiling based on multiple chromatography systems and ionization modes, and multiblock data analysis, facilitating the systematic analysis of metabolic changes in plants caused by different factors acting at the same time. Using this method, target geraniol compounds produced in transgenic tobacco cell cultures were grouped into clusters based on their response to different factors. We hypothesized that our novel approach may provide more robust data for process optimization in plant cell cultures producing any target secondary metabolite, based on the simultaneous exploration of multiple factors rather than varying one factor each time. The suitability of our approach was verified by confirming several previously reported examples of elicitor-metabolite crosstalk. However, unravelling all factor-metabolite networks remains challenging because it requires the identification of all biochemically significant metabolites in the metabolomics dataset.
Wafer hotspot prevention using etch aware OPC correction
NASA Astrophysics Data System (ADS)
Hamouda, Ayman; Power, Dave; Salama, Mohamed; Chen, Ao
2016-03-01
As technology development advances into deep-sub-wavelength nodes, multiple patterning is becoming more essential to achieve the technology shrink requirements. Recently, Optical Proximity Correction (OPC) technology has proposed simultaneous correction of multiple mask-patterns to enable multiple patterning awareness during OPC correction. This is essential to prevent inter-layer hot-spots during the final pattern transfer. In state-of-art literature, multi-layer awareness is achieved using simultaneous resist-contour simulations to predict and correct for hot-spots during mask generation. However, this approach assumes a uniform etch shrink response for all patterns independent of their proximity, which isn't sufficient for the full prevention of inter-exposure hot-spot, for example different color space violations post etch or via coverage/enclosure post etch. In this paper, we explain the need to include the etch component during multiple patterning OPC. We also introduce a novel approach for Etch-aware simultaneous Multiple-patterning OPC, where we calibrate and verify a lumped model that includes the combined resist and etch responses. Adding this extra simulation condition during OPC is suitable for full chip processing from a computation intensity point of view. Also, using this model during OPC to predict and correct inter-exposures hot-spots is similar to previously proposed multiple-patterning OPC, yet our proposed approach more accurately corrects post-etch defects too.
Simultaneous nano-tracking of multiple motor proteins via spectral discrimination of quantum dots.
Kakizuka, Taishi; Ikezaki, Keigo; Kaneshiro, Junichi; Fujita, Hideaki; Watanabe, Tomonobu M; Ichimura, Taro
2016-07-01
Simultaneous nanometric tracking of multiple motor proteins was achieved by combining multicolor fluorescent labeling of target proteins and imaging spectroscopy, revealing dynamic behaviors of multiple motor proteins at the sub-diffraction-limit scale. Using quantum dot probes of distinct colors, we experimentally verified the localization precision to be a few nanometers at temporal resolution of 30 ms or faster. One-dimensional processive movement of two heads of a single myosin molecule and multiple myosin molecules was successfully traced. Furthermore, the system was modified for two-dimensional measurement and applied to tracking of multiple myosin molecules. Our approach is useful for investigating cooperative movement of proteins in supramolecular nanomachinery.
Simultaneous nano-tracking of multiple motor proteins via spectral discrimination of quantum dots
Kakizuka, Taishi; Ikezaki, Keigo; Kaneshiro, Junichi; Fujita, Hideaki; Watanabe, Tomonobu M.; Ichimura, Taro
2016-01-01
Simultaneous nanometric tracking of multiple motor proteins was achieved by combining multicolor fluorescent labeling of target proteins and imaging spectroscopy, revealing dynamic behaviors of multiple motor proteins at the sub-diffraction-limit scale. Using quantum dot probes of distinct colors, we experimentally verified the localization precision to be a few nanometers at temporal resolution of 30 ms or faster. One-dimensional processive movement of two heads of a single myosin molecule and multiple myosin molecules was successfully traced. Furthermore, the system was modified for two-dimensional measurement and applied to tracking of multiple myosin molecules. Our approach is useful for investigating cooperative movement of proteins in supramolecular nanomachinery. PMID:27446684
Estimating Characteristics of a Maneuvering Reentry Vehicle Observed by Multiple Sensors
2010-03-01
instead of as one large data set. This method allowed the filter to respond to changing dynamics. Jackson and Farbman’s approach could be of...portion of the entire acceleration was due to drag. Lee and Liu adopted a more hybrid approach , combining a least squares and Kalman filters [9...grows again as the window approaches the end of the available data. Three values for minimum window size, window size, and maximum window size are
When counting cattle is not enough: multiple perspectives in agricultural and veterinary research
2011-01-01
A traditional approach in agricultural and veterinary research is focussing on the biological perspective where large cattle-databases are used to analyse the dairy herd. This approach has yielded valuable insights. However, recent research indicates that this knowledge-base can be further increased by examining agricultural and veterinary challenges from other perspectives. In this paper we suggest three perspectives that may supplement the biological perspective in agricultural and veterinary research; the economic-, the managerial-, and the social perspective. We review recent studies applying or combining these perspectives and discuss how multiple perspectives may improve our understanding and ability to handle cattle-health challenges. PMID:21999487
Implementation of a VLSI Level Zero Processing system utilizing the functional component approach
NASA Technical Reports Server (NTRS)
Shi, Jianfei; Horner, Ward P.; Grebowsky, Gerald J.; Chesney, James R.
1991-01-01
A high rate Level Zero Processing system is currently being prototyped at NASA/Goddard Space Flight Center (GSFC). Based on state-of-the-art VLSI technology and the functional component approach, the new system promises capabilities of handling multiple Virtual Channels and Applications with a combined data rate of up to 20 Megabits per second (Mbps) at low cost.
ERIC Educational Resources Information Center
Derguy, C.; M'Bailara, K.; Michel, G.; Roux, S.; Bouvard, M.
2016-01-01
This study aimed to identify parental stress predictors in ASD by considering individual and environmental factors in an ecological approach. Participants were 115 parents of children with ASD aged from 3 to 10 years. Multiple regression analyses were conducted to determine the best predictors of parental stress among child-related, parent-related…
Monitoring and Management of a Sensitive Resource: A Landscape-level Approach with Amphibians
2001-03-01
Results show that each technique is effective for a portion of the amphibian community and that the use of multiple techniques is essential to any...combinations of species. These results show that multiple techniques are needed for a full assessment of amphibian populations and communities at...against which future assessments of amphibian populations and communities on each installation can be evaluated. The standardized techniques used in FY
Boost OCR accuracy using iVector based system combination approach
NASA Astrophysics Data System (ADS)
Peng, Xujun; Cao, Huaigu; Natarajan, Prem
2015-01-01
Optical character recognition (OCR) is a challenging task because most existing preprocessing approaches are sensitive to writing style, writing material, noises and image resolution. Thus, a single recognition system cannot address all factors of real document images. In this paper, we describe an approach to combine diverse recognition systems by using iVector based features, which is a newly developed method in the field of speaker verification. Prior to system combination, document images are preprocessed and text line images are extracted with different approaches for each system, where iVector is transformed from a high-dimensional supervector of each text line and is used to predict the accuracy of OCR. We merge hypotheses from multiple recognition systems according to the overlap ratio and the predicted OCR score of text line images. We present evaluation results on an Arabic document database where the proposed method is compared against the single best OCR system using word error rate (WER) metric.
Kang, Pengde; Pei, Fuxing; Shen, Bin; Zhou, Zongke; Yang, Jing
2012-01-01
The treatment of osteonecrosis of the femoral head (ONFH) remains controversial. A recently proposed treatment is multiple drilling core decompression combined with systemic alendronate as a femoral head-preserving procedure for ONFH. However, it is not known whether alendronate enhances the risk of collapse. We wondered whether the combined procedure could delay or prevent progression of ONFH compared to multiple drilling alone. Patients with early-stage ONFH were randomly assigned to be treated with either multiple drilling combined with alendronate (47 patients, 67 hips) or multiple drilling alone (46 patients, 60 hips). We defined failure as the need for THA or a Harris score less than 70. The minimum follow-up was 48 months for the 77 patients completing the protocol. After a minimum 4-year follow-up, 91% (40/44) of patients with Stage II disease and 62% (8/13) of patients with Stage III disease had not required THA in alendronate group, compared to 79% (31/39) of patients with Stage II disease and 46% (6/13) of patients with Stage III disease had not required THA in control group (P=0.12, P=0.047, respectively). Small or medium and central lesions had a better successful rate in both groups. Risk factors did not seem to affect the clinical successful rate of this procedure. Multiple small-diameter drilling core decompression combined with systemic alendronate administration can reduce pain and delay progression of early-stage ONFH. Even in Ficat IIA and III hips, some benefit was obtained from this approach at least delay in the need for THA. Copyright © 2011. Published by Elsevier SAS.
Integrated model of multiple kernel learning and differential evolution for EUR/USD trading.
Deng, Shangkun; Sakurai, Akito
2014-01-01
Currency trading is an important area for individual investors, government policy decisions, and organization investments. In this study, we propose a hybrid approach referred to as MKL-DE, which combines multiple kernel learning (MKL) with differential evolution (DE) for trading a currency pair. MKL is used to learn a model that predicts changes in the target currency pair, whereas DE is used to generate the buy and sell signals for the target currency pair based on the relative strength index (RSI), while it is also combined with MKL as a trading signal. The new hybrid implementation is applied to EUR/USD trading, which is the most traded foreign exchange (FX) currency pair. MKL is essential for utilizing information from multiple information sources and DE is essential for formulating a trading rule based on a mixture of discrete structures and continuous parameters. Initially, the prediction model optimized by MKL predicts the returns based on a technical indicator called the moving average convergence and divergence. Next, a combined trading signal is optimized by DE using the inputs from the prediction model and technical indicator RSI obtained from multiple timeframes. The experimental results showed that trading using the prediction learned by MKL yielded consistent profits.
A new adaptive multiple modelling approach for non-linear and non-stationary systems
NASA Astrophysics Data System (ADS)
Chen, Hao; Gong, Yu; Hong, Xia
2016-07-01
This paper proposes a novel adaptive multiple modelling algorithm for non-linear and non-stationary systems. This simple modelling paradigm comprises K candidate sub-models which are all linear. With data available in an online fashion, the performance of all candidate sub-models are monitored based on the most recent data window, and M best sub-models are selected from the K candidates. The weight coefficients of the selected sub-model are adapted via the recursive least square (RLS) algorithm, while the coefficients of the remaining sub-models are unchanged. These M model predictions are then optimally combined to produce the multi-model output. We propose to minimise the mean square error based on a recent data window, and apply the sum to one constraint to the combination parameters, leading to a closed-form solution, so that maximal computational efficiency can be achieved. In addition, at each time step, the model prediction is chosen from either the resultant multiple model or the best sub-model, whichever is the best. Simulation results are given in comparison with some typical alternatives, including the linear RLS algorithm and a number of online non-linear approaches, in terms of modelling performance and time consumption.
Multilocus lod scores in large pedigrees: combination of exact and approximate calculations.
Tong, Liping; Thompson, Elizabeth
2008-01-01
To detect the positions of disease loci, lod scores are calculated at multiple chromosomal positions given trait and marker data on members of pedigrees. Exact lod score calculations are often impossible when the size of the pedigree and the number of markers are both large. In this case, a Markov Chain Monte Carlo (MCMC) approach provides an approximation. However, to provide accurate results, mixing performance is always a key issue in these MCMC methods. In this paper, we propose two methods to improve MCMC sampling and hence obtain more accurate lod score estimates in shorter computation time. The first improvement generalizes the block-Gibbs meiosis (M) sampler to multiple meiosis (MM) sampler in which multiple meioses are updated jointly, across all loci. The second one divides the computations on a large pedigree into several parts by conditioning on the haplotypes of some 'key' individuals. We perform exact calculations for the descendant parts where more data are often available, and combine this information with sampling of the hidden variables in the ancestral parts. Our approaches are expected to be most useful for data on a large pedigree with a lot of missing data. (c) 2007 S. Karger AG, Basel
Multilocus Lod Scores in Large Pedigrees: Combination of Exact and Approximate Calculations
Tong, Liping; Thompson, Elizabeth
2007-01-01
To detect the positions of disease loci, lod scores are calculated at multiple chromosomal positions given trait and marker data on members of pedigrees. Exact lod score calculations are often impossible when the size of the pedigree and the number of markers are both large. In this case, a Markov Chain Monte Carlo (MCMC) approach provides an approximation. However, to provide accurate results, mixing performance is always a key issue in these MCMC methods. In this paper, we propose two methods to improve MCMC sampling and hence obtain more accurate lod score estimates in shorter computation time. The first improvement generalizes the block-Gibbs meiosis (M) sampler to multiple meiosis (MM) sampler in which multiple meioses are updated jointly, across all loci. The second one divides the computations on a large pedigree into several parts by conditioning on the haplotypes of some ‘key’ individuals. We perform exact calculations for the descendant parts where more data are often available, and combine this information with sampling of the hidden variables in the ancestral parts. Our approaches are expected to be most useful for data on a large pedigree with a lot of missing data. PMID:17934317
Integrated presentation of ecological risk from multiple stressors
Goussen, Benoit; Price, Oliver R.; Rendal, Cecilie; Ashauer, Roman
2016-01-01
Current environmental risk assessments (ERA) do not account explicitly for ecological factors (e.g. species composition, temperature or food availability) and multiple stressors. Assessing mixtures of chemical and ecological stressors is needed as well as accounting for variability in environmental conditions and uncertainty of data and models. Here we propose a novel probabilistic ERA framework to overcome these limitations, which focusses on visualising assessment outcomes by construct-ing and interpreting prevalence plots as a quantitative prediction of risk. Key components include environmental scenarios that integrate exposure and ecology, and ecological modelling of relevant endpoints to assess the effect of a combination of stressors. Our illustrative results demonstrate the importance of regional differences in environmental conditions and the confounding interactions of stressors. Using this framework and prevalence plots provides a risk-based approach that combines risk assessment and risk management in a meaningful way and presents a truly mechanistic alternative to the threshold approach. Even whilst research continues to improve the underlying models and data, regulators and decision makers can already use the framework and prevalence plots. The integration of multiple stressors, environmental conditions and variability makes ERA more relevant and realistic. PMID:27782171
Bennetts, Victor Hernandez; Schaffernicht, Erik; Pomareda, Victor; Lilienthal, Achim J; Marco, Santiago; Trincavelli, Marco
2014-09-17
In this paper, we address the task of gas distribution modeling in scenarios where multiple heterogeneous compounds are present. Gas distribution modeling is particularly useful in emission monitoring applications where spatial representations of the gaseous patches can be used to identify emission hot spots. In realistic environments, the presence of multiple chemicals is expected and therefore, gas discrimination has to be incorporated in the modeling process. The approach presented in this work addresses the task of gas distribution modeling by combining different non selective gas sensors. Gas discrimination is addressed with an open sampling system, composed by an array of metal oxide sensors and a probabilistic algorithm tailored to uncontrolled environments. For each of the identified compounds, the mapping algorithm generates a calibrated gas distribution model using the classification uncertainty and the concentration readings acquired with a photo ionization detector. The meta parameters of the proposed modeling algorithm are automatically learned from the data. The approach was validated with a gas sensitive robot patrolling outdoor and indoor scenarios, where two different chemicals were released simultaneously. The experimental results show that the generated multi compound maps can be used to accurately predict the location of emitting gas sources.
Theory of chromatic noise masking applied to testing linearity of S-cone detection mechanisms.
Giulianini, Franco; Eskew, Rhea T
2007-09-01
A method for testing the linearity of cone combination of chromatic detection mechanisms is applied to S-cone detection. This approach uses the concept of mechanism noise, the noise as seen by a postreceptoral neural mechanism, to represent the effects of superposing chromatic noise components in elevating thresholds and leads to a parameter-free prediction for a linear mechanism. The method also provides a test for the presence of multiple linear detectors and off-axis looking. No evidence for multiple linear mechanisms was found when using either S-cone increment or decrement tests. The results for both S-cone test polarities demonstrate that these mechanisms combine their cone inputs nonlinearly.
Lee, Y; Tien, J M
2001-01-01
We present mathematical models that determine the optimal parameters for strategically routing multidestination traffic in an end-to-end network setting. Multidestination traffic refers to a traffic type that can be routed to any one of a multiple number of destinations. A growing number of communication services is based on multidestination routing. In this parameter-driven approach, a multidestination call is routed to one of the candidate destination nodes in accordance with predetermined decision parameters associated with each candidate node. We present three different approaches: (1) a link utilization (LU) approach, (2) a network cost (NC) approach, and (3) a combined parametric (CP) approach. The LU approach provides the solution that would result in an optimally balanced link utilization, whereas the NC approach provides the least expensive way to route traffic to destinations. The CP approach, on the other hand, provides multiple solutions that help leverage link utilization and cost. The LU approach has in fact been implemented by a long distance carrier resulting in a considerable efficiency improvement in its international direct services, as summarized.
Post-Stall Aerodynamic Modeling and Gain-Scheduled Control Design
NASA Technical Reports Server (NTRS)
Wu, Fen; Gopalarathnam, Ashok; Kim, Sungwan
2005-01-01
A multidisciplinary research e.ort that combines aerodynamic modeling and gain-scheduled control design for aircraft flight at post-stall conditions is described. The aerodynamic modeling uses a decambering approach for rapid prediction of post-stall aerodynamic characteristics of multiple-wing con.gurations using known section data. The approach is successful in bringing to light multiple solutions at post-stall angles of attack right during the iteration process. The predictions agree fairly well with experimental results from wind tunnel tests. The control research was focused on actuator saturation and .ight transition between low and high angles of attack regions for near- and post-stall aircraft using advanced LPV control techniques. The new control approaches maintain adequate control capability to handle high angle of attack aircraft control with stability and performance guarantee.
NASA Astrophysics Data System (ADS)
Yang, Jinping; Li, Peizhen; Yang, Youfa; Xu, Dian
2018-04-01
Empirical mode decomposition (EMD) is a highly adaptable signal processing method. However, the EMD approach has certain drawbacks, including distortions from end effects and mode mixing. In the present study, these two problems are addressed using an end extension method based on the support vector regression machine (SVRM) and a modal decomposition method based on the characteristics of the Hilbert transform. The algorithm includes two steps: using the SVRM, the time series data are extended at both endpoints to reduce the end effects, and then, a modified EMD method using the characteristics of the Hilbert transform is performed on the resulting signal to reduce mode mixing. A new combined static-dynamic method for identifying structural damage is presented. This method combines the static and dynamic information in an equilibrium equation that can be solved using the Moore-Penrose generalized matrix inverse. The combination method uses the differences in displacements of the structure with and without damage and variations in the modal force vector. Tests on a four-story, steel-frame structure were conducted to obtain static and dynamic responses of the structure. The modal parameters are identified using data from the dynamic tests and improved EMD method. The new method is shown to be more accurate and effective than the traditional EMD method. Through tests with a shear-type test frame, the higher performance of the proposed static-dynamic damage detection approach, which can detect both single and multiple damage locations and the degree of the damage, is demonstrated. For structures with multiple damage, the combined approach is more effective than either the static or dynamic method. The proposed EMD method and static-dynamic damage detection method offer improved modal identification and damage detection, respectively, in structures.
Reducing hydrologic model uncertainty in monthly streamflow predictions using multimodel combination
NASA Astrophysics Data System (ADS)
Li, Weihua; Sankarasubramanian, A.
2012-12-01
Model errors are inevitable in any prediction exercise. One approach that is currently gaining attention in reducing model errors is by combining multiple models to develop improved predictions. The rationale behind this approach primarily lies on the premise that optimal weights could be derived for each model so that the developed multimodel predictions will result in improved predictions. A new dynamic approach (MM-1) to combine multiple hydrological models by evaluating their performance/skill contingent on the predictor state is proposed. We combine two hydrological models, "abcd" model and variable infiltration capacity (VIC) model, to develop multimodel streamflow predictions. To quantify precisely under what conditions the multimodel combination results in improved predictions, we compare multimodel scheme MM-1 with optimal model combination scheme (MM-O) by employing them in predicting the streamflow generated from a known hydrologic model (abcd model orVICmodel) with heteroscedastic error variance as well as from a hydrologic model that exhibits different structure than that of the candidate models (i.e., "abcd" model or VIC model). Results from the study show that streamflow estimated from single models performed better than multimodels under almost no measurement error. However, under increased measurement errors and model structural misspecification, both multimodel schemes (MM-1 and MM-O) consistently performed better than the single model prediction. Overall, MM-1 performs better than MM-O in predicting the monthly flow values as well as in predicting extreme monthly flows. Comparison of the weights obtained from each candidate model reveals that as measurement errors increase, MM-1 assigns weights equally for all the models, whereas MM-O assigns higher weights for always the best-performing candidate model under the calibration period. Applying the multimodel algorithms for predicting streamflows over four different sites revealed that MM-1 performs better than all single models and optimal model combination scheme, MM-O, in predicting the monthly flows as well as the flows during wetter months.
A Nonparametric, Multiple Imputation-Based Method for the Retrospective Integration of Data Sets.
Carrig, Madeline M; Manrique-Vallier, Daniel; Ranby, Krista W; Reiter, Jerome P; Hoyle, Rick H
2015-01-01
Complex research questions often cannot be addressed adequately with a single data set. One sensible alternative to the high cost and effort associated with the creation of large new data sets is to combine existing data sets containing variables related to the constructs of interest. The goal of the present research was to develop a flexible, broadly applicable approach to the integration of disparate data sets that is based on nonparametric multiple imputation and the collection of data from a convenient, de novo calibration sample. We demonstrate proof of concept for the approach by integrating three existing data sets containing items related to the extent of problematic alcohol use and associations with deviant peers. We discuss both necessary conditions for the approach to work well and potential strengths and weaknesses of the method compared to other data set integration approaches.
Bakas, Spyridon; Zeng, Ke; Sotiras, Aristeidis; Rathore, Saima; Akbari, Hamed; Gaonkar, Bilwaj; Rozycki, Martin; Pati, Sarthak; Davatzikos, Christos
2016-01-01
We present an approach for segmenting low- and high-grade gliomas in multimodal magnetic resonance imaging volumes. The proposed approach is based on a hybrid generative-discriminative model. Firstly, a generative approach based on an Expectation-Maximization framework that incorporates a glioma growth model is used to segment the brain scans into tumor, as well as healthy tissue labels. Secondly, a gradient boosting multi-class classification scheme is used to refine tumor labels based on information from multiple patients. Lastly, a probabilistic Bayesian strategy is employed to further refine and finalize the tumor segmentation based on patient-specific intensity statistics from the multiple modalities. We evaluated our approach in 186 cases during the training phase of the BRAin Tumor Segmentation (BRATS) 2015 challenge and report promising results. During the testing phase, the algorithm was additionally evaluated in 53 unseen cases, achieving the best performance among the competing methods.
A Nonparametric, Multiple Imputation-Based Method for the Retrospective Integration of Data Sets
Carrig, Madeline M.; Manrique-Vallier, Daniel; Ranby, Krista W.; Reiter, Jerome P.; Hoyle, Rick H.
2015-01-01
Complex research questions often cannot be addressed adequately with a single data set. One sensible alternative to the high cost and effort associated with the creation of large new data sets is to combine existing data sets containing variables related to the constructs of interest. The goal of the present research was to develop a flexible, broadly applicable approach to the integration of disparate data sets that is based on nonparametric multiple imputation and the collection of data from a convenient, de novo calibration sample. We demonstrate proof of concept for the approach by integrating three existing data sets containing items related to the extent of problematic alcohol use and associations with deviant peers. We discuss both necessary conditions for the approach to work well and potential strengths and weaknesses of the method compared to other data set integration approaches. PMID:26257437
Chowdhury, Nilotpal; Sapru, Shantanu
2015-01-01
Microarray analysis has revolutionized the role of genomic prognostication in breast cancer. However, most studies are single series studies, and suffer from methodological problems. We sought to use a meta-analytic approach in combining multiple publicly available datasets, while correcting for batch effects, to reach a more robust oncogenomic analysis. The aim of the present study was to find gene sets associated with distant metastasis free survival (DMFS) in systemically untreated, node-negative breast cancer patients, from publicly available genomic microarray datasets. Four microarray series (having 742 patients) were selected after a systematic search and combined. Cox regression for each gene was done for the combined dataset (univariate, as well as multivariate - adjusted for expression of Cell cycle related genes) and for the 4 major molecular subtypes. The centre and microarray batch effects were adjusted by including them as random effects variables. The Cox regression coefficients for each analysis were then ranked and subjected to a Gene Set Enrichment Analysis (GSEA). Gene sets representing protein translation were independently negatively associated with metastasis in the Luminal A and Luminal B subtypes, but positively associated with metastasis in Basal tumors. Proteinaceous extracellular matrix (ECM) gene set expression was positively associated with metastasis, after adjustment for expression of cell cycle related genes on the combined dataset. Finally, the positive association of the proliferation-related genes with metastases was confirmed. To the best of our knowledge, the results depicting mixed prognostic significance of protein translation in breast cancer subtypes are being reported for the first time. We attribute this to our study combining multiple series and performing a more robust meta-analytic Cox regression modeling on the combined dataset, thus discovering 'hidden' associations. This methodology seems to yield new and interesting results and may be used as a tool to guide new research.
Chowdhury, Nilotpal; Sapru, Shantanu
2015-01-01
Introduction Microarray analysis has revolutionized the role of genomic prognostication in breast cancer. However, most studies are single series studies, and suffer from methodological problems. We sought to use a meta-analytic approach in combining multiple publicly available datasets, while correcting for batch effects, to reach a more robust oncogenomic analysis. Aim The aim of the present study was to find gene sets associated with distant metastasis free survival (DMFS) in systemically untreated, node-negative breast cancer patients, from publicly available genomic microarray datasets. Methods Four microarray series (having 742 patients) were selected after a systematic search and combined. Cox regression for each gene was done for the combined dataset (univariate, as well as multivariate – adjusted for expression of Cell cycle related genes) and for the 4 major molecular subtypes. The centre and microarray batch effects were adjusted by including them as random effects variables. The Cox regression coefficients for each analysis were then ranked and subjected to a Gene Set Enrichment Analysis (GSEA). Results Gene sets representing protein translation were independently negatively associated with metastasis in the Luminal A and Luminal B subtypes, but positively associated with metastasis in Basal tumors. Proteinaceous extracellular matrix (ECM) gene set expression was positively associated with metastasis, after adjustment for expression of cell cycle related genes on the combined dataset. Finally, the positive association of the proliferation-related genes with metastases was confirmed. Conclusion To the best of our knowledge, the results depicting mixed prognostic significance of protein translation in breast cancer subtypes are being reported for the first time. We attribute this to our study combining multiple series and performing a more robust meta-analytic Cox regression modeling on the combined dataset, thus discovering 'hidden' associations. This methodology seems to yield new and interesting results and may be used as a tool to guide new research. PMID:26080057
Generating Concise Natural Language Summaries.
ERIC Educational Resources Information Center
McKeown, Kathleen; And Others
1995-01-01
Presents an approach to summarization that combines information from multiple facts into a single sentence using linguistic constructions. Describes two applications: one produces summaries of basketball games, and the other contains summaries of telephone network planning activity. Both summarize input data as opposed to full text. Discusses…
Evaluating the application of multipollutant exposure metrics in air pollution health studies
Background: Health effects associated with air pollution are typically evaluated using a single-pollutant approach, yet people are exposed to mixtures consisting of multiple pollutants that may have independent or combined effects on human health. Development of metrics that re...
USDA-ARS?s Scientific Manuscript database
Most hosts are concurrently or sequentially infected with multiple parasites, thus fully understanding interactions between individual parasite species and their hosts depends on accurate characterization of the parasite community. For parasitic nematodes, non-invasive methods for obtaining quantita...
Tsuda, Sachiko; Kee, Michelle Z.L.; Cunha, Catarina; Kim, Jinsook; Yan, Ping; Loew, Leslie M.; Augustine, George J.
2013-01-01
Recent advances in our understanding of brain function have come from using light to either control or image neuronal activity. Here we describe an approach that combines both techniques: a micromirror array is used to photostimulate populations of presynaptic neurons expressing channelrhodopsin-2, while a red-shifted voltage-sensitive dye allows optical detection of resulting postsynaptic activity. Such technology allowed us to control the activity of cerebellar interneurons while simultaneously recording inhibitory responses in multiple Purkinje neurons, their postsynaptic targets. This approach should substantially accelerate our understanding of information processing by populations of neurons within brain circuits. PMID:23254260
The Predicted Cross Value for Genetic Introgression of Multiple Alleles
Han, Ye; Cameron, John N.; Wang, Lizhi; Beavis, William D.
2017-01-01
We consider the plant genetic improvement challenge of introgressing multiple alleles from a homozygous donor to a recipient. First, we frame the project as an algorithmic process that can be mathematically formulated. We then introduce a novel metric for selecting breeding parents that we refer to as the predicted cross value (PCV). Unlike estimated breeding values, which represent predictions of general combining ability, the PCV predicts specific combining ability. The PCV takes estimates of recombination frequencies as an input vector and calculates the probability that a pair of parents will produce a gamete with desirable alleles at all specified loci. We compared the PCV approach with existing estimated-breeding-value approaches in two simulation experiments, in which 7 and 20 desirable alleles were to be introgressed from a donor line into a recipient line. Results suggest that the PCV is more efficient and effective for multi-allelic trait introgression. We also discuss how operations research can be used for other crop genetic improvement projects and suggest several future research directions. PMID:28122824
A Flexible Approach for the Statistical Visualization of Ensemble Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Potter, K.; Wilson, A.; Bremer, P.
2009-09-29
Scientists are increasingly moving towards ensemble data sets to explore relationships present in dynamic systems. Ensemble data sets combine spatio-temporal simulation results generated using multiple numerical models, sampled input conditions and perturbed parameters. While ensemble data sets are a powerful tool for mitigating uncertainty, they pose significant visualization and analysis challenges due to their complexity. We present a collection of overview and statistical displays linked through a high level of interactivity to provide a framework for gaining key scientific insight into the distribution of the simulation results as well as the uncertainty associated with the data. In contrast to methodsmore » that present large amounts of diverse information in a single display, we argue that combining multiple linked statistical displays yields a clearer presentation of the data and facilitates a greater level of visual data analysis. We demonstrate this approach using driving problems from climate modeling and meteorology and discuss generalizations to other fields.« less
Focusing attention on objects of interest using multiple matched filters.
Stough, T M; Brodley, C E
2001-01-01
In order to be of use to scientists, large image databases need to be analyzed to create a catalog of the objects of interest. One approach is to apply a multiple tiered search algorithm that uses reduction techniques of increasing computational complexity to select the desired objects from the database. The first tier of this type of algorithm, often called a focus of attention (FOA) algorithm, selects candidate regions from the image data and passes them to the next tier of the algorithm. In this paper we present a new approach to FOA that employs multiple matched filters (MMF), one for each object prototype, to detect the regions of interest. The MMFs are formed using k-means clustering on a set of image patches identified by domain experts as positive examples of objects of interest. An innovation of the approach is to radically reduce the dimensionality of the feature space, used by the k-means algorithm, by taking block averages (spoiling) the sample image patches. The process of spoiling is analyzed and its applicability to other domains is discussed. The combination of the output of the MMFs is achieved through the projection of the detections back into an empty image and then thresholding. This research was motivated by the need to detect small volcanos in the Magellan probe data from Venus. An empirical evaluation of the approach illustrates that a combination of the MMF plus the average filter results in a higher likelihood of 100% detection of the objects of interest at a lower false positive rate than a single matched filter alone.
Spinning a stem cell ethics web.
McDonald, Michael; Longstaff, Holly
2013-01-01
The goal of this study was to provide an ethics education resource for trainees and researchers in the Canadian Stem Cell Network that would address the multiple ethical challenges in stem cell research including accountability in and for research across its multiple dimensions. The website was built using a bottom-up type approach based on an ethics needs assessment in combination with a top-down expert-driven component. There have been 3,615 visitors to the website since it was launched in July, 2011. The ongoing rate of returning visitors (20%) indicates that the website is becoming a valuable tool used multiple times.
High brightness KW-class direct diode laser
NASA Astrophysics Data System (ADS)
Xu, Dan; Guo, Zhijie; Ma, Di; Zhang, Tujia; Guo, Weirong; Wang, Baohua; Xu, Ray; Chen, Xiaohua
2018-02-01
With certain emitter beam quality and BPP allowed by fiber, we have derived a spatial beam combination structure that approaches the BPP limit of the fiber. Using the spatial beam combination structure and polarization beam combination, BWT has achieved 1.1KW output from a fiber (one end coated) with NA 0.22 and core diameter of 200μm. The electro- optical efficiency is nearly 47%. Multiple emitters with wavelength of 976nm are packaged in a module with size of 600 ×350×80mm3.
NASA Astrophysics Data System (ADS)
Makrakis, Dimitrios; Mathiopoulos, P. Takis
A maximum likelihood sequential decoder for the reception of digitally modulated signals with single or multiamplitude constellations transmitted over a multiplicative, nonselective fading channel is derived. It is shown that its structure consists of a combination of envelope, multiple differential, and coherent detectors. The outputs of each of these detectors are jointly processed by means of an algorithm. This algorithm is presented in a recursive form. The derivation of the new receiver is general enough to accommodate uncoded as well as coded (e.g., trellis-coded) schemes. Performance evaluation results for a reduced-complexity trellis-coded QPSK system have demonstrated that the proposed receiver dramatically reduces the error floors caused by fading. At Eb/N0 = 20 dB the new receiver structure results in bit-error-rate reductions of more than three orders of magnitude compared to a conventional Viterbi receiver, while being reasonably simple to implement.
Mondy, Cédric P; Muñoz, Isabel; Dolédec, Sylvain
2016-12-01
Multiple stressors constitute a serious threat to aquatic ecosystems, particularly in the Mediterranean region where water scarcity is likely to interact with other anthropogenic stressors. Biological traits potentially allow the unravelling of the effects of multiple stressors. However, thus far, trait-based approaches have failed to fully deliver on their promise and still lack strong predictive power when multiple stressors are present. We aimed to quantify specific community tolerances against six anthropogenic stressors and investigate the responses of the underlying macroinvertebrate biological traits and their combinations. We built and calibrated boosted regression tree models to predict community tolerances using multiple biological traits with a priori hypotheses regarding their individual responses to specific stressors. We analysed the combinations of traits underlying community tolerance and the effect of trait association on this tolerance. Our results validated the following three hypotheses: (i) the community tolerance models efficiently and robustly related trait combinations to stressor intensities and, to a lesser extent, to stressors related to the presence of dams and insecticides; (ii) the effects of traits on community tolerance not only depended on trait identity but also on the trait associations emerging at the community level from the co-occurrence of different traits in species; and (iii) the community tolerances and the underlying trait combinations were specific to the different stressors. This study takes a further step towards predictive tools in community ecology that consider combinations and associations of traits as the basis of stressor tolerance. Additionally, the community tolerance concept has potential application to help stream managers in the decision process regarding management options. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
A physics based method for combining multiple anatomy models with application to medical simulation.
Zhu, Yanong; Magee, Derek; Ratnalingam, Rishya; Kessel, David
2009-01-01
We present a physics based approach to the construction of anatomy models by combining components from different sources; different image modalities, protocols, and patients. Given an initial anatomy, a mass-spring model is generated which mimics the physical properties of the solid anatomy components. This helps maintain valid spatial relationships between the components, as well as the validity of their shapes. Combination can be either replacing/modifying an existing component, or inserting a new component. The external forces that deform the model components to fit the new shape are estimated from Gradient Vector Flow and Distance Transform maps. We demonstrate the applicability and validity of the described approach in the area of medical simulation, by showing the processes of non-rigid surface alignment, component replacement, and component insertion.
The Need for Integrated Approaches in Metabolic Engineering.
Lechner, Anna; Brunk, Elizabeth; Keasling, Jay D
2016-11-01
This review highlights state-of-the-art procedures for heterologous small-molecule biosynthesis, the associated bottlenecks, and new strategies that have the potential to accelerate future accomplishments in metabolic engineering. We emphasize that a combination of different approaches over multiple time and size scales must be considered for successful pathway engineering in a heterologous host. We have classified these optimization procedures based on the "system" that is being manipulated: transcriptome, translatome, proteome, or reactome. By bridging multiple disciplines, including molecular biology, biochemistry, biophysics, and computational sciences, we can create an integral framework for the discovery and implementation of novel biosynthetic production routes. Copyright © 2016 Cold Spring Harbor Laboratory Press; all rights reserved.
Leonard, Ryan J; McArthur, Clare; Hochuli, Dieter F
2016-08-01
Plants are routinely subjected to multiple environmental stressors, and the ability to respond to these stressors determines species survival and ecological breadth. Despite stressors such as wind and dust significantly influencing plant development, morphology, and chemistry, the combined influence of these factors is yet to be investigated. We used a manipulative glasshouse approach to compare the morphological, physiological, and biomechanical responses of Eucalyptus tereticornis to the independent and combined effects of wind and dust. Wind decreased both E. tereticornis height and stem flexural stiffness. Additionally, wind had no effect on leaf physiology, nor did dust have any significant effect on any of the traits measured. Our results suggest that wind and dust in combination may have an additive effect on several plant traits and provide new insight into the effects and importance of studying wind, dust, and different stress combinations. © 2016 Botanical Society of America.
Wang, Xiaotong; Liu, Jing; Yang, Xiaomei; Zhang, Qian; Zhang, Yiwen; Li, Qing; Bi, Kaishun
2018-03-30
To rapidly identify and classify complicated components and metabolites for traditional Chinese medicines, a liquid chromatography with quadrupole time-of-flight mass spectrometry method combined with multiple data-processing approaches was established. In this process, Kai-Xin-San, a widely used classic traditional Chinese medicine preparation, was chosen as a model prescription. Initially, the fragmentation patterns, diagnostic product ions and neutral loss of each category of compounds were summarized by collision-induced dissociation analysis of representative standards. In vitro, the multiple product ions filtering technique was utilized to identify the chemical constituents for globally covering trace components. With this strategy, 108 constituents were identified, and compounds database was successfully established. In vivo, the prototype compounds were extracted based on the established database, and the neutral loss filtering technique combined with the drug metabolism reaction rules was employed to identify metabolites. Overall, 69 constituents including prototype and metabolites were characterized in rat plasma and nine constituents were firstly characterized in rat brain, which may be the potential active constituents resulting in curative effects by synergistic interaction. In conclusion, this study provides a generally applicable strategy to global metabolite identification for the complicated components in complex matrix and a chemical basis for further pharmacological research of Kai-Xin-San. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Genomic approaches for the elucidation of genes and gene networks underlying cardiovascular traits.
Adriaens, M E; Bezzina, C R
2018-06-22
Genome-wide association studies have shed light on the association between natural genetic variation and cardiovascular traits. However, linking a cardiovascular trait associated locus to a candidate gene or set of candidate genes for prioritization for follow-up mechanistic studies is all but straightforward. Genomic technologies based on next-generation sequencing technology nowadays offer multiple opportunities to dissect gene regulatory networks underlying genetic cardiovascular trait associations, thereby aiding in the identification of candidate genes at unprecedented scale. RNA sequencing in particular becomes a powerful tool when combined with genotyping to identify loci that modulate transcript abundance, known as expression quantitative trait loci (eQTL), or loci modulating transcript splicing known as splicing quantitative trait loci (sQTL). Additionally, the allele-specific resolution of RNA-sequencing technology enables estimation of allelic imbalance, a state where the two alleles of a gene are expressed at a ratio differing from the expected 1:1 ratio. When multiple high-throughput approaches are combined with deep phenotyping in a single study, a comprehensive elucidation of the relationship between genotype and phenotype comes into view, an approach known as systems genetics. In this review, we cover key applications of systems genetics in the broad cardiovascular field.
New approach based on tetrahedral-mesh geometry for accurate 4D Monte Carlo patient-dose calculation
NASA Astrophysics Data System (ADS)
Han, Min Cheol; Yeom, Yeon Soo; Kim, Chan Hyeong; Kim, Seonghoon; Sohn, Jason W.
2015-02-01
In the present study, to achieve accurate 4D Monte Carlo dose calculation in radiation therapy, we devised a new approach that combines (1) modeling of the patient body using tetrahedral-mesh geometry based on the patient’s 4D CT data, (2) continuous movement/deformation of the tetrahedral patient model by interpolation of deformation vector fields acquired through deformable image registration, and (3) direct transportation of radiation particles during the movement and deformation of the tetrahedral patient model. The results of our feasibility study show that it is certainly possible to construct 4D patient models (= phantoms) with sufficient accuracy using the tetrahedral-mesh geometry and to directly transport radiation particles during continuous movement and deformation of the tetrahedral patient model. This new approach not only produces more accurate dose distribution in the patient but also replaces the current practice of using multiple 3D voxel phantoms and combining multiple dose distributions after Monte Carlo simulations. For routine clinical application of our new approach, the use of fast automatic segmentation algorithms is a must. In order to achieve, simultaneously, both dose accuracy and computation speed, the number of tetrahedrons for the lungs should be optimized. Although the current computation speed of our new 4D Monte Carlo simulation approach is slow (i.e. ~40 times slower than that of the conventional dose accumulation approach), this problem is resolvable by developing, in Geant4, a dedicated navigation class optimized for particle transportation in tetrahedral-mesh geometry.
Designing Crowdcritique Systems for Formative Feedback
ERIC Educational Resources Information Center
Easterday, Matthew W.; Rees Lewis, Daniel; Gerber, Elizabeth M.
2017-01-01
Intelligent tutors based on expert systems often struggle to provide formative feedback on complex, ill-defined problems where answers are unknown. Hybrid crowdsourcing systems that combine the intelligence of multiple novices in face-to-face settings might provide an alternate approach for providing intelligent formative feedback. The purpose of…
ERIC Educational Resources Information Center
Viertel, David C.; Burns, Diane M.
2012-01-01
Unique integrative learning approaches represent a fundamental opportunity for undergraduate students and faculty alike to combine interdisciplinary methods with applied spatial research. Geography and geoscience-related disciplines are particularly well-suited to adapt multiple methods within a holistic and reflective mentored research paradigm.…
Approaches to Reading with Multiple Lenses of Interpretation
ERIC Educational Resources Information Center
Troise, Melissa
2007-01-01
High school teacher Melissa Troise challenges students to recognize the relationships that exist between literary theories, such as Marxism, feminism, and postcolonialism, and urges students to expand their contexts for reading texts by accessing and combining theories. Troise believes theory provides students with the potential to better…
A regenerative approach to the treatment of multiple sclerosis.
Deshmukh, Vishal A; Tardif, Virginie; Lyssiotis, Costas A; Green, Chelsea C; Kerman, Bilal; Kim, Hyung Joon; Padmanabhan, Krishnan; Swoboda, Jonathan G; Ahmad, Insha; Kondo, Toru; Gage, Fred H; Theofilopoulos, Argyrios N; Lawson, Brian R; Schultz, Peter G; Lairson, Luke L
2013-10-17
Progressive phases of multiple sclerosis are associated with inhibited differentiation of the progenitor cell population that generates the mature oligodendrocytes required for remyelination and disease remission. To identify selective inducers of oligodendrocyte differentiation, we performed an image-based screen for myelin basic protein (MBP) expression using primary rat optic-nerve-derived progenitor cells. Here we show that among the most effective compounds identifed was benztropine, which significantly decreases clinical severity in the experimental autoimmune encephalomyelitis (EAE) model of relapsing-remitting multiple sclerosis when administered alone or in combination with approved immunosuppressive treatments for multiple sclerosis. Evidence from a cuprizone-induced model of demyelination, in vitro and in vivo T-cell assays and EAE adoptive transfer experiments indicated that the observed efficacy of this drug results directly from an enhancement of remyelination rather than immune suppression. Pharmacological studies indicate that benztropine functions by a mechanism that involves direct antagonism of M1 and/or M3 muscarinic receptors. These studies should facilitate the development of effective new therapies for the treatment of multiple sclerosis that complement established immunosuppressive approaches.
Kim, Sungjin; Jinich, Adrián; Aspuru-Guzik, Alán
2017-04-24
We propose a multiple descriptor multiple kernel (MultiDK) method for efficient molecular discovery using machine learning. We show that the MultiDK method improves both the speed and accuracy of molecular property prediction. We apply the method to the discovery of electrolyte molecules for aqueous redox flow batteries. Using multiple-type-as opposed to single-type-descriptors, we obtain more relevant features for machine learning. Following the principle of "wisdom of the crowds", the combination of multiple-type descriptors significantly boosts prediction performance. Moreover, by employing multiple kernels-more than one kernel function for a set of the input descriptors-MultiDK exploits nonlinear relations between molecular structure and properties better than a linear regression approach. The multiple kernels consist of a Tanimoto similarity kernel and a linear kernel for a set of binary descriptors and a set of nonbinary descriptors, respectively. Using MultiDK, we achieve an average performance of r 2 = 0.92 with a test set of molecules for solubility prediction. We also extend MultiDK to predict pH-dependent solubility and apply it to a set of quinone molecules with different ionizable functional groups to assess their performance as flow battery electrolytes.
Fusion of magnetometer and gradiometer sensors of MEG in the presence of multiplicative error.
Mohseni, Hamid R; Woolrich, Mark W; Kringelbach, Morten L; Luckhoo, Henry; Smith, Penny Probert; Aziz, Tipu Z
2012-07-01
Novel neuroimaging techniques have provided unprecedented information on the structure and function of the living human brain. Multimodal fusion of data from different sensors promises to radically improve this understanding, yet optimal methods have not been developed. Here, we demonstrate a novel method for combining multichannel signals. We show how this method can be used to fuse signals from the magnetometer and gradiometer sensors used in magnetoencephalography (MEG), and through extensive experiments using simulation, head phantom and real MEG data, show that it is both robust and accurate. This new approach works by assuming that the lead fields have multiplicative error. The criterion to estimate the error is given within a spatial filter framework such that the estimated power is minimized in the worst case scenario. The method is compared to, and found better than, existing approaches. The closed-form solution and the conditions under which the multiplicative error can be optimally estimated are provided. This novel approach can also be employed for multimodal fusion of other multichannel signals such as MEG and EEG. Although the multiplicative error is estimated based on beamforming, other methods for source analysis can equally be used after the lead-field modification.
Cluster ensemble based on Random Forests for genetic data.
Alhusain, Luluah; Hafez, Alaaeldin M
2017-01-01
Clustering plays a crucial role in several application domains, such as bioinformatics. In bioinformatics, clustering has been extensively used as an approach for detecting interesting patterns in genetic data. One application is population structure analysis, which aims to group individuals into subpopulations based on shared genetic variations, such as single nucleotide polymorphisms. Advances in DNA sequencing technology have facilitated the obtainment of genetic datasets with exceptional sizes. Genetic data usually contain hundreds of thousands of genetic markers genotyped for thousands of individuals, making an efficient means for handling such data desirable. Random Forests (RFs) has emerged as an efficient algorithm capable of handling high-dimensional data. RFs provides a proximity measure that can capture different levels of co-occurring relationships between variables. RFs has been widely considered a supervised learning method, although it can be converted into an unsupervised learning method. Therefore, RF-derived proximity measure combined with a clustering technique may be well suited for determining the underlying structure of unlabeled data. This paper proposes, RFcluE, a cluster ensemble approach for determining the underlying structure of genetic data based on RFs. The approach comprises a cluster ensemble framework to combine multiple runs of RF clustering. Experiments were conducted on high-dimensional, real genetic dataset to evaluate the proposed approach. The experiments included an examination of the impact of parameter changes, comparing RFcluE performance against other clustering methods, and an assessment of the relationship between the diversity and quality of the ensemble and its effect on RFcluE performance. This paper proposes, RFcluE, a cluster ensemble approach based on RF clustering to address the problem of population structure analysis and demonstrate the effectiveness of the approach. The paper also illustrates that applying a cluster ensemble approach, combining multiple RF clusterings, produces more robust and higher-quality results as a consequence of feeding the ensemble with diverse views of high-dimensional genetic data obtained through bagging and random subspace, the two key features of the RF algorithm.
Tanpitukpongse, Teerath P.; Mazurowski, Maciej A.; Ikhena, John; Petrella, Jeffrey R.
2016-01-01
Background and Purpose To assess prognostic efficacy of individual versus combined regional volumetrics in two commercially-available brain volumetric software packages for predicting conversion of patients with mild cognitive impairment to Alzheimer's disease. Materials and Methods Data was obtained through the Alzheimer's Disease Neuroimaging Initiative. 192 subjects (mean age 74.8 years, 39% female) diagnosed with mild cognitive impairment at baseline were studied. All had T1WI MRI sequences at baseline and 3-year clinical follow-up. Analysis was performed with NeuroQuant® and Neuroreader™. Receiver operating characteristic curves assessing the prognostic efficacy of each software package were generated using a univariable approach employing individual regional brain volumes, as well as two multivariable approaches (multiple regression and random forest), combining multiple volumes. Results On univariable analysis of 11 NeuroQuant® and 11 Neuroreader™ regional volumes, hippocampal volume had the highest area under the curve for both software packages (0.69 NeuroQuant®, 0.68 Neuroreader™), and was not significantly different (p > 0.05) between packages. Multivariable analysis did not increase the area under the curve for either package (0.63 logistic regression, 0.60 random forest NeuroQuant®; 0.65 logistic regression, 0.62 random forest Neuroreader™). Conclusion Of the multiple regional volume measures available in FDA-cleared brain volumetric software packages, hippocampal volume remains the best single predictor of conversion of mild cognitive impairment to Alzheimer's disease at 3-year follow-up. Combining volumetrics did not add additional prognostic efficacy. Therefore, future prognostic studies in MCI, combining such tools with demographic and other biomarker measures, are justified in using hippocampal volume as the only volumetric biomarker. PMID:28057634
Multi-objective optimization for generating a weighted multi-model ensemble
NASA Astrophysics Data System (ADS)
Lee, H.
2017-12-01
Many studies have demonstrated that multi-model ensembles generally show better skill than each ensemble member. When generating weighted multi-model ensembles, the first step is measuring the performance of individual model simulations using observations. There is a consensus on the assignment of weighting factors based on a single evaluation metric. When considering only one evaluation metric, the weighting factor for each model is proportional to a performance score or inversely proportional to an error for the model. While this conventional approach can provide appropriate combinations of multiple models, the approach confronts a big challenge when there are multiple metrics under consideration. When considering multiple evaluation metrics, it is obvious that a simple averaging of multiple performance scores or model ranks does not address the trade-off problem between conflicting metrics. So far, there seems to be no best method to generate weighted multi-model ensembles based on multiple performance metrics. The current study applies the multi-objective optimization, a mathematical process that provides a set of optimal trade-off solutions based on a range of evaluation metrics, to combining multiple performance metrics for the global climate models and their dynamically downscaled regional climate simulations over North America and generating a weighted multi-model ensemble. NASA satellite data and the Regional Climate Model Evaluation System (RCMES) software toolkit are used for assessment of the climate simulations. Overall, the performance of each model differs markedly with strong seasonal dependence. Because of the considerable variability across the climate simulations, it is important to evaluate models systematically and make future projections by assigning optimized weighting factors to the models with relatively good performance. Our results indicate that the optimally weighted multi-model ensemble always shows better performance than an arithmetic ensemble mean and may provide reliable future projections.
Non-Mutually Exclusive Deep Neural Network Classifier for Combined Modes of Bearing Fault Diagnosis.
Duong, Bach Phi; Kim, Jong-Myon
2018-04-07
The simultaneous occurrence of various types of defects in bearings makes their diagnosis more challenging owing to the resultant complexity of the constituent parts of the acoustic emission (AE) signals. To address this issue, a new approach is proposed in this paper for the detection of multiple combined faults in bearings. The proposed methodology uses a deep neural network (DNN) architecture to effectively diagnose the combined defects. The DNN structure is based on the stacked denoising autoencoder non-mutually exclusive classifier (NMEC) method for combined modes. The NMEC-DNN is trained using data for a single fault and it classifies both single faults and multiple combined faults. The results of experiments conducted on AE data collected through an experimental test-bed demonstrate that the DNN achieves good classification performance with a maximum accuracy of 95%. The proposed method is compared with a multi-class classifier based on support vector machines (SVMs). The NMEC-DNN yields better diagnostic performance in comparison to the multi-class classifier based on SVM. The NMEC-DNN reduces the number of necessary data collections and improves the bearing fault diagnosis performance.
Group decision-making approach for flood vulnerability identification using the fuzzy VIKOR method
NASA Astrophysics Data System (ADS)
Lee, G.; Jun, K. S.; Chung, E.-S.
2015-04-01
This study proposes an improved group decision making (GDM) framework that combines the VIKOR method with data fuzzification to quantify the spatial flood vulnerability including multiple criteria. In general, GDM method is an effective tool for formulating a compromise solution that involves various decision makers since various stakeholders may have different perspectives on their flood risk/vulnerability management responses. The GDM approach is designed to achieve consensus building that reflects the viewpoints of each participant. The fuzzy VIKOR method was developed to solve multi-criteria decision making (MCDM) problems with conflicting and noncommensurable criteria. This comprising method can be used to obtain a nearly ideal solution according to all established criteria. This approach effectively can propose some compromising decisions by combining the GDM method and fuzzy VIKOR method. The spatial flood vulnerability of the southern Han River using the GDM approach combined with the fuzzy VIKOR method was compared with the spatial flood vulnerability using general MCDM methods, such as the fuzzy TOPSIS and classical GDM methods (i.e., Borda, Condorcet, and Copeland). As a result, the proposed fuzzy GDM approach can reduce the uncertainty in the data confidence and weight derivation techniques. Thus, the combination of the GDM approach with the fuzzy VIKOR method can provide robust prioritization because it actively reflects the opinions of various groups and considers uncertainty in the input data.
Automatic Prediction of Protein 3D Structures by Probabilistic Multi-template Homology Modeling.
Meier, Armin; Söding, Johannes
2015-10-01
Homology modeling predicts the 3D structure of a query protein based on the sequence alignment with one or more template proteins of known structure. Its great importance for biological research is owed to its speed, simplicity, reliability and wide applicability, covering more than half of the residues in protein sequence space. Although multiple templates have been shown to generally increase model quality over single templates, the information from multiple templates has so far been combined using empirically motivated, heuristic approaches. We present here a rigorous statistical framework for multi-template homology modeling. First, we find that the query proteins' atomic distance restraints can be accurately described by two-component Gaussian mixtures. This insight allowed us to apply the standard laws of probability theory to combine restraints from multiple templates. Second, we derive theoretically optimal weights to correct for the redundancy among related templates. Third, a heuristic template selection strategy is proposed. We improve the average GDT-ha model quality score by 11% over single template modeling and by 6.5% over a conventional multi-template approach on a set of 1000 query proteins. Robustness with respect to wrong constraints is likewise improved. We have integrated our multi-template modeling approach with the popular MODELLER homology modeling software in our free HHpred server http://toolkit.tuebingen.mpg.de/hhpred and also offer open source software for running MODELLER with the new restraints at https://bitbucket.org/soedinglab/hh-suite.
NASA Technical Reports Server (NTRS)
Kim, Hakil; Swain, Philip H.
1990-01-01
An axiomatic approach to intervalued (IV) probabilities is presented, where the IV probability is defined by a pair of set-theoretic functions which satisfy some pre-specified axioms. On the basis of this approach representation of statistical evidence and combination of multiple bodies of evidence are emphasized. Although IV probabilities provide an innovative means for the representation and combination of evidential information, they make the decision process rather complicated. It entails more intelligent strategies for making decisions. The development of decision rules over IV probabilities is discussed from the viewpoint of statistical pattern recognition. The proposed method, so called evidential reasoning method, is applied to the ground-cover classification of a multisource data set consisting of Multispectral Scanner (MSS) data, Synthetic Aperture Radar (SAR) data, and digital terrain data such as elevation, slope, and aspect. By treating the data sources separately, the method is able to capture both parametric and nonparametric information and to combine them. Then the method is applied to two separate cases of classifying multiband data obtained by a single sensor. In each case a set of multiple sources is obtained by dividing the dimensionally huge data into smaller and more manageable pieces based on the global statistical correlation information. By a divide-and-combine process, the method is able to utilize more features than the conventional maximum likelihood method.
Motion compensation via redundant-wavelet multihypothesis.
Fowler, James E; Cui, Suxia; Wang, Yonghui
2006-10-01
Multihypothesis motion compensation has been widely used in video coding with previous attention focused on techniques employing predictions that are diverse spatially or temporally. In this paper, the multihypothesis concept is extended into the transform domain by using a redundant wavelet transform to produce multiple predictions that are diverse in transform phase. The corresponding multiple-phase inverse transform implicitly combines the phase-diverse predictions into a single spatial-domain prediction for motion compensation. The performance advantage of this redundant-wavelet-multihypothesis approach is investigated analytically, invoking the fact that the multiple-phase inverse involves a projection that significantly reduces the power of a dense-motion residual modeled as additive noise. The analysis shows that redundant-wavelet multihypothesis is capable of up to a 7-dB reduction in prediction-residual variance over an equivalent single-phase, single-hypothesis approach. Experimental results substantiate the performance advantage for a block-based implementation.
Distribution of model uncertainty across multiple data streams
NASA Astrophysics Data System (ADS)
Wutzler, Thomas
2014-05-01
When confronting biogeochemical models with a diversity of observational data streams, we are faced with the problem of weighing the data streams. Without weighing or multiple blocked cost functions, model uncertainty is allocated to the sparse data streams and possible bias in processes that are strongly constraint is exported to processes that are constrained by sparse data streams only. In this study we propose an approach that aims at making model uncertainty a factor of observations uncertainty, that is constant over all data streams. Further we propose an implementation based on Monte-Carlo Markov chain sampling combined with simulated annealing that is able to determine this variance factor. The method is exemplified both with very simple models, artificial data and with an inversion of the DALEC ecosystem carbon model against multiple observations of Howland forest. We argue that the presented approach is able to help and maybe resolve the problem of bias export to sparse data streams.
A Hough Transform Global Probabilistic Approach to Multiple-Subject Diffusion MRI Tractography
Aganj, Iman; Lenglet, Christophe; Jahanshad, Neda; Yacoub, Essa; Harel, Noam; Thompson, Paul M.; Sapiro, Guillermo
2011-01-01
A global probabilistic fiber tracking approach based on the voting process provided by the Hough transform is introduced in this work. The proposed framework tests candidate 3D curves in the volume, assigning to each one a score computed from the diffusion images, and then selects the curves with the highest scores as the potential anatomical connections. The algorithm avoids local minima by performing an exhaustive search at the desired resolution. The technique is easily extended to multiple subjects, considering a single representative volume where the registered high-angular resolution diffusion images (HARDI) from all the subjects are non-linearly combined, thereby obtaining population-representative tracts. The tractography algorithm is run only once for the multiple subjects, and no tract alignment is necessary. We present experimental results on HARDI volumes, ranging from simulated and 1.5T physical phantoms to 7T and 4T human brain and 7T monkey brain datasets. PMID:21376655
Cheong, Fook Chiong; Wong, Chui Ching; Gao, YunFeng; Nai, Mui Hoon; Cui, Yidan; Park, Sungsu; Kenney, Linda J.; Lim, Chwee Teck
2015-01-01
Tracking fast-swimming bacteria in three dimensions can be extremely challenging with current optical techniques and a microscopic approach that can rapidly acquire volumetric information is required. Here, we introduce phase-contrast holographic video microscopy as a solution for the simultaneous tracking of multiple fast moving cells in three dimensions. This technique uses interference patterns formed between the scattered and the incident field to infer the three-dimensional (3D) position and size of bacteria. Using this optical approach, motility dynamics of multiple bacteria in three dimensions, such as speed and turn angles, can be obtained within minutes. We demonstrated the feasibility of this method by effectively tracking multiple bacteria species, including Escherichia coli, Agrobacterium tumefaciens, and Pseudomonas aeruginosa. In addition, we combined our fast 3D imaging technique with a microfluidic device to present an example of a drug/chemical assay to study effects on bacterial motility. PMID:25762336
How to retrieve additional information from the multiplicity distributions
NASA Astrophysics Data System (ADS)
Wilk, Grzegorz; Włodarczyk, Zbigniew
2017-01-01
Multiplicity distributions (MDs) P(N) measured in multiparticle production processes are most frequently described by the negative binomial distribution (NBD). However, with increasing collision energy some systematic discrepancies have become more and more apparent. They are usually attributed to the possible multi-source structure of the production process and described using a multi-NBD form of the MD. We investigate the possibility of keeping a single NBD but with its parameters depending on the multiplicity N. This is done by modifying the widely known clan model of particle production leading to the NBD form of P(N). This is then confronted with the approach based on the so-called cascade-stochastic formalism which is based on different types of recurrence relations defining P(N). We demonstrate that a combination of both approaches allows the retrieval of additional valuable information from the MDs, namely the oscillatory behavior of the counting statistics apparently visible in the high energy data.
Park, Sung Jin; Ogunseitan, Oladele A; Lejano, Raul P
2014-01-01
Regulatory agencies often face a dilemma when regulating chemicals in consumer products-namely, that of making decisions in the face of multiple, and sometimes conflicting, lines of evidence. We present an integrative approach for dealing with uncertainty and multiple pieces of evidence in toxics regulation. The integrative risk analytic framework is grounded in the Dempster-Shafer (D-S) theory that allows the analyst to combine multiple pieces of evidence and judgments from independent sources of information. We apply the integrative approach to the comparative risk assessment of bisphenol-A (BPA)-based polycarbonate and the functionally equivalent alternative, Eastman Tritan copolyester (ETC). Our results show that according to cumulative empirical evidence, the estimated probability of toxicity of BPA is 0.034, whereas the toxicity probability for ETC is 0.097. However, when we combine extant evidence with strength of confidence in the source (or expert judgment), we are guided by a richer interval measure, (Bel(t), Pl(t)). With the D-S derived measure, we arrive at various intervals for BPA, with the low-range estimate at (0.034, 0.250), and (0.097,0.688) for ETC. These new measures allow a reasonable basis for comparison and a justifiable procedure for decision making that takes advantage of multiple sources of evidence. Through the application of D-S theory to toxicity risk assessment, we show how a multiplicity of scientific evidence can be converted into a unified risk estimate, and how this information can be effectively used for comparative assessments to select potentially less toxic alternative chemicals. © 2013 SETAC.
Multifactorial antimicrobial wood protectants
Robert D. Coleman; Carol A. Clausen
2008-01-01
It is unlikely that a single antimicrobial compound, whether synthetic or natural, will provide the âmagic bulletâ for eliminating multiple biological agents affecting wood products. Development of synergistic combinations of selected compounds, especially those derived from natural sources, is recognized as a promising approach to improved wood protection. Recent...
A comparative study of cellulose nanofibrils disintegrated via multiple processing approaches
Yan Qing; Ronald Sabo; J.Y. Zhu; Umesh Agarwal; Zhiyong Cai; Yiqiang Wu
2013-01-01
Various cellulose nanofibrils (CNFs) created by refining and microfluidization, in combination withenzymatic or 2,2,6,6-tetramethylpiperidine-1-oxyl (TEMPO) oxidized pretreatment were compared. Themorphological properties, degree of polymerization, and crystallinity for the obtained nanofibrils, aswell as physical and mechanical properties of the corresponding films...
Applying Quality Function Deployment in Industrial Design Curriculum Planning
ERIC Educational Resources Information Center
Liu, Shuo-Fang; Lee, Yann-Long; Lin, Yi-Zhi; Tseng, Chien-Feng
2013-01-01
Industrial design is a discipline that combines multiple professional fields. Enterprise demands for industrial design competencies also change over time; thus, the curriculum of industrial design education should be compatible with the current demands of the industry. However, scientific approaches have not been previously employed to plan…
The advent of new higher throughput analytical instrumentation has put a strain on interpreting and explaining the results from complex studies. Contemporary human, environmental, and biomonitoring data sets are comprised of tens or hundreds of analytes, multiple repeat measures...
School Climate, Principal Support and Collaboration among Portuguese Teachers
ERIC Educational Resources Information Center
Castro Silva, José; Amante, Lúcia; Morgado, José
2017-01-01
This article analyses the relationship between school principal support and teacher collaboration among Portuguese teachers. Data were collected from a random sample of 234 teachers in middle and secondary schools. The use of a combined approach using linear and multiple regression tests concluded that the school principal support, through the…
ERIC Educational Resources Information Center
Hooper, Janice; And Others
1987-01-01
A multimodal intervention program designed for a nine-year-old with severe communication problems (secondary to cerebral palsy, receptive dysphasia, and auditory agnosia) combined manual signs and graphic symbols to help her communicate. The intensive, highly structured program had significant positive results. (Author/CB)
The Effects of Mobile Collaborative Activities in a Second Language Course
ERIC Educational Resources Information Center
Ilic, Peter
2015-01-01
This research is designed to explore the areas of collaborative learning and the use of smartphones as a support for collaborative learning through a year-long exploratory multiple case study approach integrating both qualitative and quantitative data analysis. Qualitative exploratory interviews are combined with Multidimensional Scaling Analysis…
USDA-ARS?s Scientific Manuscript database
Hyperspectral scattering is a promising technique for rapid and noninvasive measurement of multiple quality attributes of apple fruit. A hierarchical evolutionary algorithm (HEA) approach, in combination with subspace decomposition and partial least squares (PLS) regression, was proposed to select o...
Meta-Analysis of Scale Reliability Using Latent Variable Modeling
ERIC Educational Resources Information Center
Raykov, Tenko; Marcoulides, George A.
2013-01-01
A latent variable modeling approach is outlined that can be used for meta-analysis of reliability coefficients of multicomponent measuring instruments. Important limitations of efforts to combine composite reliability findings across multiple studies are initially pointed out. A reliability synthesis procedure is discussed that is based on…
Multi-jet Merging with NLO Matrix Elements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Siegert, Frank; /Freiburg U.; Hoche, Stefan
2011-08-18
In the algorithm presented here, the ME+PS approach to merge samples of tree-level matrix elements into inclusive event samples is combined with the POWHEG method, which includes exact next-to-leading order matrix elements in the parton shower. The advantages of the method are discussed and the quality of its implementation in SHERPA is exemplified by results for e{sup +}e{sup -} annihilation into hadrons at LEP, for deep-inelastic lepton-nucleon scattering at HERA, for Drell-Yan lepton-pair production at the Tevatron and for W{sup +}W{sup -}-production at LHC energies. The simulation of hard QCD radiation in parton-shower Monte Carlos has seen tremendous progress overmore » the last years. It was largely stimulated by the need for more precise predictions at LHC energies where the large available phase space allows additional hard QCD radiation alongside known Standard Model processes or even signals from new physics. Two types of algorithms have been developed, which allow to improve upon the soft-collinear approximations made in the parton shower, such that hard radiation is simulated according to exact matrix elements. In the ME+PS approach [1] higher-order tree-level matrix elements for different final-state jet multiplicity are merged with each other and with subsequent parton shower emissions to generate an inclusive sample. Such a prescription is invaluable for analyses which are sensitive to final states with a large jet multiplicity. The only remaining deficiency of such tree-level calculations is the large uncertainty stemming from scale variations. The POWHEG method [2] solves this problem for the lowest multiplicity subprocess by combining full NLO matrix elements with the parton shower. While this leads to NLO accuracy in the inclusive cross section and the exact radiation pattern for the first emission, it fails to describe higher-order emissions with improved accuracy. Thus it is not sufficient if final states with high jet multiplicities are considered. With the complementary advantages of these two approaches, the question arises naturally whether it would be possible to combine them into an even more powerful one. Such a combined algorithm was independently developed in [5] and [6]. Here a summary of the algorithm is given and predictions from corresponding Monte-Carlo predictions are presented.« less
Integrated Model of Multiple Kernel Learning and Differential Evolution for EUR/USD Trading
Deng, Shangkun; Sakurai, Akito
2014-01-01
Currency trading is an important area for individual investors, government policy decisions, and organization investments. In this study, we propose a hybrid approach referred to as MKL-DE, which combines multiple kernel learning (MKL) with differential evolution (DE) for trading a currency pair. MKL is used to learn a model that predicts changes in the target currency pair, whereas DE is used to generate the buy and sell signals for the target currency pair based on the relative strength index (RSI), while it is also combined with MKL as a trading signal. The new hybrid implementation is applied to EUR/USD trading, which is the most traded foreign exchange (FX) currency pair. MKL is essential for utilizing information from multiple information sources and DE is essential for formulating a trading rule based on a mixture of discrete structures and continuous parameters. Initially, the prediction model optimized by MKL predicts the returns based on a technical indicator called the moving average convergence and divergence. Next, a combined trading signal is optimized by DE using the inputs from the prediction model and technical indicator RSI obtained from multiple timeframes. The experimental results showed that trading using the prediction learned by MKL yielded consistent profits. PMID:25097891
A Bayesian trans-dimensional approach for the fusion of multiple geophysical datasets
NASA Astrophysics Data System (ADS)
JafarGandomi, Arash; Binley, Andrew
2013-09-01
We propose a Bayesian fusion approach to integrate multiple geophysical datasets with different coverage and sensitivity. The fusion strategy is based on the capability of various geophysical methods to provide enough resolution to identify either subsurface material parameters or subsurface structure, or both. We focus on electrical resistivity as the target material parameter and electrical resistivity tomography (ERT), electromagnetic induction (EMI), and ground penetrating radar (GPR) as the set of geophysical methods. However, extending the approach to different sets of geophysical parameters and methods is straightforward. Different geophysical datasets are entered into a trans-dimensional Markov chain Monte Carlo (McMC) search-based joint inversion algorithm. The trans-dimensional property of the McMC algorithm allows dynamic parameterisation of the model space, which in turn helps to avoid bias of the post-inversion results towards a particular model. Given that we are attempting to develop an approach that has practical potential, we discretize the subsurface into an array of one-dimensional earth-models. Accordingly, the ERT data that are collected by using two-dimensional acquisition geometry are re-casted to a set of equivalent vertical electric soundings. Different data are inverted either individually or jointly to estimate one-dimensional subsurface models at discrete locations. We use Shannon's information measure to quantify the information obtained from the inversion of different combinations of geophysical datasets. Information from multiple methods is brought together via introducing joint likelihood function and/or constraining the prior information. A Bayesian maximum entropy approach is used for spatial fusion of spatially dispersed estimated one-dimensional models and mapping of the target parameter. We illustrate the approach with a synthetic dataset and then apply it to a field dataset. We show that the proposed fusion strategy is successful not only in enhancing the subsurface information but also as a survey design tool to identify the appropriate combination of the geophysical tools and show whether application of an individual method for further investigation of a specific site is beneficial.
Mayhew, Terry M; Lucocq, John M
2011-03-01
Various methods for quantifying cellular immunogold labelling on transmission electron microscope thin sections are currently available. All rely on sound random sampling principles and are applicable to single immunolabelling across compartments within a given cell type or between different experimental groups of cells. Although methods are also available to test for colocalization in double/triple immunogold labelling studies, so far, these have relied on making multiple measurements of gold particle densities in defined areas or of inter-particle nearest neighbour distances. Here, we present alternative two-step approaches to codistribution and colocalization assessment that merely require raw counts of gold particles in distinct cellular compartments. For assessing codistribution over aggregate compartments, initial statistical evaluation involves combining contingency table and chi-squared analyses to provide predicted gold particle distributions. The observed and predicted distributions allow testing of the appropriate null hypothesis, namely, that there is no difference in the distribution patterns of proteins labelled by different sizes of gold particle. In short, the null hypothesis is that of colocalization. The approach for assessing colabelling recognises that, on thin sections, a compartment is made up of a set of sectional images (profiles) of cognate structures. The approach involves identifying two groups of compartmental profiles that are unlabelled and labelled for one gold marker size. The proportions in each group that are also labelled for the second gold marker size are then compared. Statistical analysis now uses a 2 × 2 contingency table combined with the Fisher exact probability test. Having identified double labelling, the profiles can be analysed further in order to identify characteristic features that might account for the double labelling. In each case, the approach is illustrated using synthetic and/or experimental datasets and can be refined to correct observed labelling patterns to specific labelling patterns. These simple and efficient approaches should be of more immediate utility to those interested in codistribution and colocalization in multiple immunogold labelling investigations.
Multiple curved descending approaches and the air traffic control problem
NASA Technical Reports Server (NTRS)
Hart, S. G.; Mcpherson, D.; Kreifeldt, J.; Wemple, T. E.
1977-01-01
A terminal area air traffic control simulation was designed to study ways of accommodating increased air traffic density. The concepts that were investigated assumed the availability of the microwave landing system and data link and included: (1) multiple curved descending final approaches; (2) parallel runways certified for independent and simultaneous operation under IFR conditions; (3) closer spacing between successive aircraft; and (4) a distributed management system between the air and ground. Three groups each consisting of three pilots and two air traffic controllers flew a combined total of 350 approaches. Piloted simulators were supplied with computer generated traffic situation displays and flight instruments. The controllers were supplied with a terminal area map and digital status information. Pilots and controllers also reported that the distributed management procedure was somewhat more safe and orderly than the centralized management procedure. Flying precision increased as the amount of turn required to intersect the outer mark decreased. Pilots reported that they preferred the alternative of multiple curved descending approaches with wider spacing between aircraft to closer spacing on single, straight in finals while controllers preferred the latter option. Both pilots and controllers felt that parallel runways are an acceptable way to accommodate increased traffic density safely and expeditiously.
Bayesian networks improve causal environmental ...
Rule-based weight of evidence approaches to ecological risk assessment may not account for uncertainties and generally lack probabilistic integration of lines of evidence. Bayesian networks allow causal inferences to be made from evidence by including causal knowledge about the problem, using this knowledge with probabilistic calculus to combine multiple lines of evidence, and minimizing biases in predicting or diagnosing causal relationships. Too often, sources of uncertainty in conventional weight of evidence approaches are ignored that can be accounted for with Bayesian networks. Specifying and propagating uncertainties improve the ability of models to incorporate strength of the evidence in the risk management phase of an assessment. Probabilistic inference from a Bayesian network allows evaluation of changes in uncertainty for variables from the evidence. The network structure and probabilistic framework of a Bayesian approach provide advantages over qualitative approaches in weight of evidence for capturing the impacts of multiple sources of quantifiable uncertainty on predictions of ecological risk. Bayesian networks can facilitate the development of evidence-based policy under conditions of uncertainty by incorporating analytical inaccuracies or the implications of imperfect information, structuring and communicating causal issues through qualitative directed graph formulations, and quantitatively comparing the causal power of multiple stressors on value
Information Retrieval and Graph Analysis Approaches for Book Recommendation.
Benkoussas, Chahinez; Bellot, Patrice
2015-01-01
A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD) a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval) Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments.
A statistical approach to combining multisource information in one-class classifiers
Simonson, Katherine M.; Derek West, R.; Hansen, Ross L.; ...
2017-06-08
A new method is introduced in this paper for combining information from multiple sources to support one-class classification. The contributing sources may represent measurements taken by different sensors of the same physical entity, repeated measurements by a single sensor, or numerous features computed from a single measured image or signal. The approach utilizes the theory of statistical hypothesis testing, and applies Fisher's technique for combining p-values, modified to handle nonindependent sources. Classifier outputs take the form of fused p-values, which may be used to gauge the consistency of unknown entities with one or more class hypotheses. The approach enables rigorousmore » assessment of classification uncertainties, and allows for traceability of classifier decisions back to the constituent sources, both of which are important for high-consequence decision support. Application of the technique is illustrated in two challenge problems, one for skin segmentation and the other for terrain labeling. Finally, the method is seen to be particularly effective for relatively small training samples.« less
A statistical approach to combining multisource information in one-class classifiers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simonson, Katherine M.; Derek West, R.; Hansen, Ross L.
A new method is introduced in this paper for combining information from multiple sources to support one-class classification. The contributing sources may represent measurements taken by different sensors of the same physical entity, repeated measurements by a single sensor, or numerous features computed from a single measured image or signal. The approach utilizes the theory of statistical hypothesis testing, and applies Fisher's technique for combining p-values, modified to handle nonindependent sources. Classifier outputs take the form of fused p-values, which may be used to gauge the consistency of unknown entities with one or more class hypotheses. The approach enables rigorousmore » assessment of classification uncertainties, and allows for traceability of classifier decisions back to the constituent sources, both of which are important for high-consequence decision support. Application of the technique is illustrated in two challenge problems, one for skin segmentation and the other for terrain labeling. Finally, the method is seen to be particularly effective for relatively small training samples.« less
Issues in evaluation: evaluating assessments of elderly people using a combination of methods.
McEwan, R T
1989-02-01
In evaluating a health service, individuals will give differing accounts of its performance, according to their experiences of the service, and the evaluative perspective they adopt. The value of a service may also change through time, and according to the particular part of the service studied. Traditional health care evaluations have generally not accounted for this variability because of the approaches used. Studies evaluating screening or assessment programmes for the elderly have focused on programme effectiveness and efficiency, using relatively inflexible quantitative methods. Evaluative approaches must reflect the complexity of health service provision, and methods must vary to suit the particular research objective. Under these circumstances, this paper presents the case for the use of multiple triangulation in evaluative research, where differing methods and perspectives are combined in one study. Emphasis is placed on the applications and benefits of subjectivist approaches in evaluation. An example of combined methods is provided in the form of an evaluation of the Newcastle Care Plan for the Elderly.
Information Retrieval and Graph Analysis Approaches for Book Recommendation
Benkoussas, Chahinez; Bellot, Patrice
2015-01-01
A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD) a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval) Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments. PMID:26504899
Tsuda, Sachiko; Kee, Michelle Z L; Cunha, Catarina; Kim, Jinsook; Yan, Ping; Loew, Leslie M; Augustine, George J
2013-01-01
Recent advances in our understanding of brain function have come from using light to either control or image neuronal activity. Here we describe an approach that combines both techniques: a micromirror array is used to photostimulate populations of presynaptic neurons expressing channelrhodopsin-2, while a red-shifted voltage-sensitive dye allows optical detection of resulting postsynaptic activity. Such technology allowed us to control the activity of cerebellar interneurons while simultaneously recording inhibitory responses in multiple Purkinje neurons, their postsynaptic targets. This approach should substantially accelerate our understanding of information processing by populations of neurons within brain circuits. Copyright © 2013 Elsevier Ireland Ltd and the Japan Neuroscience Society. All rights reserved.
Multiple Chronic Condition Combinations and Depression in Community-Dwelling Older Adults.
Pruchno, Rachel A; Wilson-Genderson, Maureen; Heid, Allison R
2016-07-01
The U.S. Department of Health and Human Services recently called for a paradigm shift from the study of individual chronic conditions to multiple chronic conditions (MCCs). We identify the most common combinations of chronic diseases experienced by a sample of community-dwelling older people and assess whether depression is differentially associated with combinations of illnesses. Self-reports of diagnosed chronic conditions and depressive symptoms were provided by 5,688 people participating in the ORANJ BOWL(SM) research panel. Each respondent was categorized as belonging to one of 32 groups. ANOVA examined the association between depressive symptoms and combinations of illnesses. People with more health conditions experienced higher levels of depression than people with fewer health conditions. People with some illness combinations had higher levels of depressive symptoms than people with other illness combinations. Findings confirm extensive variability in the combinations of illnesses experienced by older adults and demonstrate the complex associations of specific illness combinations with depressive symptoms. Results highlight the need to expand our conceptualization of research and treatment around MCCs and call for a person-centered approach that addresses the unique needs of individuals with MCCs. © The Author 2016. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Composite load spectra for select space propulsion structural components
NASA Technical Reports Server (NTRS)
Newell, J. F.; Kurth, R. E.; Ho, H.
1991-01-01
The objective of this program is to develop generic load models with multiple levels of progressive sophistication to simulate the composite (combined) load spectra that are induced in space propulsion system components, representative of Space Shuttle Main Engines (SSME), such as transfer ducts, turbine blades, and liquid oxygen posts and system ducting. The first approach will consist of using state of the art probabilistic methods to describe the individual loading conditions and combinations of these loading conditions to synthesize the composite load spectra simulation. The second approach will consist of developing coupled models for composite load spectra simulation which combine the deterministic models for composite load dynamic, acoustic, high pressure, and high rotational speed, etc., load simulation using statistically varying coefficients. These coefficients will then be determined using advanced probabilistic simulation methods with and without strategically selected experimental data.
An integrative formal model of motivation and decision making: The MGPM*.
Ballard, Timothy; Yeo, Gillian; Loft, Shayne; Vancouver, Jeffrey B; Neal, Andrew
2016-09-01
We develop and test an integrative formal model of motivation and decision making. The model, referred to as the extended multiple-goal pursuit model (MGPM*), is an integration of the multiple-goal pursuit model (Vancouver, Weinhardt, & Schmidt, 2010) and decision field theory (Busemeyer & Townsend, 1993). Simulations of the model generated predictions regarding the effects of goal type (approach vs. avoidance), risk, and time sensitivity on prioritization. We tested these predictions in an experiment in which participants pursued different combinations of approach and avoidance goals under different levels of risk. The empirical results were consistent with the predictions of the MGPM*. Specifically, participants pursuing 1 approach and 1 avoidance goal shifted priority from the approach to the avoidance goal over time. Among participants pursuing 2 approach goals, those with low time sensitivity prioritized the goal with the larger discrepancy, whereas those with high time sensitivity prioritized the goal with the smaller discrepancy. Participants pursuing 2 avoidance goals generally prioritized the goal with the smaller discrepancy. Finally, all of these effects became weaker as the level of risk increased. We used quantitative model comparison to show that the MGPM* explained the data better than the original multiple-goal pursuit model, and that the major extensions from the original model were justified. The MGPM* represents a step forward in the development of a general theory of decision making during multiple-goal pursuit. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Management of Fatigue in Persons with Multiple Sclerosis
Khan, Fary; Amatya, Bhasker; Galea, Mary
2014-01-01
Fatigue is one of the most common symptoms of multiple sclerosis. Despite advances in pharmacological and non-pharmacological treatment, fatigue continues to be the disabling symptom in persons with MS (pwMS), affecting almost 80% of pwMS. In current practice, both pharmacological and non-pharmacological interventions are used in combination, encompassing a multi-disciplinary approach. The body of research investigating the effect of these interventions is growing. This review systematically evaluated the existing evidence on the effectiveness and safety of different interventions currently applied for the management of fatigue in person with multiple sclerosis in improving patient outcomes, to guide treating clinicians. PMID:25309504
Rescuing mutant CFTR: a multi-task approach to a better outcome in treating cystic fibrosis.
Amaral, Margarida D; Farinha, Carlos M
2013-01-01
Correcting multiple defects of mutant CFTR with small molecule compounds has been the goal of an increasing number of recent Cystic Fibrosis (CF) drug discovery programmes. However, the mechanism of action (MoA) by which these molecules restore mutant CFTR is still poorly understood, in particular of CFTR correctors, i.e., compounds rescuing to the cells surface the most prevalent mutant in CF patients--F508del-CFTR. However, there is increasing evidence that to fully restore the multiple defects associated with F508del-CFTR, different small molecules with distinct corrective properties may be required. Towards this goal, a better insight into MoA of correctors is needed and several constraints should be addressed. The methodological approaches to achieve this include: 1) testing the combined effect of compounds with that of other (non-pharmacological) rescuing strategies (e.g., revertants or low temperature); 2) assessing effects in multiple cellular models (non-epithelial vs epithelial, non-human vs human, immortalized vs primary cultures, polarized vs non polarized, cells vs tissues); 3) assessing compound effects on isolated CFTR domains (e.g., compound binding by surface plasmon resonance, assessing effects on domain folding and aggregation); and finally 4) assessing compounds specificity in rescuing different CFTR mutants and other mutant proteins. These topics are reviewed and discussed here so as to provide a state-of-the art review on how to combine multiple ways of rescuing mutant CFTR to the ultimate benefit of CF patients.
Design of clinical trials involving multiple hypothesis tests with a common control.
Schou, I Manjula; Marschner, Ian C
2017-07-01
Randomized clinical trials comparing several treatments to a common control are often reported in the medical literature. For example, multiple experimental treatments may be compared with placebo, or in combination therapy trials, a combination therapy may be compared with each of its constituent monotherapies. Such trials are typically designed using a balanced approach in which equal numbers of individuals are randomized to each arm, however, this can result in an inefficient use of resources. We provide a unified framework and new theoretical results for optimal design of such single-control multiple-comparator studies. We consider variance optimal designs based on D-, A-, and E-optimality criteria, using a general model that allows for heteroscedasticity and a range of effect measures that include both continuous and binary outcomes. We demonstrate the sensitivity of these designs to the type of optimality criterion by showing that the optimal allocation ratios are systematically ordered according to the optimality criterion. Given this sensitivity to the optimality criterion, we argue that power optimality is a more suitable approach when designing clinical trials where testing is the objective. Weighted variance optimal designs are also discussed, which, like power optimal designs, allow the treatment difference to play a major role in determining allocation ratios. We illustrate our methods using two real clinical trial examples taken from the medical literature. Some recommendations on the use of optimal designs in single-control multiple-comparator trials are also provided. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Some intriguing aspects of multiparticle production processes
NASA Astrophysics Data System (ADS)
Wilk, Grzegorz; Włodarczyk, Zbigniew
2018-04-01
Multiparticle production processes provide valuable information about the mechanism of the conversion of the initial energy of projectiles into a number of secondaries by measuring their multiplicity distributions and their distributions in phase space. They therefore serve as a reference point for more involved measurements. Distributions in phase space are usually investigated using the statistical approach, very successful in general but failing in cases of small colliding systems, small multiplicities, and at the edges of the allowed phase space, in which cases the underlying dynamical effects competing with the statistical distributions take over. We discuss an alternative approach, which applies to the whole phase space without detailed knowledge of dynamics. It is based on a modification of the usual statistics by generalizing it to a superstatistical form. We stress particularly the scaling and self-similar properties of such an approach manifesting themselves as the phenomena of the log-periodic oscillations and oscillations of temperature caused by sound waves in hadronic matter. Concerning the multiplicity distributions we discuss in detail the phenomenon of the oscillatory behavior of the modified combinants apparently observed in experimental data.
Scialla, Julia J
2018-07-01
Treatment of mineral metabolism is a mainstay of dialysis care including some of its most widely used and costly pharmaceuticals. Although many mineral metabolites are associated with increased risk of mortality, cardiovascular disease, and other morbidities, few clinical trials are available to guide therapy and most focus on single drug approaches. In practice, providers manage many aspects of mineral metabolism simultaneously in integrated treatment approaches that incorporate multiple agents and changes in the dialysis prescription. The present review discusses the rationale and existing evidence for evaluating integrated, as opposed to single drug, approaches in mineral metabolism. Drugs used to treat mineral metabolism have numerous, and sometimes, opposing effects on biochemical risk factors, such as fibroblast growth factor 23 (FGF23), calcium, and phosphorus. Although vitamin D sterols raise these risk markers when lowering parathyroid hormone (PTH), calcimimetics lower them. Trials demonstrate that combined approaches best 'normalize' the mineral metabolism axis in end-stage renal disease (ESRD). Observations embedded within major trials of calcimimetics reveal that adjustment of calcium-based binders and dialysate calcium is a common approach to adverse effects of these drugs with some initial, but inconclusive, evidence that these co-interventions may impact outcomes. The multiple, and often opposing, biochemical effects of many mineral metabolism drugs provides a strong rationale for studying integrated management strategies that consider combinations of drugs and co-interventions as a whole. This remains a current gap in the field with opportunities for clinical trials.
Love, Tanzy; Carriquiry, Alicia
2009-01-01
We analyze data collected in a somatic embryogenesis experiment carried out on Zea mays at Iowa State University. The main objective of the study was to identify the set of genes in maize that actively participate in embryo development. Embryo tissue was sampled and analyzed at various time periods and under different mediums and light conditions. As is the case in many microarray experiments, the operator scanned each slide multiple times to find the slide-specific ‘optimal’ laser and sensor settings. The multiple readings of each slide are repeated measurements on different scales with differing censoring; they cannot be considered to be replicate measurements in the traditional sense. Yet it has been shown that the choice of reading can have an impact on genetic inference. We propose a hierarchical modeling approach to estimating gene expression that combines all available readings on each spot and accounts for censoring in the observed values. We assess the statistical properties of the proposed expression estimates using a simulation experiment. As expected, combining all available scans using an approach with good statistical properties results in expression estimates with noticeably lower bias and root mean squared error relative to other approaches that have been proposed in the literature. Inferences drawn from the somatic embryogenesis experiment, which motivated this work changed drastically when data were analyzed using the standard approaches or using the methodology we propose. PMID:19960120
Multi-layer plastic/glass microfluidic systems containing electrical and mechanical functionality.
Han, Arum; Wang, Olivia; Graff, Mason; Mohanty, Swomitra K; Edwards, Thayne L; Han, Ki-Ho; Bruno Frazier, A
2003-08-01
This paper describes an approach for fabricating multi-layer microfluidic systems from a combination of glass and plastic materials. Methods and characterization results for the microfabrication technologies underlying the process flow are presented. The approach is used to fabricate and characterize multi-layer plastic/glass microfluidic systems containing electrical and mechanical functionality. Hot embossing, heat staking of plastics, injection molding, microstenciling of electrodes, and stereolithography were combined with conventional MEMS fabrication techniques to realize the multi-layer systems. The approach enabled the integration of multiple plastic/glass materials into a single monolithic system, provided a solution for the integration of electrical functionality throughout the system, provided a mechanism for the inclusion of microactuators such as micropumps/valves, and provided an interconnect technology for interfacing fluids and electrical components between the micro system and the macro world.
Field Science Ethnography: Methods For Systematic Observation on an Expedition
NASA Technical Reports Server (NTRS)
Clancey, William J.; Clancy, Daniel (Technical Monitor)
2001-01-01
The Haughton-Mars expedition is a multidisciplinary project, exploring an impact crater in an extreme environment to determine how people might live and work on Mars. The expedition seeks to understand and field test Mars facilities, crew roles, operations, and computer tools. I combine an ethnographic approach to establish a baseline understanding of how scientists prefer to live and work when relatively unemcumbered, with a participatory design approach of experimenting with procedures and tools in the context of use. This paper focuses on field methods for systematically recording and analyzing the expedition's activities. Systematic photography and time-lapse video are combined with concept mapping to organize and present information. This hybrid approach is generally applicable to the study of modern field expeditions having a dozen or more multidisciplinary participants, spread over a large terrain during multiple field seasons.
Objectively combining AR5 instrumental period and paleoclimate climate sensitivity evidence
NASA Astrophysics Data System (ADS)
Lewis, Nicholas; Grünwald, Peter
2018-03-01
Combining instrumental period evidence regarding equilibrium climate sensitivity with largely independent paleoclimate proxy evidence should enable a more constrained sensitivity estimate to be obtained. Previous, subjective Bayesian approaches involved selection of a prior probability distribution reflecting the investigators' beliefs about climate sensitivity. Here a recently developed approach employing two different statistical methods—objective Bayesian and frequentist likelihood-ratio—is used to combine instrumental period and paleoclimate evidence based on data presented and assessments made in the IPCC Fifth Assessment Report. Probabilistic estimates from each source of evidence are represented by posterior probability density functions (PDFs) of physically-appropriate form that can be uniquely factored into a likelihood function and a noninformative prior distribution. The three-parameter form is shown accurately to fit a wide range of estimated climate sensitivity PDFs. The likelihood functions relating to the probabilistic estimates from the two sources are multiplicatively combined and a prior is derived that is noninformative for inference from the combined evidence. A posterior PDF that incorporates the evidence from both sources is produced using a single-step approach, which avoids the order-dependency that would arise if Bayesian updating were used. Results are compared with an alternative approach using the frequentist signed root likelihood ratio method. Results from these two methods are effectively identical, and provide a 5-95% range for climate sensitivity of 1.1-4.05 K (median 1.87 K).
Detecting bursts in the EEG of very and extremely premature infants using a multi-feature approach.
O'Toole, John M; Boylan, Geraldine B; Lloyd, Rhodri O; Goulding, Robert M; Vanhatalo, Sampsa; Stevenson, Nathan J
2017-07-01
To develop a method that segments preterm EEG into bursts and inter-bursts by extracting and combining multiple EEG features. Two EEG experts annotated bursts in individual EEG channels for 36 preterm infants with gestational age < 30 weeks. The feature set included spectral, amplitude, and frequency-weighted energy features. Using a consensus annotation, feature selection removed redundant features and a support vector machine combined features. Area under the receiver operator characteristic (AUC) and Cohen's kappa (κ) evaluated performance within a cross-validation procedure. The proposed channel-independent method improves AUC by 4-5% over existing methods (p < 0.001, n=36), with median (95% confidence interval) AUC of 0.989 (0.973-0.997) and sensitivity-specificity of 95.8-94.4%. Agreement rates between the detector and experts' annotations, κ=0.72 (0.36-0.83) and κ=0.65 (0.32-0.81), are comparable to inter-rater agreement, κ=0.60 (0.21-0.74). Automating the visual identification of bursts in preterm EEG is achievable with a high level of accuracy. Multiple features, combined using a data-driven approach, improves on existing single-feature methods. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Feature and Score Fusion Based Multiple Classifier Selection for Iris Recognition
Islam, Md. Rabiul
2014-01-01
The aim of this work is to propose a new feature and score fusion based iris recognition approach where voting method on Multiple Classifier Selection technique has been applied. Four Discrete Hidden Markov Model classifiers output, that is, left iris based unimodal system, right iris based unimodal system, left-right iris feature fusion based multimodal system, and left-right iris likelihood ratio score fusion based multimodal system, is combined using voting method to achieve the final recognition result. CASIA-IrisV4 database has been used to measure the performance of the proposed system with various dimensions. Experimental results show the versatility of the proposed system of four different classifiers with various dimensions. Finally, recognition accuracy of the proposed system has been compared with existing N hamming distance score fusion approach proposed by Ma et al., log-likelihood ratio score fusion approach proposed by Schmid et al., and single level feature fusion approach proposed by Hollingsworth et al. PMID:25114676
Kenny, Joseph P.; Janssen, Curtis L.; Gordon, Mark S.; ...
2008-01-01
Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE) has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA) Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also addressmore » interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.« less
Feature and score fusion based multiple classifier selection for iris recognition.
Islam, Md Rabiul
2014-01-01
The aim of this work is to propose a new feature and score fusion based iris recognition approach where voting method on Multiple Classifier Selection technique has been applied. Four Discrete Hidden Markov Model classifiers output, that is, left iris based unimodal system, right iris based unimodal system, left-right iris feature fusion based multimodal system, and left-right iris likelihood ratio score fusion based multimodal system, is combined using voting method to achieve the final recognition result. CASIA-IrisV4 database has been used to measure the performance of the proposed system with various dimensions. Experimental results show the versatility of the proposed system of four different classifiers with various dimensions. Finally, recognition accuracy of the proposed system has been compared with existing N hamming distance score fusion approach proposed by Ma et al., log-likelihood ratio score fusion approach proposed by Schmid et al., and single level feature fusion approach proposed by Hollingsworth et al.
Instructor Perspectives of Multiple-Choice Questions in Summative Assessment for Novice Programmers
ERIC Educational Resources Information Center
Shuhidan, Shuhaida; Hamilton, Margaret; D'Souza, Daryl
2010-01-01
Learning to program is known to be difficult for novices. High attrition and high failure rates in foundation-level programming courses undertaken at tertiary level in Computer Science programs, are commonly reported. A common approach to evaluating novice programming ability is through a combination of formative and summative assessments, with…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-09
... intramural and extramural research efforts that address the combined health effects of multiple environmental... construed as a funding opportunity or grant program. Input from all interested parties is welcome including... program priorities and recommends funding levels to assure maximum utilization of available resources in...
Integrating Science and Management to Assess Forest Ecosystem Vulnerability to Climate Change
Leslie A. Brandt; Patricia R. Butler; Stephen D. Handler; Maria K. Janowiak; P. Danielle Shannon; Christopher W. Swanston
2017-01-01
We developed the ecosystem vulnerability assessment approach (EVAA) to help inform potential adaptation actions in response to a changing climate. EVAA combines multiple quantitative models and expert elicitation from scientists and land managers. In each of eight assessment areas, a panel of local experts determined potential vulnerability of forest ecosystems to...
Pathways to Childhood Depressive Symptoms: The Role of Social, Cognitive, and Genetic Risk Factors
ERIC Educational Resources Information Center
Lau, Jennifer Y. F.; Rijsdijk, Fruhling; Gregory, Alice M.; McGuffin, Peter; Eley, Thalia C.
2007-01-01
Childhood depressive conditions have been explored from multiple theoretical approaches but with few empirical attempts to address the interrelationships among these different domains and their combined effects. In the present study, the authors examined different pathways through which social, cognitive, and genetic risk factors may be expressed…
Thomas W. Bonnot; Frank R. Thompson; Joshua J. Millspaugh; D. Todd Jones-Farland
2013-01-01
Efforts to conserve regional biodiversity in the face of global climate change, habitat loss and fragmentation will depend on approaches that consider population processes at multiple scales. By combining habitat and demographic modeling, landscape-based population viability models effectively relate small-scale habitat and landscape patterns to regional population...
Physician performance assessment using a composite quality index.
Liu, Kaibo; Jain, Shabnam; Shi, Jianjun
2013-07-10
Assessing physician performance is important for the purposes of measuring and improving quality of service and reducing healthcare delivery costs. In recent years, physician performance scorecards have been used to provide feedback on individual measures; however, one key challenge is how to develop a composite quality index that combines multiple measures for overall physician performance evaluation. A controversy arises over establishing appropriate weights to combine indicators in multiple dimensions, and cannot be easily resolved. In this study, we proposed a generic unsupervised learning approach to develop a single composite index for physician performance assessment by using non-negative principal component analysis. We developed a new algorithm named iterative quadratic programming to solve the numerical issue in the non-negative principal component analysis approach. We conducted real case studies to demonstrate the performance of the proposed method. We provided interpretations from both statistical and clinical perspectives to evaluate the developed composite ranking score in practice. In addition, we implemented the root cause assessment techniques to explain physician performance for improvement purposes. Copyright © 2012 John Wiley & Sons, Ltd.
PD-1/PD-L1 inhibitors in multiple myeloma: The present and the future
Jelinek, T.; Hajek, R.
2016-01-01
ABSTRACT The introduction of PD-1/PD-L1 pathway inhibitors has marked a significant milestone in the treatment of various types of solid tumors. The current situation in multiple myeloma (MM) is rather unclear, as distinct research groups have reported discordant results. This discrepancy dominantly concerns the expression of PD-1/PD-L1 molecules as well as the identification of the responsible immune effector cell population. The results of monotherapy with PD-1/PD-L1 inhibitors have been unsatisfactory in MM, suggesting that a combination approach is needed. The most logical partners are immunomodulatory agents as they possess many synergistic effects. We are also proposing other rational and promising combinations (e.g., daratumumab, ibrutinib, anti-CD137) that warrant further investigation. PMID:28123899
Crowe, Sonya; Brown, Katherine; Tregay, Jenifer; Wray, Jo; Knowles, Rachel; Ridout, Deborah A; Bull, Catherine; Utley, Martin
2017-01-01
Background Improving integration and continuity of care across sectors within resource constraints is a priority in many health systems. Qualitative operational research methods of problem structuring have been used to address quality improvement in services involving multiple sectors but not in combination with quantitative operational research methods that enable targeting of interventions according to patient risk. We aimed to combine these methods to augment and inform an improvement initiative concerning infants with congenital heart disease (CHD) whose complex care pathway spans multiple sectors. Methods Soft systems methodology was used to consider systematically changes to services from the perspectives of community, primary, secondary and tertiary care professionals and a patient group, incorporating relevant evidence. Classification and regression tree (CART) analysis of national audit datasets was conducted along with data visualisation designed to inform service improvement within the context of limited resources. Results A ‘Rich Picture’ was developed capturing the main features of services for infants with CHD pertinent to service improvement. This was used, along with a graphical summary of the CART analysis, to guide discussions about targeting interventions at specific patient risk groups. Agreement was reached across representatives of relevant health professions and patients on a coherent set of targeted recommendations for quality improvement. These fed into national decisions about service provision and commissioning. Conclusions When tackling complex problems in service provision across multiple settings, it is important to acknowledge and work with multiple perspectives systematically and to consider targeting service improvements in response to confined resources. Our research demonstrates that applying a combination of qualitative and quantitative operational research methods is one approach to doing so that warrants further consideration. PMID:28062603
Combining p-values in replicated single-case experiments with multivariate outcome.
Solmi, Francesca; Onghena, Patrick
2014-01-01
Interest in combining probabilities has a long history in the global statistical community. The first steps in this direction were taken by Ronald Fisher, who introduced the idea of combining p-values of independent tests to provide a global decision rule when multiple aspects of a given problem were of interest. An interesting approach to this idea of combining p-values is the one based on permutation theory. The methods belonging to this particular approach exploit the permutation distributions of the tests to be combined, and use a simple function to combine probabilities. Combining p-values finds a very interesting application in the analysis of replicated single-case experiments. In this field the focus, while comparing different treatments effects, is more articulated than when just looking at the means of the different populations. Moreover, it is often of interest to combine the results obtained on the single patients in order to get more global information about the phenomenon under study. This paper gives an overview of how the concept of combining p-values was conceived, and how it can be easily handled via permutation techniques. Finally, the method of combining p-values is applied to a simulated replicated single-case experiment, and a numerical illustration is presented.
Narrow band imaging combined with water immersion technique in the diagnosis of celiac disease.
Valitutti, Francesco; Oliva, Salvatore; Iorfida, Donatella; Aloi, Marina; Gatti, Silvia; Trovato, Chiara Maria; Montuori, Monica; Tiberti, Antonio; Cucchiara, Salvatore; Di Nardo, Giovanni
2014-12-01
The "multiple-biopsy" approach both in duodenum and bulb is the best strategy to confirm the diagnosis of celiac disease; however, this increases the invasiveness of the procedure itself and is time-consuming. To evaluate the diagnostic yield of a single biopsy guided by narrow-band imaging combined with water immersion technique in paediatric patients. Prospective assessment of the diagnostic accuracy of narrow-band imaging/water immersion technique-driven biopsy approach versus standard protocol in suspected celiac disease. The experimental approach correctly diagnosed 35/40 children with celiac disease, with an overall diagnostic sensitivity of 87.5% (95% CI: 77.3-97.7). An altered pattern of narrow-band imaging/water immersion technique endoscopic visualization was significantly associated with villous atrophy at guided biopsy (Spearman Rho 0.637, p<0.001). Concordance of narrow-band imaging/water immersion technique endoscopic assessments was high between two operators (K: 0.884). The experimental protocol was highly timesaving compared to the standard protocol. An altered narrow-band imaging/water immersion technique pattern coupled with high anti-transglutaminase antibodies could allow a single guided biopsy to diagnose celiac disease. When no altered mucosal pattern is visible even by narrow-band imaging/water immersion technique, multiple bulbar and duodenal biopsies should be obtained. Copyright © 2014. Published by Elsevier Ltd.
Fisz, Jacek J
2006-12-07
The optimization approach based on the genetic algorithm (GA) combined with multiple linear regression (MLR) method, is discussed. The GA-MLR optimizer is designed for the nonlinear least-squares problems in which the model functions are linear combinations of nonlinear functions. GA optimizes the nonlinear parameters, and the linear parameters are calculated from MLR. GA-MLR is an intuitive optimization approach and it exploits all advantages of the genetic algorithm technique. This optimization method results from an appropriate combination of two well-known optimization methods. The MLR method is embedded in the GA optimizer and linear and nonlinear model parameters are optimized in parallel. The MLR method is the only one strictly mathematical "tool" involved in GA-MLR. The GA-MLR approach simplifies and accelerates considerably the optimization process because the linear parameters are not the fitted ones. Its properties are exemplified by the analysis of the kinetic biexponential fluorescence decay surface corresponding to a two-excited-state interconversion process. A short discussion of the variable projection (VP) algorithm, designed for the same class of the optimization problems, is presented. VP is a very advanced mathematical formalism that involves the methods of nonlinear functionals, algebra of linear projectors, and the formalism of Fréchet derivatives and pseudo-inverses. Additional explanatory comments are added on the application of recently introduced the GA-NR optimizer to simultaneous recovery of linear and weakly nonlinear parameters occurring in the same optimization problem together with nonlinear parameters. The GA-NR optimizer combines the GA method with the NR method, in which the minimum-value condition for the quadratic approximation to chi(2), obtained from the Taylor series expansion of chi(2), is recovered by means of the Newton-Raphson algorithm. The application of the GA-NR optimizer to model functions which are multi-linear combinations of nonlinear functions, is indicated. The VP algorithm does not distinguish the weakly nonlinear parameters from the nonlinear ones and it does not apply to the model functions which are multi-linear combinations of nonlinear functions.
Gene Therapy for Infectious Diseases
Bunnell, Bruce A.; Morgan, Richard A.
1998-01-01
Gene therapy is being investigated as an alternative treatment for a wide range of infectious diseases that are not amenable to standard clinical management. Approaches to gene therapy for infectious diseases can be divided into three broad categories: (i) gene therapies based on nucleic acid moieties, including antisense DNA or RNA, RNA decoys, and catalytic RNA moieties (ribozymes); (ii) protein approaches such as transdominant negative proteins and single-chain antibodies; and (iii) immunotherapeutic approaches involving genetic vaccines or pathogen-specific lymphocytes. It is further possible that combinations of the aforementioned approaches will be used simultaneously to inhibit multiple stages of the life cycle of the infectious agent. PMID:9457428
Zhang, Guangzhao; Lv, Lei; Deng, Yonghong; Wang, Chaoyang
2017-06-01
Self-healing hydrogels have been studied by many researchers via multiple cross-linking approaches including physical and chemical interactions. It is an interesting project in multifunctional hydrogel exploration that a water soluble polymer matrix is cross-linked by combining the ionic coordination and the multiple hydrogen bonds to fabricate self-healing hydrogels with injectable property. This study introduces a general procedure of preparing the hydrogels (termed gelatin-UPy-Fe) cross-linked by both ionic coordination of Fe 3+ and carboxyl group from the gelatin and the quadruple hydrogen bonding interaction from the ureido-pyrimidinone (UPy) dimers. The gelatin-UPy-Fe hydrogels possess an excellent self-healing property. The effects of the ionic coordination of Fe 3+ and quadruple hydrogen bonding of UPy on the formation and mechanical behavior of the prepared hydrogels are investigated. In vitro drug release of the gelatin-UPy-Fe hydrogels is also observed, giving an intriguing glimpse into possible biological applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Xu, Banglao; Du, Yan; Lin, Jinqiong; Qi, Mingyue; Shu, Bowen; Wen, Xiaoxia; Liang, Guangtie; Chen, Bin; Liu, Dayu
2016-12-06
A microfluidic chip was developed for one-step identification and antimicrobial susceptibility testing (AST) of multiple uropathogens. The polydimethylsiloxane (PDMS) microchip used had features of cell culture chamber arrays connected through a sample introduction channel. At the bottom of each chamber, a paper substrate preloaded with chromogenic media and antimicrobial agents was embedded. By integrating a hydrophobic membrane valve on the microchip, the urine sample can be equally distributed into and confined in individual chambers. The identification and AST assays on multiple uropathogens were performed by combining the spatial resolution of the cell culture arrays and the color resolution from the chromogenic reaction. The composite microbial testing assay was based on dynamic changes in color in a serial of chambers. The bacterial antimicrobial susceptibility was determined by the lowest concentration of an antimicrobial agent that is capable of inhibiting the chromogenic reaction. Using three common uropathogenic bacteria as test models, the developed microfluidic approach was demonstrated to be able to complete the multiple colorimetric assays in 15 h. The accuracy of the microchip method, in comparison with that of the conventional approach, showed a coincidence of 94.1%. Our data suggest this microfluidic approach will be a promising tool for simple and fast uropathogen testing in resource-limited settings.
En-face Flying Spot OCT/Ophthalmoscope
NASA Astrophysics Data System (ADS)
Rosen, Richard B.; Garcia, Patricia; Podoleanu, Adrian Gh.; Cucu, Radu; Dobre, George; Trifanov, Irina; van Velthoven, Mirjam E. J.; de Smet, Marc D.; Rogers, John A.; Hathaway, Mark; Pedro, Justin; Weitz, Rishard
This is a review of a technique for high-resolution imaging of the eye that allows multiple sample sectioning perspectives with different axial resolutions. The technique involves the flying spot approach employed in confocal scanning laser ophthalmoscopy which is extended to OCT imaging via time domain en face fast lateral scanning. The ability of imaging with multiple axial resolutions stimulated the development of the dual en face OCT-confocal imaging technology. Dual imaging also allows various other imaging combinations, such as OCT with confocal microscopy for imaging the eye anterior segment and OCT with fluorescence angiography imaging.
He, Jianjun; Gu, Hong; Liu, Wenqi
2012-01-01
It is well known that an important step toward understanding the functions of a protein is to determine its subcellular location. Although numerous prediction algorithms have been developed, most of them typically focused on the proteins with only one location. In recent years, researchers have begun to pay attention to the subcellular localization prediction of the proteins with multiple sites. However, almost all the existing approaches have failed to take into account the correlations among the locations caused by the proteins with multiple sites, which may be the important information for improving the prediction accuracy of the proteins with multiple sites. In this paper, a new algorithm which can effectively exploit the correlations among the locations is proposed by using gaussian process model. Besides, the algorithm also can realize optimal linear combination of various feature extraction technologies and could be robust to the imbalanced data set. Experimental results on a human protein data set show that the proposed algorithm is valid and can achieve better performance than the existing approaches.
Multiple quantum coherence spectroscopy.
Mathew, Nathan A; Yurs, Lena A; Block, Stephen B; Pakoulev, Andrei V; Kornau, Kathryn M; Wright, John C
2009-08-20
Multiple quantum coherences provide a powerful approach for studies of complex systems because increasing the number of quantum states in a quantum mechanical superposition state increases the selectivity of a spectroscopic measurement. We show that frequency domain multiple quantum coherence multidimensional spectroscopy can create these superposition states using different frequency excitation pulses. The superposition state is created using two excitation frequencies to excite the symmetric and asymmetric stretch modes in a rhodium dicarbonyl chelate and the dynamic Stark effect to climb the vibrational ladders involving different overtone and combination band states. A monochromator resolves the free induction decay of different coherences comprising the superposition state. The three spectral dimensions provide the selectivity required to observe 19 different spectral features associated with fully coherent nonlinear processes involving up to 11 interactions with the excitation fields. The different features act as spectroscopic probes of the diagonal and off-diagonal parts of the molecular potential energy hypersurface. This approach can be considered as a coherent pump-probe spectroscopy where the pump is a series of excitation pulses that prepares a multiple quantum coherence and the probe is another series of pulses that creates the output coherence.
Advances in chemical labeling of proteins in living cells.
Yan, Qi; Bruchez, Marcel P
2015-04-01
The pursuit of quantitative biological information via imaging requires robust labeling approaches that can be used in multiple applications and with a variety of detectable colors and properties. In addition to conventional fluorescent proteins, chemists and biologists have come together to provide a range of approaches that combine dye chemistry with the convenience of genetic targeting. This hybrid-tagging approach amalgamates the rational design of properties available through synthetic dye chemistry with the robust biological targeting available with genetic encoding. In this review, we discuss the current range of approaches that have been exploited for dye targeting or for targeting and activation and some of the recent applications that are uniquely permitted by these hybrid-tagging approaches.
Nuclear cycler: An incremental approach to the deflection of asteroids
NASA Astrophysics Data System (ADS)
Vasile, Massimiliano; Thiry, Nicolas
2016-04-01
This paper introduces a novel deflection approach based on nuclear explosions: the nuclear cycler. The idea is to combine the effectiveness of nuclear explosions with the controllability and redundancy offered by slow push methods within an incremental deflection strategy. The paper will present an extended model for single nuclear stand-off explosions in the proximity of elongated ellipsoidal asteroids, and a family of natural formation orbits that allows the spacecraft to deploy multiple bombs while being shielded by the asteroid during the detonation.
Linear combination methods to improve diagnostic/prognostic accuracy on future observations
Kang, Le; Liu, Aiyi; Tian, Lili
2014-01-01
Multiple diagnostic tests or biomarkers can be combined to improve diagnostic accuracy. The problem of finding the optimal linear combinations of biomarkers to maximise the area under the receiver operating characteristic curve has been extensively addressed in the literature. The purpose of this article is threefold: (1) to provide an extensive review of the existing methods for biomarker combination; (2) to propose a new combination method, namely, the nonparametric stepwise approach; (3) to use leave-one-pair-out cross-validation method, instead of re-substitution method, which is overoptimistic and hence might lead to wrong conclusion, to empirically evaluate and compare the performance of different linear combination methods in yielding the largest area under receiver operating characteristic curve. A data set of Duchenne muscular dystrophy was analysed to illustrate the applications of the discussed combination methods. PMID:23592714
Grant, Steven
2018-06-01
Venetoclax (ABT-199) is a Bcl-2-specific BH3-mimetic that has shown significant promise in certain subtypes of CLL as well as in several other hematologic malignancies. As in the case of essentially all targeted agents, intrinsic or acquired resistance to this agent generally occurs, prompting the search for new strategies capable of circumventing this problem. A logical approach to this challenge involves rational combination strategies designed to disable preexisting or induced compensatory survival pathways. Many of these strategies involve downregulation of Mcl-1, a pro-survival Bcl-2 family member that is not targeted by venetoclax, and which often confers resistance to this agent. Given encouraging clinical results involving venetoclax in both lymphoid and myeloid malignancies, it is likely that such combination approaches will be incorporated into the therapeutic armamentarium for multiple hematologic malignancies in the near future.
Maulidiani; Rudiyanto; Abas, Faridah; Ismail, Intan Safinar; Lajis, Nordin H
2018-06-01
Optimization process is an important aspect in the natural product extractions. Herein, an alternative approach is proposed for the optimization in extraction, namely, the Generalized Likelihood Uncertainty Estimation (GLUE). The approach combines the Latin hypercube sampling, the feasible range of independent variables, the Monte Carlo simulation, and the threshold criteria of response variables. The GLUE method is tested in three different techniques including the ultrasound, the microwave, and the supercritical CO 2 assisted extractions utilizing the data from previously published reports. The study found that this method can: provide more information on the combined effects of the independent variables on the response variables in the dotty plots; deal with unlimited number of independent and response variables; consider combined multiple threshold criteria, which is subjective depending on the target of the investigation for response variables; and provide a range of values with their distribution for the optimization. Copyright © 2018 Elsevier Ltd. All rights reserved.
Non-Mutually Exclusive Deep Neural Network Classifier for Combined Modes of Bearing Fault Diagnosis
Kim, Jong-Myon
2018-01-01
The simultaneous occurrence of various types of defects in bearings makes their diagnosis more challenging owing to the resultant complexity of the constituent parts of the acoustic emission (AE) signals. To address this issue, a new approach is proposed in this paper for the detection of multiple combined faults in bearings. The proposed methodology uses a deep neural network (DNN) architecture to effectively diagnose the combined defects. The DNN structure is based on the stacked denoising autoencoder non-mutually exclusive classifier (NMEC) method for combined modes. The NMEC-DNN is trained using data for a single fault and it classifies both single faults and multiple combined faults. The results of experiments conducted on AE data collected through an experimental test-bed demonstrate that the DNN achieves good classification performance with a maximum accuracy of 95%. The proposed method is compared with a multi-class classifier based on support vector machines (SVMs). The NMEC-DNN yields better diagnostic performance in comparison to the multi-class classifier based on SVM. The NMEC-DNN reduces the number of necessary data collections and improves the bearing fault diagnosis performance. PMID:29642466
Reich, M R
2000-03-17
Global inequities in access to pharmaceutical products exist between rich and poor countries because of market and government failures as well as huge income differences. Multiple policies are required to address this global drug gap for three categories of pharmaceutical products: essential drugs, new drugs, and yet-to-be-developed drugs. Policies should combine "push" approaches of subsidies to support targeted drug development, "pull" approaches of financial incentives such as market guarantees, and "process" approaches aimed at improved institutional capacity. Constructive solutions are needed that can both protect the incentives for research and development and reduce the inequities of access.
Discovering Synergistic Drug Combination from a Computational Perspective.
Ding, Pingjian; Luo, Jiawei; Liang, Cheng; Xiao, Qiu; Cao, Buwen; Li, Guanghui
2018-03-30
Synergistic drug combinations play an important role in the treatment of complex diseases. The identification of effective drug combination is vital to further reduce the side effects and improve therapeutic efficiency. In previous years, in vitro method has been the main route to discover synergistic drug combinations. However, many limitations of time and resource consumption lie within the in vitro method. Therefore, with the rapid development of computational models and the explosive growth of large and phenotypic data, computational methods for discovering synergistic drug combinations are an efficient and promising tool and contribute to precision medicine. It is the key of computational methods how to construct the computational model. Different computational strategies generate different performance. In this review, the recent advancements in computational methods for predicting effective drug combination are concluded from multiple aspects. First, various datasets utilized to discover synergistic drug combinations are summarized. Second, we discussed feature-based approaches and partitioned these methods into two classes including feature-based methods in terms of similarity measure, and feature-based methods in terms of machine learning. Third, we discussed network-based approaches for uncovering synergistic drug combinations. Finally, we analyzed and prospected computational methods for predicting effective drug combinations. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Assessing Risk Prediction Models Using Individual Participant Data From Multiple Studies
Pennells, Lisa; Kaptoge, Stephen; White, Ian R.; Thompson, Simon G.; Wood, Angela M.; Tipping, Robert W.; Folsom, Aaron R.; Couper, David J.; Ballantyne, Christie M.; Coresh, Josef; Goya Wannamethee, S.; Morris, Richard W.; Kiechl, Stefan; Willeit, Johann; Willeit, Peter; Schett, Georg; Ebrahim, Shah; Lawlor, Debbie A.; Yarnell, John W.; Gallacher, John; Cushman, Mary; Psaty, Bruce M.; Tracy, Russ; Tybjærg-Hansen, Anne; Price, Jackie F.; Lee, Amanda J.; McLachlan, Stela; Khaw, Kay-Tee; Wareham, Nicholas J.; Brenner, Hermann; Schöttker, Ben; Müller, Heiko; Jansson, Jan-Håkan; Wennberg, Patrik; Salomaa, Veikko; Harald, Kennet; Jousilahti, Pekka; Vartiainen, Erkki; Woodward, Mark; D'Agostino, Ralph B.; Bladbjerg, Else-Marie; Jørgensen, Torben; Kiyohara, Yutaka; Arima, Hisatomi; Doi, Yasufumi; Ninomiya, Toshiharu; Dekker, Jacqueline M.; Nijpels, Giel; Stehouwer, Coen D. A.; Kauhanen, Jussi; Salonen, Jukka T.; Meade, Tom W.; Cooper, Jackie A.; Cushman, Mary; Folsom, Aaron R.; Psaty, Bruce M.; Shea, Steven; Döring, Angela; Kuller, Lewis H.; Grandits, Greg; Gillum, Richard F.; Mussolino, Michael; Rimm, Eric B.; Hankinson, Sue E.; Manson, JoAnn E.; Pai, Jennifer K.; Kirkland, Susan; Shaffer, Jonathan A.; Shimbo, Daichi; Bakker, Stephan J. L.; Gansevoort, Ron T.; Hillege, Hans L.; Amouyel, Philippe; Arveiler, Dominique; Evans, Alun; Ferrières, Jean; Sattar, Naveed; Westendorp, Rudi G.; Buckley, Brendan M.; Cantin, Bernard; Lamarche, Benoît; Barrett-Connor, Elizabeth; Wingard, Deborah L.; Bettencourt, Richele; Gudnason, Vilmundur; Aspelund, Thor; Sigurdsson, Gunnar; Thorsson, Bolli; Kavousi, Maryam; Witteman, Jacqueline C.; Hofman, Albert; Franco, Oscar H.; Howard, Barbara V.; Zhang, Ying; Best, Lyle; Umans, Jason G.; Onat, Altan; Sundström, Johan; Michael Gaziano, J.; Stampfer, Meir; Ridker, Paul M.; Michael Gaziano, J.; Ridker, Paul M.; Marmot, Michael; Clarke, Robert; Collins, Rory; Fletcher, Astrid; Brunner, Eric; Shipley, Martin; Kivimäki, Mika; Ridker, Paul M.; Buring, Julie; Cook, Nancy; Ford, Ian; Shepherd, James; Cobbe, Stuart M.; Robertson, Michele; Walker, Matthew; Watson, Sarah; Alexander, Myriam; Butterworth, Adam S.; Angelantonio, Emanuele Di; Gao, Pei; Haycock, Philip; Kaptoge, Stephen; Pennells, Lisa; Thompson, Simon G.; Walker, Matthew; Watson, Sarah; White, Ian R.; Wood, Angela M.; Wormser, David; Danesh, John
2014-01-01
Individual participant time-to-event data from multiple prospective epidemiologic studies enable detailed investigation into the predictive ability of risk models. Here we address the challenges in appropriately combining such information across studies. Methods are exemplified by analyses of log C-reactive protein and conventional risk factors for coronary heart disease in the Emerging Risk Factors Collaboration, a collation of individual data from multiple prospective studies with an average follow-up duration of 9.8 years (dates varied). We derive risk prediction models using Cox proportional hazards regression analysis stratified by study and obtain estimates of risk discrimination, Harrell's concordance index, and Royston's discrimination measure within each study; we then combine the estimates across studies using a weighted meta-analysis. Various weighting approaches are compared and lead us to recommend using the number of events in each study. We also discuss the calculation of measures of reclassification for multiple studies. We further show that comparison of differences in predictive ability across subgroups should be based only on within-study information and that combining measures of risk discrimination from case-control studies and prospective studies is problematic. The concordance index and discrimination measure gave qualitatively similar results throughout. While the concordance index was very heterogeneous between studies, principally because of differing age ranges, the increments in the concordance index from adding log C-reactive protein to conventional risk factors were more homogeneous. PMID:24366051
Assessing risk prediction models using individual participant data from multiple studies.
Pennells, Lisa; Kaptoge, Stephen; White, Ian R; Thompson, Simon G; Wood, Angela M
2014-03-01
Individual participant time-to-event data from multiple prospective epidemiologic studies enable detailed investigation into the predictive ability of risk models. Here we address the challenges in appropriately combining such information across studies. Methods are exemplified by analyses of log C-reactive protein and conventional risk factors for coronary heart disease in the Emerging Risk Factors Collaboration, a collation of individual data from multiple prospective studies with an average follow-up duration of 9.8 years (dates varied). We derive risk prediction models using Cox proportional hazards regression analysis stratified by study and obtain estimates of risk discrimination, Harrell's concordance index, and Royston's discrimination measure within each study; we then combine the estimates across studies using a weighted meta-analysis. Various weighting approaches are compared and lead us to recommend using the number of events in each study. We also discuss the calculation of measures of reclassification for multiple studies. We further show that comparison of differences in predictive ability across subgroups should be based only on within-study information and that combining measures of risk discrimination from case-control studies and prospective studies is problematic. The concordance index and discrimination measure gave qualitatively similar results throughout. While the concordance index was very heterogeneous between studies, principally because of differing age ranges, the increments in the concordance index from adding log C-reactive protein to conventional risk factors were more homogeneous.
Lombardi, Federica; Golla, Kalyan; Fitzpatrick, Darren J.; Casey, Fergal P.; Moran, Niamh; Shields, Denis C.
2015-01-01
Identifying effective therapeutic drug combinations that modulate complex signaling pathways in platelets is central to the advancement of effective anti-thrombotic therapies. However, there is no systems model of the platelet that predicts responses to different inhibitor combinations. We developed an approach which goes beyond current inhibitor-inhibitor combination screening to efficiently consider other signaling aspects that may give insights into the behaviour of the platelet as a system. We investigated combinations of platelet inhibitors and activators. We evaluated three distinct strands of information, namely: activator-inhibitor combination screens (testing a panel of inhibitors against a panel of activators); inhibitor-inhibitor synergy screens; and activator-activator synergy screens. We demonstrated how these analyses may be efficiently performed, both experimentally and computationally, to identify particular combinations of most interest. Robust tests of activator-activator synergy and of inhibitor-inhibitor synergy required combinations to show significant excesses over the double doses of each component. Modeling identified multiple effects of an inhibitor of the P2Y12 ADP receptor, and complementarity between inhibitor-inhibitor synergy effects and activator-inhibitor combination effects. This approach accelerates the mapping of combination effects of compounds to develop combinations that may be therapeutically beneficial. We integrated the three information sources into a unified model that predicted the benefits of a triple drug combination targeting ADP, thromboxane and thrombin signaling. PMID:25875950
Probing of multiple magnetic responses in magnetic inductors using atomic force microscopy.
Park, Seongjae; Seo, Hosung; Seol, Daehee; Yoon, Young-Hwan; Kim, Mi Yang; Kim, Yunseok
2016-02-08
Even though nanoscale analysis of magnetic properties is of significant interest, probing methods are relatively less developed compared to the significance of the technique, which has multiple potential applications. Here, we demonstrate an approach for probing various magnetic properties associated with eddy current, coil current and magnetic domains in magnetic inductors using multidimensional magnetic force microscopy (MMFM). The MMFM images provide combined magnetic responses from the three different origins, however, each contribution to the MMFM response can be differentiated through analysis based on the bias dependence of the response. In particular, the bias dependent MMFM images show locally different eddy current behavior with values dependent on the type of materials that comprise the MI. This approach for probing magnetic responses can be further extended to the analysis of local physical features.
Srinivasa, Narayan; Zhang, Deying; Grigorian, Beayna
2014-03-01
This paper describes a novel architecture for enabling robust and efficient neuromorphic communication. The architecture combines two concepts: 1) synaptic time multiplexing (STM) that trades space for speed of processing to create an intragroup communication approach that is firing rate independent and offers more flexibility in connectivity than cross-bar architectures and 2) a wired multiple input multiple output (MIMO) communication with orthogonal frequency division multiplexing (OFDM) techniques to enable a robust and efficient intergroup communication for neuromorphic systems. The MIMO-OFDM concept for the proposed architecture was analyzed by simulating large-scale spiking neural network architecture. Analysis shows that the neuromorphic system with MIMO-OFDM exhibits robust and efficient communication while operating in real time with a high bit rate. Through combining STM with MIMO-OFDM techniques, the resulting system offers a flexible and scalable connectivity as well as a power and area efficient solution for the implementation of very large-scale spiking neural architectures in hardware.
A Framework for Distributed Problem Solving
NASA Astrophysics Data System (ADS)
Leone, Joseph; Shin, Don G.
1989-03-01
This work explores a distributed problem solving (DPS) approach, namely the AM/AG model, to cooperative memory recall. The AM/AG model is a hierarchic social system metaphor for DPS based on the Mintzberg's model of organizations. At the core of the model are information flow mechanisms, named amplification and aggregation. Amplification is a process of expounding a given task, called an agenda, into a set of subtasks with magnified degree of specificity and distributing them to multiple processing units downward in the hierarchy. Aggregation is a process of combining the results reported from multiple processing units into a unified view, called a resolution, and promoting the conclusion upward in the hierarchy. The combination of amplification and aggregation can account for a memory recall process which primarily relies on the ability of making associations between vast amounts of related concepts, sorting out the combined results, and promoting the most plausible ones. The amplification process is discussed in detail. An implementation of the amplification process is presented. The process is illustrated by an example.
NASA Astrophysics Data System (ADS)
Kim, M. G.; Lin, J. C.; Huang, L.; Edwards, T. W.; Jones, J. P.; Polavarapu, S.; Nassar, R.
2012-12-01
Reducing uncertainties in the projections of atmospheric CO2 concentration levels relies on increasing our scientific understanding of the exchange processes between atmosphere and land at regional scales, which is highly dependent on climate, ecosystem processes, and anthropogenic disturbances. In order for researchers to reduce the uncertainties, a combined framework that mutually addresses these independent variables to account for each process is invaluable. In this research, an example of top-down inversion modeling approach that is combined with stable isotope measurement data is presented. The potential for the proposed analysis framework is demonstrated using the Stochastic Time-Inverted Lagrangian Transport (STILT) model runs combined with high precision CO2 concentration data measured at a Canadian greenhouse gas monitoring site as well as multiple tracers: stable isotopes and combustion-related species. This framework yields a unique regional scale constraint that can be used to relate the measured changes of tracer concentrations to processes in their upwind source regions. The inversion approach both reproduces source areas in a spatially explicit way through sophisticated Lagrangian transport modeling and infers emission processes that leave imprints on atmospheric tracers. The understanding gained through the combined approach can also be used to verify reported emissions as part of regulatory regimes. The results indicate that changes in CO2 concentration is strongly influenced by regional sources, including significant fossil fuel emissions, and that the combined approach can be used to test reported emissions of the greenhouse gas from oil sands developments. Also, methods to further reduce uncertainties in the retrieved emissions by incorporating additional constraints including tracer-to-tracer correlations and satellite measurements are discussed briefly.
Auerbach, Nancy A; Tulloch, Ayesha I T; Possingham, Hugh P
Conservation practitioners, faced with managing multiple threats to biodiversity and limited funding, must prioritize investment in different management actions. From an economic perspective, it is routine practice to invest where the highest rate of return is expected. This return-on-investment (ROI) thinking can also benefit species conservation, and researchers are developing sophisticated approaches to support decision-making for cost-effective conservation. However, applied use of these approaches is limited. Managers may be wary of “black-box” algorithms or complex methods that are difficult to explain to funding agencies. As an alternative, we demonstrate the use of a basic ROI analysis for determining where to invest in cost-effective management to address threats to species. This method can be applied using basic geographic information system and spreadsheet calculations. We illustrate the approach in a management action prioritization for a biodiverse region of eastern Australia. We use ROI to prioritize management actions for two threats to a suite of threatened species: habitat degradation by cattle grazing, and predation by invasive red foxes (Vulpes vulpes). We show how decisions based on cost-effective threat management depend upon how expected benefits to species are defined and how benefits and costs co-vary. By considering a combination of species richness, restricted habitats, species vulnerability, and costs of management actions, small investments can result in greater expected benefit compared with management decisions that consider only species richness. Furthermore, a landscape management strategy that implements multiple actions is more efficient than managing only for one threat, or more traditional approaches that don't consider ROI. Our approach provides transparent and logical decision support for prioritizing different actions intended to abate threats associated with multiple species; it is of use when managers need a justifiable and repeatable approach to investment.
Choi, Ji Yeh; Hwang, Heungsun; Yamamoto, Michio; Jung, Kwanghee; Woodward, Todd S
2017-06-01
Functional principal component analysis (FPCA) and functional multiple-set canonical correlation analysis (FMCCA) are data reduction techniques for functional data that are collected in the form of smooth curves or functions over a continuum such as time or space. In FPCA, low-dimensional components are extracted from a single functional dataset such that they explain the most variance of the dataset, whereas in FMCCA, low-dimensional components are obtained from each of multiple functional datasets in such a way that the associations among the components are maximized across the different sets. In this paper, we propose a unified approach to FPCA and FMCCA. The proposed approach subsumes both techniques as special cases. Furthermore, it permits a compromise between the techniques, such that components are obtained from each set of functional data to maximize their associations across different datasets, while accounting for the variance of the data well. We propose a single optimization criterion for the proposed approach, and develop an alternating regularized least squares algorithm to minimize the criterion in combination with basis function approximations to functions. We conduct a simulation study to investigate the performance of the proposed approach based on synthetic data. We also apply the approach for the analysis of multiple-subject functional magnetic resonance imaging data to obtain low-dimensional components of blood-oxygen level-dependent signal changes of the brain over time, which are highly correlated across the subjects as well as representative of the data. The extracted components are used to identify networks of neural activity that are commonly activated across the subjects while carrying out a working memory task.
Quantifying similarity in reliability surfaces using the probability of agreement
Stevens, Nathaniel T.; Anderson-Cook, Christine Michaela
2017-03-30
When separate populations exhibit similar reliability as a function of multiple explanatory variables, combining them into a single population is tempting. This can simplify future predictions and reduce uncertainty associated with estimation. However, combining these populations may introduce bias if the underlying relationships are in fact different. The probability of agreement formally and intuitively quantifies the similarity of estimated reliability surfaces across a two-factor input space. An example from the reliability literature demonstrates the utility of the approach when deciding whether to combine two populations or to keep them as distinct. As a result, new graphical summaries provide strategies formore » visualizing the results.« less
Quantifying similarity in reliability surfaces using the probability of agreement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stevens, Nathaniel T.; Anderson-Cook, Christine Michaela
When separate populations exhibit similar reliability as a function of multiple explanatory variables, combining them into a single population is tempting. This can simplify future predictions and reduce uncertainty associated with estimation. However, combining these populations may introduce bias if the underlying relationships are in fact different. The probability of agreement formally and intuitively quantifies the similarity of estimated reliability surfaces across a two-factor input space. An example from the reliability literature demonstrates the utility of the approach when deciding whether to combine two populations or to keep them as distinct. As a result, new graphical summaries provide strategies formore » visualizing the results.« less
NASA Astrophysics Data System (ADS)
Hurford, Anthony; Harou, Julien
2015-04-01
Climate change has challenged conventional methods of planning water resources infrastructure investment, relying on stationarity of time-series data. It is not clear how to best use projections of future climatic conditions. Many-objective simulation-optimisation and trade-off analysis using evolutionary algorithms has been proposed as an approach to addressing complex planning problems with multiple conflicting objectives. The search for promising assets and policies can be carried out across a range of climate projections, to identify the configurations of infrastructure investment shown by model simulation to be robust under diverse future conditions. Climate projections can be used in different ways within a simulation model to represent the range of possible future conditions and understand how optimal investments vary according to the different hydrological conditions. We compare two approaches, optimising over an ensemble of different 20-year flow and PET timeseries projections, and separately for individual future scenarios built synthetically from the original ensemble. Comparing trade-off curves and surfaces generated by the two approaches helps understand the limits and benefits of optimising under different sets of conditions. The comparison is made for the Tana Basin in Kenya, where climate change combined with multiple conflicting objectives of water management and infrastructure investment mean decision-making is particularly challenging.
Multi-scale mechanics of granular solids from grain-resolved X-ray measurements
NASA Astrophysics Data System (ADS)
Hurley, R. C.; Hall, S. A.; Wright, J. P.
2017-11-01
This work discusses an experimental technique for studying the mechanics of three-dimensional (3D) granular solids. The approach combines 3D X-ray diffraction and X-ray computed tomography to measure grain-resolved strains, kinematics and contact fabric in the bulk of a granular solid, from which continuum strains, grain stresses, interparticle forces and coarse-grained elasto-plastic moduli can be determined. We demonstrate the experimental approach and analysis of selected results on a sample of 1099 stiff, frictional grains undergoing multiple uniaxial compression cycles. We investigate the inter-particle force network, elasto-plastic moduli and associated length scales, reversibility of mechanical responses during cyclic loading, the statistics of microscopic responses and microstructure-property relationships. This work serves to highlight both the fundamental insight into granular mechanics that is furnished by combined X-ray measurements and describes future directions in the field of granular materials that can be pursued with such approaches.
Fieuws, Steffen; Willems, Guy; Larsen-Tangmose, Sara; Lynnerup, Niels; Boldsen, Jesper; Thevissen, Patrick
2016-03-01
When an estimate of age is needed, typically multiple indicators are present as found in skeletal or dental information. There exists a vast literature on approaches to estimate age from such multivariate data. Application of Bayes' rule has been proposed to overcome drawbacks of classical regression models but becomes less trivial as soon as the number of indicators increases. Each of the age indicators can lead to a different point estimate ("the most plausible value for age") and a prediction interval ("the range of possible values"). The major challenge in the combination of multiple indicators is not the calculation of a combined point estimate for age but the construction of an appropriate prediction interval. Ignoring the correlation between the age indicators results in intervals being too small. Boldsen et al. (2002) presented an ad-hoc procedure to construct an approximate confidence interval without the need to model the multivariate correlation structure between the indicators. The aim of the present paper is to bring under attention this pragmatic approach and to evaluate its performance in a practical setting. This is all the more needed since recent publications ignore the need for interval estimation. To illustrate and evaluate the method, Köhler et al. (1995) third molar scores are used to estimate the age in a dataset of 3200 male subjects in the juvenile age range.
NASA Astrophysics Data System (ADS)
Croce, Pierpaolo; Zappasodi, Filippo; Merla, Arcangelo; Chiarelli, Antonio Maria
2017-08-01
Objective. Electrical and hemodynamic brain activity are linked through the neurovascular coupling process and they can be simultaneously measured through integration of electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS). Thanks to the lack of electro-optical interference, the two procedures can be easily combined and, whereas EEG provides electrophysiological information, fNIRS can provide measurements of two hemodynamic variables, such as oxygenated and deoxygenated hemoglobin. A Bayesian sequential Monte Carlo approach (particle filter, PF) was applied to simulated recordings of electrical and neurovascular mediated hemodynamic activity, and the advantages of a unified framework were shown. Approach. Multiple neural activities and hemodynamic responses were simulated in the primary motor cortex of a subject brain. EEG and fNIRS recordings were obtained by means of forward models of volume conduction and light propagation through the head. A state space model of combined EEG and fNIRS data was built and its dynamic evolution was estimated through a Bayesian sequential Monte Carlo approach (PF). Main results. We showed the feasibility of the procedure and the improvements in both electrical and hemodynamic brain activity reconstruction when using the PF on combined EEG and fNIRS measurements. Significance. The investigated procedure allows one to combine the information provided by the two methodologies, and, by taking advantage of a physical model of the coupling between electrical and hemodynamic response, to obtain a better estimate of brain activity evolution. Despite the high computational demand, application of such an approach to in vivo recordings could fully exploit the advantages of this combined brain imaging technology.
Elhassadi, Ezzat; Murphy, Maurice; Hacking, Dayle; Farrell, Michael
2018-04-01
CNS myelomatous involvement is a rare complication of multiple myeloma with dismal outcome. This disease's optimal treatment is unclear. Combined approach of systemic therapy, radiotherapy, and intrathecal injections chemotherapy should be considered and autologous stem cell transplant consolidation is offered to eligible patients. The role of Daratumumab in this disease deserves further evaluation.
Disc Activities in Physical Education: A Comprehensive Approach
ERIC Educational Resources Information Center
Cramer, Stanley J.
2017-01-01
Nearly everyone who throws a disc associates the activity with fun. Over the years, multiple disc games and activities have been invented, combining fun and learning. These are games that many individuals are likely to continue playing long after they have left school and are worthy of being included in a contemporary physical education program.…
ERIC Educational Resources Information Center
Wang, Tzu-Hua
2010-01-01
This research combines the idea of cake format dynamic assessment defined by Sternberg and Grigorenko (2001) and the "graduated prompt approach" proposed by (Campione and Brown, 1985) and (Campione and Brown, 1987) to develop a multiple-choice Web-based dynamic assessment system. This research adopts a quasi-experimental design to…
ERIC Educational Resources Information Center
Hegde, Balasubrahmanya; Meera, B. N.
2012-01-01
A perceived difficulty is associated with physics problem solving from a learner's viewpoint, arising out of a multitude of reasons. In this paper, we have examined the microstructure of students' thought processes during physics problem solving by combining the analysis of responses to multiple-choice questions and semistructured student…
gPhysics--Using Smart Glasses for Head-Centered, Context-Aware Learning in Physics Experiments
ERIC Educational Resources Information Center
Kuhn, Jochen; Lukowicz, Paul; Hirth, Michael; Poxrucker, Andreas; Weppner, Jens; Younas, Junaid
2016-01-01
Smart Glasses such as Google Glass are mobile computers combining classical Head-Mounted Displays (HMD) with several sensors. Therefore, contact-free, sensor-based experiments can be linked with relating, near-eye presented multiple representations. We will present a first approach on how Smart Glasses can be used as an experimental tool for…
Much Needed Structure [Structured Decision-Making with DMRCS. Define-Measure-Reduce-Combine-Select
Anderson-Cook, Christine M.; Lu, Lu
2015-10-01
We have described a new DMRCS process for structured decision making, which mirrors the approach of the DMAIC process which has become so popular within Lean Six Sigma. By dividing a complex often unstructured process into distinct steps, we hope to have made the task of balancing multiple competing objectives less daunting.
NASA Astrophysics Data System (ADS)
Huang, C. L.; Hsu, N. S.; Yeh, W. W. G.; Hsieh, I. H.
2017-12-01
This study develops an innovative calibration method for regional groundwater modeling by using multi-class empirical orthogonal functions (EOFs). The developed method is an iterative approach. Prior to carrying out the iterative procedures, the groundwater storage hydrographs associated with the observation wells are calculated. The combined multi-class EOF amplitudes and EOF expansion coefficients of the storage hydrographs are then used to compute the initial gauss of the temporal and spatial pattern of multiple recharges. The initial guess of the hydrogeological parameters are also assigned according to in-situ pumping experiment. The recharges include net rainfall recharge and boundary recharge, and the hydrogeological parameters are riverbed leakage conductivity, horizontal hydraulic conductivity, vertical hydraulic conductivity, storage coefficient, and specific yield. The first step of the iterative algorithm is to conduct the numerical model (i.e. MODFLOW) by the initial guess / adjusted values of the recharges and parameters. Second, in order to determine the best EOF combination of the error storage hydrographs for determining the correction vectors, the objective function is devised as minimizing the root mean square error (RMSE) of the simulated storage hydrographs. The error storage hydrograph are the differences between the storage hydrographs computed from observed and simulated groundwater level fluctuations. Third, adjust the values of recharges and parameters and repeat the iterative procedures until the stopping criterion is reached. The established methodology was applied to the groundwater system of Ming-Chu Basin, Taiwan. The study period is from January 1st to December 2ed in 2012. Results showed that the optimal EOF combination for the multiple recharges and hydrogeological parameters can decrease the RMSE of the simulated storage hydrographs dramatically within three calibration iterations. It represents that the iterative approach that using EOF techniques can capture the groundwater flow tendency and detects the correction vector of the simulated error sources. Hence, the established EOF-based methodology can effectively and accurately identify the multiple recharges and hydrogeological parameters.
Local classifier weighting by quadratic programming.
Cevikalp, Hakan; Polikar, Robi
2008-10-01
It has been widely accepted that the classification accuracy can be improved by combining outputs of multiple classifiers. However, how to combine multiple classifiers with various (potentially conflicting) decisions is still an open problem. A rich collection of classifier combination procedures -- many of which are heuristic in nature -- have been developed for this goal. In this brief, we describe a dynamic approach to combine classifiers that have expertise in different regions of the input space. To this end, we use local classifier accuracy estimates to weight classifier outputs. Specifically, we estimate local recognition accuracies of classifiers near a query sample by utilizing its nearest neighbors, and then use these estimates to find the best weights of classifiers to label the query. The problem is formulated as a convex quadratic optimization problem, which returns optimal nonnegative classifier weights with respect to the chosen objective function, and the weights ensure that locally most accurate classifiers are weighted more heavily for labeling the query sample. Experimental results on several data sets indicate that the proposed weighting scheme outperforms other popular classifier combination schemes, particularly on problems with complex decision boundaries. Hence, the results indicate that local classification-accuracy-based combination techniques are well suited for decision making when the classifiers are trained by focusing on different regions of the input space.
Reducing uncertainties in decadal variability of the global carbon budget with multiple datasets
Li, Wei; Ciais, Philippe; Wang, Yilong; Peng, Shushi; Broquet, Grégoire; Ballantyne, Ashley P.; Canadell, Josep G.; Cooper, Leila; Friedlingstein, Pierre; Le Quéré, Corinne; Myneni, Ranga B.; Peters, Glen P.; Piao, Shilong; Pongratz, Julia
2016-01-01
Conventional calculations of the global carbon budget infer the land sink as a residual between emissions, atmospheric accumulation, and the ocean sink. Thus, the land sink accumulates the errors from the other flux terms and bears the largest uncertainty. Here, we present a Bayesian fusion approach that combines multiple observations in different carbon reservoirs to optimize the land (B) and ocean (O) carbon sinks, land use change emissions (L), and indirectly fossil fuel emissions (F) from 1980 to 2014. Compared with the conventional approach, Bayesian optimization decreases the uncertainties in B by 41% and in O by 46%. The L uncertainty decreases by 47%, whereas F uncertainty is marginally improved through the knowledge of natural fluxes. Both ocean and net land uptake (B + L) rates have positive trends of 29 ± 8 and 37 ± 17 Tg C⋅y−2 since 1980, respectively. Our Bayesian fusion of multiple observations reduces uncertainties, thereby allowing us to isolate important variability in global carbon cycle processes. PMID:27799533
A Hough transform global probabilistic approach to multiple-subject diffusion MRI tractography.
Aganj, Iman; Lenglet, Christophe; Jahanshad, Neda; Yacoub, Essa; Harel, Noam; Thompson, Paul M; Sapiro, Guillermo
2011-08-01
A global probabilistic fiber tracking approach based on the voting process provided by the Hough transform is introduced in this work. The proposed framework tests candidate 3D curves in the volume, assigning to each one a score computed from the diffusion images, and then selects the curves with the highest scores as the potential anatomical connections. The algorithm avoids local minima by performing an exhaustive search at the desired resolution. The technique is easily extended to multiple subjects, considering a single representative volume where the registered high-angular resolution diffusion images (HARDI) from all the subjects are non-linearly combined, thereby obtaining population-representative tracts. The tractography algorithm is run only once for the multiple subjects, and no tract alignment is necessary. We present experimental results on HARDI volumes, ranging from simulated and 1.5T physical phantoms to 7T and 4T human brain and 7T monkey brain datasets. Copyright © 2011 Elsevier B.V. All rights reserved.
Fu, Wei; Shi, Qiyuan; Prosperi, Christine; Wu, Zhenke; Hammitt, Laura L.; Feikin, Daniel R.; Baggett, Henry C.; Howie, Stephen R.C.; Scott, J. Anthony G.; Murdoch, David R.; Madhi, Shabir A.; Thea, Donald M.; Brooks, W. Abdullah; Kotloff, Karen L.; Li, Mengying; Park, Daniel E.; Lin, Wenyi; Levine, Orin S.; O’Brien, Katherine L.; Zeger, Scott L.
2017-01-01
Abstract In pneumonia, specimens are rarely obtained directly from the infection site, the lung, so the pathogen causing infection is determined indirectly from multiple tests on peripheral clinical specimens, which may have imperfect and uncertain sensitivity and specificity, so inference about the cause is complex. Analytic approaches have included expert review of case-only results, case–control logistic regression, latent class analysis, and attributable fraction, but each has serious limitations and none naturally integrate multiple test results. The Pneumonia Etiology Research for Child Health (PERCH) study required an analytic solution appropriate for a case–control design that could incorporate evidence from multiple specimens from cases and controls and that accounted for measurement error. We describe a Bayesian integrated approach we developed that combined and extended elements of attributable fraction and latent class analyses to meet some of these challenges and illustrate the advantage it confers regarding the challenges identified for other methods. PMID:28575370
Testini, Mario; Piccinni, Giuseppe; Pedote, Pasquale; Lissidini, Germana; Gurrado, Angela; Lardo, Domenica; Greco, Luigi; Marzaioli, Rinaldo
2008-09-02
Shotgun injuries are the cause of increasing surgical problems related to the proliferation of firearms. Gunshot pancreaticoduodenal traumas are unusual in urban trauma units. Their management remains complex because of the absence of standardized, universal guidelines for treatment and the high incidence of associated lesions of major vessels as well as of other gastrointestinal structures. Surgical treatment is still controversial, and the possibilities offered by the safe and effective mini-invasive techniques seem to open new, articulated perspectives for the treatment of pancreaticoduodenal injury complications. We present the case of a 27-year-old man with multiple penetrating gunshot trauma evolving into acute necrotizing pancreatitis, treated by combining a surgical with a mini-invasive approach. At admission, he presented a Glasgow Coma Score of 4 due to severe hemorrhagic shock. First, surgical hemostasis, duodenogastric resection, multiple intestinal resections, peripancreatic and thoracic drainage were carried out as emergency procedures. On the 12th postoperative day, the patient underwent re-surgery with toilette, external duodenal drainage with Foley tube and peripancreatic drainage repositioning as a result of a duodenal perforation due to acute necrotizing pancreatitis. Eight days later, following the accidental removal of the peripancreatic drains, a CT scan was done showing a considerable collection of fluid in the epiploon retrocavity. Percutaneous CT-guided drainage was performed by inserting an 8.5 Fr pigtail catheter, thus avoiding further re-operation. The patient was successfully discharged on the 80th postoperative day. The treatment of multiple pancreaticoduodenal penetrating gunshot traumas should focus on multidisciplinary surgical and minimally invasive treatment to optimize organ recovery.
2015-01-01
Background Computer-aided drug design has a long history of being applied to discover new molecules to treat various cancers, but it has always been focused on single targets. The development of systems biology has let scientists reveal more hidden mechanisms of cancers, but attempts to apply systems biology to cancer therapies remain at preliminary stages. Our lab has successfully developed various systems biology models for several cancers. Based on these achievements, we present the first attempt to combine multiple-target therapy with systems biology. Methods In our previous study, we identified 28 significant proteins--i.e., common core network markers--of four types of cancers as house-keeping proteins of these cancers. In this study, we ranked these proteins by summing their carcinogenesis relevance values (CRVs) across the four cancers, and then performed docking and pharmacophore modeling to do virtual screening on the NCI database for anti-cancer drugs. We also performed pathway analysis on these proteins using Panther and MetaCore to reveal more mechanisms of these cancer house-keeping proteins. Results We designed several approaches to discover targets for multiple-target cocktail therapies. In the first one, we identified the top 20 drugs for each of the 28 cancer house-keeping proteins, and analyzed the docking pose to further understand the interaction mechanisms of these drugs. After screening for duplicates, we found that 13 of these drugs could target 11 proteins simultaneously. In the second approach, we chose the top 5 proteins with the highest summed CRVs and used them as the drug targets. We built a pharmacophore and applied it to do virtual screening against the Life-Chemical library for anti-cancer drugs. Based on these results, wet-lab bio-scientists could freely investigate combinations of these drugs for multiple-target therapy for cancers, in contrast to the traditional single target therapy. Conclusions Combination of systems biology with computer-aided drug design could help us develop novel drug cocktails with multiple targets. We believe this will enhance the efficiency of therapeutic practice and lead to new directions for cancer therapy. PMID:26680552
Gastrointestinal Toxicities With Combined Antiangiogenic and Stereotactic Body Radiation Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pollom, Erqi L.; Deng, Lei; Pai, Reetesh K.
2015-07-01
Combining the latest targeted biologic agents with the most advanced radiation technologies has been an exciting development in the treatment of cancer patients. Stereotactic body radiation therapy (SBRT) is an ablative radiation approach that has become established for the treatment of a variety of malignancies, and it has been increasingly used in combination with biologic agents, including those targeting angiogenesis-specific pathways. Multiple reports have emerged describing unanticipated toxicities arising from the combination of SBRT and angiogenesis-targeting agents, particularly of late luminal gastrointestinal toxicities. In this review, we summarize the literature describing these toxicities, explore the biological mechanism of action ofmore » toxicity with the combined use of antiangiogenic therapies, and discuss areas of future research, so that this combination of treatment modalities can continue to be used in broader clinical contexts.« less
Tanpitukpongse, T P; Mazurowski, M A; Ikhena, J; Petrella, J R
2017-03-01
Alzheimer disease is a prevalent neurodegenerative disease. Computer assessment of brain atrophy patterns can help predict conversion to Alzheimer disease. Our aim was to assess the prognostic efficacy of individual-versus-combined regional volumetrics in 2 commercially available brain volumetric software packages for predicting conversion of patients with mild cognitive impairment to Alzheimer disease. Data were obtained through the Alzheimer's Disease Neuroimaging Initiative. One hundred ninety-two subjects (mean age, 74.8 years; 39% female) diagnosed with mild cognitive impairment at baseline were studied. All had T1-weighted MR imaging sequences at baseline and 3-year clinical follow-up. Analysis was performed with NeuroQuant and Neuroreader. Receiver operating characteristic curves assessing the prognostic efficacy of each software package were generated by using a univariable approach using individual regional brain volumes and 2 multivariable approaches (multiple regression and random forest), combining multiple volumes. On univariable analysis of 11 NeuroQuant and 11 Neuroreader regional volumes, hippocampal volume had the highest area under the curve for both software packages (0.69, NeuroQuant; 0.68, Neuroreader) and was not significantly different ( P > .05) between packages. Multivariable analysis did not increase the area under the curve for either package (0.63, logistic regression; 0.60, random forest NeuroQuant; 0.65, logistic regression; 0.62, random forest Neuroreader). Of the multiple regional volume measures available in FDA-cleared brain volumetric software packages, hippocampal volume remains the best single predictor of conversion of mild cognitive impairment to Alzheimer disease at 3-year follow-up. Combining volumetrics did not add additional prognostic efficacy. Therefore, future prognostic studies in mild cognitive impairment, combining such tools with demographic and other biomarker measures, are justified in using hippocampal volume as the only volumetric biomarker. © 2017 by American Journal of Neuroradiology.
Pirpinia, Kleopatra; Bosman, Peter A N; Loo, Claudette E; Winter-Warnars, Gonneke; Janssen, Natasja N Y; Scholten, Astrid N; Sonke, Jan-Jakob; van Herk, Marcel; Alderliesten, Tanja
2017-06-23
Deformable image registration is typically formulated as an optimization problem involving a linearly weighted combination of terms that correspond to objectives of interest (e.g. similarity, deformation magnitude). The weights, along with multiple other parameters, need to be manually tuned for each application, a task currently addressed mainly via trial-and-error approaches. Such approaches can only be successful if there is a sensible interplay between parameters, objectives, and desired registration outcome. This, however, is not well established. To study this interplay, we use multi-objective optimization, where multiple solutions exist that represent the optimal trade-offs between the objectives, forming a so-called Pareto front. Here, we focus on weight tuning. To study the space a user has to navigate during manual weight tuning, we randomly sample multiple linear combinations. To understand how these combinations relate to desirability of registration outcome, we associate with each outcome a mean target registration error (TRE) based on expert-defined anatomical landmarks. Further, we employ a multi-objective evolutionary algorithm that optimizes the weight combinations, yielding a Pareto front of solutions, which can be directly navigated by the user. To study how the complexity of manual weight tuning changes depending on the registration problem, we consider an easy problem, prone-to-prone breast MR image registration, and a hard problem, prone-to-supine breast MR image registration. Lastly, we investigate how guidance information as an additional objective influences the prone-to-supine registration outcome. Results show that the interplay between weights, objectives, and registration outcome makes manual weight tuning feasible for the prone-to-prone problem, but very challenging for the harder prone-to-supine problem. Here, patient-specific, multi-objective weight optimization is needed, obtaining a mean TRE of 13.6 mm without guidance information reduced to 7.3 mm with guidance information, but also providing a Pareto front that exhibits an intuitively sensible interplay between weights, objectives, and registration outcome, allowing outcome selection.
Integrating vector control across diseases.
Golding, Nick; Wilson, Anne L; Moyes, Catherine L; Cano, Jorge; Pigott, David M; Velayudhan, Raman; Brooker, Simon J; Smith, David L; Hay, Simon I; Lindsay, Steve W
2015-10-01
Vector-borne diseases cause a significant proportion of the overall burden of disease across the globe, accounting for over 10 % of the burden of infectious diseases. Despite the availability of effective interventions for many of these diseases, a lack of resources prevents their effective control. Many existing vector control interventions are known to be effective against multiple diseases, so combining vector control programmes to simultaneously tackle several diseases could offer more cost-effective and therefore sustainable disease reductions. The highly successful cross-disease integration of vaccine and mass drug administration programmes in low-resource settings acts a precedent for cross-disease vector control. Whilst deliberate implementation of vector control programmes across multiple diseases has yet to be trialled on a large scale, a number of examples of 'accidental' cross-disease vector control suggest the potential of such an approach. Combining contemporary high-resolution global maps of the major vector-borne pathogens enables us to quantify overlap in their distributions and to estimate the populations jointly at risk of multiple diseases. Such an analysis shows that over 80 % of the global population live in regions of the world at risk from one vector-borne disease, and more than half the world's population live in areas where at least two different vector-borne diseases pose a threat to health. Combining information on co-endemicity with an assessment of the overlap of vector control methods effective against these diseases allows us to highlight opportunities for such integration. Malaria, leishmaniasis, lymphatic filariasis, and dengue are prime candidates for combined vector control. All four of these diseases overlap considerably in their distributions and there is a growing body of evidence for the effectiveness of insecticide-treated nets, screens, and curtains for controlling all of their vectors. The real-world effectiveness of cross-disease vector control programmes can only be evaluated by large-scale trials, but there is clear evidence of the potential of such an approach to enable greater overall health benefit using the limited funds available.
NASA Astrophysics Data System (ADS)
Pirpinia, Kleopatra; Bosman, Peter A. N.; E Loo, Claudette; Winter-Warnars, Gonneke; Y Janssen, Natasja N.; Scholten, Astrid N.; Sonke, Jan-Jakob; van Herk, Marcel; Alderliesten, Tanja
2017-07-01
Deformable image registration is typically formulated as an optimization problem involving a linearly weighted combination of terms that correspond to objectives of interest (e.g. similarity, deformation magnitude). The weights, along with multiple other parameters, need to be manually tuned for each application, a task currently addressed mainly via trial-and-error approaches. Such approaches can only be successful if there is a sensible interplay between parameters, objectives, and desired registration outcome. This, however, is not well established. To study this interplay, we use multi-objective optimization, where multiple solutions exist that represent the optimal trade-offs between the objectives, forming a so-called Pareto front. Here, we focus on weight tuning. To study the space a user has to navigate during manual weight tuning, we randomly sample multiple linear combinations. To understand how these combinations relate to desirability of registration outcome, we associate with each outcome a mean target registration error (TRE) based on expert-defined anatomical landmarks. Further, we employ a multi-objective evolutionary algorithm that optimizes the weight combinations, yielding a Pareto front of solutions, which can be directly navigated by the user. To study how the complexity of manual weight tuning changes depending on the registration problem, we consider an easy problem, prone-to-prone breast MR image registration, and a hard problem, prone-to-supine breast MR image registration. Lastly, we investigate how guidance information as an additional objective influences the prone-to-supine registration outcome. Results show that the interplay between weights, objectives, and registration outcome makes manual weight tuning feasible for the prone-to-prone problem, but very challenging for the harder prone-to-supine problem. Here, patient-specific, multi-objective weight optimization is needed, obtaining a mean TRE of 13.6 mm without guidance information reduced to 7.3 mm with guidance information, but also providing a Pareto front that exhibits an intuitively sensible interplay between weights, objectives, and registration outcome, allowing outcome selection.
Application of MSCTA combined with VRT in the operation of cervical dumbbell tumors
Wang, Wan; Lin, Jia; Knosp, Engelbert; Zhao, Yuanzheng; Xiu, Dianhui; Guo, Yongchuan
2015-01-01
Cervical dumbbell tumor poses great difficulties for neurosurgical treatment and incurs remarkable local recurrence rate as the formidable problem for neurosurgery. However, as the routine preoperative evaluation scheme, MRI and CT failed to reveal the mutual three-dimensional relationships between tumor and adjacent structures. Here, we report the clinical application of MSCTA and VRT in three-dimensional reconstruction of cervical dumbbell tumors. From January 2012 to July 2014, 24 patients diagnosed with cervical dumbbell tumor were retrospectively analyzed. All patients enrolled were indicated for preoperative MSCTA/VRT image reconstruction to explore the three-dimensional stereoscopic anatomical relationships among neuroma, spinal cord and vertebral artery to achieve optimal surgical approach from multiple configurations and surgical practice. Three-dimensional mutual anatomical relationships among tumor, adjacent vessels and vertebrae were vividly reconstructed by MSCTA/VRT in all patients in accordance with intraoperative findings. Multiple configurations for optimal surgical approach contribute to total resection of tumor, minimal damage to vessels and nerves, and maximal maintenance of cervical spine stability. Preoperative MSCTA/VRT contributes to reconstruction of three-dimensional stereoscopic anatomical relationships between cervical dumbbell tumor and adjacent structures for optimal surgical approach by multiple configurations and reduction of intraoperative damages and postoperative complications. PMID:26550385
Application of MSCTA combined with VRT in the operation of cervical dumbbell tumors.
Wang, Wan; Lin, Jia; Knosp, Engelbert; Zhao, Yuanzheng; Xiu, Dianhui; Guo, Yongchuan
2015-01-01
Cervical dumbbell tumor poses great difficulties for neurosurgical treatment and incurs remarkable local recurrence rate as the formidable problem for neurosurgery. However, as the routine preoperative evaluation scheme, MRI and CT failed to reveal the mutual three-dimensional relationships between tumor and adjacent structures. Here, we report the clinical application of MSCTA and VRT in three-dimensional reconstruction of cervical dumbbell tumors. From January 2012 to July 2014, 24 patients diagnosed with cervical dumbbell tumor were retrospectively analyzed. All patients enrolled were indicated for preoperative MSCTA/VRT image reconstruction to explore the three-dimensional stereoscopic anatomical relationships among neuroma, spinal cord and vertebral artery to achieve optimal surgical approach from multiple configurations and surgical practice. Three-dimensional mutual anatomical relationships among tumor, adjacent vessels and vertebrae were vividly reconstructed by MSCTA/VRT in all patients in accordance with intraoperative findings. Multiple configurations for optimal surgical approach contribute to total resection of tumor, minimal damage to vessels and nerves, and maximal maintenance of cervical spine stability. Preoperative MSCTA/VRT contributes to reconstruction of three-dimensional stereoscopic anatomical relationships between cervical dumbbell tumor and adjacent structures for optimal surgical approach by multiple configurations and reduction of intraoperative damages and postoperative complications.
Automatic parameter selection for feature-based multi-sensor image registration
NASA Astrophysics Data System (ADS)
DelMarco, Stephen; Tom, Victor; Webb, Helen; Chao, Alan
2006-05-01
Accurate image registration is critical for applications such as precision targeting, geo-location, change-detection, surveillance, and remote sensing. However, the increasing volume of image data is exceeding the current capacity of human analysts to perform manual registration. This image data glut necessitates the development of automated approaches to image registration, including algorithm parameter value selection. Proper parameter value selection is crucial to the success of registration techniques. The appropriate algorithm parameters can be highly scene and sensor dependent. Therefore, robust algorithm parameter value selection approaches are a critical component of an end-to-end image registration algorithm. In previous work, we developed a general framework for multisensor image registration which includes feature-based registration approaches. In this work we examine the problem of automated parameter selection. We apply the automated parameter selection approach of Yitzhaky and Peli to select parameters for feature-based registration of multisensor image data. The approach consists of generating multiple feature-detected images by sweeping over parameter combinations and using these images to generate estimated ground truth. The feature-detected images are compared to the estimated ground truth images to generate ROC points associated with each parameter combination. We develop a strategy for selecting the optimal parameter set by choosing the parameter combination corresponding to the optimal ROC point. We present numerical results showing the effectiveness of the approach using registration of collected SAR data to reference EO data.
Consensus Classification Using Non-Optimized Classifiers.
Brownfield, Brett; Lemos, Tony; Kalivas, John H
2018-04-03
Classifying samples into categories is a common problem in analytical chemistry and other fields. Classification is usually based on only one method, but numerous classifiers are available with some being complex, such as neural networks, and others are simple, such as k nearest neighbors. Regardless, most classification schemes require optimization of one or more tuning parameters for best classification accuracy, sensitivity, and specificity. A process not requiring exact selection of tuning parameter values would be useful. To improve classification, several ensemble approaches have been used in past work to combine classification results from multiple optimized single classifiers. The collection of classifications for a particular sample are then combined by a fusion process such as majority vote to form the final classification. Presented in this Article is a method to classify a sample by combining multiple classification methods without specifically classifying the sample by each method, that is, the classification methods are not optimized. The approach is demonstrated on three analytical data sets. The first is a beer authentication set with samples measured on five instruments, allowing fusion of multiple instruments by three ways. The second data set is composed of textile samples from three classes based on Raman spectra. This data set is used to demonstrate the ability to classify simultaneously with different data preprocessing strategies, thereby reducing the need to determine the ideal preprocessing method, a common prerequisite for accurate classification. The third data set contains three wine cultivars for three classes measured at 13 unique chemical and physical variables. In all cases, fusion of nonoptimized classifiers improves classification. Also presented are atypical uses of Procrustes analysis and extended inverted signal correction (EISC) for distinguishing sample similarities to respective classes.
Zhou, Jin J.; Cho, Michael H.; Lange, Christoph; Lutz, Sharon; Silverman, Edwin K.; Laird, Nan M.
2015-01-01
Many correlated disease variables are analyzed jointly in genetic studies in the hope of increasing power to detect causal genetic variants. One approach involves assessing the relationship between each phenotype and each single nucleotide polymorphism (SNP) individually and using a Bonferroni correction for the effective number of tests conducted. Alternatively, one can apply a multivariate regression or a dimension reduction technique, such as principal component analysis (PCA), and test for the association with the principal components (PC) of the phenotypes rather than the individual phenotypes. Inspired by the previous approaches of combining phenotypes to maximize heritability at individual SNPs, in this paper, we propose to construct a maximally heritable phenotype (MaxH) by taking advantage of the estimated total heritability and co-heritability. The heritability and co-heritability only need to be estimated once, therefore our method is applicable to genome-wide scans. MaxH phenotype is a linear combination of the individual phenotypes with increased heritability and power over the phenotypes being combined. Simulations show that the heritability and power achieved agree well with the theory for large samples and two phenotypes. We compare our approach with commonly used methods and assess both the heritability and the power of the MaxH phenotype. Moreover we provide suggestions for how to choose the phenotypes for combination. An application of our approach to a COPD genome-wide association study shows the practical relevance. PMID:26111731
NASA Astrophysics Data System (ADS)
Hagos Subagadis, Yohannes; Schütze, Niels; Grundmann, Jens
2015-04-01
The planning and implementation of effective water resources management strategies need an assessment of multiple (physical, environmental, and socio-economic) issues, and often requires new research in which knowledge of diverse disciplines are combined in a unified methodological and operational frameworks. Such integrative research to link different knowledge domains faces several practical challenges. Such complexities are further compounded by multiple actors frequently with conflicting interests and multiple uncertainties about the consequences of potential management decisions. A fuzzy-stochastic multiple criteria decision analysis tool was developed in this study to systematically quantify both probabilistic and fuzzy uncertainties associated with complex hydrosystems management. It integrated physical process-based models, fuzzy logic, expert involvement and stochastic simulation within a general framework. Subsequently, the proposed new approach is applied to a water-scarce coastal arid region water management problem in northern Oman, where saltwater intrusion into a coastal aquifer due to excessive groundwater extraction for irrigated agriculture has affected the aquifer sustainability, endangering associated socio-economic conditions as well as traditional social structure. Results from the developed method have provided key decision alternatives which can serve as a platform for negotiation and further exploration. In addition, this approach has enabled to systematically quantify both probabilistic and fuzzy uncertainties associated with the decision problem. Sensitivity analysis applied within the developed tool has shown that the decision makers' risk aversion and risk taking attitude may yield in different ranking of decision alternatives. The developed approach can be applied to address the complexities and uncertainties inherent in water resources systems to support management decisions, while serving as a platform for stakeholder participation.
Expression proteomics study to determine metallodrug targets and optimal drug combinations.
Lee, Ronald F S; Chernobrovkin, Alexey; Rutishauser, Dorothea; Allardyce, Claire S; Hacker, David; Johnsson, Kai; Zubarev, Roman A; Dyson, Paul J
2017-05-08
The emerging technique termed functional identification of target by expression proteomics (FITExP) has been shown to identify the key protein targets of anti-cancer drugs. Here, we use this approach to elucidate the proteins involved in the mechanism of action of two ruthenium(II)-based anti-cancer compounds, RAPTA-T and RAPTA-EA in breast cancer cells, revealing significant differences in the proteins upregulated. RAPTA-T causes upregulation of multiple proteins suggesting a broad mechanism of action involving suppression of both metastasis and tumorigenicity. RAPTA-EA bearing a GST inhibiting ethacrynic acid moiety, causes upregulation of mainly oxidative stress related proteins. The approach used in this work could be applied to the prediction of effective drug combinations to test in cancer chemotherapy clinical trials.
MICA: Multiple interval-based curve alignment
NASA Astrophysics Data System (ADS)
Mann, Martin; Kahle, Hans-Peter; Beck, Matthias; Bender, Bela Johannes; Spiecker, Heinrich; Backofen, Rolf
2018-01-01
MICA enables the automatic synchronization of discrete data curves. To this end, characteristic points of the curves' shapes are identified. These landmarks are used within a heuristic curve registration approach to align profile pairs by mapping similar characteristics onto each other. In combination with a progressive alignment scheme, this enables the computation of multiple curve alignments. Multiple curve alignments are needed to derive meaningful representative consensus data of measured time or data series. MICA was already successfully applied to generate representative profiles of tree growth data based on intra-annual wood density profiles or cell formation data. The MICA package provides a command-line and graphical user interface. The R interface enables the direct embedding of multiple curve alignment computation into larger analyses pipelines. Source code, binaries and documentation are freely available at https://github.com/BackofenLab/MICA
Robust Pedestrian Tracking and Recognition from FLIR Video: A Unified Approach via Sparse Coding
Li, Xin; Guo, Rui; Chen, Chao
2014-01-01
Sparse coding is an emerging method that has been successfully applied to both robust object tracking and recognition in the vision literature. In this paper, we propose to explore a sparse coding-based approach toward joint object tracking-and-recognition and explore its potential in the analysis of forward-looking infrared (FLIR) video to support nighttime machine vision systems. A key technical contribution of this work is to unify existing sparse coding-based approaches toward tracking and recognition under the same framework, so that they can benefit from each other in a closed-loop. On the one hand, tracking the same object through temporal frames allows us to achieve improved recognition performance through dynamical updating of template/dictionary and combining multiple recognition results; on the other hand, the recognition of individual objects facilitates the tracking of multiple objects (i.e., walking pedestrians), especially in the presence of occlusion within a crowded environment. We report experimental results on both the CASIAPedestrian Database and our own collected FLIR video database to demonstrate the effectiveness of the proposed joint tracking-and-recognition approach. PMID:24961216
Oguz, Ozgur S; Zhou, Zhehua; Glasauer, Stefan; Wollherr, Dirk
2018-04-03
Human motor control is highly efficient in generating accurate and appropriate motor behavior for a multitude of tasks. This paper examines how kinematic and dynamic properties of the musculoskeletal system are controlled to achieve such efficiency. Even though recent studies have shown that the human motor control relies on multiple models, how the central nervous system (CNS) controls this combination is not fully addressed. In this study, we utilize an Inverse Optimal Control (IOC) framework in order to find the combination of those internal models and how this combination changes for different reaching tasks. We conducted an experiment where participants executed a comprehensive set of free-space reaching motions. The results show that there is a trade-off between kinematics and dynamics based controllers depending on the reaching task. In addition, this trade-off depends on the initial and final arm configurations, which in turn affect the musculoskeletal load to be controlled. Given this insight, we further provide a discomfort metric to demonstrate its influence on the contribution of different inverse internal models. This formulation together with our analysis not only support the multiple internal models (MIMs) hypothesis but also suggest a hierarchical framework for the control of human reaching motions by the CNS.
Objective consensus from decision trees.
Putora, Paul Martin; Panje, Cedric M; Papachristofilou, Alexandros; Dal Pra, Alan; Hundsberger, Thomas; Plasswilm, Ludwig
2014-12-05
Consensus-based approaches provide an alternative to evidence-based decision making, especially in situations where high-level evidence is limited. Our aim was to demonstrate a novel source of information, objective consensus based on recommendations in decision tree format from multiple sources. Based on nine sample recommendations in decision tree format a representative analysis was performed. The most common (mode) recommendations for each eventuality (each permutation of parameters) were determined. The same procedure was applied to real clinical recommendations for primary radiotherapy for prostate cancer. Data was collected from 16 radiation oncology centres, converted into decision tree format and analyzed in order to determine the objective consensus. Based on information from multiple sources in decision tree format, treatment recommendations can be assessed for every parameter combination. An objective consensus can be determined by means of mode recommendations without compromise or confrontation among the parties. In the clinical example involving prostate cancer therapy, three parameters were used with two cut-off values each (Gleason score, PSA, T-stage) resulting in a total of 27 possible combinations per decision tree. Despite significant variations among the recommendations, a mode recommendation could be found for specific combinations of parameters. Recommendations represented as decision trees can serve as a basis for objective consensus among multiple parties.
Endedijk, Maaike D; Brekelmans, Mieke; Sleegers, Peter; Vermunt, Jan D
Self-regulated learning has benefits for students' academic performance in school, but also for expertise development during their professional career. This study examined the validity of an instrument to measure student teachers' regulation of their learning to teach across multiple and different kinds of learning events in the context of a postgraduate professional teacher education programme. Based on an analysis of the literature, we developed a log with structured questions that could be used as a multiple-event instrument to determine the quality of student teachers' regulation of learning by combining data from multiple learning experiences. The findings showed that this structured version of the instrument measured student teachers' regulation of their learning in a valid and reliable way. Furthermore, with the aid of the Structured Learning Report individual differences in student teachers' regulation of learning could be discerned. Together the findings indicate that a multiple-event instrument can be used to measure regulation of learning in multiple contexts for various learning experiences at the same time, without the necessity of relying on students' ability to rate themselves across all these different experiences. In this way, this instrument can make an important contribution to bridging the gap between two dominant approaches to measure SRL, the traditional aptitude and event measurement approach.
Allelic-based gene-gene interaction associated with quantitative traits.
Jung, Jeesun; Sun, Bin; Kwon, Deukwoo; Koller, Daniel L; Foroud, Tatiana M
2009-05-01
Recent studies have shown that quantitative phenotypes may be influenced not only by multiple single nucleotide polymorphisms (SNPs) within a gene but also by the interaction between SNPs at unlinked genes. We propose a new statistical approach that can detect gene-gene interactions at the allelic level which contribute to the phenotypic variation in a quantitative trait. By testing for the association of allelic combinations at multiple unlinked loci with a quantitative trait, we can detect the SNP allelic interaction whether or not it can be detected as a main effect. Our proposed method assigns a score to unrelated subjects according to their allelic combination inferred from observed genotypes at two or more unlinked SNPs, and then tests for the association of the allelic score with a quantitative trait. To investigate the statistical properties of the proposed method, we performed a simulation study to estimate type I error rates and power and demonstrated that this allelic approach achieves greater power than the more commonly used genotypic approach to test for gene-gene interaction. As an example, the proposed method was applied to data obtained as part of a candidate gene study of sodium retention by the kidney. We found that this method detects an interaction between the calcium-sensing receptor gene (CaSR), the chloride channel gene (CLCNKB) and the Na, K, 2Cl cotransporter gene (CLC12A1) that contributes to variation in diastolic blood pressure.
Novakovic, Dunja; Saarinen, Jukka; Rojalin, Tatu; Antikainen, Osmo; Fraser-Miller, Sara J; Laaksonen, Timo; Peltonen, Leena; Isomäki, Antti; Strachan, Clare J
2017-11-07
Two nonlinear imaging modalities, coherent anti-Stokes Raman scattering (CARS) and sum-frequency generation (SFG), were successfully combined for sensitive multimodal imaging of multiple solid-state forms and their changes on drug tablet surfaces. Two imaging approaches were used and compared: (i) hyperspectral CARS combined with principal component analysis (PCA) and SFG imaging and (ii) simultaneous narrowband CARS and SFG imaging. Three different solid-state forms of indomethacin-the crystalline gamma and alpha forms, as well as the amorphous form-were clearly distinguished using both approaches. Simultaneous narrowband CARS and SFG imaging was faster, but hyperspectral CARS and SFG imaging has the potential to be applied to a wider variety of more complex samples. These methodologies were further used to follow crystallization of indomethacin on tablet surfaces under two storage conditions: 30 °C/23% RH and 30 °C/75% RH. Imaging with (sub)micron resolution showed that the approach allowed detection of very early stage surface crystallization. The surfaces progressively crystallized to predominantly (but not exclusively) the gamma form at lower humidity and the alpha form at higher humidity. Overall, this study suggests that multimodal nonlinear imaging is a highly sensitive, solid-state (and chemically) specific, rapid, and versatile imaging technique for understanding and hence controlling (surface) solid-state forms and their complex changes in pharmaceuticals.
Vascularized Bone Tissue Engineering: Approaches for Potential Improvement
Nguyen, Lonnissa H.; Annabi, Nasim; Nikkhah, Mehdi; Bae, Hojae; Binan, Loïc; Park, Sangwon; Kang, Yunqing
2012-01-01
Significant advances have been made in bone tissue engineering (TE) in the past decade. However, classical bone TE strategies have been hampered mainly due to the lack of vascularization within the engineered bone constructs, resulting in poor implant survival and integration. In an effort toward clinical success of engineered constructs, new TE concepts have arisen to develop bone substitutes that potentially mimic native bone tissue structure and function. Large tissue replacements have failed in the past due to the slow penetration of the host vasculature, leading to necrosis at the central region of the engineered tissues. For this reason, multiple microscale strategies have been developed to induce and incorporate vascular networks within engineered bone constructs before implantation in order to achieve successful integration with the host tissue. Previous attempts to engineer vascularized bone tissue only focused on the effect of a single component among the three main components of TE (scaffold, cells, or signaling cues) and have only achieved limited success. However, with efforts to improve the engineered bone tissue substitutes, bone TE approaches have become more complex by combining multiple strategies simultaneously. The driving force behind combining various TE strategies is to produce bone replacements that more closely recapitulate human physiology. Here, we review and discuss the limitations of current bone TE approaches and possible strategies to improve vascularization in bone tissue substitutes. PMID:22765012
Mondal, Suchismita; Rutkoski, Jessica E.; Velu, Govindan; Singh, Pawan K.; Crespo-Herrera, Leonardo A.; Guzmán, Carlos; Bhavani, Sridhar; Lan, Caixia; He, Xinyao; Singh, Ravi P.
2016-01-01
Current trends in population growth and consumption patterns continue to increase the demand for wheat, a key cereal for global food security. Further, multiple abiotic challenges due to climate change and evolving pathogen and pests pose a major concern for increasing wheat production globally. Triticeae species comprising of primary, secondary, and tertiary gene pools represent a rich source of genetic diversity in wheat. The conventional breeding strategies of direct hybridization, backcrossing and selection have successfully introgressed a number of desirable traits associated with grain yield, adaptation to abiotic stresses, disease resistance, and bio-fortification of wheat varieties. However, it is time consuming to incorporate genes conferring tolerance/resistance to multiple stresses in a single wheat variety by conventional approaches due to limitations in screening methods and the lower probabilities of combining desirable alleles. Efforts on developing innovative breeding strategies, novel tools and utilizing genetic diversity for new genes/alleles are essential to improve productivity, reduce vulnerability to diseases and pests and enhance nutritional quality. New technologies of high-throughput phenotyping, genome sequencing and genomic selection are promising approaches to maximize progeny screening and selection to accelerate the genetic gains in breeding more productive varieties. Use of cisgenic techniques to transfer beneficial alleles and their combinations within related species also offer great promise especially to achieve durable rust resistance. PMID:27458472
Townsley, Michael; Bernasco, Wim; Ruiter, Stijn; Johnson, Shane D.; White, Gentry; Baum, Scott
2015-01-01
Objectives: This study builds on research undertaken by Bernasco and Nieuwbeerta and explores the generalizability of a theoretically derived offender target selection model in three cross-national study regions. Methods: Taking a discrete spatial choice approach, we estimate the impact of both environment- and offender-level factors on residential burglary placement in the Netherlands, the United Kingdom, and Australia. Combining cleared burglary data from all study regions in a single statistical model, we make statistical comparisons between environments. Results: In all three study regions, the likelihood an offender selects an area for burglary is positively influenced by proximity to their home, the proportion of easily accessible targets, and the total number of targets available. Furthermore, in two of the three study regions, juvenile offenders under the legal driving age are significantly more influenced by target proximity than adult offenders. Post hoc tests indicate the magnitudes of these impacts vary significantly between study regions. Conclusions: While burglary target selection strategies are consistent with opportunity-based explanations of offending, the impact of environmental context is significant. As such, the approach undertaken in combining observations from multiple study regions may aid criminology scholars in assessing the generalizability of observed findings across multiple environments. PMID:25866418
Underwater Light Regimes in Rivers from Multiple Measurement Approaches
NASA Astrophysics Data System (ADS)
Gardner, J.; Ensign, S.; Houser, J.; Doyle, M.
2017-12-01
Underwater light regimes are complex over space and time. Light in rivers is less understood compared to other aquatic systems, yet light is often the limiting resource and a fundamental control of many biological and physical processes in riverine systems. We combined multiple measurement approaches (fixed-site and flowpath) to understand underwater light regimes. We measured vertical light profiles over time (fixed-site) with stationary buoys and over space and time (flowpath) with Lagrangian neutrally buoyant sensors in two different large US rivers; the Upper Mississippi River in Wisconsin, USA and the Neuse River in North Carolina, USA. Fixed site data showed light extinction coefficients, and therefore the depth of the euphotic zone, varied up to three-fold within a day. Flowpath data revealed the stochastic nature of light regimes from the perspective of a neutrally buoyant particle as it moves throughout the water column. On average, particles were in the euphotic zone between 15-50% of the time. Combining flowpath and fixed-site data allowed spatial disaggregation of a river reach to determine if changes in the light regime were due to space or time as well as development of a conceptual model of the dynamic euphotic zone of rivers.
Losier, Y; Englehart, K; Hudgins, B
2007-01-01
The integration of multiple input sources within a control strategy for powered upper limb prostheses could provide smoother, more intuitive multi-joint reaching movements based on the user's intended motion. The work presented in this paper presents the results of using myoelectric signals (MES) of the shoulder area in combination with the position of the shoulder as input sources to multiple linear discriminant analysis classifiers. Such an approach may provide users with control signals capable of controlling three degrees of freedom (DOF). This work is another important step in the development of hybrid systems that will enable simultaneous control of multiple degrees of freedom used for reaching tasks in a prosthetic limb.
Directly solar-pumped iodine laser for beamed power transmission in space
NASA Technical Reports Server (NTRS)
Choi, S. H.; Meador, W. E.; Lee, J. H.
1992-01-01
A new approach for development of a 50-kW directly solar-pumped iodine laser (DSPIL) system as a space-based power station was made using a confocal unstable resonator (CUR). The CUR-based DSPIL has advantages, such as performance enhancement, reduction of total mass, and simplicity which alleviates the complexities inherent in the previous system, master oscillator/power amplifier (MOPA) configurations. In this design, a single CUR-based DSPIL with 50-kW output power was defined and compared to the MOPA-based DSPIL. Integration of multiple modules for power requirements more than 50-kW is physically and structurally a sound approach as compared to building a single large system. An integrated system of multiple modules can respond to various mission power requirements by combining and aiming the coherent beams at the user's receiver.
Katchman, Benjamin A.; Smith, Joseph T.; Obahiagbon, Uwadiae; Kesiraju, Sailaja; Lee, Yong-Kyun; O’Brien, Barry; Kaftanoglu, Korhan; Blain Christen, Jennifer; Anderson, Karen S.
2016-01-01
Point-of-care molecular diagnostics can provide efficient and cost-effective medical care, and they have the potential to fundamentally change our approach to global health. However, most existing approaches are not scalable to include multiple biomarkers. As a solution, we have combined commercial flat panel OLED display technology with protein microarray technology to enable high-density fluorescent, programmable, multiplexed biorecognition in a compact and disposable configuration with clinical-level sensitivity. Our approach leverages advances in commercial display technology to reduce pre-functionalized biosensor substrate costs to pennies per cm2. Here, we demonstrate quantitative detection of IgG antibodies to multiple viral antigens in patient serum samples with detection limits for human IgG in the 10 pg/mL range. We also demonstrate multiplexed detection of antibodies to the HPV16 proteins E2, E6, and E7, which are circulating biomarkers for cervical as well as head and neck cancers. PMID:27374875
Katchman, Benjamin A; Smith, Joseph T; Obahiagbon, Uwadiae; Kesiraju, Sailaja; Lee, Yong-Kyun; O'Brien, Barry; Kaftanoglu, Korhan; Blain Christen, Jennifer; Anderson, Karen S
2016-07-04
Point-of-care molecular diagnostics can provide efficient and cost-effective medical care, and they have the potential to fundamentally change our approach to global health. However, most existing approaches are not scalable to include multiple biomarkers. As a solution, we have combined commercial flat panel OLED display technology with protein microarray technology to enable high-density fluorescent, programmable, multiplexed biorecognition in a compact and disposable configuration with clinical-level sensitivity. Our approach leverages advances in commercial display technology to reduce pre-functionalized biosensor substrate costs to pennies per cm(2). Here, we demonstrate quantitative detection of IgG antibodies to multiple viral antigens in patient serum samples with detection limits for human IgG in the 10 pg/mL range. We also demonstrate multiplexed detection of antibodies to the HPV16 proteins E2, E6, and E7, which are circulating biomarkers for cervical as well as head and neck cancers.
Shape Optimization of Supersonic Turbines Using Response Surface and Neural Network Methods
NASA Technical Reports Server (NTRS)
Papila, Nilay; Shyy, Wei; Griffin, Lisa W.; Dorney, Daniel J.
2001-01-01
Turbine performance directly affects engine specific impulse, thrust-to-weight ratio, and cost in a rocket propulsion system. A global optimization framework combining the radial basis neural network (RBNN) and the polynomial-based response surface method (RSM) is constructed for shape optimization of a supersonic turbine. Based on the optimized preliminary design, shape optimization is performed for the first vane and blade of a 2-stage supersonic turbine, involving O(10) design variables. The design of experiment approach is adopted to reduce the data size needed by the optimization task. It is demonstrated that a major merit of the global optimization approach is that it enables one to adaptively revise the design space to perform multiple optimization cycles. This benefit is realized when an optimal design approaches the boundary of a pre-defined design space. Furthermore, by inspecting the influence of each design variable, one can also gain insight into the existence of multiple design choices and select the optimum design based on other factors such as stress and materials considerations.
Laine, Elodie; Carbone, Alessandra
2015-01-01
Protein-protein interactions (PPIs) are essential to all biological processes and they represent increasingly important therapeutic targets. Here, we present a new method for accurately predicting protein-protein interfaces, understanding their properties, origins and binding to multiple partners. Contrary to machine learning approaches, our method combines in a rational and very straightforward way three sequence- and structure-based descriptors of protein residues: evolutionary conservation, physico-chemical properties and local geometry. The implemented strategy yields very precise predictions for a wide range of protein-protein interfaces and discriminates them from small-molecule binding sites. Beyond its predictive power, the approach permits to dissect interaction surfaces and unravel their complexity. We show how the analysis of the predicted patches can foster new strategies for PPIs modulation and interaction surface redesign. The approach is implemented in JET2, an automated tool based on the Joint Evolutionary Trees (JET) method for sequence-based protein interface prediction. JET2 is freely available at www.lcqb.upmc.fr/JET2. PMID:26690684
Bioengineering Strategies for Designing Targeted Cancer Therapies
Wen, Xuejun
2014-01-01
The goals of bioengineering strategies for targeted cancer therapies are (1) to deliver a high dose of an anticancer drug directly to a cancer tumor, (2) to enhance drug uptake by malignant cells, and (3) to minimize drug uptake by nonmalignant cells. Effective cancer-targeting therapies will require both passive- and active targeting strategies and a thorough understanding of physiologic barriers to targeted drug delivery. Designing a targeted therapy includes the selection and optimization of a nanoparticle delivery vehicle for passive accumulation in tumors, a targeting moiety for active receptor-mediated uptake, and stimuli-responsive polymers for control of drug release. The future direction of cancer targeting is a combinatorial approach, in which targeting therapies are designed to use multiple targeting strategies. The combinatorial approach will enable combination therapy for delivery of multiple drugs and dual ligand targeting to improve targeting specificity. Targeted cancer treatments in development and the new combinatorial approaches show promise for improving targeted anticancer drug delivery and improving treatment outcomes. PMID:23768509
Multiplex biomarker approach to cardiovascular diseases.
Adamcova, Michaela; Šimko, Fedor
2018-04-12
Personalized medicine is partly based on biomarker-guided diagnostics, therapy and prognosis, which is becoming an unavoidable concept in modern cardiology. However, the clinical significance of single biomarker studies is rather limited. A promising novel approach involves combining multiple markers into a multiplex panel, which could refine the management of a particular patient with cardiovascular pathology. Two principally different assay formats have been developed to facilitate simultaneous quantification of multiple antigens: planar array assays and microbead assays. These approaches may help to better evaluate the complexity and dynamic nature of pathologic processes and offer substantial cost and sample savings compared with traditional enzyme-linked immunosorbent assay (ELISA) measurements. However, a multiplex multimarker approach cannot become a generally disseminated method until analytical problems are solved and further studies confirming improved clinical outcomes are accomplished. These drawbacks underlie the fact that a limited number of systematic studies are available regarding the use of a multiplex biomarker approach in cardiovascular medicine to date. Our perspective underscores the significant potential of the use of the multiplex approach in a wider conceptual framework under the close cooperation of clinical and experimental cardiologists, pathophysiologists and biochemists so that the personalized approach based on standardized multimarker testing may improve the management of various cardiovascular pathologies and become a ubiquitous partner of population-derived evidence-based medicine.
NASA Technical Reports Server (NTRS)
DeCarvalho, N. V.; Chen, B. Y.; Pinho, S. T.; Baiz, P. M.; Ratcliffe, J. G.; Tay, T. E.
2013-01-01
A novel approach is proposed for high-fidelity modeling of progressive damage and failure in composite materials that combines the Floating Node Method (FNM) and the Virtual Crack Closure Technique (VCCT) to represent multiple interacting failure mechanisms in a mesh-independent fashion. In this study, the approach is applied to the modeling of delamination migration in cross-ply tape laminates. Delamination, matrix cracking, and migration are all modeled using fracture mechanics based failure and migration criteria. The methodology proposed shows very good qualitative and quantitative agreement with experiments.
NASA Technical Reports Server (NTRS)
DeCarvalho, Nelson V.; Chen, B. Y.; Pinho, Silvestre T.; Baiz, P. M.; Ratcliffe, James G.; Tay, T. E.
2013-01-01
A novel approach is proposed for high-fidelity modeling of progressive damage and failure in composite materials that combines the Floating Node Method (FNM) and the Virtual Crack Closure Technique (VCCT) to represent multiple interacting failure mechanisms in a mesh-independent fashion. In this study, the approach is applied to the modeling of delamination migration in cross-ply tape laminates. Delamination, matrix cracking, and migration are all modeled using fracture mechanics based failure and migration criteria. The methodology proposed shows very good qualitative and quantitative agreement with experiments.
Errard, Audrey; Ulrichs, Christian; Kühne, Stefan; Mewis, Inga; Drungowski, Mario; Schreiner, Monika; Baldermann, Susanne
2015-11-25
Tomato is susceptible to pest infestations by both spider mites and aphids. The effects of each individual pest on plants are known, whereas multiple-pest infestations have received little interest. We studied the effects of single- versus multiple-pest infestation by Tetranychus urticae and Myzus persicae on tomato biochemistry (Solanum lycopersicum) by combining a metabolomic approach and analyses of carotenoids using UHPLC-ToF-MS and volatiles using GC-MS. Plants responded differently to aphids and mites after 3 weeks of infestation, and a multiple infestation induced a specific metabolite composition in plants. In addition, we showed that volatiles emissions differed between the adaxial and abaxial leaf epidermes and identified compounds emitted particularly in response to a multiple infestation (cyclohexadecane, dodecane, aromadendrene, and β-elemene). Finally, the carotenoid concentrations in leaves and stems were more affected by multiple than single infestations. Our study highlights and discusses the interplay of biotic stressors within the terpenoid metabolism.
Bakrania, Anita K; Variya, Bhavesh C; Rathod, Lalaji V; Patel, Snehal S
2018-01-01
Triple negative breast cancer revolution has identified a plethora of therapeutic targets making it apparent that a single target for its treatment could be rare hence creating an urge to develop robust technologies for combination drug therapy. Paclitaxel, hailed as the most significant advancement in chemotherapy faces several underpinnings due to its low solubility and permeability. Advancing research has demonstrated the role of interferons in cancer. DEAE-Dextran, an emerging molecule with evidence of interferon induction was utilized in the present study to develop a nanoformulation in conjugation with paclitaxel to target multiple therapeutic pathways, with diminution of paclitaxel adverse effects and develop a specific targeted nano system. Evidently, it was demonstrated that DEAE-Dextran coated nanoformulation portrays significant synergistic cytotoxicity in the various cell lines. Moreover, overcoming the activation of ROS by paclitaxel, the combination drug therapy more effectively inhibited ROS through β-interferon induction. The nanoformulation was further conjugated to FITC for internalization studies which subsequently indicated maximum cellular uptake at 60min post treatment demonstrated by green fluorescence from FITC lighting up the nuclear membrane. Precisely, the mechanistic approach of nuclear-targeted nanoformulation was evaluated by in vivo xenograft studies which showed a synergistic release of β-interferon at the target organ. Moreover, the combination nanoformulation inculcated multiple mechanistic approaches through VEGF and NOTCH1 inhibition along with dual β and γ-interferon overexpression. Overall, the combination therapy may be a promising multifunctional nanomaterial for intranuclear drug delivery in TNBC. Copyright © 2017 Elsevier B.V. All rights reserved.
Zou, Shanmei; Fei, Cong; Wang, Chun; Gao, Zhan; Bao, Yachao; He, Meilin; Wang, Changhai
2016-01-01
Microalgae identification is extremely difficult. The efficiency of DNA barcoding in microalgae identification involves ideal gene markers and approaches employed, which however, is still under the way. Although Scenedesmus has obtained much research in producing lipids its identification is difficult. Here we present a comprehensive coalescent, distance and character-based DNA barcoding for 118 Scenedesmus strains based on rbcL, tufA, ITS and 16S. The four genes, and their combined data rbcL + tufA + ITS + 16S, rbcL + tufA and ITS + 16S were analyzed by all of GMYC, P ID, PTP, ABGD, and character-based barcoding respectively. It was apparent that the three combined gene data showed a higher proportion of resolution success than the single gene. In comparison, the GMYC and PTP analysis produced more taxonomic lineages. The ABGD generated various resolution in discrimination among the single and combined data. The character-based barcoding was proved to be the most effective approach for species discrimination in both single and combined data which produced consistent species identification. All the integrated results recovered 11 species, five out of which were revealed as potential cryptic species. We suggest that the character-based DNA barcoding together with other approaches based on multiple genes and their combined data could be more effective in microalgae diversity revelation. PMID:27827440
Zou, Shanmei; Fei, Cong; Wang, Chun; Gao, Zhan; Bao, Yachao; He, Meilin; Wang, Changhai
2016-11-09
Microalgae identification is extremely difficult. The efficiency of DNA barcoding in microalgae identification involves ideal gene markers and approaches employed, which however, is still under the way. Although Scenedesmus has obtained much research in producing lipids its identification is difficult. Here we present a comprehensive coalescent, distance and character-based DNA barcoding for 118 Scenedesmus strains based on rbcL, tufA, ITS and 16S. The four genes, and their combined data rbcL + tufA + ITS + 16S, rbcL + tufA and ITS + 16S were analyzed by all of GMYC, P ID, PTP, ABGD, and character-based barcoding respectively. It was apparent that the three combined gene data showed a higher proportion of resolution success than the single gene. In comparison, the GMYC and PTP analysis produced more taxonomic lineages. The ABGD generated various resolution in discrimination among the single and combined data. The character-based barcoding was proved to be the most effective approach for species discrimination in both single and combined data which produced consistent species identification. All the integrated results recovered 11 species, five out of which were revealed as potential cryptic species. We suggest that the character-based DNA barcoding together with other approaches based on multiple genes and their combined data could be more effective in microalgae diversity revelation.
Combining fungal biopesticides and insecticide-treated bednets to enhance malaria control.
Hancock, Penelope A
2009-10-01
In developing strategies to control malaria vectors, there is increased interest in biological methods that do not cause instant vector mortality, but have sublethal and lethal effects at different ages and stages in the mosquito life cycle. These techniques, particularly if integrated with other vector control interventions, may produce substantial reductions in malaria transmission due to the total effect of alterations to multiple life history parameters at relevant points in the life-cycle and transmission-cycle of the vector. To quantify this effect, an analytically tractable gonotrophic cycle model of mosquito-malaria interactions is developed that unites existing continuous and discrete feeding cycle approaches. As a case study, the combined use of fungal biopesticides and insecticide treated bednets (ITNs) is considered. Low values of the equilibrium EIR and human prevalence were obtained when fungal biopesticides and ITNs were combined, even for scenarios where each intervention acting alone had relatively little impact. The effect of the combined interventions on the equilibrium EIR was at least as strong as the multiplicative effect of both interventions. For scenarios representing difficult conditions for malaria control, due to high transmission intensity and widespread insecticide resistance, the effect of the combined interventions on the equilibrium EIR was greater than the multiplicative effect, as a result of synergistic interactions between the interventions. Fungal biopesticide application was found to be most effective when ITN coverage was high, producing significant reductions in equilibrium prevalence for low levels of biopesticide coverage. By incorporating biological mechanisms relevant to vectorial capacity, continuous-time vector population models can increase their applicability to integrated vector management.
Li, Miao; Li, Jun; Zhou, Yiyu
2015-12-08
The problem of jointly detecting and tracking multiple targets from the raw observations of an infrared focal plane array is a challenging task, especially for the case with uncertain target dynamics. In this paper a multi-model labeled multi-Bernoulli (MM-LMB) track-before-detect method is proposed within the labeled random finite sets (RFS) framework. The proposed track-before-detect method consists of two parts-MM-LMB filter and MM-LMB smoother. For the MM-LMB filter, original LMB filter is applied to track-before-detect based on target and measurement models, and is integrated with the interacting multiple models (IMM) approach to accommodate the uncertainty of target dynamics. For the MM-LMB smoother, taking advantage of the track labels and posterior model transition probability, the single-model single-target smoother is extended to a multi-model multi-target smoother. A Sequential Monte Carlo approach is also presented to implement the proposed method. Simulation results show the proposed method can effectively achieve tracking continuity for multiple maneuvering targets. In addition, compared with the forward filtering alone, our method is more robust due to its combination of forward filtering and backward smoothing.
Li, Miao; Li, Jun; Zhou, Yiyu
2015-01-01
The problem of jointly detecting and tracking multiple targets from the raw observations of an infrared focal plane array is a challenging task, especially for the case with uncertain target dynamics. In this paper a multi-model labeled multi-Bernoulli (MM-LMB) track-before-detect method is proposed within the labeled random finite sets (RFS) framework. The proposed track-before-detect method consists of two parts—MM-LMB filter and MM-LMB smoother. For the MM-LMB filter, original LMB filter is applied to track-before-detect based on target and measurement models, and is integrated with the interacting multiple models (IMM) approach to accommodate the uncertainty of target dynamics. For the MM-LMB smoother, taking advantage of the track labels and posterior model transition probability, the single-model single-target smoother is extended to a multi-model multi-target smoother. A Sequential Monte Carlo approach is also presented to implement the proposed method. Simulation results show the proposed method can effectively achieve tracking continuity for multiple maneuvering targets. In addition, compared with the forward filtering alone, our method is more robust due to its combination of forward filtering and backward smoothing. PMID:26670234
FireProt: Energy- and Evolution-Based Computational Design of Thermostable Multiple-Point Mutants.
Bednar, David; Beerens, Koen; Sebestova, Eva; Bendl, Jaroslav; Khare, Sagar; Chaloupkova, Radka; Prokop, Zbynek; Brezovsky, Jan; Baker, David; Damborsky, Jiri
2015-11-01
There is great interest in increasing proteins' stability to enhance their utility as biocatalysts, therapeutics, diagnostics and nanomaterials. Directed evolution is a powerful, but experimentally strenuous approach. Computational methods offer attractive alternatives. However, due to the limited reliability of predictions and potentially antagonistic effects of substitutions, only single-point mutations are usually predicted in silico, experimentally verified and then recombined in multiple-point mutants. Thus, substantial screening is still required. Here we present FireProt, a robust computational strategy for predicting highly stable multiple-point mutants that combines energy- and evolution-based approaches with smart filtering to identify additive stabilizing mutations. FireProt's reliability and applicability was demonstrated by validating its predictions against 656 mutations from the ProTherm database. We demonstrate that thermostability of the model enzymes haloalkane dehalogenase DhaA and γ-hexachlorocyclohexane dehydrochlorinase LinA can be substantially increased (ΔTm = 24°C and 21°C) by constructing and characterizing only a handful of multiple-point mutants. FireProt can be applied to any protein for which a tertiary structure and homologous sequences are available, and will facilitate the rapid development of robust proteins for biomedical and biotechnological applications.
Statistical inference from multiple iTRAQ experiments without using common reference standards.
Herbrich, Shelley M; Cole, Robert N; West, Keith P; Schulze, Kerry; Yager, James D; Groopman, John D; Christian, Parul; Wu, Lee; O'Meally, Robert N; May, Damon H; McIntosh, Martin W; Ruczinski, Ingo
2013-02-01
Isobaric tags for relative and absolute quantitation (iTRAQ) is a prominent mass spectrometry technology for protein identification and quantification that is capable of analyzing multiple samples in a single experiment. Frequently, iTRAQ experiments are carried out using an aliquot from a pool of all samples, or "masterpool", in one of the channels as a reference sample standard to estimate protein relative abundances in the biological samples and to combine abundance estimates from multiple experiments. In this manuscript, we show that using a masterpool is counterproductive. We obtain more precise estimates of protein relative abundance by using the available biological data instead of the masterpool and do not need to occupy a channel that could otherwise be used for another biological sample. In addition, we introduce a simple statistical method to associate proteomic data from multiple iTRAQ experiments with a numeric response and show that this approach is more powerful than the conventionally employed masterpool-based approach. We illustrate our methods using data from four replicate iTRAQ experiments on aliquots of the same pool of plasma samples and from a 406-sample project designed to identify plasma proteins that covary with nutrient concentrations in chronically undernourished children from South Asia.
NASA Astrophysics Data System (ADS)
Wolff, J.; Jankov, I.; Beck, J.; Carson, L.; Frimel, J.; Harrold, M.; Jiang, H.
2016-12-01
It is well known that global and regional numerical weather prediction ensemble systems are under-dispersive, producing unreliable and overconfident ensemble forecasts. Typical approaches to alleviate this problem include the use of multiple dynamic cores, multiple physics suite configurations, or a combination of the two. While these approaches may produce desirable results, they have practical and theoretical deficiencies and are more difficult and costly to maintain. An active area of research that promotes a more unified and sustainable system for addressing the deficiencies in ensemble modeling is the use of stochastic physics to represent model-related uncertainty. Stochastic approaches include Stochastic Parameter Perturbations (SPP), Stochastic Kinetic Energy Backscatter (SKEB), Stochastic Perturbation of Physics Tendencies (SPPT), or some combination of all three. The focus of this study is to assess the model performance within a convection-permitting ensemble at 3-km grid spacing across the Contiguous United States (CONUS) when using stochastic approaches. For this purpose, the test utilized a single physics suite configuration based on the operational High-Resolution Rapid Refresh (HRRR) model, with ensemble members produced by employing stochastic methods. Parameter perturbations were employed in the Rapid Update Cycle (RUC) land surface model and Mellor-Yamada-Nakanishi-Niino (MYNN) planetary boundary layer scheme. Results will be presented in terms of bias, error, spread, skill, accuracy, reliability, and sharpness using the Model Evaluation Tools (MET) verification package. Due to the high level of complexity of running a frequently updating (hourly), high spatial resolution (3 km), large domain (CONUS) ensemble system, extensive high performance computing (HPC) resources were needed to meet this objective. Supercomputing resources were provided through the National Center for Atmospheric Research (NCAR) Strategic Capability (NSC) project support, allowing for a more extensive set of tests over multiple seasons, consequently leading to more robust results. Through the use of these stochastic innovations and powerful supercomputing at NCAR, further insights and advancements in ensemble forecasting at convection-permitting scales will be possible.
Díaz, Tania; Rodríguez, Vanina; Lozano, Ester; Mena, Mari-Pau; Calderón, Marcos; Rosiñol, Laura; Martínez, Antonio; Tovar, Natalia; Pérez-Galán, Patricia; Bladé, Joan; Roué, Gaël; de Larrea, Carlos Fernández
2017-01-01
Most patients with multiple myeloma treated with current therapies, including immunomodulatory drugs, eventually develop relapsed/refractory disease. Clinical activity of lenalidomide relies on degradation of Ikaros and the consequent reduction in IRF4 expression, both required for myeloma cell survival and involved in the regulation of MYC transcription. Thus, we sought to determine the combinational effect of an MYC-interfering therapy with lenalidomide/dexamethasone. We analyzed the potential therapeutic effect of the combination of the BET bromodomain inhibitor CPI203 with the lenalidomide/dexamethasone regimen in myeloma cell lines. CPI203 exerted a dose-dependent cell growth inhibition in cell lines, indeed in lenalidomide/dexamethasone-resistant cells (median response at 0.5 μM: 65.4%), characterized by G1 cell cycle blockade and a concomitant inhibition of MYC and Ikaros signaling. These effects were potentiated by the addition of lenalidomide/dexamethasone. Results were validated in primary plasma cells from patients with multiple myeloma co-cultured with the mesenchymal stromal cell line stromaNKtert. Consistently, the drug combination evoked a 50% reduction in cell proliferation and correlated with basal Ikaros mRNA expression levels (P=0.04). Finally, in a SCID mouse xenotransplant model of myeloma, addition of CPI203 to lenalidomide/dexamethasone decreased tumor burden, evidenced by a lower glucose uptake and increase in the growth arrest marker GADD45B, with simultaneous downregulation of key transcription factors such as MYC, Ikaros and IRF4. Taken together, our data show that the combination of a BET bromodomain inhibitor with a lenalidomide-based regimen may represent a therapeutic approach to improve the response in relapsed/refractory patients with multiple myeloma, even in cases with suboptimal prior response to immunomodulatory drugs. PMID:28751557
Frano, Kristen A; Mayhew, Hannah E; Svoboda, Shelley A; Wustholz, Kristin L
2014-12-21
The analysis of paint cross-sections can reveal a remarkable amount of information about the layers and materials in a painting without visibly altering the artwork. Although a variety of analytical approaches are used to detect inorganic pigments as well as organic binders, proteins, and lipids in cross-sections, they do not provide for the unambiguous identification of natural, organic colorants. Here, we develop a novel combined surface-enhanced Raman scattering (SERS), light microscopy, and normal Raman scattering (NRS) approach for the identification of red organic and inorganic pigments in paint cross-sections obtained from historic 18th and 19th century oil paintings. In particular, Ag nanoparticles are directly applied to localized areas of paint cross-sections mounted in polyester resin for SERS analysis of the organic pigments. This combined extractionless non-hydrolysis SERS and NRS approach provides for the definitive identification of carmine lake, madder lake, and vermilion in multiple paint layers. To our knowledge, this study represents the first in situ identification of natural, organic pigments within paint cross-sections from oil paintings. Furthermore, the combination of SERS and normal Raman, with light microscopy provides conservators with a more comprehensive understanding of a painting from a single sample and without the need for sample pretreatment.
NASA Astrophysics Data System (ADS)
Young, M. E.; Alakomi, H.-L.; Fortune, I.; Gorbushina, A. A.; Krumbein, W. E.; Maxwell, I.; McCullagh, C.; Robertson, P.; Saarela, M.; Valero, J.; Vendrell, M.
2008-12-01
Existing chemical treatments to prevent biological damage to monuments often involve considerable amounts of potentially dangerous and even poisonous biocides. The scientific approach described in this paper aims at a drastic reduction in the concentration of biocide applications by a polyphasic approach of biocides combined with cell permeabilisers, polysaccharide and pigment inhibitors and a photodynamic treatment. A variety of potential agents were screened to determine the most effective combination. Promising compounds were tested under laboratory conditions with cultures of rock deteriorating bacteria, algae, cyanobacteria and fungi. A subsequent field trial involved two sandstone types with natural biofilms. These were treated with multiple combinations of chemicals and exposed to three different climatic conditions. Although treatments proved successful in the laboratory, field trials were inconclusive and further testing will be required to determine the most effective treatment regime. While the most effective combination of chemicals and their application methodology is still being optimised, results to date indicate that this is a promising and effective treatment for the control of a wide variety of potentially damaging organisms colonising stone substrates.
Matthews, J B; Staeva, T P; Bernstein, P L; Peakman, M; von Herrath, M
2010-01-01
Like many other complex human disorders of unknown aetiology, autoimmune-mediated type 1 diabetes may ultimately be controlled via a therapeutic approach that combines multiple agents, each with differing modes of action. The numerous advantages of such a strategy include the ability to minimize toxicities and realize synergies to enhance and prolong efficacy. The recognition that combinations might offer far-reaching benefits, at a time when few single agents have yet proved themselves in well-powered trials, represents a significant challenge to our ability to conceive and implement rational treatment designs. As a first step in this process, the Immune Tolerance Network, in collaboration with the Juvenile Diabetes Research Foundation, convened a Type 1 Diabetes Combination Therapy Assessment Group, the recommendations of which are discussed in this Perspective paper. PMID:20629979
Sinclair, Karen; Kinable, Els; Grosch, Kai; Wang, Jixian
2016-05-01
In current industry practice, it is difficult to assess QT effects at potential therapeutic doses based on Phase I dose-escalation trials in oncology due to data scarcity, particularly in combinations trials. In this paper, we propose to use dose-concentration and concentration-QT models jointly to model the exposures and effects of multiple drugs in combination. The fitted models then can be used to make early predictions for QT prolongation to aid choosing recommended dose combinations for further investigation. The models consider potential correlation between concentrations of test drugs and potential drug-drug interactions at PK and QT levels. In addition, this approach allows for the assessment of the probability of QT prolongation exceeding given thresholds of clinical significance. The performance of this approach was examined via simulation under practical scenarios for dose-escalation trials for a combination of two drugs. The simulation results show that invaluable information of QT effects at therapeutic dose combinations can be gained by the proposed approaches. Early detection of dose combinations with substantial QT prolongation is evaluated effectively through the CIs of the predicted peak QT prolongation at each dose combination. Furthermore, the probability of QT prolongation exceeding a certain threshold is also computed to support early detection of safety signals while accounting for uncertainty associated with data from Phase I studies. While the prediction of QT effects is sensitive to the dose escalation process, the sensitivity and limited sample size should be considered when providing support to the decision-making process for further developing certain dose combinations. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Fabbris, G.; Hücker, M.; Gu, G. D.; ...
2016-07-14
Some of the most exotic material properties derive from electronic states with short correlation length (~10-500 Å), suggesting that the local structural symmetry may play a relevant role in their behavior. In this study, we discuss the combined use of polarized x-ray absorption fine structure and x-ray diffraction at high pressure as a powerful method to tune and probe structural and electronic orders at multiple length scales. Besides addressing some of the technical challenges associated with such experiments, we illustrate this approach with results obtained in the cuprate La 1.875Ba 0.125CuO 4, in which the response of electronic order tomore » pressure can only be understood by probing the structure at the relevant length scales.« less
Motl, Robert W; Sandroff, Brian M; DeLuca, John
2016-07-01
The current review develops a rationale and framework for examining the independent and combined effects of exercise training and cognitive rehabilitation on walking and cognitive functions in persons with multiple sclerosis (MS). To do so, we first review evidence for improvements in walking and cognitive outcomes with exercise training and cognitive rehabilitation in MS. We then review evidence regarding cognitive-motor coupling and possible cross-modality transfer effects of exercise training and cognitive rehabilitation. We lastly present a macro-level framework for considering mechanisms that might explain improvements in walking and cognitive dysfunction with exercise and cognitive rehabilitation individually and combined in MS. We conclude that researchers should consider examining the effects of exercise training and cognitive rehabilitation on walking, cognition, and cognitive-motor interactions in MS and the possible physiological and central mechanisms for improving these functions. © The Author(s) 2015.
The role of insulin pump therapy for type 2 diabetes mellitus.
Landau, Zohar; Raz, Itamar; Wainstein, Julio; Bar-Dayan, Yosefa; Cahn, Avivit
2017-01-01
Many patients with type 2 diabetes fail to achieve adequate glucose control despite escalation of treatment and combinations of multiple therapies including insulin. Patients with long-standing type 2 diabetes often suffer from the combination of severe insulin deficiency in addition to insulin resistance, thereby requiring high doses of insulin delivered in multiple injections to attain adequate glycemic control. Insulin-pump therapy was first introduced in the 1970s as an approach to mimic physiological insulin delivery and attain normal glucose in patients with type 1 diabetes. The recent years have seen an increase in the use of this technology for patients with type 2 diabetes. This article summarizes the clinical studies evaluating insulin pump use in patients with type 2 diabetes and discusses the benefits and shortcomings of pump therapy in this population. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Parikh, Nidhi; Hayatnagarkar, Harshal G; Beckman, Richard J; Marathe, Madhav V; Swarup, Samarth
2016-11-01
We describe a large-scale simulation of the aftermath of a hypothetical 10kT improvised nuclear detonation at ground level, near the White House in Washington DC. We take a synthetic information approach, where multiple data sets are combined to construct a synthesized representation of the population of the region with accurate demographics, as well as four infrastructures: transportation, healthcare, communication, and power. In this article, we focus on the model of agents and their behavior, which is represented using the options framework. Six different behavioral options are modeled: household reconstitution, evacuation, healthcare-seeking, worry, shelter-seeking, and aiding & assisting others. Agent decision-making takes into account their health status, information about family members, information about the event, and their local environment. We combine these behavioral options into five different behavior models of increasing complexity and do a number of simulations to compare the models.
Parikh, Nidhi; Hayatnagarkar, Harshal G.; Beckman, Richard J.; Marathe, Madhav V.; Swarup, Samarth
2016-01-01
We describe a large-scale simulation of the aftermath of a hypothetical 10kT improvised nuclear detonation at ground level, near the White House in Washington DC. We take a synthetic information approach, where multiple data sets are combined to construct a synthesized representation of the population of the region with accurate demographics, as well as four infrastructures: transportation, healthcare, communication, and power. In this article, we focus on the model of agents and their behavior, which is represented using the options framework. Six different behavioral options are modeled: household reconstitution, evacuation, healthcare-seeking, worry, shelter-seeking, and aiding & assisting others. Agent decision-making takes into account their health status, information about family members, information about the event, and their local environment. We combine these behavioral options into five different behavior models of increasing complexity and do a number of simulations to compare the models. PMID:27909393
Array-based sensing using nanoparticles: an alternative approach for cancer diagnostics.
Le, Ngoc D B; Yazdani, Mahdieh; Rotello, Vincent M
2014-07-01
Array-based sensing using nanoparticles (NPs) provides an attractive alternative to specific biomarker-focused strategies for cancer diagnosis. The physical and chemical properties of NPs provide both the recognition and transduction capabilities required for biosensing. Array-based sensors utilize a combined response from the interactions between sensors and analytes to generate a distinct pattern (fingerprint) for each analyte. These interactions can be the result of either the combination of multiple specific biomarker recognition (specific binding) or multiple selective binding responses, known as chemical nose sensing. The versatility of the latter array-based sensing using NPs can facilitate the development of new personalized diagnostic methodologies in cancer diagnostics, a necessary evolution in the current healthcare system to better provide personalized treatments. This review will describe the basic principle of array-based sensors, along with providing examples of both invasive and noninvasive samples used in cancer diagnosis.
A powerful approach reveals numerous expression quantitative trait haplotypes in multiple tissues.
Ying, Dingge; Li, Mulin Jun; Sham, Pak Chung; Li, Miaoxin
2018-04-26
Recently many studies showed single nucleotide polymorphisms (SNPs) affect gene expression and contribute to development of complex traits/diseases in a tissue context-dependent manner. However, little is known about haplotype's influence on gene expression and complex traits, which reflects the interaction effect between SNPs. In the present study, we firstly proposed a regulatory region guided eQTL haplotype association analysis approach, and then systematically investigate the expression quantitative trait loci (eQTL) haplotypes in 20 different tissues by the approach. The approach has a powerful design of reducing computational burden by the utilization of regulatory predictions for candidate SNP selection and multiple testing corrections on non-independent haplotypes. The application results in multiple tissues showed that haplotype-based eQTLs not only increased the number of eQTL genes in a tissue specific manner, but were also enriched in loci that associated with complex traits in a tissue-matched manner. In addition, we found that tag SNPs of eQTL haplotypes from whole blood were selectively enriched in certain combination of regulatory elements (e.g. promoters and enhancers) according to predicted chromatin states. In summary, this eQTL haplotype detection approach, together with the application results, shed insights into synergistic effect of sequence variants on gene expression and their susceptibility to complex diseases. The executable application "eHaplo" is implemented in Java and is publicly available at http://grass.cgs.hku.hk/limx/ehaplo/. jonsonfox@gmail.com, limiaoxin@mail.sysu.edu.cn. Supplementary data are available at Bioinformatics online.
A graph-based approach for designing extensible pipelines
2012-01-01
Background In bioinformatics, it is important to build extensible and low-maintenance systems that are able to deal with the new tools and data formats that are constantly being developed. The traditional and simplest implementation of pipelines involves hardcoding the execution steps into programs or scripts. This approach can lead to problems when a pipeline is expanding because the incorporation of new tools is often error prone and time consuming. Current approaches to pipeline development such as workflow management systems focus on analysis tasks that are systematically repeated without significant changes in their course of execution, such as genome annotation. However, more dynamism on the pipeline composition is necessary when each execution requires a different combination of steps. Results We propose a graph-based approach to implement extensible and low-maintenance pipelines that is suitable for pipeline applications with multiple functionalities that require different combinations of steps in each execution. Here pipelines are composed automatically by compiling a specialised set of tools on demand, depending on the functionality required, instead of specifying every sequence of tools in advance. We represent the connectivity of pipeline components with a directed graph in which components are the graph edges, their inputs and outputs are the graph nodes, and the paths through the graph are pipelines. To that end, we developed special data structures and a pipeline system algorithm. We demonstrate the applicability of our approach by implementing a format conversion pipeline for the fields of population genetics and genetic epidemiology, but our approach is also helpful in other fields where the use of multiple software is necessary to perform comprehensive analyses, such as gene expression and proteomics analyses. The project code, documentation and the Java executables are available under an open source license at http://code.google.com/p/dynamic-pipeline. The system has been tested on Linux and Windows platforms. Conclusions Our graph-based approach enables the automatic creation of pipelines by compiling a specialised set of tools on demand, depending on the functionality required. It also allows the implementation of extensible and low-maintenance pipelines and contributes towards consolidating openness and collaboration in bioinformatics systems. It is targeted at pipeline developers and is suited for implementing applications with sequential execution steps and combined functionalities. In the format conversion application, the automatic combination of conversion tools increased both the number of possible conversions available to the user and the extensibility of the system to allow for future updates with new file formats. PMID:22788675
Park, Sung Woo; Oh, Tae Suk; Eom, Jin Sup; Sun, Yoon Chi; Suh, Hyun Suk; Hong, Joon Pio
2015-05-01
The reconstruction of the posterior trunk remains to be a challenge as defects can be extensive, with deep dead space, and fixation devices exposed. Our goal was to achieve a tension-free closure for complex defects on the posterior trunk. From August 2006 to May 2013, 18 cases were reconstructed with multiple flaps combining perforator(s) and local skin flaps. The reconstructions were performed using freestyle approach. Starting with propeller flap(s) in single or multilobed design and sequentially in conjunction with adjacent random pattern flaps such as fitting puzzle. All defects achieved tensionless primary closure. The final appearance resembled a jigsaw puzzle-like appearance. The average size of defect was 139.6 cm(2) (range, 36-345 cm(2)). A total of 26 perforator flaps were used in addition to 19 random pattern flaps for 18 cases. In all cases, a single perforator was used for each propeller flap. The defect and the donor site all achieved tension-free closure. The reconstruction was 100% successful without flap loss. One case of late infection was noted at 12 months after surgery. Using multiple lobe designed propeller flaps in conjunction with random pattern flaps in a freestyle approach, resembling putting a jigsaw puzzle together, we can achieve a tension-free closure by distributing the tension to multiple flaps, supplying sufficient volume to obliterate dead space, and have reliable vascularity as the flaps do not need to be oversized. This can be a viable approach to reconstruct extensive defects on the posterior trunk. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Wurmb, T E; Quaisser, C; Balling, H; Kredel, M; Muellenbach, R; Kenn, W; Roewer, N; Brederlau, J
2011-04-01
Whole-body multislice helical CT becomes increasingly important as a diagnostic tool in patients with multiple injuries. Time gain in multiple-trauma patients who require emergency surgery might improve outcome. The authors hypothesised that whole-body multislice computed tomography (MSCT) (MSCT trauma protocol) as the initial diagnostic tool reduces the interval to start emergency surgery (tOR) if compared to conventional radiography, combined with abdominal ultrasound and organ-focused CT (conventional trauma protocol). The second goal of the study was to investigate whether the diagnostic approach chosen has an impact on outcome. The authors' level 1 trauma centre uses whole-body MSCT for initial radiological diagnostic work-up for patients with suspected multiple trauma. Before the introduction of MSCT in 2004, a conventional approach was used. Group I: data of trauma patients treated with conventional trauma protocol from 2001 to 2003. Group II: data from trauma patients treated with whole-body MSCT trauma protocol from 2004 to 2006. tOR in group I (n=155) was 120 (90-150) min (median and IQR) and 105 (85-133) min (median and IQR) in group II (n=163), respectively (p<0.05). Patients of group II had significantly more serious injuries. No difference in outcome data was found. 14 patients died in both groups within the first 30 days; five of these died within the first 24 h. A whole-body MSCT-based diagnostic approach to multiple trauma shortens the time interval to start emergency surgery in patients with multiple injuries. Mortality remained unchanged in both groups. Patients of group II were more seriously injured; an improvement of outcome might be assumed.
NASA Astrophysics Data System (ADS)
Kruse, Fred A.
2015-05-01
Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and spatially coincident Hyperspectral Thermal Emission Spectrometer (HyTES) data were used to map geology and alteration for a site in northern Death Valley, California and Nevada, USA. AVIRIS, with 224 bands at 10 nm spectral resolution over the range 0.4 - 2.5 μm at 3-meter spatial resolution were converted to reflectance using an atmospheric model. HyTES data with 256 bands at approximately 17 nm spectral resolution covering the 8 - 12 μm range at 4-meter spatial resolution were converted to emissivity using a longwave infrared (LWIR) radiative transfer atmospheric compensation model and a normalized temperature-emissivity separation approach. Key spectral endmembers were separately extracted for each wavelength region and identified, and the predominant material at each pixel was mapped for each range using Mixture-Tuned-Matched Filtering (MTMF), a partial unmixing approach. AVIRIS mapped iron oxides, clays, mica, and silicification (hydrothermal alteration); and the difference between calcite and dolomite. HyTES separated and mapped several igneous phases (not possible using AVIRIS), silicification, and validated separation of calcite from dolomite. Comparison of the material maps from the different modes, however, reveals complex overlap, indicating that multiple materials/processes exist in many areas. Combined and integrated analyses were performed to compare individual results and more completely characterize occurrences of multiple materials. Three approaches were used 1) integrated full-range analysis, 2) combined multimode classification, and 3) directed combined analysis in geologic context. Results illustrate that together, these two datasets provide an improved picture of the distribution of geologic units and subsequent alteration.
Psychosocial Intervention for Young Children With Chronic Tics
2018-06-18
Tourette's Syndrome; Tourette's Disorder; Tourette's Disease; Tourette Disorder; Tourette Disease; Tic Disorder, Combined Vocal and Multiple Motor; Multiple Motor and Vocal Tic Disorder, Combined; Gilles de La Tourette's Disease; Gilles de la Tourette Syndrome; Gilles De La Tourette's Syndrome; Combined Vocal and Multiple Motor Tic Disorder; Combined Multiple Motor and Vocal Tic Disorder; Chronic Motor and Vocal Tic Disorder
Whiteaker, Jeffrey R; Zhang, Heidi; Zhao, Lei; Wang, Pei; Kelly-Spratt, Karen S; Ivey, Richard G; Piening, Brian D; Feng, Li-Chia; Kasarda, Erik; Gurley, Kay E; Eng, Jimmy K; Chodosh, Lewis A; Kemp, Christopher J; McIntosh, Martin W; Paulovich, Amanda G
2007-10-01
Despite their potential to impact diagnosis and treatment of cancer, few protein biomarkers are in clinical use. Biomarker discovery is plagued with difficulties ranging from technological (inability to globally interrogate proteomes) to biological (genetic and environmental differences among patients and their tumors). We urgently need paradigms for biomarker discovery. To minimize biological variation and facilitate testing of proteomic approaches, we employed a mouse model of breast cancer. Specifically, we performed LC-MS/MS of tumor and normal mammary tissue from a conditional HER2/Neu-driven mouse model of breast cancer, identifying 6758 peptides representing >700 proteins. We developed a novel statistical approach (SASPECT) for prioritizing proteins differentially represented in LC-MS/MS datasets and identified proteins over- or under-represented in tumors. Using a combination of antibody-based approaches and multiple reaction monitoring-mass spectrometry (MRM-MS), we confirmed the overproduction of multiple proteins at the tissue level, identified fibulin-2 as a plasma biomarker, and extensively characterized osteopontin as a plasma biomarker capable of early disease detection in the mouse. Our results show that a staged pipeline employing shotgun-based comparative proteomics for biomarker discovery and multiple reaction monitoring for confirmation of biomarker candidates is capable of finding novel tissue and plasma biomarkers in a mouse model of breast cancer. Furthermore, the approach can be extended to find biomarkers relevant to human disease.
Multiview echocardiography fusion using an electromagnetic tracking system.
Punithakumar, Kumaradevan; Hareendranathan, Abhilash R; Paakkanen, Riitta; Khan, Nehan; Noga, Michelle; Boulanger, Pierre; Becher, Harald
2016-08-01
Three-dimensional ultrasound is an emerging modality for the assessment of complex cardiac anatomy and function. The advantages of this modality include lack of ionizing radiation, portability, low cost, and high temporal resolution. Major limitations include limited field-of-view, reliance on frequently limited acoustic windows, and poor signal to noise ratio. This study proposes a novel approach to combine multiple views into a single image using an electromagnetic tracking system in order to improve the field-of-view. The novel method has several advantages: 1) it does not rely on image information for alignment, and therefore, the method does not require image overlap; 2) the alignment accuracy of the proposed approach is not affected by any poor image quality as in the case of image registration based approaches; 3) in contrast to previous optical tracking based system, the proposed approach does not suffer from line-of-sight limitation; and 4) it does not require any initial calibration. In this pilot project, we were able to show that using a heart phantom, our method can fuse multiple echocardiographic images and improve the field-of view. Quantitative evaluations showed that the proposed method yielded a nearly optimal alignment of image data sets in three-dimensional space. The proposed method demonstrates the electromagnetic system can be used for the fusion of multiple echocardiography images with a seamless integration of sensors to the transducer.
Bayesian Networks Improve Causal Environmental Assessments for Evidence-Based Policy.
Carriger, John F; Barron, Mace G; Newman, Michael C
2016-12-20
Rule-based weight of evidence approaches to ecological risk assessment may not account for uncertainties and generally lack probabilistic integration of lines of evidence. Bayesian networks allow causal inferences to be made from evidence by including causal knowledge about the problem, using this knowledge with probabilistic calculus to combine multiple lines of evidence, and minimizing biases in predicting or diagnosing causal relationships. Too often, sources of uncertainty in conventional weight of evidence approaches are ignored that can be accounted for with Bayesian networks. Specifying and propagating uncertainties improve the ability of models to incorporate strength of the evidence in the risk management phase of an assessment. Probabilistic inference from a Bayesian network allows evaluation of changes in uncertainty for variables from the evidence. The network structure and probabilistic framework of a Bayesian approach provide advantages over qualitative approaches in weight of evidence for capturing the impacts of multiple sources of quantifiable uncertainty on predictions of ecological risk. Bayesian networks can facilitate the development of evidence-based policy under conditions of uncertainty by incorporating analytical inaccuracies or the implications of imperfect information, structuring and communicating causal issues through qualitative directed graph formulations, and quantitatively comparing the causal power of multiple stressors on valued ecological resources. These aspects are demonstrated through hypothetical problem scenarios that explore some major benefits of using Bayesian networks for reasoning and making inferences in evidence-based policy.
Crowe, Sonya; Brown, Katherine; Tregay, Jenifer; Wray, Jo; Knowles, Rachel; Ridout, Deborah A; Bull, Catherine; Utley, Martin
2017-08-01
Improving integration and continuity of care across sectors within resource constraints is a priority in many health systems. Qualitative operational research methods of problem structuring have been used to address quality improvement in services involving multiple sectors but not in combination with quantitative operational research methods that enable targeting of interventions according to patient risk. We aimed to combine these methods to augment and inform an improvement initiative concerning infants with congenital heart disease (CHD) whose complex care pathway spans multiple sectors. Soft systems methodology was used to consider systematically changes to services from the perspectives of community, primary, secondary and tertiary care professionals and a patient group, incorporating relevant evidence. Classification and regression tree (CART) analysis of national audit datasets was conducted along with data visualisation designed to inform service improvement within the context of limited resources. A 'Rich Picture' was developed capturing the main features of services for infants with CHD pertinent to service improvement. This was used, along with a graphical summary of the CART analysis, to guide discussions about targeting interventions at specific patient risk groups. Agreement was reached across representatives of relevant health professions and patients on a coherent set of targeted recommendations for quality improvement. These fed into national decisions about service provision and commissioning. When tackling complex problems in service provision across multiple settings, it is important to acknowledge and work with multiple perspectives systematically and to consider targeting service improvements in response to confined resources. Our research demonstrates that applying a combination of qualitative and quantitative operational research methods is one approach to doing so that warrants further consideration. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Growth factor transgenes interactively regulate articular chondrocytes.
Shi, Shuiliang; Mercer, Scott; Eckert, George J; Trippel, Stephen B
2013-04-01
Adult articular chondrocytes lack an effective repair response to correct damage from injury or osteoarthritis. Polypeptide growth factors that stimulate articular chondrocyte proliferation and cartilage matrix synthesis may augment this response. Gene transfer is a promising approach to delivering such factors. Multiple growth factor genes regulate these cell functions, but multiple growth factor gene transfer remains unexplored. We tested the hypothesis that multiple growth factor gene transfer selectively modulates articular chondrocyte proliferation and matrix synthesis. We tested the hypothesis by delivering combinations of the transgenes encoding insulin-like growth factor I (IGF-I), fibroblast growth factor-2 (FGF-2), transforming growth factor beta1 (TGF-β1), bone morphogenetic protein-2 (BMP-2), and bone morphogenetic protien-7 (BMP-7) to articular chondrocytes and measured changes in the production of DNA, glycosaminoglycan, and collagen. The transgenes differentially regulated all these chondrocyte activities. In concert, the transgenes interacted to generate widely divergent responses from the cells. These interactions ranged from inhibitory to synergistic. The transgene pair encoding IGF-I and FGF-2 maximized cell proliferation. The three-transgene group encoding IGF-I, BMP-2, and BMP-7 maximized matrix production and also optimized the balance between cell proliferation and matrix production. These data demonstrate an approach to articular chondrocyte regulation that may be tailored to stimulate specific cell functions, and suggest that certain growth factor gene combinations have potential value for cell-based articular cartilage repair. Copyright © 2012 Wiley Periodicals, Inc.
Examination of Association to Autism of Common Genetic Variation in Genes Related to Dopamine
Anderson, B.M.; Schnetz-Boutaud, N.; Bartlett, J.; Wright, H.H.; Abramson, R.K.; Cuccaro, M.L.; Gilbert, J.R.; Pericak-Vance, M.A.; Haines, J.L.
2010-01-01
Autism is a severe neurodevelopmental disorder characterized by a triad of complications. Autistic individuals display significant disturbances in language and reciprocal social interactions, combined with repetitive and stereotypic behaviors. Prevalence studies suggest that autism is more common than originally believed, with recent estimates citing a rate of one in 150. Although this genomic approach has yielded multiple suggestive regions, a specific risk locus has yet to be identified and widely confirmed. Because many etiologies have been suggested for this complex syndrome, we hypothesize that one of the difficulties in identifying autism genes is that multiple genetic variants may be required to significantly increase the risk of developing autism. Thus we took the alternative approach of examining 14 prominent dopamine pathway candidate genes for detailed study by genotyping 28 SNPs. Although we did observe a nominally significant association for rs2239535 (p=.008) on chromosome 20, single locus analysis did not reveal any results as significant after correction for multiple comparisons. No significant interaction was identified when Multifactor Dimensionality Reduction (MDR) was employed to test specifically for multilocus effects. Although genome-wide linkage scans in autism have provided support for linkage to various loci along the dopamine pathway, our study does not provide strong evidence of linkage or association to any specific gene or combination of genes within the pathway. These results demonstrate that common genetic variation within the tested genes located within this pathway at most play a minor to moderate role in overall autism pathogenesis. PMID:19360691
González-Domínguez, Raúl; Santos, Hugo Miguel; Bebianno, Maria João; García-Barrera, Tamara; Gómez-Ariza, José Luis; Capelo, José Luis
2016-12-15
Estuaries are very important ecosystems with great ecological and economic value, but usually highly impacted by anthropogenic pressure. Thus, the assessment of pollution levels in these habitats is critical in order to evaluate their environmental quality. In this work, we combined complementary metallomic and proteomic approaches with the aim to monitor the effects of environmental pollution on Scrobicularia plana clams captured in three estuarine systems from the south coast of Portugal; Arade estuary, Ria Formosa and Guadiana estuary. Multi-elemental profiling of digestive glands was carried out to evaluate the differential pollution levels in the three study areas. Then, proteomic analysis by means of two-dimensional gel electrophoresis and mass spectrometry revealed twenty-one differential proteins, which could be associated with multiple toxicological mechanisms induced in environmentally stressed organisms. Accordingly, it could be concluded that the combination of different omic approaches presents a great potential in environmental research. Copyright © 2016 Elsevier Ltd. All rights reserved.
Modular Knowledge Representation and Reasoning in the Semantic Web
NASA Astrophysics Data System (ADS)
Serafini, Luciano; Homola, Martin
Construction of modular ontologies by combining different modules is becoming a necessity in ontology engineering in order to cope with the increasing complexity of the ontologies and the domains they represent. The modular ontology approach takes inspiration from software engineering, where modularization is a widely acknowledged feature. Distributed reasoning is the other side of the coin of modular ontologies: given an ontology comprising of a set of modules, it is desired to perform reasoning by combination of multiple reasoning processes performed locally on each of the modules. In the last ten years, a number of approaches for combining logics has been developed in order to formalize modular ontologies. In this chapter, we survey and compare the main formalisms for modular ontologies and distributed reasoning in the Semantic Web. We select four formalisms build on formal logical grounds of Description Logics: Distributed Description Logics, ℰ-connections, Package-based Description Logics and Integrated Distributed Description Logics. We concentrate on expressivity and distinctive modeling features of each framework. We also discuss reasoning capabilities of each framework.
In Situ Optical Mapping of Voltage and Calcium in the Heart
Ewart, Paul; Ashley, Euan A.; Loew, Leslie M.; Kohl, Peter; Bollensdorff, Christian; Woods, Christopher E.
2012-01-01
Electroanatomic mapping the interrelation of intracardiac electrical activation with anatomic locations has become an important tool for clinical assessment of complex arrhythmias. Optical mapping of cardiac electrophysiology combines high spatiotemporal resolution of anatomy and physiological function with fast and simultaneous data acquisition. If applied to the clinical setting, this could improve both diagnostic potential and therapeutic efficacy of clinical arrhythmia interventions. The aim of this study was to explore this utility in vivo using a rat model. To this aim, we present a single-camera imaging and multiple light-emitting-diode illumination system that reduces economic and technical implementation hurdles to cardiac optical mapping. Combined with a red-shifted calcium dye and a new near-infrared voltage-sensitive dye, both suitable for use in blood-perfused tissue, we demonstrate the feasibility of in vivo multi-parametric imaging of the mammalian heart. Our approach combines recording of electrophysiologically-relevant parameters with observation of structural substrates and is adaptable, in principle, to trans-catheter percutaneous approaches. PMID:22876327
Diagnosis and Treatment of Bone Disease in Multiple Myeloma: Spotlight on Spinal Involvement
Tosi, Patrizia
2013-01-01
Bone disease is observed in almost 80% of newly diagnosed symptomatic multiple myeloma patients, and spine is the bone site that is more frequently affected by myeloma-induced osteoporosis, osteolyses, or compression fractures. In almost 20% of the cases, spinal cord compression may occur; diagnosis and treatment must be carried out rapidly in order to avoid a permanent sensitive or motor defect. Although whole body skeletal X-ray is considered mandatory for multiple myeloma staging, magnetic resonance imaging is presently considered the most appropriate diagnostic technique for the evaluation of vertebral alterations, as it allows to detect not only the exact morphology of the lesions, but also the pattern of bone marrow infiltration by the disease. Multiple treatment modalities can be used to manage multiple myeloma-related vertebral lesions. Surgery or radiotherapy is mainly employed in case of spinal cord compression, impending fractures, or intractable pain. Percutaneous vertebroplasty or balloon kyphoplasty can reduce local pain in a significant fraction of treated patients, without interfering with subsequent therapeutic programs. Systemic antimyeloma therapy with conventional chemotherapy or, more appropriately, with combinations of conventional chemotherapy and compounds acting on both neoplastic plasma cells and bone marrow microenvironment must be soon initiated in order to reduce bone resorption and, possibly, promote bone formation. Bisphosphonates should also be used in combination with antimyeloma therapy as they reduce bone resorption and prolong patients survival. A multidisciplinary approach is thus needed in order to properly manage spinal involvement in multiple myeloma. PMID:24381787
NASA Astrophysics Data System (ADS)
Gopinath, T.; Veglia, Gianluigi
2016-06-01
Conventional multidimensional magic angle spinning (MAS) solid-state NMR (ssNMR) experiments detect the signal arising from the decay of a single coherence transfer pathway (FID), resulting in one spectrum per acquisition time. Recently, we introduced two new strategies, namely DUMAS (DUal acquisition Magic Angle Spinning) and MEIOSIS (Multiple ExperIments via Orphan SpIn operatorS), that enable the simultaneous acquisitions of multidimensional ssNMR experiments using multiple coherence transfer pathways. Here, we combined the main elements of DUMAS and MEIOSIS to harness both orphan spin operators and residual polarization and increase the number of simultaneous acquisitions. We show that it is possible to acquire up to eight two-dimensional experiments using four acquisition periods per each scan. This new suite of pulse sequences, called MAeSTOSO for Multiple Acquisitions via Sequential Transfer of Orphan Spin pOlarization, relies on residual polarization of both 13C and 15N pathways and combines low- and high-sensitivity experiments into a single pulse sequence using one receiver and commercial ssNMR probes. The acquisition of multiple experiments does not affect the sensitivity of the main experiment; rather it recovers the lost coherences that are discarded, resulting in a significant gain in experimental time. Both merits and limitations of this approach are discussed.
Meta-analysis identifies gene-by-environment interactions as demonstrated in a study of 4,965 mice.
Kang, Eun Yong; Han, Buhm; Furlotte, Nicholas; Joo, Jong Wha J; Shih, Diana; Davis, Richard C; Lusis, Aldons J; Eskin, Eleazar
2014-01-01
Identifying environmentally-specific genetic effects is a key challenge in understanding the structure of complex traits. Model organisms play a crucial role in the identification of such gene-by-environment interactions, as a result of the unique ability to observe genetically similar individuals across multiple distinct environments. Many model organism studies examine the same traits but under varying environmental conditions. For example, knock-out or diet-controlled studies are often used to examine cholesterol in mice. These studies, when examined in aggregate, provide an opportunity to identify genomic loci exhibiting environmentally-dependent effects. However, the straightforward application of traditional methodologies to aggregate separate studies suffers from several problems. First, environmental conditions are often variable and do not fit the standard univariate model for interactions. Additionally, applying a multivariate model results in increased degrees of freedom and low statistical power. In this paper, we jointly analyze multiple studies with varying environmental conditions using a meta-analytic approach based on a random effects model to identify loci involved in gene-by-environment interactions. Our approach is motivated by the observation that methods for discovering gene-by-environment interactions are closely related to random effects models for meta-analysis. We show that interactions can be interpreted as heterogeneity and can be detected without utilizing the traditional uni- or multi-variate approaches for discovery of gene-by-environment interactions. We apply our new method to combine 17 mouse studies containing in aggregate 4,965 distinct animals. We identify 26 significant loci involved in High-density lipoprotein (HDL) cholesterol, many of which are consistent with previous findings. Several of these loci show significant evidence of involvement in gene-by-environment interactions. An additional advantage of our meta-analysis approach is that our combined study has significantly higher power and improved resolution compared to any single study thus explaining the large number of loci discovered in the combined study.
Meta-Analysis Identifies Gene-by-Environment Interactions as Demonstrated in a Study of 4,965 Mice
Joo, Jong Wha J.; Shih, Diana; Davis, Richard C.; Lusis, Aldons J.; Eskin, Eleazar
2014-01-01
Identifying environmentally-specific genetic effects is a key challenge in understanding the structure of complex traits. Model organisms play a crucial role in the identification of such gene-by-environment interactions, as a result of the unique ability to observe genetically similar individuals across multiple distinct environments. Many model organism studies examine the same traits but under varying environmental conditions. For example, knock-out or diet-controlled studies are often used to examine cholesterol in mice. These studies, when examined in aggregate, provide an opportunity to identify genomic loci exhibiting environmentally-dependent effects. However, the straightforward application of traditional methodologies to aggregate separate studies suffers from several problems. First, environmental conditions are often variable and do not fit the standard univariate model for interactions. Additionally, applying a multivariate model results in increased degrees of freedom and low statistical power. In this paper, we jointly analyze multiple studies with varying environmental conditions using a meta-analytic approach based on a random effects model to identify loci involved in gene-by-environment interactions. Our approach is motivated by the observation that methods for discovering gene-by-environment interactions are closely related to random effects models for meta-analysis. We show that interactions can be interpreted as heterogeneity and can be detected without utilizing the traditional uni- or multi-variate approaches for discovery of gene-by-environment interactions. We apply our new method to combine 17 mouse studies containing in aggregate 4,965 distinct animals. We identify 26 significant loci involved in High-density lipoprotein (HDL) cholesterol, many of which are consistent with previous findings. Several of these loci show significant evidence of involvement in gene-by-environment interactions. An additional advantage of our meta-analysis approach is that our combined study has significantly higher power and improved resolution compared to any single study thus explaining the large number of loci discovered in the combined study. PMID:24415945
Zepeda-Mendoza, Marie Lisandra; Bohmann, Kristine; Carmona Baez, Aldo; Gilbert, M Thomas P
2016-05-03
DNA metabarcoding is an approach for identifying multiple taxa in an environmental sample using specific genetic loci and taxa-specific primers. When combined with high-throughput sequencing it enables the taxonomic characterization of large numbers of samples in a relatively time- and cost-efficient manner. One recent laboratory development is the addition of 5'-nucleotide tags to both primers producing double-tagged amplicons and the use of multiple PCR replicates to filter erroneous sequences. However, there is currently no available toolkit for the straightforward analysis of datasets produced in this way. We present DAMe, a toolkit for the processing of datasets generated by double-tagged amplicons from multiple PCR replicates derived from an unlimited number of samples. Specifically, DAMe can be used to (i) sort amplicons by tag combination, (ii) evaluate PCR replicates dissimilarity, and (iii) filter sequences derived from sequencing/PCR errors, chimeras, and contamination. This is attained by calculating the following parameters: (i) sequence content similarity between the PCR replicates from each sample, (ii) reproducibility of each unique sequence across the PCR replicates, and (iii) copy number of the unique sequences in each PCR replicate. We showcase the insights that can be obtained using DAMe prior to taxonomic assignment, by applying it to two real datasets that vary in their complexity regarding number of samples, sequencing libraries, PCR replicates, and used tag combinations. Finally, we use a third mock dataset to demonstrate the impact and importance of filtering the sequences with DAMe. DAMe allows the user-friendly manipulation of amplicons derived from multiple samples with PCR replicates built in a single or multiple sequencing libraries. It allows the user to: (i) collapse amplicons into unique sequences and sort them by tag combination while retaining the sample identifier and copy number information, (ii) identify sequences carrying unused tag combinations, (iii) evaluate the comparability of PCR replicates of the same sample, and (iv) filter tagged amplicons from a number of PCR replicates using parameters of minimum length, copy number, and reproducibility across the PCR replicates. This enables an efficient analysis of complex datasets, and ultimately increases the ease of handling datasets from large-scale studies.
Kovalchuk, Sergey V; Funkner, Anastasia A; Metsker, Oleg G; Yakovlev, Aleksey N
2018-06-01
An approach to building a hybrid simulation of patient flow is introduced with a combination of data-driven methods for automation of model identification. The approach is described with a conceptual framework and basic methods for combination of different techniques. The implementation of the proposed approach for simulation of the acute coronary syndrome (ACS) was developed and used in an experimental study. A combination of data, text, process mining techniques, and machine learning approaches for the analysis of electronic health records (EHRs) with discrete-event simulation (DES) and queueing theory for the simulation of patient flow was proposed. The performed analysis of EHRs for ACS patients enabled identification of several classes of clinical pathways (CPs) which were used to implement a more realistic simulation of the patient flow. The developed solution was implemented using Python libraries (SimPy, SciPy, and others). The proposed approach enables more a realistic and detailed simulation of the patient flow within a group of related departments. An experimental study shows an improved simulation of patient length of stay for ACS patient flow obtained from EHRs in Almazov National Medical Research Centre in Saint Petersburg, Russia. The proposed approach, methods, and solutions provide a conceptual, methodological, and programming framework for the implementation of a simulation of complex and diverse scenarios within a flow of patients for different purposes: decision making, training, management optimization, and others. Copyright © 2018 Elsevier Inc. All rights reserved.
A Combined IRT and SEM Approach for Individual-Level Assessment in Test-Retest Studies
ERIC Educational Resources Information Center
Ferrando, Pere J.
2015-01-01
The standard two-wave multiple-indicator model (2WMIM) commonly used to analyze test-retest data provides information at both the group and item level. Furthermore, when applied to binary and graded item responses, it is related to well-known item response theory (IRT) models. In this article the IRT-2WMIM relations are used to obtain additional…
Isgut, Monica; Rao, Mukkavilli; Yang, Chunhua; Subrahmanyam, Vangala; Rida, Padmashree C G; Aneja, Ritu
2018-03-01
Modern drug discovery efforts have had mediocre success rates with increasing developmental costs, and this has encouraged pharmaceutical scientists to seek innovative approaches. Recently with the rise of the fields of systems biology and metabolomics, network pharmacology (NP) has begun to emerge as a new paradigm in drug discovery, with a focus on multiple targets and drug combinations for treating disease. Studies on the benefits of drug combinations lay the groundwork for a renewed focus on natural products in drug discovery. Natural products consist of a multitude of constituents that can act on a variety of targets in the body to induce pharmacodynamic responses that may together culminate in an additive or synergistic therapeutic effect. Although natural products cannot be patented, they can be used as starting points in the discovery of potent combination therapeutics. The optimal mix of bioactive ingredients in natural products can be determined via phenotypic screening. The targets and molecular mechanisms of action of these active ingredients can then be determined using chemical proteomics, and by implementing a reverse pharmacokinetics approach. This review article provides evidence supporting the potential benefits of natural product-based combination drugs, and summarizes drug discovery methods that can be applied to this class of drugs. © 2017 Wiley Periodicals, Inc.
An In Situ One-Pot Synthetic Approach towards Multivariate Zirconium MOFs.
Sun, Yujia; Sun, Lixian; Feng, Dawei; Zhou, Hong-Cai
2016-05-23
Chemically highly stable MOFs incorporating multiple functionalities are of great interest for applications under harsh environments. Herein, we presented a facile one-pot synthetic strategy to incorporate multiple functionalities into stable Zr-MOFs from mixed ligands of different geometry and connectivity. Via our strategy, tetratopic tetrakis(4-carboxyphenyl)porphyrin (TCPP) ligands were successfully integrated into UiO-66 while maintaining the crystal structure, morphology, and ultrahigh chemical stability of UiO-66. The amount of incorporated TCPP is controllable. Through various combinations of BDC derivatives and TCPP, 49 MOFs with multiple functionalities were obtained. Among them, MOFs modified with FeTCPPCl were demonstrated to be catalytically active for the oxidation of ABTS. We anticipate our strategy to provide a facile route to introduce multiple functionalities into stable Zr-MOFs for a wide variety of potential applications. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Guinn, Emily J.; Jagannathan, Bharat; Marqusee, Susan
2015-04-01
A fundamental question in protein folding is whether proteins fold through one or multiple trajectories. While most experiments indicate a single pathway, simulations suggest proteins can fold through many parallel pathways. Here, we use a combination of chemical denaturant, mechanical force and site-directed mutations to demonstrate the presence of multiple unfolding pathways in a simple, two-state folding protein. We show that these multiple pathways have structurally different transition states, and that seemingly small changes in protein sequence and environment can strongly modulate the flux between the pathways. These results suggest that in vivo, the crowded cellular environment could strongly influence the mechanisms of protein folding and unfolding. Our study resolves the apparent dichotomy between experimental and theoretical studies, and highlights the advantage of using a multipronged approach to reveal the complexities of a protein's free-energy landscape.
Lee, Barrett A.; Reardon, Sean F.; Firebaugh, Glenn; Farrell, Chad R.; Matthews, Stephen A.; O'Sullivan, David
2014-01-01
The census tract-based residential segregation literature rests on problematic assumptions about geographic scale and proximity. We pursue a new tract-free approach that combines explicitly spatial concepts and methods to examine racial segregation across egocentric local environments of varying size. Using 2000 census data for the 100 largest U.S. metropolitan areas, we compute a spatially modified version of the information theory index H to describe patterns of black-white, Hispanic-white, Asian-white, and multi-group segregation at different scales. The metropolitan structural characteristics that best distinguish micro-segregation from macro-segregation for each group combination are identified, and their effects are decomposed into portions due to racial variation occurring over short and long distances. A comparison of our results to those from tract-based analyses confirms the value of the new approach. PMID:25324575
Stroet, Martin; Koziara, Katarzyna B; Malde, Alpeshkumar K; Mark, Alan E
2017-12-12
A general method for parametrizing atomic interaction functions is presented. The method is based on an analysis of surfaces corresponding to the difference between calculated and target data as a function of alternative combinations of parameters (parameter space mapping). The consideration of surfaces in parameter space as opposed to local values or gradients leads to a better understanding of the relationships between the parameters being optimized and a given set of target data. This in turn enables for a range of target data from multiple molecules to be combined in a robust manner and for the optimal region of parameter space to be trivially identified. The effectiveness of the approach is illustrated by using the method to refine the chlorine 6-12 Lennard-Jones parameters against experimental solvation free enthalpies in water and hexane as well as the density and heat of vaporization of the liquid at atmospheric pressure for a set of 10 aromatic-chloro compounds simultaneously. Single-step perturbation is used to efficiently calculate solvation free enthalpies for a wide range of parameter combinations. The capacity of this approach to parametrize accurate and transferrable force fields is discussed.
Mark-recapture with multiple, non-invasive marks.
Bonner, Simon J; Holmberg, Jason
2013-09-01
Non-invasive marks, including pigmentation patterns, acquired scars, and genetic markers, are often used to identify individuals in mark-recapture experiments. If animals in a population can be identified from multiple, non-invasive marks then some individuals may be counted twice in the observed data. Analyzing the observed histories without accounting for these errors will provide incorrect inference about the population dynamics. Previous approaches to this problem include modeling data from only one mark and combining estimators obtained from each mark separately assuming that they are independent. Motivated by the analysis of data from the ECOCEAN online whale shark (Rhincodon typus) catalog, we describe a Bayesian method to analyze data from multiple, non-invasive marks that is based on the latent-multinomial model of Link et al. (2010, Biometrics 66, 178-185). Further to this, we describe a simplification of the Markov chain Monte Carlo algorithm of Link et al. (2010, Biometrics 66, 178-185) that leads to more efficient computation. We present results from the analysis of the ECOCEAN whale shark data and from simulation studies comparing our method with the previous approaches. © 2013, The International Biometric Society.
Liu, Kui; Wei, Sixiao; Chen, Zhijiang; Jia, Bin; Chen, Genshe; Ling, Haibin; Sheaff, Carolyn; Blasch, Erik
2017-01-01
This paper presents the first attempt at combining Cloud with Graphic Processing Units (GPUs) in a complementary manner within the framework of a real-time high performance computation architecture for the application of detecting and tracking multiple moving targets based on Wide Area Motion Imagery (WAMI). More specifically, the GPU and Cloud Moving Target Tracking (GC-MTT) system applied a front-end web based server to perform the interaction with Hadoop and highly parallelized computation functions based on the Compute Unified Device Architecture (CUDA©). The introduced multiple moving target detection and tracking method can be extended to other applications such as pedestrian tracking, group tracking, and Patterns of Life (PoL) analysis. The cloud and GPUs based computing provides an efficient real-time target recognition and tracking approach as compared to methods when the work flow is applied using only central processing units (CPUs). The simultaneous tracking and recognition results demonstrate that a GC-MTT based approach provides drastically improved tracking with low frame rates over realistic conditions. PMID:28208684
Cacha, L A; Parida, S; Dehuri, S; Cho, S-B; Poznanski, R R
2016-12-01
The huge number of voxels in fMRI over time poses a major challenge to for effective analysis. Fast, accurate, and reliable classifiers are required for estimating the decoding accuracy of brain activities. Although machine-learning classifiers seem promising, individual classifiers have their own limitations. To address this limitation, the present paper proposes a method based on the ensemble of neural networks to analyze fMRI data for cognitive state classification for application across multiple subjects. Similarly, the fuzzy integral (FI) approach has been employed as an efficient tool for combining different classifiers. The FI approach led to the development of a classifiers ensemble technique that performs better than any of the single classifier by reducing the misclassification, the bias, and the variance. The proposed method successfully classified the different cognitive states for multiple subjects with high accuracy of classification. Comparison of the performance improvement, while applying ensemble neural networks method, vs. that of the individual neural network strongly points toward the usefulness of the proposed method.
Deloria Knoll, Maria; Fu, Wei; Shi, Qiyuan; Prosperi, Christine; Wu, Zhenke; Hammitt, Laura L; Feikin, Daniel R; Baggett, Henry C; Howie, Stephen R C; Scott, J Anthony G; Murdoch, David R; Madhi, Shabir A; Thea, Donald M; Brooks, W Abdullah; Kotloff, Karen L; Li, Mengying; Park, Daniel E; Lin, Wenyi; Levine, Orin S; O'Brien, Katherine L; Zeger, Scott L
2017-06-15
In pneumonia, specimens are rarely obtained directly from the infection site, the lung, so the pathogen causing infection is determined indirectly from multiple tests on peripheral clinical specimens, which may have imperfect and uncertain sensitivity and specificity, so inference about the cause is complex. Analytic approaches have included expert review of case-only results, case-control logistic regression, latent class analysis, and attributable fraction, but each has serious limitations and none naturally integrate multiple test results. The Pneumonia Etiology Research for Child Health (PERCH) study required an analytic solution appropriate for a case-control design that could incorporate evidence from multiple specimens from cases and controls and that accounted for measurement error. We describe a Bayesian integrated approach we developed that combined and extended elements of attributable fraction and latent class analyses to meet some of these challenges and illustrate the advantage it confers regarding the challenges identified for other methods. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.
Predicting the synergy of multiple stress effects
NASA Astrophysics Data System (ADS)
Liess, Matthias; Foit, Kaarina; Knillmann, Saskia; Schäfer, Ralf B.; Liess, Hans-Dieter
2016-09-01
Toxicants and other, non-chemical environmental stressors contribute to the global biodiversity crisis. Examples include the loss of bees and the reduction of aquatic biodiversity. Although non-compliance with regulations might be contributing, the widespread existence of these impacts suggests that for example the current approach of pesticide risk assessment fails to protect biodiversity when multiple stressors concurrently affect organisms. To quantify such multiple stress effects, we analysed all applicable aquatic studies and found that the presence of environmental stressors increases individual sensitivity to toxicants (pesticides, trace metals) by a factor of up to 100. To predict this dependence, we developed the “Stress Addition Model” (SAM). With the SAM, we assume that each individual has a general stress capacity towards all types of specific stress that should not be exhausted. Experimental stress levels are transferred into general stress levels of the SAM using the stress-related mortality as a common link. These general stress levels of independent stressors are additive, with the sum determining the total stress exerted on a population. With this approach, we provide a tool that quantitatively predicts the highly synergistic direct effects of independent stressor combinations.
Liu, Kui; Wei, Sixiao; Chen, Zhijiang; Jia, Bin; Chen, Genshe; Ling, Haibin; Sheaff, Carolyn; Blasch, Erik
2017-02-12
This paper presents the first attempt at combining Cloud with Graphic Processing Units (GPUs) in a complementary manner within the framework of a real-time high performance computation architecture for the application of detecting and tracking multiple moving targets based on Wide Area Motion Imagery (WAMI). More specifically, the GPU and Cloud Moving Target Tracking (GC-MTT) system applied a front-end web based server to perform the interaction with Hadoop and highly parallelized computation functions based on the Compute Unified Device Architecture (CUDA©). The introduced multiple moving target detection and tracking method can be extended to other applications such as pedestrian tracking, group tracking, and Patterns of Life (PoL) analysis. The cloud and GPUs based computing provides an efficient real-time target recognition and tracking approach as compared to methods when the work flow is applied using only central processing units (CPUs). The simultaneous tracking and recognition results demonstrate that a GC-MTT based approach provides drastically improved tracking with low frame rates over realistic conditions.
Knowledge Discovery/A Collaborative Approach, an Innovative Solution
NASA Technical Reports Server (NTRS)
Fitts, Mary A.
2009-01-01
Collaboration between Medical Informatics and Healthcare Systems (MIHCS) at NASA/Johnson Space Center (JSC) and the Texas Medical Center (TMC) Library was established to investigate technologies for facilitating knowledge discovery across multiple life sciences research disciplines in multiple repositories. After reviewing 14 potential Enterprise Search System (ESS) solutions, Collexis was determined to best meet the expressed needs. A three month pilot evaluation of Collexis produced positive reports from multiple scientists across 12 research disciplines. The joint venture and a pilot-phased approach achieved the desired results without the high cost of purchasing software, hardware or additional resources to conduct the task. Medical research is highly compartmentalized by discipline, e.g. cardiology, immunology, neurology. The medical research community at large, as well as at JSC, recognizes the need for cross-referencing relevant information to generate best evidence. Cross-discipline collaboration at JSC is specifically required to close knowledge gaps affecting space exploration. To facilitate knowledge discovery across these communities, MIHCS combined expertise with the TMC library and found Collexis to best fit the needs of our researchers including:
Karvetski, Christopher W; Lambert, James H; Linkov, Igor
2011-04-01
Military and industrial facilities need secure and reliable power generation. Grid outages can result in cascading infrastructure failures as well as security breaches and should be avoided. Adding redundancy and increasing reliability can require additional environmental, financial, logistical, and other considerations and resources. Uncertain scenarios consisting of emergent environmental conditions, regulatory changes, growth of regional energy demands, and other concerns result in further complications. Decisions on selecting energy alternatives are made on an ad hoc basis. The present work integrates scenario analysis and multiple criteria decision analysis (MCDA) to identify combinations of impactful emergent conditions and to perform a preliminary benefits analysis of energy and environmental security investments for industrial and military installations. Application of a traditional MCDA approach would require significant stakeholder elicitations under multiple uncertain scenarios. The approach proposed in this study develops and iteratively adjusts a scoring function for investment alternatives to find the scenarios with the most significant impacts on installation security. A robust prioritization of investment alternatives can be achieved by integrating stakeholder preferences and focusing modeling and decision-analytical tools on a few key emergent conditions and scenarios. The approach is described and demonstrated for a campus of several dozen interconnected industrial buildings within a major installation. Copyright © 2010 SETAC.
Anwar, Shafkat; Rockefeller, Toby; Raptis, Demetrios A; Woodard, Pamela K; Eghtesady, Pirooz
2018-02-03
Patients with tetralogy of Fallot, pulmonary atresia, and multiple aortopulmonary collateral arteries (Tet PA MAPCAs) have a wide spectrum of anatomy and disease severity. Management of these patients can be challenging and often require multiple high-risk surgical and interventional catheterization procedures. These interventions are made challenging by complex anatomy that require the proceduralist to mentally reconstruct three-dimensional anatomic relationships from two-dimensional images. Three-dimensional (3D) printing is an emerging medical technology that provides added benefits in the management of patients with Tet PA MAPCAs. When used in combination with current diagnostic modalities and procedures, 3D printing provides a precise approach to the management of these challenging, high-risk patients. Specifically, 3D printing enables detailed surgical and interventional planning prior to the procedure, which may improve procedural outcomes, decrease complications, and reduce procedure-related radiation dose and contrast load.
Multiple Spectral-Spatial Classification Approach for Hyperspectral Data
NASA Technical Reports Server (NTRS)
Tarabalka, Yuliya; Benediktsson, Jon Atli; Chanussot, Jocelyn; Tilton, James C.
2010-01-01
A .new multiple classifier approach for spectral-spatial classification of hyperspectral images is proposed. Several classifiers are used independently to classify an image. For every pixel, if all the classifiers have assigned this pixel to the same class, the pixel is kept as a marker, i.e., a seed of the spatial region, with the corresponding class label. We propose to use spectral-spatial classifiers at the preliminary step of the marker selection procedure, each of them combining the results of a pixel-wise classification and a segmentation map. Different segmentation methods based on dissimilar principles lead to different classification results. Furthermore, a minimum spanning forest is built, where each tree is rooted on a classification -driven marker and forms a region in the spectral -spatial classification: map. Experimental results are presented for two hyperspectral airborne images. The proposed method significantly improves classification accuracies, when compared to previously proposed classification techniques.
D’Amico, Emanuele; Patti, Francesco; Zanghì, Aurora; Zappia, Mario
2016-01-01
Using the term of progressive multiple sclerosis (PMS), we considered a combined population of persons with secondary progressive MS (SPMS) and primary progressive MS (PPMS). These forms of MS cannot be challenged with efficacy by the licensed therapy. In the last years, several measures of risk estimation were developed for predicting clinical course in MS, but none is specific for the PMS forms. Personalized medicine is a therapeutic approach, based on identifying what might be the best therapy for an individual patient, taking into account the risk profile. We need to achieve more accurate estimates of useful predictors in PMS, including unconventional and qualitative markers which are not yet currently available or practicable routine diagnostics. The evaluation of an individual patient is based on the profile of disease activity.Within the neurology field, PMS is one of the fastest-moving going into the future. PMID:27763513
Three-dimensional spatiotemporal focusing of holographic patterns
Hernandez, Oscar; Papagiakoumou, Eirini; Tanese, Dimitrii; Fidelin, Kevin; Wyart, Claire; Emiliani, Valentina
2016-01-01
Two-photon excitation with temporally focused pulses can be combined with phase-modulation approaches, such as computer-generated holography and generalized phase contrast, to efficiently distribute light into two-dimensional, axially confined, user-defined shapes. Adding lens-phase modulations to 2D-phase holograms enables remote axial pattern displacement as well as simultaneous pattern generation in multiple distinct planes. However, the axial confinement linearly degrades with lateral shape area in previous reports where axially shifted holographic shapes were not temporally focused. Here we report an optical system using two spatial light modulators to independently control transverse- and axial-target light distribution. This approach enables simultaneous axial translation of single or multiple spatiotemporally focused patterns across the sample volume while achieving the axial confinement of temporal focusing. We use the system's capability to photoconvert tens of Kaede-expressing neurons with single-cell resolution in live zebrafish larvae. PMID:27306044
Pancreatic trauma: demographics, diagnosis, and management.
Stawicki, Stanislaw Peter; Schwab, C William
2008-12-01
Pancreatic injuries are rare, with penetrating mechanisms being causative in majority of cases. They can create major diagnostic and therapeutic challenges and require multiple diagnostic modalities, including multislice high-definition computed tomography, magnetic resonance cholangiopancreatography, endoscopic retrograde cholangiopancreatography, ultrasonography, and at times, surgery and direct visualization of the pancreas. Pancreatic trauma is frequently associated with duodenal and other severe vascular and visceral injuries. Mortality is high and usually related to the concomitant vascular injury. Surgical management of pancreatic and pancreatic-duodenal trauma is challenging, and multiple surgical approaches and techniques have been described, up to and including pancreatic damage control and later resection and reconstruction. Wide surgical drainage is a key to any surgical trauma technique and access for enteral nutrition, or occasionally parenteral nutrition, are important adjuncts. Morbidity associated with pancreatic trauma is high and can be quite severe. Treatment of pancreatic trauma-related complications often requires a combination of interventional, endoscopic, and surgical approaches.
Optimal Discrete Spatial Compression for Beamspace Massive MIMO Signals
NASA Astrophysics Data System (ADS)
Jiang, Zhiyuan; Zhou, Sheng; Niu, Zhisheng
2018-05-01
Deploying massive number of antennas at the base station side can boost the cellular system performance dramatically. Meanwhile, it however involves significant additional radio-frequency (RF) front-end complexity, hardware cost and power consumption. To address this issue, the beamspace-multiple-input-multiple-output (beamspace-MIMO) based approach is considered as a promising solution. In this paper, we first show that the traditional beamspace-MIMO suffers from spatial power leakage and imperfect channel statistics estimation. A beam combination module is hence proposed, which consists of a small number (compared with the number of antenna elements) of low-resolution (possibly one-bit) digital (discrete) phase shifters after the beamspace transformation to further compress the beamspace signal dimensionality, such that the number of RF chains can be reduced beyond beamspace transformation and beam selection. The optimum discrete beam combination weights for the uplink are obtained based on the branch-and-bound (BB) approach. The key to the BB-based solution is to solve the embodied sub-problem, whose solution is derived in a closed-form. Based on the solution, a sequential greedy beam combination scheme with linear-complexity (w.r.t. the number of beams in the beamspace) is proposed. Link-level simulation results based on realistic channel models and long-term-evolution (LTE) parameters are presented which show that the proposed schemes can reduce the number of RF chains by up to $25\\%$ with a one-bit digital phase-shifter-network.
Modelling multiple sources of dissemination bias in meta-analysis.
Bowden, Jack; Jackson, Dan; Thompson, Simon G
2010-03-30
Asymmetry in the funnel plot for a meta-analysis suggests the presence of dissemination bias. This may be caused by publication bias through the decisions of journal editors, by selective reporting of research results by authors or by a combination of both. Typically, study results that are statistically significant or have larger estimated effect sizes are more likely to appear in the published literature, hence giving a biased picture of the evidence-base. Previous statistical approaches for addressing dissemination bias have assumed only a single selection mechanism. Here we consider a more realistic scenario in which multiple dissemination processes, involving both the publishing authors and journals, are operating. In practical applications, the methods can be used to provide sensitivity analyses for the potential effects of multiple dissemination biases operating in meta-analysis.
Power-efficient method for IM-DD optical transmission of multiple OFDM signals.
Effenberger, Frank; Liu, Xiang
2015-05-18
We propose a power-efficient method for transmitting multiple frequency-division multiplexed (FDM) orthogonal frequency-division multiplexing (OFDM) signals in intensity-modulation direct-detection (IM-DD) optical systems. This method is based on quadratic soft clipping in combination with odd-only channel mapping. We show, both analytically and experimentally, that the proposed approach is capable of improving the power efficiency by about 3 dB as compared to conventional FDM OFDM signals under practical bias conditions, making it a viable solution in applications such as optical fiber-wireless integrated systems where both IM-DD optical transmission and OFDM signaling are important.
Miranda, E; Arroyo, A; Ronda, J M; Muñoz, J L; Alonso, C; Martínez-Peñuelas, F; Martí-Viaño, J L
2007-01-01
Blunt abdominal trauma can damage the intestinal vasculature and may occasionally lead to delayed intestinal perforation, associated with a combined rate of morbidity and mortality of 25%. The diagnosis of such complications is hindered by sedation in critical patients, however, and morbimortality in this population is therefore higher. We report the case of a man with multiple injuries admitted to the intensive care unit, where delayed perforations of the sigmoid colon and cecum were diagnosed. The management of blunt abdominal trauma is reviewed and the possible causes, diagnostic approaches, and treatment options for colon injuries are discussed.
2012-01-01
Background Many marine meiofaunal species are reported to have wide distributions, which creates a paradox considering their hypothesized low dispersal abilities. Correlated with this paradox is an especially high taxonomic deficit for meiofauna, partly related to a lower taxonomic effort and partly to a high number of putative cryptic species. Molecular-based species delineation and barcoding approaches have been advocated for meiofaunal biodiversity assessments to speed up description processes and uncover cryptic lineages. However, these approaches show sensitivity to sampling coverage (taxonomic and geographic) and the success rate has never been explored on mesopsammic Mollusca. Results We collected the meiofaunal sea-slug Pontohedyle (Acochlidia, Heterobranchia) from 28 localities worldwide. With a traditional morphological approach, all specimens fall into two morphospecies. However, with a multi-marker genetic approach, we reveal multiple lineages that are reciprocally monophyletic on single and concatenated gene trees in phylogenetic analyses. These lineages are largely concordant with geographical and oceanographic parameters, leading to our primary species hypothesis (PSH). In parallel, we apply four independent methods of molecular based species delineation: General Mixed Yule Coalescent model (GMYC), statistical parsimony, Bayesian Species Delineation (BPP) and Automatic Barcode Gap Discovery (ABGD). The secondary species hypothesis (SSH) is gained by relying only on uncontradicted results of the different approaches (‘minimum consensus approach’), resulting in the discovery of a radiation of (at least) 12 mainly cryptic species, 9 of them new to science, some sympatric and some allopatric with respect to ocean boundaries. However, the meiofaunal paradox still persists in some Pontohedyle species identified here with wide coastal and trans-archipelago distributions. Conclusions Our study confirms extensive, morphologically cryptic diversity among meiofauna and accentuates the taxonomic deficit that characterizes meiofauna research. We observe for Pontohedyle slugs a high degree of morphological simplicity and uniformity, which we expect might be a general rule for meiofauna. To tackle cryptic diversity in little explored and hard-to-sample invertebrate taxa, at present, a combined approach seems most promising, such as multi-marker-barcoding (i.e., molecular systematics using mitochondrial and nuclear markers and the criterion of reciprocal monophyly) combined with a minimum consensus approach across independent methods of molecular species delineation to define candidate species. PMID:23244441
A time-parallel approach to strong-constraint four-dimensional variational data assimilation
NASA Astrophysics Data System (ADS)
Rao, Vishwas; Sandu, Adrian
2016-05-01
A parallel-in-time algorithm based on an augmented Lagrangian approach is proposed to solve four-dimensional variational (4D-Var) data assimilation problems. The assimilation window is divided into multiple sub-intervals that allows parallelization of cost function and gradient computations. The solutions to the continuity equations across interval boundaries are added as constraints. The augmented Lagrangian approach leads to a different formulation of the variational data assimilation problem than the weakly constrained 4D-Var. A combination of serial and parallel 4D-Vars to increase performance is also explored. The methodology is illustrated on data assimilation problems involving the Lorenz-96 and the shallow water models.
Targeted versus statistical approaches to selecting parameters for modelling sediment provenance
NASA Astrophysics Data System (ADS)
Laceby, J. Patrick
2017-04-01
One effective field-based approach to modelling sediment provenance is the source fingerprinting technique. Arguably, one of the most important steps for this approach is selecting the appropriate suite of parameters or fingerprints used to model source contributions. Accordingly, approaches to selecting parameters for sediment source fingerprinting will be reviewed. Thereafter, opportunities and limitations of these approaches and some future research directions will be presented. For properties to be effective tracers of sediment, they must discriminate between sources whilst behaving conservatively. Conservative behavior is characterized by constancy in sediment properties, where the properties of sediment sources remain constant, or at the very least, any variation in these properties should occur in a predictable and measurable way. Therefore, properties selected for sediment source fingerprinting should remain constant through sediment detachment, transportation and deposition processes, or vary in a predictable and measurable way. One approach to select conservative properties for sediment source fingerprinting is to identify targeted tracers, such as caesium-137, that provide specific source information (e.g. surface versus subsurface origins). A second approach is to use statistical tests to select an optimal suite of conservative properties capable of modelling sediment provenance. In general, statistical approaches use a combination of a discrimination (e.g. Kruskal Wallis H-test, Mann-Whitney U-test) and parameter selection statistics (e.g. Discriminant Function Analysis or Principle Component Analysis). The challenge is that modelling sediment provenance is often not straightforward and there is increasing debate in the literature surrounding the most appropriate approach to selecting elements for modelling. Moving forward, it would be beneficial if researchers test their results with multiple modelling approaches, artificial mixtures, and multiple lines of evidence to provide secondary support to their initial modelling results. Indeed, element selection can greatly impact modelling results and having multiple lines of evidence will help provide confidence when modelling sediment provenance.
Systemic Problems: A perspective on stem cell aging and rejuvenation.
Conboy, Irina M; Conboy, Michael J; Rebo, Justin
2015-10-01
This review provides balanced analysis of the advances in systemic regulation of young and old tissue stem cells and suggests strategies for accelerating development of therapies to broadly combat age-related tissue degenerative pathologies. Many highlighted recent reports on systemic tissue rejuvenation combine parabiosis with a "silver bullet" putatively responsible for the positive effects. Attempts to unify these papers reflect the excitement about this experimental approach and add value in reproducing previous work. At the same time, defined molecular approaches, which are "beyond parabiosis" for the rejuvenation of multiple old organs represent progress toward attenuating or even reversing human tissue aging.
Dudbridge, Frank; Koeleman, Bobby P C
2004-09-01
Large exploratory studies, including candidate-gene-association testing, genomewide linkage-disequilibrium scans, and array-expression experiments, are becoming increasingly common. A serious problem for such studies is that statistical power is compromised by the need to control the false-positive rate for a large family of tests. Because multiple true associations are anticipated, methods have been proposed that combine evidence from the most significant tests, as a more powerful alternative to individually adjusted tests. The practical application of these methods is currently limited by a reliance on permutation testing to account for the correlated nature of single-nucleotide polymorphism (SNP)-association data. On a genomewide scale, this is both very time-consuming and impractical for repeated explorations with standard marker panels. Here, we alleviate these problems by fitting analytic distributions to the empirical distribution of combined evidence. We fit extreme-value distributions for fixed lengths of combined evidence and a beta distribution for the most significant length. An initial phase of permutation sampling is required to fit these distributions, but it can be completed more quickly than a simple permutation test and need be done only once for each panel of tests, after which the fitted parameters give a reusable calibration of the panel. Our approach is also a more efficient alternative to a standard permutation test. We demonstrate the accuracy of our approach and compare its efficiency with that of permutation tests on genomewide SNP data released by the International HapMap Consortium. The estimation of analytic distributions for combined evidence will allow these powerful methods to be applied more widely in large exploratory studies.
A vector space model approach to identify genetically related diseases.
Sarkar, Indra Neil
2012-01-01
The relationship between diseases and their causative genes can be complex, especially in the case of polygenic diseases. Further exacerbating the challenges in their study is that many genes may be causally related to multiple diseases. This study explored the relationship between diseases through the adaptation of an approach pioneered in the context of information retrieval: vector space models. A vector space model approach was developed that bridges gene disease knowledge inferred across three knowledge bases: Online Mendelian Inheritance in Man, GenBank, and Medline. The approach was then used to identify potentially related diseases for two target diseases: Alzheimer disease and Prader-Willi Syndrome. In the case of both Alzheimer Disease and Prader-Willi Syndrome, a set of plausible diseases were identified that may warrant further exploration. This study furthers seminal work by Swanson, et al. that demonstrated the potential for mining literature for putative correlations. Using a vector space modeling approach, information from both biomedical literature and genomic resources (like GenBank) can be combined towards identification of putative correlations of interest. To this end, the relevance of the predicted diseases of interest in this study using the vector space modeling approach were validated based on supporting literature. The results of this study suggest that a vector space model approach may be a useful means to identify potential relationships between complex diseases, and thereby enable the coordination of gene-based findings across multiple complex diseases.
Akkermans, Simen; Noriega Fernandez, Estefanía; Logist, Filip; Van Impe, Jan F
2017-01-02
Efficient modelling of the microbial growth rate can be performed by combining the effects of individual conditions in a multiplicative way, known as the gamma concept. However, several studies have illustrated that interactions between different effects should be taken into account at stressing environmental conditions to achieve a more accurate description of the growth rate. In this research, a novel approach for modeling the interactions between the effects of environmental conditions on the microbial growth rate is introduced. As a case study, the effect of temperature and pH on the growth rate of Escherichia coli K12 is modeled, based on a set of computer controlled bioreactor experiments performed under static environmental conditions. The models compared in this case study are the gamma model, the model of Augustin and Carlier (2000), the model of Le Marc et al. (2002) and the novel multiplicative interaction model, developed in this paper. This novel model enables the separate identification of interactions between the effects of two (or more) environmental conditions. The comparison of these models focuses on the accuracy, interpretability and compatibility with efficient modeling approaches. Moreover, for the separate effects of temperature and pH, new cardinal parameter model structures are proposed. The novel interaction model contributes to a generic modeling approach, resulting in predictive models that are (i) accurate, (ii) easily identifiable with a limited work load, (iii) modular, and (iv) biologically interpretable. Copyright © 2016. Published by Elsevier B.V.
Animation control of surface motion capture.
Tejera, Margara; Casas, Dan; Hilton, Adrian
2013-12-01
Surface motion capture (SurfCap) of actor performance from multiple view video provides reconstruction of the natural nonrigid deformation of skin and clothing. This paper introduces techniques for interactive animation control of SurfCap sequences which allow the flexibility in editing and interactive manipulation associated with existing tools for animation from skeletal motion capture (MoCap). Laplacian mesh editing is extended using a basis model learned from SurfCap sequences to constrain the surface shape to reproduce natural deformation. Three novel approaches for animation control of SurfCap sequences, which exploit the constrained Laplacian mesh editing, are introduced: 1) space–time editing for interactive sequence manipulation; 2) skeleton-driven animation to achieve natural nonrigid surface deformation; and 3) hybrid combination of skeletal MoCap driven and SurfCap sequence to extend the range of movement. These approaches are combined with high-level parametric control of SurfCap sequences in a hybrid surface and skeleton-driven animation control framework to achieve natural surface deformation with an extended range of movement by exploiting existing MoCap archives. Evaluation of each approach and the integrated animation framework are presented on real SurfCap sequences for actors performing multiple motions with a variety of clothing styles. Results demonstrate that these techniques enable flexible control for interactive animation with the natural nonrigid surface dynamics of the captured performance and provide a powerful tool to extend current SurfCap databases by incorporating new motions from MoCap sequences.
Gilleen, J; Michalopoulou, P G; Reichenberg, A; Drake, R; Wykes, T; Lewis, S W; Kapur, S
2014-04-01
Improving cognition in people with neuropsychiatric disorders remains a major clinical target. By themselves pharmacological and non-pharmacological approaches have shown only modest effects in improving cognition. In the present study we tested a recently-proposed methodology to combine CT with a 'cognitive-enhancing' drug to improve cognitive test scores and expanded on previous approaches by delivering combination drug and CT, over a long intervention of repeated sessions, and used multiple tasks to reveal the cognitive processes being enhanced. We also aimed to determine whether gains from this combination approach generalised to untrained tests. In this proof of principle randomised-controlled trial thirty-three healthy volunteers were randomised to receive either modafinil or placebo combined with daily cognitive training over two weeks. Volunteers were trained on tasks of new-language learning, working memory and verbal learning following 200 mg modafinil or placebo for ten days. Improvements in trained and untrained tasks were measured. Rate of new-language learning was significantly enhanced with modafinil, and effects were greatest over the first five sessions. Modafinil improved within-day learning rather than between-day retention. No enhancement of gains with modafinil was observed in working memory nor rate of verbal learning. Gains in all tasks were retained post drug-administration, but transfer effects to broad cognitive abilities were not seen. This study shows that combining CT with modafinil specifically elevates learning over early training sessions compared to CT with placebo and provides a proof of principle experimental paradigm for pharmacological enhancement of cognitive remediation. Copyright © 2014 Elsevier B.V. and ECNP. All rights reserved.
Varrassi, Giustino; Hanna, Magdi; Macheras, Giorgos; Montero, Antonio; Montes Perez, Antonio; Meissner, Winfried; Perrot, Serge; Scarpignato, Carmelo
2017-06-01
Untreated and under-treated pain represent one of the most pervasive health problems, which is worsening as the population ages and accrues risk for pain. Multiple treatment options are available, most of which have one mechanism of action, and cannot be prescribed at unlimited doses due to the ceiling of efficacy and/or safety concerns. Another limitation of single-agent analgesia is that, in general, pain is due to multiple causes. Combining drugs from different classes, with different and complementary mechanism(s) of action, provides a better opportunity for effective analgesia at reduced doses of individual agents. Therefore, there is a potential reduction of adverse events, often dose-related. Analgesic combinations are recommended by several organizations and are used in clinical practice. Provided the two agents are combined in a fixed-dose ratio, the resulting medication may offer advantages over extemporaneous combinations. Dexketoprofen/tramadol (25 mg/75 mg) is a new oral fixed-dose combination offering a comprehensive multimodal approach to moderate-to-severe acute pain that encompasses central analgesic action, peripheral analgesic effect and anti-inflammatory activity, together with a good tolerability profile. The analgesic efficacy of dexketoprofen/tramadol combination is complemented by a favorable pharmacokinetic and pharmacodynamic profile, characterized by rapid onset and long duration of action. This has been well documented in both somatic- and visceral-pain human models. This review discusses the available clinical evidence and the future possible applications of dexketoprofen/tramadol fixed-dose combination that may play an important role in the management of moderate-to-severe acute pain.
A feature-based approach to combine functional MRI, structural MRI and EEG brain imaging data.
Calhoun, V; Adali, T; Liu, J
2006-01-01
The acquisition of multiple brain imaging types for a given study is a very common practice. However these data are typically examined in separate analyses, rather than in a combined model. We propose a novel methodology to perform joint independent component analysis across image modalities, including structural MRI data, functional MRI activation data and EEG data, and to visualize the results via a joint histogram visualization technique. Evaluation of which combination of fused data is most useful is determined by using the Kullback-Leibler divergence. We demonstrate our method on a data set composed of functional MRI data from two tasks, structural MRI data, and EEG data collected on patients with schizophrenia and healthy controls. We show that combining data types can improve our ability to distinguish differences between groups.
Object-Oriented Approach to Integrating Database Semantics. Volume 4.
1987-12-01
schemata for; 1. Object Classification Shema -- Entities 2. Object Structure and Relationship Schema -- Relations 3. Operation Classification and... relationships are represented in a database is non- intuitive for naive users. *It is difficult to access and combine information in multiple databases. In this...from the CURRENT-.CLASSES table. Choosing a selected item do-selects it. Choose 0 to exit. 1. STUDENTS 2. CUR~RENT-..CLASSES 3. MANAGMNT -.CLASS
ERIC Educational Resources Information Center
Pan-Skadden, Jennifer; Wilder, David A.; Sparling, Jessica; Severtson, Erica; Donaldson, Jeanne; Postma, Nicki; Beavers, Gracie; Neidert, Pamela
2009-01-01
Behavioral skills training (BST) was combined with in-situ training to teach young children to solicit help when they become lost from a caregiver at a store. Three children were taught to approach a cashier, tell the cashier their name, and inform the cashier that they are lost. A multiple baseline design across participants was used to evaluate…
Leiner, Claude; Nemitz, Wolfgang; Schweitzer, Susanne; Kuna, Ladislav; Wenzl, Franz P; Hartmann, Paul; Satzinger, Valentin; Sommer, Christian
2016-03-20
We show that with an appropriate combination of two optical simulation techniques-classical ray-tracing and the finite difference time domain method-an optical device containing multiple diffractive and refractive optical elements can be accurately simulated in an iterative simulation approach. We compare the simulation results with experimental measurements of the device to discuss the applicability and accuracy of our iterative simulation procedure.
Multimodal biometric system using rank-level fusion approach.
Monwar, Md Maruf; Gavrilova, Marina L
2009-08-01
In many real-world applications, unimodal biometric systems often face significant limitations due to sensitivity to noise, intraclass variability, data quality, nonuniversality, and other factors. Attempting to improve the performance of individual matchers in such situations may not prove to be highly effective. Multibiometric systems seek to alleviate some of these problems by providing multiple pieces of evidence of the same identity. These systems help achieve an increase in performance that may not be possible using a single-biometric indicator. This paper presents an effective fusion scheme that combines information presented by multiple domain experts based on the rank-level fusion integration method. The developed multimodal biometric system possesses a number of unique qualities, starting from utilizing principal component analysis and Fisher's linear discriminant methods for individual matchers (face, ear, and signature) identity authentication and utilizing the novel rank-level fusion method in order to consolidate the results obtained from different biometric matchers. The ranks of individual matchers are combined using the highest rank, Borda count, and logistic regression approaches. The results indicate that fusion of individual modalities can improve the overall performance of the biometric system, even in the presence of low quality data. Insights on multibiometric design using rank-level fusion and its performance on a variety of biometric databases are discussed in the concluding section.
NASA Astrophysics Data System (ADS)
Zhang, Xiaodong; Huang, Guo H.
2011-12-01
Groundwater pollution has gathered more and more attention in the past decades. Conducting an assessment of groundwater contamination risk is desired to provide sound bases for supporting risk-based management decisions. Therefore, the objective of this study is to develop an integrated fuzzy stochastic approach to evaluate risks of BTEX-contaminated groundwater under multiple uncertainties. It consists of an integrated interval fuzzy subsurface modeling system (IIFMS) and an integrated fuzzy second-order stochastic risk assessment (IFSOSRA) model. The IIFMS is developed based on factorial design, interval analysis, and fuzzy sets approach to predict contaminant concentrations under hybrid uncertainties. Two input parameters (longitudinal dispersivity and porosity) are considered to be uncertain with known fuzzy membership functions, and intrinsic permeability is considered to be an interval number with unknown distribution information. A factorial design is conducted to evaluate interactive effects of the three uncertain factors on the modeling outputs through the developed IIFMS. The IFSOSRA model can systematically quantify variability and uncertainty, as well as their hybrids, presented as fuzzy, stochastic and second-order stochastic parameters in health risk assessment. The developed approach haw been applied to the management of a real-world petroleum-contaminated site within a western Canada context. The results indicate that multiple uncertainties, under a combination of information with various data-quality levels, can be effectively addressed to provide supports in identifying proper remedial efforts. A unique contribution of this research is the development of an integrated fuzzy stochastic approach for handling various forms of uncertainties associated with simulation and risk assessment efforts.
NASA Astrophysics Data System (ADS)
Yan, Zhixiang; Lin, Ge; Ye, Yang; Wang, Yitao; Yan, Ru
2014-06-01
Flavonoids are one of the largest classes of plant secondary metabolites serving a variety of functions in plants and associating with a number of health benefits for humans. Typically, they are co-identified with many other secondary metabolites using untargeted metabolomics. The limited data quality of untargeted workflow calls for a shift from the breadth-first to the depth-first screening strategy when a specific biosynthetic pathway is focused on. Here we introduce a generic multiple reaction monitoring (MRM)-based approach for flavonoids profiling in plants using a hybrid triple quadrupole linear ion trap (QTrap) mass spectrometer. The approach includes four steps: (1) preliminary profiling of major aglycones by multiple ion monitoring triggered enhanced product ion scan (MIM-EPI); (2) glycones profiling by precursor ion triggered EPI scan (PI-EPI) of major aglycones; (3) comprehensive aglycones profiling by combining MIM-EPI and neutral loss triggered EPI scan (NL-EPI) of major glycone; (4) in-depth flavonoids profiling by MRM-EPI with elaborated MRM transitions. Particularly, incorporation of the NH3 loss and sugar elimination proved to be very informative and confirmative for flavonoids screening. This approach was applied for profiling flavonoids in Astragali radix ( Huangqi), a famous herb widely used for medicinal and nutritional purposes in China. In total, 421 flavonoids were tentatively characterized, among which less than 40 have been previously reported in this medicinal plant. This MRM-based approach provides versatility and sensitivity that required for flavonoids profiling in plants and serves as a useful tool for plant metabolomics.
Patrick, Megan E; Blair, Clancy; Maggs, Jennifer L
2008-05-01
Relations among executive function, behavioral approach sensitivity, emotional decision making, and risk behaviors (alcohol use, drug use, and delinquent behavior) were examined in single female college students (N = 72). Hierarchical multiple regressions indicated a significant Approach Sensitivity x Working Memory interaction in which higher levels of alcohol use were associated with the combination of greater approach tendency and better working memory. This Approach Sensitivity x Working Memory interaction was also marginally significant for drug use and delinquency. Poor emotional decision making, as measured by a gambling task, was also associated with higher levels of alcohol use, but only for individuals low in inhibitory control. Findings point to the complexity of relations among aspects of self-regulation and personality and provide much needed data on neuropsychological correlates of risk behaviors in a nonclinical population.
Charlesworth, Jac C; Peralta, Juan M; Drigalenko, Eugene; Göring, Harald Hh; Almasy, Laura; Dyer, Thomas D; Blangero, John
2009-12-15
Gene identification using linkage, association, or genome-wide expression is often underpowered. We propose that formal combination of information from multiple gene-identification approaches may lead to the identification of novel loci that are missed when only one form of information is available. Firstly, we analyze the Genetic Analysis Workshop 16 Framingham Heart Study Problem 2 genome-wide association data for HDL-cholesterol using a "gene-centric" approach. Then we formally combine the association test results with genome-wide transcriptional profiling data for high-density lipoprotein cholesterol (HDL-C), from the San Antonio Family Heart Study, using a Z-transform test (Stouffer's method). We identified 39 genes by the joint test at a conservative 1% false-discovery rate, including 9 from the significant gene-based association test and 23 whose expression was significantly correlated with HDL-C. Seven genes identified as significant in the joint test were not independently identified by either the association or expression tests. This combined approach has increased power and leads to the direct nomination of novel candidate genes likely to be involved in the determination of HDL-C levels. Such information can then be used as justification for a more exhaustive search for functional sequence variation within the nominated genes. We anticipate that this type of analysis will improve our speed of identification of regulatory genes causally involved in disease risk.
Wang, Jia-Bo; Cui, He-Rong; Wang, Rui-Lin; Zhang, Cong-En; Niu, Ming; Bai, Zhao-Fang; Xu, Gen-Hua; Li, Peng-Yan; Jiang, Wen-Yan; Han, Jing-Jing; Ma, Xiao; Cai, Guang-Ming; Li, Rui-Sheng; Zhang, Li-Ping; Xiao, Xiao-He
2018-04-04
Multiple components of traditional Chinese medicine (TCM) formulae determine their treatment targets for multiple diseases as opposed to a particular disease. However, discovering the unexplored therapeutic potential of a TCM formula remains challenging and costly. Inspired by the drug repositioning methodology, we propose an integrated strategy to feasibly identify new therapeutic uses for a formula composed of six herbs, Liuweiwuling. First, we developed a comprehensive systems approach to enrich drug compound-liver disease networks to analyse the major predicted diseases of Liuweiwuling and discover its potential effect on liver failure. The underlying mechanisms were subsequently predicted to mainly attribute to a blockade of hepatocyte apoptosis via a synergistic combination of multiple effects. Next, a classical pharmacology experiment was designed to validate the effects of Liuweiwuling on different models of fulminant liver failure induced by D-galactosamine/lipopolysaccharide (GalN/LPS) or thioacetamide (TAA). The results indicated that pretreatment with Liuweiwuling restored liver function and reduced lethality induced by GalN/LPS or TAA in a dose-dependent manner, which was partially attributable to the abrogation of hepatocyte apoptosis by multiple synergistic effects. In summary, the integrated strategy discussed in this paper may provide a new approach for the more efficient discovery of new therapeutic uses for TCM formulae.
Integrating Multiple Social Statuses in Health Disparities Research: The Case of Lung Cancer
Williams, David R; Kontos, Emily Z; Viswanath, K; Haas, Jennifer S; Lathan, Christopher S; MacConaill, Laura E; Chen, Jarvis; Ayanian, John Z
2012-01-01
Objective To illustrate the complex patterns that emerge when race/ethnicity, socioeconomic status (SES), and gender are considered simultaneously in health care disparities research and to outline the needed research to understand them by using disparities in lung cancer risks, treatment, and outcomes as an example. Principal Findings SES, gender, and race/ethnicity are social categories that are robust predictors of variations in health and health services utilization. These are usually considered separately, but intersectionality theory indicates that the impact of each depends on the others. Each reflects historically and culturally contingent variations in social, economic, and political status. Distinct patterns of risk and resilience emerge at the intersections of multiple social categories and shape the experience of health, health care access, utilization, quality, and outcomes where these categories intersect. Intersectional approaches call for greater attention to understand social processes at multiple levels of society and require the collection of relevant data and utilization of appropriate analytic approaches to understand how multiple risk factors and resources combine to affect the distribution of disease and its management. Conclusions Understanding how race/ethnicity, gender, and SES are interactive, interdependent, and social identities can provide new knowledge to enhance our efforts to effectively address health disparities. PMID:22568674
Gaspard, L; Tombal, B; Castille, Y; Opsomer, R-J; Detrembleur, C
2014-03-01
To assess the effectiveness of conservative therapeutic approaches in a multiple sclerosis population. Review was performed in PubMed, PEDro, Scopus and Cochrane Library using combinations of the following keywords: multiple sclerosis; bladder dysfunction; overactive bladder; detrusor hyperreflexia; urge incontinence; urgency; stress incontinence; pelvic floor muscle; biofeedback; PTNS; tibial nerve; bladder training; physical therapy; physiotherapy; conservative treatment and behavioral therapy. Six randomized articles including 289 patients were selected. Four papers exhibited strong scores for the methodological quality assessment. The parameters always significantly improved concerned: number of incontinence episodes (decreased from 64% to 86% after treatment versus before treatment), quality of life (P≤0.001), severity of irritative symptoms (decreased by more than 50% after treatment versus before treatment), and nocturia (P=0.035 to P<0.001). Activities and participation, maximum flow rate, mean voided volume and daytime frequency were not significantly improved in all trials. The physical therapy techniques could be effective for the treatment of urinary disorders in multiple sclerosis populations with mild disability. However, the analyses are based on six studies within only four showed good methodological quality. No strong conclusions regarding treatment approaches can be drawn from this review. Copyright © 2013 Elsevier Masson SAS. All rights reserved.
Nash, David T; McNamara, Michael S
2009-01-01
The morbidity and mortality benefits of lowering blood pressure (BP) in hypertensive patients are well established, with most individuals requiring multiple agents to achieve BP control. Considering the important role of the renin-angiotensin-aldosterone system (RAAS) in the pathophysiology of hypertension, a key component of combination therapy should include a RAAS inhibitor. Angiotensin receptor blockers (ARBs) lower BP, reduce cardiovascular risk, provide organ protection, and are among the best tolerated class of antihypertensive therapy. In this article, we discuss two ARB combinations (valsartan/hydrochlorothiazide [HCTZ] and amlodipine/valsartan), both of which are indicated for the treatment of hypertension in patients not adequately controlled on monotherapy and as initial therapy in patients likely to need multiple drugs to achieve BP goals. Randomized, double-blind studies that have assessed the antihypertensive efficacy and safety of these combinations in the first-line treatment of hypertensive patients are reviewed. Both valsartan/HCTZ and amlodipine/valsartan effectively lower BP and are well tolerated in a broad range of patients with hypertension, including difficult-to-treat populations such as those with severe BP elevations, prediabetes and diabetes, patients with the cardiometabolic syndrome, and individuals who are obese, elderly, or black. Also discussed herein are patient-focused perspectives related to the use of valsartan/HCTZ and amlodipine/valsartan, and the rationale for use of single-pill combinations as one approach to enhance patient compliance with antihypertensive therapy. PMID:21949614
PDE5 Inhibitors Enhance Celecoxib Killing in Multiple Tumor Types
BOOTH, LAURENCE; ROBERTS, JANE L.; CRUICKSHANKS, NICHOLA; TAVALLAI, SEYEDMEHRAD; WEBB, TIMOTHY; SAMUEL, PETER; CONLEY, ADAM; BINION, BRITTANY; YOUNG, HAROLD F.; POKLEPOVIC, ANDREW; SPIEGEL, SARAH; DENT, PAUL
2015-01-01
The present studies determined whether clinically relevant phosphodiesterase 5 (PDE5) inhibitors interacted with a clinically relevant NSAID, celecoxib, to kill tumor cells. Celecoxib and PDE5 inhibitors interacted in a greater than additive fashion to kill multiple tumor cell types. Celecoxib and sildenafil killed ex vivo primary human glioma cells as well as their associated activated microglia. Knock down of PDE5 recapitulated the effects of PDE5 inhibitor treatment; the nitric oxide synthase inhibitor L-NAME suppressed drug combination toxicity. The effects of celecoxib were COX2 independent. Over-expression of c-FLIP-s or knock down of CD95/FADD significantly reduced killing by the drug combination. CD95 activation was dependent on nitric oxide and ceramide signaling. CD95 signaling activated the JNK pathway and inhibition of JNK suppressed cell killing. The drug combination inactivated mTOR and increased the levels of autophagy and knock down of Beclin1 or ATG5 strongly suppressed killing by the drug combination. The drug combination caused an ER stress response; knock down of IRE1α/XBP1 enhanced killing whereas knock down of eIF2α/ATF4/CHOP suppressed killing. Sildenafil and celecoxib treatment suppressed the growth of mammary tumors in vivo. Collectively our data demonstrate that clinically achievable concentrations of celecoxib and sildenafil have the potential to be a new therapeutic approach for cancer. PMID:25303541
Gene delivery strategies for the treatment of mucopolysaccharidoses.
Baldo, Guilherme; Giugliani, Roberto; Matte, Ursula
2014-03-01
Mucopolysaccharidosis (MPS) disorders are genetic diseases caused by deficiencies in the lysosomal enzymes responsible for the degradation of glycosaminoglycans. Current treatments are not able to correct all disease symptoms and are not available for all MPS types, which makes gene therapy especially relevant. Multiple gene therapy approaches have been tested for different types of MPS, and our aim in this study is to critically analyze each of them. In this review, we have included the major studies that describe the use of adeno-associated retroviral and lentiviral vectors, as well as relevant non-viral approaches for MPS disorders. Some protocols such as the use of adeno-associated vectors and lentiviral vectors are approaching the clinic for these disorders and, along with combined approaches, seem to be the future of gene therapy for MPS.
[Research on the methods for multi-class kernel CSP-based feature extraction].
Wang, Jinjia; Zhang, Lingzhi; Hu, Bei
2012-04-01
To relax the presumption of strictly linear patterns in the common spatial patterns (CSP), we studied the kernel CSP (KCSP). A new multi-class KCSP (MKCSP) approach was proposed in this paper, which combines the kernel approach with multi-class CSP technique. In this approach, we used kernel spatial patterns for each class against all others, and extracted signal components specific to one condition from EEG data sets of multiple conditions. Then we performed classification using the Logistic linear classifier. Brain computer interface (BCI) competition III_3a was used in the experiment. Through the experiment, it can be proved that this approach could decompose the raw EEG singles into spatial patterns extracted from multi-class of single trial EEG, and could obtain good classification results.
Figueroa, Melania; Upadhyaya, Narayana M; Sperschneider, Jana; Park, Robert F; Szabo, Les J; Steffenson, Brian; Ellis, Jeff G; Dodds, Peter N
2016-01-01
The recent resurgence of wheat stem rust caused by new virulent races of Puccinia graminis f. sp. tritici (Pgt) poses a threat to food security. These concerns have catalyzed an extensive global effort toward controlling this disease. Substantial research and breeding programs target the identification and introduction of new stem rust resistance (Sr) genes in cultivars for genetic protection against the disease. Such resistance genes typically encode immune receptor proteins that recognize specific components of the pathogen, known as avirulence (Avr) proteins. A significant drawback to deploying cultivars with single Sr genes is that they are often overcome by evolution of the pathogen to escape recognition through alterations in Avr genes. Thus, a key element in achieving durable rust control is the deployment of multiple effective Sr genes in combination, either through conventional breeding or transgenic approaches, to minimize the risk of resistance breakdown. In this situation, evolution of pathogen virulence would require changes in multiple Avr genes in order to bypass recognition. However, choosing the optimal Sr gene combinations to deploy is a challenge that requires detailed knowledge of the pathogen Avr genes with which they interact and the virulence phenotypes of Pgt existing in nature. Identifying specific Avr genes from Pgt will provide screening tools to enhance pathogen virulence monitoring, assess heterozygosity and propensity for mutation in pathogen populations, and confirm individual Sr gene functions in crop varieties carrying multiple effective resistance genes. Toward this goal, much progress has been made in assembling a high quality reference genome sequence for Pgt, as well as a Pan-genome encompassing variation between multiple field isolates with diverse virulence spectra. In turn this has allowed prediction of Pgt effector gene candidates based on known features of Avr genes in other plant pathogens, including the related flax rust fungus. Upregulation of gene expression in haustoria and evidence for diversifying selection are two useful parameters to identify candidate Avr genes. Recently, we have also applied machine learning approaches to agnostically predict candidate effectors. Here, we review progress in stem rust pathogenomics and approaches currently underway to identify Avr genes recognized by wheat Sr genes.
Preventing alcohol-related traffic injury: a health promotion approach.
Howat, Peter; Sleet, David; Elder, Randy; Maycock, Bruce
2004-09-01
The conditions that give rise to drinking and driving are complex, with multiple and interrelated causes. Prevention efforts benefit from an approach that relies on the combination of multiple interventions. Health promotion provides a useful framework for conceptualizing and implementing actions to reduce drinking and driving since it involves a combination of educational, behavioral, environmental, and policy approaches. This review draws on data from a range of settings to characterize the effectiveness of various interventions embedded within the health promotion approach. Interventions considered part of the health promotion approach include: (1) economic interventions (2) organizational interventions, (3) policy interventions, and (4) health education interventions, including the use of media, school and community education, and public awareness programs. Effective health promotion strengthens the skills and capabilities of individuals to take action and the capacity of groups or communities to act collectively to exert control over the determinants of alcohol-impaired driving. There is strong evidence for the effectiveness of some components of health promotion, including economic and retailer interventions, alcohol taxation, reducing alcohol availability, legal and legislative strategies, and strategies addressing the servers of alcohol. There is also evidence for the effectiveness of sobriety checkpoints, lower BAC laws, minimum legal drinking age laws, and supportive media promotion programs. Other interventions with moderate evidence of effectiveness include restricting alcohol advertising and promotion, and actions involving counter advertising. Health education interventions alone that have insufficient evidence for effectiveness include passive server training programs, school drug and alcohol education programs, community mobilization efforts, and health warnings. Because each intervention builds on the strengths of every other one, ecological approaches to reducing alcohol-impaired driving using all four components of the health promotion model are likely to be the most effective. Settings such as schools, workplaces, cities, and communities offer practical opportunities to implement alcohol-impaired driving prevention programs within this framework.
Tyrrell, Pascal N; Corey, Paul N; Feldman, Brian M; Silverman, Earl D
2013-06-01
Physicians often assess the effectiveness of treatments on a small number of patients. Multiple-baseline designs (MBDs), based on the Wampold-Worsham (WW) method of randomization and applied to four subjects, have relatively low power. Our objective was to propose another approach with greater power that does not suffer from the time requirements of the WW method applied to a greater number of subjects. The power of a design that involves the combination of two four-subject MBDs was estimated using computer simulation and compared with the four- and eight-subject designs. The effect of a delayed linear response to treatment on the power of the test was also investigated. Power was found to be adequate (>80%) for a standardized mean difference (SMD) greater than 0.8. The effect size associated with 80% power from combined tests was smaller than that of the single four-subject MBD (SMD=1.3) and comparable with the eight-subject MBD (SMD=0.6). A delayed linear response to the treatment resulted in important reductions in power (20-35%). By combining two four-subject MBD tests, an investigator can detect better effect sizes (SMD=0.8) and be able to complete a comparatively timelier and feasible study. Copyright © 2013 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
So, Sung-Sau; Karplus, Martin
2001-07-01
Glycogen phosphorylase (GP) is an important enzyme that regulates blood glucose level and a key therapeutic target for the treatment of type II diabetes. In this study, a number of potential GP inhibitors are designed with a variety of computational approaches. They include the applications of MCSS, LUDI and CoMFA to identify additional fragments that can be attached to existing lead molecules; the use of 2D and 3D similarity-based QSAR models (HQSAR and SMGNN) and of the LUDI program to identify novel molecules that may bind to the glucose binding site. The designed ligands are evaluated by a multiple screening method, which is a combination of commercial and in-house ligand-receptor binding affinity prediction programs used in a previous study (So and Karplus, J. Comp.-Aid. Mol. Des., 13 (1999), 243-258). Each method is used at an appropriate point in the screening, as determined by both the accuracy of the calculations and the computational cost. A comparison of the strengths and weaknesses of the ligand design approaches is made.
NASA Astrophysics Data System (ADS)
Yakovenko, Oleksandr; Jones, Steven J. M.
2018-01-01
We report the implementation of molecular modeling approaches developed as a part of the 2016 Grand Challenge 2, the blinded competition of computer aided drug design technologies held by the D3R Drug Design Data Resource (https://drugdesigndata.org/). The challenge was focused on the ligands of the farnesoid X receptor (FXR), a highly flexible nuclear receptor of the cholesterol derivative chenodeoxycholic acid. FXR is considered an important therapeutic target for metabolic, inflammatory, bowel and obesity related diseases (Expert Opin Drug Metab Toxicol 4:523-532, 2015), but in the context of this competition it is also interesting due to the significant ligand-induced conformational changes displayed by the protein. To deal with these conformational changes we employed multiple simulations of molecular dynamics (MD). Our MD-based protocols were top-ranked in estimating the free energy of binding of the ligands and FXR protein. Our approach was ranked second in the prediction of the binding poses where we also combined MD with molecular docking and artificial neural networks. Our approach showed mediocre results for high-throughput scoring of interactions.
NASA Astrophysics Data System (ADS)
Neves, Marco A. C.; Simões, Sérgio; Sá e Melo, M. Luisa
2010-12-01
CXCR4 is a G-protein coupled receptor for CXCL12 that plays an important role in human immunodeficiency virus infection, cancer growth and metastasization, immune cell trafficking and WHIM syndrome. In the absence of an X-ray crystal structure, theoretical modeling of the CXCR4 receptor remains an important tool for structure-function analysis and to guide the discovery of new antagonists with potential clinical use. In this study, the combination of experimental data and molecular modeling approaches allowed the development of optimized ligand-receptor models useful for elucidation of the molecular determinants of small molecule binding and functional antagonism. The ligand-guided homology modeling approach used in this study explicitly re-shaped the CXCR4 binding pocket in order to improve discrimination between known CXCR4 antagonists and random decoys. Refinement based on multiple test-sets with small compounds from single chemotypes provided the best early enrichment performance. These results provide an important tool for structure-based drug design and virtual ligand screening of new CXCR4 antagonists.
Bottomley, Steven; Denny, Paul
2011-01-01
A participatory learning approach, combined with both a traditional and a competitive assessment, was used to motivate students and promote a deep approach to learning biochemistry. Students were challenged to research, author, and explain their own multiple-choice questions (MCQs). They were also required to answer, evaluate, and discuss MCQs written by their peers. The technology used to support this activity was PeerWise--a freely available, innovative web-based system that supports students in the creation of an annotated question repository. In this case study, we describe students' contributions to, and perceptions of, the PeerWise system for a cohort of 107 second-year biomedical science students from three degree streams studying a core biochemistry subject. Our study suggests that the students are eager participants and produce a large repository of relevant, good quality MCQs. In addition, they rate the PeerWise system highly and use higher order thinking skills while taking an active role in their learning. We also discuss potential issues and future work using PeerWise for biomedical students. Copyright © 2011 Wiley Periodicals, Inc.
Detection of epistatic effects with logic regression and a classical linear regression model.
Malina, Magdalena; Ickstadt, Katja; Schwender, Holger; Posch, Martin; Bogdan, Małgorzata
2014-02-01
To locate multiple interacting quantitative trait loci (QTL) influencing a trait of interest within experimental populations, usually methods as the Cockerham's model are applied. Within this framework, interactions are understood as the part of the joined effect of several genes which cannot be explained as the sum of their additive effects. However, if a change in the phenotype (as disease) is caused by Boolean combinations of genotypes of several QTLs, this Cockerham's approach is often not capable to identify them properly. To detect such interactions more efficiently, we propose a logic regression framework. Even though with the logic regression approach a larger number of models has to be considered (requiring more stringent multiple testing correction) the efficient representation of higher order logic interactions in logic regression models leads to a significant increase of power to detect such interactions as compared to a Cockerham's approach. The increase in power is demonstrated analytically for a simple two-way interaction model and illustrated in more complex settings with simulation study and real data analysis.
Time series sightability modeling of animal populations.
ArchMiller, Althea A; Dorazio, Robert M; St Clair, Katherine; Fieberg, John R
2018-01-01
Logistic regression models-or "sightability models"-fit to detection/non-detection data from marked individuals are often used to adjust for visibility bias in later detection-only surveys, with population abundance estimated using a modified Horvitz-Thompson (mHT) estimator. More recently, a model-based alternative for analyzing combined detection/non-detection and detection-only data was developed. This approach seemed promising, since it resulted in similar estimates as the mHT when applied to data from moose (Alces alces) surveys in Minnesota. More importantly, it provided a framework for developing flexible models for analyzing multiyear detection-only survey data in combination with detection/non-detection data. During initial attempts to extend the model-based approach to multiple years of detection-only data, we found that estimates of detection probabilities and population abundance were sensitive to the amount of detection-only data included in the combined (detection/non-detection and detection-only) analysis. Subsequently, we developed a robust hierarchical modeling approach where sightability model parameters are informed only by the detection/non-detection data, and we used this approach to fit a fixed-effects model (FE model) with year-specific parameters and a temporally-smoothed model (TS model) that shares information across years via random effects and a temporal spline. The abundance estimates from the TS model were more precise, with decreased interannual variability relative to the FE model and mHT abundance estimates, illustrating the potential benefits from model-based approaches that allow information to be shared across years.
Cowan, Dallas M; Maskrey, Joshua R; Fung, Ernest S; Woods, Tyler A; Stabryla, Lisa M; Scott, Paul K; Finley, Brent L
2016-07-01
Alcohol concentrations in biological matrices offer information regarding an individual's intoxication level at a given time. In forensic cases, the alcohol concentration in the blood (BAC) at the time of death is sometimes used interchangeably with the BAC measured post-mortem, without consideration for alcohol concentration changes in the body after death. However, post-mortem factors must be taken into account for accurate forensic determination of BAC prior to death to avoid incorrect conclusions. The main objective of this work was to describe best practices for relating ante-mortem and post-mortem alcohol concentrations, using a combination of modeling, empirical data and other qualitative considerations. The Widmark modeling approach is a best practices method for superimposing multiple alcohol doses ingested at various times with alcohol elimination rate adjustments based on individual body factors. We combined the selected ante-mortem model with a suggestion for an approach used to roughly estimate changes in BAC post-mortem, and then analyzed the available data on post-mortem alcohol production in human bodies and potential markers for alcohol production through decomposition and putrefaction. Hypothetical cases provide best practice approaches as an example for determining alcohol concentration in biological matrices ante-mortem, as well as potential issues encountered with quantitative post-mortem approaches. This study provides information for standardizing BAC determination in forensic toxicology, while minimizing real world case uncertainties. Copyright © 2016 Elsevier Inc. All rights reserved.
Time series sightability modeling of animal populations
ArchMiller, Althea A.; Dorazio, Robert; St. Clair, Katherine; Fieberg, John R.
2018-01-01
Logistic regression models—or “sightability models”—fit to detection/non-detection data from marked individuals are often used to adjust for visibility bias in later detection-only surveys, with population abundance estimated using a modified Horvitz-Thompson (mHT) estimator. More recently, a model-based alternative for analyzing combined detection/non-detection and detection-only data was developed. This approach seemed promising, since it resulted in similar estimates as the mHT when applied to data from moose (Alces alces) surveys in Minnesota. More importantly, it provided a framework for developing flexible models for analyzing multiyear detection-only survey data in combination with detection/non-detection data. During initial attempts to extend the model-based approach to multiple years of detection-only data, we found that estimates of detection probabilities and population abundance were sensitive to the amount of detection-only data included in the combined (detection/non-detection and detection-only) analysis. Subsequently, we developed a robust hierarchical modeling approach where sightability model parameters are informed only by the detection/non-detection data, and we used this approach to fit a fixed-effects model (FE model) with year-specific parameters and a temporally-smoothed model (TS model) that shares information across years via random effects and a temporal spline. The abundance estimates from the TS model were more precise, with decreased interannual variability relative to the FE model and mHT abundance estimates, illustrating the potential benefits from model-based approaches that allow information to be shared across years.
Combination therapies - the next logical step for the treatment of synucleinopathies?
Valera, E.; Masliah, E.
2015-01-01
Currently there are no disease-modifying alternatives for the treatment of most neurodegenerative disorders. The available therapies for diseases such as Parkinson’s disease (PD), PD dementia (PDD), Dementia with Lewy bodies (DLB) and Multiple system atrophy (MSA), in which the protein alpha-synuclein (α-syn) accumulates within neurons and glial cells with toxic consequences, are focused on managing the disease symptoms. However, utilizing strategic drug combinations and/or multi-target drugs might increase the treatment efficiency when compared to monotherapies. Synucleinopathies are complex disorders that progress through several stages, and toxic α-syn aggregates exhibit prion-like behavior spreading from cell to cell. Therefore, it follows that these neurodegenerative disorders might require equally complex therapeutic approaches in order to obtain significant and long-lasting results. Hypothetically, therapies aimed at reducing α-syn accumulation and cell-to-cell transfer, such as immunotherapy against α-syn, could be combined with agents that reduce neuroinflammation with potential synergistic outcomes. Here we review the current evidence supporting this type of approach, suggesting that such rational therapy combinations, together with the use of multi-target drugs, may hold promise as the next logical step for the treatment of synucleinopathies. PMID:26388203
NASA Astrophysics Data System (ADS)
Saha, Abhijit; Vivas, A. Katherina
2017-12-01
Ongoing and future surveys with repeat imaging in multiple bands are producing (or will produce) time-spaced measurements of brightness, resulting in the identification of large numbers of variable sources in the sky. A large fraction of these are periodic variables: compilations of these are of scientific interest for a variety of purposes. Unavoidably, the data sets from many such surveys not only have sparse sampling, but also have embedded frequencies in the observing cadence that beat against the natural periodicities of any object under investigation. Such limitations can make period determination ambiguous and uncertain. For multiband data sets with asynchronous measurements in multiple passbands, we wish to maximally use the information on periodicity in a manner that is agnostic of differences in the light-curve shapes across the different channels. Given large volumes of data, computational efficiency is also at a premium. This paper develops and presents a computationally economic method for determining periodicity that combines the results from two different classes of period-determination algorithms. The underlying principles are illustrated through examples. The effectiveness of this approach for combining asynchronously sampled measurements in multiple observables that share an underlying fundamental frequency is also demonstrated.
Combining multiple thresholding binarization values to improve OCR output
NASA Astrophysics Data System (ADS)
Lund, William B.; Kennard, Douglas J.; Ringger, Eric K.
2013-01-01
For noisy, historical documents, a high optical character recognition (OCR) word error rate (WER) can render the OCR text unusable. Since image binarization is often the method used to identify foreground pixels, a body of research seeks to improve image-wide binarization directly. Instead of relying on any one imperfect binarization technique, our method incorporates information from multiple simple thresholding binarizations of the same image to improve text output. Using a new corpus of 19th century newspaper grayscale images for which the text transcription is known, we observe WERs of 13.8% and higher using current binarization techniques and a state-of-the-art OCR engine. Our novel approach combines the OCR outputs from multiple thresholded images by aligning the text output and producing a lattice of word alternatives from which a lattice word error rate (LWER) is calculated. Our results show a LWER of 7.6% when aligning two threshold images and a LWER of 6.8% when aligning five. From the word lattice we commit to one hypothesis by applying the methods of Lund et al. (2011) achieving an improvement over the original OCR output and a 8.41% WER result on this data set.
Intelligent power management in a vehicular system with multiple power sources
NASA Astrophysics Data System (ADS)
Murphey, Yi L.; Chen, ZhiHang; Kiliaris, Leonidas; Masrur, M. Abul
This paper presents an optimal online power management strategy applied to a vehicular power system that contains multiple power sources and deals with largely fluctuated load requests. The optimal online power management strategy is developed using machine learning and fuzzy logic. A machine learning algorithm has been developed to learn the knowledge about minimizing power loss in a Multiple Power Sources and Loads (M_PS&LD) system. The algorithm exploits the fact that different power sources used to deliver a load request have different power losses under different vehicle states. The machine learning algorithm is developed to train an intelligent power controller, an online fuzzy power controller, FPC_MPS, that has the capability of finding combinations of power sources that minimize power losses while satisfying a given set of system and component constraints during a drive cycle. The FPC_MPS was implemented in two simulated systems, a power system of four power sources, and a vehicle system of three power sources. Experimental results show that the proposed machine learning approach combined with fuzzy control is a promising technology for intelligent vehicle power management in a M_PS&LD power system.
Jian, Yulin; Huang, Daoyu; Yan, Jia; Lu, Kun; Huang, Ying; Wen, Tailai; Zeng, Tanyue; Zhong, Shijie; Xie, Qilong
2017-06-19
A novel classification model, named the quantum-behaved particle swarm optimization (QPSO)-based weighted multiple kernel extreme learning machine (QWMK-ELM), is proposed in this paper. Experimental validation is carried out with two different electronic nose (e-nose) datasets. Being different from the existing multiple kernel extreme learning machine (MK-ELM) algorithms, the combination coefficients of base kernels are regarded as external parameters of single-hidden layer feedforward neural networks (SLFNs). The combination coefficients of base kernels, the model parameters of each base kernel, and the regularization parameter are optimized by QPSO simultaneously before implementing the kernel extreme learning machine (KELM) with the composite kernel function. Four types of common single kernel functions (Gaussian kernel, polynomial kernel, sigmoid kernel, and wavelet kernel) are utilized to constitute different composite kernel functions. Moreover, the method is also compared with other existing classification methods: extreme learning machine (ELM), kernel extreme learning machine (KELM), k-nearest neighbors (KNN), support vector machine (SVM), multi-layer perceptron (MLP), radical basis function neural network (RBFNN), and probabilistic neural network (PNN). The results have demonstrated that the proposed QWMK-ELM outperforms the aforementioned methods, not only in precision, but also in efficiency for gas classification.
Vallotton, Nathalie; Price, Paul S
2016-05-17
This paper uses the maximum cumulative ratio (MCR) as part of a tiered approach to evaluate and prioritize the risk of acute ecological effects from combined exposures to the plant protection products (PPPs) measured in 3 099 surface water samples taken from across the United States. Assessments of the reported mixtures performed on a substance-by-substance approach and using a Tier One cumulative assessment based on the lowest acute ecotoxicity benchmark gave the same findings for 92.3% of the mixtures. These mixtures either did not indicate a potential risk for acute effects or included one or more individual PPPs that had concentrations in excess of their benchmarks. A Tier Two assessment using a trophic level approach was applied to evaluate the remaining 7.7% of the mixtures. This assessment reduced the number of mixtures of concern by eliminating the combination of endpoint from multiple trophic levels, identified invertebrates and nonvascular plants as the most susceptible nontarget organisms, and indicated that a only a very limited number of PPPs drove the potential concerns. The combination of the measures of cumulative risk and the MCR enabled the identification of a small subset of mixtures where a potential risk would be missed in substance-by-substance assessments.
Bioinformatics Knowledge Map for Analysis of Beta-Catenin Function in Cancer
Arighi, Cecilia N.; Wu, Cathy H.
2015-01-01
Given the wealth of bioinformatics resources and the growing complexity of biological information, it is valuable to integrate data from disparate sources to gain insight into the role of genes/proteins in health and disease. We have developed a bioinformatics framework that combines literature mining with information from biomedical ontologies and curated databases to create knowledge “maps” of genes/proteins of interest. We applied this approach to the study of beta-catenin, a cell adhesion molecule and transcriptional regulator implicated in cancer. The knowledge map includes post-translational modifications (PTMs), protein-protein interactions, disease-associated mutations, and transcription factors co-activated by beta-catenin and their targets and captures the major processes in which beta-catenin is known to participate. Using the map, we generated testable hypotheses about beta-catenin biology in normal and cancer cells. By focusing on proteins participating in multiple relation types, we identified proteins that may participate in feedback loops regulating beta-catenin transcriptional activity. By combining multiple network relations with PTM proteoform-specific functional information, we proposed a mechanism to explain the observation that the cyclin dependent kinase CDK5 positively regulates beta-catenin co-activator activity. Finally, by overlaying cancer-associated mutation data with sequence features, we observed mutation patterns in several beta-catenin PTM sites and PTM enzyme binding sites that varied by tissue type, suggesting multiple mechanisms by which beta-catenin mutations can contribute to cancer. The approach described, which captures rich information for molecular species from genes and proteins to PTM proteoforms, is extensible to other proteins and their involvement in disease. PMID:26509276
Herrgård, Markus J.
2014-01-01
High-cell-density fermentation for industrial production of chemicals can impose numerous stresses on cells due to high substrate, product, and by-product concentrations; high osmolarity; reactive oxygen species; and elevated temperatures. There is a need to develop platform strains of industrial microorganisms that are more tolerant toward these typical processing conditions. In this study, the growth of six industrially relevant strains of Escherichia coli was characterized under eight stress conditions representative of fed-batch fermentation, and strains W and BL21(DE3) were selected as platforms for transposon (Tn) mutagenesis due to favorable resistance characteristics. Selection experiments, followed by either targeted or genome-wide next-generation-sequencing-based Tn insertion site determination, were performed to identify mutants with improved growth properties under a subset of three stress conditions and two combinations of individual stresses. A subset of the identified loss-of-function mutants were selected for a combinatorial approach, where strains with combinations of two and three gene deletions were systematically constructed and tested for single and multistress resistance. These approaches allowed identification of (i) strain-background-specific stress resistance phenotypes, (ii) novel gene deletion mutants in E. coli that confer single and multistress resistance in a strain-background-dependent manner, and (iii) synergistic effects of multiple gene deletions that confer improved resistance over single deletions. The results of this study underscore the suboptimality and strain-specific variability of the genetic network regulating growth under stressful conditions and suggest that further exploration of the combinatorial gene deletion space in multiple strain backgrounds is needed for optimizing strains for microbial bioprocessing applications. PMID:25085490
Linked independent component analysis for multimodal data fusion.
Groves, Adrian R; Beckmann, Christian F; Smith, Steve M; Woolrich, Mark W
2011-02-01
In recent years, neuroimaging studies have increasingly been acquiring multiple modalities of data and searching for task- or disease-related changes in each modality separately. A major challenge in analysis is to find systematic approaches for fusing these differing data types together to automatically find patterns of related changes across multiple modalities, when they exist. Independent Component Analysis (ICA) is a popular unsupervised learning method that can be used to find the modes of variation in neuroimaging data across a group of subjects. When multimodal data is acquired for the subjects, ICA is typically performed separately on each modality, leading to incompatible decompositions across modalities. Using a modular Bayesian framework, we develop a novel "Linked ICA" model for simultaneously modelling and discovering common features across multiple modalities, which can potentially have completely different units, signal- and contrast-to-noise ratios, voxel counts, spatial smoothnesses and intensity distributions. Furthermore, this general model can be configured to allow tensor ICA or spatially-concatenated ICA decompositions, or a combination of both at the same time. Linked ICA automatically determines the optimal weighting of each modality, and also can detect single-modality structured components when present. This is a fully probabilistic approach, implemented using Variational Bayes. We evaluate the method on simulated multimodal data sets, as well as on a real data set of Alzheimer's patients and age-matched controls that combines two very different types of structural MRI data: morphological data (grey matter density) and diffusion data (fractional anisotropy, mean diffusivity, and tensor mode). Copyright © 2010 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Sokolov, Vladimir V.; Filonenko, E. V.; Telegina, L. V.; Boulgakova, N. N.; Smirnov, V. V.
2002-11-01
The results of comparative studies of autofluorescence and 5-ALA-induced fluorescence of protoporphyrin IX, used in the diagnostics of early cancer of larynx and bronchi, are presented. The autofluorescence and 5-ALA-induced fluorescence images of larynx and bronchial tissues are analysed during the endoscopic study. The method of local spectrophotometry is used to verify findings obtained from fluorescence images. It is shown that such a combined approach can be efficiently used to improve the diagnostics of precancer and early cancer, to detect a primary multiple tumours, as well as for the diagnostics of a residual tumour or an early recurrence after the endoscopic, surgery or X-ray treatment. The developed approach allows one to minimise the number of false-positive results and to reduce the number of biopsies, which are commonly used in the white-light bronchoscopy search for occult cancerous loci.
Meta-Analysis of Effect Sizes Reported at Multiple Time Points Using General Linear Mixed Model.
Musekiwa, Alfred; Manda, Samuel O M; Mwambi, Henry G; Chen, Ding-Geng
2016-01-01
Meta-analysis of longitudinal studies combines effect sizes measured at pre-determined time points. The most common approach involves performing separate univariate meta-analyses at individual time points. This simplistic approach ignores dependence between longitudinal effect sizes, which might result in less precise parameter estimates. In this paper, we show how to conduct a meta-analysis of longitudinal effect sizes where we contrast different covariance structures for dependence between effect sizes, both within and between studies. We propose new combinations of covariance structures for the dependence between effect size and utilize a practical example involving meta-analysis of 17 trials comparing postoperative treatments for a type of cancer, where survival is measured at 6, 12, 18 and 24 months post randomization. Although the results from this particular data set show the benefit of accounting for within-study serial correlation between effect sizes, simulations are required to confirm these results.
Orthology detection combining clustering and synteny for very large datasets.
Lechner, Marcus; Hernandez-Rosales, Maribel; Doerr, Daniel; Wieseke, Nicolas; Thévenin, Annelyse; Stoye, Jens; Hartmann, Roland K; Prohaska, Sonja J; Stadler, Peter F
2014-01-01
The elucidation of orthology relationships is an important step both in gene function prediction as well as towards understanding patterns of sequence evolution. Orthology assignments are usually derived directly from sequence similarities for large data because more exact approaches exhibit too high computational costs. Here we present PoFF, an extension for the standalone tool Proteinortho, which enhances orthology detection by combining clustering, sequence similarity, and synteny. In the course of this work, FFAdj-MCS, a heuristic that assesses pairwise gene order using adjacencies (a similarity measure related to the breakpoint distance) was adapted to support multiple linear chromosomes and extended to detect duplicated regions. PoFF largely reduces the number of false positives and enables more fine-grained predictions than purely similarity-based approaches. The extension maintains the low memory requirements and the efficient concurrency options of its basis Proteinortho, making the software applicable to very large datasets.
Orthology Detection Combining Clustering and Synteny for Very Large Datasets
Lechner, Marcus; Hernandez-Rosales, Maribel; Doerr, Daniel; Wieseke, Nicolas; Thévenin, Annelyse; Stoye, Jens; Hartmann, Roland K.; Prohaska, Sonja J.; Stadler, Peter F.
2014-01-01
The elucidation of orthology relationships is an important step both in gene function prediction as well as towards understanding patterns of sequence evolution. Orthology assignments are usually derived directly from sequence similarities for large data because more exact approaches exhibit too high computational costs. Here we present PoFF, an extension for the standalone tool Proteinortho, which enhances orthology detection by combining clustering, sequence similarity, and synteny. In the course of this work, FFAdj-MCS, a heuristic that assesses pairwise gene order using adjacencies (a similarity measure related to the breakpoint distance) was adapted to support multiple linear chromosomes and extended to detect duplicated regions. PoFF largely reduces the number of false positives and enables more fine-grained predictions than purely similarity-based approaches. The extension maintains the low memory requirements and the efficient concurrency options of its basis Proteinortho, making the software applicable to very large datasets. PMID:25137074
A community computational challenge to predict the activity of pairs of compounds.
Bansal, Mukesh; Yang, Jichen; Karan, Charles; Menden, Michael P; Costello, James C; Tang, Hao; Xiao, Guanghua; Li, Yajuan; Allen, Jeffrey; Zhong, Rui; Chen, Beibei; Kim, Minsoo; Wang, Tao; Heiser, Laura M; Realubit, Ronald; Mattioli, Michela; Alvarez, Mariano J; Shen, Yao; Gallahan, Daniel; Singer, Dinah; Saez-Rodriguez, Julio; Xie, Yang; Stolovitzky, Gustavo; Califano, Andrea
2014-12-01
Recent therapeutic successes have renewed interest in drug combinations, but experimental screening approaches are costly and often identify only small numbers of synergistic combinations. The DREAM consortium launched an open challenge to foster the development of in silico methods to computationally rank 91 compound pairs, from the most synergistic to the most antagonistic, based on gene-expression profiles of human B cells treated with individual compounds at multiple time points and concentrations. Using scoring metrics based on experimental dose-response curves, we assessed 32 methods (31 community-generated approaches and SynGen), four of which performed significantly better than random guessing. We highlight similarities between the methods. Although the accuracy of predictions was not optimal, we find that computational prediction of compound-pair activity is possible, and that community challenges can be useful to advance the field of in silico compound-synergy prediction.
Mauk, Michael G.; Song, Jinzhao; Liu, Changchun; Bau, Haim H.
2018-01-01
Designs and applications of microfluidics-based devices for molecular diagnostics (Nucleic Acid Amplification Tests, NAATs) in infectious disease testing are reviewed, with emphasis on minimally instrumented, point-of-care (POC) tests for resource-limited settings. Microfluidic cartridges (‘chips’) that combine solid-phase nucleic acid extraction; isothermal enzymatic nucleic acid amplification; pre-stored, paraffin-encapsulated lyophilized reagents; and real-time or endpoint optical detection are described. These chips can be used with a companion module for separating plasma from blood through a combined sedimentation-filtration effect. Three reporter types: Fluorescence, colorimetric dyes, and bioluminescence; and a new paradigm for end-point detection based on a diffusion-reaction column are compared. Multiplexing (parallel amplification and detection of multiple targets) is demonstrated. Low-cost detection and added functionality (data analysis, control, communication) can be realized using a cellphone platform with the chip. Some related and similar-purposed approaches by others are surveyed. PMID:29495424
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peters, Junenette L., E-mail: petersj@bu.edu; Patricia Fabian, M., E-mail: pfabian@bu.edu; Levy, Jonathan I., E-mail: jonlevy@bu.edu
High blood pressure is associated with exposure to multiple chemical and non-chemical risk factors, but epidemiological analyses to date have not assessed the combined effects of both chemical and non-chemical stressors on human populations in the context of cumulative risk assessment. We developed a novel modeling approach to evaluate the combined impact of lead, cadmium, polychlorinated biphenyls (PCBs), and multiple non-chemical risk factors on four blood pressure measures using data for adults aged ≥20 years from the National Health and Nutrition Examination Survey (1999–2008). We developed predictive models for chemical and other stressors. Structural equation models were applied to accountmore » for complex associations among predictors of stressors as well as blood pressure. Models showed that blood lead, serum PCBs, and established non-chemical stressors were significantly associated with blood pressure. Lead was the chemical stressor most predictive of diastolic blood pressure and mean arterial pressure, while PCBs had a greater influence on systolic blood pressure and pulse pressure, and blood cadmium was not a significant predictor of blood pressure. The simultaneously fit exposure models explained 34%, 43% and 52% of the variance for lead, cadmium and PCBs, respectively. The structural equation models were developed using predictors available from public data streams (e.g., U.S. Census), which would allow the models to be applied to any U.S. population exposed to these multiple stressors in order to identify high risk subpopulations, direct intervention strategies, and inform public policy. - Highlights: • We evaluated joint impact of chemical and non-chemical stressors on blood pressure. • We built predictive models for lead, cadmium and polychlorinated biphenyls (PCBs). • Our approach allows joint evaluation of predictors from population-specific data. • Lead, PCBs and established non-chemical stressors were related to blood pressure. • Framework allows cumulative risk assessment in specific geographic settings.« less
Sophocleous, M.
2000-01-01
A practical methodology for recharge characterization was developed based on several years of field-oriented research at 10 sites in the Great Bend Prairie of south-central Kansas. This methodology combines the soil-water budget on a storm-by-storm year-round basis with the resulting watertable rises. The estimated 1985-1992 average annual recharge was less than 50mm/year with a range from 15 mm/year (during the 1998 drought) to 178 mm/year (during the 1993 flood year). Most of this recharge occurs during the spring months. To regionalize these site-specific estimates, an additional methodology based on multiple (forward) regression analysis combined with classification and GIS overlay analyses was developed and implemented. The multiple regression analysis showed that the most influential variables were, in order of decreasing importance, total annual precipitation, average maximum springtime soil-profile water storage, average shallowest springtime depth to watertable, and average springtime precipitation rate. Therefore, four GIS (ARC/INFO) data "layers" or coverages were constructed for the study region based on these four variables, and each such coverage was classified into the same number of data classes to avoid biasing the results. The normalized regression coefficients were employed to weigh the class rankings of each recharge-affecting variable. This approach resulted in recharge zonations that agreed well with the site recharge estimates. During the "Great Flood of 1993," when rainfall totals exceeded normal levels by -200% in the northern portion of the study region, the developed regionalization methodology was tested against such extreme conditions, and proved to be both practical, based on readily available or easily measurable data, and robust. It was concluded that the combination of multiple regression and GIS overlay analyses is a powerful and practical approach to regionalizing small samples of recharge estimates.
New methods for image collection and analysis in scanning Auger microscopy
NASA Technical Reports Server (NTRS)
Browning, R.
1985-01-01
While scanning Auger micrographs are used extensively for illustrating the stoichiometry of complex surfaces and for indicating areas of interest for fine point Auger spectroscopy, there are many problems in the quantification and analysis of Auger images. These problems include multiple contrast mechanisms and the lack of meaningful relationships with other Auger data. Collection of multielemental Auger images allows some new approaches to image analysis and presentation. Information about the distribution and quantity of elemental combinations at a surface are retrievable, and particular combinations of elements can be imaged, such as alloy phases. Results from the precipitate hardened alloy Al-2124 illustrate multispectral Auger imaging.
Evaluation of satellites and remote sensors for atmospheric pollution measurements
NASA Technical Reports Server (NTRS)
Carmichael, J.; Eldridge, R.; Friedman, E.; Keitz, E.
1976-01-01
An approach to the development of a prioritized list of scientific goals in atmospheric research is provided. The results of the analysis are used to estimate the contribution of various spacecraft/remote sensor combinations for each of several important constituents of the stratosphere. The evaluation of the combinations includes both single-instrument and multiple-instrument payloads. Attention was turned to the physical and chemical features of the atmosphere as well as the performance capability of a number of atmospheric remote sensors. In addition, various orbit considerations were reviewed along with detailed information on stratospheric aerosols and the impact of spacecraft environment on the operation of the sensors.
NASA Astrophysics Data System (ADS)
Chen, Zhangqi; Liu, Zi-Kui; Zhao, Ji-Cheng
2018-05-01
Diffusion coefficients of seven binary systems (Ti-Mo, Ti-Nb, Ti-Ta, Ti-Zr, Zr-Mo, Zr-Nb, and Zr-Ta) at 1200 °C, 1000 °C, and 800 °C were experimentally determined using three Ti-Mo-Nb-Ta-Zr diffusion multiples. Electron probe microanalysis (EPMA) was performed to collect concentration profiles at the binary diffusion regions. Forward simulation analysis (FSA) was then applied to extract both impurity and interdiffusion coefficients in Ti-rich and Zr-rich part of the bcc phase. Excellent agreements between our results and most of the literature data validate the high-throughput approach combining FSA with diffusion multiples to obtain a large amount of systematic diffusion data, which will help establish the diffusion (mobility) databases for the design and development of biomedical and structural Ti alloys.
NASA Astrophysics Data System (ADS)
Chen, Zhangqi; Liu, Zi-Kui; Zhao, Ji-Cheng
2018-07-01
Diffusion coefficients of seven binary systems (Ti-Mo, Ti-Nb, Ti-Ta, Ti-Zr, Zr-Mo, Zr-Nb, and Zr-Ta) at 1200 °C, 1000 °C, and 800 °C were experimentally determined using three Ti-Mo-Nb-Ta-Zr diffusion multiples. Electron probe microanalysis (EPMA) was performed to collect concentration profiles at the binary diffusion regions. Forward simulation analysis (FSA) was then applied to extract both impurity and interdiffusion coefficients in Ti-rich and Zr-rich part of the bcc phase. Excellent agreements between our results and most of the literature data validate the high-throughput approach combining FSA with diffusion multiples to obtain a large amount of systematic diffusion data, which will help establish the diffusion (mobility) databases for the design and development of biomedical and structural Ti alloys.
NASA Astrophysics Data System (ADS)
Müller, H.; Haberlandt, U.
2018-01-01
Rainfall time series of high temporal resolution and spatial density are crucial for urban hydrology. The multiplicative random cascade model can be used for temporal rainfall disaggregation of daily data to generate such time series. Here, the uniform splitting approach with a branching number of 3 in the first disaggregation step is applied. To achieve a final resolution of 5 min, subsequent steps after disaggregation are necessary. Three modifications at different disaggregation levels are tested in this investigation (uniform splitting at Δt = 15 min, linear interpolation at Δt = 7.5 min and Δt = 3.75 min). Results are compared both with observations and an often used approach, based on the assumption that a time steps with Δt = 5.625 min, as resulting if a branching number of 2 is applied throughout, can be replaced with Δt = 5 min (called the 1280 min approach). Spatial consistence is implemented in the disaggregated time series using a resampling algorithm. In total, 24 recording stations in Lower Saxony, Northern Germany with a 5 min resolution have been used for the validation of the disaggregation procedure. The urban-hydrological suitability is tested with an artificial combined sewer system of about 170 hectares. The results show that all three variations outperform the 1280 min approach regarding reproduction of wet spell duration, average intensity, fraction of dry intervals and lag-1 autocorrelation. Extreme values with durations of 5 min are also better represented. For durations of 1 h, all approaches show only slight deviations from the observed extremes. The applied resampling algorithm is capable to achieve sufficient spatial consistence. The effects on the urban hydrological simulations are significant. Without spatial consistence, flood volumes of manholes and combined sewer overflow are strongly underestimated. After resampling, results using disaggregated time series as input are in the range of those using observed time series. Best overall performance regarding rainfall statistics are obtained by the method in which the disaggregation process ends at time steps with 7.5 min duration, deriving the 5 min time steps by linear interpolation. With subsequent resampling this method leads to a good representation of manhole flooding and combined sewer overflow volume in terms of hydrological simulations and outperforms the 1280 min approach.
Dinh, Duy; Tamine, Lynda; Boubekeur, Fatiha
2013-02-01
The aim of this work is to evaluate a set of indexing and retrieval strategies based on the integration of several biomedical terminologies on the available TREC Genomics collections for an ad hoc information retrieval (IR) task. We propose a multi-terminology based concept extraction approach to selecting best concepts from free text by means of voting techniques. We instantiate this general approach on four terminologies (MeSH, SNOMED, ICD-10 and GO). We particularly focus on the effect of integrating terminologies into a biomedical IR process, and the utility of using voting techniques for combining the extracted concepts from each document in order to provide a list of unique concepts. Experimental studies conducted on the TREC Genomics collections show that our multi-terminology IR approach based on voting techniques are statistically significant compared to the baseline. For example, tested on the 2005 TREC Genomics collection, our multi-terminology based IR approach provides an improvement rate of +6.98% in terms of MAP (mean average precision) (p<0.05) compared to the baseline. In addition, our experimental results show that document expansion using preferred terms in combination with query expansion using terms from top ranked expanded documents improve the biomedical IR effectiveness. We have evaluated several voting models for combining concepts issued from multiple terminologies. Through this study, we presented many factors affecting the effectiveness of biomedical IR system including term weighting, query expansion, and document expansion models. The appropriate combination of those factors could be useful to improve the IR performance. Copyright © 2012 Elsevier B.V. All rights reserved.
Finney, Andrew; Healey, Emma; Jordan, Joanne L; Ryan, Sarah; Dziedzic, Krysia S
2016-07-08
The National Institute for Health and Care Excellence's Osteoarthritis (OA) guidelines recommended that future research should consider the benefits of combination therapies in people with OA across multiple joint sites. However, the clinical effectiveness of such approaches to OA management is unknown. This systematic review therefore aimed to identify the clinical and cost effectiveness of multidisciplinary approaches targeting multiple joint sites for OA in primary care. A systematic review of randomised controlled trials. Computerised bibliographic databases were searched (MEDLINE, EMBASE, CINAHL, PsychINFO, BNI, HBE, HMIC, AMED, Web of Science and Cochrane). Studies were included if they met the following criteria; a randomised controlled trial (RCT), a primary care population with OA across at least two different peripheral joint sites (multiple joint sites), and interventions undertaken by at least two different health disciplines (multidisciplinary). The Cochrane 'Risk of Bias' tool and PEDro were used for quality assessment of eligible studies. Clinical and cost effectiveness was determined by extracting and examining self-reported outcomes for pain, function, quality of life (QoL) and health care utilisation. The date range for the search was from database inception until August 2015. The search identified 1148 individual titles of which four were included in the review. A narrative review was conducted due to the heterogeneity of the included trials. Each of the four trials used either educational or exercise interventions facilitated by a range of different health disciplines. Moderate clinical benefits on pain, function and QoL were reported across the studies. The beneficial effects of exercise generally decreased over time within all studies. Two studies were able to show a reduction in healthcare utilisation due to a reduction in visits to a physiotherapist or a reduction in x-rays and orthopaedic referrals. The intervention that showed the most promise used educational interventions delivered by GPs with reinforcement by practice nurses. There are currently very few studies that target multidisciplinary approaches suitable for OA across multiple joint sites, in primary care. A more consistent approach to outcome measurement in future studies of this nature should be considered to allow for better comparison.
Computational Precision of Mental Inference as Critical Source of Human Choice Suboptimality.
Drugowitsch, Jan; Wyart, Valentin; Devauchelle, Anne-Dominique; Koechlin, Etienne
2016-12-21
Making decisions in uncertain environments often requires combining multiple pieces of ambiguous information from external cues. In such conditions, human choices resemble optimal Bayesian inference, but typically show a large suboptimal variability whose origin remains poorly understood. In particular, this choice suboptimality might arise from imperfections in mental inference rather than in peripheral stages, such as sensory processing and response selection. Here, we dissociate these three sources of suboptimality in human choices based on combining multiple ambiguous cues. Using a novel quantitative approach for identifying the origin and structure of choice variability, we show that imperfections in inference alone cause a dominant fraction of suboptimal choices. Furthermore, two-thirds of this suboptimality appear to derive from the limited precision of neural computations implementing inference rather than from systematic deviations from Bayes-optimal inference. These findings set an upper bound on the accuracy and ultimate predictability of human choices in uncertain environments. Copyright © 2016 Elsevier Inc. All rights reserved.
Metabolite identification through multiple kernel learning on fragmentation trees.
Shen, Huibin; Dührkop, Kai; Böcker, Sebastian; Rousu, Juho
2014-06-15
Metabolite identification from tandem mass spectrometric data is a key task in metabolomics. Various computational methods have been proposed for the identification of metabolites from tandem mass spectra. Fragmentation tree methods explore the space of possible ways in which the metabolite can fragment, and base the metabolite identification on scoring of these fragmentation trees. Machine learning methods have been used to map mass spectra to molecular fingerprints; predicted fingerprints, in turn, can be used to score candidate molecular structures. Here, we combine fragmentation tree computations with kernel-based machine learning to predict molecular fingerprints and identify molecular structures. We introduce a family of kernels capturing the similarity of fragmentation trees, and combine these kernels using recently proposed multiple kernel learning approaches. Experiments on two large reference datasets show that the new methods significantly improve molecular fingerprint prediction accuracy. These improvements result in better metabolite identification, doubling the number of metabolites ranked at the top position of the candidates list. © The Author 2014. Published by Oxford University Press.
Jäckel, David; Bakkum, Douglas J; Russell, Thomas L; Müller, Jan; Radivojevic, Milos; Frey, Urs; Franke, Felix; Hierlemann, Andreas
2017-04-20
We present a novel, all-electric approach to record and to precisely control the activity of tens of individual presynaptic neurons. The method allows for parallel mapping of the efficacy of multiple synapses and of the resulting dynamics of postsynaptic neurons in a cortical culture. For the measurements, we combine an extracellular high-density microelectrode array, featuring 11'000 electrodes for extracellular recording and stimulation, with intracellular patch-clamp recording. We are able to identify the contributions of individual presynaptic neurons - including inhibitory and excitatory synaptic inputs - to postsynaptic potentials, which enables us to study dendritic integration. Since the electrical stimuli can be controlled at microsecond resolution, our method enables to evoke action potentials at tens of presynaptic cells in precisely orchestrated sequences of high reliability and minimum jitter. We demonstrate the potential of this method by evoking short- and long-term synaptic plasticity through manipulation of multiple synaptic inputs to a specific neuron.
Insights from Classifying Visual Concepts with Multiple Kernel Learning
Binder, Alexander; Nakajima, Shinichi; Kloft, Marius; Müller, Christina; Samek, Wojciech; Brefeld, Ulf; Müller, Klaus-Robert; Kawanabe, Motoaki
2012-01-01
Combining information from various image features has become a standard technique in concept recognition tasks. However, the optimal way of fusing the resulting kernel functions is usually unknown in practical applications. Multiple kernel learning (MKL) techniques allow to determine an optimal linear combination of such similarity matrices. Classical approaches to MKL promote sparse mixtures. Unfortunately, 1-norm regularized MKL variants are often observed to be outperformed by an unweighted sum kernel. The main contributions of this paper are the following: we apply a recently developed non-sparse MKL variant to state-of-the-art concept recognition tasks from the application domain of computer vision. We provide insights on benefits and limits of non-sparse MKL and compare it against its direct competitors, the sum-kernel SVM and sparse MKL. We report empirical results for the PASCAL VOC 2009 Classification and ImageCLEF2010 Photo Annotation challenge data sets. Data sets (kernel matrices) as well as further information are available at http://doc.ml.tu-berlin.de/image_mkl/(Accessed 2012 Jun 25). PMID:22936970
The importance of the vaginal delivery route for antiretrovirals in HIV prevention
Ferguson, Lindsay M; Rohan, Lisa Cencia
2012-01-01
The HIV/AIDS pandemic continues to be a global health priority, with high rates of new HIV-1 infections persisting in young women. One HIV prevention strategy is topical pre-exposure prophylactics or microbicides, which are applied vaginally or rectally to protect the user from HIV and possibly other sexually transmitted infections. Vaginal microbicide delivery will be the focus of this review. Multiple nonspecific and specific antiretroviral microbicide products have been clinically evaluated, and many are in preclinical development. The events of HIV mucosal transmission and dynamics of the cervicovaginal environment should be considered for successful vaginal microbicide delivery. Beyond conventional vaginal formulations, intravaginal rings, tablets and films are employed as platforms in the hope to increase the likelihood of microbicide use. Furthermore, combining multiple antiretrovirals within a given formulation, combining a microbicide product with a vaginal device and integrating novel drug-delivery strategies within a microbicide product are approaches to successful vaginal-microbicide delivery. PMID:22468220
2017-02-01
Reports an error in "An integrative formal model of motivation and decision making: The MGPM*" by Timothy Ballard, Gillian Yeo, Shayne Loft, Jeffrey B. Vancouver and Andrew Neal ( Journal of Applied Psychology , 2016[Sep], Vol 101[9], 1240-1265). Equation A3 contained an error. This correct equation is provided in the erratum. (The following abstract of the original article appeared in record 2016-28692-001.) We develop and test an integrative formal model of motivation and decision making. The model, referred to as the extended multiple-goal pursuit model (MGPM*), is an integration of the multiple-goal pursuit model (Vancouver, Weinhardt, & Schmidt, 2010) and decision field theory (Busemeyer & Townsend, 1993). Simulations of the model generated predictions regarding the effects of goal type (approach vs. avoidance), risk, and time sensitivity on prioritization. We tested these predictions in an experiment in which participants pursued different combinations of approach and avoidance goals under different levels of risk. The empirical results were consistent with the predictions of the MGPM*. Specifically, participants pursuing 1 approach and 1 avoidance goal shifted priority from the approach to the avoidance goal over time. Among participants pursuing 2 approach goals, those with low time sensitivity prioritized the goal with the larger discrepancy, whereas those with high time sensitivity prioritized the goal with the smaller discrepancy. Participants pursuing 2 avoidance goals generally prioritized the goal with the smaller discrepancy. Finally, all of these effects became weaker as the level of risk increased. We used quantitative model comparison to show that the MGPM* explained the data better than the original multiple-goal pursuit model, and that the major extensions from the original model were justified. The MGPM* represents a step forward in the development of a general theory of decision making during multiple-goal pursuit. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Confocal and dermoscopic features of basal cell carcinoma in Gorlin-Goltz syndrome: A case report.
Casari, Alice; Argenziano, Giuseppe; Moscarella, Elvira; Lallas, Aimilios; Longo, Caterina
2017-05-01
Gorlin-Goltz (GS) syndrome is an autosomal dominant disease linked to a mutation in the PTCH gene. Major criteria include the onset of multiple basal cell carcinoma (BCC), keratocystic odontogenic tumours in the jaws and bifid ribs. Dermoscopy and reflectance confocal microscopy represent imaging tools that are able to increase the diagnostic accuracy of skin cancer in a totally noninvasive manner, without performing punch biopsies. Here we present a case of a young woman in whom the combined approach of dermoscopy and RCM led to the identification of multiple small inconspicuous lesions as BCC and thus to the diagnosis of GS syndrome. © 2016 The Australasian College of Dermatologists.
[Therapy of multiple myeloma: indications and options].
Peest, D; Ganser, A
2007-12-01
The multiple myeloma (MM) has an incidence of 3-4/100,000 in the Caucasian population. MM has to be distinguished from smouldering MM and monoclonal gammopathy of uncertain significance (MGUS). In younger patients (<65 years) a good long-term remission is the aim of therapy, while in the elderly patients with comorbidities the aim is a good partial remission with good quality of life. In the elderly this can be achieved with a combination of melphalan and prednisone. High-dose chemotherapy, often as a tandem transplantation, is part of standard therapy of MM patients <65 years. However, allogeneic stem cell transplantation is the only curative approach. New substances approved for treatment of relapsed MM include bortezomib, thalidomide, and lenalidomide.
Bairi, Partha; Minami, Kosuke; Hill, Jonathan P; Nakanishi, Waka; Shrestha, Lok Kumar; Liu, Chao; Harano, Koji; Nakamura, Eiichi; Ariga, Katsuhiko
2016-09-27
Supramolecular assembly can be used to construct a wide variety of ordered structures by exploiting the cumulative effects of multiple noncovalent interactions. However, the construction of anisotropic nanostructures remains subject to some limitations. Here, we demonstrate the preparation of anisotropic fullerene-based nanostructures by supramolecular differentiation, which is the programmed control of multiple assembly strategies. We have carefully combined interfacial assembly and local phase separation phenomena. Two fullerene derivatives, PhH and C12H, were together formed into self-assembled anisotropic nanostructures by using this approach. This technique is applicable for the construction of anisotropic nanostructures without requiring complex molecular design or complicated methodology.
History of Cognitive-Behavioral Therapy (CBT) in Youth
Benjamin, Courtney L.; Puleo, Connor M.; Settipani, Cara A.; Brodman, Douglas M.; Edmunds, Julie M.; Cummings, Colleen M.
2011-01-01
Synopsis CBT represents a combination of behavioral and cognitive theories of human behavior and psychopathology, and a melding of emotional, familial, and peer influences. The numerous intervention strategies that comprise CBT reflect its complex and integrative nature and include such topics as extinction, habituation, modeling, cognitive restructuring, problem-solving, and the development of coping strategies, mastery, and a sense of self-control. CBT targets multiple areas of potential vulnerability (e.g., cognitive, behavioral, affective) with developmentally-guided strategies and traverses multiple intervention pathways. Although CBT is often considered the “first line treatment” for many psychological disorders in youth, additional work is necessary to address treatment non-responders and to facilitate the dissemination of efficacious CBT approaches. PMID:21440849
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stuckelberger, Michael; West, Bradley; Nietzold, Tara
In situ and operando measurement techniques combined with nanoscale resolution have proven invaluable in multiple fields of study. We argue that evaluating device performance as well as material behavior by correlative X-ray microscopy with <100 nm resolution can radically change the approach for optimizing absorbers, interfaces and full devices in solar cell research. Here, we thoroughly discuss the measurement technique of X-ray beam induced current and point out fundamental differences between measurements of wafer-based silicon and thin-film solar cells. Based on reports of the last years, we showcase the potential that X-ray microscopy measurements have in combination with in situmore » and operando approaches throughout the solar cell lifecycle: from the growth of individual layers to the performance under operating conditions and degradation mechanisms. Enabled by new developments in synchrotron beamlines, the combination of high spatial resolution with high brilliance and a safe working distance allows for the insertion of measurement equipment that can pave the way for a new class of experiments. When applied to photovoltaics research, we highlight today’s opportunities and challenges in the field of nanoscale X-ray microscopy, and give an outlook on future developments.« less
NASA Astrophysics Data System (ADS)
Omar, M. A.; Parvataneni, R.; Zhou, Y.
2010-09-01
Proposed manuscript describes the implementation of a two step processing procedure, composed of the self-referencing and the Principle Component Thermography (PCT). The combined approach enables the processing of thermograms from transient (flash), steady (halogen) and selective (induction) thermal perturbations. Firstly, the research discusses the three basic processing schemes typically applied for thermography; namely mathematical transformation based processing, curve-fitting processing, and direct contrast based calculations. Proposed algorithm utilizes the self-referencing scheme to create a sub-sequence that contains the maximum contrast information and also compute the anomalies' depth values. While, the Principle Component Thermography operates on the sub-sequence frames by re-arranging its data content (pixel values) spatially and temporally then it highlights the data variance. The PCT is mainly used as a mathematical mean to enhance the defects' contrast thus enabling its shape and size retrieval. The results show that the proposed combined scheme is effective in processing multiple size defects in sandwich steel structure in real-time (<30 Hz) and with full spatial coverage, without the need for a priori defect-free area.
Yu, Huanzhou; Shimakawa, Ann; Hines, Catherine D. G.; McKenzie, Charles A.; Hamilton, Gavin; Sirlin, Claude B.; Brittain, Jean H.; Reeder, Scott B.
2011-01-01
Multipoint water–fat separation techniques rely on different water–fat phase shifts generated at multiple echo times to decompose water and fat. Therefore, these methods require complex source images and allow unambiguous separation of water and fat signals. However, complex-based water–fat separation methods are sensitive to phase errors in the source images, which may lead to clinically important errors. An alternative approach to quantify fat is through “magnitude-based” methods that acquire multiecho magnitude images. Magnitude-based methods are insensitive to phase errors, but cannot estimate fat-fraction greater than 50%. In this work, we introduce a water–fat separation approach that combines the strengths of both complex and magnitude reconstruction algorithms. A magnitude-based reconstruction is applied after complex-based water–fat separation to removes the effect of phase errors. The results from the two reconstructions are then combined. We demonstrate that using this hybrid method, 0–100% fat-fraction can be estimated with improved accuracy at low fat-fractions. PMID:21695724
Orthodontic treatment in periodontal patients: a case report with 7 years follow-up.
Derton, Nicola; Derton, Roberto; Perini, Alessandro; Gracco, Antonio; Fornaciari, Paolo Andrea
2011-03-01
Tooth flaring of the anterior segment is often unesthetic and therefore a primary reason for combined orthodontic and periodontal treatment in adult patients with periodontal disease. Thus, a multidisciplinary approach is frequently chosen for these patients by a qualified dental team. A clinical case of an adult patient suffering from chronic periodontitis with horizontal bone loss in the anterior segment and consequent flaring of the anterior teeth is described. A combined approach was chosen, initially to improve and stabilize the periodontal situation via multiple scaling and root planning sessions with additional pharmacological therapy and, finally by orthodontic treatment, to resolve the malocclusion. At the end of treatment, bone resorbtion was stabilized, the vertical bone defect was improved and incisor flaring was absent. Follow-up at 7 years post-treatment confirmed the stability of the orthodontic and esthetic results. The correct combination of orthodontic and periodontal treatment may contribute efficaciously to eliminate the effects of chronic periodontitis in adult patients, as well as improving esthetic parameters. Copyright © 2011 CEO. Published by Elsevier Masson SAS. All rights reserved.
Vogt, Martin; Bajorath, Jürgen
2008-01-01
Bayesian classifiers are increasingly being used to distinguish active from inactive compounds and search large databases for novel active molecules. We introduce an approach to directly combine the contributions of property descriptors and molecular fingerprints in the search for active compounds that is based on a Bayesian framework. Conventionally, property descriptors and fingerprints are used as alternative features for virtual screening methods. Following the approach introduced here, probability distributions of descriptor values and fingerprint bit settings are calculated for active and database molecules and the divergence between the resulting combined distributions is determined as a measure of biological activity. In test calculations on a large number of compound activity classes, this methodology was found to consistently perform better than similarity searching using fingerprints and multiple reference compounds or Bayesian screening calculations using probability distributions calculated only from property descriptors. These findings demonstrate that there is considerable synergy between different types of property descriptors and fingerprints in recognizing diverse structure-activity relationships, at least in the context of Bayesian modeling.
Phenome-driven disease genetics prediction toward drug discovery.
Chen, Yang; Li, Li; Zhang, Guo-Qiang; Xu, Rong
2015-06-15
Discerning genetic contributions to diseases not only enhances our understanding of disease mechanisms, but also leads to translational opportunities for drug discovery. Recent computational approaches incorporate disease phenotypic similarities to improve the prediction power of disease gene discovery. However, most current studies used only one data source of human disease phenotype. We present an innovative and generic strategy for combining multiple different data sources of human disease phenotype and predicting disease-associated genes from integrated phenotypic and genomic data. To demonstrate our approach, we explored a new phenotype database from biomedical ontologies and constructed Disease Manifestation Network (DMN). We combined DMN with mimMiner, which was a widely used phenotype database in disease gene prediction studies. Our approach achieved significantly improved performance over a baseline method, which used only one phenotype data source. In the leave-one-out cross-validation and de novo gene prediction analysis, our approach achieved the area under the curves of 90.7% and 90.3%, which are significantly higher than 84.2% (P < e(-4)) and 81.3% (P < e(-12)) for the baseline approach. We further demonstrated that our predicted genes have the translational potential in drug discovery. We used Crohn's disease as an example and ranked the candidate drugs based on the rank of drug targets. Our gene prediction approach prioritized druggable genes that are likely to be associated with Crohn's disease pathogenesis, and our rank of candidate drugs successfully prioritized the Food and Drug Administration-approved drugs for Crohn's disease. We also found literature evidence to support a number of drugs among the top 200 candidates. In summary, we demonstrated that a novel strategy combining unique disease phenotype data with system approaches can lead to rapid drug discovery. nlp. edu/public/data/DMN © The Author 2015. Published by Oxford University Press.
Hook, Ch D; Samsonov, V V; Ublinskaya, A A; Kuvaeva, T M; Andreeva, E V; Gorbacheva, L Yu; Stoynova, N V
2016-11-01
Despite the abundance of genetic manipulation approaches, particularly for Escherichia coli, new techniques and increased flexibility in the application of existing techniques are required to address novel aims. The most widely used approaches for chromosome editing are based on bacteriophage site-specific and λRed/RecET-mediated homologous recombination. In the present study, these techniques were combined to develop a novel approach for in vivo cloning and targeted long-length chromosomal insertion. This approach permits direct λRed-mediated cloning of DNA fragment with lengths of 10kb or greater from the E. coli chromosome into the plasmid vector pGL2, which carries the ori of pSC101, the ϕ80-attP site of ϕ80 phage, and an excisable Cm R marker bracketed by λ-attL/attR sites. In pGL2-based recombinant plasmids, the origin of replication can be eliminated in vitro via hydrolysis by SceI endonuclease and recircularization by DNA ligase. The resulting ori-less circular recombinant DNA can be used for targeted insertion of the cloned sequence into the chromosome at a selected site via ϕ80 phage-specific integrase-mediated recombination using the Dual-In/Out approach (Minaeva et al., 2008). At the final stage of chromosomal editing, the Cm R -marker can be excised from the chromosome due to expression of the λint/xis genes. Notably, the desired fragment can be inserted as multiple copies in the chromosome by combining insertions at different sites in one strain using the P1 general transduction technique (Moore, 2011). The developed approach is useful for the construction of plasmidless, markerless recombinant strains for fundamental and industrial purposes. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Bersini, Simone; Gilardi, Mara; Arrigoni, Chiara; Talò, Giuseppe; Zamai, Moreno; Zagra, Luigi; Caiolfa, Valeria; Moretti, Matteo
2016-01-01
The generation of functional, vascularized tissues is a key challenge for both tissue engineering applications and the development of advanced in vitro models analyzing interactions among circulating cells, endothelium and organ-specific microenvironments. Since vascularization is a complex process guided by multiple synergic factors, it is critical to analyze the specific role that different experimental parameters play in the generation of physiological tissues. Our goals were to design a novel meso-scale model bridging the gap between microfluidic and macro-scale studies, and high-throughput screen the effects of multiple variables on the vascularization of bone-mimicking tissues. We investigated the influence of endothelial cell (EC) density (3-5 Mcells/ml), cell ratio among ECs, mesenchymal stem cells (MSCs) and osteo-differentiated MSCs (1:1:0, 10:1:0, 10:1:1), culture medium (endothelial, endothelial + angiopoietin-1, 1:1 endothelial/osteo), hydrogel type (100%fibrin, 60%fibrin+40%collagen), tissue geometry (2 × 2 × 2, 2 × 2 × 5 mm(3)). We optimized the geometry and oxygen gradient inside hydrogels through computational simulations and we analyzed microvascular network features including total network length/area and vascular branch number/length. Particularly, we employed the "Design of Experiment" statistical approach to identify key differences among experimental conditions. We combined the generation of 3D functional tissue units with the fine control over the local microenvironment (e.g. oxygen gradients), and developed an effective strategy to enable the high-throughput screening of multiple experimental parameters. Our approach allowed to identify synergic correlations among critical parameters driving microvascular network development within a bone-mimicking environment and could be translated to any vascularized tissue. Copyright © 2015 Elsevier Ltd. All rights reserved.
A multi-disciplinary approach for the integrated assessment of multiple risks in delta areas.
NASA Astrophysics Data System (ADS)
Sperotto, Anna; Torresan, Silvia; Critto, Andrea; Marcomini, Antonio
2016-04-01
The assessment of climate change related risks is notoriously difficult due to the complex and uncertain combinations of hazardous events that might happen, the multiplicity of physical processes involved, the continuous changes and interactions of environmental and socio-economic systems. One important challenge lies in predicting and modelling cascades of natural and man -made hazard events which can be triggered by climate change, encompassing different spatial and temporal scales. Another regard the potentially difficult integration of environmental, social and economic disciplines in the multi-risk concept. Finally, the effective interaction between scientists and stakeholders is essential to ensure that multi-risk knowledge is translated into efficient adaptation and management strategies. The assessment is even more complex at the scale of deltaic systems which are particularly vulnerable to global environmental changes, due to the fragile equilibrium between the presence of valuable natural ecosystems and relevant economic activities. Improving our capacity to assess the combined effects of multiple hazards (e.g. sea-level rise, storm surges, reduction in sediment load, local subsidence, saltwater intrusion) is therefore essential to identify timely opportunities for adaptation. A holistic multi-risk approach is here proposed to integrate terminology, metrics and methodologies from different research fields (i.e. environmental, social and economic sciences) thus creating shared knowledge areas to advance multi risk assessment and management in delta regions. A first testing of the approach, including the application of Bayesian network analysis for the assessment of impacts of climate change on key natural systems (e.g. wetlands, protected areas, beaches) and socio-economic activities (e.g. agriculture, tourism), is applied in the Po river delta in Northern Italy. The approach is based on a bottom-up process involving local stakeholders early in different stages of the multi-risk assessment process (i.e. identification of objectives, collection of data, definition of risk thresholds and indicators). The results of the assessment will allow the development of multi-risk scenarios enabling the evaluation and prioritization of risk management and adaptation options under changing climate conditions.
Combining Search Engines for Comparative Proteomics
Tabb, David
2012-01-01
Many proteomics laboratories have found spectral counting to be an ideal way to recognize biomarkers that differentiate cohorts of samples. This approach assumes that proteins that differ in quantity between samples will generate different numbers of identifiable tandem mass spectra. Increasingly, researchers are employing multiple search engines to maximize the identifications generated from data collections. This talk evaluates four strategies to combine information from multiple search engines in comparative proteomics. The “Count Sum” model pools the spectra across search engines. The “Vote Counting” model combines the judgments from each search engine by protein. Two other models employ parametric and non-parametric analyses of protein-specific p-values from different search engines. We evaluated the four strategies in two different data sets. The ABRF iPRG 2009 study generated five LC-MS/MS analyses of “red” E. coli and five analyses of “yellow” E. coli. NCI CPTAC Study 6 generated five concentrations of Sigma UPS1 spiked into a yeast background. All data were identified with X!Tandem, Sequest, MyriMatch, and TagRecon. For both sample types, “Vote Counting” appeared to manage the diverse identification sets most effectively, yielding heightened discrimination as more search engines were added.
Digital equalization of time-delay array receivers on coherent laser communications.
Belmonte, Aniceto
2017-01-15
Field conjugation arrays use adaptive combining techniques on multi-aperture receivers to improve the performance of coherent laser communication links by mitigating the consequences of atmospheric turbulence on the down-converted coherent power. However, this motivates the use of complex receivers as optical signals collected by different apertures need to be adaptively processed, co-phased, and scaled before they are combined. Here, we show that multiple apertures, coupled with optical delay lines, combine retarded versions of a signal at a single coherent receiver, which uses digital equalization to obtain diversity gain against atmospheric fading. We found in our analysis that, instead of field conjugation arrays, digital equalization of time-delay multi-aperture receivers is a simpler and more versatile approach to accomplish reduction of atmospheric fading.
Kellman, Peter; Dyke, Christopher K.; Aletras, Anthony H.; McVeigh, Elliot R.; Arai, Andrew E.
2007-01-01
Regions of the body with long T1, such as cerebrospinal fluid (CSF), may create ghost artifacts on gadolinium-hyperenhanced images of myocardial infarction when inversion recovery (IR) sequences are used with a segmented acquisition. Oscillations in the transient approach to steady state for regions with long T1 may cause ghosts, with the number of ghosts being equal to the number of segments. B1-weighted phased-array combining provides an inherent degree of ghost artifact suppression because the ghost artifact is weighted less than the desired signal intensity by the coil sensitivity profiles. Example images are shown that illustrate the suppression of CSF ghost artifacts by the use of B1-weighted phased-array combining of multiple receiver coils. PMID:14755669
A hybrid structured-unstructured grid method for unsteady turbomachinery flow computations
NASA Technical Reports Server (NTRS)
Mathur, Sanjay R.; Madavan, Nateri K.; Rajagopalan, R. G.
1993-01-01
A hybrid grid technique for the solution of 2D, unsteady flows is developed. This technique is capable of handling complex, multiple component geometries in relative motion, such as those encountered in turbomachinery. The numerical approach utilizes a mixed structured-unstructured zonal grid topology along with modeling equations and solution methods that are most appropriate in the individual domains, therefore combining the advantages of both structured and unstructured grid techniques.
SC3 - consensus clustering of single-cell RNA-Seq data
Kiselev, Vladimir Yu.; Kirschner, Kristina; Schaub, Michael T.; Andrews, Tallulah; Yiu, Andrew; Chandra, Tamir; Natarajan, Kedar N; Reik, Wolf; Barahona, Mauricio; Green, Anthony R; Hemberg, Martin
2017-01-01
Single-cell RNA-seq (scRNA-seq) enables a quantitative cell-type characterisation based on global transcriptome profiles. We present Single-Cell Consensus Clustering (SC3), a user-friendly tool for unsupervised clustering which achieves high accuracy and robustness by combining multiple clustering solutions through a consensus approach. We demonstrate that SC3 is capable of identifying subclones based on the transcriptomes from neoplastic cells collected from patients. PMID:28346451
Multimodal Approach to Testing the Acute Effects of Mild Traumatic Brain Injury (mTBI)
2015-03-01
included several key staff changes, a major instrument acquisition, repairs and upgrades to the MEG , combined with substantial progress with patient...patients to non-head trauma controls in the first days after injury. Multiple modalities of behavioral, electrophysiological, and most strikingly, MEG ...changes were found. The MEG of all mTBI patients had delta activity in the frontal lobes that was absent in all controls. A scientific abstract on
Platelet lysate-based pro-angiogenic nanocoatings.
Oliveira, Sara M; Pirraco, Rogério P; Marques, Alexandra P; Santo, Vítor E; Gomes, Manuela E; Reis, Rui L; Mano, João F
2016-03-01
Human platelet lysate (PL) is a cost-effective and human source of autologous multiple and potent pro-angiogenic factors, such as vascular endothelial growth factor A (VEGF A), fibroblast growth factor b (FGF b) and angiopoietin-1. Nanocoatings previously characterized were prepared by layer-by-layer assembling incorporating PL with marine-origin polysaccharides and were shown to activate human umbilical vein endothelial cells (HUVECs). Within 20 h of incubation, the more sulfated coatings induced the HUVECS to the form tube-like structures accompanied by an increased expression of angiogenic-associated genes, such as angiopoietin-1 and VEGF A. This may be a cost-effective approach to modify 2D/3D constructs to instruct angiogenic cells towards the formation of neo-vascularization, driven by multiple and synergistic stimulations from the PL combined with sulfated polysaccharides. The presence, or fast induction, of a stable and mature vasculature inside 3D constructs is crucial for new tissue formation and its viability. This has been one of the major tissue engineering challenges, limiting the dimensions of efficient tissue constructs. Many approaches based on cells, growth factors, 3D bioprinting and channel incorporation have been proposed. Herein, we explored a versatile technique, layer-by-layer assembling in combination with platelet lysate (PL), that is a cost-effective source of many potent pro-angiogenic proteins and growth factors. Results suggest that the combination of PL with sulfated polyelectrolytes might be used to introduce interfaces onto 2D/3D constructs with potential to induce the formation of cell-based tubular structures. Copyright © 2015 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
Estimating demographic parameters using a combination of known-fate and open N-mixture models
Schmidt, Joshua H.; Johnson, Devin S.; Lindberg, Mark S.; Adams, Layne G.
2015-01-01
Accurate estimates of demographic parameters are required to infer appropriate ecological relationships and inform management actions. Known-fate data from marked individuals are commonly used to estimate survival rates, whereas N-mixture models use count data from unmarked individuals to estimate multiple demographic parameters. However, a joint approach combining the strengths of both analytical tools has not been developed. Here we develop an integrated model combining known-fate and open N-mixture models, allowing the estimation of detection probability, recruitment, and the joint estimation of survival. We demonstrate our approach through both simulations and an applied example using four years of known-fate and pack count data for wolves (Canis lupus). Simulation results indicated that the integrated model reliably recovered parameters with no evidence of bias, and survival estimates were more precise under the joint model. Results from the applied example indicated that the marked sample of wolves was biased toward individuals with higher apparent survival rates than the unmarked pack mates, suggesting that joint estimates may be more representative of the overall population. Our integrated model is a practical approach for reducing bias while increasing precision and the amount of information gained from mark–resight data sets. We provide implementations in both the BUGS language and an R package.
Estimating demographic parameters using a combination of known-fate and open N-mixture models.
Schmidt, Joshua H; Johnson, Devin S; Lindberg, Mark S; Adams, Layne G
2015-10-01
Accurate estimates of demographic parameters are required to infer appropriate ecological relationships and inform management actions. Known-fate data from marked individuals are commonly used to estimate survival rates, whereas N-mixture models use count data from unmarked individuals to estimate multiple demographic parameters. However, a joint approach combining the strengths of both analytical tools has not been developed. Here we develop an integrated model combining known-fate and open N-mixture models, allowing the estimation of detection probability, recruitment, and the joint estimation of survival. We demonstrate our approach through both simulations and an applied example using four years of known-fate and pack count data for wolves (Canis lupus). Simulation results indicated that the integrated model reliably recovered parameters with no evidence of bias, and survival estimates were more precise under the joint model. Results from the applied example indicated that the marked sample of wolves was biased toward individuals with higher apparent survival rates than the unmarked pack mates, suggesting that joint estimates may be more representative of the overall population. Our integrated model is a practical approach for reducing bias while increasing precision and the amount of information gained from mark-resight data sets. We provide implementations in both the BUGS language and an R package.
Technologies and Approaches to Elucidate and Model the Virulence Program of Salmonella.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDermott, Jason E.; Yoon, Hyunjin; Nakayasu, Ernesto S.
Salmonella is a primary cause of enteric diseases in a variety of animals. During its evolution into a pathogenic bacterium, Salmonella acquired an elaborate regulatory network that responds to multiple environmental stimuli within host animals and integrates them resulting in fine regulation of the virulence program. The coordinated action by this regulatory network involves numerous virulence regulators, necessitating genome-wide profiling analysis to assess and combine efforts from multiple regulons. In this review we discuss recent high-throughput analytic approaches to understand the regulatory network of Salmonella that controls virulence processes. Application of high-throughput analyses have generated a large amount of datamore » and driven development of computational approaches required for data integration. Therefore, we also cover computer-aided network analyses to infer regulatory networks, and demonstrate how genome-scale data can be used to construct regulatory and metabolic systems models of Salmonella pathogenesis. Genes that are coordinately controlled by multiple virulence regulators under infectious conditions are more likely to be important for pathogenesis. Thus, reconstructing the global regulatory network during infection or, at the very least, under conditions that mimic the host cellular environment not only provides a bird’s eye view of Salmonella survival strategy in response to hostile host environments but also serves as an efficient means to identify novel virulence factors that are essential for Salmonella to accomplish systemic infection in the host.« less
Sevick, Mary Ann; Woolf, Kathleen; Mattoo, Aditya; Katz, Stuart D; Li, Huilin; St-Jules, David E; Jagannathan, Ram; Hu, Lu; Pompeii, Mary Lou; Ganguzza, Lisa; Li, Zhi; Sierra, Alex; Williams, Stephen K; Goldfarb, David S
2018-01-01
Patients with complex chronic diseases usually must make multiple lifestyle changes to limit and manage their conditions. Numerous studies have shown that education alone is insufficient for engaging people in lifestyle behavior change, and that theory-based behavioral approaches also are necessary. However, even the most motivated individual may have difficulty with making lifestyle changes because of the information complexity associated with multiple behavior changes. The goal of the current Healthy Hearts and Kidneys study was to evaluate, different mobile health (mHealth)-delivered intervention approaches for engaging individuals with type 2 diabetes (T2D) and concurrent chronic kidney disease (CKD) in behavior changes. Participants were randomized to 1 of 4 groups, receiving: (1) a behavioral counseling, (2) technology-based self-monitoring to reduce information complexity, (3) combined behavioral counseling and technology-based self-monitoring, or (4) baseline advice. We will determine the impact of randomization assignment on weight loss success and 24-hour urinary excretion of sodium and phosphorus. With this report we describe the study design, methods, and approaches used to assure information security for this ongoing clinical trial. Clinical Trials.gov Identifier: NCT02276742. Copyright © 2017. Published by Elsevier Inc.
Merging information from multi-model flood projections in a hierarchical Bayesian framework
NASA Astrophysics Data System (ADS)
Le Vine, Nataliya
2016-04-01
Multi-model ensembles are becoming widely accepted for flood frequency change analysis. The use of multiple models results in large uncertainty around estimates of flood magnitudes, due to both uncertainty in model selection and natural variability of river flow. The challenge is therefore to extract the most meaningful signal from the multi-model predictions, accounting for both model quality and uncertainties in individual model estimates. The study demonstrates the potential of a recently proposed hierarchical Bayesian approach to combine information from multiple models. The approach facilitates explicit treatment of shared multi-model discrepancy as well as the probabilistic nature of the flood estimates, by treating the available models as a sample from a hypothetical complete (but unobserved) set of models. The advantages of the approach are: 1) to insure an adequate 'baseline' conditions with which to compare future changes; 2) to reduce flood estimate uncertainty; 3) to maximize use of statistical information in circumstances where multiple weak predictions individually lack power, but collectively provide meaningful information; 4) to adjust multi-model consistency criteria when model biases are large; and 5) to explicitly consider the influence of the (model performance) stationarity assumption. Moreover, the analysis indicates that reducing shared model discrepancy is the key to further reduction of uncertainty in the flood frequency analysis. The findings are of value regarding how conclusions about changing exposure to flooding are drawn, and to flood frequency change attribution studies.
Kim, Ki Hwan; Park, Sung-Hong
2017-04-01
The balanced steady-state free precession (bSSFP) MR sequence is frequently used in clinics, but is sensitive to off-resonance effects, which can cause banding artifacts. Often multiple bSSFP datasets are acquired at different phase cycling (PC) angles and then combined in a special way for banding artifact suppression. Many strategies of combining the datasets have been suggested for banding artifact suppression, but there are still limitations in their performance, especially when the number of phase-cycled bSSFP datasets is small. The purpose of this study is to develop a learning-based model to combine the multiple phase-cycled bSSFP datasets for better banding artifact suppression. Multilayer perceptron (MLP) is a feedforward artificial neural network consisting of three layers of input, hidden, and output layers. MLP models were trained by input bSSFP datasets acquired from human brain and knee at 3T, which were separately performed for two and four PC angles. Banding-free bSSFP images were generated by maximum-intensity projection (MIP) of 8 or 12 phase-cycled datasets and were used as targets for training the output layer. The trained MLP models were applied to another brain and knee datasets acquired with different scan parameters and also to multiple phase-cycled bSSFP functional MRI datasets acquired on rat brain at 9.4T, in comparison with the conventional MIP method. Simulations were also performed to validate the MLP approach. Both the simulations and human experiments demonstrated that MLP suppressed banding artifacts significantly, superior to MIP in both banding artifact suppression and SNR efficiency. MLP demonstrated superior performance over MIP for the 9.4T fMRI data as well, which was not used for training the models, while visually preserving the fMRI maps very well. Artificial neural network is a promising technique for combining multiple phase-cycled bSSFP datasets for banding artifact suppression. Copyright © 2016 Elsevier Inc. All rights reserved.
Interaction Analysis of Longevity Interventions Using Survival Curves.
Nowak, Stefan; Neidhart, Johannes; Szendro, Ivan G; Rzezonka, Jonas; Marathe, Rahul; Krug, Joachim
2018-01-06
A long-standing problem in ageing research is to understand how different factors contributing to longevity should be expected to act in combination under the assumption that they are independent. Standard interaction analysis compares the extension of mean lifespan achieved by a combination of interventions to the prediction under an additive or multiplicative null model, but neither model is fundamentally justified. Moreover, the target of longevity interventions is not mean life span but the entire survival curve. Here we formulate a mathematical approach for predicting the survival curve resulting from a combination of two independent interventions based on the survival curves of the individual treatments, and quantify interaction between interventions as the deviation from this prediction. We test the method on a published data set comprising survival curves for all combinations of four different longevity interventions in Caenorhabditis elegans . We find that interactions are generally weak even when the standard analysis indicates otherwise.
Interaction Analysis of Longevity Interventions Using Survival Curves
Nowak, Stefan; Neidhart, Johannes; Szendro, Ivan G.; Rzezonka, Jonas; Marathe, Rahul; Krug, Joachim
2018-01-01
A long-standing problem in ageing research is to understand how different factors contributing to longevity should be expected to act in combination under the assumption that they are independent. Standard interaction analysis compares the extension of mean lifespan achieved by a combination of interventions to the prediction under an additive or multiplicative null model, but neither model is fundamentally justified. Moreover, the target of longevity interventions is not mean life span but the entire survival curve. Here we formulate a mathematical approach for predicting the survival curve resulting from a combination of two independent interventions based on the survival curves of the individual treatments, and quantify interaction between interventions as the deviation from this prediction. We test the method on a published data set comprising survival curves for all combinations of four different longevity interventions in Caenorhabditis elegans. We find that interactions are generally weak even when the standard analysis indicates otherwise. PMID:29316622
Yang, Yan; Geng, Chao; Li, Feng; Huang, Guan; Li, Xinyang
2017-10-30
Multi-aperture receiver with optical combining architecture is an effective approach to overcome the turbulent atmosphere effect on the performance of the free-space optical (FSO) communications, in which how to combine the multiple laser beams received by the sub-apertures efficiently is one of the key technologies. In this paper, we focus on the combining module based on fiber couplers, and propose the all-fiber coherent beam combining (CBC) with two architectures by using active phase locking. To validate the feasibility of the proposed combining module, corresponding experiments and simulations on the CBC of four laser beams are carried out. The experimental results show that the phase differences among the input beams can be compensated and the combining efficiency can be stably promoted by active phase locking in CBC with both of the two architectures. The simulation results show that the combining efficiency fluctuates when turbulent atmosphere is considered, and the effectiveness of the combining module decreases as the turbulence increases. We believe that the combining module proposed in this paper has great potential, and the results can provide significant advices for researchers when building such a multi-aperture receiver with optical combining architecture for FSO commutation systems.
Ensemble Clustering using Semidefinite Programming with Applications
Singh, Vikas; Mukherjee, Lopamudra; Peng, Jiming; Xu, Jinhui
2011-01-01
In this paper, we study the ensemble clustering problem, where the input is in the form of multiple clustering solutions. The goal of ensemble clustering algorithms is to aggregate the solutions into one solution that maximizes the agreement in the input ensemble. We obtain several new results for this problem. Specifically, we show that the notion of agreement under such circumstances can be better captured using a 2D string encoding rather than a voting strategy, which is common among existing approaches. Our optimization proceeds by first constructing a non-linear objective function which is then transformed into a 0–1 Semidefinite program (SDP) using novel convexification techniques. This model can be subsequently relaxed to a polynomial time solvable SDP. In addition to the theoretical contributions, our experimental results on standard machine learning and synthetic datasets show that this approach leads to improvements not only in terms of the proposed agreement measure but also the existing agreement measures based on voting strategies. In addition, we identify several new application scenarios for this problem. These include combining multiple image segmentations and generating tissue maps from multiple-channel Diffusion Tensor brain images to identify the underlying structure of the brain. PMID:21927539
Ensemble Clustering using Semidefinite Programming with Applications.
Singh, Vikas; Mukherjee, Lopamudra; Peng, Jiming; Xu, Jinhui
2010-05-01
In this paper, we study the ensemble clustering problem, where the input is in the form of multiple clustering solutions. The goal of ensemble clustering algorithms is to aggregate the solutions into one solution that maximizes the agreement in the input ensemble. We obtain several new results for this problem. Specifically, we show that the notion of agreement under such circumstances can be better captured using a 2D string encoding rather than a voting strategy, which is common among existing approaches. Our optimization proceeds by first constructing a non-linear objective function which is then transformed into a 0-1 Semidefinite program (SDP) using novel convexification techniques. This model can be subsequently relaxed to a polynomial time solvable SDP. In addition to the theoretical contributions, our experimental results on standard machine learning and synthetic datasets show that this approach leads to improvements not only in terms of the proposed agreement measure but also the existing agreement measures based on voting strategies. In addition, we identify several new application scenarios for this problem. These include combining multiple image segmentations and generating tissue maps from multiple-channel Diffusion Tensor brain images to identify the underlying structure of the brain.
Terrain modeling for microwave landing system
NASA Technical Reports Server (NTRS)
Poulose, M. M.
1991-01-01
A powerful analytical approach for evaluating the terrain effects on a microwave landing system (MLS) is presented. The approach combines a multiplate model with a powerful and exhaustive ray tracing technique and an accurate formulation for estimating the electromagnetic fields due to the antenna array in the presence of terrain. Both uniform theory of diffraction (UTD) and impedance UTD techniques have been employed to evaluate these fields. Innovative techniques are introduced at each stage to make the model versatile to handle most general terrain contours and also to reduce the computational requirement to a minimum. The model is applied to several terrain geometries, and the results are discussed.
The predictive information obtained by testing multiple software versions
NASA Technical Reports Server (NTRS)
Lee, Larry D.
1987-01-01
Multiversion programming is a redundancy approach to developing highly reliable software. In applications of this method, two or more versions of a program are developed independently by different programmers and the versions are combined to form a redundant system. One variation of this approach consists of developing a set of n program versions and testing the versions to predict the failure probability of a particular program or a system formed from a subset of the programs. The precision that might be obtained, and also the effect of programmer variability if predictions are made over repetitions of the process of generating different program versions, are examined.
Enhanced disease characterization through multi network functional normalization in fMRI.
Çetin, Mustafa S; Khullar, Siddharth; Damaraju, Eswar; Michael, Andrew M; Baum, Stefi A; Calhoun, Vince D
2015-01-01
Conventionally, structural topology is used for spatial normalization during the pre-processing of fMRI. The co-existence of multiple intrinsic networks which can be detected in the resting brain are well-studied. Also, these networks exhibit temporal and spatial modulation during cognitive task vs. rest which shows the existence of common spatial excitation patterns between these identified networks. Previous work (Khullar et al., 2011) has shown that structural and functional data may not have direct one-to-one correspondence and functional activation patterns in a well-defined structural region can vary across subjects even for a well-defined functional task. The results of this study and the existence of the neural activity patterns in multiple networks motivates us to investigate multiple resting-state networks as a single fusion template for functional normalization for multi groups of subjects. We extend the previous approach (Khullar et al., 2011) by co-registering multi group of subjects (healthy control and schizophrenia patients) and by utilizing multiple resting-state networks (instead of just one) as a single fusion template for functional normalization. In this paper we describe the initial steps toward using multiple resting-state networks as a single fusion template for functional normalization. A simple wavelet-based image fusion approach is presented in order to evaluate the feasibility of combining multiple functional networks. Our results showed improvements in both the significance of group statistics (healthy control and schizophrenia patients) and the spatial extent of activation when a multiple resting-state network applied as a single fusion template for functional normalization after the conventional structural normalization. Also, our results provided evidence that the improvement in significance of group statistics lead to better accuracy results for classification of healthy controls and schizophrenia patients.
Biocultural approaches to well-being and sustainability indicators across scales.
Sterling, Eleanor J; Filardi, Christopher; Toomey, Anne; Sigouin, Amanda; Betley, Erin; Gazit, Nadav; Newell, Jennifer; Albert, Simon; Alvira, Diana; Bergamini, Nadia; Blair, Mary; Boseto, David; Burrows, Kate; Bynum, Nora; Caillon, Sophie; Caselle, Jennifer E; Claudet, Joachim; Cullman, Georgina; Dacks, Rachel; Eyzaguirre, Pablo B; Gray, Steven; Herrera, James; Kenilorea, Peter; Kinney, Kealohanuiopuna; Kurashima, Natalie; Macey, Suzanne; Malone, Cynthia; Mauli, Senoveva; McCarter, Joe; McMillen, Heather; Pascua, Pua'ala; Pikacha, Patrick; Porzecanski, Ana L; de Robert, Pascale; Salpeteur, Matthieu; Sirikolo, Myknee; Stege, Mark H; Stege, Kristina; Ticktin, Tamara; Vave, Ron; Wali, Alaka; West, Paige; Winter, Kawika B; Jupiter, Stacy D
2017-12-01
Monitoring and evaluation are central to ensuring that innovative, multi-scale, and interdisciplinary approaches to sustainability are effective. The development of relevant indicators for local sustainable management outcomes, and the ability to link these to broader national and international policy targets, are key challenges for resource managers, policymakers, and scientists. Sets of indicators that capture both ecological and social-cultural factors, and the feedbacks between them, can underpin cross-scale linkages that help bridge local and global scale initiatives to increase resilience of both humans and ecosystems. Here we argue that biocultural approaches, in combination with methods for synthesizing across evidence from multiple sources, are critical to developing metrics that facilitate linkages across scales and dimensions. Biocultural approaches explicitly start with and build on local cultural perspectives - encompassing values, knowledges, and needs - and recognize feedbacks between ecosystems and human well-being. Adoption of these approaches can encourage exchange between local and global actors, and facilitate identification of crucial problems and solutions that are missing from many regional and international framings of sustainability. Resource managers, scientists, and policymakers need to be thoughtful about not only what kinds of indicators are measured, but also how indicators are designed, implemented, measured, and ultimately combined to evaluate resource use and well-being. We conclude by providing suggestions for translating between local and global indicator efforts.
Data-collection strategy for challenging native SAD phasing.
Olieric, Vincent; Weinert, Tobias; Finke, Aaron D; Anders, Carolin; Li, Dianfan; Olieric, Natacha; Borca, Camelia N; Steinmetz, Michel O; Caffrey, Martin; Jinek, Martin; Wang, Meitian
2016-03-01
Recent improvements in data-collection strategies have pushed the limits of native SAD (single-wavelength anomalous diffraction) phasing, a method that uses the weak anomalous signal of light elements naturally present in macromolecules. These involve the merging of multiple data sets from either multiple crystals or from a single crystal collected in multiple orientations at a low X-ray dose. Both approaches yield data of high multiplicity while minimizing radiation damage and systematic error, thus ensuring accurate measurements of the anomalous differences. Here, the combined use of these two strategies is described to solve cases of native SAD phasing that were particular challenges: the integral membrane diacylglycerol kinase (DgkA) with a low Bijvoet ratio of 1% and the large 200 kDa complex of the CRISPR-associated endonuclease (Cas9) bound to guide RNA and target DNA crystallized in the low-symmetry space group C2. The optimal native SAD data-collection strategy based on systematic measurements performed on the 266 kDa multiprotein/multiligand tubulin complex is discussed.
Attention Modulates Spatial Precision in Multiple-Object Tracking.
Srivastava, Nisheeth; Vul, Ed
2016-01-01
We present a computational model of multiple-object tracking that makes trial-level predictions about the allocation of visual attention and the effect of this allocation on observers' ability to track multiple objects simultaneously. This model follows the intuition that increased attention to a location increases the spatial resolution of its internal representation. Using a combination of empirical and computational experiments, we demonstrate the existence of a tight coupling between cognitive and perceptual resources in this task: Low-level tracking of objects generates bottom-up predictions of error likelihood, and high-level attention allocation selectively reduces error probabilities in attended locations while increasing it at non-attended locations. Whereas earlier models of multiple-object tracking have predicted the big picture relationship between stimulus complexity and response accuracy, our approach makes accurate predictions of both the macro-scale effect of target number and velocity on tracking difficulty and micro-scale variations in difficulty across individual trials and targets arising from the idiosyncratic within-trial interactions of targets and distractors. Copyright © 2016 Cognitive Science Society, Inc.
Estimation of gross land-use change and its uncertainty using a Bayesian data assimilation approach
NASA Astrophysics Data System (ADS)
Levy, Peter; van Oijen, Marcel; Buys, Gwen; Tomlinson, Sam
2018-03-01
We present a method for estimating land-use change using a Bayesian data assimilation approach. The approach provides a general framework for combining multiple disparate data sources with a simple model. This allows us to constrain estimates of gross land-use change with reliable national-scale census data, whilst retaining the detailed information available from several other sources. Eight different data sources, with three different data structures, were combined in our posterior estimate of land use and land-use change, and other data sources could easily be added in future. The tendency for observations to underestimate gross land-use change is accounted for by allowing for a skewed distribution in the likelihood function. The data structure produced has high temporal and spatial resolution, and is appropriate for dynamic process-based modelling. Uncertainty is propagated appropriately into the output, so we have a full posterior distribution of output and parameters. The data are available in the widely used netCDF file format from http://eidc.ceh.ac.uk/.
Exploring Mouse Protein Function via Multiple Approaches.
Huang, Guohua; Chu, Chen; Huang, Tao; Kong, Xiangyin; Zhang, Yunhua; Zhang, Ning; Cai, Yu-Dong
2016-01-01
Although the number of available protein sequences is growing exponentially, functional protein annotations lag far behind. Therefore, accurate identification of protein functions remains one of the major challenges in molecular biology. In this study, we presented a novel approach to predict mouse protein functions. The approach was a sequential combination of a similarity-based approach, an interaction-based approach and a pseudo amino acid composition-based approach. The method achieved an accuracy of about 0.8450 for the 1st-order predictions in the leave-one-out and ten-fold cross-validations. For the results yielded by the leave-one-out cross-validation, although the similarity-based approach alone achieved an accuracy of 0.8756, it was unable to predict the functions of proteins with no homologues. Comparatively, the pseudo amino acid composition-based approach alone reached an accuracy of 0.6786. Although the accuracy was lower than that of the previous approach, it could predict the functions of almost all proteins, even proteins with no homologues. Therefore, the combined method balanced the advantages and disadvantages of both approaches to achieve efficient performance. Furthermore, the results yielded by the ten-fold cross-validation indicate that the combined method is still effective and stable when there are no close homologs are available. However, the accuracy of the predicted functions can only be determined according to known protein functions based on current knowledge. Many protein functions remain unknown. By exploring the functions of proteins for which the 1st-order predicted functions are wrong but the 2nd-order predicted functions are correct, the 1st-order wrongly predicted functions were shown to be closely associated with the genes encoding the proteins. The so-called wrongly predicted functions could also potentially be correct upon future experimental verification. Therefore, the accuracy of the presented method may be much higher in reality.
Exploring Mouse Protein Function via Multiple Approaches
Huang, Tao; Kong, Xiangyin; Zhang, Yunhua; Zhang, Ning
2016-01-01
Although the number of available protein sequences is growing exponentially, functional protein annotations lag far behind. Therefore, accurate identification of protein functions remains one of the major challenges in molecular biology. In this study, we presented a novel approach to predict mouse protein functions. The approach was a sequential combination of a similarity-based approach, an interaction-based approach and a pseudo amino acid composition-based approach. The method achieved an accuracy of about 0.8450 for the 1st-order predictions in the leave-one-out and ten-fold cross-validations. For the results yielded by the leave-one-out cross-validation, although the similarity-based approach alone achieved an accuracy of 0.8756, it was unable to predict the functions of proteins with no homologues. Comparatively, the pseudo amino acid composition-based approach alone reached an accuracy of 0.6786. Although the accuracy was lower than that of the previous approach, it could predict the functions of almost all proteins, even proteins with no homologues. Therefore, the combined method balanced the advantages and disadvantages of both approaches to achieve efficient performance. Furthermore, the results yielded by the ten-fold cross-validation indicate that the combined method is still effective and stable when there are no close homologs are available. However, the accuracy of the predicted functions can only be determined according to known protein functions based on current knowledge. Many protein functions remain unknown. By exploring the functions of proteins for which the 1st-order predicted functions are wrong but the 2nd-order predicted functions are correct, the 1st-order wrongly predicted functions were shown to be closely associated with the genes encoding the proteins. The so-called wrongly predicted functions could also potentially be correct upon future experimental verification. Therefore, the accuracy of the presented method may be much higher in reality. PMID:27846315
Bell, Christopher; Puttick, Simon; Rose, Stephen; Smith, Jye; Thomas, Paul; Dowson, Nicholas
2017-06-21
Imaging using more than one biological process using PET could be of great utility, but despite previously proposed approaches to dual-tracer imaging, it is seldom performed. The alternative of performing multiple scans is often infeasible for clinical practice or even in research studies. Dual-tracer PET scanning allows for multiple PET radiotracers to be imaged within the same imaging session. In this paper we describe our approach to utilise the basis pursuit method to aid in the design of dual-tracer PET imaging experiments, and later in separation of the signals. The advantage of this approach is that it does not require a compartment model architecture to be specified or even that both signals are distinguishable in all cases. This means the method for separating dual-tracer signals can be used for many feasible and useful combinations of biology or radiotracer, once an appropriate scanning protocol has been decided upon. Following a demonstration in separating the signals from two consecutively injected radionuclides in a controlled experiment, phantom and list-mode mouse experiments demonstrated the ability to test the feasibility of dual-tracer imaging protocols for multiple injection delays. Increases in variances predicted for kinetic macro-parameters V D and K I in brain and tumoral tissue were obtained when separating the synthetically combined data. These experiments confirmed previous work using other approaches that injections delays of 10-20 min ensured increases in variance were kept minimal for the test tracers used. On this basis, an actual dual-tracer experiment using a 20 min delay was performed using these radio tracers, with the kinetic parameters (V D and K I ) extracted for each tracer in agreement with the literature. This study supports previous work that dual-tracer PET imaging can be accomplished provided certain constraints are adhered to. The utilisation of basis pursuit techniques, with its removed need to specify a model architecture, allows the feasibility of a range of imaging protocols to be investigated via simulation in a straight-forward manner for a wide range of possible scenarios. The hope is that the ease of utilising this approach during feasibility studies and in practice removes any perceived technical barrier to performing dual-tracer imaging.
NASA Astrophysics Data System (ADS)
Bell, Christopher; Puttick, Simon; Rose, Stephen; Smith, Jye; Thomas, Paul; Dowson, Nicholas
2017-06-01
Imaging using more than one biological process using PET could be of great utility, but despite previously proposed approaches to dual-tracer imaging, it is seldom performed. The alternative of performing multiple scans is often infeasible for clinical practice or even in research studies. Dual-tracer PET scanning allows for multiple PET radiotracers to be imaged within the same imaging session. In this paper we describe our approach to utilise the basis pursuit method to aid in the design of dual-tracer PET imaging experiments, and later in separation of the signals. The advantage of this approach is that it does not require a compartment model architecture to be specified or even that both signals are distinguishable in all cases. This means the method for separating dual-tracer signals can be used for many feasible and useful combinations of biology or radiotracer, once an appropriate scanning protocol has been decided upon. Following a demonstration in separating the signals from two consecutively injected radionuclides in a controlled experiment, phantom and list-mode mouse experiments demonstrated the ability to test the feasibility of dual-tracer imaging protocols for multiple injection delays. Increases in variances predicted for kinetic macro-parameters V D and K I in brain and tumoral tissue were obtained when separating the synthetically combined data. These experiments confirmed previous work using other approaches that injections delays of 10-20 min ensured increases in variance were kept minimal for the test tracers used. On this basis, an actual dual-tracer experiment using a 20 min delay was performed using these radio tracers, with the kinetic parameters (V D and K I) extracted for each tracer in agreement with the literature. This study supports previous work that dual-tracer PET imaging can be accomplished provided certain constraints are adhered to. The utilisation of basis pursuit techniques, with its removed need to specify a model architecture, allows the feasibility of a range of imaging protocols to be investigated via simulation in a straight-forward manner for a wide range of possible scenarios. The hope is that the ease of utilising this approach during feasibility studies and in practice removes any perceived technical barrier to performing dual-tracer imaging.
Multiple point statistical simulation using uncertain (soft) conditional data
NASA Astrophysics Data System (ADS)
Hansen, Thomas Mejer; Vu, Le Thanh; Mosegaard, Klaus; Cordua, Knud Skou
2018-05-01
Geostatistical simulation methods have been used to quantify spatial variability of reservoir models since the 80s. In the last two decades, state of the art simulation methods have changed from being based on covariance-based 2-point statistics to multiple-point statistics (MPS), that allow simulation of more realistic Earth-structures. In addition, increasing amounts of geo-information (geophysical, geological, etc.) from multiple sources are being collected. This pose the problem of integration of these different sources of information, such that decisions related to reservoir models can be taken on an as informed base as possible. In principle, though difficult in practice, this can be achieved using computationally expensive Monte Carlo methods. Here we investigate the use of sequential simulation based MPS simulation methods conditional to uncertain (soft) data, as a computational efficient alternative. First, it is demonstrated that current implementations of sequential simulation based on MPS (e.g. SNESIM, ENESIM and Direct Sampling) do not account properly for uncertain conditional information, due to a combination of using only co-located information, and a random simulation path. Then, we suggest two approaches that better account for the available uncertain information. The first make use of a preferential simulation path, where more informed model parameters are visited preferentially to less informed ones. The second approach involves using non co-located uncertain information. For different types of available data, these approaches are demonstrated to produce simulation results similar to those obtained by the general Monte Carlo based approach. These methods allow MPS simulation to condition properly to uncertain (soft) data, and hence provides a computationally attractive approach for integration of information about a reservoir model.
NASA Astrophysics Data System (ADS)
Diaz-Cano, Andres
Boron carbide (B4C) is the third hardest material after diamond and cubic boron nitride. It's unique combination of properties makes B4C a highly valuable material. With hardness values around 35 MPa, a high melting point, 2450°C, density of 2.52 g/cm3, and high chemical inertness, boron carbide is used in severe wear components, like cutting tools and sandblasting nozzles, nuclear reactors' control rots, and finally and most common application, armor. Production of complex-shaped ceramic component is complex and represents many challenges. Present research presents a new and novel approach to produce complex-shaped B4C components. Proposed approach allows forming to be done at room temperatures and under very low forming pressures. Additives and binder concentrations are kept as low as possible, around 5Vol%, while ceramics loadings are maximized above 50Vol%. Given that proposed approach uses water as the main solvent, pieces drying is simple and environmentally safe. Optimized formulation allows rheological properties to be tailored and adjust to multiple processing approaches, including, injection molding, casting, and additive manufacturing. Boron carbide samples then were pressureless sintered. Due to the high covalent character of boron carbide, multiples sintering aids and techniques have been proposed in order to achieve high levels of densification. However, is not possible to define a clear sintering methodology based on literature. Thus, present research developed a comprehensive study on the effect of multiple sintering aids on the densification of boron carbide when pressureless sintered. Relative densities above 90% were achieved with values above 30MPa in hardness. Current research allows extending the uses and application of boron carbide, and other ceramic systems, by providing a new approach to produce complex-shaped components with competitive properties.
Mi, Zhibao; Novitzky, Dimitri; Collins, Joseph F; Cooper, David KC
2015-01-01
The management of brain-dead organ donors is complex. The use of inotropic agents and replacement of depleted hormones (hormonal replacement therapy) is crucial for successful multiple organ procurement, yet the optimal hormonal replacement has not been identified, and the statistical adjustment to determine the best selection is not trivial. Traditional pair-wise comparisons between every pair of treatments, and multiple comparisons to all (MCA), are statistically conservative. Hsu’s multiple comparisons with the best (MCB) – adapted from the Dunnett’s multiple comparisons with control (MCC) – has been used for selecting the best treatment based on continuous variables. We selected the best hormonal replacement modality for successful multiple organ procurement using a two-step approach. First, we estimated the predicted margins by constructing generalized linear models (GLM) or generalized linear mixed models (GLMM), and then we applied the multiple comparison methods to identify the best hormonal replacement modality given that the testing of hormonal replacement modalities is independent. Based on 10-year data from the United Network for Organ Sharing (UNOS), among 16 hormonal replacement modalities, and using the 95% simultaneous confidence intervals, we found that the combination of thyroid hormone, a corticosteroid, antidiuretic hormone, and insulin was the best modality for multiple organ procurement for transplantation. PMID:25565890
NASA Astrophysics Data System (ADS)
Xu, Jie; Stockli, Daniel F.; Snedden, John W.
2017-10-01
Detrital zircon U-Pb analysis is an effective approach for investigating sediment provenance by relating crystallization age to potential crystalline source terranes. Studies of large passive margin basins, such as the Gulf of Mexico Basin, that have received sediment from multiple terranes with non-unique crystallization ages or sedimentary strata, benefit from additional constraints to better elucidate provenance interpretation. In this study, U-Pb and (U-Th)/He double dating analyses on single zircons from the lower Miocene sandstones in the northern Gulf of Mexico Basin reveal a detailed history of sediment source evolution. U-Pb age data indicate that most zircon originated from five major crystalline provinces, including the Western Cordillera Arc (<250 Ma), the Appalachian-Ouachita orogen (500-260 Ma), the Grenville (1300-950 Ma) orogen, the Mid-Continent Granite-Rhyolite (1500-1300 Ma), and the Yavapai-Mazatzal (1800-1600 Ma) terranes as well as sparse Pan-African (700-500 Ma) and Canadian Shield (>1800 Ma) terranes. Zircon (U-Th)/He ages record tectonic cooling and exhumation in the U.S. since the Mesoproterozoic related to the Grenville to Laramide Orogenies. The combined crystallization and cooling information from single zircon double dating can differentiate volcanic and plutonic zircons. Importantly, the U-Pb-He double dating approach allows for the differentiation between multiple possible crystallization-age sources on the basis of their subsequent tectonic evolution. In particular, for Grenville zircons that are present in all of lower Miocene samples, four distinct zircon U-Pb-He age combinations are recognizable that can be traced back to four different possible sources. The integrated U-Pb and (U-Th)/He data eliminate some ambiguities and improves the provenance interpretation for the lower Miocene strata in the northern Gulf of Mexico Basin and illustrate the applicability of this approach for other large-scale basins to reconstruct sediment provenance and dispersal patterns.
NASA Astrophysics Data System (ADS)
Hartung, Christine; Spraul, Raphael; Schuchert, Tobias
2017-10-01
Wide area motion imagery (WAMI) acquired by an airborne multicamera sensor enables continuous monitoring of large urban areas. Each image can cover regions of several square kilometers and contain thousands of vehicles. Reliable vehicle tracking in this imagery is an important prerequisite for surveillance tasks, but remains challenging due to low frame rate and small object size. Most WAMI tracking approaches rely on moving object detections generated by frame differencing or background subtraction. These detection methods fail when objects slow down or stop. Recent approaches for persistent tracking compensate for missing motion detections by combining a detection-based tracker with a second tracker based on appearance or local context. In order to avoid the additional complexity introduced by combining two trackers, we employ an alternative single tracker framework that is based on multiple hypothesis tracking and recovers missing motion detections with a classifierbased detector. We integrate an appearance-based similarity measure, merge handling, vehicle-collision tests, and clutter handling to adapt the approach to the specific context of WAMI tracking. We apply the tracking framework on a region of interest of the publicly available WPAFB 2009 dataset for quantitative evaluation; a comparison to other persistent WAMI trackers demonstrates state of the art performance of the proposed approach. Furthermore, we analyze in detail the impact of different object detection methods and detector settings on the quality of the output tracking results. For this purpose, we choose four different motion-based detection methods that vary in detection performance and computation time to generate the input detections. As detector parameters can be adjusted to achieve different precision and recall performance, we combine each detection method with different detector settings that yield (1) high precision and low recall, (2) high recall and low precision, and (3) best f-score. Comparing the tracking performance achieved with all generated sets of input detections allows us to quantify the sensitivity of the tracker to different types of detector errors and to derive recommendations for detector and parameter choice.
Dose-finding design for multi-drug combinations
Wages, Nolan A; Conaway, Mark R; O'Quigley, John
2012-01-01
Background Most of the current designs used for Phase I dose finding trials in oncology will either involve only a single cytotoxic agent or will impose some implicit ordering among the doses. The goal of the studies is to estimate the maximum tolerated dose (MTD), the highest dose that can be administered with an acceptable level of toxicity. A key working assumption of these methods is the monotonicity of the dose–toxicity curve. Purpose Here we consider situations in which the monotonicity assumption may fail. These studies are becoming increasingly common in practice, most notably, in phase I trials that involve combinations of agents. Our focus is on studies where there exist pairs of treatment combinations for which the ordering of the probabilities of a dose-limiting toxicity cannot be known a priori. Methods We describe a new dose-finding design which can be used for multiple-drug trials and can be applied to this kind of problem. Our methods proceed by laying out all possible orderings of toxicity probabilities that are consistent with the known orderings among treatment combinations and allowing the continual reassessment method (CRM) to provide efficient estimates of the MTD within these orders. The design can be seen to simplify to the CRM when the full ordering is known. Results We study the properties of the design via simulations that provide comparisons to the Bayesian approach to partial orders (POCRM) of Wages, Conaway, and O'Quigley. The POCRM was shown to perform well when compared to other suggested methods for partial orders. Therefore, we comapre our approach to it in order to assess the performance of the new design. Limitations A limitation concerns the number of possible orders. There are dose-finding studies with combinations of agents that can lead to a large number of possible orders. In this case, it may not be feasible to work with all possible orders. Conclusions The proposed design demonstrates the ability to effectively estimate MTD combinations in partially ordered dosefinding studies. Because it relaxes the monotonicity assumption, it can be considered a multivariate generalization of the CRM. Hence, it can serve as a link between single and multiple-agent dosefinding trials. PMID:21652689
Top-pair production at the LHC through NNLO QCD and NLO EW
NASA Astrophysics Data System (ADS)
Czakon, Michał; Heymes, David; Mitov, Alexander; Pagani, Davide; Tsinikos, Ioannis; Zaro, Marco
2017-10-01
In this work we present for the first time predictions for top-quark pair differential distributions at the LHC at NNLO QCD accuracy and including EW corrections. For the latter we include not only contributions of O({α}_s^2α ) , but also those of order O({α}_s{α}^2) and O({α}^3) . Besides providing phenomenological predictions for all main differential distributions with stable top quarks, we also study the following issues. 1) The effect of the photon PDF on top-pair spectra: we find it to be strongly dependent on the PDF set used — especially for the top p T distribution. 2) The difference between the additive and multiplicative approaches for combining QCD and EW corrections: with our scale choice, we find relatively small differences between the central predictions, but reduced scale dependence within the multiplicative approach. 3) The potential effect from the radiation of heavy bosons on inclusive top-pair spectra: we find it to be, typically, negligible.
Chauhan, Rinki; Ravi, Janani; Datta, Pratik; Chen, Tianlong; Schnappinger, Dirk; Bassler, Kevin E.; Balázsi, Gábor; Gennaro, Maria Laura
2016-01-01
Accessory sigma factors, which reprogram RNA polymerase to transcribe specific gene sets, activate bacterial adaptive responses to noxious environments. Here we reconstruct the complete sigma factor regulatory network of the human pathogen Mycobacterium tuberculosis by an integrated approach. The approach combines identification of direct regulatory interactions between M. tuberculosis sigma factors in an E. coli model system, validation of selected links in M. tuberculosis, and extensive literature review. The resulting network comprises 41 direct interactions among all 13 sigma factors. Analysis of network topology reveals (i) a three-tiered hierarchy initiating at master regulators, (ii) high connectivity and (iii) distinct communities containing multiple sigma factors. These topological features are likely associated with multi-layer signal processing and specialized stress responses involving multiple sigma factors. Moreover, the identification of overrepresented network motifs, such as autoregulation and coregulation of sigma and anti-sigma factor pairs, provides structural information that is relevant for studies of network dynamics. PMID:27029515
Spatial resolution enhancement of satellite image data using fusion approach
NASA Astrophysics Data System (ADS)
Lestiana, H.; Sukristiyanti
2018-02-01
Object identification using remote sensing data has a problem when the spatial resolution is not in accordance with the object. The fusion approach is one of methods to solve the problem, to improve the object recognition and to increase the objects information by combining data from multiple sensors. The application of fusion image can be used to estimate the environmental component that is needed to monitor in multiple views, such as evapotranspiration estimation, 3D ground-based characterisation, smart city application, urban environments, terrestrial mapping, and water vegetation. Based on fusion application method, the visible object in land area has been easily recognized using the method. The variety of object information in land area has increased the variation of environmental component estimation. The difficulties in recognizing the invisible object like Submarine Groundwater Discharge (SGD), especially in tropical area, might be decreased by the fusion method. The less variation of the object in the sea surface temperature is a challenge to be solved.
Aerodynamic analysis for aircraft with nacelles, pylons, and winglets at transonic speeds
NASA Technical Reports Server (NTRS)
Boppe, Charles W.
1987-01-01
A computational method has been developed to provide an analysis for complex realistic aircraft configurations at transonic speeds. Wing-fuselage configurations with various combinations of pods, pylons, nacelles, and winglets can be analyzed along with simpler shapes such as airfoils, isolated wings, and isolated bodies. The flexibility required for the treatment of such diverse geometries is obtained by using a multiple nested grid approach in the finite-difference relaxation scheme. Aircraft components (and their grid systems) can be added or removed as required. As a result, the computational method can be used in the same manner as a wind tunnel to study high-speed aerodynamic interference effects. The multiple grid approach also provides high boundary point density/cost ratio. High resolution pressure distributions can be obtained. Computed results are correlated with wind tunnel and flight data using four different transport configurations. Experimental/computational component interference effects are included for cases where data are available. The computer code used for these comparisons is described in the appendices.
Khotanlou, Hassan; Afrasiabi, Mahlagha
2012-10-01
This paper presents a new feature selection approach for automatically extracting multiple sclerosis (MS) lesions in three-dimensional (3D) magnetic resonance (MR) images. Presented method is applicable to different types of MS lesions. In this method, T1, T2, and fluid attenuated inversion recovery (FLAIR) images are firstly preprocessed. In the next phase, effective features to extract MS lesions are selected by using a genetic algorithm (GA). The fitness function of the GA is the Similarity Index (SI) of a support vector machine (SVM) classifier. The results obtained on different types of lesions have been evaluated by comparison with manual segmentations. This algorithm is evaluated on 15 real 3D MR images using several measures. As a result, the SI between MS regions determined by the proposed method and radiologists was 87% on average. Experiments and comparisons with other methods show the effectiveness and the efficiency of the proposed approach.
Paradise, Jordan; Tisdale, Alison W; Hall, Ralph F; Kokkoli, Efrosini
2009-01-01
This article evaluates the oversight of drugs and medical devices by the U.S. Food and Drug Administration (FDA) using an integration of public policy, law, and bioethics approaches and employing multiple assessment criteria, including economic, social, safety, and technological. Criteria assessment and expert elicitation are combined with existing literature, case law, and regulations in an integrative historical case studies approach. We then use our findings as a tool to explore possibilities for effective oversight and regulatory mechanisms for nanobiotechnology. Section I describes oversight mechanisms for human drugs and medical devices and presents current nanotechnology products. Section II describes the results of expert elicitation research. Section III highlights key criteria and relates them to the literature and larger debate. We conclude with broad lessons for the oversight of nanobiotechnology informed by Sections I-III in order to provide useful analysis from multiple disciplines and perspectives to guide discussions regarding appropriate FDA oversight.
Promoting Healthy Outcomes Among Youth with Multiple Risks: Innovative Approaches
Greenberg, Mark T.; Lippold, Melissa A.
2015-01-01
Adolescent behavior problems such as substance use, antisocial behavior problems, and mental health problems have extremely high social costs and lead to overburdened mental health and juvenile justice systems in the United States and Europe. The prevalence of these problems is substantial, and at-risk youth often present with a combination of concerns. An understanding of risk and protective factors at multiple levels, including the child, family, peer, school, and community, has influenced intervention development. At the individual and family levels, the most effective and cost-effective programs work intensively with youth and their families or use individual and group cognitive-behavioral approaches. However, there is a paucity of careful studies of effective policies and programs in the juvenile justice system. Research is needed that focuses on adoption, financing, implementation, and sustainable use of evidence-based programs in public service systems. In addition, the field needs to understand better for whom current programs are most effective to create the next generation of more effective and efficient programs. PMID:23297659
Floden, Evan W; Tommaso, Paolo D; Chatzou, Maria; Magis, Cedrik; Notredame, Cedric; Chang, Jia-Ming
2016-07-08
The PSI/TM-Coffee web server performs multiple sequence alignment (MSA) of proteins by combining homology extension with a consistency based alignment approach. Homology extension is performed with Position Specific Iterative (PSI) BLAST searches against a choice of redundant and non-redundant databases. The main novelty of this server is to allow databases of reduced complexity to rapidly perform homology extension. This server also gives the possibility to use transmembrane proteins (TMPs) reference databases to allow even faster homology extension on this important category of proteins. Aside from an MSA, the server also outputs topological prediction of TMPs using the HMMTOP algorithm. Previous benchmarking of the method has shown this approach outperforms the most accurate alignment methods such as MSAProbs, Kalign, PROMALS, MAFFT, ProbCons and PRALINE™. The web server is available at http://tcoffee.crg.cat/tmcoffee. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Lin Wu, Fe-Lin; Wang, Jui; Ho, Wei; Chou, Chia-Hung; Wu, Yi-Jung; Choo, Dan-Wei; Wang, Yu-Wen; Chen, Po-Yu; Chien, Kuo-Liong; Lin, Zhen-Fang
2017-04-15
The clinical benefits of a combination of statins and ezetimibe in patients with acute coronary syndrome (ACS) were observed in a clinical trial. However, little is known regarding the effectiveness of using statins with or without ezetimibe in patients with ACS and multiple comorbidities in real-world clinical practice. This is a nationwide population-based cohort study using Taiwan National Health Insurance Research Database. A total of 212,110 patients with ACS who had been discharged after their first ACS events between 2006 and 2010 were enrolled. A propensity score matching approach was used to create matched cohorts for adjusting potential confounders. Cox proportional hazards regressions were performed to estimate the risk of re-hospitalization for ACS and revascularization. Patients in the statins-plus-ezetimibe group had a significantly lower risk of re-hospitalization for ACS (adjusted hazard ratio [HR]=0.64, 95% confidence interval [CI]: 0.60-0.69) and revascularization (HR=0.69, 95% CI: 0.63-0.76) than those in the statins-alone group. In the statins-plus-ezetimibe group, female patients had a lower risk of re-hospitalization for ACS than male patients did, and patients without diabetes mellitus had a lower risk of re-hospitalization for ACS than did patients with diabetes mellitus. Patients with ACS and multiple comorbidities receiving a combination therapy of statins and ezetimibe had a lower risk of re-hospitalization for ACS and revascularization than those receiving statins alone. Significant interaction effects were observed between combination with ezetimibe, sex, and diabetes mellitus. Copyright © 2017 Elsevier B.V. All rights reserved.
2013-01-01
Background The synthesis of information across microarray studies has been performed by combining statistical results of individual studies (as in a mosaic), or by combining data from multiple studies into a large pool to be analyzed as a single data set (as in a melting pot of data). Specific issues relating to data heterogeneity across microarray studies, such as differences within and between labs or differences among experimental conditions, could lead to equivocal results in a melting pot approach. Results We applied statistical theory to determine the specific effect of different means and heteroskedasticity across 19 groups of microarray data on the sign and magnitude of gene-to-gene Pearson correlation coefficients obtained from the pool of 19 groups. We quantified the biases of the pooled coefficients and compared them to the biases of correlations estimated by an effect-size model. Mean differences across the 19 groups were the main factor determining the magnitude and sign of the pooled coefficients, which showed largest values of bias as they approached ±1. Only heteroskedasticity across the pool of 19 groups resulted in less efficient estimations of correlations than did a classical meta-analysis approach of combining correlation coefficients. These results were corroborated by simulation studies involving either mean differences or heteroskedasticity across a pool of N > 2 groups. Conclusions The combination of statistical results is best suited for synthesizing the correlation between expression profiles of a gene pair across several microarray studies. PMID:23822712
The potential for increased power from combining P-values testing the same hypothesis.
Ganju, Jitendra; Julie Ma, Guoguang
2017-02-01
The conventional approach to hypothesis testing for formal inference is to prespecify a single test statistic thought to be optimal. However, we usually have more than one test statistic in mind for testing the null hypothesis of no treatment effect but we do not know which one is the most powerful. Rather than relying on a single p-value, combining p-values from prespecified multiple test statistics can be used for inference. Combining functions include Fisher's combination test and the minimum p-value. Using randomization-based tests, the increase in power can be remarkable when compared with a single test and Simes's method. The versatility of the method is that it also applies when the number of covariates exceeds the number of observations. The increase in power is large enough to prefer combined p-values over a single p-value. The limitation is that the method does not provide an unbiased estimator of the treatment effect and does not apply to situations when the model includes treatment by covariate interaction.
Combined TRAF6 Targeting and Proteasome Blockade Has Anti-myeloma and Anti-Bone Resorptive Effects.
Chen, Haiming; Li, Mingjie; Sanchez, Eric; Wang, Cathy S; Lee, Tiffany; Soof, Camilia M; Casas, Christian E; Cao, Jasmin; Xie, Colin; Udd, Kyle A; DeCorso, Kevin; Tang, George Y; Spektor, Tanya M; Berenson, James R
2017-05-01
TNF receptor-associated factor 6 (TRAF6) has been implicated in polyubiquitin-mediated IL1R/TLR signaling through activation of IκB kinase (IKK) to regulate the NF-κB and JNK signaling pathways. Here, TRAF6 protein was determined to be overexpressed in bone marrow mononuclear cells (BMMC) from patients with multiple myeloma. TRAF6 expression in BMMCs from patients with progressive disease is significantly elevated as compared with individuals in complete remission, with monoclonal gammopathy of undetermined significance, or healthy subjects. Furthermore, TRAF6 dominant-negative (TRAF6dn) peptides were constructed which specifically reduced TRAF6 signaling and activation of IKK. TRAF6 not only reduced cellular growth but also increased the apoptosis of multiple myeloma tumor cells in a concentration-dependent fashion. Because TRAF6 activates IKK through polyubiquitination, independent of its proteasome activity, a TRAF6dn peptide was combined with the proteasome inhibitors bortezomib or carfilzomib to treat multiple myeloma. Importantly, targeting of TRAF6 in the presence of proteasome inhibition enhanced anti-multiple myeloma effects and also decreased TLR/TRAF6/NF-κB-related signaling. Finally, TRAF6dn dose dependently inhibited osteoclast cell formation from CD14 + monocytes, induced with RANKL and mCSF , and markedly reduced bone resorption in dentin pits. In all, these data demonstrate that blocking TRAF6 signaling has anti-multiple myeloma effects and reduces bone loss. Implications: The ability to target TRAF6 signaling and associated pathways in multiple myeloma suggests a promising new therapeutic approach. Mol Cancer Res; 15(5); 598-609. ©2017 AACR . ©2017 American Association for Cancer Research.
Muller, Joséphine; Bolomsky, Arnold; Dubois, Sophie; Duray, Elodie; Stangelberger, Kathrin; Plougonven, Erwan; Lejeune, Margaux; Léonard, Angélique; Marty, Caroline; Hempel, Ute; Baron, Frédéric; Beguin, Yves; Cohen-Solal, Martine; Ludwig, Heinz; Heusschen, Roy; Caers, Jo
2018-05-10
Multiple myeloma bone disease is characterized by an uncoupling of bone remodeling in the multiple myeloma microenvironment, resulting in the development of lytic bone lesions. Most myeloma patients suffer from these bone lesions, which not only causes morbidity but also negatively impacts survival. The development of novel therapies, ideally with a combined anti-resorptive and bone-anabolic effect, is of great interest because lesions persist with the current standard of care, even in patients in complete remission. We have previously shown that MELK plays a central role in proliferation-associated high-risk multiple myeloma and its inhibition with OTSSP167 resulted in decreased tumor load. MELK inhibition in bone cells has not yet been explored, although some reports suggest factors downstream of MELK stimulate osteoclast activity and inhibit osteoblast activity, which makes MELK inhibition a promising therapeutic approach. Therefore, we assessed the effect of OTSSP167 on bone cell activity and the development of myeloma-induced bone disease. OTSSP167 inhibited osteoclast activity in vitro by decreasing progenitor viability as well as via a direct anti-resorptive effect on mature osteoclasts. In addition, OTSSP167 stimulated matrix deposition and mineralization by osteoblasts in vitro. This combined anti-resorptive and osteoblast-stimulating effect of OTSSP167 resulted in the complete prevention of lytic lesions and bone loss in myeloma-bearing mice. Immunohistomorphometric analyses corroborated our in vitro findings. In conclusion, we show that OTSSP167 has a direct effect on myeloma-induced bone disease in addition to its anti-multiple myeloma effect, which warrants further clinical development of MELK inhibition in multiple myeloma. Copyright © 2018, Ferrata Storti Foundation.
Aptamer-conjugated nanoparticles for cancer cell detection.
Medley, Colin D; Bamrungsap, Suwussa; Tan, Weihong; Smith, Joshua E
2011-02-01
Aptamer-conjugated nanoparticles (ACNPs) have been used for a variety of applications, particularly dual nanoparticles for magnetic extraction and fluorescent labeling. In this type of assay, silica-coated magnetic and fluorophore-doped silica nanoparticles are conjugated to highly selective aptamers to detect and extract targeted cells in a variety of matrixes. However, considerable improvements are required in order to increase the selectivity and sensitivity of this two-particle assay to be useful in a clinical setting. To accomplish this, several parameters were investigated, including nanoparticle size, conjugation chemistry, use of multiple aptamer sequences on the nanoparticles, and use of multiple nanoparticles with different aptamer sequences. After identifying the best-performing elements, the improvements made to this assay's conditional parameters were combined to illustrate the overall enhanced sensitivity and selectivity of the two-particle assay using an innovative multiple aptamer approach, signifying a critical feature in the advancement of this technique.