Sample records for outperforms previously published

  1. Optimizing the Learning Order of Chinese Characters Using a Novel Topological Sort Algorithm

    PubMed Central

    Wang, Jinzhao

    2016-01-01

    We present a novel algorithm for optimizing the order in which Chinese characters are learned, one that incorporates the benefits of learning them in order of usage frequency and in order of their hierarchal structural relationships. We show that our work outperforms previously published orders and algorithms. Our algorithm is applicable to any scheduling task where nodes have intrinsic differences in importance and must be visited in topological order. PMID:27706234

  2. Improving the efficiency of a user-driven learning system with reconfigurable hardware. Application to DNA splicing.

    PubMed

    Lemoine, E; Merceron, D; Sallantin, J; Nguifo, E M

    1999-01-01

    This paper describes a new approach to problem solving by splitting up problem component parts between software and hardware. Our main idea arises from the combination of two previously published works. The first one proposed a conceptual environment of concept modelling in which the machine and the human expert interact. The second one reported an algorithm based on reconfigurable hardware system which outperforms any kind of previously published genetic data base scanning hardware or algorithms. Here we show how efficient the interaction between the machine and the expert is when the concept modelling is based on reconfigurable hardware system. Their cooperation is thus achieved with an real time interaction speed. The designed system has been partially applied to the recognition of primate splice junctions sites in genetic sequences.

  3. Deep learning improves prediction of CRISPR-Cpf1 guide RNA activity.

    PubMed

    Kim, Hui Kwon; Min, Seonwoo; Song, Myungjae; Jung, Soobin; Choi, Jae Woo; Kim, Younggwang; Lee, Sangeun; Yoon, Sungroh; Kim, Hyongbum Henry

    2018-03-01

    We present two algorithms to predict the activity of AsCpf1 guide RNAs. Indel frequencies for 15,000 target sequences were used in a deep-learning framework based on a convolutional neural network to train Seq-deepCpf1. We then incorporated chromatin accessibility information to create the better-performing DeepCpf1 algorithm for cell lines for which such information is available and show that both algorithms outperform previous machine learning algorithms on our own and published data sets.

  4. Language Identification in Short Utterances Using Long Short-Term Memory (LSTM) Recurrent Neural Networks.

    PubMed

    Zazo, Ruben; Lozano-Diez, Alicia; Gonzalez-Dominguez, Javier; Toledano, Doroteo T; Gonzalez-Rodriguez, Joaquin

    2016-01-01

    Long Short Term Memory (LSTM) Recurrent Neural Networks (RNNs) have recently outperformed other state-of-the-art approaches, such as i-vector and Deep Neural Networks (DNNs), in automatic Language Identification (LID), particularly when dealing with very short utterances (∼3s). In this contribution we present an open-source, end-to-end, LSTM RNN system running on limited computational resources (a single GPU) that outperforms a reference i-vector system on a subset of the NIST Language Recognition Evaluation (8 target languages, 3s task) by up to a 26%. This result is in line with previously published research using proprietary LSTM implementations and huge computational resources, which made these former results hardly reproducible. Further, we extend those previous experiments modeling unseen languages (out of set, OOS, modeling), which is crucial in real applications. Results show that a LSTM RNN with OOS modeling is able to detect these languages and generalizes robustly to unseen OOS languages. Finally, we also analyze the effect of even more limited test data (from 2.25s to 0.1s) proving that with as little as 0.5s an accuracy of over 50% can be achieved.

  5. Language Identification in Short Utterances Using Long Short-Term Memory (LSTM) Recurrent Neural Networks

    PubMed Central

    Zazo, Ruben; Lozano-Diez, Alicia; Gonzalez-Dominguez, Javier; T. Toledano, Doroteo; Gonzalez-Rodriguez, Joaquin

    2016-01-01

    Long Short Term Memory (LSTM) Recurrent Neural Networks (RNNs) have recently outperformed other state-of-the-art approaches, such as i-vector and Deep Neural Networks (DNNs), in automatic Language Identification (LID), particularly when dealing with very short utterances (∼3s). In this contribution we present an open-source, end-to-end, LSTM RNN system running on limited computational resources (a single GPU) that outperforms a reference i-vector system on a subset of the NIST Language Recognition Evaluation (8 target languages, 3s task) by up to a 26%. This result is in line with previously published research using proprietary LSTM implementations and huge computational resources, which made these former results hardly reproducible. Further, we extend those previous experiments modeling unseen languages (out of set, OOS, modeling), which is crucial in real applications. Results show that a LSTM RNN with OOS modeling is able to detect these languages and generalizes robustly to unseen OOS languages. Finally, we also analyze the effect of even more limited test data (from 2.25s to 0.1s) proving that with as little as 0.5s an accuracy of over 50% can be achieved. PMID:26824467

  6. Classifying medical relations in clinical text via convolutional neural networks.

    PubMed

    He, Bin; Guan, Yi; Dai, Rui

    2018-05-16

    Deep learning research on relation classification has achieved solid performance in the general domain. This study proposes a convolutional neural network (CNN) architecture with a multi-pooling operation for medical relation classification on clinical records and explores a loss function with a category-level constraint matrix. Experiments using the 2010 i2b2/VA relation corpus demonstrate these models, which do not depend on any external features, outperform previous single-model methods and our best model is competitive with the existing ensemble-based method. Copyright © 2018. Published by Elsevier B.V.

  7. Comparison of DNA preservation methods for environmental bacterial community samples.

    PubMed

    Gray, Michael A; Pratte, Zoe A; Kellogg, Christina A

    2013-02-01

    Field collections of environmental samples, for example corals, for molecular microbial analyses present distinct challenges. The lack of laboratory facilities in remote locations is common, and preservation of microbial community DNA for later study is critical. A particular challenge is keeping samples frozen in transit. Five nucleic acid preservation methods that do not require cold storage were compared for effectiveness over time and ease of use. Mixed microbial communities of known composition were created and preserved by DNAgard(™), RNAlater(®), DMSO-EDTA-salt (DESS), FTA(®) cards, and FTA Elute(®) cards. Automated ribosomal intergenic spacer analysis and clone libraries were used to detect specific changes in the faux communities over weeks and months of storage. A previously known bias in FTA(®) cards that results in lower recovery of pure cultures of Gram-positive bacteria was also detected in mixed community samples. There appears to be a uniform bias across all five preservation methods against microorganisms with high G + C DNA. Overall, the liquid-based preservatives (DNAgard(™), RNAlater(®), and DESS) outperformed the card-based methods. No single liquid method clearly outperformed the others, leaving method choice to be based on experimental design, field facilities, shipping constraints, and allowable cost. © 2012 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.

  8. Treatment planning for spinal radiosurgery : A competitive multiplatform benchmark challenge.

    PubMed

    Moustakis, Christos; Chan, Mark K H; Kim, Jinkoo; Nilsson, Joakim; Bergman, Alanah; Bichay, Tewfik J; Palazon Cano, Isabel; Cilla, Savino; Deodato, Francesco; Doro, Raffaela; Dunst, Jürgen; Eich, Hans Theodor; Fau, Pierre; Fong, Ming; Haverkamp, Uwe; Heinze, Simon; Hildebrandt, Guido; Imhoff, Detlef; de Klerck, Erik; Köhn, Janett; Lambrecht, Ulrike; Loutfi-Krauss, Britta; Ebrahimi, Fatemeh; Masi, Laura; Mayville, Alan H; Mestrovic, Ante; Milder, Maaike; Morganti, Alessio G; Rades, Dirk; Ramm, Ulla; Rödel, Claus; Siebert, Frank-Andre; den Toom, Wilhelm; Wang, Lei; Wurster, Stefan; Schweikard, Achim; Soltys, Scott G; Ryu, Samuel; Blanck, Oliver

    2018-05-25

    To investigate the quality of treatment plans of spinal radiosurgery derived from different planning and delivery systems. The comparisons include robotic delivery and intensity modulated arc therapy (IMAT) approaches. Multiple centers with equal systems were used to reduce a bias based on individual's planning abilities. The study used a series of three complex spine lesions to maximize the difference in plan quality among the various approaches. Internationally recognized experts in the field of treatment planning and spinal radiosurgery from 12 centers with various treatment planning systems participated. For a complex spinal lesion, the results were compared against a previously published benchmark plan derived for CyberKnife radiosurgery (CKRS) using circular cones only. For two additional cases, one with multiple small lesions infiltrating three vertebrae and a single vertebra lesion treated with integrated boost, the results were compared against a benchmark plan generated using a best practice guideline for CKRS. All plans were rated based on a previously established ranking system. All 12 centers could reach equality (n = 4) or outperform (n = 8) the benchmark plan. For the multiple lesions and the single vertebra lesion plan only 5 and 3 of the 12 centers, respectively, reached equality or outperformed the best practice benchmark plan. However, the absolute differences in target and critical structure dosimetry were small and strongly planner-dependent rather than system-dependent. Overall, gantry-based IMAT with simple planning techniques (two coplanar arcs) produced faster treatments and significantly outperformed static gantry intensity modulated radiation therapy (IMRT) and multileaf collimator (MLC) or non-MLC CKRS treatment plan quality regardless of the system (mean rank out of 4 was 1.2 vs. 3.1, p = 0.002). High plan quality for complex spinal radiosurgery was achieved among all systems and all participating centers in this planning challenge. This study concludes that simple IMAT techniques can generate significantly better plan quality compared to previous established CKRS benchmarks.

  9. Advanced tools for the analysis of protein phosphorylation in yeast mitochondria.

    PubMed

    Walter, Corvin; Gonczarowska-Jorge, Humberto; Sickmann, Albert; Zahedi, René P; Meisinger, Chris; Schmidt, Oliver

    2018-05-24

    The biochemical analysis of protein phosphorylation in mitochondria lags behind that of cytosolic signaling events. One reason is the poor stability of many phosphorylation sites during common isolation procedures for mitochondria. We present here an optimized, fast protocol for the purification of yeast mitochondria that greatly increases recovery of phosphorylated mitochondrial proteins. Moreover, we describe improved protocols for the biochemical analysis of mitochondrial protein phosphorylation by Zn 2+ -Phos-tag electrophoresis under both denaturing and - for the first time - native conditions, and demonstrate that they outperform previously applied methods. Copyright © 2018. Published by Elsevier Inc.

  10. The role of socio-communicative rearing environments in the development of social and physical cognition in apes.

    PubMed

    Russell, Jamie L; Lyn, Heidi; Schaeffer, Jennifer A; Hopkins, William D

    2011-11-01

    The cultural intelligence hypothesis (CIH) claims that humans' advanced cognition is a direct result of human culture and that children are uniquely specialized to absorb and utilize this cultural experience (Tomasello, 2000). Comparative data demonstrating that 2.5-year-old human children outperform apes on measures of social cognition but not on measures of physical cognition support this claim (Herrmann et al., 2007). However, the previous study failed to control for rearing when comparing these two species. Specifically, the human children were raised in a human culture whereas the apes were raised in standard sanctuary settings. To further explore the CIH, here we compared the performance on multiple measures of social and physical cognition in a group of standard reared apes raised in conditions typical of zoo and biomedical laboratory settings to that of apes reared in an enculturated socio-communicatively rich environment. Overall, the enculturated apes significantly outperformed their standard reared counterparts on the cognitive tasks and this was particularly true for measures of communication. Furthermore, the performance of the enculturated apes was very similar to previously reported data from 2.5-year-old children. We conclude that apes who are reared in a human-like socio-communicatively rich environment develop superior communicative abilities compared to apes reared in standard laboratory settings, which supports some assumptions of the cultural intelligence hypothesis. 2011 Blackwell Publishing Ltd.

  11. A Multi-Start Evolutionary Local Search for the Two-Echelon Location Routing Problem

    NASA Astrophysics Data System (ADS)

    Nguyen, Viet-Phuong; Prins, Christian; Prodhon, Caroline

    This paper presents a new hybrid metaheuristic between a greedy randomized adaptive search procedure (GRASP) and an evolutionary/iterated local search (ELS/ILS), using Tabu list to solve the two-echelon location routing problem (LRP-2E). The GRASP uses in turn three constructive heuristics followed by local search to generate the initial solutions. From a solution of GRASP, an intensification strategy is carried out by a dynamic alternation between ELS and ILS. In this phase, each child is obtained by mutation and evaluated through a splitting procedure of giant tour followed by a local search. The tabu list, defined by two characteristics of solution (total cost and number of trips), is used to avoid searching a space already explored. The results show that our metaheuristic clearly outperforms all previously published methods on LRP-2E benchmark instances. Furthermore, it is competitive with the best meta-heuristic published for the single-echelon LRP.

  12. Does Cognitive Behavioral Therapy for Youth Anxiety Outperform Usual Care in Community Clinics? An Initial Effectiveness Test

    ERIC Educational Resources Information Center

    Southam-Gerow, Michael A.; Weisz, John R.; Chu, Brian C.; McLeod, Bryce D.; Gordis, Elana B.; Connor-Smith, Jennifer K.

    2010-01-01

    Objective: Most tests of cognitive behavioral therapy (CBT) for youth anxiety disorders have shown beneficial effects, but these have been efficacy trials with recruited youths treated by researcher-employed therapists. One previous (nonrandomized) trial in community clinics found that CBT did not outperform usual care (UC). The present study used…

  13. Translation of Genotype to Phenotype by a Hierarchy of Cell Subsystems.

    PubMed

    Yu, Michael Ku; Kramer, Michael; Dutkowski, Janusz; Srivas, Rohith; Licon, Katherine; Kreisberg, Jason; Ng, Cherie T; Krogan, Nevan; Sharan, Roded; Ideker, Trey

    2016-02-24

    Accurately translating genotype to phenotype requires accounting for the functional impact of genetic variation at many biological scales. Here we present a strategy for genotype-phenotype reasoning based on existing knowledge of cellular subsystems. These subsystems and their hierarchical organization are defined by the Gene Ontology or a complementary ontology inferred directly from previously published datasets. Guided by the ontology's hierarchical structure, we organize genotype data into an "ontotype," that is, a hierarchy of perturbations representing the effects of genetic variation at multiple cellular scales. The ontotype is then interpreted using logical rules generated by machine learning to predict phenotype. This approach substantially outperforms previous, non-hierarchical methods for translating yeast genotype to cell growth phenotype, and it accurately predicts the growth outcomes of two new screens of 2,503 double gene knockouts impacting DNA repair or nuclear lumen. Ontotypes also generalize to larger knockout combinations, setting the stage for interpreting the complex genetics of disease.

  14. Executive functions in men and postmenopausal women.

    PubMed

    Castonguay, Nathalie; Lussier, Maxime; Bugaiska, Aurélia; Lord, Catherine; Bherer, Louis

    2015-01-01

    This study was designed to assess sex differences in older adults (55-65 years old) in executive functions and to examine the influence of hormone therapy (HT) in postmenopausal women. We have assessed task performance in memory, visuospatial, and executive functions in 29 women using HT, 29 women who never used HT, and 30 men. Men outperformed never users in task switching and updating. HT users outperformed never users in updating. HT users outperformed never users and men in visual divided attention. The present study support previous findings that sex and HT impact cognition and bring new insights on sex and HT-related differences in executive functions.

  15. Tactile Acuity in the Blind: A Closer Look Reveals Superiority over the Sighted in Some but Not All Cutaneous Tasks

    ERIC Educational Resources Information Center

    Alary, Flamine; Duquette, Marco; Goldstein, Rachel; Chapman, C. Elaine; Voss, Patrice; La Buissonniere-Ariza, Valerie; Lepore, Franco

    2009-01-01

    Previous studies have shown that blind subjects may outperform the sighted on certain tactile discrimination tasks. We recently showed that blind subjects outperformed the sighted in a haptic 2D-angle discrimination task. The purpose of this study was to compare the performance of the same blind (n = 16) and sighted (n = 17, G1) subjects in three…

  16. Bilingualism and increased attention to speech: Evidence from event-related potentials.

    PubMed

    Kuipers, Jan Rouke; Thierry, Guillaume

    2015-10-01

    A number of studies have shown that from an early age, bilinguals outperform their monolingual peers on executive control tasks. We previously found that bilingual children and adults also display greater attention to unexpected language switches within speech. Here, we investigated the effect of a bilingual upbringing on speech perception in one language. We recorded monolingual and bilingual toddlers' event-related potentials (ERPs) to spoken words preceded by pictures. Words matching the picture prime elicited an early frontal positivity in bilingual participants only, whereas later ERP amplitudes associated with semantic processing did not differ between groups. These results add to the growing body of evidence that bilingualism increases overall attention during speech perception whilst semantic integration is unaffected. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Fast and accurate estimation of the covariance between pairwise maximum likelihood distances.

    PubMed

    Gil, Manuel

    2014-01-01

    Pairwise evolutionary distances are a model-based summary statistic for a set of molecular sequences. They represent the leaf-to-leaf path lengths of the underlying phylogenetic tree. Estimates of pairwise distances with overlapping paths covary because of shared mutation events. It is desirable to take these covariance structure into account to increase precision in any process that compares or combines distances. This paper introduces a fast estimator for the covariance of two pairwise maximum likelihood distances, estimated under general Markov models. The estimator is based on a conjecture (going back to Nei & Jin, 1989) which links the covariance to path lengths. It is proven here under a simple symmetric substitution model. A simulation shows that the estimator outperforms previously published ones in terms of the mean squared error.

  18. Fast and accurate estimation of the covariance between pairwise maximum likelihood distances

    PubMed Central

    2014-01-01

    Pairwise evolutionary distances are a model-based summary statistic for a set of molecular sequences. They represent the leaf-to-leaf path lengths of the underlying phylogenetic tree. Estimates of pairwise distances with overlapping paths covary because of shared mutation events. It is desirable to take these covariance structure into account to increase precision in any process that compares or combines distances. This paper introduces a fast estimator for the covariance of two pairwise maximum likelihood distances, estimated under general Markov models. The estimator is based on a conjecture (going back to Nei & Jin, 1989) which links the covariance to path lengths. It is proven here under a simple symmetric substitution model. A simulation shows that the estimator outperforms previously published ones in terms of the mean squared error. PMID:25279263

  19. AptRank: an adaptive PageRank model for protein function prediction on   bi-relational graphs.

    PubMed

    Jiang, Biaobin; Kloster, Kyle; Gleich, David F; Gribskov, Michael

    2017-06-15

    Diffusion-based network models are widely used for protein function prediction using protein network data and have been shown to outperform neighborhood-based and module-based methods. Recent studies have shown that integrating the hierarchical structure of the Gene Ontology (GO) data dramatically improves prediction accuracy. However, previous methods usually either used the GO hierarchy to refine the prediction results of multiple classifiers, or flattened the hierarchy into a function-function similarity kernel. No study has taken the GO hierarchy into account together with the protein network as a two-layer network model. We first construct a Bi-relational graph (Birg) model comprised of both protein-protein association and function-function hierarchical networks. We then propose two diffusion-based methods, BirgRank and AptRank, both of which use PageRank to diffuse information on this two-layer graph model. BirgRank is a direct application of traditional PageRank with fixed decay parameters. In contrast, AptRank utilizes an adaptive diffusion mechanism to improve the performance of BirgRank. We evaluate the ability of both methods to predict protein function on yeast, fly and human protein datasets, and compare with four previous methods: GeneMANIA, TMC, ProteinRank and clusDCA. We design four different validation strategies: missing function prediction, de novo function prediction, guided function prediction and newly discovered function prediction to comprehensively evaluate predictability of all six methods. We find that both BirgRank and AptRank outperform the previous methods, especially in missing function prediction when using only 10% of the data for training. The MATLAB code is available at https://github.rcac.purdue.edu/mgribsko/aptrank . gribskov@purdue.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  20. Closing the loop: from paper to protein annotation using supervised Gene Ontology classification.

    PubMed

    Gobeill, Julien; Pasche, Emilie; Vishnyakova, Dina; Ruch, Patrick

    2014-01-01

    Gene function curation of the literature with Gene Ontology (GO) concepts is one particularly time-consuming task in genomics, and the help from bioinformatics is highly requested to keep up with the flow of publications. In 2004, the first BioCreative challenge already designed a task of automatic GO concepts assignment from a full text. At this time, results were judged far from reaching the performances required by real curation workflows. In particular, supervised approaches produced the most disappointing results because of lack of training data. Ten years later, the available curation data have massively grown. In 2013, the BioCreative IV GO task revisited the automatic GO assignment task. For this issue, we investigated the power of our supervised classifier, GOCat. GOCat computes similarities between an input text and already curated instances contained in a knowledge base to infer GO concepts. The subtask A consisted in selecting GO evidence sentences for a relevant gene in a full text. For this, we designed a state-of-the-art supervised statistical approach, using a naïve Bayes classifier and the official training set, and obtained fair results. The subtask B consisted in predicting GO concepts from the previous output. For this, we applied GOCat and reached leading results, up to 65% for hierarchical recall in the top 20 outputted concepts. Contrary to previous competitions, machine learning has this time outperformed standard dictionary-based approaches. Thanks to BioCreative IV, we were able to design a complete workflow for curation: given a gene name and a full text, this system is able to select evidence sentences for curation and to deliver highly relevant GO concepts. Contrary to previous competitions, machine learning this time outperformed dictionary-based systems. Observed performances are sufficient for being used in a real semiautomatic curation workflow. GOCat is available at http://eagl.unige.ch/GOCat/. http://eagl.unige.ch/GOCat4FT/. © The Author(s) 2014. Published by Oxford University Press.

  1. Experiments to Determine Whether Recursive Partitioning (CART) or an Artificial Neural Network Overcomes Theoretical Limitations of Cox Proportional Hazards Regression

    NASA Technical Reports Server (NTRS)

    Kattan, Michael W.; Hess, Kenneth R.; Kattan, Michael W.

    1998-01-01

    New computationally intensive tools for medical survival analyses include recursive partitioning (also called CART) and artificial neural networks. A challenge that remains is to better understand the behavior of these techniques in effort to know when they will be effective tools. Theoretically they may overcome limitations of the traditional multivariable survival technique, the Cox proportional hazards regression model. Experiments were designed to test whether the new tools would, in practice, overcome these limitations. Two datasets in which theory suggests CART and the neural network should outperform the Cox model were selected. The first was a published leukemia dataset manipulated to have a strong interaction that CART should detect. The second was a published cirrhosis dataset with pronounced nonlinear effects that a neural network should fit. Repeated sampling of 50 training and testing subsets was applied to each technique. The concordance index C was calculated as a measure of predictive accuracy by each technique on the testing dataset. In the interaction dataset, CART outperformed Cox (P less than 0.05) with a C improvement of 0.1 (95% Cl, 0.08 to 0.12). In the nonlinear dataset, the neural network outperformed the Cox model (P less than 0.05), but by a very slight amount (0.015). As predicted by theory, CART and the neural network were able to overcome limitations of the Cox model. Experiments like these are important to increase our understanding of when one of these new techniques will outperform the standard Cox model. Further research is necessary to predict which technique will do best a priori and to assess the magnitude of superiority.

  2. GStream: Improving SNP and CNV Coverage on Genome-Wide Association Studies

    PubMed Central

    Alonso, Arnald; Marsal, Sara; Tortosa, Raül; Canela-Xandri, Oriol; Julià, Antonio

    2013-01-01

    We present GStream, a method that combines genome-wide SNP and CNV genotyping in the Illumina microarray platform with unprecedented accuracy. This new method outperforms previous well-established SNP genotyping software. More importantly, the CNV calling algorithm of GStream dramatically improves the results obtained by previous state-of-the-art methods and yields an accuracy that is close to that obtained by purely CNV-oriented technologies like Comparative Genomic Hybridization (CGH). We demonstrate the superior performance of GStream using microarray data generated from HapMap samples. Using the reference CNV calls generated by the 1000 Genomes Project (1KGP) and well-known studies on whole genome CNV characterization based either on CGH or genotyping microarray technologies, we show that GStream can increase the number of reliably detected variants up to 25% compared to previously developed methods. Furthermore, the increased genome coverage provided by GStream allows the discovery of CNVs in close linkage disequilibrium with SNPs, previously associated with disease risk in published Genome-Wide Association Studies (GWAS). These results could provide important insights into the biological mechanism underlying the detected disease risk association. With GStream, large-scale GWAS will not only benefit from the combined genotyping of SNPs and CNVs at an unprecedented accuracy, but will also take advantage of the computational efficiency of the method. PMID:23844243

  3. Combining independent decisions increases diagnostic accuracy of reading lumbosacral radiographs and magnetic resonance imaging.

    PubMed

    Kurvers, Ralf H J M; de Zoete, Annemarie; Bachman, Shelby L; Algra, Paul R; Ostelo, Raymond

    2018-01-01

    Diagnosing the causes of low back pain is a challenging task, prone to errors. A novel approach to increase diagnostic accuracy in medical decision making is collective intelligence, which refers to the ability of groups to outperform individual decision makers in solving problems. We investigated whether combining the independent ratings of chiropractors, chiropractic radiologists and medical radiologists can improve diagnostic accuracy when interpreting diagnostic images of the lumbosacral spine. Evaluations were obtained from two previously published studies: study 1 consisted of 13 raters independently rating 300 lumbosacral radiographs; study 2 consisted of 14 raters independently rating 100 lumbosacral magnetic resonance images. In both studies, raters evaluated the presence of "abnormalities", which are indicators of a serious health risk and warrant immediate further examination. We combined independent decisions of raters using a majority rule which takes as final diagnosis the decision of the majority of the group. We compared the performance of the majority rule to the performance of single raters. Our results show that with increasing group size (i.e., increasing the number of independent decisions) both sensitivity and specificity increased in both data-sets, with groups consistently outperforming single raters. These results were found for radiographs and MR image reading alike. Our findings suggest that combining independent ratings can improve the accuracy of lumbosacral diagnostic image reading.

  4. SINE_scan: an efficient tool to discover short interspersed nuclear elements (SINEs) in large-scale genomic datasets.

    PubMed

    Mao, Hongliang; Wang, Hao

    2017-03-01

    Short Interspersed Nuclear Elements (SINEs) are transposable elements (TEs) that amplify through a copy-and-paste mode via RNA intermediates. The computational identification of new SINEs are challenging because of their weak structural signals and rapid diversification in sequences. Here we report SINE_Scan, a highly efficient program to predict SINE elements in genomic DNA sequences. SINE_Scan integrates hallmark of SINE transposition, copy number and structural signals to identify a SINE element. SINE_Scan outperforms the previously published de novo SINE discovery program. It shows high sensitivity and specificity in 19 plant and animal genome assemblies, of which sizes vary from 120 Mb to 3.5 Gb. It identifies numerous new families and substantially increases the estimation of the abundance of SINEs in these genomes. The code of SINE_Scan is freely available at http://github.com/maohlzj/SINE_Scan , implemented in PERL and supported on Linux. wangh8@fudan.edu.cn. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  5. Reciprocity Outperforms Conformity to Promote Cooperation.

    PubMed

    Romano, Angelo; Balliet, Daniel

    2017-10-01

    Evolutionary psychologists have proposed two processes that could give rise to the pervasiveness of human cooperation observed among individuals who are not genetically related: reciprocity and conformity. We tested whether reciprocity outperformed conformity in promoting cooperation, especially when these psychological processes would promote a different cooperative or noncooperative response. To do so, across three studies, we observed participants' cooperation with a partner after learning (a) that their partner had behaved cooperatively (or not) on several previous trials and (b) that their group members had behaved cooperatively (or not) on several previous trials with that same partner. Although we found that people both reciprocate and conform, reciprocity has a stronger influence on cooperation. Moreover, we found that conformity can be partly explained by a concern about one's reputation-a finding that supports a reciprocity framework.

  6. Comparative Analysis of User-Generated Online Yelp Reviews for Periodontal Practices in Multiple Metropolitan Markets.

    PubMed

    Holtzclaw, Dan J

    2017-02-01

    Previously published research for a single metropolitan market (Austin, Texas) found that periodontists fare poorly on the Yelp website for nearly all measured metrics, including average star ratings, number of reviews, review removal rate, and evaluations by "elite" Yelp users. The purpose of the current study is to confirm or refute these findings by expanding datasets to additional metropolitan markets of various sizes and geographic locations. A total of 6,559 Yelp reviews were examined for general dentists, endodontists, pediatric dentists, oral surgeons, orthodontists, and periodontists in small (Austin, Texas), medium (Seattle, Washington), and large (New York City, New York) metropolitan markets. Numerous review characteristics were evaluated, including: 1) total number of reviews; 2) average star rating; 3) review filtering rate; and 4) number of reviews by Yelp members with elite status. Results were compared in multiple ways to determine whether statistically significant differences existed. In all metropolitan markets, periodontists were outperformed by all other dental specialties for all measured Yelp metrics in this study. Intermetropolitan comparisons of periodontal practices showed no statistically significant differences. Periodontists were outperformed consistently by all other dental specialties in every measured metric on the Yelp website. These results were consistent and repeated in all three metropolitan markets evaluated in this study. Poor performance of periodontists on Yelp may be related to the age profile of patients in the typical periodontal practice. This may result in inadvertently biased filtering of periodontal reviews and subsequently poor performance in multiple other categories.

  7. A Review of Depth and Normal Fusion Algorithms

    PubMed Central

    Štolc, Svorad; Pock, Thomas

    2018-01-01

    Geometric surface information such as depth maps and surface normals can be acquired by various methods such as stereo light fields, shape from shading and photometric stereo techniques. We compare several algorithms which deal with the combination of depth with surface normal information in order to reconstruct a refined depth map. The reasons for performance differences are examined from the perspective of alternative formulations of surface normals for depth reconstruction. We review and analyze methods in a systematic way. Based on our findings, we introduce a new generalized fusion method, which is formulated as a least squares problem and outperforms previous methods in the depth error domain by introducing a novel normal weighting that performs closer to the geodesic distance measure. Furthermore, a novel method is introduced based on Total Generalized Variation (TGV) which further outperforms previous approaches in terms of the geodesic normal distance error and maintains comparable quality in the depth error domain. PMID:29389903

  8. Using Collective Intelligence to Route Internet Traffic

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.; Tumer, Kagan; Frank, Jeremy

    1998-01-01

    A Collective Intelligence (COIN) is a community of interacting reinforcement learning (RL) algorithms designed so that their collective behavior maximizes a global utility function. We introduce the theory of COINs, then present experiments using that theory to design COINs to control internet traffic routing. These experiments indicate that COINs outperform previous RL-based systems for such routing that have previously been investigated.

  9. Math Disabilities: A Selective Meta-Analysis of the Literature

    ERIC Educational Resources Information Center

    Swanson, H. Lee; Jerman, Olga

    2006-01-01

    This article synthesizes published literature comparing the cognitive functioning of children who have math disabilities (MD) with that of (a) average-achieving children; (b) children who have reading disabilities (RD); and (c) children who have co-morbid disabilities (MD+RD). Average achievers outperformed children with MD on measures of verbal…

  10. The development of creative cognition across adolescence: distinct trajectories for insight and divergent thinking.

    PubMed

    Kleibeuker, Sietske W; De Dreu, Carsten K W; Crone, Eveline A

    2013-01-01

    We examined developmental trajectories of creative cognition across adolescence. Participants (N = 98), divided into four age groups (12/13 yrs, 15/16 yrs, 18/19 yrs, and 25-30 yrs), were subjected to a battery of tasks gauging creative insight (visual; verbal) and divergent thinking (verbal; visuo-spatial). The two older age groups outperformed the two younger age groups on insight tasks. The 25-30-year-olds outperformed the two youngest age groups on the originality measure of verbal divergent thinking. No age-group differences were observed for verbal divergent thinking fluency and flexibility. On divergent thinking in the visuo-spatial domain, however, only 15/16-year-olds outperformed 12/13-year-olds; a model with peak performance for 15/16-years-old showed the best fit. The results for the different creativity processes are discussed in relation to cognitive and related neurobiological models. We conclude that mid-adolescence is a period of not only immaturities but also of creative potentials in the visuo-spatial domain, possibly related to developing control functions and explorative behavior. © 2012 Blackwell Publishing Ltd.

  11. GUIDANCE2: accurate detection of unreliable alignment regions accounting for the uncertainty of multiple parameters.

    PubMed

    Sela, Itamar; Ashkenazy, Haim; Katoh, Kazutaka; Pupko, Tal

    2015-07-01

    Inference of multiple sequence alignments (MSAs) is a critical part of phylogenetic and comparative genomics studies. However, from the same set of sequences different MSAs are often inferred, depending on the methodologies used and the assumed parameters. Much effort has recently been devoted to improving the ability to identify unreliable alignment regions. Detecting such unreliable regions was previously shown to be important for downstream analyses relying on MSAs, such as the detection of positive selection. Here we developed GUIDANCE2, a new integrative methodology that accounts for: (i) uncertainty in the process of indel formation, (ii) uncertainty in the assumed guide tree and (iii) co-optimal solutions in the pairwise alignments, used as building blocks in progressive alignment algorithms. We compared GUIDANCE2 with seven methodologies to detect unreliable MSA regions using extensive simulations and empirical benchmarks. We show that GUIDANCE2 outperforms all previously developed methodologies. Furthermore, GUIDANCE2 also provides a set of alternative MSAs which can be useful for downstream analyses. The novel algorithm is implemented as a web-server, available at: http://guidance.tau.ac.il. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  12. Metal artifact reduction for CT-based luggage screening.

    PubMed

    Karimi, Seemeen; Martz, Harry; Cosman, Pamela

    2015-01-01

    In aviation security, checked luggage is screened by computed tomography scanning. Metal objects in the bags create artifacts that degrade image quality. Though there exist metal artifact reduction (MAR) methods mainly in medical imaging literature, they require knowledge of the materials in the scan, or are outlier rejection methods. To improve and evaluate a MAR method we previously introduced, that does not require knowledge of the materials in the scan, and gives good results on data with large quantities and different kinds of metal. We describe in detail an optimization which de-emphasizes metal projections and has a constraint for beam hardening and scatter. This method isolates and reduces artifacts in an intermediate image, which is then fed to a previously published sinogram replacement method. We evaluate the algorithm for luggage data containing multiple and large metal objects. We define measures of artifact reduction, and compare this method against others in MAR literature. Metal artifacts were reduced in our test images, even for multiple and large metal objects, without much loss of structure or resolution. Our MAR method outperforms the methods with which we compared it. Our approach does not make assumptions about image content, nor does it discard metal projections.

  13. Improving cerebellar segmentation with statistical fusion

    NASA Astrophysics Data System (ADS)

    Plassard, Andrew J.; Yang, Zhen; Prince, Jerry L.; Claassen, Daniel O.; Landman, Bennett A.

    2016-03-01

    The cerebellum is a somatotopically organized central component of the central nervous system well known to be involved with motor coordination and increasingly recognized roles in cognition and planning. Recent work in multiatlas labeling has created methods that offer the potential for fully automated 3-D parcellation of the cerebellar lobules and vermis (which are organizationally equivalent to cortical gray matter areas). This work explores the trade offs of using different statistical fusion techniques and post hoc optimizations in two datasets with distinct imaging protocols. We offer a novel fusion technique by extending the ideas of the Selective and Iterative Method for Performance Level Estimation (SIMPLE) to a patch-based performance model. We demonstrate the effectiveness of our algorithm, Non- Local SIMPLE, for segmentation of a mixed population of healthy subjects and patients with severe cerebellar anatomy. Under the first imaging protocol, we show that Non-Local SIMPLE outperforms previous gold-standard segmentation techniques. In the second imaging protocol, we show that Non-Local SIMPLE outperforms previous gold standard techniques but is outperformed by a non-locally weighted vote with the deeper population of atlases available. This work advances the state of the art in open source cerebellar segmentation algorithms and offers the opportunity for routinely including cerebellar segmentation in magnetic resonance imaging studies that acquire whole brain T1-weighted volumes with approximately 1 mm isotropic resolution.

  14. SANDPUMA: ensemble predictions of nonribosomal peptide chemistry reveal biosynthetic diversity across Actinobacteria.

    PubMed

    Chevrette, Marc G; Aicheler, Fabian; Kohlbacher, Oliver; Currie, Cameron R; Medema, Marnix H

    2017-10-15

    Nonribosomally synthesized peptides (NRPs) are natural products with widespread applications in medicine and biotechnology. Many algorithms have been developed to predict the substrate specificities of nonribosomal peptide synthetase adenylation (A) domains from DNA sequences, which enables prioritization and dereplication, and integration with other data types in discovery efforts. However, insufficient training data and a lack of clarity regarding prediction quality have impeded optimal use. Here, we introduce prediCAT, a new phylogenetics-inspired algorithm, which quantitatively estimates the degree of predictability of each A-domain. We then systematically benchmarked all algorithms on a newly gathered, independent test set of 434 A-domain sequences, showing that active-site-motif-based algorithms outperform whole-domain-based methods. Subsequently, we developed SANDPUMA, a powerful ensemble algorithm, based on newly trained versions of all high-performing algorithms, which significantly outperforms individual methods. Finally, we deployed SANDPUMA in a systematic investigation of 7635 Actinobacteria genomes, suggesting that NRP chemical diversity is much higher than previously estimated. SANDPUMA has been integrated into the widely used antiSMASH biosynthetic gene cluster analysis pipeline and is also available as an open-source, standalone tool. SANDPUMA is freely available at https://bitbucket.org/chevrm/sandpuma and as a docker image at https://hub.docker.com/r/chevrm/sandpuma/ under the GNU Public License 3 (GPL3). chevrette@wisc.edu or marnix.medema@wur.nl. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  15. Solving the productivity and impact puzzle: Do men outperform women, or are metrics biased?

    Treesearch

    Elissa Z. Cameron; Angela M. White; Meeghan E. Gray

    2016-01-01

    The attrition of women from science with increasing career stage continues, suggesting that current strategies are unsuccessful. Research evaluation using unbiased metrics could be important for the retention of women, because other factors such as implicit bias are unlikely to quickly change. We compare the publishing patterns of men and women within the...

  16. Sociocultural Perspectives on the Internationalization of Research in Mathematics Education: A Survey Based on "JRME," "ESM," and "MTL"

    ERIC Educational Resources Information Center

    Liu, Po-Hung

    2017-01-01

    The current main research trend in mathematics education is publishing studies by Western scholars pertaining to educational issues of the world in general. but Asia is mostly overlooked. Since international comparisons show Asian students outperform others in mathematics, the imbalance should receive more attention. To gain insight into this…

  17. Front-line intraperitoneal versus intravenous chemotherapy in stage III-IV epithelial ovarian, tubal, and peritoneal cancer with minimal residual disease: a competing risk analysis.

    PubMed

    Chang, Yen-Hou; Li, Wai-Hou; Chang, Yi; Peng, Chia-Wen; Cheng, Ching-Hsuan; Chang, Wei-Pin; Chuang, Chi-Mu

    2016-03-17

    In the analysis of survival data for cancer patients, the problem of competing risks is often ignored. Competing risks have been recognized as a special case of time-to-event analysis. The conventional techniques for time-to-event analysis applied in the presence of competing risks often give biased or uninterpretable results. Using a prospectively collected administrative health care database in a single institution, we identified patients diagnosed with stage III or IV primary epithelial ovarian, tubal, and peritoneal cancers with minimal residual disease after primary cytoreductive surgery between 1995 and 2012. Here, we sought to evaluate whether intraperitoneal chemotherapy outperforms intravenous chemotherapy in the presence of competing risks. Unadjusted and multivariable subdistribution hazards models were applied to this database with two types of competing risks (cancer-specific mortality and other-cause mortality) coded to measure the relative effects of intraperitoneal chemotherapy. A total of 1263 patients were recruited as the initial cohort. After propensity score matching, 381 patients in each arm entered into final competing risk analysis. Cumulative incidence estimates for cancer-specific mortality were statistically significantly lower (p = 0.017, Gray test) in patients receiving intraperitoneal chemotherapy (5-year estimates, 34.5%; 95% confidence interval [CI], 29.5-39.6%, and 10-year estimates, 60.7%; 95% CI, 52.2-68.0%) versus intravenous chemotherapy (5-year estimates, 41.3%; 95% CI, 36.2-46.3%, and 10-year estimates, 67.5%, 95% CI, 61.6-72.7%). In subdistribution hazards analysis, for cancer-specific mortality, intraperitoneal chemotherapy outperforms intravenous chemotherapy (Subdistribution hazard ratio, 0.82; 95% CI, 0.70-0.96) after correcting other covariates. In conclusion, results from this comparative effectiveness study provide supportive evidence for previous published randomized trials that intraperitoneal chemotherapy outperforms intravenous chemotherapy even eliminating the confounding of competing risks. We suggest that implementation of competing risk analysis should be highly considered for the investigation of cancer patients who have medium to long-term follow-up period.

  18. ScaffoldScaffolder: solving contig orientation via bidirected to directed graph reduction.

    PubMed

    Bodily, Paul M; Fujimoto, M Stanley; Snell, Quinn; Ventura, Dan; Clement, Mark J

    2016-01-01

    The contig orientation problem, which we formally define as the MAX-DIR problem, has at times been addressed cursorily and at times using various heuristics. In setting forth a linear-time reduction from the MAX-CUT problem to the MAX-DIR problem, we prove the latter is NP-complete. We compare the relative performance of a novel greedy approach with several other heuristic solutions. Our results suggest that our greedy heuristic algorithm not only works well but also outperforms the other algorithms due to the nature of scaffold graphs. Our results also demonstrate a novel method for identifying inverted repeats and inversion variants, both of which contradict the basic single-orientation assumption. Such inversions have previously been noted as being difficult to detect and are directly involved in the genetic mechanisms of several diseases. http://bioresearch.byu.edu/scaffoldscaffolder. paulmbodily@gmail.com Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lave, Matthew Samuel; Stein, Joshua S.; Burnham, Laurie

    A 9.6 kW test array of Prism bifacial modules and reference monofacial modules installed in February 2016 at the New Mexico Regional Test Center has produced six months of performance data. The data reveal that the Prism modules are out-performing the monofacial modules, with bifacial gains in energy over the six-month period ranging from 18% to 136%, depending on the orientation and ground albedo. These measured bifacial gains were found to be in good agreement with modeled bifacial gains using equations previously published by Prism. The most dramatic increase in performance was seen among the vertically tilted, west-facing modules, wheremore » the bifacial modules produced more than double the energy of monofacial modules and more energy than monofacial modules at any orientation. Because peak energy generation (mid-morning and mid-afternoon) for these bifacial modules may best match load on the electric grid, the west-facing orientation may be more economically desirable than traditional south-facing module orientations (which peak at solar noon).« less

  20. Can We Train Machine Learning Methods to Outperform the High-dimensional Propensity Score Algorithm?

    PubMed

    Karim, Mohammad Ehsanul; Pang, Menglan; Platt, Robert W

    2018-03-01

    The use of retrospective health care claims datasets is frequently criticized for the lack of complete information on potential confounders. Utilizing patient's health status-related information from claims datasets as surrogates or proxies for mismeasured and unobserved confounders, the high-dimensional propensity score algorithm enables us to reduce bias. Using a previously published cohort study of postmyocardial infarction statin use (1998-2012), we compare the performance of the algorithm with a number of popular machine learning approaches for confounder selection in high-dimensional covariate spaces: random forest, least absolute shrinkage and selection operator, and elastic net. Our results suggest that, when the data analysis is done with epidemiologic principles in mind, machine learning methods perform as well as the high-dimensional propensity score algorithm. Using a plasmode framework that mimicked the empirical data, we also showed that a hybrid of machine learning and high-dimensional propensity score algorithms generally perform slightly better than both in terms of mean squared error, when a bias-based analysis is used.

  1. PSI/TM-Coffee: a web server for fast and accurate multiple sequence alignments of regular and transmembrane proteins using homology extension on reduced databases.

    PubMed

    Floden, Evan W; Tommaso, Paolo D; Chatzou, Maria; Magis, Cedrik; Notredame, Cedric; Chang, Jia-Ming

    2016-07-08

    The PSI/TM-Coffee web server performs multiple sequence alignment (MSA) of proteins by combining homology extension with a consistency based alignment approach. Homology extension is performed with Position Specific Iterative (PSI) BLAST searches against a choice of redundant and non-redundant databases. The main novelty of this server is to allow databases of reduced complexity to rapidly perform homology extension. This server also gives the possibility to use transmembrane proteins (TMPs) reference databases to allow even faster homology extension on this important category of proteins. Aside from an MSA, the server also outputs topological prediction of TMPs using the HMMTOP algorithm. Previous benchmarking of the method has shown this approach outperforms the most accurate alignment methods such as MSAProbs, Kalign, PROMALS, MAFFT, ProbCons and PRALINE™. The web server is available at http://tcoffee.crg.cat/tmcoffee. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. Corvids Outperform Pigeons and Primates in Learning a Basic Concept.

    PubMed

    Wright, Anthony A; Magnotti, John F; Katz, Jeffrey S; Leonard, Kevin; Vernouillet, Alizée; Kelly, Debbie M

    2017-04-01

    Corvids (birds of the family Corvidae) display intelligent behavior previously ascribed only to primates, but such feats are not directly comparable across species. To make direct species comparisons, we used a same/different task in the laboratory to assess abstract-concept learning in black-billed magpies ( Pica hudsonia). Concept learning was tested with novel pictures after training. Concept learning improved with training-set size, and test accuracy eventually matched training accuracy-full concept learning-with a 128-picture set; this magpie performance was equivalent to that of Clark's nutcrackers (a species of corvid) and monkeys (rhesus, capuchin) and better than that of pigeons. Even with an initial 8-item picture set, both corvid species showed partial concept learning, outperforming both monkeys and pigeons. Similar corvid performance refutes the hypothesis that nutcrackers' prolific cache-location memory accounts for their superior concept learning, because magpies rely less on caching. That corvids with "primitive" neural architectures evolved to equal primates in full concept learning and even to outperform them on the initial 8-item picture test is a testament to the shared (convergent) survival importance of abstract-concept learning.

  3. Charter Schools, Academic Achievement and NCLB

    ERIC Educational Resources Information Center

    Lubienski, Christopher; Lubienski, Sarah Theule

    2006-01-01

    The reform movement embracing charter schools is based largely on the promise that these autonomous schools will out-perform public schools plagued by bureaucratic administration--an expectation reflected in the federal NCLB law. However, the many state-based reports have been mixed, and previous national studies have suffered from serious…

  4. Learning a weighted sequence model of the nucleosome core and linker yields more accurate predictions in Saccharomyces cerevisiae and Homo sapiens.

    PubMed

    Reynolds, Sheila M; Bilmes, Jeff A; Noble, William Stafford

    2010-07-08

    DNA in eukaryotes is packaged into a chromatin complex, the most basic element of which is the nucleosome. The precise positioning of the nucleosome cores allows for selective access to the DNA, and the mechanisms that control this positioning are important pieces of the gene expression puzzle. We describe a large-scale nucleosome pattern that jointly characterizes the nucleosome core and the adjacent linkers and is predominantly characterized by long-range oscillations in the mono, di- and tri-nucleotide content of the DNA sequence, and we show that this pattern can be used to predict nucleosome positions in both Homo sapiens and Saccharomyces cerevisiae more accurately than previously published methods. Surprisingly, in both H. sapiens and S. cerevisiae, the most informative individual features are the mono-nucleotide patterns, although the inclusion of di- and tri-nucleotide features results in improved performance. Our approach combines a much longer pattern than has been previously used to predict nucleosome positioning from sequence-301 base pairs, centered at the position to be scored-with a novel discriminative classification approach that selectively weights the contributions from each of the input features. The resulting scores are relatively insensitive to local AT-content and can be used to accurately discriminate putative dyad positions from adjacent linker regions without requiring an additional dynamic programming step and without the attendant edge effects and assumptions about linker length modeling and overall nucleosome density. Our approach produces the best dyad-linker classification results published to date in H. sapiens, and outperforms two recently published models on a large set of S. cerevisiae nucleosome positions. Our results suggest that in both genomes, a comparable and relatively small fraction of nucleosomes are well-positioned and that these positions are predictable based on sequence alone. We believe that the bulk of the remaining nucleosomes follow a statistical positioning model.

  5. Learning a Weighted Sequence Model of the Nucleosome Core and Linker Yields More Accurate Predictions in Saccharomyces cerevisiae and Homo sapiens

    PubMed Central

    Reynolds, Sheila M.; Bilmes, Jeff A.; Noble, William Stafford

    2010-01-01

    DNA in eukaryotes is packaged into a chromatin complex, the most basic element of which is the nucleosome. The precise positioning of the nucleosome cores allows for selective access to the DNA, and the mechanisms that control this positioning are important pieces of the gene expression puzzle. We describe a large-scale nucleosome pattern that jointly characterizes the nucleosome core and the adjacent linkers and is predominantly characterized by long-range oscillations in the mono, di- and tri-nucleotide content of the DNA sequence, and we show that this pattern can be used to predict nucleosome positions in both Homo sapiens and Saccharomyces cerevisiae more accurately than previously published methods. Surprisingly, in both H. sapiens and S. cerevisiae, the most informative individual features are the mono-nucleotide patterns, although the inclusion of di- and tri-nucleotide features results in improved performance. Our approach combines a much longer pattern than has been previously used to predict nucleosome positioning from sequence—301 base pairs, centered at the position to be scored—with a novel discriminative classification approach that selectively weights the contributions from each of the input features. The resulting scores are relatively insensitive to local AT-content and can be used to accurately discriminate putative dyad positions from adjacent linker regions without requiring an additional dynamic programming step and without the attendant edge effects and assumptions about linker length modeling and overall nucleosome density. Our approach produces the best dyad-linker classification results published to date in H. sapiens, and outperforms two recently published models on a large set of S. cerevisiae nucleosome positions. Our results suggest that in both genomes, a comparable and relatively small fraction of nucleosomes are well-positioned and that these positions are predictable based on sequence alone. We believe that the bulk of the remaining nucleosomes follow a statistical positioning model. PMID:20628623

  6. PhyloGibbs-MP: Module Prediction and Discriminative Motif-Finding by Gibbs Sampling

    PubMed Central

    Siddharthan, Rahul

    2008-01-01

    PhyloGibbs, our recent Gibbs-sampling motif-finder, takes phylogeny into account in detecting binding sites for transcription factors in DNA and assigns posterior probabilities to its predictions obtained by sampling the entire configuration space. Here, in an extension called PhyloGibbs-MP, we widen the scope of the program, addressing two major problems in computational regulatory genomics. First, PhyloGibbs-MP can localise predictions to small, undetermined regions of a large input sequence, thus effectively predicting cis-regulatory modules (CRMs) ab initio while simultaneously predicting binding sites in those modules—tasks that are usually done by two separate programs. PhyloGibbs-MP's performance at such ab initio CRM prediction is comparable with or superior to dedicated module-prediction software that use prior knowledge of previously characterised transcription factors. Second, PhyloGibbs-MP can predict motifs that differentiate between two (or more) different groups of regulatory regions, that is, motifs that occur preferentially in one group over the others. While other “discriminative motif-finders” have been published in the literature, PhyloGibbs-MP's implementation has some unique features and flexibility. Benchmarks on synthetic and actual genomic data show that this algorithm is successful at enhancing predictions of differentiating sites and suppressing predictions of common sites and compares with or outperforms other discriminative motif-finders on actual genomic data. Additional enhancements include significant performance and speed improvements, the ability to use “informative priors” on known transcription factors, and the ability to output annotations in a format that can be visualised with the Generic Genome Browser. In stand-alone motif-finding, PhyloGibbs-MP remains competitive, outperforming PhyloGibbs-1.0 and other programs on benchmark data. PMID:18769735

  7. Limited value of haptics in virtual reality laparoscopic cholecystectomy training.

    PubMed

    Thompson, Jonathan R; Leonard, Anthony C; Doarn, Charles R; Roesch, Matt J; Broderick, Timothy J

    2011-04-01

    Haptics is an expensive addition to virtual reality (VR) simulators, and the added value to training has not been proven. This study evaluated the benefit of haptics in VR laparoscopic surgery training for novices. The Simbionix LapMentor II haptic VR simulator was used in the study. Randomly, 33 laparoscopic novice students were placed in one of three groups: control, haptics-trained, or nonhaptics-trained group. The control group performed nine basic laparoscopy tasks and four cholecystectomy procedural tasks one time with haptics engaged at the default setting. The haptics group was trained to proficiency in the basic tasks and then performed each of the procedural tasks one time with haptics engaged. The nonhaptics group used the same training protocol except that haptics was disengaged. The proficiency values used were previously published expert values. Each group was assessed in the performance of 10 laparoscopic cholecystectomies (alternating with and without haptics). Performance was measured via automatically collected simulator data. The three groups exhibited no differences in terms of sex, education level, hand dominance, video game experience, surgical experience, and nonsurgical simulator experience. The number of attempts required to reach proficiency did not differ between the haptics- and nonhaptics-training groups. The haptics and nonhaptics groups exhibited no difference in performance. Both training groups outperformed the control group in number of movements as well as path length of the left instrument. In addition, the nonhaptics group outperformed the control group in total time. Haptics does not improve the efficiency or effectiveness of LapMentor II VR laparoscopic surgery training. The limited benefit and the significant cost of haptics suggest that haptics should not be included routinely in VR laparoscopic surgery training.

  8. Post-Conflict Slowing Effects in Monolingual and Bilingual Children

    ERIC Educational Resources Information Center

    Grundy, John G.; Keyvani Chahi, Aram

    2017-01-01

    Previous research has shown that bilingual children outperform their monolingual peers on a wide variety of tasks measuring executive functions (EF). However, recent failures to replicate this finding have cast doubt on the idea that the bilingual experience leads to domain-general cognitive benefits. The present study explored the role of…

  9. Cultural Differences in Early Math Skills among U.S., Taiwanese, Dutch, and Peruvian Preschoolers

    ERIC Educational Resources Information Center

    Paik, Jae H.; van Gelderen, Loes; Gonzales, Manuel; de Jong, Peter F.; Hayes, Michael

    2011-01-01

    East Asian children have consistently outperformed children from other nations on mathematical tests. However, most previous cross-cultural studies mainly compared East Asian countries and the United States and have largely ignored cultures from other parts of the world. The present study explored cultural differences in young children's early…

  10. Some Numerical Simulations and an Experimental Investigation of Finger Seals

    NASA Technical Reports Server (NTRS)

    Braun, Minel J.; Smith, Ian; Marie, Hazel

    2007-01-01

    All seal types have been shown to lift effectively, and experience only minor wear during startup. .. The double pad design outperforms previous seals, providing lower operating temperatures, and less leakage at higher pressures. .. Future experimentation at higher pressures, temperatures, and operating speeds will show the full potential of finger sealing technology.

  11. Bilingualism as a Model for Multitasking

    PubMed Central

    Poarch, Gregory J.; Bialystok, Ellen

    2015-01-01

    Because both languages of bilinguals are constantly active, bilinguals need to manage attention to the target language and avoid interference from the non-target language. This process is likely carried out by recruiting the executive function (EF) system, a system that is also the basis for multitasking. In previous research, bilinguals have been shown to outperform monolinguals on tasks requiring EF, suggesting that the practice using EF for language management benefits performance in other tasks as well. The present study examined 203 children, 8-11 years old, who were monolingual, partially bilingual, bilingual, or trilingual performing a flanker task. Two results support the interpretation that bilingualism is related to multitasking. First, bilingual children outperformed monolinguals on the conflict trials in the flanker task, confirming previous results for a bilingual advantage in EF. Second, the inclusion of partial bilinguals and trilinguals set limits on the role of experience: partial bilingual performed similarly to monolinguals and trilinguals performed similarly to bilinguals, suggesting that degrees of experience are not well-calibrated to improvements in EF. Our conclusion is that the involvement of EF in bilingual language processing makes bilingualism a form of linguistic multitasking. PMID:25821336

  12. Bilingualism as a Model for Multitasking.

    PubMed

    Poarch, Gregory J; Bialystok, Ellen

    2015-03-01

    Because both languages of bilinguals are constantly active, bilinguals need to manage attention to the target language and avoid interference from the non-target language. This process is likely carried out by recruiting the executive function (EF) system, a system that is also the basis for multitasking. In previous research, bilinguals have been shown to outperform monolinguals on tasks requiring EF, suggesting that the practice using EF for language management benefits performance in other tasks as well. The present study examined 203 children, 8-11 years old, who were monolingual, partially bilingual, bilingual, or trilingual performing a flanker task. Two results support the interpretation that bilingualism is related to multitasking. First, bilingual children outperformed monolinguals on the conflict trials in the flanker task, confirming previous results for a bilingual advantage in EF. Second, the inclusion of partial bilinguals and trilinguals set limits on the role of experience: partial bilingual performed similarly to monolinguals and trilinguals performed similarly to bilinguals, suggesting that degrees of experience are not well-calibrated to improvements in EF. Our conclusion is that the involvement of EF in bilingual language processing makes bilingualism a form of linguistic multitasking.

  13. Prediction of heterotrimeric protein complexes by two-phase learning using neighboring kernels

    PubMed Central

    2014-01-01

    Background Protein complexes play important roles in biological systems such as gene regulatory networks and metabolic pathways. Most methods for predicting protein complexes try to find protein complexes with size more than three. It, however, is known that protein complexes with smaller sizes occupy a large part of whole complexes for several species. In our previous work, we developed a method with several feature space mappings and the domain composition kernel for prediction of heterodimeric protein complexes, which outperforms existing methods. Results We propose methods for prediction of heterotrimeric protein complexes by extending techniques in the previous work on the basis of the idea that most heterotrimeric protein complexes are not likely to share the same protein with each other. We make use of the discriminant function in support vector machines (SVMs), and design novel feature space mappings for the second phase. As the second classifier, we examine SVMs and relevance vector machines (RVMs). We perform 10-fold cross-validation computational experiments. The results suggest that our proposed two-phase methods and SVM with the extended features outperform the existing method NWE, which was reported to outperform other existing methods such as MCL, MCODE, DPClus, CMC, COACH, RRW, and PPSampler for prediction of heterotrimeric protein complexes. Conclusions We propose two-phase prediction methods with the extended features, the domain composition kernel, SVMs and RVMs. The two-phase method with the extended features and the domain composition kernel using SVM as the second classifier is particularly useful for prediction of heterotrimeric protein complexes. PMID:24564744

  14. Real-time edge-enhanced optical correlator

    NASA Astrophysics Data System (ADS)

    Shihabi, Mazen M.; Hinedi, Sami M.; Shah, Biren N.

    1992-08-01

    The performance of five symbol lock detectors are compared. They are the square-law detector with overlapping (SQOD) and non-overlapping (SQNOD) integrators, the absolute value detectors with overlapping and non-overlapping (AVNOD) integrators and the signal power estimator detector (SPED). The analysis considers various scenarios when the observation interval is much larger or equal to the symbol synchronizer loop bandwidth, which has not been considered in previous analyses. Also, the case of threshold setting in the absence of signal is considered. It is shown that the SQOD outperforms all others when the threshold is set in the presence of signal, independent of the relationship between loop bandwidth and observation period. On the other hand, the SPED outperforms all others when the threshold is set in the presence of noise only.

  15. Real-time edge-enhanced optical correlator

    NASA Technical Reports Server (NTRS)

    Shihabi, Mazen M. (Inventor); Hinedi, Sami M. (Inventor); Shah, Biren N. (Inventor)

    1992-01-01

    The performance of five symbol lock detectors are compared. They are the square-law detector with overlapping (SQOD) and non-overlapping (SQNOD) integrators, the absolute value detectors with overlapping and non-overlapping (AVNOD) integrators and the signal power estimator detector (SPED). The analysis considers various scenarios when the observation interval is much larger or equal to the symbol synchronizer loop bandwidth, which has not been considered in previous analyses. Also, the case of threshold setting in the absence of signal is considered. It is shown that the SQOD outperforms all others when the threshold is set in the presence of signal, independent of the relationship between loop bandwidth and observation period. On the other hand, the SPED outperforms all others when the threshold is set in the presence of noise only.

  16. Effects of gender, imagery ability, and sports practice on the performance of a mental rotation task.

    PubMed

    Habacha, Hamdi; Molinaro, Corinne; Dosseville, Fabrice

    2014-01-01

    Mental rotation is one of the main spatial abilities necessary in the spatial transformation of mental images and the manipulation of spatial parameters. Researchers have shown that mental rotation abilities differ between populations depending on several variables. This study uses a mental rotation task to investigate effects of several factors on the spatial abilities of 277 volunteers. The results demonstrate that high and low imagers performed equally well on this tasks. Athletes outperformed nonathletes regardless of their discipline, and athletes with greater expertise outperformed those with less experience. The results replicate the previously reported finding that men exhibit better spatial abilities than women. However, with high amounts of practice, the women in the current study were able to perform as well as men.

  17. Efficient sequential and parallel algorithms for record linkage.

    PubMed

    Mamun, Abdullah-Al; Mi, Tian; Aseltine, Robert; Rajasekaran, Sanguthevar

    2014-01-01

    Integrating data from multiple sources is a crucial and challenging problem. Even though there exist numerous algorithms for record linkage or deduplication, they suffer from either large time needs or restrictions on the number of datasets that they can integrate. In this paper we report efficient sequential and parallel algorithms for record linkage which handle any number of datasets and outperform previous algorithms. Our algorithms employ hierarchical clustering algorithms as the basis. A key idea that we use is radix sorting on certain attributes to eliminate identical records before any further processing. Another novel idea is to form a graph that links similar records and find the connected components. Our sequential and parallel algorithms have been tested on a real dataset of 1,083,878 records and synthetic datasets ranging in size from 50,000 to 9,000,000 records. Our sequential algorithm runs at least two times faster, for any dataset, than the previous best-known algorithm, the two-phase algorithm using faster computation of the edit distance (TPA (FCED)). The speedups obtained by our parallel algorithm are almost linear. For example, we get a speedup of 7.5 with 8 cores (residing in a single node), 14.1 with 16 cores (residing in two nodes), and 26.4 with 32 cores (residing in four nodes). We have compared the performance of our sequential algorithm with TPA (FCED) and found that our algorithm outperforms the previous one. The accuracy is the same as that of this previous best-known algorithm.

  18. New technology in postfire rehab

    Treesearch

    Joe Sabel

    2007-01-01

    PAM-12™ is a recycled office paper byproduct made into a spreadable mulch with added Water Soluble Polyacrylamide (WSPAM), a previously difficult polymer to apply. PAM-12 is extremely versatile and can be applied through several methods. In a field test, PAM-12 outperformed straw in every targeted performance area: erosion control, improving soil hydrophobicity, and...

  19. Phonological Memory and the Acquisition of Grammar in Child L2 Learners

    ERIC Educational Resources Information Center

    Verhagen, Josje; Leseman, Paul; Messer, Marielle

    2015-01-01

    Previous studies show that second language (L2) learners with large phonological memory spans outperform learners with smaller memory spans on tests of L2 grammar. The current study investigated the relationship between phonological memory and L2 grammar in more detail than has been done earlier. Specifically, we asked how phonological memory…

  20. Using direct mail to promote organ donor registration: Two campaigns and a meta-analysis.

    PubMed

    Feeley, Thomas H; Quick, Brian L; Lee, Seyoung

    2016-12-01

    Two direct mail campaigns were undertaken in Rochester and Buffalo, New York, with the goal of enrolling adults aged 50-64 years into the state organ and tissue donation electronic registry. Meta-analytic methods were used to summarize the body of research on the effects of direct mail marketing to promote organ donation registration. In the first study, 40 000 mailers were sent to targeted adults in Rochester, New York, and varied by brochure-only, letter-only, and letter plus brochure mailing conditions. A follow-up mailer using letter-only was sent to 20 000 individuals in Buffalo, New York area. In a second study, campaign results were combined with previously published direct mail campaigns in a random-effects meta-analysis. The overall registration rates were 1.6% and 4.6% for the Rochester and Buffalo campaigns, and the letter-only condition outperformed the brochure-only and letter plus brochure conditions in the Rochester area campaigns. Meta-analysis indicated a 3.3% registration rates across 15 campaigns and 329 137 targeted individuals. Registration rates were higher when targeting 18-year-olds and when direct mail letters were authored by officials affiliated with state departments. Use of direct mail to promote organ donor registration is an inexpensive method to increase enrollments in state registries. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. SINE_scan: an efficient tool to discover short interspersed nuclear elements (SINEs) in large-scale genomic datasets

    PubMed Central

    Mao, Hongliang

    2017-01-01

    Abstract Motivation: Short Interspersed Nuclear Elements (SINEs) are transposable elements (TEs) that amplify through a copy-and-paste mode via RNA intermediates. The computational identification of new SINEs are challenging because of their weak structural signals and rapid diversification in sequences. Results: Here we report SINE_Scan, a highly efficient program to predict SINE elements in genomic DNA sequences. SINE_Scan integrates hallmark of SINE transposition, copy number and structural signals to identify a SINE element. SINE_Scan outperforms the previously published de novo SINE discovery program. It shows high sensitivity and specificity in 19 plant and animal genome assemblies, of which sizes vary from 120 Mb to 3.5 Gb. It identifies numerous new families and substantially increases the estimation of the abundance of SINEs in these genomes. Availability and Implementation: The code of SINE_Scan is freely available at http://github.com/maohlzj/SINE_Scan, implemented in PERL and supported on Linux. Contact: wangh8@fudan.edu.cn Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28062442

  2. Validation of a sensitive PCR assay for the detection of Chelonid fibropapilloma-associated herpesvirus in latent turtle infections.

    PubMed

    Alfaro-Núñez, Alonzo; Gilbert, M Thomas P

    2014-09-01

    The Chelonid fibropapilloma-associated herpesvirus (CFPHV) is hypothesized to be the causative agent of fibropapillomatosis, a neoplastic disease in sea turtles, given its consistent detection by PCR in fibropapilloma tumours. CFPHV has also been detected recently by PCR in tissue samples from clinically healthy (non exhibiting fibropapilloma tumours) turtles, thus representing presumably latent infections of the pathogen. Given that template copy numbers of viruses in latent infections can be very low, extremely sensitive PCR assays are needed to optimize detection efficiency. In this study, efficiency of several PCR assays designed for CFPHV detection is explored and compared to a method published previously. The results show that adoption of a triplet set of singleplex PCR assays outperforms other methods, with an approximately 3-fold increase in detection success in comparison to the standard assay. Thus, a new assay for the detection of CFPHV DNA markers is presented, and adoption of its methodology is recommended in future CFPHV screens among sea turtles. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Unifying error structures in commonly used biotracer mixing models.

    PubMed

    Stock, Brian C; Semmens, Brice X

    2016-10-01

    Mixing models are statistical tools that use biotracers to probabilistically estimate the contribution of multiple sources to a mixture. These biotracers may include contaminants, fatty acids, or stable isotopes, the latter of which are widely used in trophic ecology to estimate the mixed diet of consumers. Bayesian implementations of mixing models using stable isotopes (e.g., MixSIR, SIAR) are regularly used by ecologists for this purpose, but basic questions remain about when each is most appropriate. In this study, we describe the structural differences between common mixing model error formulations in terms of their assumptions about the predation process. We then introduce a new parameterization that unifies these mixing model error structures, as well as implicitly estimates the rate at which consumers sample from source populations (i.e., consumption rate). Using simulations and previously published mixing model datasets, we demonstrate that the new error parameterization outperforms existing models and provides an estimate of consumption. Our results suggest that the error structure introduced here will improve future mixing model estimates of animal diet. © 2016 by the Ecological Society of America.

  4. Neonatal Seizure Detection Using Deep Convolutional Neural Networks.

    PubMed

    Ansari, Amir H; Cherian, Perumpillichira J; Caicedo, Alexander; Naulaers, Gunnar; De Vos, Maarten; Van Huffel, Sabine

    2018-04-02

    Identifying a core set of features is one of the most important steps in the development of an automated seizure detector. In most of the published studies describing features and seizure classifiers, the features were hand-engineered, which may not be optimal. The main goal of the present paper is using deep convolutional neural networks (CNNs) and random forest to automatically optimize feature selection and classification. The input of the proposed classifier is raw multi-channel EEG and the output is the class label: seizure/nonseizure. By training this network, the required features are optimized, while fitting a nonlinear classifier on the features. After training the network with EEG recordings of 26 neonates, five end layers performing the classification were replaced with a random forest classifier in order to improve the performance. This resulted in a false alarm rate of 0.9 per hour and seizure detection rate of 77% using a test set of EEG recordings of 22 neonates that also included dubious seizures. The newly proposed CNN classifier outperformed three data-driven feature-based approaches and performed similar to a previously developed heuristic method.

  5. Step changes in leaf oil accumulation via iterative metabolic engineering.

    PubMed

    Vanhercke, Thomas; Divi, Uday K; El Tahchy, Anna; Liu, Qing; Mitchell, Madeline; Taylor, Matthew C; Eastmond, Peter J; Bryant, Fiona; Mechanicos, Anna; Blundell, Cheryl; Zhi, Yao; Belide, Srinivas; Shrestha, Pushkar; Zhou, Xue-Rong; Ral, Jean-Philippe; White, Rosemary G; Green, Allan; Singh, Surinder P; Petrie, James R

    2017-01-01

    Synthesis and accumulation of plant oils in the entire vegetative biomass offers the potential to deliver yields surpassing those of oilseed crops. However, current levels still fall well short of those typically found in oilseeds. Here we show how transcriptome and biochemical analyses pointed to a futile cycle in a previously established Nicotiana tabacum line, accumulating up to 15% (dry weight) of the storage lipid triacylglycerol in leaf tissue. To overcome this metabolic bottleneck, we either silenced the SDP1 lipase or overexpressed the Arabidopsis thaliana LEC2 transcription factor in this transgenic background. Both strategies independently resulted in the accumulation of 30-33% triacylglycerol in leaf tissues. Our results demonstrate that the combined optimization of de novo fatty acid biosynthesis, storage lipid assembly and lipid turnover in leaf tissue results in a major overhaul of the plant central carbon allocation and lipid metabolism. The resulting further step changes in oil accumulation in the entire plant biomass offers the possibility of delivering yields that outperform current oilseed crops. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  6. Oxytocin increases bias, but not accuracy, in face recognition line-ups.

    PubMed

    Bate, Sarah; Bennetts, Rachel; Parris, Benjamin A; Bindemann, Markus; Udale, Robert; Bussunt, Amanda

    2015-07-01

    Previous work indicates that intranasal inhalation of oxytocin improves face recognition skills, raising the possibility that it may be used in security settings. However, it is unclear whether oxytocin directly acts upon the core face-processing system itself or indirectly improves face recognition via affective or social salience mechanisms. In a double-blind procedure, 60 participants received either an oxytocin or placebo nasal spray before completing the One-in-Ten task-a standardized test of unfamiliar face recognition containing target-present and target-absent line-ups. Participants in the oxytocin condition outperformed those in the placebo condition on target-present trials, yet were more likely to make false-positive errors on target-absent trials. Signal detection analyses indicated that oxytocin induced a more liberal response bias, rather than increasing accuracy per se. These findings support a social salience account of the effects of oxytocin on face recognition and indicate that oxytocin may impede face recognition in certain scenarios. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  7. One Year Performance Results for the Prism Solar Installation at the New Mexico Regional Test Center: Field Data from February 15 2016 - February 14 2017.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stein, Joshua; Burnham, Laurie; Lave, Matthew Samuel

    A 9.6 kW test array of Prism bifacial modules and reference monofacial modules installed in February 2016 at the New Mexico Regional Test Center has produced one year of performance data. The data reveal that the Prism modules are out-performing the monofacial modules, with bifacial gains in energy over the twelve-month period ranging from 17% to 132%, depending on the orientation and ground albedo. These measured bifacial gains were found to be in good agreement with modeled bifacial gains using equations previously published by Prism Solar. The most dramatic increase in performance was seen among the vertically mounted, west-facing modules,more » where the bifacial modules produced more than double the energy of monofacial modules in the same orientation. Because peak energy generation (mid- morning and mid-afternoon) for these bifacial modules may best match load on the electric grid, the west-facing orientation may be more economically desirable than traditional south-facing module orientations (which peak at solar noon).« less

  8. Validation of a Crowdsourcing Methodology for Developing a Knowledge Base of Related Problem-Medication Pairs.

    PubMed

    McCoy, A B; Wright, A; Krousel-Wood, M; Thomas, E J; McCoy, J A; Sittig, D F

    2015-01-01

    Clinical knowledge bases of problem-medication pairs are necessary for many informatics solutions that improve patient safety, such as clinical summarization. However, developing these knowledge bases can be challenging. We sought to validate a previously developed crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large, non-university health care system with a widely used, commercially available electronic health record. We first retrieved medications and problems entered in the electronic health record by clinicians during routine care during a six month study period. Following the previously published approach, we calculated the link frequency and link ratio for each pair then identified a threshold cutoff for estimated problem-medication pair appropriateness through clinician review; problem-medication pairs meeting the threshold were included in the resulting knowledge base. We selected 50 medications and their gold standard indications to compare the resulting knowledge base to the pilot knowledge base developed previously and determine its recall and precision. The resulting knowledge base contained 26,912 pairs, had a recall of 62.3% and a precision of 87.5%, and outperformed the pilot knowledge base containing 11,167 pairs from the previous study, which had a recall of 46.9% and a precision of 83.3%. We validated the crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large non-university health care system with a widely used, commercially available electronic health record, indicating that the approach may be generalizable across healthcare settings and clinical systems. Further research is necessary to better evaluate the knowledge, to compare crowdsourcing with other approaches, and to evaluate if incorporating the knowledge into electronic health records improves patient outcomes.

  9. Validation of a Crowdsourcing Methodology for Developing a Knowledge Base of Related Problem-Medication Pairs

    PubMed Central

    Wright, A.; Krousel-Wood, M.; Thomas, E. J.; McCoy, J. A.; Sittig, D. F.

    2015-01-01

    Summary Background Clinical knowledge bases of problem-medication pairs are necessary for many informatics solutions that improve patient safety, such as clinical summarization. However, developing these knowledge bases can be challenging. Objective We sought to validate a previously developed crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large, non-university health care system with a widely used, commercially available electronic health record. Methods We first retrieved medications and problems entered in the electronic health record by clinicians during routine care during a six month study period. Following the previously published approach, we calculated the link frequency and link ratio for each pair then identified a threshold cutoff for estimated problem-medication pair appropriateness through clinician review; problem-medication pairs meeting the threshold were included in the resulting knowledge base. We selected 50 medications and their gold standard indications to compare the resulting knowledge base to the pilot knowledge base developed previously and determine its recall and precision. Results The resulting knowledge base contained 26,912 pairs, had a recall of 62.3% and a precision of 87.5%, and outperformed the pilot knowledge base containing 11,167 pairs from the previous study, which had a recall of 46.9% and a precision of 83.3%. Conclusions We validated the crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large non-university health care system with a widely used, commercially available electronic health record, indicating that the approach may be generalizable across healthcare settings and clinical systems. Further research is necessary to better evaluate the knowledge, to compare crowdsourcing with other approaches, and to evaluate if incorporating the knowledge into electronic health records improves patient outcomes. PMID:26171079

  10. Are more complex physiological models of forest ecosystems better choices for plot and regional predictions?

    Treesearch

    Wenchi Jin; Hong S. He; Frank R. Thompson

    2016-01-01

    Process-based forest ecosystem models vary from simple physiological, complex physiological, to hybrid empirical-physiological models. Previous studies indicate that complex models provide the best prediction at plot scale with a temporal extent of less than 10 years, however, it is largely untested as to whether complex models outperform the other two types of models...

  11. Natural image sequences constrain dynamic receptive fields and imply a sparse code.

    PubMed

    Häusler, Chris; Susemihl, Alex; Nawrot, Martin P

    2013-11-06

    In their natural environment, animals experience a complex and dynamic visual scenery. Under such natural stimulus conditions, neurons in the visual cortex employ a spatially and temporally sparse code. For the input scenario of natural still images, previous work demonstrated that unsupervised feature learning combined with the constraint of sparse coding can predict physiologically measured receptive fields of simple cells in the primary visual cortex. This convincingly indicated that the mammalian visual system is adapted to the natural spatial input statistics. Here, we extend this approach to the time domain in order to predict dynamic receptive fields that can account for both spatial and temporal sparse activation in biological neurons. We rely on temporal restricted Boltzmann machines and suggest a novel temporal autoencoding training procedure. When tested on a dynamic multi-variate benchmark dataset this method outperformed existing models of this class. Learning features on a large dataset of natural movies allowed us to model spatio-temporal receptive fields for single neurons. They resemble temporally smooth transformations of previously obtained static receptive fields and are thus consistent with existing theories. A neuronal spike response model demonstrates how the dynamic receptive field facilitates temporal and population sparseness. We discuss the potential mechanisms and benefits of a spatially and temporally sparse representation of natural visual input. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.

  12. Limb-Enhancer Genie: An accessible resource of accurate enhancer predictions in the developing limb

    DOE PAGES

    Monti, Remo; Barozzi, Iros; Osterwalder, Marco; ...

    2017-08-21

    Epigenomic mapping of enhancer-associated chromatin modifications facilitates the genome-wide discovery of tissue-specific enhancers in vivo. However, reliance on single chromatin marks leads to high rates of false-positive predictions. More sophisticated, integrative methods have been described, but commonly suffer from limited accessibility to the resulting predictions and reduced biological interpretability. Here we present the Limb-Enhancer Genie (LEG), a collection of highly accurate, genome-wide predictions of enhancers in the developing limb, available through a user-friendly online interface. We predict limb enhancers using a combination of > 50 published limb-specific datasets and clusters of evolutionarily conserved transcription factor binding sites, taking advantage ofmore » the patterns observed at previously in vivo validated elements. By combining different statistical models, our approach outperforms current state-of-the-art methods and provides interpretable measures of feature importance. Our results indicate that including a previously unappreciated score that quantifies tissue-specific nuclease accessibility significantly improves prediction performance. We demonstrate the utility of our approach through in vivo validation of newly predicted elements. Moreover, we describe general features that can guide the type of datasets to include when predicting tissue-specific enhancers genome-wide, while providing an accessible resource to the general biological community and facilitating the functional interpretation of genetic studies of limb malformations.« less

  13. VSDMIP 1.5: an automated structure- and ligand-based virtual screening platform with a PyMOL graphical user interface.

    PubMed

    Cabrera, Álvaro Cortés; Gil-Redondo, Rubén; Perona, Almudena; Gago, Federico; Morreale, Antonio

    2011-09-01

    A graphical user interface (GUI) for our previously published virtual screening (VS) and data management platform VSDMIP (Gil-Redondo et al. J Comput Aided Mol Design, 23:171-184, 2009) that has been developed as a plugin for the popular molecular visualization program PyMOL is presented. In addition, a ligand-based VS module (LBVS) has been implemented that complements the already existing structure-based VS (SBVS) module and can be used in those cases where the receptor's 3D structure is not known or for pre-filtering purposes. This updated version of VSDMIP is placed in the context of similar available software and its LBVS and SBVS capabilities are tested here on a reduced set of the Directory of Useful Decoys database. Comparison of results from both approaches confirms the trend found in previous studies that LBVS outperforms SBVS. We also show that by combining LBVS and SBVS, and using a cluster of ~100 modern processors, it is possible to perform complete VS studies of several million molecules in less than a month. As the main processes in VSDMIP are 100% scalable, more powerful processors and larger clusters would notably decrease this time span. The plugin is distributed under an academic license upon request from the authors. © Springer Science+Business Media B.V. 2011

  14. Influences of gender and socioeconomic status on the motor proficiency of children in the UK.

    PubMed

    Morley, David; Till, Kevin; Ogilvie, Paul; Turner, Graham

    2015-12-01

    As the development of movement skills are so crucial to a child's involvement in lifelong physical activity and sport, the purpose of this study was to assess the motor proficiency of children aged 4-7 years (range=4.3-7.2 years), whilst considering gender and socioeconomic status. 369 children (176 females, 193 males, aged=5.96 ± 0.57 years) were assessed for fine motor precision, fine motor integration, manual dexterity, bilateral co-ordination, balance, speed and agility, upper-limb co-ordination and strength. The average standard score for all participants was 44.4 ± 8.9, classifying the participants towards the lower end of the average score. Multivariate analysis of covariance identified significant effects for gender (p<0.001) and socioeconomic status (p<0.001). Females outperformed males for fine motor skills and boys outperformed girls for catch and dribble gross motor skills. High socioeconomic status significantly outperformed middle and/or low socioeconomic status for total, fine and gross motor proficiency. Current motor proficiency of primary children aged 4-7 years in the UK is just below average with differences evident between gender and socioeconomic status. Teachers and sport coaches working with primary aged children should concentrate on the development of movement skills, whilst considering differences between genders and socioeconomic status. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.

  15. Efficient sequential and parallel algorithms for record linkage

    PubMed Central

    Mamun, Abdullah-Al; Mi, Tian; Aseltine, Robert; Rajasekaran, Sanguthevar

    2014-01-01

    Background and objective Integrating data from multiple sources is a crucial and challenging problem. Even though there exist numerous algorithms for record linkage or deduplication, they suffer from either large time needs or restrictions on the number of datasets that they can integrate. In this paper we report efficient sequential and parallel algorithms for record linkage which handle any number of datasets and outperform previous algorithms. Methods Our algorithms employ hierarchical clustering algorithms as the basis. A key idea that we use is radix sorting on certain attributes to eliminate identical records before any further processing. Another novel idea is to form a graph that links similar records and find the connected components. Results Our sequential and parallel algorithms have been tested on a real dataset of 1 083 878 records and synthetic datasets ranging in size from 50 000 to 9 000 000 records. Our sequential algorithm runs at least two times faster, for any dataset, than the previous best-known algorithm, the two-phase algorithm using faster computation of the edit distance (TPA (FCED)). The speedups obtained by our parallel algorithm are almost linear. For example, we get a speedup of 7.5 with 8 cores (residing in a single node), 14.1 with 16 cores (residing in two nodes), and 26.4 with 32 cores (residing in four nodes). Conclusions We have compared the performance of our sequential algorithm with TPA (FCED) and found that our algorithm outperforms the previous one. The accuracy is the same as that of this previous best-known algorithm. PMID:24154837

  16. Smiling on the Inside: The Social Benefits of Suppressing Positive Emotions in Outperformance Situations.

    PubMed

    Schall, Marina; Martiny, Sarah E; Goetz, Thomas; Hall, Nathan C

    2016-05-01

    Although expressing positive emotions is typically socially rewarded, in the present work, we predicted that people suppress positive emotions and thereby experience social benefits when outperformed others are present. We tested our predictions in three experimental studies with high school students. In Studies 1 and 2, we manipulated the type of social situation (outperformance vs. non-outperformance) and assessed suppression of positive emotions. In both studies, individuals reported suppressing positive emotions more in outperformance situations than in non-outperformance situations. In Study 3, we manipulated the social situation (outperformance vs. non-outperformance) as well as the videotaped person's expression of positive emotions (suppression vs. expression). The findings showed that when outperforming others, individuals were indeed evaluated more positively when they suppressed rather than expressed their positive emotions, and demonstrate the importance of the specific social situation with respect to the effects of suppression. © 2016 by the Society for Personality and Social Psychology, Inc.

  17. Multiscale site-response mapping: A case study of Parkfield, California

    USGS Publications Warehouse

    Thompson, E.M.; Baise, L.G.; Kayen, R.E.; Morgan, E.C.; Kaklamanos, J.

    2011-01-01

    The scale of previously proposed methods for mapping site-response ranges from global coverage down to individual urban regions. Typically, spatial coverage and accuracy are inversely related.We use the densely spaced strong-motion stations in Parkfield, California, to estimate the accuracy of different site-response mapping methods and demonstrate a method for integrating multiple site-response estimates from the site to the global scale. This method is simply a weighted mean of a suite of different estimates, where the weights are the inverse of the variance of the individual estimates. Thus, the dominant site-response model varies in space as a function of the accuracy of the different models. For mapping applications, site-response models should be judged in terms of both spatial coverage and the degree of correlation with observed amplifications. Performance varies with period, but in general the Parkfield data show that: (1) where a velocity profile is available, the square-rootof- impedance (SRI) method outperforms the measured VS30 (30 m divided by the S-wave travel time to 30 m depth) and (2) where velocity profiles are unavailable, the topographic slope method outperforms surficial geology for short periods, but geology outperforms slope at longer periods. We develop new equations to estimate site response from topographic slope, derived from the Next Generation Attenuation (NGA) database.

  18. Female Chess Players Outperform Expectations When Playing Men.

    PubMed

    Stafford, Tom

    2018-03-01

    Stereotype threat has been offered as a potential explanation of differential performance between men and women in some cognitive domains. Questions remain about the reliability and generality of the phenomenon. Previous studies have found that stereotype threat is activated in female chess players when they are matched against male players. I used data from over 5.5 million games of international tournament chess and found no evidence of a stereotype-threat effect. In fact, female players outperform expectations when playing men. Further analysis showed no influence of degree of challenge, player age, nor prevalence of female role models in national chess leagues on differences in performance when women play men versus when they play women. Though this analysis contradicts one specific mechanism of influence of gender stereotypes, the persistent differences between male and female players suggest that systematic factors do exist and remain to be uncovered.

  19. Experimental investigation of alternative transmission functions: Quantitative evidence for the importance of nonlinear transmission dynamics in host-parasite systems.

    PubMed

    Orlofske, Sarah A; Flaxman, Samuel M; Joseph, Maxwell B; Fenton, Andy; Melbourne, Brett A; Johnson, Pieter T J

    2018-05-01

    Understanding pathogen transmission is crucial for predicting and managing disease. Nonetheless, experimental comparisons of alternative functional forms of transmission remain rare, and those experiments that are conducted are often not designed to test the full range of possible forms. To differentiate among 10 candidate transmission functions, we used a novel experimental design in which we independently varied four factors-duration of exposure, numbers of parasites, numbers of hosts and parasite density-in laboratory infection experiments. We used interactions between amphibian hosts and trematode parasites as a model system and all candidate models incorporated parasite depletion. An additional manipulation involving anaesthesia addressed the effects of host behaviour on transmission form. Across all experiments, nonlinear transmission forms involving either a power law or a negative binomial function were the best-fitting models and consistently outperformed the linear density-dependent and density-independent functions. By testing previously published data for two other host-macroparasite systems, we also found support for the same nonlinear transmission forms. Although manipulations of parasite density are common in transmission studies, the comprehensive set of variables tested in our experiments revealed that variation in density alone was least likely to differentiate among competing transmission functions. Across host-pathogen systems, nonlinear functions may often more accurately represent transmission dynamics and thus provide more realistic predictions for infection. © 2017 The Authors. Journal of Animal Ecology published by John Wiley & Sons Ltd on behalf of British Ecological Society.

  20. Sma3s: a three-step modular annotator for large sequence datasets.

    PubMed

    Muñoz-Mérida, Antonio; Viguera, Enrique; Claros, M Gonzalo; Trelles, Oswaldo; Pérez-Pulido, Antonio J

    2014-08-01

    Automatic sequence annotation is an essential component of modern 'omics' studies, which aim to extract information from large collections of sequence data. Most existing tools use sequence homology to establish evolutionary relationships and assign putative functions to sequences. However, it can be difficult to define a similarity threshold that achieves sufficient coverage without sacrificing annotation quality. Defining the correct configuration is critical and can be challenging for non-specialist users. Thus, the development of robust automatic annotation techniques that generate high-quality annotations without needing expert knowledge would be very valuable for the research community. We present Sma3s, a tool for automatically annotating very large collections of biological sequences from any kind of gene library or genome. Sma3s is composed of three modules that progressively annotate query sequences using either: (i) very similar homologues, (ii) orthologous sequences or (iii) terms enriched in groups of homologous sequences. We trained the system using several random sets of known sequences, demonstrating average sensitivity and specificity values of ~85%. In conclusion, Sma3s is a versatile tool for high-throughput annotation of a wide variety of sequence datasets that outperforms the accuracy of other well-established annotation algorithms, and it can enrich existing database annotations and uncover previously hidden features. Importantly, Sma3s has already been used in the functional annotation of two published transcriptomes. © The Author 2014. Published by Oxford University Press on behalf of Kazusa DNA Research Institute.

  1. Compensated Row-Column Ultrasound Imaging System Using Fisher Tippett Multilayered Conditional Random Field Model

    PubMed Central

    Ben Daya, Ibrahim; Chen, Albert I. H.; Shafiee, Mohammad Javad; Wong, Alexander; Yeow, John T. W.

    2015-01-01

    3-D ultrasound imaging offers unique opportunities in the field of non destructive testing that cannot be easily found in A-mode and B-mode images. To acquire a 3-D ultrasound image without a mechanically moving transducer, a 2-D array can be used. The row column technique is preferred over a fully addressed 2-D array as it requires a significantly lower number of interconnections. Recent advances in 3-D row-column ultrasound imaging systems were largely focused on sensor design. However, these imaging systems face three intrinsic challenges that cannot be addressed by improving sensor design alone: speckle noise, sparsity of data in the imaged volume, and the spatially dependent point spread function of the imaging system. In this paper, we propose a compensated row-column ultrasound image reconstruction system using Fisher-Tippett multilayered conditional random field model. Tests carried out on both simulated and real row-column ultrasound images show the effectiveness of our proposed system as opposed to other published systems. Visual assessment of the results show our proposed system’s potential at preserving detail and reducing speckle. Quantitative analysis shows that our proposed system outperforms previously published systems when evaluated with metrics such as Peak Signal to Noise Ratio, Coefficient of Correlation, and Effective Number of Looks. These results show the potential of our proposed system as an effective tool for enhancing 3-D row-column imaging. PMID:26658577

  2. On finding bicliques in bipartite graphs: a novel algorithm and its application to the integration of diverse biological data types

    PubMed Central

    2014-01-01

    Background Integrating and analyzing heterogeneous genome-scale data is a huge algorithmic challenge for modern systems biology. Bipartite graphs can be useful for representing relationships across pairs of disparate data types, with the interpretation of these relationships accomplished through an enumeration of maximal bicliques. Most previously-known techniques are generally ill-suited to this foundational task, because they are relatively inefficient and without effective scaling. In this paper, a powerful new algorithm is described that produces all maximal bicliques in a bipartite graph. Unlike most previous approaches, the new method neither places undue restrictions on its input nor inflates the problem size. Efficiency is achieved through an innovative exploitation of bipartite graph structure, and through computational reductions that rapidly eliminate non-maximal candidates from the search space. An iterative selection of vertices for consideration based on non-decreasing common neighborhood sizes boosts efficiency and leads to more balanced recursion trees. Results The new technique is implemented and compared to previously published approaches from graph theory and data mining. Formal time and space bounds are derived. Experiments are performed on both random graphs and graphs constructed from functional genomics data. It is shown that the new method substantially outperforms the best previous alternatives. Conclusions The new method is streamlined, efficient, and particularly well-suited to the study of huge and diverse biological data. A robust implementation has been incorporated into GeneWeaver, an online tool for integrating and analyzing functional genomics experiments, available at http://geneweaver.org. The enormous increase in scalability it provides empowers users to study complex and previously unassailable gene-set associations between genes and their biological functions in a hierarchical fashion and on a genome-wide scale. This practical computational resource is adaptable to almost any applications environment in which bipartite graphs can be used to model relationships between pairs of heterogeneous entities. PMID:24731198

  3. Improved darunavir genotypic mutation score predicting treatment response for patients infected with HIV-1 subtype B and non-subtype B receiving a salvage regimen.

    PubMed

    De Luca, Andrea; Flandre, Philippe; Dunn, David; Zazzi, Maurizio; Wensing, Annemarie; Santoro, Maria Mercedes; Günthard, Huldrych F; Wittkop, Linda; Kordossis, Theodoros; Garcia, Federico; Castagna, Antonella; Cozzi-Lepri, Alessandro; Churchill, Duncan; De Wit, Stéphane; Brockmeyer, Norbert H; Imaz, Arkaitz; Mussini, Cristina; Obel, Niels; Perno, Carlo Federico; Roca, Bernardino; Reiss, Peter; Schülter, Eugen; Torti, Carlo; van Sighem, Ard; Zangerle, Robert; Descamps, Diane

    2016-05-01

    The objective of this study was to improve the prediction of the impact of HIV-1 protease mutations in different viral subtypes on virological response to darunavir. Darunavir-containing treatment change episodes (TCEs) in patients previously failing PIs were selected from large European databases. HIV-1 subtype B-infected patients were used as the derivation dataset and HIV-1 non-B-infected patients were used as the validation dataset. The adjusted association of each mutation with week 8 HIV RNA change from baseline was analysed by linear regression. A prediction model was derived based on best subset least squares estimation with mutational weights corresponding to regression coefficients. Virological outcome prediction accuracy was compared with that from existing genotypic resistance interpretation systems (GISs) (ANRS 2013, Rega 9.1.0 and HIVdb 7.0). TCEs were selected from 681 subtype B-infected and 199 non-B-infected adults. Accompanying drugs were NRTIs in 87%, NNRTIs in 27% and raltegravir or maraviroc or enfuvirtide in 53%. The prediction model included weighted protease mutations, HIV RNA, CD4 and activity of accompanying drugs. The model's association with week 8 HIV RNA change in the subtype B (derivation) set was R(2) = 0.47 [average squared error (ASE) = 0.67, P < 10(-6)]; in the non-B (validation) set, ASE was 0.91. Accuracy investigated by means of area under the receiver operating characteristic curves with a binary response (above the threshold value of HIV RNA reduction) showed that our final model outperformed models with existing interpretation systems in both training and validation sets. A model with a new darunavir-weighted mutation score outperformed existing GISs in both B and non-B subtypes in predicting virological response to darunavir. © The Author 2016. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. Machine Learning Algorithms Outperform Conventional Regression Models in Predicting Development of Hepatocellular Carcinoma

    PubMed Central

    Singal, Amit G.; Mukherjee, Ashin; Elmunzer, B. Joseph; Higgins, Peter DR; Lok, Anna S.; Zhu, Ji; Marrero, Jorge A; Waljee, Akbar K

    2015-01-01

    Background Predictive models for hepatocellular carcinoma (HCC) have been limited by modest accuracy and lack of validation. Machine learning algorithms offer a novel methodology, which may improve HCC risk prognostication among patients with cirrhosis. Our study's aim was to develop and compare predictive models for HCC development among cirrhotic patients, using conventional regression analysis and machine learning algorithms. Methods We enrolled 442 patients with Child A or B cirrhosis at the University of Michigan between January 2004 and September 2006 (UM cohort) and prospectively followed them until HCC development, liver transplantation, death, or study termination. Regression analysis and machine learning algorithms were used to construct predictive models for HCC development, which were tested on an independent validation cohort from the Hepatitis C Antiviral Long-term Treatment against Cirrhosis (HALT-C) Trial. Both models were also compared to the previously published HALT-C model. Discrimination was assessed using receiver operating characteristic curve analysis and diagnostic accuracy was assessed with net reclassification improvement and integrated discrimination improvement statistics. Results After a median follow-up of 3.5 years, 41 patients developed HCC. The UM regression model had a c-statistic of 0.61 (95%CI 0.56-0.67), whereas the machine learning algorithm had a c-statistic of 0.64 (95%CI 0.60–0.69) in the validation cohort. The machine learning algorithm had significantly better diagnostic accuracy as assessed by net reclassification improvement (p<0.001) and integrated discrimination improvement (p=0.04). The HALT-C model had a c-statistic of 0.60 (95%CI 0.50-0.70) in the validation cohort and was outperformed by the machine learning algorithm (p=0.047). Conclusion Machine learning algorithms improve the accuracy of risk stratifying patients with cirrhosis and can be used to accurately identify patients at high-risk for developing HCC. PMID:24169273

  5. Evolutionary History of the Asian Horned Frogs (Megophryinae): Integrative Approaches to Timetree Dating in the Absence of a Fossil Record.

    PubMed

    Mahony, Stephen; Foley, Nicole M; Biju, S D; Teeling, Emma C

    2017-03-01

    Molecular dating studies typically need fossils to calibrate the analyses. Unfortunately, the fossil record is extremely poor or presently nonexistent for many species groups, rendering such dating analysis difficult. One such group is the Asian horned frogs (Megophryinae). Sampling all generic nomina, we combined a novel ∼5 kb dataset composed of four nuclear and three mitochondrial gene fragments to produce a robust phylogeny, with an extensive external morphological study to produce a working taxonomy for the group. Expanding the molecular dataset to include out-groups of fossil-represented ancestral anuran families, we compared the priorless RelTime dating method with the widely used prior-based Bayesian timetree method, MCMCtree, utilizing a novel combination of fossil priors for anuran phylogenetic dating. The phylogeny was then subjected to ancestral phylogeographic analyses, and dating estimates were compared with likely biogeographic vicariant events. Phylogenetic analyses demonstrated that previously proposed systematic hypotheses were incorrect due to the paraphyly of genera. Molecular phylogenetic, morphological, and timetree results support the recognition of Megophryinae as a single genus, Megophrys, with a subgenus level classification. Timetree results using RelTime better corresponded with the known fossil record for the out-group anuran tree. For the priorless in-group, it also outperformed MCMCtree when node date estimates were compared with likely influential historical biogeographic events, providing novel insights into the evolutionary history of this pan-Asian anuran group. Given a relatively small molecular dataset, and limited prior knowledge, this study demonstrates that the computationally rapid RelTime dating tool may outperform more popular and complex prior reliant timetree methodologies. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Oversimplifying quantum factoring.

    PubMed

    Smolin, John A; Smith, Graeme; Vargo, Alexander

    2013-07-11

    Shor's quantum factoring algorithm exponentially outperforms known classical methods. Previous experimental implementations have used simplifications dependent on knowing the factors in advance. However, as we show here, all composite numbers admit simplification of the algorithm to a circuit equivalent to flipping coins. The difficulty of a particular experiment therefore depends on the level of simplification chosen, not the size of the number factored. Valid implementations should not make use of the answer sought.

  7. Human Splice-Site Prediction with Deep Neural Networks.

    PubMed

    Naito, Tatsuhiko

    2018-04-18

    Accurate splice-site prediction is essential to delineate gene structures from sequence data. Several computational techniques have been applied to create a system to predict canonical splice sites. For classification tasks, deep neural networks (DNNs) have achieved record-breaking results and often outperformed other supervised learning techniques. In this study, a new method of splice-site prediction using DNNs was proposed. The proposed system receives an input sequence data and returns an answer as to whether it is splice site. The length of input is 140 nucleotides, with the consensus sequence (i.e., "GT" and "AG" for the donor and acceptor sites, respectively) in the middle. Each input sequence model is applied to the pretrained DNN model that determines the probability that an input is a splice site. The model consists of convolutional layers and bidirectional long short-term memory network layers. The pretraining and validation were conducted using the data set tested in previously reported methods. The performance evaluation results showed that the proposed method can outperform the previous methods. In addition, the pattern learned by the DNNs was visualized as position frequency matrices (PFMs). Some of PFMs were very similar to the consensus sequence. The trained DNN model and the brief source code for the prediction system are uploaded. Further improvement will be achieved following the further development of DNNs.

  8. Military and academic programs outperform community programs on the American Board of Surgery Examinations.

    PubMed

    Falcone, John L; Charles, Anthony G

    2013-01-01

    There is a paucity of American Board of Surgery (ABS) Qualifying Examination (QE) and Certifying Examination (CE) outcomes comparing residency programs by academic, community, or military affiliation. We hypothesize that the larger academic programs will outperform the smaller community programs. In this retrospective study from 2002 to 2012, examination performance on the ABS QE and CE were obtained from the ABS for all of the general surgery residency programs. Programs were categorized by academic, community, and military affiliation. Both nonparametric and parametric statistics were used for comparison, using an α = 0.05. There were 137/235 (58.3%) academic programs, 90/235 (38.3%) community programs, and 8/235 (3.4%) military programs that satisfied inclusion criteria for this study. The Mann-Whitney U tests showed that the military programs outperformed academic and community programs on the ABS QE and the ABS CE, and had a higher proportion of examinees passing both examinations on the first attempt (all p≤0.02). One-tailed Student t-tests showed that academic programs had higher pass rates than community programs on the ABS QE (85.4%±9.5% vs. 81.9%±11.5%), higher pass rates on the ABS CE (83.6%±8.3% vs. 80.6%±11.0%), and a higher proportion of examinees passing both examinations on the first attempt (0.73±0.12 vs. 0.68±0.15) (all p≤0.01). The chi-square and Fisher exact tests showed that examinees performed highest in military programs, followed by academic programs, and lowest in community programs on the ABS QE and ABS CE (all p≤ 0.01). Military programs have the highest degrees of success on all of the ABS examinations. Academic programs outperform community programs. These results have the potential to affect application patterns to established general surgery residency programs. Copyright © 2013 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  9. Audiovisual preconditioning enhances the efficacy of an anatomical dissection course: A randomised study.

    PubMed

    Collins, Anne M; Quinlan, Christine S; Dolan, Roisin T; O'Neill, Shane P; Tierney, Paul; Cronin, Kevin J; Ridgway, Paul F

    2015-07-01

    The benefits of incorporating audiovisual materials into learning are well recognised. The outcome of integrating such a modality in to anatomical education has not been reported previously. The aim of this randomised study was to determine whether audiovisual preconditioning is a useful adjunct to learning at an upper limb dissection course. Prior to instruction participants completed a standardised pre course multiple-choice questionnaire (MCQ). The intervention group was subsequently shown a video with a pre-recorded commentary. Following initial dissection, both groups completed a second MCQ. The final MCQ was completed at the conclusion of the course. Statistical analysis confirmed a significant improvement in the performance in both groups over the duration of the three MCQs. The intervention group significantly outperformed their control group counterparts immediately following audiovisual preconditioning and in the post course MCQ. Audiovisual preconditioning is a practical and effective tool that should be incorporated in to future course curricula to optimise learning. Level of evidence This study appraises an intervention in medical education. Kirkpatrick Level 2b (modification of knowledge). Copyright © 2015 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  10. Global transcriptome analysis of formalin-fixed prostate cancer specimens identifies biomarkers of disease recurrence.

    PubMed

    Long, Qi; Xu, Jianpeng; Osunkoya, Adeboye O; Sannigrahi, Soma; Johnson, Brent A; Zhou, Wei; Gillespie, Theresa; Park, Jong Y; Nam, Robert K; Sugar, Linda; Stanimirovic, Aleksandra; Seth, Arun K; Petros, John A; Moreno, Carlos S

    2014-06-15

    Prostate cancer remains the second leading cause of cancer death in American men and there is an unmet need for biomarkers to identify patients with aggressive disease. In an effort to identify biomarkers of recurrence, we performed global RNA sequencing on 106 formalin-fixed, paraffin-embedded prostatectomy samples from 100 patients at three independent sites, defining a 24-gene signature panel. The 24 genes in this panel function in cell-cycle progression, angiogenesis, hypoxia, apoptosis, PI3K signaling, steroid metabolism, translation, chromatin modification, and transcription. Sixteen genes have been associated with cancer, with five specifically associated with prostate cancer (BTG2, IGFBP3, SIRT1, MXI1, and FDPS). Validation was performed on an independent publicly available dataset of 140 patients, where the new signature panel outperformed markers published previously in terms of predicting biochemical recurrence. Our work also identified differences in gene expression between Gleason pattern 4 + 3 and 3 + 4 tumors, including several genes involved in the epithelial-to-mesenchymal transition and developmental pathways. Overall, this study defines a novel biomarker panel that has the potential to improve the clinical management of prostate cancer. ©2014 American Association for Cancer Research.

  11. Comparison of Bayesian clustering and edge detection methods for inferring boundaries in landscape genetics

    USGS Publications Warehouse

    Safner, T.; Miller, M.P.; McRae, B.H.; Fortin, M.-J.; Manel, S.

    2011-01-01

    Recently, techniques available for identifying clusters of individuals or boundaries between clusters using genetic data from natural populations have expanded rapidly. Consequently, there is a need to evaluate these different techniques. We used spatially-explicit simulation models to compare three spatial Bayesian clustering programs and two edge detection methods. Spatially-structured populations were simulated where a continuous population was subdivided by barriers. We evaluated the ability of each method to correctly identify boundary locations while varying: (i) time after divergence, (ii) strength of isolation by distance, (iii) level of genetic diversity, and (iv) amount of gene flow across barriers. To further evaluate the methods' effectiveness to detect genetic clusters in natural populations, we used previously published data on North American pumas and a European shrub. Our results show that with simulated and empirical data, the Bayesian spatial clustering algorithms outperformed direct edge detection methods. All methods incorrectly detected boundaries in the presence of strong patterns of isolation by distance. Based on this finding, we support the application of Bayesian spatial clustering algorithms for boundary detection in empirical datasets, with necessary tests for the influence of isolation by distance. ?? 2011 by the authors; licensee MDPI, Basel, Switzerland.

  12. Efficient weighting strategy for enhancing synchronizability of complex networks

    NASA Astrophysics Data System (ADS)

    Wang, Youquan; Yu, Feng; Huang, Shucheng; Tu, Juanjuan; Chen, Yan

    2018-04-01

    Networks with high propensity to synchronization are desired in many applications ranging from biology to engineering. In general, there are two ways to enhance the synchronizability of a network: link rewiring and/or link weighting. In this paper, we propose a new link weighting strategy based on the concept of the neighborhood subgroup. The neighborhood subgroup of a node i through node j in a network, i.e. Gi→j, means that node u belongs to Gi→j if node u belongs to the first-order neighbors of j (not include i). Our proposed weighting schema used the local and global structural properties of the networks such as the node degree, betweenness centrality and closeness centrality measures. We applied the method on scale-free and Watts-Strogatz networks of different structural properties and show the good performance of the proposed weighting scheme. Furthermore, as model networks cannot capture all essential features of real-world complex networks, we considered a number of undirected and unweighted real-world networks. To the best of our knowledge, the proposed weighting strategy outperformed the previously published weighting methods by enhancing the synchronizability of these real-world networks.

  13. A perturbative approach for enhancing the performance of time series forecasting.

    PubMed

    de Mattos Neto, Paulo S G; Ferreira, Tiago A E; Lima, Aranildo R; Vasconcelos, Germano C; Cavalcanti, George D C

    2017-04-01

    This paper proposes a method to perform time series prediction based on perturbation theory. The approach is based on continuously adjusting an initial forecasting model to asymptotically approximate a desired time series model. First, a predictive model generates an initial forecasting for a time series. Second, a residual time series is calculated as the difference between the original time series and the initial forecasting. If that residual series is not white noise, then it can be used to improve the accuracy of the initial model and a new predictive model is adjusted using residual series. The whole process is repeated until convergence or the residual series becomes white noise. The output of the method is then given by summing up the outputs of all trained predictive models in a perturbative sense. To test the method, an experimental investigation was conducted on six real world time series. A comparison was made with six other methods experimented and ten other results found in the literature. Results show that not only the performance of the initial model is significantly improved but also the proposed method outperforms the other results previously published. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Sex Differences in Spatial Memory in Brown-Headed Cowbirds: Males Outperform Females on a Touchscreen Task

    PubMed Central

    Guigueno, Mélanie F.; MacDougall-Shackleton, Scott A.; Sherry, David F.

    2015-01-01

    Spatial cognition in females and males can differ in species in which there are sex-specific patterns in the use of space. Brown-headed cowbirds are brood parasites that show a reversal of sex-typical space use often seen in mammals. Female cowbirds, search for, revisit and parasitize hosts nests, have a larger hippocampus than males and have better memory than males for a rewarded location in an open spatial environment. In the current study, we tested female and male cowbirds in breeding and non-breeding conditions on a touchscreen delayed-match-to-sample task using both spatial and colour stimuli. Our goal was to determine whether sex differences in spatial memory in cowbirds generalizes to all spatial tasks or is task-dependant. Both sexes performed better on the spatial than on the colour touchscreen task. On the spatial task, breeding males outperformed breeding females. On the colour task, females and males did not differ, but females performed better in breeding condition than in non-breeding condition. Although female cowbirds were observed to outperform males on a previous larger-scale spatial task, males performed better than females on a task testing spatial memory in the cowbirds’ immediate visual field. Spatial abilities in cowbirds can favour males or females depending on the type of spatial task, as has been observed in mammals, including humans. PMID:26083573

  15. Cleaning lateral morphological features of the root canal: the role of streaming and cavitation.

    PubMed

    Robinson, J P; Macedo, R G; Verhaagen, B; Versluis, M; Cooper, P R; van der Sluis, L W M; Walmsley, A D

    2018-01-01

    To investigate the effects of ultrasonic activation file type, lateral canal location and irrigant on the removal of a biofilm-mimicking hydrogel from a fabricated lateral canal. Additionally, the amount of cavitation and streaming was quantified for these parameters. An intracanal sonochemical dosimetry method was used to quantify the cavitation generated by an IrriSafe 25 mm length, size 25 file inside a root canal model filled with filtered degassed/saturated water or three different concentrations of NaOCl. Removal of a hydrogel, demonstrated previously to be an appropriate biofilm mimic, was recorded to measure the lateral canal cleaning rate from two different instruments (IrriSafe 25 mm length, size 25 and K 21 mm length, size 15) activated with a P5 Suprasson (Satelec) at power P8.5 in degassed/saturated water or NaOCl. Removal rates were compared for significant differences using nonparametric Kruskal-Wallis and/or Mann-Whitney U-tests. Streaming was measured using high-speed particle imaging velocimetry at 250 kfps, analysing both the oscillatory and steady flow inside the lateral canals. There was no significant difference in amount of cavitation between tap water and oversaturated water (P = 0.538), although more cavitation was observed than in degassed water. The highest cavitation signal was generated with NaOCl solutions (1.0%, 4.5%, 9.0%) (P < 0.007) and increased with concentration (P < 0.014). The IrriSafe file outperformed significantly the K-file in removing hydrogel (P < 0.05). Up to 64% of the total hydrogel volume was removed after 20 s. The IrriSafe file typically outperformed the K-file in generating streaming. The oscillatory velocities were higher inside the lateral canal 3 mm compared to 6 mm from WL and were higher for NaOCl than for saturated water, which in turn was higher than for degassed water. Measurements of cavitation and acoustic streaming have provided insight into their contribution to cleaning. Significant differences in cleaning, cavitation and streaming were found depending on the file type and size, lateral canal location and irrigant used. In general, the IrriSafe file outperformed the K-file, and NaOCl performed better than the other irrigants tested. The cavitation and streaming measurements revealed that both contributed to hydrogel removal and both play a significant role in root canal cleaning. © 2017 International Endodontic Journal. Published by John Wiley & Sons Ltd.

  16. Combining MEDLINE and publisher data to create parallel corpora for the automatic translation of biomedical text

    PubMed Central

    2013-01-01

    Background Most of the institutional and research information in the biomedical domain is available in the form of English text. Even in countries where English is an official language, such as the United States, language can be a barrier for accessing biomedical information for non-native speakers. Recent progress in machine translation suggests that this technique could help make English texts accessible to speakers of other languages. However, the lack of adequate specialized corpora needed to train statistical models currently limits the quality of automatic translations in the biomedical domain. Results We show how a large-sized parallel corpus can automatically be obtained for the biomedical domain, using the MEDLINE database. The corpus generated in this work comprises article titles obtained from MEDLINE and abstract text automatically retrieved from journal websites, which substantially extends the corpora used in previous work. After assessing the quality of the corpus for two language pairs (English/French and English/Spanish) we use the Moses package to train a statistical machine translation model that outperforms previous models for automatic translation of biomedical text. Conclusions We have built translation data sets in the biomedical domain that can easily be extended to other languages available in MEDLINE. These sets can successfully be applied to train statistical machine translation models. While further progress should be made by incorporating out-of-domain corpora and domain-specific lexicons, we believe that this work improves the automatic translation of biomedical texts. PMID:23631733

  17. Normative data for the "Sniffin' Sticks" including tests of odor identification, odor discrimination, and olfactory thresholds: an upgrade based on a group of more than 3,000 subjects.

    PubMed

    Hummel, T; Kobal, G; Gudziol, H; Mackay-Sim, A

    2007-03-01

    "Sniffin' Sticks" is a test of nasal chemosensory function that is based on pen-like odor dispensing devices, introduced some 10 years ago by Kobal and co-workers. It consists of tests for odor threshold, discrimination, and identification. Previous work established its test-retest reliability and validity. Results of the test are presented as "TDI score", the sum of results obtained for threshold, discrimination, and identification measures. While normative data have been established they are based on a relatively small number of subjects, especially with regard to subjects older than 55 years where data from only 30 healthy subjects have been used. The present study aimed to remedy this situation. Now data are available from 3,282 subjects as compared to data from 738 subjects published previously. Disregarding sex-related differences, the TDI score at the tenth percentile was 24.9 in subjects younger than 15 years, 30.3 for ages from 16 to 35 years, 27.3 for ages from 36 to 55 years, and 19.6 for subjects older than 55 years. Because the tenth percentile has been defined to separate hyposmia from normosmia, these data can be used as a guide to estimate individual olfactory ability in relation to subject's age. Absolute hyposmia was defined as the tenth percentile score of 16-35 year old subjects. Other than previous reports the present norms are also sex-differentiated with women outperforming men in the three olfactory tests. Further, the present data suggest specific changes of individual olfactory functions in relation to age, with odor thresholds declining most dramatically compared to odor discrimination and odor identification.

  18. Role of strategies and prior exposure in mental rotation.

    PubMed

    Cherney, Isabelle D; Neff, Nicole L

    2004-06-01

    The purpose of these two studies was to examine sex differences in strategy use and the effect of prior exposure on the performance on Vandenberg and Kuse's 1978 Mental Rotation Test. A total of 152 participants completed the spatial task and self-reported their strategy use. Consistent with previous studies, men outperformed women. Strategy usage did not account for these differences, although guessing did. Previous exposure to the Mental Rotation Test, American College Test scores and frequent computer or video game play predicted performance on the test. These results suggest that prior exposure to spatial tasks may provide cues to improve participants' performance.

  19. Framework for Instructional Technology: Methods of Implementing Adaptive Training and Education

    DTIC Science & Technology

    2014-01-01

    with when they were correct and certain, performed better on a posttest than students who got the same (positive) feedback for every correct...analogous step in previous examples. Students in this condition performed better on a posttest than students who had received fading of worked examples...well on a module posttest would get the next module at a higher level. Students learning with this system outperformed those learning with the

  20. Comparing Bilingual to Monolingual Learners on English Spelling: A Meta-analytic Review.

    PubMed

    Zhao, Jing; Quiroz, Blanca; Dixon, L Quentin; Joshi, R Malatesha

    2016-08-01

    This study reports on a meta-analysis to examine how bilingual learners compare with English monolingual learners on two English spelling outcomes: real-word spelling and pseudo-word spelling. Eighteen studies published in peer-reviewed journals between 1990 and 2014 were retrieved. The study-level variables and characteristics (e.g. sample size, study design and research instruments) were coded, and 29 independent effect sizes across the 18 retrieved studies were analysed. We found that bilinguals outperformed monolinguals on real-word spelling overall and more so in early grades, but monolinguals outperformed bilinguals on pseudo-word spelling. Further, bilinguals at risk for reading difficulties did better on real-word spelling than monolinguals at risk for reading difficulties. Having investigated systematic sources of variability in effect sizes, we conclude that in comparison with their monolingual peers, bilingual learners, especially those from alphabetic L1 backgrounds, are able to master constrained skills, such as English spelling, in the current instructional context. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  1. Abstract Spatial Reasoning as an Autistic Strength

    PubMed Central

    Stevenson, Jennifer L.; Gernsbacher, Morton Ann

    2013-01-01

    Autistic individuals typically excel on spatial tests that measure abstract reasoning, such as the Block Design subtest on intelligence test batteries and the Raven’s Progressive Matrices nonverbal test of intelligence. Such well-replicated findings suggest that abstract spatial processing is a relative and perhaps absolute strength of autistic individuals. However, previous studies have not systematically varied reasoning level – concrete vs. abstract – and test domain – spatial vs. numerical vs. verbal, which the current study did. Autistic participants (N = 72) and non-autistic participants (N = 72) completed a battery of 12 tests that varied by reasoning level (concrete vs. abstract) and domain (spatial vs. numerical vs. verbal). Autistic participants outperformed non-autistic participants on abstract spatial tests. Non-autistic participants did not outperform autistic participants on any of the three domains (spatial, numerical, and verbal) or at either of the two reasoning levels (concrete and abstract), suggesting similarity in abilities between autistic and non-autistic individuals, with abstract spatial reasoning as an autistic strength. PMID:23533615

  2. Anonymizing 1:M microdata with high utility

    PubMed Central

    Gong, Qiyuan; Luo, Junzhou; Yang, Ming; Ni, Weiwei; Li, Xiao-Bai

    2016-01-01

    Preserving privacy and utility during data publishing and data mining is essential for individuals, data providers and researchers. However, studies in this area typically assume that one individual has only one record in a dataset, which is unrealistic in many applications. Having multiple records for an individual leads to new privacy leakages. We call such a dataset a 1:M dataset. In this paper, we propose a novel privacy model called (k, l)-diversity that addresses disclosure risks in 1:M data publishing. Based on this model, we develop an efficient algorithm named 1:M-Generalization to preserve privacy and data utility, and compare it with alternative approaches. Extensive experiments on real-world data show that our approach outperforms the state-of-the-art technique, in terms of data utility and computational cost. PMID:28603388

  3. Tail Biting Trellis Representation of Codes: Decoding and Construction

    NASA Technical Reports Server (NTRS)

    Shao. Rose Y.; Lin, Shu; Fossorier, Marc

    1999-01-01

    This paper presents two new iterative algorithms for decoding linear codes based on their tail biting trellises, one is unidirectional and the other is bidirectional. Both algorithms are computationally efficient and achieves virtually optimum error performance with a small number of decoding iterations. They outperform all the previous suboptimal decoding algorithms. The bidirectional algorithm also reduces decoding delay. Also presented in the paper is a method for constructing tail biting trellises for linear block codes.

  4. Support vector regression scoring of receptor-ligand complexes for rank-ordering and virtual screening of chemical libraries.

    PubMed

    Li, Liwei; Wang, Bo; Meroueh, Samy O

    2011-09-26

    The community structure-activity resource (CSAR) data sets are used to develop and test a support vector machine-based scoring function in regression mode (SVR). Two scoring functions (SVR-KB and SVR-EP) are derived with the objective of reproducing the trend of the experimental binding affinities provided within the two CSAR data sets. The features used to train SVR-KB are knowledge-based pairwise potentials, while SVR-EP is based on physicochemical properties. SVR-KB and SVR-EP were compared to seven other widely used scoring functions, including Glide, X-score, GoldScore, ChemScore, Vina, Dock, and PMF. Results showed that SVR-KB trained with features obtained from three-dimensional complexes of the PDBbind data set outperformed all other scoring functions, including best performing X-score, by nearly 0.1 using three correlation coefficients, namely Pearson, Spearman, and Kendall. It was interesting that higher performance in rank ordering did not translate into greater enrichment in virtual screening assessed using the 40 targets of the Directory of Useful Decoys (DUD). To remedy this situation, a variant of SVR-KB (SVR-KBD) was developed by following a target-specific tailoring strategy that we had previously employed to derive SVM-SP. SVR-KBD showed a much higher enrichment, outperforming all other scoring functions tested, and was comparable in performance to our previously derived scoring function SVM-SP.

  5. Optimal healthcare decision making under multiple mathematical models: application in prostate cancer screening.

    PubMed

    Bertsimas, Dimitris; Silberholz, John; Trikalinos, Thomas

    2018-03-01

    Important decisions related to human health, such as screening strategies for cancer, need to be made without a satisfactory understanding of the underlying biological and other processes. Rather, they are often informed by mathematical models that approximate reality. Often multiple models have been made to study the same phenomenon, which may lead to conflicting decisions. It is natural to seek a decision making process that identifies decisions that all models find to be effective, and we propose such a framework in this work. We apply the framework in prostate cancer screening to identify prostate-specific antigen (PSA)-based strategies that perform well under all considered models. We use heuristic search to identify strategies that trade off between optimizing the average across all models' assessments and being "conservative" by optimizing the most pessimistic model assessment. We identified three recently published mathematical models that can estimate quality-adjusted life expectancy (QALE) of PSA-based screening strategies and identified 64 strategies that trade off between maximizing the average and the most pessimistic model assessments. All prescribe PSA thresholds that increase with age, and 57 involve biennial screening. Strategies with higher assessments with the pessimistic model start screening later, stop screening earlier, and use higher PSA thresholds at earlier ages. The 64 strategies outperform 22 previously published expert-generated strategies. The 41 most "conservative" ones remained better than no screening with all models in extensive sensitivity analyses. We augment current comparative modeling approaches by identifying strategies that perform well under all models, for various degrees of decision makers' conservativeness.

  6. PANTHER-PSEP: predicting disease-causing genetic variants using position-specific evolutionary preservation.

    PubMed

    Tang, Haiming; Thomas, Paul D

    2016-07-15

    PANTHER-PSEP is a new software tool for predicting non-synonymous genetic variants that may play a causal role in human disease. Several previous variant pathogenicity prediction methods have been proposed that quantify evolutionary conservation among homologous proteins from different organisms. PANTHER-PSEP employs a related but distinct metric based on 'evolutionary preservation': homologous proteins are used to reconstruct the likely sequences of ancestral proteins at nodes in a phylogenetic tree, and the history of each amino acid can be traced back in time from its current state to estimate how long that state has been preserved in its ancestors. Here, we describe the PSEP tool, and assess its performance on standard benchmarks for distinguishing disease-associated from neutral variation in humans. On these benchmarks, PSEP outperforms not only previous tools that utilize evolutionary conservation, but also several highly used tools that include multiple other sources of information as well. For predicting pathogenic human variants, the trace back of course starts with a human 'reference' protein sequence, but the PSEP tool can also be applied to predicting deleterious or pathogenic variants in reference proteins from any of the ∼100 other species in the PANTHER database. PANTHER-PSEP is freely available on the web at http://pantherdb.org/tools/csnpScoreForm.jsp Users can also download the command-line based tool at ftp://ftp.pantherdb.org/cSNP_analysis/PSEP/ CONTACT: pdthomas@usc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. Approximation algorithms for the min-power symmetric connectivity problem

    NASA Astrophysics Data System (ADS)

    Plotnikov, Roman; Erzin, Adil; Mladenovic, Nenad

    2016-10-01

    We consider the NP-hard problem of synthesis of optimal spanning communication subgraph in a given arbitrary simple edge-weighted graph. This problem occurs in the wireless networks while minimizing the total transmission power consumptions. We propose several new heuristics based on the variable neighborhood search metaheuristic for the approximation solution of the problem. We have performed a numerical experiment where all proposed algorithms have been executed on the randomly generated test samples. For these instances, on average, our algorithms outperform the previously known heuristics.

  8. The Achilles' heel of left ventricular assist device therapy: right ventricle.

    PubMed

    Ranganath, Neel K; Smith, Deane E; Moazami, Nader

    2018-06-01

    Many patients suffer from either persistent right ventricular failure (RVF) at the time of left ventricular assist device (LVAD) or have ongoing symptoms consistent with RVF during chronic mechanical circulatory support. The lack of long-term right ventricular assist devices (RVADs) has limited the impact that mechanical circulatory support can provide to patients with biventricular failure. We aim to review the entire spectrum of RVF in patients receiving LVADs and reflect on why this entity remains the Achilles' heel of LVAD therapy. In the early postoperative period, LVAD implantation reduces right ventricle (RV) afterload, but RV dysfunction may be exacerbated secondary to increased venous return. With prolonged therapy, the decreased RV afterload leads to improved RV contractile function. Bayesian statistical models outperform previously published preoperative risk scores by considering inter-relationships and conditional probabilities amongst independent variables. Various echocardiographic parameters and the pulmonary artery pulsatility index have shown promise in predicting post-LVAD RVF. Recent publications have delineated the emergence of 'delayed' RVF. Several devices are currently being investigated for use as RVADs. Post-LVAD RVF depends on the RV's ability to adapt to acute hemodynamic changes imposed by the LVAD. Management options are limited due to the lack of an easily implantable, chronic-use RVAD.

  9. Simultaneous determination of thirteen different steroid hormones using micro UHPLC-MS/MS with on-line SPE system.

    PubMed

    Márta, Zoltán; Bobály, Balázs; Fekete, Jenő; Magda, Balázs; Imre, Tímea; Mészáros, Katalin Viola; Bálint, Mária; Szabó, Pál Tamás

    2018-02-20

    Ultratrace analysis of sample components requires excellent analytical performance in terms of limits of quantitation (LOQ). Micro UHPLC coupled to sensitive tandem mass spectrometry provides state of the art solution for such analytical problems. Using on-line SPE with column switching on a micro UHPLC-MS/MS system allowed to decrease LOQ without any complex sample preparation protocol. The presented method is capable of reaching satisfactory low LOQ values for analysis of thirteen different steroid molecules from human plasma without the most commonly used off-line SPE or compound derivatization. Steroids were determined by using two simple sample preparation methods, based on lower and higher plasma steroid concentrations. In the first method, higher analyte concentrations were directly determined after protein precipitation with methanol. The organic phase obtained from the precipitation was diluted with water and directly injected into the LC-MS system. In the second method, low steroid levels were determined by concentrating the organic phase after steroid extraction. In this case, analytes were extracted with ethyl acetate and reconstituted in 90/10 water/acetonitrile following evaporation to dryness. This step provided much lower LOQs, outperforming previously published values. The method has been validated and subsequently applied to clinical laboratory measurement. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Ensemble Clustering Classification compete SVM and One-Class classifiers applied on plant microRNAs Data.

    PubMed

    Yousef, Malik; Khalifa, Waleed; AbedAllah, Loai

    2016-12-22

    The performance of many learning and data mining algorithms depends critically on suitable metrics to assess efficiency over the input space. Learning a suitable metric from examples may, therefore, be the key to successful application of these algorithms. We have demonstrated that the k-nearest neighbor (kNN) classification can be significantly improved by learning a distance metric from labeled examples. The clustering ensemble is used to define the distance between points in respect to how they co-cluster. This distance is then used within the framework of the kNN algorithm to define a classifier named ensemble clustering kNN classifier (EC-kNN). In many instances in our experiments we achieved highest accuracy while SVM failed to perform as well. In this study, we compare the performance of a two-class classifier using EC-kNN with different one-class and two-class classifiers. The comparison was applied to seven different plant microRNA species considering eight feature selection methods. In this study, the averaged results show that ECkNN outperforms all other methods employed here and previously published results for the same data. In conclusion, this study shows that the chosen classifier shows high performance when the distance metric is carefully chosen.

  11. Ensemble Clustering Classification Applied to Competing SVM and One-Class Classifiers Exemplified by Plant MicroRNAs Data.

    PubMed

    Yousef, Malik; Khalifa, Waleed; AbdAllah, Loai

    2016-12-01

    The performance of many learning and data mining algorithms depends critically on suitable metrics to assess efficiency over the input space. Learning a suitable metric from examples may, therefore, be the key to successful application of these algorithms. We have demonstrated that the k-nearest neighbor (kNN) classification can be significantly improved by learning a distance metric from labeled examples. The clustering ensemble is used to define the distance between points in respect to how they co-cluster. This distance is then used within the framework of the kNN algorithm to define a classifier named ensemble clustering kNN classifier (EC-kNN). In many instances in our experiments we achieved highest accuracy while SVM failed to perform as well. In this study, we compare the performance of a two-class classifier using EC-kNN with different one-class and two-class classifiers. The comparison was applied to seven different plant microRNA species considering eight feature selection methods. In this study, the averaged results show that EC-kNN outperforms all other methods employed here and previously published results for the same data. In conclusion, this study shows that the chosen classifier shows high performance when the distance metric is carefully chosen.

  12. A comprehensive statistical classifier of foci in the cell transformation assay for carcinogenicity testing.

    PubMed

    Callegaro, Giulia; Malkoc, Kasja; Corvi, Raffaella; Urani, Chiara; Stefanini, Federico M

    2017-12-01

    The identification of the carcinogenic risk of chemicals is currently mainly based on animal studies. The in vitro Cell Transformation Assays (CTAs) are a promising alternative to be considered in an integrated approach. CTAs measure the induction of foci of transformed cells. CTAs model key stages of the in vivo neoplastic process and are able to detect both genotoxic and some non-genotoxic compounds, being the only in vitro method able to deal with the latter. Despite their favorable features, CTAs can be further improved, especially reducing the possible subjectivity arising from the last phase of the protocol, namely visual scoring of foci using coded morphological features. By taking advantage of digital image analysis, the aim of our work is to translate morphological features into statistical descriptors of foci images, and to use them to mimic the classification performances of the visual scorer to discriminate between transformed and non-transformed foci. Here we present a classifier based on five descriptors trained on a dataset of 1364 foci, obtained with different compounds and concentrations. Our classifier showed accuracy, sensitivity and specificity equal to 0.77 and an area under the curve (AUC) of 0.84. The presented classifier outperforms a previously published model. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. An Ensemble Approach for Drug Side Effect Prediction

    PubMed Central

    Jahid, Md Jamiul; Ruan, Jianhua

    2014-01-01

    In silico prediction of drug side-effects in early stage of drug development is becoming more popular now days, which not only reduces the time for drug design but also reduces the drug development costs. In this article we propose an ensemble approach to predict drug side-effects of drug molecules based on their chemical structure. Our idea originates from the observation that similar drugs have similar side-effects. Based on this observation we design an ensemble approach that combine the results from different classification models where each model is generated by a different set of similar drugs. We applied our approach to 1385 side-effects in the SIDER database for 888 drugs. Results show that our approach outperformed previously published approaches and standard classifiers. Furthermore, we applied our method to a number of uncharacterized drug molecules in DrugBank database and predict their side-effect profiles for future usage. Results from various sources confirm that our method is able to predict the side-effects for uncharacterized drugs and more importantly able to predict rare side-effects which are often ignored by other approaches. The method described in this article can be useful to predict side-effects in drug design in an early stage to reduce experimental cost and time. PMID:25327524

  14. OH-PRED: prediction of protein hydroxylation sites by incorporating adapted normal distribution bi-profile Bayes feature extraction and physicochemical properties of amino acids.

    PubMed

    Jia, Cang-Zhi; He, Wen-Ying; Yao, Yu-Hua

    2017-03-01

    Hydroxylation of proline or lysine residues in proteins is a common post-translational modification event, and such modifications are found in many physiological and pathological processes. Nonetheless, the exact molecular mechanism of hydroxylation remains under investigation. Because experimental identification of hydroxylation is time-consuming and expensive, bioinformatics tools with high accuracy represent desirable alternatives for large-scale rapid identification of protein hydroxylation sites. In view of this, we developed a supporter vector machine-based tool, OH-PRED, for the prediction of protein hydroxylation sites using the adapted normal distribution bi-profile Bayes feature extraction in combination with the physicochemical property indexes of the amino acids. In a jackknife cross validation, OH-PRED yields an accuracy of 91.88% and a Matthew's correlation coefficient (MCC) of 0.838 for the prediction of hydroxyproline sites, and yields an accuracy of 97.42% and a MCC of 0.949 for the prediction of hydroxylysine sites. These results demonstrate that OH-PRED increased significantly the prediction accuracy of hydroxyproline and hydroxylysine sites by 7.37 and 14.09%, respectively, when compared with the latest predictor PredHydroxy. In independent tests, OH-PRED also outperforms previously published methods.

  15. Impact of vocational interests, previous academic experience, gender and age on Situational Judgement Test performance.

    PubMed

    Schripsema, Nienke R; van Trigt, Anke M; Borleffs, Jan C C; Cohen-Schotanus, Janke

    2017-05-01

    Situational Judgement Tests (SJTs) are increasingly implemented in medical school admissions. In this paper, we investigate the effects of vocational interests, previous academic experience, gender and age on SJT performance. The SJT was part of the selection process for the Bachelor's degree programme in Medicine at University of Groningen, the Netherlands. All applicants for the academic year 2015-2016 were included and had to choose between learning communities Global Health (n = 126), Sustainable Care (n = 149), Intramural Care (n = 225), or Molecular Medicine (n = 116). This choice was used as a proxy for vocational interest. In addition, all graduate-entry applicants for academic year 2015-2016 (n = 213) were included to examine the effect of previous academic experience on performance. We used MANCOVA analyses with Bonferroni post hoc multiple comparisons tests for applicant performance on a six-scenario SJT. The MANCOVA analyses showed that for all scenarios, the independent variables were significantly related to performance (Pillai's Trace: 0.02-0.47, p < .01). Vocational interest was related to performance on three scenarios (p < .01). Graduate-entry applicants outperformed all other groups on three scenarios (p < .01) and at least one other group on the other three scenarios (p < .01). Female applicants outperformed male applicants on three scenarios (p < .01) and age was positively related to performance on two scenarios (p < .05). A good fit between applicants' vocational interests and SJT scenario was related to better performance, as was previous academic experience. Gender and age were related to performance on SJT scenarios in different settings. Especially the first effect might be helpful in selecting appropriate candidates for areas of health care in which more professionals are needed.

  16. Multivariate decoding of brain images using ordinal regression.

    PubMed

    Doyle, O M; Ashburner, J; Zelaya, F O; Williams, S C R; Mehta, M A; Marquand, A F

    2013-11-01

    Neuroimaging data are increasingly being used to predict potential outcomes or groupings, such as clinical severity, drug dose response, and transitional illness states. In these examples, the variable (target) we want to predict is ordinal in nature. Conventional classification schemes assume that the targets are nominal and hence ignore their ranked nature, whereas parametric and/or non-parametric regression models enforce a metric notion of distance between classes. Here, we propose a novel, alternative multivariate approach that overcomes these limitations - whole brain probabilistic ordinal regression using a Gaussian process framework. We applied this technique to two data sets of pharmacological neuroimaging data from healthy volunteers. The first study was designed to investigate the effect of ketamine on brain activity and its subsequent modulation with two compounds - lamotrigine and risperidone. The second study investigates the effect of scopolamine on cerebral blood flow and its modulation using donepezil. We compared ordinal regression to multi-class classification schemes and metric regression. Considering the modulation of ketamine with lamotrigine, we found that ordinal regression significantly outperformed multi-class classification and metric regression in terms of accuracy and mean absolute error. However, for risperidone ordinal regression significantly outperformed metric regression but performed similarly to multi-class classification both in terms of accuracy and mean absolute error. For the scopolamine data set, ordinal regression was found to outperform both multi-class and metric regression techniques considering the regional cerebral blood flow in the anterior cingulate cortex. Ordinal regression was thus the only method that performed well in all cases. Our results indicate the potential of an ordinal regression approach for neuroimaging data while providing a fully probabilistic framework with elegant approaches for model selection. Copyright © 2013. Published by Elsevier Inc.

  17. Hybrid Power Management for Office Equipment

    NASA Astrophysics Data System (ADS)

    Gingade, Ganesh P.

    Office machines (such as printers, scanners, fax, and copiers) can consume significant amounts of power. Few studies have been devoted to power management of office equipment. Most office machines have sleep modes to save power. Power management of these machines are usually timeout-based: a machine sleeps after being idle long enough. Setting the timeout duration can be difficult: if it is too long, the machine wastes power during idleness. If it is too short, the machine sleeps too soon and too often--the wakeup delay can significantly degrade productivity. Thus, power management is a tradeoff between saving energy and keeping short response time. Many power management policies have been published and one policy may outperform another in some scenarios. There is no definite conclusion which policy is always better. This thesis describes two methods for office equipment power management. The first method adaptively reduces power based on a constraint of the wakeup delay. The second method is a hybrid with multiple candidate policies and it selects the most appropriate power management policy. Using six months of request traces from 18 different offices, we demonstrate that the hybrid policy outperforms individual policies. We also discover that power management based on business hours does not produce consistent energy savings.

  18. Integration of copy number and transcriptomics provides risk stratification in prostate cancer: A discovery and validation cohort study

    PubMed Central

    Ross-Adams, H.; Lamb, A.D.; Dunning, M.J.; Halim, S.; Lindberg, J.; Massie, C.M.; Egevad, L.A.; Russell, R.; Ramos-Montoya, A.; Vowler, S.L.; Sharma, N.L.; Kay, J.; Whitaker, H.; Clark, J.; Hurst, R.; Gnanapragasam, V.J.; Shah, N.C.; Warren, A.Y.; Cooper, C.S.; Lynch, A.G.; Stark, R.; Mills, I.G.; Grönberg, H.; Neal, D.E.

    2015-01-01

    Background Understanding the heterogeneous genotypes and phenotypes of prostate cancer is fundamental to improving the way we treat this disease. As yet, there are no validated descriptions of prostate cancer subgroups derived from integrated genomics linked with clinical outcome. Methods In a study of 482 tumour, benign and germline samples from 259 men with primary prostate cancer, we used integrative analysis of copy number alterations (CNA) and array transcriptomics to identify genomic loci that affect expression levels of mRNA in an expression quantitative trait loci (eQTL) approach, to stratify patients into subgroups that we then associated with future clinical behaviour, and compared with either CNA or transcriptomics alone. Findings We identified five separate patient subgroups with distinct genomic alterations and expression profiles based on 100 discriminating genes in our separate discovery and validation sets of 125 and 103 men. These subgroups were able to consistently predict biochemical relapse (p = 0.0017 and p = 0.016 respectively) and were further validated in a third cohort with long-term follow-up (p = 0.027). We show the relative contributions of gene expression and copy number data on phenotype, and demonstrate the improved power gained from integrative analyses. We confirm alterations in six genes previously associated with prostate cancer (MAP3K7, MELK, RCBTB2, ELAC2, TPD52, ZBTB4), and also identify 94 genes not previously linked to prostate cancer progression that would not have been detected using either transcript or copy number data alone. We confirm a number of previously published molecular changes associated with high risk disease, including MYC amplification, and NKX3-1, RB1 and PTEN deletions, as well as over-expression of PCA3 and AMACR, and loss of MSMB in tumour tissue. A subset of the 100 genes outperforms established clinical predictors of poor prognosis (PSA, Gleason score), as well as previously published gene signatures (p = 0.0001). We further show how our molecular profiles can be used for the early detection of aggressive cases in a clinical setting, and inform treatment decisions. Interpretation For the first time in prostate cancer this study demonstrates the importance of integrated genomic analyses incorporating both benign and tumour tissue data in identifying molecular alterations leading to the generation of robust gene sets that are predictive of clinical outcome in independent patient cohorts. PMID:26501111

  19. Improving personalized link prediction by hybrid diffusion

    NASA Astrophysics Data System (ADS)

    Liu, Jin-Hu; Zhu, Yu-Xiao; Zhou, Tao

    2016-04-01

    Inspired by traditional link prediction and to solve the problem of recommending friends in social networks, we introduce the personalized link prediction in this paper, in which each individual will get equal number of diversiform predictions. While the performances of many classical algorithms are not satisfactory under this framework, thus new algorithms are in urgent need. Motivated by previous researches in other fields, we generalize heat conduction process to the framework of personalized link prediction and find that this method outperforms many classical similarity-based algorithms, especially in the performance of diversity. In addition, we demonstrate that adding one ground node that is supposed to connect all the nodes in the system will greatly benefit the performance of heat conduction. Finally, better hybrid algorithms composed of local random walk and heat conduction have been proposed. Numerical results show that the hybrid algorithms can outperform other algorithms simultaneously in all four adopted metrics: AUC, precision, recall and hamming distance. In a word, this work may shed some light on the in-depth understanding of the effect of physical processes in personalized link prediction.

  20. Learning by (video) example: a randomized study of communication skills training for end-of-life and error disclosure family care conferences.

    PubMed

    Schmitz, Connie C; Braman, Jonathan P; Turner, Norman; Heller, Stephanie; Radosevich, David M; Yan, Yelena; Miller, Jane; Chipman, Jeffrey G

    2016-11-01

    Teaching residents to lead end of life (EOL) and error disclosure (ED) conferences is important. We developed and tested an intervention using videotapes of EOL and error disclosure encounters from previous Objective Structured Clinical Exams. Residents (n = 72) from general and orthopedic surgery programs at 2 sites were enrolled. Using a prospective, pre-post, block group design with stratified randomization, we hypothesized the treatment group would outperform the control on EOL and ED cases. We also hypothesized that online course usage would correlate positively with post-test scores. All residents improved (pre-post). At the group level, treatment effects were insignificant, and post-test performance was unrelated to course usage. At the subgroup level for EOL, low performers assigned to treatment scored higher than controls at post-test; and within the treatment group, post graduate year 3 residents outperformed post graduate year ​1 residents. To be effective, online curricula illustrating communication behaviors need face-to-face interaction, individual role play with feedback and discussion. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. On the optimum signal constellation design for high-speed optical transport networks.

    PubMed

    Liu, Tao; Djordjevic, Ivan B

    2012-08-27

    In this paper, we first describe an optimum signal constellation design algorithm, which is optimum in MMSE-sense, called MMSE-OSCD, for channel capacity achieving source distribution. Secondly, we introduce a feedback channel capacity inspired optimum signal constellation design (FCC-OSCD) to further improve the performance of MMSE-OSCD, inspired by the fact that feedback channel capacity is higher than that of systems without feedback. The constellations obtained by FCC-OSCD are, however, OSNR dependent. The optimization is jointly performed together with regular quasi-cyclic low-density parity-check (LDPC) code design. Such obtained coded-modulation scheme, in combination with polarization-multiplexing, is suitable as both 400 Gb/s and multi-Tb/s optical transport enabling technology. Using large girth LDPC code, we demonstrate by Monte Carlo simulations that a 32-ary signal constellation, obtained by FCC-OSCD, outperforms previously proposed optimized 32-ary CIPQ signal constellation by 0.8 dB at BER of 10(-7). On the other hand, the LDPC-coded 16-ary FCC-OSCD outperforms 16-QAM by 1.15 dB at the same BER.

  2. Are gamers better crossers? An examination of action video game experience and dual task effects in a simulated street crossing task.

    PubMed

    Gaspar, John G; Neider, Mark B; Crowell, James A; Lutz, Aubrey; Kaczmarski, Henry; Kramer, Arthur F

    2014-05-01

    A high-fidelity street crossing simulator was used to test the hypothesis that experienced action video game players are less vulnerable than non-gamers to dual task costs in complex tasks. Previous research has shown that action video game players outperform nonplayers on many single task measures of perception and attention. It is unclear, however, whether action video game players outperform nonplayers in complex, divided attention tasks. Experienced action video game players and nongamers completed a street crossing task in a high-fidelity simulator. Participants walked on a manual treadmill to cross the street. During some crossings, a cognitively demanding working memory task was added. Dividing attention resulted in more collisions and increased decision making time. Of importance, these dual task costs were equivalent for the action video game players and the nongamers. These results suggest that action video game players are equally susceptible to the costs of dividing attention in a complex task. Perceptual and attentional benefits associated with action video game experience may not translate to performance benefits in complex, real-world tasks.

  3. Echo state networks with filter neurons and a delay&sum readout.

    PubMed

    Holzmann, Georg; Hauser, Helmut

    2010-03-01

    Echo state networks (ESNs) are a novel approach to recurrent neural network training with the advantage of a very simple and linear learning algorithm. It has been demonstrated that ESNs outperform other methods on a number of benchmark tasks. Although the approach is appealing, there are still some inherent limitations in the original formulation. Here we suggest two enhancements of this network model. First, the previously proposed idea of filters in neurons is extended to arbitrary infinite impulse response (IIR) filter neurons. This enables such networks to learn multiple attractors and signals at different timescales, which is especially important for modeling real-world time series. Second, a delay&sum readout is introduced, which adds trainable delays in the synaptic connections of output neurons and therefore vastly improves the memory capacity of echo state networks. It is shown in commonly used benchmark tasks and real-world examples, that this new structure is able to significantly outperform standard ESNs and other state-of-the-art models for nonlinear dynamical system modeling. Copyright 2009 Elsevier Ltd. All rights reserved.

  4. The Role of Socio-Communicative Rearing Environments on the Development of Social and Physical Cognition in Apes

    PubMed Central

    Russell, J. L.; Lyn, H.; Schaeffer, J. A.; Hopkins, W. D.

    2011-01-01

    The cultural intelligence hypothesis (CIH) claims that humans' advanced cognition is a direct result of human culture and that children are uniquely specialized to absorb and utilize this cultural experience (Tomasello, 2000). Comparative data demonstrating that 2.5 year old human children outperform apes on measures of social cognition but not on measures of physical cognition support this claim (E. Herrmann, J. Call, M. V. Hernandez-Lloreda, B. Hare, & M. Tomasello, 2007). However, the previous study failed to control for rearing when comparing these two species. Specifically, the human children were raised in a human culture whereas the apes were raised in standard sanctuary settings. To further explore the CIH, here we compared the performance on multiple measures of social and physical cognition in a group of standard reared apes raised in conditions typical of zoo and biomedical laboratory settings to that of apes reared in an enculturated socio-communicatively rich environment. Overall, the enculturated apes significantly outperformed their standard reared counterparts on the cognitive tasks and this was particularly true for measures of communication. Furthermore, the performance of the enculturated apes was very similar to previously reported data from 2.5 year old children. We conclude that apes who are reared in a human-like socio-communicatively rich environment develop superior communicative abilities compared to apes reared in standard laboratory settings, which supports some assumptions of the cultural intelligence hypothesis. PMID:22010903

  5. An artificial bioindicator system for network intrusion detection.

    PubMed

    Blum, Christian; Lozano, José A; Davidson, Pedro Pinacho

    An artificial bioindicator system is developed in order to solve a network intrusion detection problem. The system, inspired by an ecological approach to biological immune systems, evolves a population of agents that learn to survive in their environment. An adaptation process allows the transformation of the agent population into a bioindicator that is capable of reacting to system anomalies. Two characteristics stand out in our proposal. On the one hand, it is able to discover new, previously unseen attacks, and on the other hand, contrary to most of the existing systems for network intrusion detection, it does not need any previous training. We experimentally compare our proposal with three state-of-the-art algorithms and show that it outperforms the competing approaches on widely used benchmark data.

  6. Figure-ground segmentation based on class-independent shape priors

    NASA Astrophysics Data System (ADS)

    Li, Yang; Liu, Yang; Liu, Guojun; Guo, Maozu

    2018-01-01

    We propose a method to generate figure-ground segmentation by incorporating shape priors into the graph-cuts algorithm. Given an image, we first obtain a linear representation of an image and then apply directional chamfer matching to generate class-independent, nonparametric shape priors, which provide shape clues for the graph-cuts algorithm. We then enforce shape priors in a graph-cuts energy function to produce object segmentation. In contrast to previous segmentation methods, the proposed method shares shape knowledge for different semantic classes and does not require class-specific model training. Therefore, the approach obtains high-quality segmentation for objects. We experimentally validate that the proposed method outperforms previous approaches using the challenging PASCAL VOC 2010/2012 and Berkeley (BSD300) segmentation datasets.

  7. Hypothesis testing on the fractal structure of behavioral sequences: the Bayesian assessment of scaling methodology.

    PubMed

    Moscoso del Prado Martín, Fermín

    2013-12-01

    I introduce the Bayesian assessment of scaling (BAS), a simple but powerful Bayesian hypothesis contrast methodology that can be used to test hypotheses on the scaling regime exhibited by a sequence of behavioral data. Rather than comparing parametric models, as typically done in previous approaches, the BAS offers a direct, nonparametric way to test whether a time series exhibits fractal scaling. The BAS provides a simpler and faster test than do previous methods, and the code for making the required computations is provided. The method also enables testing of finely specified hypotheses on the scaling indices, something that was not possible with the previously available methods. I then present 4 simulation studies showing that the BAS methodology outperforms the other methods used in the psychological literature. I conclude with a discussion of methodological issues on fractal analyses in experimental psychology. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  8. Recurrent personality dimensions in inclusive lexical studies: indications for a big six structure.

    PubMed

    Saucier, Gerard

    2009-10-01

    Previous evidence for both the Big Five and the alternative six-factor model has been drawn from lexical studies with relatively narrow selections of attributes. This study examined factors from previous lexical studies using a wider selection of attributes in 7 languages (Chinese, English, Filipino, Greek, Hebrew, Spanish, and Turkish) and found 6 recurrent factors, each with common conceptual content across most of the studies. The previous narrow-selection-based six-factor model outperformed the Big Five in capturing the content of the 6 recurrent wideband factors. Adjective markers of the 6 recurrent wideband factors showed substantial incremental prediction of important criterion variables over and above the Big Five. Correspondence between wideband 6 and narrowband 6 factors indicate they are variants of a "Big Six" model that is more general across variable-selection procedures and may be more general across languages and populations.

  9. Evaluating alternative systems of peer review: a large-scale agent-based modelling approach to scientific publication.

    PubMed

    Kovanis, Michail; Trinquart, Ludovic; Ravaud, Philippe; Porcher, Raphaël

    2017-01-01

    The debate on whether the peer-review system is in crisis has been heated recently. A variety of alternative systems have been proposed to improve the system and make it sustainable. However, we lack sufficient evidence and data related to these issues. Here we used a previously developed agent-based model of the scientific publication and peer-review system calibrated with empirical data to compare the efficiency of five alternative peer-review systems with the conventional system. We modelled two systems of immediate publication, with and without online reviews (crowdsourcing), a system with only one round of reviews and revisions allowed (re-review opt-out) and two review-sharing systems in which rejected manuscripts are resubmitted along with their past reviews to any other journal (portable) or to only those of the same publisher but of lower impact factor (cascade). The review-sharing systems outperformed or matched the performance of the conventional one in all peer-review efficiency, reviewer effort and scientific dissemination metrics we used. The systems especially showed a large decrease in total time of the peer-review process and total time devoted by reviewers to complete all reports in a year. The two systems with immediate publication released more scientific information than the conventional one but provided almost no other benefit. Re-review opt-out decreased the time reviewers devoted to peer review but had lower performance on screening papers that should not be published and relative increase in intrinsic quality of papers due to peer review than the conventional system. Sensitivity analyses showed consistent findings to those from our main simulations. We recommend prioritizing a system of review-sharing to create a sustainable scientific publication and peer-review system.

  10. Exploring the efficacy of cyclic vs static aspiration in a cerebral thrombectomy model: an initial proof of concept study.

    PubMed

    Simon, Scott; Grey, Casey Paul; Massenzo, Trisha; Simpson, David G; Longest, P Worth

    2014-11-01

    Current technology for endovascular thrombectomy in ischemic stroke utilizes static loading and is successful in approximately 85% of cases. Existing technology uses either static suction (applied via a continuous pump or syringe) or flow arrest with a proximal balloon. In this paper we evaluate the potential of cyclic loading in aspiration thrombectomy. In order to evaluate the efficacy of cyclic aspiration, a model was created using a Penumbra aspiration system, three-way valve and Penumbra 5Max catheter. Synthetic clots were aspirated at different frequencies and using different aspiration mediums. Success or failure of clot removal and time were recorded. All statistical analyses were based on either a one-way or two-way analysis of variance, Holm-Sidak pairwise multiple comparison procedure (α=0.05). Cyclic aspiration outperformed static aspiration in overall clot removal and removal speed (p<0.001). Within cyclic aspiration, Max Hz frequencies (∼6.3 Hz) cleared clots faster than 1 Hz (p<0.001) and 2 Hz (p=0.024). Loading cycle dynamics (specific pressure waveforms) affected speed and overall clearance (p<0.001). Water as the aspiration medium was more effective at clearing clots than air (p=0.019). Cyclic aspiration significantly outperformed static aspiration in speed and overall clearance of synthetic clots in our experimental model. Within cyclic aspiration, efficacy is improved by increasing cycle frequency, utilizing specific pressure cycle waveforms and using water rather than air as the aspiration medium. These findings provide a starting point for altering existing thrombectomy technology or perhaps the development of new technologies with higher recanalization rates. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  11. Proximal tubule proteins are significantly elevated in bladder urine of patients with ureteropelvic junction obstruction and may represent novel biomarkers: A pilot study.

    PubMed

    Gerber, Claire; Harel, Miriam; Lynch, Miranda L; Herbst, Katherine W; Ferrer, Fernando A; Shapiro, Linda H

    2016-04-01

    Ureteropelvic junction obstruction (UPJO) is the major cause of hydronephrosis in children and may lead to renal injury and early renal dysfunction. However, diagnosis of the degree of obstruction and severity of renal injury relies on invasive and often inconclusive renal scans. Biomarkers from voided urine that detect early renal injury are highly desirable because of their noninvasive collection and their potential to assist in earlier and more reliable diagnosis of the severity of obstruction. Early in response to UPJO, increased intrarenal pressure directly impacts the proximal tubule brush border. We hypothesize that single-pass, apically expressed proximal tubule brush border proteins will be shed into the urine early and rapidly and will be reliable noninvasive urinary biomarkers, providing the tools for a more reliable stratification of UPJO patients. We performed a prospective cohort study at Connecticut Children's Medical Center. Bladder urine samples from 12 UPJO patients were obtained prior to surgical intervention. Control urine samples were collected from healthy pediatric patients presenting with primary nocturnal enuresis. We determined levels of NGAL, KIM-1 (previously identified biomarkers), CD10, CD13, and CD26 (potentially novel biomarkers) by ELISA in control and experimental urine samples. Urinary creatinine levels were used to normalize the urinary protein levels measured by ELISA. Each of the proximal tubule proteins outperformed the previously published biomarkers. No differences in urinary NGAL and KIM-1 levels were observed between control and obstructed patients (p = 0.932 and p = 0.799, respectively). However, levels of CD10, CD13, and CD26 were significantly higher in the voided urine of obstructed individuals when compared with controls (p = 0.002, p = 0.024, and p = 0.007, respectively) (Figure). Targeted identification of reliable, noninvasive biomarkers of renal injury is critical to aid in diagnosing patients at risk, guiding therapeutic decisions and monitoring treatment efficacy. Proximal tubule brush border proteins are reliably detected in the urine of obstructed patients and may be more effective at predicting UPJO. Copyright © 2015 Journal of Pediatric Urology Company. Published by Elsevier Ltd. All rights reserved.

  12. Development and validation of an objective instrument to measure surgical performance at tonsillectomy.

    PubMed

    Roberson, David W; Kentala, Erna; Forbes, Peter

    2005-12-01

    The goals of this project were 1) to develop and validate an objective instrument to measure surgical performance at tonsillectomy, 2) to assess its interobserver and interobservation reliability and construct validity, and 3) to select those items with best reliability and most independent information to design a simplified form suitable for routine use in otolaryngology surgical evaluation. Prospective, observational data collection for an educational quality improvement project. The evaluation instrument was based on previous instruments developed in general surgery with input from attending otolaryngologic surgeons and experts in medical education. It was pilot tested and subjected to iterative improvements. After the instrument was finalized, a total of 55 tonsillectomies were observed and scored during academic year 2002 to 2003: 45 cases by residents at different points during their rotation, 5 by fellows, and 5 by faculty. Results were assessed for interobserver reliability, interobservation reliability, and construct validity. Factor analysis was used to identify items with independent information. Interobserver and interobservation reliability was high. On technical items, faculty substantially outperformed fellows, who in turn outperformed residents (P < .0001 for both comparisons). On the "global" scale (overall assessment), residents improved an average of 1 full point (on a 5 point scale) during a 3 month rotation (P = .01). In the subscale of "patient care," results were less clear cut: fellows outperformed residents, who in turn outperformed faculty, but only the fellows to faculty comparison was statistically significant (P = .04), and residents did not clearly improve over time (P = .36). Factor analysis demonstrated that technical items and patient care items factor separately and thus represent separate skill domains in surgery. It is possible to objectively measure surgical skill at tonsillectomy with high reliability and good construct validity. Factor analysis demonstrated that patient care is a distinct domain in surgical skill. Although the interobserver reliability for some patient care items reached statistical significance, it was not high enough for "high stakes testing" purposes. Using reliability and factor analysis results, we propose a simplified instrument for use in evaluating trainees in otolaryngologic surgery.

  13. A comprehensive and quantitative comparison of text-mining in 15 million full-text articles versus their corresponding abstracts.

    PubMed

    Westergaard, David; Stærfeldt, Hans-Henrik; Tønsberg, Christian; Jensen, Lars Juhl; Brunak, Søren

    2018-02-01

    Across academia and industry, text mining has become a popular strategy for keeping up with the rapid growth of the scientific literature. Text mining of the scientific literature has mostly been carried out on collections of abstracts, due to their availability. Here we present an analysis of 15 million English scientific full-text articles published during the period 1823-2016. We describe the development in article length and publication sub-topics during these nearly 250 years. We showcase the potential of text mining by extracting published protein-protein, disease-gene, and protein subcellular associations using a named entity recognition system, and quantitatively report on their accuracy using gold standard benchmark data sets. We subsequently compare the findings to corresponding results obtained on 16.5 million abstracts included in MEDLINE and show that text mining of full-text articles consistently outperforms using abstracts only.

  14. A comprehensive and quantitative comparison of text-mining in 15 million full-text articles versus their corresponding abstracts

    PubMed Central

    Westergaard, David; Stærfeldt, Hans-Henrik

    2018-01-01

    Across academia and industry, text mining has become a popular strategy for keeping up with the rapid growth of the scientific literature. Text mining of the scientific literature has mostly been carried out on collections of abstracts, due to their availability. Here we present an analysis of 15 million English scientific full-text articles published during the period 1823–2016. We describe the development in article length and publication sub-topics during these nearly 250 years. We showcase the potential of text mining by extracting published protein–protein, disease–gene, and protein subcellular associations using a named entity recognition system, and quantitatively report on their accuracy using gold standard benchmark data sets. We subsequently compare the findings to corresponding results obtained on 16.5 million abstracts included in MEDLINE and show that text mining of full-text articles consistently outperforms using abstracts only. PMID:29447159

  15. Robust Vacuum-/Air-Dried Graphene Aerogels and Fast Recoverable Shape-Memory Hybrid Foams.

    PubMed

    Li, Chenwei; Qiu, Ling; Zhang, Baoqing; Li, Dan; Liu, Chen-Yang

    2016-02-17

    New graphene aerogels can be fabricated by vacuum/air drying, and because of the mechanical robustness of the graphene aerogels, shape-memory polymer/graphene hybrid foams can be fabricated by a simple infiltration-air-drying-crosslinking method. Due to the superelasticity, high strength, and good electrical conductivity of the as-prepared graphene aerogels, the shape-memory hybrid foams exhibit excellent thermotropical and electrical shape-memory properties, outperforming previously reported shape-memory polymer foams. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Continuous-variable quantum-key-distribution protocols with a non-Gaussian modulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leverrier, Anthony; Grangier, Philippe; Laboratoire Charles Fabry, Institut d'Optique, CNRS, Univ. Paris-Sud, Campus Polytechnique, RD 128, F-91127 Palaiseau Cedex

    2011-04-15

    In this paper, we consider continuous-variable quantum-key-distribution (QKD) protocols which use non-Gaussian modulations. These specific modulation schemes are compatible with very efficient error-correction procedures, hence allowing the protocols to outperform previous protocols in terms of achievable range. In their simplest implementation, these protocols are secure for any linear quantum channels (hence against Gaussian attacks). We also show how the use of decoy states makes the protocols secure against arbitrary collective attacks, which implies their unconditional security in the asymptotic limit.

  17. Multicenter clinical assessment of improved wearable multimodal convulsive seizure detectors.

    PubMed

    Onorati, Francesco; Regalia, Giulia; Caborni, Chiara; Migliorini, Matteo; Bender, Daniel; Poh, Ming-Zher; Frazier, Cherise; Kovitch Thropp, Eliana; Mynatt, Elizabeth D; Bidwell, Jonathan; Mai, Roberto; LaFrance, W Curt; Blum, Andrew S; Friedman, Daniel; Loddenkemper, Tobias; Mohammadpour-Touserkani, Fatemeh; Reinsberger, Claus; Tognetti, Simone; Picard, Rosalind W

    2017-11-01

    New devices are needed for monitoring seizures, especially those associated with sudden unexpected death in epilepsy (SUDEP). They must be unobtrusive and automated, and provide false alarm rates (FARs) bearable in everyday life. This study quantifies the performance of new multimodal wrist-worn convulsive seizure detectors. Hand-annotated video-electroencephalographic seizure events were collected from 69 patients at six clinical sites. Three different wristbands were used to record electrodermal activity (EDA) and accelerometer (ACM) signals, obtaining 5,928 h of data, including 55 convulsive epileptic seizures (six focal tonic-clonic seizures and 49 focal to bilateral tonic-clonic seizures) from 22 patients. Recordings were analyzed offline to train and test two new machine learning classifiers and a published classifier based on EDA and ACM. Moreover, wristband data were analyzed to estimate seizure-motion duration and autonomic responses. The two novel classifiers consistently outperformed the previous detector. The most efficient (Classifier III) yielded sensitivity of 94.55%, and an FAR of 0.2 events/day. No nocturnal seizures were missed. Most patients had <1 false alarm every 4 days, with an FAR below their seizure frequency. When increasing the sensitivity to 100% (no missed seizures), the FAR is up to 13 times lower than with the previous detector. Furthermore, all detections occurred before the seizure ended, providing reasonable latency (median = 29.3 s, range = 14.8-151 s). Automatically estimated seizure durations were correlated with true durations, enabling reliable annotations. Finally, EDA measurements confirmed the presence of postictal autonomic dysfunction, exhibiting a significant rise in 73% of the convulsive seizures. The proposed multimodal wrist-worn convulsive seizure detectors provide seizure counts that are more accurate than previous automated detectors and typical patient self-reports, while maintaining a tolerable FAR for ambulatory monitoring. Furthermore, the multimodal system provides an objective description of motor behavior and autonomic dysfunction, aimed at enriching seizure characterization, with potential utility for SUDEP warning. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.

  18. Change in end-tidal carbon dioxide outperforms other surrogates for change in cardiac output during fluid challenge.

    PubMed

    Lakhal, K; Nay, M A; Kamel, T; Lortat-Jacob, B; Ehrmann, S; Rozec, B; Boulain, T

    2017-03-01

    During fluid challenge, volume expansion (VE)-induced increase in cardiac output (Δ VE CO) is seldom measured. In patients with shock undergoing strictly controlled mechanical ventilation and receiving VE, we assessed minimally invasive surrogates for Δ VE CO (by transthoracic echocardiography): fluid-induced increases in end-tidal carbon dioxide (Δ VE E'CO2 ); pulse (Δ VE PP), systolic (Δ VE SBP), and mean systemic blood pressure (Δ VE MBP); and femoral artery Doppler flow (Δ VE FemFlow). In the absence of arrhythmia, fluid-induced decrease in heart rate (Δ VE HR) and in pulse pressure respiratory variation (Δ VE PPV) were also evaluated. Areas under the receiver operating characteristic curves (AUC ROC s) reflect the ability to identify a response to VE (Δ VE CO ≥15%). In 86 patients, Δ VE E'CO2 had an AUC ROC =0.82 [interquartile range 0.73-0.90], significantly higher than the AUC ROC for Δ VE PP, Δ VE SBP, Δ VE MBP, and Δ VE FemFlow (AUC ROC =0.61-0.65, all P  <0.05). A value of Δ VE E'CO2  >1 mm Hg (>0.13 kPa) had good positive (5.0 [2.6-9.8]) and fair negative (0.29 [0.2-0.5]) likelihood ratios. The 16 patients with arrhythmia had similar relationships between Δ VE E'CO2 and Δ VE CO to patients with regular rhythm ( r 2 =0.23 in both subgroups). In 60 patients with no arrhythmia, Δ VE E'CO2 (AUC ROC =0.84 [0.72-0.92]) outperformed Δ VE HR (AUC ROC =0.52 [0.39-0.66], P <0.05) and tended to outperform Δ VE PPV (AUC ROC =0.73 [0.60-0.84], P =0.21). In the 45 patients with no arrhythmia and receiving ventilation with tidal volume <8 ml kg -1 , Δ VE E'CO2 performed better than Δ VE PPV, with AUC ROC =0.86 [0.72-0.95] vs 0.66 [0.49-0.80], P =0.02. Δ VE E'CO2 outperformed Δ VE PP, Δ VE SBP, Δ VE MBP, Δ VE FemFlow, and Δ VE HR and, during protective ventilation, arrhythmia, or both, it also outperformed Δ VE PPV. A value of Δ VE E'CO2 >1 mm Hg (>0.13 kPa) indicated a likely response to VE. © The Author 2017. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  19. Model-based functional neuroimaging using dynamic neural fields: An integrative cognitive neuroscience approach

    PubMed Central

    Wijeakumar, Sobanawartiny; Ambrose, Joseph P.; Spencer, John P.; Curtu, Rodica

    2017-01-01

    A fundamental challenge in cognitive neuroscience is to develop theoretical frameworks that effectively span the gap between brain and behavior, between neuroscience and psychology. Here, we attempt to bridge this divide by formalizing an integrative cognitive neuroscience approach using dynamic field theory (DFT). We begin by providing an overview of how DFT seeks to understand the neural population dynamics that underlie cognitive processes through previous applications and comparisons to other modeling approaches. We then use previously published behavioral and neural data from a response selection Go/Nogo task as a case study for model simulations. Results from this study served as the ‘standard’ for comparisons with a model-based fMRI approach using dynamic neural fields (DNF). The tutorial explains the rationale and hypotheses involved in the process of creating the DNF architecture and fitting model parameters. Two DNF models, with similar structure and parameter sets, are then compared. Both models effectively simulated reaction times from the task as we varied the number of stimulus-response mappings and the proportion of Go trials. Next, we directly simulated hemodynamic predictions from the neural activation patterns from each model. These predictions were tested using general linear models (GLMs). Results showed that the DNF model that was created by tuning parameters to capture simultaneously trends in neural activation and behavioral data quantitatively outperformed a Standard GLM analysis of the same dataset. Further, by using the GLM results to assign functional roles to particular clusters in the brain, we illustrate how DNF models shed new light on the neural populations’ dynamics within particular brain regions. Thus, the present study illustrates how an interactive cognitive neuroscience model can be used in practice to bridge the gap between brain and behavior. PMID:29118459

  20. Semi-automated extraction of longitudinal subglacial bedforms from digital terrain models - Two new methods

    NASA Astrophysics Data System (ADS)

    Jorge, Marco G.; Brennand, Tracy A.

    2017-07-01

    Relict drumlin and mega-scale glacial lineation (positive relief, longitudinal subglacial bedforms - LSBs) morphometry has been used as a proxy for paleo ice-sheet dynamics. LSB morphometric inventories have relied on manual mapping, which is slow and subjective and thus potentially difficult to reproduce. Automated methods are faster and reproducible, but previous methods for LSB semi-automated mapping have not been highly successful. Here, two new object-based methods for the semi-automated extraction of LSBs (footprints) from digital terrain models are compared in a test area in the Puget Lowland, Washington, USA. As segmentation procedures to create LSB-candidate objects, the normalized closed contour method relies on the contouring of a normalized local relief model addressing LSBs on slopes, and the landform elements mask method relies on the classification of landform elements derived from the digital terrain model. For identifying which LSB-candidate objects correspond to LSBs, both methods use the same LSB operational definition: a ruleset encapsulating expert knowledge, published morphometric data, and the morphometric range of LSBs in the study area. The normalized closed contour method was separately applied to four different local relief models, two computed in moving windows and two hydrology-based. Overall, the normalized closed contour method outperformed the landform elements mask method. The normalized closed contour method performed on a hydrological relief model from a multiple direction flow routing algorithm performed best. For an assessment of its transferability, the normalized closed contour method was evaluated on a second area, the Chautauqua drumlin field, Pennsylvania and New York, USA where it performed better than in the Puget Lowland. A broad comparison to previous methods suggests that the normalized relief closed contour method may be the most capable method to date, but more development is required.

  1. Alternative Constraint Handling Technique for Four-Bar Linkage Path Generation

    NASA Astrophysics Data System (ADS)

    Sleesongsom, S.; Bureerat, S.

    2018-03-01

    This paper proposes an extension of a new concept for path generation from our previous work by adding a new constraint handling technique. The propose technique was initially designed for problems without prescribed timing by avoiding the timing constraint, while remain constraints are solving with a new constraint handling technique. The technique is one kind of penalty technique. The comparative study is optimisation of path generation problems are solved using self-adaptive population size teaching-learning based optimization (SAP-TLBO) and original TLBO. In this study, two traditional path generation test problem are used to test the proposed technique. The results show that the new technique can be applied with the path generation problem without prescribed timing and gives better results than the previous technique. Furthermore, SAP-TLBO outperforms the original one.

  2. Evaluating and addressing the effects of regression to the mean phenomenon in estimating collision frequencies on urban high collision concentration locations.

    PubMed

    Lee, Jinwoo; Chung, Koohong; Kang, Seungmo

    2016-12-01

    Two different methods for addressing the regression to the mean phenomenon (RTM) were evaluated using empirical data: Data from 110 miles of freeway located in California were used to evaluate the performance of the EB and CRP methods in addressing RTM. CRP outperformed the EB method in estimating collision frequencies in selected high collision concentration locations (HCCLs). Findings indicate that the performance of the EB method can be markedly affected when SPF is biased, while the performance of CRP remains much less affected. The CRP method was more effective in addressing RTM. Published by Elsevier Ltd.

  3. Location Estimation of Urban Images Based on Geographical Neighborhoods

    NASA Astrophysics Data System (ADS)

    Huang, Jie; Lo, Sio-Long

    2018-04-01

    Estimating the location of an image is a challenging computer vision problem, and the recent decade has witnessed increasing research efforts towards the solution of this problem. In this paper, we propose a new approach to the location estimation of images taken in urban environments. Experiments are conducted to quantitatively compare the estimation accuracy of our approach, against three representative approaches in the existing literature, using a recently published dataset of over 150 thousand Google Street View images and 259 user uploaded images as queries. According to the experimental results, our approach outperforms three baseline approaches and shows its robustness across different distance thresholds.

  4. Functional status predicts acute care readmission in the traumatic spinal cord injury population.

    PubMed

    Huang, Donna; Slocum, Chloe; Silver, Julie K; Morgan, James W; Goldstein, Richard; Zafonte, Ross; Schneider, Jeffrey C

    2018-03-29

    Context/objective Acute care readmission has been identified as an important marker of healthcare quality. Most previous models assessing risk prediction of readmission incorporate variables for medical comorbidity. We hypothesized that functional status is a more robust predictor of readmission in the spinal cord injury population than medical comorbidities. Design Retrospective cross-sectional analysis. Setting Inpatient rehabilitation facilities, Uniform Data System for Medical Rehabilitation data from 2002 to 2012 Participants traumatic spinal cord injury patients. Outcome measures A logistic regression model for predicting acute care readmission based on demographic variables and functional status (Functional Model) was compared with models incorporating demographics, functional status, and medical comorbidities (Functional-Plus) or models including demographics and medical comorbidities (Demographic-Comorbidity). The primary outcomes were 3- and 30-day readmission, and the primary measure of model performance was the c-statistic. Results There were a total of 68,395 patients with 1,469 (2.15%) readmitted at 3 days and 7,081 (10.35%) readmitted at 30 days. The c-statistics for the Functional Model were 0.703 and 0.654 for 3 and 30 days. The Functional Model outperformed Demographic-Comorbidity models at 3 days (c-statistic difference: 0.066-0.096) and outperformed two of the three Demographic-Comorbidity models at 30 days (c-statistic difference: 0.029-0.056). The Functional-Plus models exhibited negligible improvements (0.002-0.010) in model performance compared to the Functional models. Conclusion Readmissions are used as a marker of hospital performance. Function-based readmission models in the spinal cord injury population outperform models incorporating medical comorbidities. Readmission risk models for this population would benefit from the inclusion of functional status.

  5. Integrative Analysis of High-throughput Cancer Studies with Contrasted Penalization

    PubMed Central

    Shi, Xingjie; Liu, Jin; Huang, Jian; Zhou, Yong; Shia, BenChang; Ma, Shuangge

    2015-01-01

    In cancer studies with high-throughput genetic and genomic measurements, integrative analysis provides a way to effectively pool and analyze heterogeneous raw data from multiple independent studies and outperforms “classic” meta-analysis and single-dataset analysis. When marker selection is of interest, the genetic basis of multiple datasets can be described using the homogeneity model or the heterogeneity model. In this study, we consider marker selection under the heterogeneity model, which includes the homogeneity model as a special case and can be more flexible. Penalization methods have been developed in the literature for marker selection. This study advances from the published ones by introducing the contrast penalties, which can accommodate the within- and across-dataset structures of covariates/regression coefficients and, by doing so, further improve marker selection performance. Specifically, we develop a penalization method that accommodates the across-dataset structures by smoothing over regression coefficients. An effective iterative algorithm, which calls an inner coordinate descent iteration, is developed. Simulation shows that the proposed method outperforms the benchmark with more accurate marker identification. The analysis of breast cancer and lung cancer prognosis studies with gene expression measurements shows that the proposed method identifies genes different from those using the benchmark and has better prediction performance. PMID:24395534

  6. Thermodynamic Constraints Improve Metabolic Networks.

    PubMed

    Krumholz, Elias W; Libourel, Igor G L

    2017-08-08

    In pursuit of establishing a realistic metabolic phenotypic space, the reversibility of reactions is thermodynamically constrained in modern metabolic networks. The reversibility constraints follow from heuristic thermodynamic poise approximations that take anticipated cellular metabolite concentration ranges into account. Because constraints reduce the feasible space, draft metabolic network reconstructions may need more extensive reconciliation, and a larger number of genes may become essential. Notwithstanding ubiquitous application, the effect of reversibility constraints on the predictive capabilities of metabolic networks has not been investigated in detail. Instead, work has focused on the implementation and validation of the thermodynamic poise calculation itself. With the advance of fast linear programming-based network reconciliation, the effects of reversibility constraints on network reconciliation and gene essentiality predictions have become feasible and are the subject of this study. Networks with thermodynamically informed reversibility constraints outperformed gene essentiality predictions compared to networks that were constrained with randomly shuffled constraints. Unconstrained networks predicted gene essentiality as accurately as thermodynamically constrained networks, but predicted substantially fewer essential genes. Networks that were reconciled with sequence similarity data and strongly enforced reversibility constraints outperformed all other networks. We conclude that metabolic network analysis confirmed the validity of the thermodynamic constraints, and that thermodynamic poise information is actionable during network reconciliation. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  7. Comparison of DNA preservation methods for environmental bacterial community samples

    USGS Publications Warehouse

    Gray, Michael A.; Pratte, Zoe A.; Kellogg, Christina A.

    2013-01-01

    Field collections of environmental samples, for example corals, for molecular microbial analyses present distinct challenges. The lack of laboratory facilities in remote locations is common, and preservation of microbial community DNA for later study is critical. A particular challenge is keeping samples frozen in transit. Five nucleic acid preservation methods that do not require cold storage were compared for effectiveness over time and ease of use. Mixed microbial communities of known composition were created and preserved by DNAgard™, RNAlater®, DMSO–EDTA–salt (DESS), FTA® cards, and FTA Elute® cards. Automated ribosomal intergenic spacer analysis and clone libraries were used to detect specific changes in the faux communities over weeks and months of storage. A previously known bias in FTA® cards that results in lower recovery of pure cultures of Gram-positive bacteria was also detected in mixed community samples. There appears to be a uniform bias across all five preservation methods against microorganisms with high G + C DNA. Overall, the liquid-based preservatives (DNAgard™, RNAlater®, and DESS) outperformed the card-based methods. No single liquid method clearly outperformed the others, leaving method choice to be based on experimental design, field facilities, shipping constraints, and allowable cost.

  8. Optimal lattice-structured materials

    DOE PAGES

    Messner, Mark C.

    2016-07-09

    This paper describes a method for optimizing the mesostructure of lattice-structured materials. These materials are periodic arrays of slender members resembling efficient, lightweight macroscale structures like bridges and frame buildings. Current additive manufacturing technologies can assemble lattice structures with length scales ranging from nanometers to millimeters. Previous work demonstrates that lattice materials have excellent stiffness- and strength-to-weight scaling, outperforming natural materials. However, there are currently no methods for producing optimal mesostructures that consider the full space of possible 3D lattice topologies. The inverse homogenization approach for optimizing the periodic structure of lattice materials requires a parameterized, homogenized material model describingmore » the response of an arbitrary structure. This work develops such a model, starting with a method for describing the long-wavelength, macroscale deformation of an arbitrary lattice. The work combines the homogenized model with a parameterized description of the total design space to generate a parameterized model. Finally, the work describes an optimization method capable of producing optimal mesostructures. Several examples demonstrate the optimization method. One of these examples produces an elastically isotropic, maximally stiff structure, here called the isotruss, that arguably outperforms the anisotropic octet truss topology.« less

  9. Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition.

    PubMed

    Ordóñez, Francisco Javier; Roggen, Daniel

    2016-01-18

    Human activity recognition (HAR) tasks have traditionally been solved using engineered features obtained by heuristic processes. Current research suggests that deep convolutional neural networks are suited to automate feature extraction from raw sensor inputs. However, human activities are made of complex sequences of motor movements, and capturing this temporal dynamics is fundamental for successful HAR. Based on the recent success of recurrent neural networks for time series domains, we propose a generic deep framework for activity recognition based on convolutional and LSTM recurrent units, which: (i) is suitable for multimodal wearable sensors; (ii) can perform sensor fusion naturally; (iii) does not require expert knowledge in designing features; and (iv) explicitly models the temporal dynamics of feature activations. We evaluate our framework on two datasets, one of which has been used in a public activity recognition challenge. Our results show that our framework outperforms competing deep non-recurrent networks on the challenge dataset by 4% on average; outperforming some of the previous reported results by up to 9%. Our results show that the framework can be applied to homogeneous sensor modalities, but can also fuse multimodal sensors to improve performance. We characterise key architectural hyperparameters' influence on performance to provide insights about their optimisation.

  10. Small quaternary alkyl phosphonium bis(fluorosulfonyl)imide ionic liquid electrolytes for sodium-ion batteries with P2- and O3-Na2/3[Fe2/3Mn1/3]O2 cathode material

    NASA Astrophysics Data System (ADS)

    Hilder, Matthias; Howlett, Patrick C.; Saurel, Damien; Gonzalo, Elena; Armand, Michel; Rojo, Teófilo; Macfarlane, Douglas R.; Forsyth, Maria

    2017-05-01

    A saturated solution of 2.3 M sodium bis(fluorosulfonyl)imide in trimethyl iso-butyl phosphonium bis(fluorosulfonyl)imide ionic liquid shows a high conductivity (0.94 mScm-1 at 50 °C), low ion association, and a wide operational temperature window (-71 °C-305 °C) making it a promising electrolyte for sodium battery applications. Cycling with P2- and O3-Na2/3[Fe2/3Mn1/3]O2 cathode display excellent performance at 50 °C outperforming conventional organic solvent based electrolytes in terms of capacities (at C/10) and long term cycle stability (at C/2). Post analysis of the electrolyte shows no measurable changes while the sodium metal anode and the cathode surface shows the presence of electrolyte specific elements after cycling, suggesting the formation of a stabilizing solid electrolyte interface. Additionally, cycling changes the topography and particle morphology of the cathode. Thus, the electrolyte properties and cell performance match or outperform previously reported results with the additional benefit of replacing the hazardous and flammable organic solvent solutions commonly employed.

  11. Normalization of urinary pteridines by urine specific gravity for early cancer detection.

    PubMed

    Burton, Casey; Shi, Honglan; Ma, Yinfa

    2014-08-05

    Urinary biomarkers, such as pteridines, require normalization with respect to an individual's hydration status and time since last urination. Conventional creatinine-based corrections are affected by a multitude of patient factors whereas urine specific gravity (USG) is a bulk specimen property that may better resist those same factors. We examined the performance of traditional creatinine adjustments relative to USG to six urinary pteridines in aggressive and benign breast cancers. 6-Biopterin, neopterin, pterin, 6-hydroxymethylpterin, isoxanthopterin, xanthopterin, and creatinine were analyzed in 50 urine specimens with a previously developed liquid chromatography-tandem mass spectrometry technique. Creatinine and USG performance were evaluated with non-parametric Mann-Whitney hypothesis testing. USG and creatinine were moderately correlated (r=0.857) with deviations occurring in dilute and concentrated specimens. In 48 aggressive and benign breast cancers, normalization by USG significantly outperformed creatinine adjustments which marginally outperformed uncorrected pteridines in predicting pathological status. In addition, isoxanthopterin and xanthopterin were significantly higher in pathological specimens when normalized by USG. USG, as a bulk property, can provide better performance over creatinine-based normalizations for urinary pteridines in cancer detection applications. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Thresher: an improved algorithm for peak height thresholding of microbial community profiles.

    PubMed

    Starke, Verena; Steele, Andrew

    2014-11-15

    This article presents Thresher, an improved technique for finding peak height thresholds for automated rRNA intergenic spacer analysis (ARISA) profiles. We argue that thresholds must be sample dependent, taking community richness into account. In most previous fragment analyses, a common threshold is applied to all samples simultaneously, ignoring richness variations among samples and thereby compromising cross-sample comparison. Our technique solves this problem, and at the same time provides a robust method for outlier rejection, selecting for removal any replicate pairs that are not valid replicates. Thresholds are calculated individually for each replicate in a pair, and separately for each sample. The thresholds are selected to be the ones that minimize the dissimilarity between the replicates after thresholding. If a choice of threshold results in the two replicates in a pair failing a quantitative test of similarity, either that threshold or that sample must be rejected. We compare thresholded ARISA results with sequencing results, and demonstrate that the Thresher algorithm outperforms conventional thresholding techniques. The software is implemented in R, and the code is available at http://verenastarke.wordpress.com or by contacting the author. vstarke@ciw.edu or http://verenastarke.wordpress.com Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  13. Fetal QRS detection and heart rate estimation: a wavelet-based approach.

    PubMed

    Almeida, Rute; Gonçalves, Hernâni; Bernardes, João; Rocha, Ana Paula

    2014-08-01

    Fetal heart rate monitoring is used for pregnancy surveillance in obstetric units all over the world but in spite of recent advances in analysis methods, there are still inherent technical limitations that bound its contribution to the improvement of perinatal indicators. In this work, a previously published wavelet transform based QRS detector, validated over standard electrocardiogram (ECG) databases, is adapted to fetal QRS detection over abdominal fetal ECG. Maternal ECG waves were first located using the original detector and afterwards a version with parameters adapted for fetal physiology was applied to detect fetal QRS, excluding signal singularities associated with maternal heartbeats. Single lead (SL) based marks were combined in a single annotator with post processing rules (SLR) from which fetal RR and fetal heart rate (FHR) measures can be computed. Data from PhysioNet with reference fetal QRS locations was considered for validation, with SLR outperforming SL including ICA based detections. The error in estimated FHR using SLR was lower than 20 bpm for more than 80% of the processed files. The median error in 1 min based FHR estimation was 0.13 bpm, with a correlation between reference and estimated FHR of 0.48, which increased to 0.73 when considering only records for which estimated FHR > 110 bpm. This allows us to conclude that the proposed methodology is able to provide a clinically useful estimation of the FHR.

  14. A real-time heat strain risk classifier using heart rate and skin temperature.

    PubMed

    Buller, Mark J; Latzka, William A; Yokota, Miyo; Tharion, William J; Moran, Daniel S

    2008-12-01

    Heat injury is a real concern to workers engaged in physically demanding tasks in high heat strain environments. Several real-time physiological monitoring systems exist that can provide indices of heat strain, e.g. physiological strain index (PSI), and provide alerts to medical personnel. However, these systems depend on core temperature measurement using expensive, ingestible thermometer pills. Seeking a better solution, we suggest the use of a model which can identify the probability that individuals are 'at risk' from heat injury using non-invasive measures. The intent is for the system to identify individuals who need monitoring more closely or who should apply heat strain mitigation strategies. We generated a model that can identify 'at risk' (PSI 7.5) workers from measures of heart rate and chest skin temperature. The model was built using data from six previously published exercise studies in which some subjects wore chemical protective equipment. The model has an overall classification error rate of 10% with one false negative error (2.7%), and outperforms an earlier model and a least squares regression model with classification errors of 21% and 14%, respectively. Additionally, the model allows the classification criteria to be adjusted based on the task and acceptable level of risk. We conclude that the model could be a valuable part of a multi-faceted heat strain management system.

  15. Prediction of spatially explicit rainfall intensity–duration thresholds for post-fire debris-flow generation in the western United States

    USGS Publications Warehouse

    Staley, Dennis M.; Negri, Jacquelyn; Kean, Jason W.; Laber, Jayme L.; Tillery, Anne C.; Youberg, Ann M.

    2017-01-01

    Early warning of post-fire debris-flow occurrence during intense rainfall has traditionally relied upon a library of regionally specific empirical rainfall intensity–duration thresholds. Development of this library and the calculation of rainfall intensity-duration thresholds often require several years of monitoring local rainfall and hydrologic response to rainstorms, a time-consuming approach where results are often only applicable to the specific region where data were collected. Here, we present a new, fully predictive approach that utilizes rainfall, hydrologic response, and readily available geospatial data to predict rainfall intensity–duration thresholds for debris-flow generation in recently burned locations in the western United States. Unlike the traditional approach to defining regional thresholds from historical data, the proposed methodology permits the direct calculation of rainfall intensity–duration thresholds for areas where no such data exist. The thresholds calculated by this method are demonstrated to provide predictions that are of similar accuracy, and in some cases outperform, previously published regional intensity–duration thresholds. The method also provides improved predictions of debris-flow likelihood, which can be incorporated into existing approaches for post-fire debris-flow hazard assessment. Our results also provide guidance for the operational expansion of post-fire debris-flow early warning systems in areas where empirically defined regional rainfall intensity–duration thresholds do not currently exist.

  16. SeqTrim: a high-throughput pipeline for pre-processing any type of sequence read

    PubMed Central

    2010-01-01

    Background High-throughput automated sequencing has enabled an exponential growth rate of sequencing data. This requires increasing sequence quality and reliability in order to avoid database contamination with artefactual sequences. The arrival of pyrosequencing enhances this problem and necessitates customisable pre-processing algorithms. Results SeqTrim has been implemented both as a Web and as a standalone command line application. Already-published and newly-designed algorithms have been included to identify sequence inserts, to remove low quality, vector, adaptor, low complexity and contaminant sequences, and to detect chimeric reads. The availability of several input and output formats allows its inclusion in sequence processing workflows. Due to its specific algorithms, SeqTrim outperforms other pre-processors implemented as Web services or standalone applications. It performs equally well with sequences from EST libraries, SSH libraries, genomic DNA libraries and pyrosequencing reads and does not lead to over-trimming. Conclusions SeqTrim is an efficient pipeline designed for pre-processing of any type of sequence read, including next-generation sequencing. It is easily configurable and provides a friendly interface that allows users to know what happened with sequences at every pre-processing stage, and to verify pre-processing of an individual sequence if desired. The recommended pipeline reveals more information about each sequence than previously described pre-processors and can discard more sequencing or experimental artefacts. PMID:20089148

  17. SimBA: simulation algorithm to fit extant-population distributions.

    PubMed

    Parida, Laxmi; Haiminen, Niina

    2015-03-14

    Simulation of populations with specified characteristics such as allele frequencies, linkage disequilibrium etc., is an integral component of many studies, including in-silico breeding optimization. Since the accuracy and sensitivity of population simulation is critical to the quality of the output of the applications that use them, accurate algorithms are required to provide a strong foundation to the methods in these studies. In this paper we present SimBA (Simulation using Best-fit Algorithm) a non-generative approach, based on a combination of stochastic techniques and discrete methods. We optimize a hill climbing algorithm and extend the framework to include multiple subpopulation structures. Additionally, we show that SimBA is very sensitive to the input specifications, i.e., very similar but distinct input characteristics result in distinct outputs with high fidelity to the specified distributions. This property of the simulation is not explicitly modeled or studied by previous methods. We show that SimBA outperforms the existing population simulation methods, both in terms of accuracy as well as time-efficiency. Not only does it construct populations that meet the input specifications more stringently than other published methods, SimBA is also easy to use. It does not require explicit parameter adaptations or calibrations. Also, it can work with input specified as distributions, without an exemplar matrix or population as required by some methods. SimBA is available at http://researcher.ibm.com/project/5669 .

  18. Prediction of spatially explicit rainfall intensity-duration thresholds for post-fire debris-flow generation in the western United States

    NASA Astrophysics Data System (ADS)

    Staley, Dennis M.; Negri, Jacquelyn A.; Kean, Jason W.; Laber, Jayme L.; Tillery, Anne C.; Youberg, Ann M.

    2017-02-01

    Early warning of post-fire debris-flow occurrence during intense rainfall has traditionally relied upon a library of regionally specific empirical rainfall intensity-duration thresholds. Development of this library and the calculation of rainfall intensity-duration thresholds often require several years of monitoring local rainfall and hydrologic response to rainstorms, a time-consuming approach where results are often only applicable to the specific region where data were collected. Here, we present a new, fully predictive approach that utilizes rainfall, hydrologic response, and readily available geospatial data to predict rainfall intensity-duration thresholds for debris-flow generation in recently burned locations in the western United States. Unlike the traditional approach to defining regional thresholds from historical data, the proposed methodology permits the direct calculation of rainfall intensity-duration thresholds for areas where no such data exist. The thresholds calculated by this method are demonstrated to provide predictions that are of similar accuracy, and in some cases outperform, previously published regional intensity-duration thresholds. The method also provides improved predictions of debris-flow likelihood, which can be incorporated into existing approaches for post-fire debris-flow hazard assessment. Our results also provide guidance for the operational expansion of post-fire debris-flow early warning systems in areas where empirically defined regional rainfall intensity-duration thresholds do not currently exist.

  19. Supervised multiblock sparse multivariable analysis with application to multimodal brain imaging genetics.

    PubMed

    Kawaguchi, Atsushi; Yamashita, Fumio

    2017-10-01

    This article proposes a procedure for describing the relationship between high-dimensional data sets, such as multimodal brain images and genetic data. We propose a supervised technique to incorporate the clinical outcome to determine a score, which is a linear combination of variables with hieratical structures to multimodalities. This approach is expected to obtain interpretable and predictive scores. The proposed method was applied to a study of Alzheimer's disease (AD). We propose a diagnostic method for AD that involves using whole-brain magnetic resonance imaging (MRI) and positron emission tomography (PET), and we select effective brain regions for the diagnostic probability and investigate the genome-wide association with the regions using single nucleotide polymorphisms (SNPs). The two-step dimension reduction method, which we previously introduced, was considered applicable to such a study and allows us to partially incorporate the proposed method. We show that the proposed method offers classification functions with feasibility and reasonable prediction accuracy based on the receiver operating characteristic (ROC) analysis and reasonable regions of the brain and genomes. Our simulation study based on the synthetic structured data set showed that the proposed method outperformed the original method and provided the characteristic for the supervised feature. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  20. ERPs and oscillations during encoding predict retrieval of digit memory in superior mnemonists.

    PubMed

    Pan, Yafeng; Li, Xianchun; Chen, Xi; Ku, Yixuan; Dong, Yujie; Dou, Zheng; He, Lin; Hu, Yi; Li, Weidong; Zhou, Xiaolin

    2017-10-01

    Previous studies have consistently demonstrated that superior mnemonists (SMs) outperform normal individuals in domain-specific memory tasks. However, the neural correlates of memory-related processes remain unclear. In the current EEG study, SMs and control participants performed a digit memory task during which their brain activity was recorded. Chinese SMs used a digit-image mnemonic for encoding digits, in which they associated 2-digit groups with images immediately after the presentation of each even-position digit in sequences. Behaviorally, SMs' memory of digit sequences was better than the controls'. During encoding in the study phase, SMs showed an increased right central P2 (150-250ms post onset) and a larger right posterior high-alpha (10-14Hz, 500-1720ms) oscillation on digits at even-positions compared with digits at odd-positions. Both P2 and high-alpha oscillations in the study phase co-varied with performance in the recall phase, but only in SMs, indicating that neural dynamics during encoding could predict successful retrieval of digit memory in SMs. Our findings suggest that representation of a digit sequence in SMs using mnemonics may recruit both the early-stage attention allocation process and the sustained information preservation process. This study provides evidence for the role of dynamic and efficient neural encoding processes in mnemonists. Copyright © 2017. Published by Elsevier Inc.

  1. Age-Dependent Effects of Catechol-O-Methyltransferase (COMT) Gene Val158Met Polymorphism on Language Function in Developing Children.

    PubMed

    Sugiura, Lisa; Toyota, Tomoko; Matsuba-Kurita, Hiroko; Iwayama, Yoshimi; Mazuka, Reiko; Yoshikawa, Takeo; Hagiwara, Hiroko

    2017-01-01

    The genetic basis controlling language development remains elusive. Previous studies of the catechol-O-methyltransferase (COMT) Val158Met genotype and cognition have focused on prefrontally guided executive functions involving dopamine. However, COMT may further influence posterior cortical regions implicated in language perception. We investigated whether COMT influences language ability and cortical language processing involving the posterior language regions in 246 children aged 6-10 years. We assessed language ability using a language test and cortical responses recorded during language processing using a word repetition task and functional near-infrared spectroscopy. The COMT genotype had significant effects on language performance and processing. Importantly, Met carriers outperformed Val homozygotes in language ability during the early elementary school years (6-8 years), whereas Val homozygotes exhibited significant language development during the later elementary school years. Both genotype groups exhibited equal language performance at approximately 10 years of age. Val homozygotes exhibited significantly less cortical activation compared with Met carriers during word processing, particularly at older ages. These findings regarding dopamine transmission efficacy may be explained by a hypothetical inverted U-shaped curve. Our findings indicate that the effects of the COMT genotype on language ability and cortical language processing may change in a narrow age window of 6-10 years. © The Author 2016. Published by Oxford University Press.

  2. Mass spectrometric detection of 27-hydroxycholesterol in breast cancer exosomes.

    PubMed

    Roberg-Larsen, Hanne; Lund, Kaja; Seterdal, Kristina Erikstad; Solheim, Stian; Vehus, Tore; Solberg, Nina; Krauss, Stefan; Lundanes, Elsa; Wilson, Steven Ray

    2017-05-01

    Exosomes from cancer cells are rich sources of biomarkers and may contain elevated levels of lipids of diagnostic value. 27-Hydroxycholesterol (27-OHC) is associated with proliferation and metastasis in estrogen receptor positive (ER+) breast cancer. In this study, we investigated the levels of 27-OHC, and other sidechain-hydroxylated oxysterols in exosomes. To study both cytoplasmic and exosomal oxysterol samples of limited size, we have developed a capillary liquid chromatography-mass spectrometry platform that outperforms our previously published systems regarding chromatographic resolution, analysis time and sensitivity. In the analyzed samples, the quantified level of cytoplasmic 27-OHC using this platform fitted with mRNA levels of 27-OHC's corresponding enzyme, CYP27A1. We find clearly increased levels of 27-OHC in exosomes (i.e., enrichment) from an ER+ breast cancer cell line (MCF-7) compared to exosomes derived from an estrogen receptor (ER-) breast cancer cell line (MDA-MB-231) and other control exosomes (non-cancerous cell line (HEK293) and human pooled serum). The exosomal oxysterol profile did not reflect cytoplasmic oxysterol profiles in the cells of origin; cytoplasmic 27-OHC was low in ER+ MCF-7 cells while high in MDA-MB-231 cells. Other control cancer cells showed varied cytoplasmic oxysterol levels. Hence, exosome profiling in cancer cells might provide complementary information with the possibility of diagnostic value. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Extracting PICO Sentences from Clinical Trial Reports using Supervised Distant Supervision

    PubMed Central

    Wallace, Byron C.; Kuiper, Joël; Sharma, Aakash; Zhu, Mingxi (Brian); Marshall, Iain J.

    2016-01-01

    Systematic reviews underpin Evidence Based Medicine (EBM) by addressing precise clinical questions via comprehensive synthesis of all relevant published evidence. Authors of systematic reviews typically define a Population/Problem, Intervention, Comparator, and Outcome (a PICO criteria) of interest, and then retrieve, appraise and synthesize results from all reports of clinical trials that meet these criteria. Identifying PICO elements in the full-texts of trial reports is thus a critical yet time-consuming step in the systematic review process. We seek to expedite evidence synthesis by developing machine learning models to automatically extract sentences from articles relevant to PICO elements. Collecting a large corpus of training data for this task would be prohibitively expensive. Therefore, we derive distant supervision (DS) with which to train models using previously conducted reviews. DS entails heuristically deriving ‘soft’ labels from an available structured resource. However, we have access only to unstructured, free-text summaries of PICO elements for corresponding articles; we must derive from these the desired sentence-level annotations. To this end, we propose a novel method – supervised distant supervision (SDS) – that uses a small amount of direct supervision to better exploit a large corpus of distantly labeled instances by learning to pseudo-annotate articles using the available DS. We show that this approach tends to outperform existing methods with respect to automated PICO extraction. PMID:27746703

  4. A novel missense-mutation-related feature extraction scheme for 'driver' mutation identification.

    PubMed

    Tan, Hua; Bao, Jiguang; Zhou, Xiaobo

    2012-11-15

    It becomes widely accepted that human cancer is a disease involving dynamic changes in the genome and that the missense mutations constitute the bulk of human genetic variations. A multitude of computational algorithms, especially the machine learning-based ones, has consequently been proposed to distinguish missense changes that contribute to the cancer progression ('driver' mutation) from those that do not ('passenger' mutation). However, the existing methods have multifaceted shortcomings, in the sense that they either adopt incomplete feature space or depend on protein structural databases which are usually far from integrated. In this article, we investigated multiple aspects of a missense mutation and identified a novel feature space that well distinguishes cancer-associated driver mutations from passenger ones. An index (DX score) was proposed to evaluate the discriminating capability of each feature, and a subset of these features which ranks top was selected to build the SVM classifier. Cross-validation showed that the classifier trained on our selected features significantly outperforms the existing ones both in precision and robustness. We applied our method to several datasets of missense mutations culled from published database and literature and obtained more reasonable results than previous studies. The software is available online at http://www.methodisthealth.com/software and https://sites.google.com/site/drivermutationidentification/. xzhou@tmhs.org. Supplementary data are available at Bioinformatics online.

  5. PockDrug-Server: a new web server for predicting pocket druggability on holo and apo proteins.

    PubMed

    Hussein, Hiba Abi; Borrel, Alexandre; Geneix, Colette; Petitjean, Michel; Regad, Leslie; Camproux, Anne-Claude

    2015-07-01

    Predicting protein pocket's ability to bind drug-like molecules with high affinity, i.e. druggability, is of major interest in the target identification phase of drug discovery. Therefore, pocket druggability investigations represent a key step of compound clinical progression projects. Currently computational druggability prediction models are attached to one unique pocket estimation method despite pocket estimation uncertainties. In this paper, we propose 'PockDrug-Server' to predict pocket druggability, efficient on both (i) estimated pockets guided by the ligand proximity (extracted by proximity to a ligand from a holo protein structure) and (ii) estimated pockets based solely on protein structure information (based on amino atoms that form the surface of potential binding cavities). PockDrug-Server provides consistent druggability results using different pocket estimation methods. It is robust with respect to pocket boundary and estimation uncertainties, thus efficient using apo pockets that are challenging to estimate. It clearly distinguishes druggable from less druggable pockets using different estimation methods and outperformed recent druggability models for apo pockets. It can be carried out from one or a set of apo/holo proteins using different pocket estimation methods proposed by our web server or from any pocket previously estimated by the user. PockDrug-Server is publicly available at: http://pockdrug.rpbs.univ-paris-diderot.fr. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  6. Smoothing of climate time series revisited

    NASA Astrophysics Data System (ADS)

    Mann, Michael E.

    2008-08-01

    We present an easily implemented method for smoothing climate time series, generalizing upon an approach previously described by Mann (2004). The method adaptively weights the three lowest order time series boundary constraints to optimize the fit with the raw time series. We apply the method to the instrumental global mean temperature series from 1850-2007 and to various surrogate global mean temperature series from 1850-2100 derived from the CMIP3 multimodel intercomparison project. These applications demonstrate that the adaptive method systematically out-performs certain widely used default smoothing methods, and is more likely to yield accurate assessments of long-term warming trends.

  7. AGU journals increase in importance according to 2010 Impact Factors

    NASA Astrophysics Data System (ADS)

    Cook, Bill

    2011-07-01

    AGU journals continue to rank highly in many categories in the 2010 Journal Citation Report (JCR), which was released by Thomson Reuters on 28 June. JCR reports on several measures of journal usage, including a journal's Eigenfactor score, its Article Influence score, its Impact Factor, and its rank within a cohort of similar journals. According to the 2010 statistics, AGU again has outperformed its larger competitors. Four different AGU titles are ranked in the top three journals in six different cohorts. The Impact Factor of several AGU journals increased significantly over the previous year.

  8. Using Objective Structured Clinical Examinations to Assess Intern Orthopaedic Physical Examination Skills: A Multimodal Didactic Comparison.

    PubMed

    Phillips, Donna; Pean, Christian A; Allen, Kathleen; Zuckerman, Joseph; Egol, Kenneth

    Patient care is 1 of the 6 core competencies defined by the Accreditation Council for Graduate Medical Education (ACGME). The physical examination (PE) is a fundamental skill to evaluate patients and make an accurate diagnosis. The purpose of this study was to investigate 3 different methods to teach PE skills and to assess the ability to do a complete PE in a simulated patient encounter. Prospective, uncontrolled, observational. Northeastern academic medical center. A total of 32 orthopedic surgery residents participated and were divided into 3 didactic groups: Group 1 (n = 12) live interactive lectures, demonstration on standardized patients, and textbook reading; Group 2 (n = 11) video recordings of the lectures given to Group 1 and textbook reading alone; Group 3 (n = 9): 90-minute modules taught by residents to interns in near-peer format and textbook reading. The overall score for objective structured clinical examinations from the combined groups was 66%. There was a trend toward more complete PEs in Group 1 taught via live lectures and demonstrations compared to Group 2 that relied on video recording. Near-peer taught residents from Group 3 significantly outperformed Group 2 residents overall (p = 0.02), and trended toward significantly outperforming Group 1 residents as well, with significantly higher scores in the ankle (p = 0.02) and shoulder (p = 0.02) PE cases. This study found that orthopedic interns taught musculoskeletal PE skills by near-peers outperformed other groups overall. An overall score of 66% for the combined didactic groups suggests a baseline deficit in first-year resident musculoskeletal PE skills. The PE should continue to be taught and objectively assessed throughout residency to confirm that budding surgeons have mastered these fundamental skills before going into practice. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  9. Implicit Incompressible SPH.

    PubMed

    Ihmsen, Markus; Cornelis, Jens; Solenthaler, Barbara; Horvath, Christopher; Teschner, Matthias

    2013-07-25

    We propose a novel formulation of the projection method for Smoothed Particle Hydrodynamics (SPH). We combine a symmetric SPH pressure force and an SPH discretization of the continuity equation to obtain a discretized form of the pressure Poisson equation (PPE). In contrast to previous projection schemes, our system does consider the actual computation of the pressure force. This incorporation improves the convergence rate of the solver. Furthermore, we propose to compute the density deviation based on velocities instead of positions as this formulation improves the robustness of the time-integration scheme. We show that our novel formulation outperforms previous projection schemes and state-of-the-art SPH methods. Large time steps and small density deviations of down to 0.01% can be handled in typical scenarios. The practical relevance of the approach is illustrated by scenarios with up to 40 million SPH particles.

  10. Implicit incompressible SPH.

    PubMed

    Ihmsen, Markus; Cornelis, Jens; Solenthaler, Barbara; Horvath, Christopher; Teschner, Matthias

    2014-03-01

    We propose a novel formulation of the projection method for Smoothed Particle Hydrodynamics (SPH). We combine a symmetric SPH pressure force and an SPH discretization of the continuity equation to obtain a discretized form of the pressure Poisson equation (PPE). In contrast to previous projection schemes, our system does consider the actual computation of the pressure force. This incorporation improves the convergence rate of the solver. Furthermore, we propose to compute the density deviation based on velocities instead of positions as this formulation improves the robustness of the time-integration scheme. We show that our novel formulation outperforms previous projection schemes and state-of-the-art SPH methods. Large time steps and small density deviations of down to 0.01 percent can be handled in typical scenarios. The practical relevance of the approach is illustrated by scenarios with up to 40 million SPH particles.

  11. Strong Showing for AGU Journals in 2009 Impact Factors

    NASA Astrophysics Data System (ADS)

    Cook, Bill

    2010-06-01

    AGU publishes great science, which is recognized in several ways. One of the most widely recognized is from Thomson Reuters, which provides the Journal Citation Report (JCR) each year as a component of the Web of Science®. JCR reports on several measures of journal usage, including a journal's Eigenfactor score, its Article Influence score, its Impact Factor, and its rank within a cohort of similar journals. According to the 2009 statistics released last week, AGU again has outperformed its larger competitors. For the twelfth time, two different AGU titles hold the top rank in their categories, and AGU titles hold the second spot in two other categories and third in two more.

  12. Scientific Eminence: Where Are the Women?

    PubMed

    Eagly, Alice H; Miller, David I

    2016-11-01

    Women are sparsely represented among psychologists honored for scientific eminence. However, most currently eminent psychologists started their careers when far fewer women pursued training in psychological science. Now that women earn the majority of psychology Ph.D.'s, will they predominate in the next generation's cadre of eminent psychologists? Comparing currently active female and male psychology professors on publication metrics such as the h index provides clues for answering this question. Men outperform women on the h index and its two components: scientific productivity and citations of contributions. To interpret these gender gaps, we first evaluate whether publication metrics are affected by gender bias in obtaining grant support, publishing papers, or gaining citations of published papers. We also consider whether women's chances of attaining eminence are compromised by two intertwined sets of influences: (a) gender bias stemming from social norms pertaining to gender and to science and (b) the choices that individual psychologists make in pursuing their careers. © The Author(s) 2016.

  13. Increasing the relevance of GCM simulations for Climate Services

    NASA Astrophysics Data System (ADS)

    Smith, L. A.; Suckling, E.

    2012-12-01

    The design and interpretation of model simulations for climate services differ significantly from experimental design for the advancement of the fundamental research on predictability that underpins it. Climate services consider the sources of best information available today; this calls for a frank evaluation of model skill in the face of statistical benchmarks defined by empirical models. The fact that Physical simulation models are thought to provide the only reliable method for extrapolating into conditions not previously observed has no bearing on whether or not today's simulation models outperform empirical models. Evidence on the length scales on which today's simulation models fail to outperform empirical benchmarks is presented; it is illustrated that this occurs even on global scales in decadal prediction. At all timescales considered thus far (as of July 2012), predictions based on simulation models are improved by blending with the output of statistical models. Blending is shown to be more interesting in the climate context than it is in the weather context, where blending with a history-based climatology is straightforward. As GCMs improve and as the Earth's climate moves further from that of the last century, the skill from simulation models and their relevance to climate services is expected to increase. Examples from both seasonal and decadal forecasting will be used to discuss a third approach that may increase the role of current GCMs more quickly. Specifically, aspects of the experimental design in previous hind cast experiments are shown to hinder the use of GCM simulations for climate services. Alternative designs are proposed. The value in revisiting Thompson's classic approach to improving weather forecasting in the fifties in the context of climate services is discussed.

  14. A New Heuristic Anonymization Technique for Privacy Preserved Datasets Publication on Cloud Computing

    NASA Astrophysics Data System (ADS)

    Aldeen Yousra, S.; Mazleena, Salleh

    2018-05-01

    Recent advancement in Information and Communication Technologies (ICT) demanded much of cloud services to sharing users’ private data. Data from various organizations are the vital information source for analysis and research. Generally, this sensitive or private data information involves medical, census, voter registration, social network, and customer services. Primary concern of cloud service providers in data publishing is to hide the sensitive information of individuals. One of the cloud services that fulfill the confidentiality concerns is Privacy Preserving Data Mining (PPDM). The PPDM service in Cloud Computing (CC) enables data publishing with minimized distortion and absolute privacy. In this method, datasets are anonymized via generalization to accomplish the privacy requirements. However, the well-known privacy preserving data mining technique called K-anonymity suffers from several limitations. To surmount those shortcomings, I propose a new heuristic anonymization framework for preserving the privacy of sensitive datasets when publishing on cloud. The advantages of K-anonymity, L-diversity and (α, k)-anonymity methods for efficient information utilization and privacy protection are emphasized. Experimental results revealed the superiority and outperformance of the developed technique than K-anonymity, L-diversity, and (α, k)-anonymity measure.

  15. Media and memory: the efficacy of video and print materials for promoting patient education about asthma.

    PubMed

    Wilson, Elizabeth A H; Park, Denise C; Curtis, Laura M; Cameron, Kenzie A; Clayman, Marla L; Makoul, Gregory; Vom Eigen, Keith; Wolf, Michael S

    2010-09-01

    We examined the effects of presentation medium on immediate and delayed recall of information and assessed the effect of giving patients take-home materials after initial presentations. Primary-care patients received video-based, print-based or no asthma education about asthma symptoms and triggers and then answered knowledge-based questions. Print participants and half the video participants received take-home print materials. A week later, available participants completed the knowledge assessment again. Participants receiving either intervention outperformed controls on immediate and delayed assessments (p<0.001). For symptom-related information, immediate performance did not significantly differ between print and video participants. A week later, receiving take-home print predicted better performance (p<0.05), as did self-reported review among recipients of take-home print (p<0.01). For content about inhaler usage, although video watchers outperformed print participants immediately after seeing the materials (p<0.001), a week later these two groups' performance did not significantly differ. Among participants given take-home materials, review predicted marginally better recall (p=0.06). Video and print interventions can promote recall of health-related information. Additionally, reviewable materials, if they are utilized, may improve retention. When creating educational tools, providers should consider how long information must be retained, its content, and the feasibility of providing tangible supporting materials. Copyright (c) 2010. Published by Elsevier Ireland Ltd.

  16. Comparative study of the effectiveness and limitations of current methods for detecting sequence coevolution.

    PubMed

    Mao, Wenzhi; Kaya, Cihan; Dutta, Anindita; Horovitz, Amnon; Bahar, Ivet

    2015-06-15

    With rapid accumulation of sequence data on several species, extracting rational and systematic information from multiple sequence alignments (MSAs) is becoming increasingly important. Currently, there is a plethora of computational methods for investigating coupled evolutionary changes in pairs of positions along the amino acid sequence, and making inferences on structure and function. Yet, the significance of coevolution signals remains to be established. Also, a large number of false positives (FPs) arise from insufficient MSA size, phylogenetic background and indirect couplings. Here, a set of 16 pairs of non-interacting proteins is thoroughly examined to assess the effectiveness and limitations of different methods. The analysis shows that recent computationally expensive methods designed to remove biases from indirect couplings outperform others in detecting tertiary structural contacts as well as eliminating intermolecular FPs; whereas traditional methods such as mutual information benefit from refinements such as shuffling, while being highly efficient. Computations repeated with 2,330 pairs of protein families from the Negatome database corroborated these results. Finally, using a training dataset of 162 families of proteins, we propose a combined method that outperforms existing individual methods. Overall, the study provides simple guidelines towards the choice of suitable methods and strategies based on available MSA size and computing resources. Software is freely available through the Evol component of ProDy API. © The Author 2015. Published by Oxford University Press.

  17. Risk-based management of invading plant disease.

    PubMed

    Hyatt-Twynam, Samuel R; Parnell, Stephen; Stutt, Richard O J H; Gottwald, Tim R; Gilligan, Christopher A; Cunniffe, Nik J

    2017-05-01

    Effective control of plant disease remains a key challenge. Eradication attempts often involve removal of host plants within a certain radius of detection, targeting asymptomatic infection. Here we develop and test potentially more effective, epidemiologically motivated, control strategies, using a mathematical model previously fitted to the spread of citrus canker in Florida. We test risk-based control, which preferentially removes hosts expected to cause a high number of infections in the remaining host population. Removals then depend on past patterns of pathogen spread and host removal, which might be nontransparent to affected stakeholders. This motivates a variable radius strategy, which approximates risk-based control via removal radii that vary by location, but which are fixed in advance of any epidemic. Risk-based control outperforms variable radius control, which in turn outperforms constant radius removal. This result is robust to changes in disease spread parameters and initial patterns of susceptible host plants. However, efficiency degrades if epidemiological parameters are incorrectly characterised. Risk-based control including additional epidemiology can be used to improve disease management, but it requires good prior knowledge for optimal performance. This focuses attention on gaining maximal information from past epidemics, on understanding model transferability between locations and on adaptive management strategies that change over time. © 2017 The Authors. New Phytologist © 2017 New Phytologist Trust.

  18. HMM-based lexicon-driven and lexicon-free word recognition for online handwritten Indic scripts.

    PubMed

    Bharath, A; Madhvanath, Sriganesh

    2012-04-01

    Research for recognizing online handwritten words in Indic scripts is at its early stages when compared to Latin and Oriental scripts. In this paper, we address this problem specifically for two major Indic scripts--Devanagari and Tamil. In contrast to previous approaches, the techniques we propose are largely data driven and script independent. We propose two different techniques for word recognition based on Hidden Markov Models (HMM): lexicon driven and lexicon free. The lexicon-driven technique models each word in the lexicon as a sequence of symbol HMMs according to a standard symbol writing order derived from the phonetic representation. The lexicon-free technique uses a novel Bag-of-Symbols representation of the handwritten word that is independent of symbol order and allows rapid pruning of the lexicon. On handwritten Devanagari word samples featuring both standard and nonstandard symbol writing orders, a combination of lexicon-driven and lexicon-free recognizers significantly outperforms either of them used in isolation. In contrast, most Tamil word samples feature the standard symbol order, and the lexicon-driven recognizer outperforms the lexicon free one as well as their combination. The best recognition accuracies obtained for 20,000 word lexicons are 87.13 percent for Devanagari when the two recognizers are combined, and 91.8 percent for Tamil using the lexicon-driven technique.

  19. Adult Cleaner Wrasse Outperform Capuchin Monkeys, Chimpanzees and Orang-utans in a Complex Foraging Task Derived from Cleaner – Client Reef Fish Cooperation

    PubMed Central

    Proctor, Darby; Essler, Jennifer; Pinto, Ana I.; Wismer, Sharon; Stoinski, Tara; Brosnan, Sarah F.; Bshary, Redouan

    2012-01-01

    The insight that animals' cognitive abilities are linked to their evolutionary history, and hence their ecology, provides the framework for the comparative approach. Despite primates renowned dietary complexity and social cognition, including cooperative abilities, we here demonstrate that cleaner wrasse outperform three primate species, capuchin monkeys, chimpanzees and orang-utans, in a foraging task involving a choice between two actions, both of which yield identical immediate rewards, but only one of which yields an additional delayed reward. The foraging task decisions involve partner choice in cleaners: they must service visiting client reef fish before resident clients to access both; otherwise the former switch to a different cleaner. Wild caught adult, but not juvenile, cleaners learned to solve the task quickly and relearned the task when it was reversed. The majority of primates failed to perform above chance after 100 trials, which is in sharp contrast to previous studies showing that primates easily learn to choose an action that yields immediate double rewards compared to an alternative action. In conclusion, the adult cleaners' ability to choose a superior action with initially neutral consequences is likely due to repeated exposure in nature, which leads to specific learned optimal foraging decision rules. PMID:23185293

  20. Deep Networks Can Resemble Human Feed-forward Vision in Invariant Object Recognition

    PubMed Central

    Kheradpisheh, Saeed Reza; Ghodrati, Masoud; Ganjtabesh, Mohammad; Masquelier, Timothée

    2016-01-01

    Deep convolutional neural networks (DCNNs) have attracted much attention recently, and have shown to be able to recognize thousands of object categories in natural image databases. Their architecture is somewhat similar to that of the human visual system: both use restricted receptive fields, and a hierarchy of layers which progressively extract more and more abstracted features. Yet it is unknown whether DCNNs match human performance at the task of view-invariant object recognition, whether they make similar errors and use similar representations for this task, and whether the answers depend on the magnitude of the viewpoint variations. To investigate these issues, we benchmarked eight state-of-the-art DCNNs, the HMAX model, and a baseline shallow model and compared their results to those of humans with backward masking. Unlike in all previous DCNN studies, we carefully controlled the magnitude of the viewpoint variations to demonstrate that shallow nets can outperform deep nets and humans when variations are weak. When facing larger variations, however, more layers were needed to match human performance and error distributions, and to have representations that are consistent with human behavior. A very deep net with 18 layers even outperformed humans at the highest variation level, using the most human-like representations. PMID:27601096

  1. Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition

    PubMed Central

    Ordóñez, Francisco Javier; Roggen, Daniel

    2016-01-01

    Human activity recognition (HAR) tasks have traditionally been solved using engineered features obtained by heuristic processes. Current research suggests that deep convolutional neural networks are suited to automate feature extraction from raw sensor inputs. However, human activities are made of complex sequences of motor movements, and capturing this temporal dynamics is fundamental for successful HAR. Based on the recent success of recurrent neural networks for time series domains, we propose a generic deep framework for activity recognition based on convolutional and LSTM recurrent units, which: (i) is suitable for multimodal wearable sensors; (ii) can perform sensor fusion naturally; (iii) does not require expert knowledge in designing features; and (iv) explicitly models the temporal dynamics of feature activations. We evaluate our framework on two datasets, one of which has been used in a public activity recognition challenge. Our results show that our framework outperforms competing deep non-recurrent networks on the challenge dataset by 4% on average; outperforming some of the previous reported results by up to 9%. Our results show that the framework can be applied to homogeneous sensor modalities, but can also fuse multimodal sensors to improve performance. We characterise key architectural hyperparameters’ influence on performance to provide insights about their optimisation. PMID:26797612

  2. Effective Diagnosis of Alzheimer's Disease by Means of Association Rules

    NASA Astrophysics Data System (ADS)

    Chaves, R.; Ramírez, J.; Górriz, J. M.; López, M.; Salas-Gonzalez, D.; Illán, I.; Segovia, F.; Padilla, P.

    In this paper we present a novel classification method of SPECT images for the early diagnosis of the Alzheimer's disease (AD). The proposed method is based on Association Rules (ARs) aiming to discover interesting associations between attributes contained in the database. The system uses firstly voxel-as-features (VAF) and Activation Estimation (AE) to find tridimensional activated brain regions of interest (ROIs) for each patient. These ROIs act as inputs to secondly mining ARs between activated blocks for controls, with a specified minimum support and minimum confidence. ARs are mined in supervised mode, using information previously extracted from the most discriminant rules for centering interest in the relevant brain areas, reducing the computational requirement of the system. Finally classification process is performed depending on the number of previously mined rules verified by each subject, yielding an up to 95.87% classification accuracy, thus outperforming recent developed methods for AD diagnosis.

  3. SASS Applied to Optimum Work Roll Profile Selection in the Hot Rolling of Wide Steel

    NASA Astrophysics Data System (ADS)

    Nolle, Lars

    The quality of steel strip produced in a wide strip rolling mill depends heavily on the careful selection of initial ground work roll profiles for each of the mill stands in the finishing train. In the past, these profiles were determined by human experts, based on their knowledge and experience. In previous work, the profiles were successfully optimised using a self-organising migration algorithm (SOMA). In this research, SASS, a novel heuristic optimisation algorithm that has only one control parameter, has been used to find the optimum profiles for a simulated rolling mill. The resulting strip quality produced using the profiles found by SASS is compared with results from previous work and the quality produced using the original profile specifications. The best set of profiles found by SASS clearly outperformed the original set and performed equally well as SOMA without the need of finding a suitable set of control parameters.

  4. Correlation Filter Learning Toward Peak Strength for Visual Tracking.

    PubMed

    Sui, Yao; Wang, Guanghui; Zhang, Li

    2018-04-01

    This paper presents a novel visual tracking approach to correlation filter learning toward peak strength of correlation response. Previous methods leverage all features of the target and the immediate background to learn a correlation filter. Some features, however, may be distractive to tracking, like those from occlusion and local deformation, resulting in unstable tracking performance. This paper aims at solving this issue and proposes a novel algorithm to learn the correlation filter. The proposed approach, by imposing an elastic net constraint on the filter, can adaptively eliminate those distractive features in the correlation filtering. A new peak strength metric is proposed to measure the discriminative capability of the learned correlation filter. It is demonstrated that the proposed approach effectively strengthens the peak of the correlation response, leading to more discriminative performance than previous methods. Extensive experiments on a challenging visual tracking benchmark demonstrate that the proposed tracker outperforms most state-of-the-art methods.

  5. Extracting Cross-Ontology Weighted Association Rules from Gene Ontology Annotations.

    PubMed

    Agapito, Giuseppe; Milano, Marianna; Guzzi, Pietro Hiram; Cannataro, Mario

    2016-01-01

    Gene Ontology (GO) is a structured repository of concepts (GO Terms) that are associated to one or more gene products through a process referred to as annotation. The analysis of annotated data is an important opportunity for bioinformatics. There are different approaches of analysis, among those, the use of association rules (AR) which provides useful knowledge, discovering biologically relevant associations between terms of GO, not previously known. In a previous work, we introduced GO-WAR (Gene Ontology-based Weighted Association Rules), a methodology for extracting weighted association rules from ontology-based annotated datasets. We here adapt the GO-WAR algorithm to mine cross-ontology association rules, i.e., rules that involve GO terms present in the three sub-ontologies of GO. We conduct a deep performance evaluation of GO-WAR by mining publicly available GO annotated datasets, showing how GO-WAR outperforms current state of the art approaches.

  6. Learning polynomial feedforward neural networks by genetic programming and backpropagation.

    PubMed

    Nikolaev, N Y; Iba, H

    2003-01-01

    This paper presents an approach to learning polynomial feedforward neural networks (PFNNs). The approach suggests, first, finding the polynomial network structure by means of a population-based search technique relying on the genetic programming paradigm, and second, further adjustment of the best discovered network weights by an especially derived backpropagation algorithm for higher order networks with polynomial activation functions. These two stages of the PFNN learning process enable us to identify networks with good training as well as generalization performance. Empirical results show that this approach finds PFNN which outperform considerably some previous constructive polynomial network algorithms on processing benchmark time series.

  7. A SAT Based Effective Algorithm for the Directed Hamiltonian Cycle Problem

    NASA Astrophysics Data System (ADS)

    Jäger, Gerold; Zhang, Weixiong

    The Hamiltonian cycle problem (HCP) is an important combinatorial problem with applications in many areas. While thorough theoretical and experimental analyses have been made on the HCP in undirected graphs, little is known for the HCP in directed graphs (DHCP). The contribution of this work is an effective algorithm for the DHCP. Our algorithm explores and exploits the close relationship between the DHCP and the Assignment Problem (AP) and utilizes a technique based on Boolean satisfiability (SAT). By combining effective algorithms for the AP and SAT, our algorithm significantly outperforms previous exact DHCP algorithms including an algorithm based on the award-winning Concorde TSP algorithm.

  8. iPARTS2: an improved tool for pairwise alignment of RNA tertiary structures, version 2.

    PubMed

    Yang, Chung-Han; Shih, Cheng-Ting; Chen, Kun-Tze; Lee, Po-Han; Tsai, Ping-Han; Lin, Jian-Cheng; Yen, Ching-Yu; Lin, Tiao-Yin; Lu, Chin Lung

    2016-07-08

    Since its first release in 2010, iPARTS has become a valuable tool for globally or locally aligning two RNA 3D structures. It was implemented by a structural alphabet (SA)-based approach, which uses an SA of 23 letters to reduce RNA 3D structures into 1D sequences of SA letters and applies traditional sequence alignment to these SA-encoded sequences for determining their global or local similarity. In this version, we have re-implemented iPARTS into a new web server iPARTS2 by constructing a totally new SA, which consists of 92 elements with each carrying both information of base and backbone geometry for a representative nucleotide. This SA is significantly different from the one used in iPARTS, because the latter consists of only 23 elements with each carrying only the backbone geometry information of a representative nucleotide. Our experimental results have shown that iPARTS2 outperforms its previous version iPARTS and also achieves better accuracy than other popular tools, such as SARA, SETTER and RASS, in RNA alignment quality and function prediction. iPARTS2 takes as input two RNA 3D structures in the PDB format and outputs their global or local alignments with graphical display. iPARTS2 is now available online at http://genome.cs.nthu.edu.tw/iPARTS2/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  9. Theory of mind and executive function during middle childhood across cultures.

    PubMed

    Wang, Zhenlin; Devine, Rory T; Wong, Keri K; Hughes, Claire

    2016-09-01

    Previous studies with preschoolers have reported "East-West" contrasts in children's executive function (East>West) and theory of mind (East

  10. NeEMO: a method using residue interaction networks to improve prediction of protein stability upon mutation.

    PubMed

    Giollo, Manuel; Martin, Alberto J M; Walsh, Ian; Ferrari, Carlo; Tosatto, Silvio C E

    2014-01-01

    The rapid growth of un-annotated missense variants poses challenges requiring novel strategies for their interpretation. From the thermodynamic point of view, amino acid changes can lead to a change in the internal energy of a protein and induce structural rearrangements. This is of great relevance for the study of diseases and protein design, justifying the development of prediction methods for variant-induced stability changes. Here we propose NeEMO, a tool for the evaluation of stability changes using an effective representation of proteins based on residue interaction networks (RINs). RINs are used to extract useful features describing interactions of the mutant amino acid with its structural environment. Benchmarking shows NeEMO to be very effective, allowing reliable predictions in different parts of the protein such as β-strands and buried residues. Validation on a previously published independent dataset shows that NeEMO has a Pearson correlation coefficient of 0.77 and a standard error of 1 Kcal/mol, outperforming nine recent methods. The NeEMO web server can be freely accessed from URL: http://protein.bio.unipd.it/neemo/. NeEMO offers an innovative and reliable tool for the annotation of amino acid changes. A key contribution are RINs, which can be used for modeling proteins and their interactions effectively. Interestingly, the approach is very general, and can motivate the development of a new family of RIN-based protein structure analyzers. NeEMO may suggest innovative strategies for bioinformatics tools beyond protein stability prediction.

  11. The role of first impression in operant learning.

    PubMed

    Shteingart, Hanan; Neiman, Tal; Loewenstein, Yonatan

    2013-05-01

    We quantified the effect of first experience on behavior in operant learning and studied its underlying computational principles. To that goal, we analyzed more than 200,000 choices in a repeated-choice experiment. We found that the outcome of the first experience has a substantial and lasting effect on participants' subsequent behavior, which we term outcome primacy. We found that this outcome primacy can account for much of the underweighting of rare events, where participants apparently underestimate small probabilities. We modeled behavior in this task using a standard, model-free reinforcement learning algorithm. In this model, the values of the different actions are learned over time and are used to determine the next action according to a predefined action-selection rule. We used a novel nonparametric method to characterize this action-selection rule and showed that the substantial effect of first experience on behavior is consistent with the reinforcement learning model if we assume that the outcome of first experience resets the values of the experienced actions, but not if we assume arbitrary initial conditions. Moreover, the predictive power of our resetting model outperforms previously published models regarding the aggregate choice behavior. These findings suggest that first experience has a disproportionately large effect on subsequent actions, similar to primacy effects in other fields of cognitive psychology. The mechanism of resetting of the initial conditions that underlies outcome primacy may thus also account for other forms of primacy. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  12. De-identification of patient notes with recurrent neural networks.

    PubMed

    Dernoncourt, Franck; Lee, Ji Young; Uzuner, Ozlem; Szolovits, Peter

    2017-05-01

    Patient notes in electronic health records (EHRs) may contain critical information for medical investigations. However, the vast majority of medical investigators can only access de-identified notes, in order to protect the confidentiality of patients. In the United States, the Health Insurance Portability and Accountability Act (HIPAA) defines 18 types of protected health information that needs to be removed to de-identify patient notes. Manual de-identification is impractical given the size of electronic health record databases, the limited number of researchers with access to non-de-identified notes, and the frequent mistakes of human annotators. A reliable automated de-identification system would consequently be of high value. We introduce the first de-identification system based on artificial neural networks (ANNs), which requires no handcrafted features or rules, unlike existing systems. We compare the performance of the system with state-of-the-art systems on two datasets: the i2b2 2014 de-identification challenge dataset, which is the largest publicly available de-identification dataset, and the MIMIC de-identification dataset, which we assembled and is twice as large as the i2b2 2014 dataset. Our ANN model outperforms the state-of-the-art systems. It yields an F1-score of 97.85 on the i2b2 2014 dataset, with a recall of 97.38 and a precision of 98.32, and an F1-score of 99.23 on the MIMIC de-identification dataset, with a recall of 99.25 and a precision of 99.21. Our findings support the use of ANNs for de-identification of patient notes, as they show better performance than previously published systems while requiring no manual feature engineering. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  13. Detecting causality from online psychiatric texts using inter-sentential language patterns

    PubMed Central

    2012-01-01

    Background Online psychiatric texts are natural language texts expressing depressive problems, published by Internet users via community-based web services such as web forums, message boards and blogs. Understanding the cause-effect relations embedded in these psychiatric texts can provide insight into the authors’ problems, thus increasing the effectiveness of online psychiatric services. Methods Previous studies have proposed the use of word pairs extracted from a set of sentence pairs to identify cause-effect relations between sentences. A word pair is made up of two words, with one coming from the cause text span and the other from the effect text span. Analysis of the relationship between these words can be used to capture individual word associations between cause and effect sentences. For instance, (broke up, life) and (boyfriend, meaningless) are two word pairs extracted from the sentence pair: “I broke up with my boyfriend. Life is now meaningless to me”. The major limitation of word pairs is that individual words in sentences usually cannot reflect the exact meaning of the cause and effect events, and thus may produce semantically incomplete word pairs, as the previous examples show. Therefore, this study proposes the use of inter-sentential language patterns such as ≪broke up, boyfriend>,

  14. Augmenting Conceptual Design Trajectory Tradespace Exploration with Graph Theory

    NASA Technical Reports Server (NTRS)

    Dees, Patrick D.; Zwack, Mathew R.; Steffens, Michael; Edwards, Stephen

    2016-01-01

    Within conceptual design changes occur rapidly due to a combination of uncertainty and shifting requirements. To stay relevant in this fluid time, trade studies must also be performed rapidly. In order to drive down analysis time while improving the information gained by these studies, surrogate models can be created to represent the complex output of a tool or tools within a specified tradespace. In order to create this model however, a large amount of data must be collected in a short amount of time. By this method, the historical approach of relying on subject matter experts to generate the data required is schedule infeasible. However, by implementing automation and distributed analysis the required data can be generated in a fraction of the time. Previous work focused on setting up a tool called multiPOST capable of orchestrating many simultaneous runs of an analysis tool assessing these automated analyses utilizing heuristics gleaned from the best practices of current subject matter experts. In this update to the previous work, elements of graph theory are included to further drive down analysis time by leveraging data previously gathered. It is shown to outperform the previous method in both time required, and the quantity and quality of data produced.

  15. Interior search algorithm (ISA): a novel approach for global optimization.

    PubMed

    Gandomi, Amir H

    2014-07-01

    This paper presents the interior search algorithm (ISA) as a novel method for solving optimization tasks. The proposed ISA is inspired by interior design and decoration. The algorithm is different from other metaheuristic algorithms and provides new insight for global optimization. The proposed method is verified using some benchmark mathematical and engineering problems commonly used in the area of optimization. ISA results are further compared with well-known optimization algorithms. The results show that the ISA is efficiently capable of solving optimization problems. The proposed algorithm can outperform the other well-known algorithms. Further, the proposed algorithm is very simple and it only has one parameter to tune. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  16. Application of machine learning techniques to analyse the effects of physical exercise in ventricular fibrillation.

    PubMed

    Caravaca, Juan; Soria-Olivas, Emilio; Bataller, Manuel; Serrano, Antonio J; Such-Miquel, Luis; Vila-Francés, Joan; Guerrero, Juan F

    2014-02-01

    This work presents the application of machine learning techniques to analyse the influence of physical exercise in the physiological properties of the heart, during ventricular fibrillation. To this end, different kinds of classifiers (linear and neural models) are used to classify between trained and sedentary rabbit hearts. The use of those classifiers in combination with a wrapper feature selection algorithm allows to extract knowledge about the most relevant features in the problem. The obtained results show that neural models outperform linear classifiers (better performance indices and a better dimensionality reduction). The most relevant features to describe the benefits of physical exercise are those related to myocardial heterogeneity, mean activation rate and activation complexity. © 2013 Published by Elsevier Ltd.

  17. On the Quantification of Cellular Velocity Fields.

    PubMed

    Vig, Dhruv K; Hamby, Alex E; Wolgemuth, Charles W

    2016-04-12

    The application of flow visualization in biological systems is becoming increasingly common in studies ranging from intracellular transport to the movements of whole organisms. In cell biology, the standard method for measuring cell-scale flows and/or displacements has been particle image velocimetry (PIV); however, alternative methods exist, such as optical flow constraint. Here we review PIV and optical flow, focusing on the accuracy and efficiency of these methods in the context of cellular biophysics. Although optical flow is not as common, a relatively simple implementation of this method can outperform PIV and is easily augmented to extract additional biophysical/chemical information such as local vorticity or net polymerization rates from speckle microscopy. Copyright © 2016 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  18. Gender differences in recognition of toy faces suggest a contribution of experience.

    PubMed

    Ryan, Kaitlin F; Gauthier, Isabel

    2016-12-01

    When there is a gender effect, women perform better then men in face recognition tasks. Prior work has not documented a male advantage on a face recognition task, suggesting that women may outperform men at face recognition generally either due to evolutionary reasons or the influence of social roles. Here, we question the idea that women excel at all face recognition and provide a proof of concept based on a face category for which men outperform women. We developed a test of face learning to measures individual differences with face categories for which men and women may differ in experience, using the faces of Barbie dolls and of Transformers. The results show a crossover interaction between subject gender and category, where men outperform women with Transformers' faces. We demonstrate that men can outperform women with some categories of faces, suggesting that explanations for a general face recognition advantage for women are in fact not needed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. The AJCC 8th Edition Staging System for Soft Tissue Sarcoma of the Extremities or Trunk: A Cohort Study of the SEER Database.

    PubMed

    Cates, Justin M M

    2018-02-01

    Background: The AJCC recently published the 8th edition of its cancer staging system. Significant changes were made to the staging algorithm for soft tissue sarcoma (STS) of the extremities or trunk, including the addition of 2 additional T (size) classifications in lieu of tumor depth and grouping lymph node metastasis (LNM) with distant metastasis as stage IV disease. Whether these changes improve staging system performance is questionable. Patients and Methods: This retrospective cohort analysis of 21,396 adult patients with STS of the extremity or trunk in the SEER database compares the AJCC 8th edition staging system with the 7th edition and a newly proposed staging algorithm using a variety of statistical techniques. The effect of tumor size on disease-specific survival was assessed by flexible, nonlinear Cox proportional hazard regression using restricted cubic splines and fractional polynomials. Results: The slope of covariate-adjusted log hazards for sarcoma-specific survival decreases for tumors >8 cm in greatest dimension, limiting prognostic information contributed by the new T4 classification in the AJCC 8th edition. Anatomic depth independently provides significant prognostic information. LNM is not equivalent to distant, non-nodal metastasis. Based on these findings, an alternative staging system is proposed and demonstrated to outperform both AJCC staging schemes. The analyses presented also disclose no evidence of improved clinical performance of the 8th edition compared with the previous edition. Conclusions: The AJCC 8th edition staging system for STS is no better than the previous 7th edition. Instead, a proposed staging system based on histologic grade, tumor size, and anatomic depth shows significantly higher predictive accuracy, with higher model concordance than either AJCC staging system. Changes to existing staging systems should improve the performance of prognostic models. Until such improvements are documented, AJCC committees should refrain from modifying established staging schemes. Copyright © 2018 by the National Comprehensive Cancer Network.

  20. Functional Status Outperforms Comorbidities as a Predictor of 30-Day Acute Care Readmissions in the Inpatient Rehabilitation Population.

    PubMed

    Shih, Shirley L; Zafonte, Ross; Bates, David W; Gerrard, Paul; Goldstein, Richard; Mix, Jacqueline; Niewczyk, Paulette; Greysen, S Ryan; Kazis, Lewis; Ryan, Colleen M; Schneider, Jeffrey C

    2016-10-01

    Functional status is associated with patient outcomes, but is rarely included in hospital readmission risk models. The objective of this study was to determine whether functional status is a better predictor of 30-day acute care readmission than traditionally investigated variables including demographics and comorbidities. Retrospective database analysis between 2002 and 2011. 1158 US inpatient rehabilitation facilities. 4,199,002 inpatient rehabilitation facility admissions comprising patients from 16 impairment groups within the Uniform Data System for Medical Rehabilitation database. Logistic regression models predicting 30-day readmission were developed based on age, gender, comorbidities (Elixhauser comorbidity index, Deyo-Charlson comorbidity index, and Medicare comorbidity tier system), and functional status [Functional Independence Measure (FIM)]. We hypothesized that (1) function-based models would outperform demographic- and comorbidity-based models and (2) the addition of demographic and comorbidity data would not significantly enhance function-based models. For each impairment group, Function Only Models were compared against Demographic-Comorbidity Models and Function Plus Models (Function-Demographic-Comorbidity Models). The primary outcome was 30-day readmission, and the primary measure of model performance was the c-statistic. All-cause 30-day readmission rate from inpatient rehabilitation facilities to acute care hospitals was 9.87%. C-statistics for the Function Only Models were 0.64 to 0.70. For all 16 impairment groups, the Function Only Model demonstrated better c-statistics than the Demographic-Comorbidity Models (c-statistic difference: 0.03-0.12). The best-performing Function Plus Models exhibited negligible improvements in model performance compared to Function Only Models, with c-statistic improvements of only 0.01 to 0.05. Readmissions are currently used as a marker of hospital performance, with recent financial penalties to hospitals for excessive readmissions. Function-based readmission models outperform models based only on demographics and comorbidities. Readmission risk models would benefit from the inclusion of functional status as a primary predictor. Copyright © 2016 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.

  1. The Role of Instructional Quality within School Sectors: A Multi-Level Analysis

    ERIC Educational Resources Information Center

    Miller, Saralyn J.

    2013-01-01

    On average, private school students outperform public school students on standardized tests. Research confirms these differences in student scores, but also shows that when student background characteristics are controlled, on average, public school students outperform private school students. Explaining achievement differences between sectors…

  2. From free energy to expected energy: Improving energy-based value function approximation in reinforcement learning.

    PubMed

    Elfwing, Stefan; Uchibe, Eiji; Doya, Kenji

    2016-12-01

    Free-energy based reinforcement learning (FERL) was proposed for learning in high-dimensional state and action spaces. However, the FERL method does only really work well with binary, or close to binary, state input, where the number of active states is fewer than the number of non-active states. In the FERL method, the value function is approximated by the negative free energy of a restricted Boltzmann machine (RBM). In our earlier study, we demonstrated that the performance and the robustness of the FERL method can be improved by scaling the free energy by a constant that is related to the size of network. In this study, we propose that RBM function approximation can be further improved by approximating the value function by the negative expected energy (EERL), instead of the negative free energy, as well as being able to handle continuous state input. We validate our proposed method by demonstrating that EERL: (1) outperforms FERL, as well as standard neural network and linear function approximation, for three versions of a gridworld task with high-dimensional image state input; (2) achieves new state-of-the-art results in stochastic SZ-Tetris in both model-free and model-based learning settings; and (3) significantly outperforms FERL and standard neural network function approximation for a robot navigation task with raw and noisy RGB images as state input and a large number of actions. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  3. Heuristics as Bayesian inference under extreme priors.

    PubMed

    Parpart, Paula; Jones, Matt; Love, Bradley C

    2018-05-01

    Simple heuristics are often regarded as tractable decision strategies because they ignore a great deal of information in the input data. One puzzle is why heuristics can outperform full-information models, such as linear regression, which make full use of the available information. These "less-is-more" effects, in which a relatively simpler model outperforms a more complex model, are prevalent throughout cognitive science, and are frequently argued to demonstrate an inherent advantage of simplifying computation or ignoring information. In contrast, we show at the computational level (where algorithmic restrictions are set aside) that it is never optimal to discard information. Through a formal Bayesian analysis, we prove that popular heuristics, such as tallying and take-the-best, are formally equivalent to Bayesian inference under the limit of infinitely strong priors. Varying the strength of the prior yields a continuum of Bayesian models with the heuristics at one end and ordinary regression at the other. Critically, intermediate models perform better across all our simulations, suggesting that down-weighting information with the appropriate prior is preferable to entirely ignoring it. Rather than because of their simplicity, our analyses suggest heuristics perform well because they implement strong priors that approximate the actual structure of the environment. We end by considering how new heuristics could be derived by infinitely strengthening the priors of other Bayesian models. These formal results have implications for work in psychology, machine learning and economics. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  4. Clinical studies of fiber posts: a literature review.

    PubMed

    Cagidiaco, Maria C; Goracci, Cecilia; Garcia-Godoy, Franklin; Ferrari, Marco

    2008-01-01

    This literature review aimed to find answers to relevant questions regarding the clinical outcome of endontically treated teeth restored with fiber posts. All clinical studies published since 1990 in journals indexed in MEDLINE were retrieved by searching PubMed with the query terms "fiber posts and clinical studies." The reference list of the collected articles was also screened for further relevant citations. The strength of the evidence provided by the reviewed papers was assessed according to the criteria of evidence-based dentistry. Five randomized controlled trials (RCTs) on fiber posts have been published in peer-reviewed journals. A meta-analysis is not applicable to these studies since they do not address the same specific clinical question. Retrospective and prospective trials without controls are also available. Two RCTs indicate that fiber-reinforced composite posts outperform metal posts in the restoration of endontically treated teeth. However, this evidence cannot be considered as conclusive. Longer-term RCTs would be desirable. The placement of a fiber-reinforced composite post protects against failure, especially under conditions of extensive coronal destruction. The most common type of failure with fiber-reinforced composite posts is debonding.

  5. Wavelet tree structure based speckle noise removal for optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Yuan, Xin; Liu, Xuan; Liu, Yang

    2018-02-01

    We report a new speckle noise removal algorithm in optical coherence tomography (OCT). Though wavelet domain thresholding algorithms have demonstrated superior advantages in suppressing noise magnitude and preserving image sharpness in OCT, the wavelet tree structure has not been investigated in previous applications. In this work, we propose an adaptive wavelet thresholding algorithm via exploiting the tree structure in wavelet coefficients to remove the speckle noise in OCT images. The threshold for each wavelet band is adaptively selected following a special rule to retain the structure of the image across different wavelet layers. Our results demonstrate that the proposed algorithm outperforms conventional wavelet thresholding, with significant advantages in preserving image features.

  6. The A2iA French handwriting recognition system at the Rimes-ICDAR2011 competition

    NASA Astrophysics Data System (ADS)

    Menasri, Farès; Louradour, Jérôme; Bianne-Bernard, Anne-Laure; Kermorvant, Christopher

    2012-01-01

    This paper describes the system for the recognition of French handwriting submitted by A2iA to the competition organized at ICDAR2011 using the Rimes database. This system is composed of several recognizers based on three different recognition technologies, combined using a novel combination method. A framework multi-word recognition based on weighted finite state transducers is presented, using an explicit word segmentation, a combination of isolated word recognizers and a language model. The system was tested both for isolated word recognition and for multi-word line recognition and submitted to the RIMES-ICDAR2011 competition. This system outperformed all previously proposed systems on these tasks.

  7. Accurate and diverse recommendations via eliminating redundant correlations

    NASA Astrophysics Data System (ADS)

    Zhou, Tao; Su, Ri-Qi; Liu, Run-Ran; Jiang, Luo-Luo; Wang, Bing-Hong; Zhang, Yi-Cheng

    2009-12-01

    In this paper, based on a weighted projection of a bipartite user-object network, we introduce a personalized recommendation algorithm, called network-based inference (NBI), which has higher accuracy than the classical algorithm, namely collaborative filtering. In NBI, the correlation resulting from a specific attribute may be repeatedly counted in the cumulative recommendations from different objects. By considering the higher order correlations, we design an improved algorithm that can, to some extent, eliminate the redundant correlations. We test our algorithm on two benchmark data sets, MovieLens and Netflix. Compared with NBI, the algorithmic accuracy, measured by the ranking score, can be further improved by 23 per cent for MovieLens and 22 per cent for Netflix. The present algorithm can even outperform the Latent Dirichlet Allocation algorithm, which requires much longer computational time. Furthermore, most previous studies considered the algorithmic accuracy only; in this paper, we argue that the diversity and popularity, as two significant criteria of algorithmic performance, should also be taken into account. With more or less the same accuracy, an algorithm giving higher diversity and lower popularity is more favorable. Numerical results show that the present algorithm can outperform the standard one simultaneously in all five adopted metrics: lower ranking score and higher precision for accuracy, larger Hamming distance and lower intra-similarity for diversity, as well as smaller average degree for popularity.

  8. Prediction of nocturnal hypoglycemia by an aggregation of previously known prediction approaches: proof of concept for clinical application.

    PubMed

    Tkachenko, Pavlo; Kriukova, Galyna; Aleksandrova, Marharyta; Chertov, Oleg; Renard, Eric; Pereverzyev, Sergei V

    2016-10-01

    Nocturnal hypoglycemia (NH) is common in patients with insulin-treated diabetes. Despite the risk associated with NH, there are only a few methods aiming at the prediction of such events based on intermittent blood glucose monitoring data and none has been validated for clinical use. Here we propose a method of combining several predictors into a new one that will perform at the level of the best involved one, or even outperform all individual candidates. The idea of the method is to use a recently developed strategy for aggregating ranking algorithms. The method has been calibrated and tested on data extracted from clinical trials, performed in the European FP7-funded project DIAdvisor. Then we have tested the proposed approach on other datasets to show the portability of the method. This feature of the method allows its simple implementation in the form of a diabetic smartphone app. On the considered datasets the proposed approach exhibits good performance in terms of sensitivity, specificity and predictive values. Moreover, the resulting predictor automatically performs at the level of the best involved method or even outperforms it. We propose a strategy for a combination of NH predictors that leads to a method exhibiting a reliable performance and the potential for everyday use by any patient who performs self-monitoring of blood glucose. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  9. Predicting and understanding law-making with word vectors and an ensemble model.

    PubMed

    Nay, John J

    2017-01-01

    Out of nearly 70,000 bills introduced in the U.S. Congress from 2001 to 2015, only 2,513 were enacted. We developed a machine learning approach to forecasting the probability that any bill will become law. Starting in 2001 with the 107th Congress, we trained models on data from previous Congresses, predicted all bills in the current Congress, and repeated until the 113th Congress served as the test. For prediction we scored each sentence of a bill with a language model that embeds legislative vocabulary into a high-dimensional, semantic-laden vector space. This language representation enables our investigation into which words increase the probability of enactment for any topic. To test the relative importance of text and context, we compared the text model to a context-only model that uses variables such as whether the bill's sponsor is in the majority party. To test the effect of changes to bills after their introduction on our ability to predict their final outcome, we compared using the bill text and meta-data available at the time of introduction with using the most recent data. At the time of introduction context-only predictions outperform text-only, and with the newest data text-only outperforms context-only. Combining text and context always performs best. We conducted a global sensitivity analysis on the combined model to determine important variables predicting enactment.

  10. Static and Dynamic Frequency Scaling on Multicore CPUs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bao, Wenlei; Hong, Changwan; Chunduri, Sudheer

    2016-12-28

    Dynamic voltage and frequency scaling (DVFS) adapts CPU power consumption by modifying a processor’s operating frequency (and the associated voltage). Typical approaches employing DVFS involve default strategies such as running at the lowest or the highest frequency, or observing the CPU’s runtime behavior and dynamically adapting the voltage/frequency configuration based on CPU usage. In this paper, we argue that many previous approaches suffer from inherent limitations, such as not account- ing for processor-specific impact of frequency changes on energy for different workload types. We first propose a lightweight runtime-based approach to automatically adapt the frequency based on the CPU workload,more » that is agnostic of the processor characteristics. We then show that further improvements can be achieved for affine kernels in the application, using a compile-time characterization instead of run-time monitoring to select the frequency and number of CPU cores to use. Our framework relies on a one-time energy characterization of CPU-specific DVFS profiles followed by a compile-time categorization of loop-based code segments in the application. These are combined to determine a priori of the frequency and the number of cores to use to execute the application so as to optimize energy or energy-delay product, outperforming runtime approach. Extensive evaluation on 60 benchmarks and five multi-core CPUs show that our approach systematically outperforms the powersave Linux governor, while improving overall performance.« less

  11. A dual-trace model for visual sensory memory.

    PubMed

    Cappiello, Marcus; Zhang, Weiwei

    2016-11-01

    Visual sensory memory refers to a transient memory lingering briefly after the stimulus offset. Although previous literature suggests that visual sensory memory is supported by a fine-grained trace for continuous representation and a coarse-grained trace of categorical information, simultaneous separation and assessment of these traces can be difficult without a quantitative model. The present study used a continuous estimation procedure to test a novel mathematical model of the dual-trace hypothesis of visual sensory memory according to which visual sensory memory could be modeled as a mixture of 2 von Mises (2VM) distributions differing in standard deviation. When visual sensory memory and working memory (WM) for colors were distinguished using different experimental manipulations in the first 3 experiments, the 2VM model outperformed Zhang and Luck (2008) standard mixture model (SM) representing a mixture of a single memory trace and random guesses, even though SM outperformed 2VM for WM. Experiment 4 generalized 2VM's advantages of fitting visual sensory memory data over SM from color to orientation. Furthermore, a single trace model and 4 other alternative models were ruled out, suggesting the necessity and sufficiency of dual traces for visual sensory memory. Together these results support the dual-trace model of visual sensory memory and provide a preliminary inquiry into the nature of information loss from visual sensory memory to WM. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  12. Predicting and understanding law-making with word vectors and an ensemble model

    PubMed Central

    Nay, John J.

    2017-01-01

    Out of nearly 70,000 bills introduced in the U.S. Congress from 2001 to 2015, only 2,513 were enacted. We developed a machine learning approach to forecasting the probability that any bill will become law. Starting in 2001 with the 107th Congress, we trained models on data from previous Congresses, predicted all bills in the current Congress, and repeated until the 113th Congress served as the test. For prediction we scored each sentence of a bill with a language model that embeds legislative vocabulary into a high-dimensional, semantic-laden vector space. This language representation enables our investigation into which words increase the probability of enactment for any topic. To test the relative importance of text and context, we compared the text model to a context-only model that uses variables such as whether the bill’s sponsor is in the majority party. To test the effect of changes to bills after their introduction on our ability to predict their final outcome, we compared using the bill text and meta-data available at the time of introduction with using the most recent data. At the time of introduction context-only predictions outperform text-only, and with the newest data text-only outperforms context-only. Combining text and context always performs best. We conducted a global sensitivity analysis on the combined model to determine important variables predicting enactment. PMID:28489868

  13. FIND: difFerential chromatin INteractions Detection using a spatial Poisson process.

    PubMed

    Djekidel, Mohamed Nadhir; Chen, Yang; Zhang, Michael Q

    2018-02-12

    Polymer-based simulations and experimental studies indicate the existence of a spatial dependency between the adjacent DNA fibers involved in the formation of chromatin loops. However, the existing strategies for detecting differential chromatin interactions assume that the interacting segments are spatially independent from the other segments nearby. To resolve this issue, we developed a new computational method, FIND, which considers the local spatial dependency between interacting loci. FIND uses a spatial Poisson process to detect differential chromatin interactions that show a significant difference in their interaction frequency and the interaction frequency of their neighbors. Simulation and biological data analysis show that FIND outperforms the widely used count-based methods and has a better signal-to-noise ratio. © 2018 Djekidel et al.; Published by Cold Spring Harbor Laboratory Press.

  14. Application of the Teager-Kaiser energy operator in bearing fault diagnosis.

    PubMed

    Henríquez Rodríguez, Patricia; Alonso, Jesús B; Ferrer, Miguel A; Travieso, Carlos M

    2013-03-01

    Condition monitoring of rotating machines is important in the prevention of failures. As most machine malfunctions are related to bearing failures, several bearing diagnosis techniques have been developed. Some of them feature the bearing vibration signal with statistical measures and others extract the bearing fault characteristic frequency from the AM component of the vibration signal. In this paper, we propose to transform the vibration signal to the Teager-Kaiser domain and feature it with statistical and energy-based measures. A bearing database with normal and faulty bearings is used. The diagnosis is performed with two classifiers: a neural network classifier and a LS-SVM classifier. Experiments show that the Teager domain features outperform those based on the temporal or AM signal. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  15. BPP: a sequence-based algorithm for branch point prediction.

    PubMed

    Zhang, Qing; Fan, Xiaodan; Wang, Yejun; Sun, Ming-An; Shao, Jianlin; Guo, Dianjing

    2017-10-15

    Although high-throughput sequencing methods have been proposed to identify splicing branch points in the human genome, these methods can only detect a small fraction of the branch points subject to the sequencing depth, experimental cost and the expression level of the mRNA. An accurate computational model for branch point prediction is therefore an ongoing objective in human genome research. We here propose a novel branch point prediction algorithm that utilizes information on the branch point sequence and the polypyrimidine tract. Using experimentally validated data, we demonstrate that our proposed method outperforms existing methods. Availability and implementation: https://github.com/zhqingit/BPP. djguo@cuhk.edu.hk. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  16. Sex Discrimination and Cerebral Bias: Implications for the Reading Curriculum.

    ERIC Educational Resources Information Center

    Keenan, Donna; Smith, Michael

    1983-01-01

    Reviews research supporting the concept that girls usually outperform boys on tasks requiring verbal skills and that boys outperform girls on tasks using visual and spatial skills. Offers an explanation for this situation based on left brain/right brain research. Concludes that the curriculum in American schools is clearly left-brain biased. (FL)

  17. Using Outperformance Pay to Motivate Academics: Insiders' Accounts of Promises and Problems

    ERIC Educational Resources Information Center

    Field, Laurie

    2015-01-01

    Many researchers have investigated the appropriateness of pay for outperformance, (also called "merit-based pay" and "performance-based pay") for academics, but a review of this body of work shows that the voice of academics themselves is largely absent. This article is a contribution to addressing this gap, summarising the…

  18. Character-level neural network for biomedical named entity recognition.

    PubMed

    Gridach, Mourad

    2017-06-01

    Biomedical named entity recognition (BNER), which extracts important named entities such as genes and proteins, is a challenging task in automated systems that mine knowledge in biomedical texts. The previous state-of-the-art systems required large amounts of task-specific knowledge in the form of feature engineering, lexicons and data pre-processing to achieve high performance. In this paper, we introduce a novel neural network architecture that benefits from both word- and character-level representations automatically, by using a combination of bidirectional long short-term memory (LSTM) and conditional random field (CRF) eliminating the need for most feature engineering tasks. We evaluate our system on two datasets: JNLPBA corpus and the BioCreAtIvE II Gene Mention (GM) corpus. We obtained state-of-the-art performance by outperforming the previous systems. To the best of our knowledge, we are the first to investigate the combination of deep neural networks, CRF, word embeddings and character-level representation in recognizing biomedical named entities. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Predicting Pharmacodynamic Drug-Drug Interactions through Signaling Propagation Interference on Protein-Protein Interaction Networks.

    PubMed

    Park, Kyunghyun; Kim, Docyong; Ha, Suhyun; Lee, Doheon

    2015-01-01

    As pharmacodynamic drug-drug interactions (PD DDIs) could lead to severe adverse effects in patients, it is important to identify potential PD DDIs in drug development. The signaling starting from drug targets is propagated through protein-protein interaction (PPI) networks. PD DDIs could occur by close interference on the same targets or within the same pathways as well as distant interference through cross-talking pathways. However, most of the previous approaches have considered only close interference by measuring distances between drug targets or comparing target neighbors. We have applied a random walk with restart algorithm to simulate signaling propagation from drug targets in order to capture the possibility of their distant interference. Cross validation with DrugBank and Kyoto Encyclopedia of Genes and Genomes DRUG shows that the proposed method outperforms the previous methods significantly. We also provide a web service with which PD DDIs for drug pairs can be analyzed at http://biosoft.kaist.ac.kr/targetrw.

  20. Performance evaluation of a semi-active cladding connection for multi-hazard mitigation

    NASA Astrophysics Data System (ADS)

    Gong, Yongqiang; Cao, Liang; Micheli, Laura; Laflamme, Simon; Quiel, Spencer; Ricles, James

    2018-03-01

    A novel semi-active damping device termed Variable Friction Cladding Connection (VFCC) has been previously proposed to leverage cladding systems for the mitigation of natural and man-made hazards. The VFCC is a semi-active friction damper that connects cladding elements to the structural system. The friction force is generated by sliding plates and varied using an actuator through a system of adjustable toggles. The dynamics of the device has been previously characterized in a laboratory environment. In this paper, the performance of the VFCC at mitigating non-simultaneous multi-hazard excitations that includes wind and seismic loads is investigated on a simulated benchmark building. Simulations consider the robustness with respect to some uncertainties, including the wear of the friction surfaces and sensor failure. The performance of the VFCC is compared against other connection strategies including traditional stiffness, passive viscous, and passive friction elements. Results show that the VFCC is robust and capable of outperforming passive systems for the mitigation of multiple hazards.

  1. Cataract surgeons outperform medical students in Eyesi virtual reality cataract surgery: evidence for construct validity.

    PubMed

    Selvander, Madeleine; Asman, Peter

    2013-08-01

    To investigate construct validity for modules hydromaneuvers and phaco on the Eyesi surgical simulator. Seven cataract surgeons and 17 medical students performed capsulorhexis, hydromaneuvers, phaco, navigation, forceps, cracking and chopping modules in a standardized manner. Three trials were performed on each module (two on phaco) in the above order. Performance parameters as calculated by the simulator for each trial were saved. Video recordings of the second trial of the modules capsulorhexis, hydromaneuvers and phaco were evaluated with the modified Objective Structured Assessment of Surgical Skill (OSATS) and Objective Structured Assessment of Cataract Surgical Skill (OSACSS) tools. Cataract surgeons outperformed medical students with regard to overall score on capsulorhexis (p < 0.001, p = 0.035, p = 0.010 for the tree iterations, respectively), navigation (p = 0.024, p = 0.307, p = 0.007), forceps (p = 0.017, p = 0.03, p = 0.028). Less obvious differences in overall score were found for modules cracking and chopping (p = 0.266, p = 0.022, p = 0.324) and phaco (p = 0.011, p = 0.081 for the two iterations, respectively). No differences in overall score were found on hydromaneuvers (p = 0.588, p = 0.503, p = 0.773), but surgeons received better scores from the evaluations of the modified OSATS (p = 0.001) and OSACSS (capsulorhexis, p = 0.003; hydromaneuvers, p = 0.017; phaco, p = 0.001). Construct validity was found on several modules previously not investigated (phaco, hydromaneuvers, cracking and chopping, navigation), and our results confirm previously demonstrated construct validity for capsulorhexis and forceps modules. Interestingly, validation of the hydromaneuvers module required OSACSS video evaluation tool. A further development of the scoring system in the simulator for the hydromaneuvers module would be advantageous and make training and evaluation of progress more accessible and immediate. © 2012 The Authors. Acta Ophthalmologica © 2012 Acta Ophthalmologica Scandinavica Foundation.

  2. The cost-constrained traveling salesman problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sokkappa, P.R.

    1990-10-01

    The Cost-Constrained Traveling Salesman Problem (CCTSP) is a variant of the well-known Traveling Salesman Problem (TSP). In the TSP, the goal is to find a tour of a given set of cities such that the total cost of the tour is minimized. In the CCTSP, each city is given a value, and a fixed cost-constraint is specified. The objective is to find a subtour of the cities that achieves maximum value without exceeding the cost-constraint. Thus, unlike the TSP, the CCTSP requires both selection and sequencing. As a consequence, most results for the TSP cannot be extended to the CCTSP.more » We show that the CCTSP is NP-hard and that no K-approximation algorithm or fully polynomial approximation scheme exists, unless P = NP. We also show that several special cases are polynomially solvable. Algorithms for the CCTSP, which outperform previous methods, are developed in three areas: upper bounding methods, exact algorithms, and heuristics. We found that a bounding strategy based on the knapsack problem performs better, both in speed and in the quality of the bounds, than methods based on the assignment problem. Likewise, we found that a branch-and-bound approach using the knapsack bound was superior to a method based on a common branch-and-bound method for the TSP. In our study of heuristic algorithms, we found that, when selecting modes for inclusion in the subtour, it is important to consider the neighborhood'' of the nodes. A node with low value that brings the subtour near many other nodes may be more desirable than an isolated node of high value. We found two types of repetition to be desirable: repetitions based on randomization in the subtour buildings process, and repetitions encouraging the inclusion of different subsets of the nodes. By varying the number and type of repetitions, we can adjust the computation time required by our method to obtain algorithms that outperform previous methods.« less

  3. DeepLoc: prediction of protein subcellular localization using deep learning.

    PubMed

    Almagro Armenteros, José Juan; Sønderby, Casper Kaae; Sønderby, Søren Kaae; Nielsen, Henrik; Winther, Ole

    2017-11-01

    The prediction of eukaryotic protein subcellular localization is a well-studied topic in bioinformatics due to its relevance in proteomics research. Many machine learning methods have been successfully applied in this task, but in most of them, predictions rely on annotation of homologues from knowledge databases. For novel proteins where no annotated homologues exist, and for predicting the effects of sequence variants, it is desirable to have methods for predicting protein properties from sequence information only. Here, we present a prediction algorithm using deep neural networks to predict protein subcellular localization relying only on sequence information. At its core, the prediction model uses a recurrent neural network that processes the entire protein sequence and an attention mechanism identifying protein regions important for the subcellular localization. The model was trained and tested on a protein dataset extracted from one of the latest UniProt releases, in which experimentally annotated proteins follow more stringent criteria than previously. We demonstrate that our model achieves a good accuracy (78% for 10 categories; 92% for membrane-bound or soluble), outperforming current state-of-the-art algorithms, including those relying on homology information. The method is available as a web server at http://www.cbs.dtu.dk/services/DeepLoc. Example code is available at https://github.com/JJAlmagro/subcellular_localization. The dataset is available at http://www.cbs.dtu.dk/services/DeepLoc/data.php. jjalma@dtu.dk. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  4. RENT+: an improved method for inferring local genealogical trees from haplotypes with recombination.

    PubMed

    Mirzaei, Sajad; Wu, Yufeng

    2017-04-01

    : Haplotypes from one or multiple related populations share a common genealogical history. If this shared genealogy can be inferred from haplotypes, it can be very useful for many population genetics problems. However, with the presence of recombination, the genealogical history of haplotypes is complex and cannot be represented by a single genealogical tree. Therefore, inference of genealogical history with recombination is much more challenging than the case of no recombination. : In this paper, we present a new approach called RENT+  for the inference of local genealogical trees from haplotypes with the presence of recombination. RENT+  builds on a previous genealogy inference approach called RENT , which infers a set of related genealogical trees at different genomic positions. RENT+  represents a significant improvement over RENT in the sense that it is more effective in extracting information contained in the haplotype data about the underlying genealogy than RENT . The key components of RENT+  are several greatly enhanced genealogy inference rules. Through simulation, we show that RENT+  is more efficient and accurate than several existing genealogy inference methods. As an application, we apply RENT+  in the inference of population demographic history from haplotypes, which outperforms several existing methods. : RENT+  is implemented in Java, and is freely available for download from: https://github.com/SajadMirzaei/RentPlus . : sajad@engr.uconn.edu or ywu@engr.uconn.edu. : Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  5. The Bologna Annotation Resource (BAR 3.0): improving protein functional annotation.

    PubMed

    Profiti, Giuseppe; Martelli, Pier Luigi; Casadio, Rita

    2017-07-03

    BAR 3.0 updates our server BAR (Bologna Annotation Resource) for predicting protein structural and functional features from sequence. We increase data volume, query capabilities and information conveyed to the user. The core of BAR 3.0 is a graph-based clustering procedure of UniProtKB sequences, following strict pairwise similarity criteria (sequence identity ≥40% with alignment coverage ≥90%). Each cluster contains the available annotation downloaded from UniProtKB, GO, PFAM and PDB. After statistical validation, GO terms and PFAM domains are cluster-specific and annotate new sequences entering the cluster after satisfying similarity constraints. BAR 3.0 includes 28 869 663 sequences in 1 361 773 clusters, of which 22.2% (22 241 661 sequences) and 47.4% (24 555 055 sequences) have at least one validated GO term and one PFAM domain, respectively. 1.4% of the clusters (36% of all sequences) include PDB structures and the cluster is associated to a hidden Markov model that allows building template-target alignment suitable for structural modeling. Some other 3 399 026 sequences are singletons. BAR 3.0 offers an improved search interface, allowing queries by UniProtKB-accession, Fasta sequence, GO-term, PFAM-domain, organism, PDB and ligand/s. When evaluated on the CAFA2 targets, BAR 3.0 largely outperforms our previous version and scores among state-of-the-art methods. BAR 3.0 is publicly available and accessible at http://bar.biocomp.unibo.it/bar3. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  6. The Neuropsychological Profile of Comorbid Post-Traumatic Stress Disorder in Adult ADHD.

    PubMed

    Antshel, Kevin M; Biederman, Joseph; Spencer, Thomas J; Faraone, Stephen V

    2016-12-01

    ADHD and post-traumatic stress disorder (PTSD) are often comorbid yet despite the increased comorbidity between the two disorders, to our knowledge, no data have been published regarding the neuropsychological profile of adults with comorbid ADHD and PTSD. Likewise, previous empirical studies of the neuropsychology of PTSD did not control for ADHD status. We sought to fill this gap in the literature and to assess the extent to which neuropsychological test performance predicted psychosocial functioning, and perceived quality of life. Participants were 201 adults with ADHD attending an outpatient mental health clinic between 1998 and 2003 and 123 controls without ADHD. Participants completed a large battery of self-report measures and psychological tests. Diagnoses were made using data obtained from structured psychiatric interviews (i.e., Structured Clinical Interview for DSM-IV, Schedule for Affective Disorders and Schizophrenia for School-Age Children Epidemiologic Version). Differences emerged between control participants and participants with ADHD on multiple neuropsychological tests. Across all tests, control participants outperformed participants with ADHD. Differences between the two ADHD groups emerged on seven psychological subtests including multiple Wechsler Adult Intelligence Scale-Third edition and Rey-Osterrieth Complex Figure Test measures. These test differences did not account for self-reported quality of life differences between groups. The comorbidity with PTSD in adults with ADHD is associated with weaker cognitive performance on several tasks that appear related to spatial/perceptual abilities and fluency. Neuropsychological test performances may share variance with the quality of life variables yet are not mediators of the quality of life ratings. © The Author(s) 2014.

  7. A novel, fast and efficient single-sensor automatic sleep-stage classification based on complementary cross-frequency coupling estimates.

    PubMed

    Dimitriadis, Stavros I; Salis, Christos; Linden, David

    2018-04-01

    Limitations of the manual scoring of polysomnograms, which include data from electroencephalogram (EEG), electro-oculogram (EOG), electrocardiogram (ECG) and electromyogram (EMG) channels have long been recognized. Manual staging is resource intensive and time consuming, and thus considerable effort must be spent to ensure inter-rater reliability. As a result, there is a great interest in techniques based on signal processing and machine learning for a completely Automatic Sleep Stage Classification (ASSC). In this paper, we present a single-EEG-sensor ASSC technique based on the dynamic reconfiguration of different aspects of cross-frequency coupling (CFC) estimated between predefined frequency pairs over 5 s epoch lengths. The proposed analytic scheme is demonstrated using the PhysioNet Sleep European Data Format (EDF) Database with repeat recordings from 20 healthy young adults. We validate our methodology in a second sleep dataset. We achieved very high classification sensitivity, specificity and accuracy of 96.2 ± 2.2%, 94.2 ± 2.3%, and 94.4 ± 2.2% across 20 folds, respectively, and also a high mean F1 score (92%, range 90-94%) when a multi-class Naive Bayes classifier was applied. High classification performance has been achieved also in the second sleep dataset. Our method outperformed the accuracy of previous studies not only on different datasets but also on the same database. Single-sensor ASSC makes the entire methodology appropriate for longitudinal monitoring using wearable EEG in real-world and laboratory-oriented environments. Crown Copyright © 2018. Published by Elsevier B.V. All rights reserved.

  8. Direct PCR amplification of DNA from human bloodstains, saliva, and touch samples collected with microFLOQ® swabs.

    PubMed

    Ambers, Angie; Wiley, Rachel; Novroski, Nicole; Budowle, Bruce

    2018-01-01

    Previous studies have shown that nylon flocked swabs outperform traditional fiber swabs in DNA recovery due to their innovative design and lack of internal absorbent core to entrap cellular materials. The microFLOQ ® Direct swab, a miniaturized version of the 4N6 FLOQSwab ® , has a small swab head that is treated with a lysing agent which allows for direct amplification and DNA profiling from sample collection to final result in less than two hours. Additionally, the microFLOQ ® system subsamples only a minute portion of a stain and preserves the vast majority of the sample for subsequent testing or re-analysis, if desired. The efficacy of direct amplification of DNA from dilute bloodstains, saliva stains, and touch samples was evaluated using microFLOQ ® Direct swabs and the GlobalFiler™ Express system. Comparisons were made to traditional methods to assess the robustness of this alternate workflow. Controlled studies with 1:19 and 1:99 dilutions of bloodstains and saliva stains consistently yielded higher STR peak heights than standard methods with 1ng input DNA from the same samples. Touch samples from common items yielded single source and mixed profiles that were consistent with primary users of the objects. With this novel methodology/workflow, no sample loss occurs and therefore more template DNA is available during amplification. This approach may have important implications for analysis of low quantity and/or degraded samples that plague forensic casework. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  9. Analysis of Point Based Image Registration Errors With Applications in Single Molecule Microscopy

    PubMed Central

    Cohen, E. A. K.; Ober, R. J.

    2014-01-01

    We present an asymptotic treatment of errors involved in point-based image registration where control point (CP) localization is subject to heteroscedastic noise; a suitable model for image registration in fluorescence microscopy. Assuming an affine transform, CPs are used to solve a multivariate regression problem. With measurement errors existing for both sets of CPs this is an errors-in-variable problem and linear least squares is inappropriate; the correct method being generalized least squares. To allow for point dependent errors the equivalence of a generalized maximum likelihood and heteroscedastic generalized least squares model is achieved allowing previously published asymptotic results to be extended to image registration. For a particularly useful model of heteroscedastic noise where covariance matrices are scalar multiples of a known matrix (including the case where covariance matrices are multiples of the identity) we provide closed form solutions to estimators and derive their distribution. We consider the target registration error (TRE) and define a new measure called the localization registration error (LRE) believed to be useful, especially in microscopy registration experiments. Assuming Gaussianity of the CP localization errors, it is shown that the asymptotic distribution for the TRE and LRE are themselves Gaussian and the parameterized distributions are derived. Results are successfully applied to registration in single molecule microscopy to derive the key dependence of the TRE and LRE variance on the number of CPs and their associated photon counts. Simulations show asymptotic results are robust for low CP numbers and non-Gaussianity. The method presented here is shown to outperform GLS on real imaging data. PMID:24634573

  10. Automatic lesion detection in capsule endoscopy based on color saliency: closer to an essential adjunct for reviewing software.

    PubMed

    Iakovidis, Dimitris K; Koulaouzidis, Anastasios

    2014-11-01

    The advent of wireless capsule endoscopy (WCE) has revolutionized the diagnostic approach to small-bowel disease. However, the task of reviewing WCE video sequences is laborious and time-consuming; software tools offering automated video analysis would enable a timelier and potentially a more accurate diagnosis. To assess the validity of innovative, automatic lesion-detection software in WCE. A color feature-based pattern recognition methodology was devised and applied to the aforementioned image group. This study was performed at the Royal Infirmary of Edinburgh, United Kingdom, and the Technological Educational Institute of Central Greece, Lamia, Greece. A total of 137 deidentified WCE single images, 77 showing pathology and 60 normal images. The proposed methodology, unlike state-of-the-art approaches, is capable of detecting several different types of lesions. The average performance, in terms of the area under the receiver-operating characteristic curve, reached 89.2 ± 0.9%. The best average performance was obtained for angiectasias (97.5 ± 2.4%) and nodular lymphangiectasias (96.3 ± 3.6%). Single expert for annotation of pathologies, single type of WCE model, use of single images instead of entire WCE videos. A simple, yet effective, approach allowing automatic detection of all types of abnormalities in capsule endoscopy is presented. Based on color pattern recognition, it outperforms previous state-of-the-art approaches. Moreover, it is robust in the presence of luminal contents and is capable of detecting even very small lesions. Crown Copyright © 2014. Published by Elsevier Inc. All rights reserved.

  11. Learning and generalization from reward and punishment in opioid addiction.

    PubMed

    Myers, Catherine E; Rego, Janice; Haber, Paul; Morley, Kirsten; Beck, Kevin D; Hogarth, Lee; Moustafa, Ahmed A

    2017-01-15

    This study adapts a widely-used acquired equivalence paradigm to investigate how opioid-addicted individuals learn from positive and negative feedback, and how they generalize this learning. The opioid-addicted group consisted of 33 participants with a history of heroin dependency currently in a methadone maintenance program; the control group consisted of 32 healthy participants without a history of drug addiction. All participants performed a novel variant of the acquired equivalence task, where they learned to map some stimuli to correct outcomes in order to obtain reward, and to map other stimuli to correct outcomes in order to avoid punishment; some stimuli were implicitly "equivalent" in the sense of being paired with the same outcome. On the initial training phase, both groups performed similarly on learning to obtain reward, but as memory load grew, the control group outperformed the addicted group on learning to avoid punishment. On a subsequent testing phase, the addicted and control groups performed similarly on retention trials involving previously-trained stimulus-outcome pairs, as well as on generalization trials to assess acquired equivalence. Since prior work with acquired equivalence tasks has associated stimulus-outcome learning with the nigrostriatal dopamine system, and generalization with the hippocampal region, the current results are consistent with basal ganglia dysfunction in the opioid-addicted patients. Further, a selective deficit in learning from punishment could contribute to processes by which addicted individuals continue to pursue drug use even at the cost of negative consequences such as loss of income and the opportunity to engage in other life activities. Published by Elsevier B.V.

  12. Theory of mind may be contagious, but you don't catch it from your twin.

    PubMed

    Wright Cassidy, Kimberly; Shaw Fineberg, Deborah; Brown, Kimberly; Perkins, Alexis

    2005-01-01

    The theory-of-mind abilities of twins, children with nontwin siblings, and only children were compared to investigate further the link between number and type of siblings and theory-of-mind abilities. Three- to 5-year-old children with nontwin siblings outperformed both only children and twins with no other siblings, twins who also had other siblings outperformed twins who did not, and children with at least 1 opposite-sex sibling outperformed children with only same-sex siblings. Twins performed significantly better when asked about the false beliefs of their twins than they did when asked about the false beliefs of their friends. Results are discussed in terms of potential mechanisms that may account for the twin and sibling effects.

  13. Sexual-orientation-related differences in verbal fluency.

    PubMed

    Rahman, Qazi; Abrahams, Sharon; Wilson, Glenn D

    2003-04-01

    This study examined the performance of 60 heterosexual men, 60 gay men, 60 heterosexual women, and 60 lesbians on 3 tests of verbal fluency known to show gender differences: letter, category, and synonym fluency. Gay men and lesbians showed opposite-sex shifts in their profile of scores. For letter fluency, gay men outperformed all other groups; lesbians showed the lowest scores. For category fluency, gay men and heterosexual women jointly outperformed lesbians and heterosexual men. Finally, gay men outperformed all other groups on synonym fluency, whereas lesbians and heterosexual men performed similarly. A difference between heterosexual men and women was demonstrated on category and synonym fluency only. The findings implicate within-sex differences in the functioning of the prefrontal and temporal cortices.

  14. Reference Accuracy among Research Articles Published in "Research on Social Work Practice"

    ERIC Educational Resources Information Center

    Wilks, Scott E.; Geiger, Jennifer R.; Bates, Samantha M.; Wright, Amy L.

    2017-01-01

    Objective: The objective was to examine reference errors in research articles published in Research on Social Work Practice. High rates of reference errors in other top social work journals have been noted in previous studies. Methods: Via a sampling frame of 22,177 total references among 464 research articles published in the previous decade, a…

  15. A Bayesian estimate of the concordance correlation coefficient with skewed data.

    PubMed

    Feng, Dai; Baumgartner, Richard; Svetnik, Vladimir

    2015-01-01

    Concordance correlation coefficient (CCC) is one of the most popular scaled indices used to evaluate agreement. Most commonly, it is used under the assumption that data is normally distributed. This assumption, however, does not apply to skewed data sets. While methods for the estimation of the CCC of skewed data sets have been introduced and studied, the Bayesian approach and its comparison with the previous methods has been lacking. In this study, we propose a Bayesian method for the estimation of the CCC of skewed data sets and compare it with the best method previously investigated. The proposed method has certain advantages. It tends to outperform the best method studied before when the variation of the data is mainly from the random subject effect instead of error. Furthermore, it allows for greater flexibility in application by enabling incorporation of missing data, confounding covariates, and replications, which was not considered previously. The superiority of this new approach is demonstrated using simulation as well as real-life biomarker data sets used in an electroencephalography clinical study. The implementation of the Bayesian method is accessible through the Comprehensive R Archive Network. Copyright © 2015 John Wiley & Sons, Ltd.

  16. Compartmentalization of the Coso East Flank geothermal field imaged by 3-D full-tensor MT inversion

    USGS Publications Warehouse

    Lindsey, Nathaniel J.; Kaven, Joern; Davatzes, Nicholas C.; Newman, Gregory A.

    2017-01-01

    Previous magnetotelluric (MT) studies of the high-temperature Coso geothermal system in California identified a subvertical feature of low resistivity (2–5 Ohm m) and appreciable lateral extent (>1 km) in the producing zone of the East Flank field. However, these models could not reproduce gross 3-D effects in the recorded data. We perform 3-D full-tensor inversion and retrieve a resistivity model that out-performs previous 2-D and 3-D off-diagonal models in terms of its fit to the complete 3-D MT data set as well as the degree of modelling bias. Inclusion of secondary Zxx and Zyy data components leads to a robust east-dip (60†) to the previously identified conductive East Flank reservoir feature, which correlates strongly with recently mapped surface faults, downhole well temperatures, 3-D seismic reflection data, and local microseismicity. We perform synthetic forward modelling to test the best-fit dip of this conductor using the response at a nearby MT station. We interpret the dipping conductor as a fractured and fluidized compartment, which is structurally controlled by an unmapped blind East Flank fault zone.

  17. Compartmentalization of the Coso East Flank geothermal field imaged by 3-D full-tensor MT inversion

    NASA Astrophysics Data System (ADS)

    Lindsey, Nathaniel J.; Kaven, Joern Ole; Davatzes, Nicholas; Newman, Gregory A.

    2017-02-01

    Previous magnetotelluric (MT) studies of the high-temperature Coso geothermal system in California identified a subvertical feature of low resistivity (2-5 Ohm m) and appreciable lateral extent (>1 km) in the producing zone of the East Flank field. However, these models could not reproduce gross 3-D effects in the recorded data. We perform 3-D full-tensor inversion and retrieve a resistivity model that out-performs previous 2-D and 3-D off-diagonal models in terms of its fit to the complete 3-D MT data set as well as the degree of modelling bias. Inclusion of secondary Zxx and Zyy data components leads to a robust east-dip (60†) to the previously identified conductive East Flank reservoir feature, which correlates strongly with recently mapped surface faults, downhole well temperatures, 3-D seismic reflection data, and local microseismicity. We perform synthetic forward modelling to test the best-fit dip of this conductor using the response at a nearby MT station. We interpret the dipping conductor as a fractured and fluidized compartment, which is structurally controlled by an unmapped blind East Flank fault zone.

  18. Enhancing Performance and Bit Rates in a Brain-Computer Interface System With Phase-to-Amplitude Cross-Frequency Coupling: Evidences From Traditional c-VEP, Fast c-VEP, and SSVEP Designs.

    PubMed

    Dimitriadis, Stavros I; Marimpis, Avraam D

    2018-01-01

    A brain-computer interface (BCI) is a channel of communication that transforms brain activity into specific commands for manipulating a personal computer or other home or electrical devices. In other words, a BCI is an alternative way of interacting with the environment by using brain activity instead of muscles and nerves. For that reason, BCI systems are of high clinical value for targeted populations suffering from neurological disorders. In this paper, we present a new processing approach in three publicly available BCI data sets: (a) a well-known multi-class ( N = 6) coded-modulated Visual Evoked potential (c-VEP)-based BCI system for able-bodied and disabled subjects; (b) a multi-class ( N = 32) c-VEP with slow and fast stimulus representation; and (c) a steady-state Visual Evoked potential (SSVEP) multi-class ( N = 5) flickering BCI system. Estimating cross-frequency coupling (CFC) and namely δ-θ [δ: (0.5-4 Hz), θ: (4-8 Hz)] phase-to-amplitude coupling (PAC) within sensor and across experimental time, we succeeded in achieving high classification accuracy and Information Transfer Rates (ITR) in the three data sets. Our approach outperformed the originally presented ITR on the three data sets. The bit rates obtained for both the disabled and able-bodied subjects reached the fastest reported level of 324 bits/min with the PAC estimator. Additionally, our approach outperformed alternative signal features such as the relative power (29.73 bits/min) and raw time series analysis (24.93 bits/min) and also the original reported bit rates of 10-25 bits/min . In the second data set, we succeeded in achieving an average ITR of 124.40 ± 11.68 for the slow 60 Hz and an average ITR of 233.99 ± 15.75 for the fast 120 Hz. In the third data set, we succeeded in achieving an average ITR of 106.44 ± 8.94. Current methodology outperforms any previous methodologies applied to each of the three free available BCI datasets.

  19. Assessment of physiological noise modelling methods for functional imaging of the spinal cord.

    PubMed

    Kong, Yazhuo; Jenkinson, Mark; Andersson, Jesper; Tracey, Irene; Brooks, Jonathan C W

    2012-04-02

    The spinal cord is the main pathway for information between the central and the peripheral nervous systems. Non-invasive functional MRI offers the possibility of studying spinal cord function and central sensitisation processes. However, imaging neural activity in the spinal cord is more difficult than in the brain. A significant challenge when dealing with such data is the influence of physiological noise (primarily cardiac and respiratory), and currently there is no standard approach to account for these effects. We have previously studied the various sources of physiological noise for spinal cord fMRI at 1.5T and proposed a physiological noise model (PNM) (Brooks et al., 2008). An alternative de-noising strategy, selective averaging filter (SAF), was proposed by Deckers et al. (2006). In this study we reviewed and implemented published physiological noise correction methods at higher field (3T) and aimed to find the optimal models for gradient-echo-based BOLD acquisitions. Two general techniques were compared: physiological noise model (PNM) and selective averaging filter (SAF), along with regressors designed to account for specific signal compartments and physiological processes: cerebrospinal fluid (CSF), motion correction (MC) parameters, heart rate (HR), respiration volume per time (RVT), and the associated cardiac and respiratory response functions. Functional responses were recorded from the cervical spinal cord of 18 healthy subjects in response to noxious thermal and non-noxious punctate stimulation. The various combinations of models and regressors were compared in three ways: the model fit residuals, regression model F-tests and the number of activated voxels. The PNM was found to outperform SAF in all three tests. Furthermore, inclusion of the CSF regressor was crucial as it explained a significant amount of signal variance in the cord and increased the number of active cord voxels. Whilst HR, RVT and MC explained additional signal (noise) variance, they were also found (in particular HR and RVT) to have a negative impact on the parameter estimates (of interest)--as they may be correlated with task conditions e.g. noxious thermal stimuli. Convolution with previously published cardiac and respiratory impulse response functions was not found to be beneficial. The other novel aspect of current study is the investigation of the influence of pre-whitening together with PNM regressors on spinal fMRI data. Pre-whitening was found to reduce non-white noise, which was not accounted for by physiological noise correction, and decrease false positive detection rates. Copyright © 2011 Elsevier Inc. All rights reserved.

  20. High-performance reconfigurable coincidence counting unit based on a field programmable gate array.

    PubMed

    Park, Byung Kwon; Kim, Yong-Su; Kwon, Osung; Han, Sang-Wook; Moon, Sung

    2015-05-20

    We present a high-performance reconfigurable coincidence counting unit (CCU) using a low-end field programmable gate array (FPGA) and peripheral circuits. Because of the flexibility guaranteed by the FPGA program, we can easily change system parameters, such as internal input delays, coincidence configurations, and the coincidence time window. In spite of a low-cost implementation, the proposed CCU architecture outperforms previous ones in many aspects: it has 8 logic inputs and 4 coincidence outputs that can measure up to eight-fold coincidences. The minimum coincidence time window and the maximum input frequency are 0.47 ns and 163 MHz, respectively. The CCU will be useful in various experimental research areas, including the field of quantum optics and quantum information.

  1. Sentence-Level Attachment Prediction

    NASA Astrophysics Data System (ADS)

    Albakour, M.-Dyaa; Kruschwitz, Udo; Lucas, Simon

    Attachment prediction is the task of automatically identifying email messages that should contain an attachment. This can be useful to tackle the problem of sending out emails but forgetting to include the relevant attachment (something that happens all too often). A common Information Retrieval (IR) approach in analyzing documents such as emails is to treat the entire document as a bag of words. Here we propose a finer-grained analysis to address the problem. We aim at identifying individual sentences within an email that refer to an attachment. If we detect any such sentence, we predict that the email should have an attachment. Using part of the Enron corpus for evaluation we find that our finer-grained approach outperforms previously reported document-level attachment prediction in similar evaluation settings.

  2. Distributed Efficient Similarity Search Mechanism in Wireless Sensor Networks

    PubMed Central

    Ahmed, Khandakar; Gregory, Mark A.

    2015-01-01

    The Wireless Sensor Network similarity search problem has received considerable research attention due to sensor hardware imprecision and environmental parameter variations. Most of the state-of-the-art distributed data centric storage (DCS) schemes lack optimization for similarity queries of events. In this paper, a DCS scheme with metric based similarity searching (DCSMSS) is proposed. DCSMSS takes motivation from vector distance index, called iDistance, in order to transform the issue of similarity searching into the problem of an interval search in one dimension. In addition, a sector based distance routing algorithm is used to efficiently route messages. Extensive simulation results reveal that DCSMSS is highly efficient and significantly outperforms previous approaches in processing similarity search queries. PMID:25751081

  3. Hybrid glowworm swarm optimization for task scheduling in the cloud environment

    NASA Astrophysics Data System (ADS)

    Zhou, Jing; Dong, Shoubin

    2018-06-01

    In recent years many heuristic algorithms have been proposed to solve task scheduling problems in the cloud environment owing to their optimization capability. This article proposes a hybrid glowworm swarm optimization (HGSO) based on glowworm swarm optimization (GSO), which uses a technique of evolutionary computation, a strategy of quantum behaviour based on the principle of neighbourhood, offspring production and random walk, to achieve more efficient scheduling with reasonable scheduling costs. The proposed HGSO reduces the redundant computation and the dependence on the initialization of GSO, accelerates the convergence and more easily escapes from local optima. The conducted experiments and statistical analysis showed that in most cases the proposed HGSO algorithm outperformed previous heuristic algorithms to deal with independent tasks.

  4. Noise adaptive wavelet thresholding for speckle noise removal in optical coherence tomography.

    PubMed

    Zaki, Farzana; Wang, Yahui; Su, Hao; Yuan, Xin; Liu, Xuan

    2017-05-01

    Optical coherence tomography (OCT) is based on coherence detection of interferometric signals and hence inevitably suffers from speckle noise. To remove speckle noise in OCT images, wavelet domain thresholding has demonstrated significant advantages in suppressing noise magnitude while preserving image sharpness. However, speckle noise in OCT images has different characteristics in different spatial scales, which has not been considered in previous applications of wavelet domain thresholding. In this study, we demonstrate a noise adaptive wavelet thresholding (NAWT) algorithm that exploits the difference of noise characteristics in different wavelet sub-bands. The algorithm is simple, fast, effective and is closely related to the physical origin of speckle noise in OCT image. Our results demonstrate that NAWT outperforms conventional wavelet thresholding.

  5. Body-Earth Mover's Distance: A Matching-Based Approach for Sleep Posture Recognition.

    PubMed

    Xu, Xiaowei; Lin, Feng; Wang, Aosen; Hu, Yu; Huang, Ming-Chun; Xu, Wenyao

    2016-10-01

    Sleep posture is a key component in sleep quality assessment and pressure ulcer prevention. Currently, body pressure analysis has been a popular method for sleep posture recognition. In this paper, a matching-based approach, Body-Earth Mover's Distance (BEMD), for sleep posture recognition is proposed. BEMD treats pressure images as weighted 2D shapes, and combines EMD and Euclidean distance for similarity measure. Compared with existing work, sleep posture recognition is achieved with posture similarity rather than multiple features for specific postures. A pilot study is performed with 14 persons for six different postures. The experimental results show that the proposed BEMD can achieve 91.21% accuracy, which outperforms the previous method with an improvement of 8.01%.

  6. Seminar 14 - Desiccant Enhanced Air Conditioning: Desiccant Enhanced Evaporative Air Conditioning (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kozubal, E.

    2013-02-01

    This presentation explains how liquid desiccant based coupled with an indirect evaporative cooler can efficiently produce cool, dry air, and how a liquid desiccant membrane air conditioner can efficiently provide cooling and dehumidification without the carryover problems of previous generations of liquid desiccant systems. It provides an overview to a liquid desiccant DX air conditioner that can efficiently provide cooling and dehumidification to high latent loads without the need for reheat, explains how liquid desiccant cooling and dehumidification systems can outperform vapor compression based air conditioning systems in hot and humid climates, explains how liquid desiccant cooling and dehumidification systemsmore » work, and describes a refrigerant free liquid desiccant based cooling system.« less

  7. A blur-invariant local feature for motion blurred image matching

    NASA Astrophysics Data System (ADS)

    Tong, Qiang; Aoki, Terumasa

    2017-07-01

    Image matching between a blurred (caused by camera motion, out of focus, etc.) image and a non-blurred image is a critical task for many image/video applications. However, most of the existing local feature schemes fail to achieve this work. This paper presents a blur-invariant descriptor and a novel local feature scheme including the descriptor and the interest point detector based on moment symmetry - the authors' previous work. The descriptor is based on a new concept - center peak moment-like element (CPME) which is robust to blur and boundary effect. Then by constructing CPMEs, the descriptor is also distinctive and suitable for image matching. Experimental results show our scheme outperforms state of the art methods for blurred image matching

  8. Exploring multiple feature combination strategies with a recurrent neural network architecture for off-line handwriting recognition

    NASA Astrophysics Data System (ADS)

    Mioulet, L.; Bideault, G.; Chatelain, C.; Paquet, T.; Brunessaux, S.

    2015-01-01

    The BLSTM-CTC is a novel recurrent neural network architecture that has outperformed previous state of the art algorithms in tasks such as speech recognition or handwriting recognition. It has the ability to process long term dependencies in temporal signals in order to label unsegmented data. This paper describes different ways of combining features using a BLSTM-CTC architecture. Not only do we explore the low level combination (feature space combination) but we also explore high level combination (decoding combination) and mid-level (internal system representation combination). The results are compared on the RIMES word database. Our results show that the low level combination works best, thanks to the powerful data modeling of the LSTM neurons.

  9. A set-covering formulation for a drayage problem with single and double container loads

    NASA Astrophysics Data System (ADS)

    Ghezelsoflu, A.; Di Francesco, M.; Frangioni, A.; Zuddas, P.

    2018-01-01

    This paper addresses a drayage problem, which is motivated by the case study of a real carrier. Its trucks carry one or two containers from a port to importers and from exporters to the port. Since up to four customers can be served in each route, we propose a set-covering formulation for this problem where all possible routes are enumerated. This model can be efficiently solved to optimality by a commercial solver, significantly outperforming a previously proposed node-arc formulation. Moreover, the model can be effectively used to evaluate a new distribution policy, which results in an enlarged set of feasible routes and can increase savings w.r.t. the policy currently employed by the carrier.

  10. Network immunization under limited budget using graph spectra

    NASA Astrophysics Data System (ADS)

    Zahedi, R.; Khansari, M.

    2016-03-01

    In this paper, we propose a new algorithm that minimizes the worst expected growth of an epidemic by reducing the size of the largest connected component (LCC) of the underlying contact network. The proposed algorithm is applicable to any level of available resources and, despite the greedy approaches of most immunization strategies, selects nodes simultaneously. In each iteration, the proposed method partitions the LCC into two groups. These are the best candidates for communities in that component, and the available resources are sufficient to separate them. Using Laplacian spectral partitioning, the proposed method performs community detection inference with a time complexity that rivals that of the best previous methods. Experiments show that our method outperforms targeted immunization approaches in both real and synthetic networks.

  11. What are the most important variables for Poaceae airborne pollen forecasting?

    PubMed

    Navares, Ricardo; Aznarte, José Luis

    2017-02-01

    In this paper, the problem of predicting future concentrations of airborne pollen is solved through a computational intelligence data-driven approach. The proposed method is able to identify the most important variables among those considered by other authors (mainly recent pollen concentrations and weather parameters), without any prior assumptions about the phenological relevance of the variables. Furthermore, an inferential procedure based on non-parametric hypothesis testing is presented to provide statistical evidence of the results, which are coherent to the literature and outperform previous proposals in terms of accuracy. The study is built upon Poaceae airborne pollen concentrations recorded in seven different locations across the Spanish province of Madrid. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Complex refractive index measurements for BaF 2 and CaF 2 via single-angle infrared reflectance spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelly-Gorham, Molly Rose K.; DeVetter, Brent M.; Brauer, Carolyn S.

    We have re-investigated the optical constants n and k for the homologous series of inorganic salts barium fluoride (BaF2) and calcium fluoride (CaF2) using a single-angle near-normal incidence reflectance device in combination with a calibrated Fourier transform infrared (FTIR) spectrometer. Our results are in good qualitative agreement with most previous works. However, certain features of the previously published data near the reststrahlen band exhibit distinct differences in spectral characteristics. Notably, our measurements of BaF2 do not include a spectral feature in the ~250 cm-1 reststrahlen band that was previously published. Additionally, CaF2 exhibits a distinct wavelength shift relative to themore » model derived from previously published data. We confirmed our results with recently published works that use significantly more modern instrumentation and data reduction techniques« less

  13. Importance of a species' socioecology: Wolves outperform dogs in a conspecific cooperation task.

    PubMed

    Marshall-Pescini, Sarah; Schwarz, Jonas F L; Kostelnik, Inga; Virányi, Zsófia; Range, Friederike

    2017-10-31

    A number of domestication hypotheses suggest that dogs have acquired a more tolerant temperament than wolves, promoting cooperative interactions with humans and conspecifics. This selection process has been proposed to resemble the one responsible for our own greater cooperative inclinations in comparison with our closest living relatives. However, the socioecology of wolves and dogs, with the former relying more heavily on cooperative activities, predicts that at least with conspecifics, wolves should cooperate better than dogs. Here we tested similarly raised wolves and dogs in a cooperative string-pulling task with conspecifics and found that wolves outperformed dogs, despite comparable levels of interest in the task. Whereas wolves coordinated their actions so as to simultaneously pull the rope ends, leading to success, dogs pulled the ropes in alternate moments, thereby never succeeding. Indeed in dog dyads it was also less likely that both members simultaneously engaged in other manipulative behaviors on the apparatus. Different conflict-management strategies are likely responsible for these results, with dogs' avoidance of potential competition over the apparatus constraining their capacity to coordinate actions. Wolves, in contrast, did not hesitate to manipulate the ropes simultaneously, and once cooperation was initiated, rapidly learned to coordinate in more complex conditions as well. Social dynamics (rank and affiliation) played a key role in success rates. Results call those domestication hypotheses that suggest dogs evolved greater cooperative inclinations into question, and rather support the idea that dogs' and wolves' different social ecologies played a role in affecting their capacity for conspecific cooperation and communication. Published under the PNAS license.

  14. Combining aneuploidy and dysplasia for colitis' cancer risk assessment outperforms current surveillance efficiency: a meta-analysis.

    PubMed

    Meyer, Rüdiger; Freitag-Wolf, Sandra; Blindow, Silke; Büning, Jürgen; Habermann, Jens K

    2017-02-01

    Cancer risk assessment for ulcerative colitis patients by evaluating histological changes through colonoscopy surveillance is still challenging. Thus, additional parameters of high prognostic impact for the development of colitis-associated carcinoma are necessary. This meta-analysis was conducted to clarify the value of aneuploidy as predictor for individual cancer risk compared with current surveillance parameters. A systematic web-based search identified studies published in English that addressed the relevance of the ploidy status for individual cancer risk during surveillance in comparison to neoplastic mucosal changes. The resulting data were included into a meta-analysis, and odds ratios (OR) were calculated for aneuploidy or dysplasia or aneuploidy plus dysplasia. Twelve studies addressing the relevance of aneuploidy compared to dyplasia were comprehensively evaluated and further used for meta-analysis. The meta-analysis revealed that aneuploidy (OR 5.31 [95 % CI 2.03, 13.93]) is an equally effective parameter for cancer risk assessment in ulcerative colitis patients as dysplasia (OR 4.93 [1.61, 15.11]). Strikingly, the combined assessment of dysplasia and aneuploidy is superior compared to applying each parameter alone (OR 8.99 [3.08, 26.26]). This meta-analysis reveals that aneuploidy is an equally effective parameter for individual cancer risk assessment in ulcerative colitis as the detection of dysplasia. More important, the combined assessment of dysplasia and aneuploidy outperforms the use of each parameter alone. We suggest image cytometry for ploidy assessment to become an additional feature of consensus criteria to individually assess cancer risk in UC.

  15. BESST--efficient scaffolding of large fragmented assemblies.

    PubMed

    Sahlin, Kristoffer; Vezzi, Francesco; Nystedt, Björn; Lundeberg, Joakim; Arvestad, Lars

    2014-08-15

    The use of short reads from High Throughput Sequencing (HTS) techniques is now commonplace in de novo assembly. Yet, obtaining contiguous assemblies from short reads is challenging, thus making scaffolding an important step in the assembly pipeline. Different algorithms have been proposed but many of them use the number of read pairs supporting a linking of two contigs as an indicator of reliability. This reasoning is intuitive, but fails to account for variation in link count due to contig features.We have also noted that published scaffolders are only evaluated on small datasets using output from only one assembler. Two issues arise from this. Firstly, some of the available tools are not well suited for complex genomes. Secondly, these evaluations provide little support for inferring a software's general performance. We propose a new algorithm, implemented in a tool called BESST, which can scaffold genomes of all sizes and complexities and was used to scaffold the genome of P. abies (20 Gbp). We performed a comprehensive comparison of BESST against the most popular stand-alone scaffolders on a large variety of datasets. Our results confirm that some of the popular scaffolders are not practical to run on complex datasets. Furthermore, no single stand-alone scaffolder outperforms the others on all datasets. However, BESST fares favorably to the other tested scaffolders on GAGE datasets and, moreover, outperforms the other methods when library insert size distribution is wide. We conclude from our results that information sources other than the quantity of links, as is commonly used, can provide useful information about genome structure when scaffolding.

  16. Quantifying Uncertainties in N2O Emission Due to N Fertilizer Application in Cultivated Areas

    PubMed Central

    Philibert, Aurore; Loyce, Chantal; Makowski, David

    2012-01-01

    Nitrous oxide (N2O) is a greenhouse gas with a global warming potential approximately 298 times greater than that of CO2. In 2006, the Intergovernmental Panel on Climate Change (IPCC) estimated N2O emission due to synthetic and organic nitrogen (N) fertilization at 1% of applied N. We investigated the uncertainty on this estimated value, by fitting 13 different models to a published dataset including 985 N2O measurements. These models were characterized by (i) the presence or absence of the explanatory variable “applied N”, (ii) the function relating N2O emission to applied N (exponential or linear function), (iii) fixed or random background (i.e. in the absence of N application) N2O emission and (iv) fixed or random applied N effect. We calculated ranges of uncertainty on N2O emissions from a subset of these models, and compared them with the uncertainty ranges currently used in the IPCC-Tier 1 method. The exponential models outperformed the linear models, and models including one or two random effects outperformed those including fixed effects only. The use of an exponential function rather than a linear function has an important practical consequence: the emission factor is not constant and increases as a function of applied N. Emission factors estimated using the exponential function were lower than 1% when the amount of N applied was below 160 kg N ha−1. Our uncertainty analysis shows that the uncertainty range currently used by the IPCC-Tier 1 method could be reduced. PMID:23226430

  17. Delimiting Species-Poor Data Sets using Single Molecular Markers: A Study of Barcode Gaps, Haplowebs and GMYC.

    PubMed

    Dellicour, Simon; Flot, Jean-François

    2015-11-01

    Most single-locus molecular approaches to species delimitation available to date have been designed and tested on data sets comprising at least tens of species, whereas the opposite case (species-poor data sets for which the hypothesis that all individuals are conspecific cannot by rejected beforehand) has rarely been the focus of such attempts. Here we compare the performance of barcode gap detection, haplowebs and generalized mixed Yule-coalescent (GMYC) models to delineate chimpanzees and bonobos using nuclear sequence markers, then apply these single-locus species delimitation methods to data sets of one, three, or six species simulated under a wide range of population sizes, speciation rates, mutation rates and sampling efforts. Our results show that barcode gap detection and GMYC models are unable to delineate species properly in data sets composed of one or two species, two situations in which haplowebs outperform them. For data sets composed of three or six species, bGMYC and haplowebs outperform the single-threshold and multiple-threshold versions of GMYC, whereas a clear barcode gap is only observed when population sizes and speciation rates are both small. The latter conditions represent a "sweet spot" for molecular taxonomy where all the single-locus approaches tested work well; however, the performance of these methods decreases strongly when population sizes and speciation rates are high, suggesting that multilocus approaches may be necessary to tackle such cases. © The Author(s) 2015. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. FastGCN: A GPU Accelerated Tool for Fast Gene Co-Expression Networks

    PubMed Central

    Liang, Meimei; Zhang, Futao; Jin, Gulei; Zhu, Jun

    2015-01-01

    Gene co-expression networks comprise one type of valuable biological networks. Many methods and tools have been published to construct gene co-expression networks; however, most of these tools and methods are inconvenient and time consuming for large datasets. We have developed a user-friendly, accelerated and optimized tool for constructing gene co-expression networks that can fully harness the parallel nature of GPU (Graphic Processing Unit) architectures. Genetic entropies were exploited to filter out genes with no or small expression changes in the raw data preprocessing step. Pearson correlation coefficients were then calculated. After that, we normalized these coefficients and employed the False Discovery Rate to control the multiple tests. At last, modules identification was conducted to construct the co-expression networks. All of these calculations were implemented on a GPU. We also compressed the coefficient matrix to save space. We compared the performance of the GPU implementation with those of multi-core CPU implementations with 16 CPU threads, single-thread C/C++ implementation and single-thread R implementation. Our results show that GPU implementation largely outperforms single-thread C/C++ implementation and single-thread R implementation, and GPU implementation outperforms multi-core CPU implementation when the number of genes increases. With the test dataset containing 16,000 genes and 590 individuals, we can achieve greater than 63 times the speed using a GPU implementation compared with a single-thread R implementation when 50 percent of genes were filtered out and about 80 times the speed when no genes were filtered out. PMID:25602758

  19. FastGCN: a GPU accelerated tool for fast gene co-expression networks.

    PubMed

    Liang, Meimei; Zhang, Futao; Jin, Gulei; Zhu, Jun

    2015-01-01

    Gene co-expression networks comprise one type of valuable biological networks. Many methods and tools have been published to construct gene co-expression networks; however, most of these tools and methods are inconvenient and time consuming for large datasets. We have developed a user-friendly, accelerated and optimized tool for constructing gene co-expression networks that can fully harness the parallel nature of GPU (Graphic Processing Unit) architectures. Genetic entropies were exploited to filter out genes with no or small expression changes in the raw data preprocessing step. Pearson correlation coefficients were then calculated. After that, we normalized these coefficients and employed the False Discovery Rate to control the multiple tests. At last, modules identification was conducted to construct the co-expression networks. All of these calculations were implemented on a GPU. We also compressed the coefficient matrix to save space. We compared the performance of the GPU implementation with those of multi-core CPU implementations with 16 CPU threads, single-thread C/C++ implementation and single-thread R implementation. Our results show that GPU implementation largely outperforms single-thread C/C++ implementation and single-thread R implementation, and GPU implementation outperforms multi-core CPU implementation when the number of genes increases. With the test dataset containing 16,000 genes and 590 individuals, we can achieve greater than 63 times the speed using a GPU implementation compared with a single-thread R implementation when 50 percent of genes were filtered out and about 80 times the speed when no genes were filtered out.

  20. Native Honey Bees Outperform Adventive Honey Bees in Increasing Pyrus bretschneideri (Rosales: Rosaceae) Pollination.

    PubMed

    Gemeda, Tolera Kumsa; Shao, Youquan; Wu, Wenqin; Yang, Huipeng; Huang, Jiaxing; Wu, Jie

    2017-12-05

    The foraging behavior of different bee species is a key factor influencing the pollination efficiency of different crops. Most pear species exhibit full self-incompatibility and thus depend entirely on cross-pollination. However, as little is known about the pear visitation preferences of native Apis cerana (Fabricius; Hymenoptera: Apidae) and adventive Apis mellifera (L.; Hymenoptera: Apidae) in China. A comparative analysis was performed to explore the pear-foraging differences of these species under the natural conditions of pear growing areas. The results show significant variability in the pollen-gathering tendency of these honey bees. Compared to A. mellifera, A. cerana begins foraging at an earlier time of day and gathers a larger amount of pollen in the morning. Based on pollen collection data, A. mellifera shows variable preferences: vigorously foraging on pear on the first day of observation but collecting pollen from non-target floral resources on other experimental days. Conversely, A. cerana persists in pear pollen collection, without shifting preference to other competitive flowers. Therefore, A. cerana outperforms adventive A. mellifera with regard to pear pollen collection under natural conditions, which may lead to increased pear pollination. This study supports arguments in favor of further multiplication and maintenance of A. cerana for pear and other native crop pollination. Moreover, it is essential to develop alternative pollination management techniques to utilize A. mellifera for pear pollination. © The Author(s) 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  1. Gender Differences in Primary and Secondary Education: Are Girls Really Outperforming Boys?

    ERIC Educational Resources Information Center

    Driessen, Geert; van Langen, Annemarie

    2013-01-01

    A moral panic has broken out in several countries after recent studies showed that girls were outperforming boys in education. Commissioned by the Dutch Ministry of Education, the present study examines the position of boys and girls in Dutch primary education and in the first phase of secondary education over the past ten to fifteen years. On the…

  2. Why Do Chinese Students Out-Perform Those from the West? Do Approaches to Learning Contribute to the Explanation?

    ERIC Educational Resources Information Center

    Kember, David

    2016-01-01

    One of the major current issues in education is the question of why Chinese and East Asian students are outperforming those from Western countries. Research into the approaches to learning of Chinese students revealed the existence of intermediate approaches, combining memorising and understanding, which were distinct from rote learning. At the…

  3. High precision tracking control of a servo gantry with dynamic friction compensation.

    PubMed

    Zhang, Yangming; Yan, Peng; Zhang, Zhen

    2016-05-01

    This paper is concerned with the tracking control problem of a voice coil motor (VCM) actuated servo gantry system. By utilizing an adaptive control technique combined with a sliding mode approach, an adaptive sliding mode control (ASMC) law with friction compensation scheme is proposed in presence of both frictions and external disturbances. Based on the LuGre dynamic friction model, a dual-observer structure is used to estimate the unmeasurable friction state, and an adaptive control law is synthesized to effectively handle the unknown friction model parameters as well as the bound of the disturbances. Moreover, the proposed control law is also implemented on a VCM servo gantry system for motion tracking. Simulations and experimental results demonstrate good tracking performance, which outperform traditional control approaches. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  4. Design and analysis of a model predictive controller for active queue management.

    PubMed

    Wang, Ping; Chen, Hong; Yang, Xiaoping; Ma, Yan

    2012-01-01

    Model predictive (MP) control as a novel active queue management (AQM) algorithm in dynamic computer networks is proposed. According to the predicted future queue length in the data buffer, early packets at the router are dropped reasonably by the MPAQM controller so that the queue length reaches the desired value with minimal tracking error. The drop probability is obtained by optimizing the network performance. Further, randomized algorithms are applied to analyze the robustness of MPAQM successfully, and also to provide the stability domain of systems with uncertain network parameters. The performances of MPAQM are evaluated through a series of simulations in NS2. The simulation results show that the MPAQM algorithm outperforms RED, PI, and REM algorithms in terms of stability, disturbance rejection, and robustness. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Direct power control of DFIG wind turbine systems based on an intelligent proportional-integral sliding mode control.

    PubMed

    Li, Shanzhi; Wang, Haoping; Tian, Yang; Aitouch, Abdel; Klein, John

    2016-09-01

    This paper presents an intelligent proportional-integral sliding mode control (iPISMC) for direct power control of variable speed-constant frequency wind turbine system. This approach deals with optimal power production (in the maximum power point tracking sense) under several disturbance factors such as turbulent wind. This controller is made of two sub-components: (i) an intelligent proportional-integral module for online disturbance compensation and (ii) a sliding mode module for circumventing disturbance estimation errors. This iPISMC method has been tested on FAST/Simulink platform of a 5MW wind turbine system. The obtained results demonstrate that the proposed iPISMC method outperforms the classical PI and intelligent proportional-integral control (iPI) in terms of both active power and response time. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  6. An Optimization-based Framework to Learn Conditional Random Fields for Multi-label Classification

    PubMed Central

    Naeini, Mahdi Pakdaman; Batal, Iyad; Liu, Zitao; Hong, CharmGil; Hauskrecht, Milos

    2015-01-01

    This paper studies multi-label classification problem in which data instances are associated with multiple, possibly high-dimensional, label vectors. This problem is especially challenging when labels are dependent and one cannot decompose the problem into a set of independent classification problems. To address the problem and properly represent label dependencies we propose and study a pairwise conditional random Field (CRF) model. We develop a new approach for learning the structure and parameters of the CRF from data. The approach maximizes the pseudo likelihood of observed labels and relies on the fast proximal gradient descend for learning the structure and limited memory BFGS for learning the parameters of the model. Empirical results on several datasets show that our approach outperforms several multi-label classification baselines, including recently published state-of-the-art methods. PMID:25927015

  7. Structural Acoustic Prediction and Interior Noise Control Technology

    NASA Technical Reports Server (NTRS)

    Mathur, G. P.; Chin, C. L.; Simpson, M. A.; Lee, J. T.; Palumbo, Daniel L. (Technical Monitor)

    2001-01-01

    This report documents the results of Task 14, "Structural Acoustic Prediction and Interior Noise Control Technology". The task was to evaluate the performance of tuned foam elements (termed Smart Foam) both analytically and experimentally. Results taken from a three-dimensional finite element model of an active, tuned foam element are presented. Measurements of sound absorption and sound transmission loss were taken using the model. These results agree well with published data. Experimental performance data were taken in Boeing's Interior Noise Test Facility where 12 smart foam elements were applied to a 757 sidewall. Several configurations were tested. Noise reductions of 5-10 dB were achieved over the 200-800 Hz bandwidth of the controller. Accelerometers mounted on the panel provided a good reference for the controller. Configurations with far-field error microphones outperformed near-field cases.

  8. Probability genotype imputation method and integrated weighted lasso for QTL identification.

    PubMed

    Demetrashvili, Nino; Van den Heuvel, Edwin R; Wit, Ernst C

    2013-12-30

    Many QTL studies have two common features: (1) often there is missing marker information, (2) among many markers involved in the biological process only a few are causal. In statistics, the second issue falls under the headings "sparsity" and "causal inference". The goal of this work is to develop a two-step statistical methodology for QTL mapping for markers with binary genotypes. The first step introduces a novel imputation method for missing genotypes. Outcomes of the proposed imputation method are probabilities which serve as weights to the second step, namely in weighted lasso. The sparse phenotype inference is employed to select a set of predictive markers for the trait of interest. Simulation studies validate the proposed methodology under a wide range of realistic settings. Furthermore, the methodology outperforms alternative imputation and variable selection methods in such studies. The methodology was applied to an Arabidopsis experiment, containing 69 markers for 165 recombinant inbred lines of a F8 generation. The results confirm previously identified regions, however several new markers are also found. On the basis of the inferred ROC behavior these markers show good potential for being real, especially for the germination trait Gmax. Our imputation method shows higher accuracy in terms of sensitivity and specificity compared to alternative imputation method. Also, the proposed weighted lasso outperforms commonly practiced multiple regression as well as the traditional lasso and adaptive lasso with three weighting schemes. This means that under realistic missing data settings this methodology can be used for QTL identification.

  9. Book review: The Wilderness Debate Rages On: Continuing the Great New Wilderness Debate

    Treesearch

    Peter Landres

    2009-01-01

    The Wilderness Debate Rages On is a collection of mostly previously published papers about the meaning, value, and role of wilderness and continues the discussion that was propelled by the editors' previous book The Great New Wilderness Debate (also a collection of papers) published in 1998. The editors state that this sequel to their previous book is mandated...

  10. Forecasting influenza-like illness dynamics for military populations using neural networks and social media

    DOE PAGES

    Volkova, Svitlana; Ayton, Ellyn; Porterfield, Katherine; ...

    2017-12-15

    This work is the first to take advantage of recurrent neural networks to predict influenza-like-illness (ILI) dynamics from various linguistic signals extracted from social media data. Unlike other approaches that rely on timeseries analysis of historical ILI data [1, 2] and the state-of-the-art machine learning models [3, 4], we build and evaluate the predictive power of Long Short Term Memory (LSTMs) architectures capable of nowcasting (predicting in \\real-time") and forecasting (predicting the future) ILI dynamics in the 2011 { 2014 influenza seasons. To build our models we integrate information people post in social media e.g., topics, stylistic and syntactic patterns,more » emotions and opinions, and communication behavior. We then quantitatively evaluate the predictive power of different social media signals and contrast the performance of the-state-of-the-art regression models with neural networks. Finally, we combine ILI and social media signals to build joint neural network models for ILI dynamics prediction. Unlike the majority of the existing work, we specifically focus on developing models for local rather than national ILI surveillance [1], specifically for military rather than general populations [3] in 26 U.S. and six international locations. Our approach demonstrates several advantages: (a) Neural network models learned from social media data yield the best performance compared to previously used regression models. (b) Previously under-explored language and communication behavior features are more predictive of ILI dynamics than syntactic and stylistic signals expressed in social media. (c) Neural network models learned exclusively from social media signals yield comparable or better performance to the models learned from ILI historical data, thus, signals from social media can be potentially used to accurately forecast ILI dynamics for the regions where ILI historical data is not available. (d) Neural network models learned from combined ILI and social media signals significantly outperform models that rely solely on ILI historical data, which adds to a great potential of alternative public sources for ILI dynamics prediction. (e) Location-specific models outperform previously used location-independent models e.g., U.S. only. (f) Prediction results significantly vary across geolocations depending on the amount of social media data available and ILI activity patterns.« less

  11. Forecasting influenza-like illness dynamics for military populations using neural networks and social media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Volkova, Svitlana; Ayton, Ellyn; Porterfield, Katherine

    This work is the first to take advantage of recurrent neural networks to predict influenza-like-illness (ILI) dynamics from various linguistic signals extracted from social media data. Unlike other approaches that rely on timeseries analysis of historical ILI data [1, 2] and the state-of-the-art machine learning models [3, 4], we build and evaluate the predictive power of Long Short Term Memory (LSTMs) architectures capable of nowcasting (predicting in \\real-time") and forecasting (predicting the future) ILI dynamics in the 2011 { 2014 influenza seasons. To build our models we integrate information people post in social media e.g., topics, stylistic and syntactic patterns,more » emotions and opinions, and communication behavior. We then quantitatively evaluate the predictive power of different social media signals and contrast the performance of the-state-of-the-art regression models with neural networks. Finally, we combine ILI and social media signals to build joint neural network models for ILI dynamics prediction. Unlike the majority of the existing work, we specifically focus on developing models for local rather than national ILI surveillance [1], specifically for military rather than general populations [3] in 26 U.S. and six international locations. Our approach demonstrates several advantages: (a) Neural network models learned from social media data yield the best performance compared to previously used regression models. (b) Previously under-explored language and communication behavior features are more predictive of ILI dynamics than syntactic and stylistic signals expressed in social media. (c) Neural network models learned exclusively from social media signals yield comparable or better performance to the models learned from ILI historical data, thus, signals from social media can be potentially used to accurately forecast ILI dynamics for the regions where ILI historical data is not available. (d) Neural network models learned from combined ILI and social media signals significantly outperform models that rely solely on ILI historical data, which adds to a great potential of alternative public sources for ILI dynamics prediction. (e) Location-specific models outperform previously used location-independent models e.g., U.S. only. (f) Prediction results significantly vary across geolocations depending on the amount of social media data available and ILI activity patterns.« less

  12. Effects of linking a soil-water-balance model with a groundwater-flow model

    USGS Publications Warehouse

    Stanton, Jennifer S.; Ryter, Derek W.; Peterson, Steven M.

    2013-01-01

    A previously published regional groundwater-flow model in north-central Nebraska was sequentially linked with the recently developed soil-water-balance (SWB) model to analyze effects to groundwater-flow model parameters and calibration results. The linked models provided a more detailed spatial and temporal distribution of simulated recharge based on hydrologic processes, improvement of simulated groundwater-level changes and base flows at specific sites in agricultural areas, and a physically based assessment of the relative magnitude of recharge for grassland, nonirrigated cropland, and irrigated cropland areas. Root-mean-squared (RMS) differences between the simulated and estimated or measured target values for the previously published model and linked models were relatively similar and did not improve for all types of calibration targets. However, without any adjustment to the SWB-generated recharge, the RMS difference between simulated and estimated base-flow target values for the groundwater-flow model was slightly smaller than for the previously published model, possibly indicating that the volume of recharge simulated by the SWB code was closer to actual hydrogeologic conditions than the previously published model provided. Groundwater-level and base-flow hydrographs showed that temporal patterns of simulated groundwater levels and base flows were more accurate for the linked models than for the previously published model at several sites, particularly in agricultural areas.

  13. Advanced Placement Results, 2013-14. Measuring Up. D&A Report No.15.01

    ERIC Educational Resources Information Center

    Gilleland, Kevin; Muli, Juliana

    2015-01-01

    Advanced Placement (AP) outcomes for Wake County Public School System (WCPSS) students have continued an upward trend for over 18 years, out-performing the state and the nation in all measures. In 2013-14 there were 13,757 exams taken by 6,955 WCPSS test-takers with almost 76% of the exams resulting in scores at or above 3, outperforming Guilford…

  14. Using prediction markets to estimate the reproducibility of scientific research.

    PubMed

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A; Johannesson, Magnus

    2015-12-15

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants' individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a "statistically significant" finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications.

  15. Using prediction markets to estimate the reproducibility of scientific research

    PubMed Central

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A.; Johannesson, Magnus

    2015-01-01

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants’ individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a “statistically significant” finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications. PMID:26553988

  16. Bioengineered Renal Cell Therapy Device for Clinical Translation

    PubMed Central

    Pino, Christopher J.; Westover, Angela J.; Buffington, Deborah A.; Humes, H. David

    2016-01-01

    The Bioartificial Renal Epithelial Cell System (BRECS), is a cell-based device to treat acute kidney injury through renal cell therapy from an extracorporeal circuit. To enable widespread implementation of cell therapy, the BRECS was designed to be cryopreserved as a complete device, cryostored, cryoshipped to an end-use site, thawed as a complete device, and employed in a therapeutic extracorporeal hemofiltration circuit. This strategy overcomes storage and distribution issues that have been previous barriers to cell therapy. Previous BRECS housings produced by Computer Numerical Control (CNC) machining, a slow process taking hours to produce one bioreactor, was also prohibitively expensive (>$600/CNC-BRECS); major obstacles to mass production. The goal of this study was to produce a BRECS to be mass produced by injection molding (IM-BRECS), decreasing cost (<$20/unit) and improving manufacturing speed (hundreds of units/hr), while maintaining the same cell therapy function as the previous CNC-BRECS, first evaluated through prototypes produced by stereolithography (SLA-BRECS). The finalized IM-BRECS design had a significantly lower fill volume (10 mL), mass (49 g) and footprint (8.5 cm×8.5 cm×1.5 cm), and was demonstrated to outperform the previous BRECS designs with respect to heat transfer, significantly improving control of cooling during cryopreservation and reducing thaw times during warming. During in vitro culture, IM-BRECS performed similarly to previous CNC-BRECS with respect to cell metabolic activity (lactate production, oxygen consumption and glutathione metabolism) and amount of cells supported. PMID:27922886

  17. The role of auditory feedback in music-supported stroke rehabilitation: A single-blinded randomised controlled intervention.

    PubMed

    van Vugt, F T; Kafczyk, T; Kuhn, W; Rollnik, J D; Tillmann, B; Altenmüller, E

    2016-01-01

    Learning to play musical instruments such as piano was previously shown to benefit post-stroke motor rehabilitation. Previous work hypothesised that the mechanism of this rehabilitation is that patients use auditory feedback to correct their movements and therefore show motor learning. We tested this hypothesis by manipulating the auditory feedback timing in a way that should disrupt such error-based learning. We contrasted a patient group undergoing music-supported therapy on a piano that emits sounds immediately (as in previous studies) with a group whose sounds are presented after a jittered delay. The delay was not noticeable to patients. Thirty-four patients in early stroke rehabilitation with moderate motor impairment and no previous musical background learned to play the piano using simple finger exercises and familiar children's songs. Rehabilitation outcome was not impaired in the jitter group relative to the normal group. Conversely, some clinical tests suggests the jitter group outperformed the normal group. Auditory feedback-based motor learning is not the beneficial mechanism of music-supported therapy. Immediate auditory feedback therapy may be suboptimal. Jittered delay may increase efficacy of the proposed therapy and allow patients to fully benefit from motivational factors of music training. Our study shows a novel way to test hypotheses concerning music training in a single-blinded way, which is an important improvement over existing unblinded tests of music interventions.

  18. 15 CFR 10.10 - Review of published standards.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 1 2010-01-01 2010-01-01 false Review of published standards. 10.10... DEVELOPMENT OF VOLUNTARY PRODUCT STANDARDS § 10.10 Review of published standards. (a) Each standard published... considered until a replacement standard is published. (b) Each standard published under these or previous...

  19. Image watermarking against lens flare effects

    NASA Astrophysics Data System (ADS)

    Chotikawanid, Piyanart; Amornraksa, Thumrongrat

    2017-02-01

    Lens flare effects in various photo and camera software nowadays can partially or fully damage the watermark information within the watermarked image. We propose in this paper a spatial domain based image watermarking against lens flare effects. The watermark embedding is based on the modification of the saturation color component in HSV color space of a host image. For watermark extraction, a homomorphic filter is used to predict the original embedding component from the watermarked component, and the watermark is blindly recovered by differentiating both components. The watermarked image's quality is evaluated by wPSNR, while the extracted watermark's accuracy is evaluated by NC. The experimental results against various types of lens flare effects from both computer software and mobile application showed that our proposed method outperformed the previous methods.

  20. Identifying synonymy between relational phrases using word embeddings.

    PubMed

    Nguyen, Nhung T H; Miwa, Makoto; Tsuruoka, Yoshimasa; Tojo, Satoshi

    2015-08-01

    Many text mining applications in the biomedical domain benefit from automatic clustering of relational phrases into synonymous groups, since it alleviates the problem of spurious mismatches caused by the diversity of natural language expressions. Most of the previous work that has addressed this task of synonymy resolution uses similarity metrics between relational phrases based on textual strings or dependency paths, which, for the most part, ignore the context around the relations. To overcome this shortcoming, we employ a word embedding technique to encode relational phrases. We then apply the k-means algorithm on top of the distributional representations to cluster the phrases. Our experimental results show that this approach outperforms state-of-the-art statistical models including latent Dirichlet allocation and Markov logic networks. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Modeling Aromatic Liquids:  Toluene, Phenol, and Pyridine.

    PubMed

    Baker, Christopher M; Grant, Guy H

    2007-03-01

    Aromatic groups are now acknowledged to play an important role in many systems of interest. However, existing molecular mechanics methods provide a poor representation of these groups. In a previous paper, we have shown that the molecular mechanics treatment of benzene can be improved by the incorporation of an explicit representation of the aromatic π electrons. Here, we develop this concept further, developing charge-separation models for toluene, phenol, and pyridine. Monte Carlo simulations are used to parametrize the models, via the reproduction of experimental thermodynamic data, and our models are shown to outperform an existing atom-centered model. The models are then used to make predictions about the structures of the liquids at the molecular level and are tested further through their application to the modeling of gas-phase dimers and cation-π interactions.

  2. Constructive autoassociative neural network for facial recognition.

    PubMed

    Fernandes, Bruno J T; Cavalcanti, George D C; Ren, Tsang I

    2014-01-01

    Autoassociative artificial neural networks have been used in many different computer vision applications. However, it is difficult to define the most suitable neural network architecture because this definition is based on previous knowledge and depends on the problem domain. To address this problem, we propose a constructive autoassociative neural network called CANet (Constructive Autoassociative Neural Network). CANet integrates the concepts of receptive fields and autoassociative memory in a dynamic architecture that changes the configuration of the receptive fields by adding new neurons in the hidden layer, while a pruning algorithm removes neurons from the output layer. Neurons in the CANet output layer present lateral inhibitory connections that improve the recognition rate. Experiments in face recognition and facial expression recognition show that the CANet outperforms other methods presented in the literature.

  3. Optimal design of radial Bragg cavities and lasers.

    PubMed

    Ben-Bassat, Eyal; Scheuer, Jacob

    2015-07-01

    We present a new and optimal design approach for obtaining maximal confinement of the field in radial Bragg cavities and lasers for TM polarization. The presented approach outperforms substantially the previously employed periodic and semi-periodic design schemes of such lasers. We show that in order to obtain maximal confinement, it is essential to consider the complete reflection properties (amplitude and phase) of the propagating radial waves at the interfaces between Bragg layers. When these properties are taken into account, we find that it is necessary to introduce a wider ("half-wavelength") layer at a specific radius in the "quarter-wavelength" radial Bragg stack. It is shown that this radius corresponds to the cylindrical equivalent of Brewster's angle. The confinement and field profile are calculated numerically by means of transfer matrix method.

  4. Single image super-resolution via regularized extreme learning regression for imagery from microgrid polarimeters

    NASA Astrophysics Data System (ADS)

    Sargent, Garrett C.; Ratliff, Bradley M.; Asari, Vijayan K.

    2017-08-01

    The advantage of division of focal plane imaging polarimeters is their ability to obtain temporally synchronized intensity measurements across a scene; however, they sacrifice spatial resolution in doing so due to their spatially modulated arrangement of the pixel-to-pixel polarizers and often result in aliased imagery. Here, we propose a super-resolution method based upon two previously trained extreme learning machines (ELM) that attempt to recover missing high frequency and low frequency content beyond the spatial resolution of the sensor. This method yields a computationally fast and simple way of recovering lost high and low frequency content from demosaicing raw microgrid polarimetric imagery. The proposed method outperforms other state-of-the-art single-image super-resolution algorithms in terms of structural similarity and peak signal-to-noise ratio.

  5. Overcoming Communication Restrictions in Collectives

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan; Agogino, Adrian K.

    2004-01-01

    Many large distributed system are characterized by having a large number of components (eg., agents, neurons) whose actions and interactions determine a %orld utility which rates the performance of the overall system. Such collectives are often subject to communication restrictions, making it difficult for components which try to optimize their own private utilities, to take actions that also help optimize the world utility. In this article we address that coordination problem and derive four utility functions which present different compromises between how aligned a component s private utility is with the world utility and how readily that component can determine the actions that optimize its utility. The results show that the utility functions specifically derived to operate under communication restrictions outperform both traditional methods and previous collective-based methods by up to 75%.

  6. Dictionary learning and time sparsity in dynamic MRI.

    PubMed

    Caballero, Jose; Rueckert, Daniel; Hajnal, Joseph V

    2012-01-01

    Sparse representation methods have been shown to tackle adequately the inherent speed limits of magnetic resonance imaging (MRI) acquisition. Recently, learning-based techniques have been used to further accelerate the acquisition of 2D MRI. The extension of such algorithms to dynamic MRI (dMRI) requires careful examination of the signal sparsity distribution among the different dimensions of the data. Notably, the potential of temporal gradient (TG) sparsity in dMRI has not yet been explored. In this paper, a novel method for the acceleration of cardiac dMRI is presented which investigates the potential benefits of enforcing sparsity constraints on patch-based learned dictionaries and TG at the same time. We show that an algorithm exploiting sparsity on these two domains can outperform previous sparse reconstruction techniques.

  7. Coping efficiently with now-relative medical data.

    PubMed

    Stantic, Bela; Terenziani, Paolo; Sattar, Abdul

    2008-11-06

    In Medical Informatics, there is an increasing awareness that temporal information plays a crucial role, so that suitable database approaches are needed to store and support it. Specifically, most clinical data are intrinsically temporal, and a relevant part of them are now-relative (i.e., they are valid at the current time). Even if previous studies indicate that the treatment of now-relative data has a crucial impact on efficiency, current approaches have several limitations. In this paper we propose a novel approach, which is based on a new representation of now, and on query transformations. We also experimentally demonstrate that our approach outperforms its best competitors in the literature to the extent of a factor of more than ten, both in number of disk accesses and of CPU usage.

  8. Global Search Capabilities of Indirect Methods for Impulsive Transfers

    NASA Astrophysics Data System (ADS)

    Shen, Hong-Xin; Casalino, Lorenzo; Luo, Ya-Zhong

    2015-09-01

    An optimization method which combines an indirect method with homotopic approach is proposed and applied to impulsive trajectories. Minimum-fuel, multiple-impulse solutions, with either fixed or open time are obtained. The homotopic approach at hand is relatively straightforward to implement and does not require an initial guess of adjoints, unlike previous adjoints estimation methods. A multiple-revolution Lambert solver is used to find multiple starting solutions for the homotopic procedure; this approach can guarantee to obtain multiple local solutions without relying on the user's intuition, thus efficiently exploring the solution space to find the global optimum. The indirect/homotopic approach proves to be quite effective and efficient in finding optimal solutions, and outperforms the joint use of evolutionary algorithms and deterministic methods in the test cases.

  9. Distant supervision for neural relation extraction integrated with word attention and property features.

    PubMed

    Qu, Jianfeng; Ouyang, Dantong; Hua, Wen; Ye, Yuxin; Li, Ximing

    2018-04-01

    Distant supervision for neural relation extraction is an efficient approach to extracting massive relations with reference to plain texts. However, the existing neural methods fail to capture the critical words in sentence encoding and meanwhile lack useful sentence information for some positive training instances. To address the above issues, we propose a novel neural relation extraction model. First, we develop a word-level attention mechanism to distinguish the importance of each individual word in a sentence, increasing the attention weights for those critical words. Second, we investigate the semantic information from word embeddings of target entities, which can be developed as a supplementary feature for the extractor. Experimental results show that our model outperforms previous state-of-the-art baselines. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Deep Learning for Computer Vision: A Brief Review

    PubMed Central

    Doulamis, Nikolaos; Doulamis, Anastasios; Protopapadakis, Eftychios

    2018-01-01

    Over the last years deep learning methods have been shown to outperform previous state-of-the-art machine learning techniques in several fields, with computer vision being one of the most prominent cases. This review paper provides a brief overview of some of the most significant deep learning schemes used in computer vision problems, that is, Convolutional Neural Networks, Deep Boltzmann Machines and Deep Belief Networks, and Stacked Denoising Autoencoders. A brief account of their history, structure, advantages, and limitations is given, followed by a description of their applications in various computer vision tasks, such as object detection, face recognition, action and activity recognition, and human pose estimation. Finally, a brief overview is given of future directions in designing deep learning schemes for computer vision problems and the challenges involved therein. PMID:29487619

  11. Learning to forget: continual prediction with LSTM.

    PubMed

    Gers, F A; Schmidhuber, J; Cummins, F

    2000-10-01

    Long short-term memory (LSTM; Hochreiter & Schmidhuber, 1997) can solve numerous tasks not solvable by previous learning algorithms for recurrent neural networks (RNNs). We identify a weakness of LSTM networks processing continual input streams that are not a priori segmented into subsequences with explicitly marked ends at which the network's internal state could be reset. Without resets, the state may grow indefinitely and eventually cause the network to break down. Our remedy is a novel, adaptive "forget gate" that enables an LSTM cell to learn to reset itself at appropriate times, thus releasing internal resources. We review illustrative benchmark problems on which standard LSTM outperforms other RNN algorithms. All algorithms (including LSTM) fail to solve continual versions of these problems. LSTM with forget gates, however, easily solves them, and in an elegant way.

  12. Knowledge boosting: a graph-based integration approach with multi-omics data and genomic knowledge for cancer clinical outcome prediction.

    PubMed

    Kim, Dokyoon; Joung, Je-Gun; Sohn, Kyung-Ah; Shin, Hyunjung; Park, Yu Rang; Ritchie, Marylyn D; Kim, Ju Han

    2015-01-01

    Cancer can involve gene dysregulation via multiple mechanisms, so no single level of genomic data fully elucidates tumor behavior due to the presence of numerous genomic variations within or between levels in a biological system. We have previously proposed a graph-based integration approach that combines multi-omics data including copy number alteration, methylation, miRNA, and gene expression data for predicting clinical outcome in cancer. However, genomic features likely interact with other genomic features in complex signaling or regulatory networks, since cancer is caused by alterations in pathways or complete processes. Here we propose a new graph-based framework for integrating multi-omics data and genomic knowledge to improve power in predicting clinical outcomes and elucidate interplay between different levels. To highlight the validity of our proposed framework, we used an ovarian cancer dataset from The Cancer Genome Atlas for predicting stage, grade, and survival outcomes. Integrating multi-omics data with genomic knowledge to construct pre-defined features resulted in higher performance in clinical outcome prediction and higher stability. For the grade outcome, the model with gene expression data produced an area under the receiver operating characteristic curve (AUC) of 0.7866. However, models of the integration with pathway, Gene Ontology, chromosomal gene set, and motif gene set consistently outperformed the model with genomic data only, attaining AUCs of 0.7873, 0.8433, 0.8254, and 0.8179, respectively. Integrating multi-omics data and genomic knowledge to improve understanding of molecular pathogenesis and underlying biology in cancer should improve diagnostic and prognostic indicators and the effectiveness of therapies. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  13. Triplet supertree heuristics for the tree of life

    PubMed Central

    Lin, Harris T; Burleigh, J Gordon; Eulenstein, Oliver

    2009-01-01

    Background There is much interest in developing fast and accurate supertree methods to infer the tree of life. Supertree methods combine smaller input trees with overlapping sets of taxa to make a comprehensive phylogenetic tree that contains all of the taxa in the input trees. The intrinsically hard triplet supertree problem takes a collection of input species trees and seeks a species tree (supertree) that maximizes the number of triplet subtrees that it shares with the input trees. However, the utility of this supertree problem has been limited by a lack of efficient and effective heuristics. Results We introduce fast hill-climbing heuristics for the triplet supertree problem that perform a step-wise search of the tree space, where each step is guided by an exact solution to an instance of a local search problem. To realize time efficient heuristics we designed the first nontrivial algorithms for two standard search problems, which greatly improve on the time complexity to the best known (naïve) solutions by a factor of n and n2 (the number of taxa in the supertree). These algorithms enable large-scale supertree analyses based on the triplet supertree problem that were previously not possible. We implemented hill-climbing heuristics that are based on our new algorithms, and in analyses of two published supertree data sets, we demonstrate that our new heuristics outperform other standard supertree methods in maximizing the number of triplets shared with the input trees. Conclusion With our new heuristics, the triplet supertree problem is now computationally more tractable for large-scale supertree analyses, and it provides a potentially more accurate alternative to existing supertree methods. PMID:19208181

  14. Can physiological engineering/programming increase multi-generational thermal tolerance to extreme temperature events?

    PubMed

    Sorby, Kris L; Green, Mark P; Dempster, Tim D; Jessop, Tim S

    2018-05-29

    Organisms increasingly encounter higher frequencies of extreme weather events as a consequence of global climate change. Currently, few strategies are available to mitigate climate change effects on animals arising from acute extreme high temperature events. We tested the capacity of physiological engineering to influence the intra- and multi-generational upper thermal tolerance capacity of a model organism Artemia , subjected to extreme high temperatures. Enhancement of specific physiological regulators during development could affect thermal tolerances or life-history attributes affecting subsequent fitness. Using experimental Artemia populations we exposed F0 individuals to one of four treatments; heat hardening (28°C to 36°C, 1°C per 10 minutes), heat hardening plus serotonin (0.056 µg ml -1 ), heat hardening plus methionine (0.79 mg ml -1 ), and a control treatment. Regulator concentrations were based on previous literature. Serotonin may promote thermotolerance, acting upon metabolism and life-history. Methionine acts as a methylation agent across generations. For all groups, measurements were collected for three performance traits of individual thermal tolerance (upper sublethal thermal limit, lethal limit, and dysregulation range) over two generations. Results showed no treatment increased upper thermal limit during acute thermal stress, although serotonin-treated and methionine-treated individuals outperformed controls across multiple thermal performance traits. Additionally, some effects were evident across generations. Together these results suggest phenotypic engineering provides complex outcomes; and if implemented with heat hardening can further influence performance in multiple thermal tolerance traits, within and across generations. Potentially, such techniques could be up-scaled to provide resilience and stability in populations susceptible to extreme temperature events. © 2018. Published by The Company of Biologists Ltd.

  15. Predicting Relapse in Patients With Medulloblastoma by Integrating Evidence From Clinical and Genomic Features

    PubMed Central

    Tamayo, Pablo; Cho, Yoon-Jae; Tsherniak, Aviad; Greulich, Heidi; Ambrogio, Lauren; Schouten-van Meeteren, Netteke; Zhou, Tianni; Buxton, Allen; Kool, Marcel; Meyerson, Matthew; Pomeroy, Scott L.; Mesirov, Jill P.

    2011-01-01

    Purpose Despite significant progress in the molecular understanding of medulloblastoma, stratification of risk in patients remains a challenge. Focus has shifted from clinical parameters to molecular markers, such as expression of specific genes and selected genomic abnormalities, to improve accuracy of treatment outcome prediction. Here, we show how integration of high-level clinical and genomic features or risk factors, including disease subtype, can yield more comprehensive, accurate, and biologically interpretable prediction models for relapse versus no-relapse classification. We also introduce a novel Bayesian nomogram indicating the amount of evidence that each feature contributes on a patient-by-patient basis. Patients and Methods A Bayesian cumulative log-odds model of outcome was developed from a training cohort of 96 children treated for medulloblastoma, starting with the evidence provided by clinical features of metastasis and histology (model A) and incrementally adding the evidence from gene-expression–derived features representing disease subtype–independent (model B) and disease subtype–dependent (model C) pathways, and finally high-level copy-number genomic abnormalities (model D). The models were validated on an independent test cohort (n = 78). Results On an independent multi-institutional test data set, models A to D attain an area under receiver operating characteristic (au-ROC) curve of 0.73 (95% CI, 0.60 to 0.84), 0.75 (95% CI, 0.64 to 0.86), 0.80 (95% CI, 0.70 to 0.90), and 0.78 (95% CI, 0.68 to 0.88), respectively, for predicting relapse versus no relapse. Conclusion The proposed models C and D outperform the current clinical classification schema (au-ROC, 0.68), our previously published eight-gene outcome signature (au-ROC, 0.71), and several new schemas recently proposed in the literature for medulloblastoma risk stratification. PMID:21357789

  16. Teaching for understanding in medical classrooms using multimedia design principles.

    PubMed

    Issa, Nabil; Mayer, Richard E; Schuller, Mary; Wang, Edward; Shapiro, Michael B; DaRosa, Debra A

    2013-04-01

    In line with a recent report entitled Effective Use of Educational Technology in Medical Education from the Association of American Medical Colleges Institute for Improving Medical Education (AAMC-IME), this study examined whether revising a medical lecture based on evidence-based principles of multimedia design would lead to improved long-term transfer and retention in Year 3 medical students. A previous study yielded positive effects on an immediate retention test, but did not investigate long-term effects. In a pre-test/post-test control design, a cohort of 37 Year 3 medical students at a private, midwestern medical school received a bullet point-based PowerPoint™ lecture on shock developed by the instructor as part of their core curriculum (the traditional condition group). Another cohort of 43 similar medical students received a lecture covering identical content using slides redesigned according to Mayer's evidence-based principles of multimedia design (the modified condition group). Findings showed that the modified condition group significantly outscored the traditional condition group on delayed tests of transfer given 1 week (d = 0.83) and 4 weeks (d = 1.17) after instruction, and on delayed tests of retention given 1 week (d = 0.83) and 4 weeks (d = 0.79) after instruction. The modified condition group also significantly outperformed the traditional condition group on immediate tests of retention (d = 1.49) and transfer (d = 0.76). This study provides the first evidence that applying multimedia design principles to an actual medical lecture has significant effects on measures of learner understanding (i.e. long-term transfer and long-term retention). This work reinforces the need to apply the science of learning and instruction in medical education. © Blackwell Publishing Ltd 2013.

  17. Accurate shade image matching by using a smartphone camera.

    PubMed

    Tam, Weng-Kong; Lee, Hsi-Jian

    2017-04-01

    Dental shade matching by using digital images may be feasible when suitable color features are properly manipulated. Separating the color features into feature spaces facilitates favorable matching. We propose using support vector machines (SVM), which are outstanding classifiers, in shade classification. A total of 1300 shade tab images were captured using a smartphone camera with auto-mode settings and no flash. The images were shot at angled distances of 14-20cm from a shade guide at a clinic equipped with light tubes that produced a 4000K color temperature. The Group 1 samples comprised 1040 tab images, for which the shade guide was randomly positioned in the clinic, and the Group 2 samples comprised 260 tab images, for which the shade guide had a fixed position in the clinic. Rectangular content was cropped manually on each shade tab image and further divided into 10×2 blocks. The color features extracted from the blocks were described using a feature vector. The feature vectors in each group underwent SVM training and classification by using the "leave-one-out" strategy. The top one and three accuracies of Group 1 were 0.86 and 0.98, respectively, and those of Group 2 were 0.97 and 1.00, respectively. This study provides a feasible technique for dental shade classification that uses the camera of a mobile device. The findings reveal that the proposed SVM classification might outperform the shade-matching results of previous studies that have performed similarity measurements of ΔE levels or used an S, a*, b* feature set. Copyright © 2016 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.

  18. Optimal expression of a Fab-effector fusion protein in Escherichia coli by removing the cysteine residues responsible for an interchain disulfide bond of a Fab molecule.

    PubMed

    Kang, Hyeon-Ju; Kim, Hye-Jin; Jung, Mun-Sik; Han, Jae-Kyu; Cha, Sang-Hoon

    2017-04-01

    Development of novel bi-functional or even tri-functional Fab-effector fusion proteins would have a great potential in the biomedical sciences. However, the expression of Fab-effector fusion proteins in Escherichia coli is problematic especially when a eukaryotic effector moiety is genetically linked to a Fab due to the lack of proper chaperone proteins and an inappropriate physicochemical environment intrinsic to the microbial hosts. We previously reported that a human Fab molecule, referred to as SL335, reactive to human serum albumin has a prolonged in vivo serum half-life in rats. We, herein, tested six discrete SL335-human growth hormone (hGH) fusion constructs as a model system to define an optimal Fab-effector fusion format for E. coli expression. We found that one variant, referred to as HserG/Lser, outperformed the others in terms of a soluble expression yield and functionality in that HserG/Lser has a functional hGH bioactivity and possesses an serum albumin-binding affinity comparable to SL335. Our results clearly demonstrated that the genetic linkage of an effector domain to the C-terminus of Fd (V H +C H1 ) and the removal of cysteine (Cys) residues responsible for an interchain disulfide bond (IDB) ina Fab molecule optimize the periplasmic expression of a Fab-effector fusion protein in E. coli. We believe that our approach can contribute the development of diverse bi-functional Fab-effector fusion proteins by providing a simple strategy that enables the reliable expression of a functional fusion proteins in E. coli. Copyright © 2017 European Federation of Immunological Societies. Published by Elsevier B.V. All rights reserved.

  19. The Importance of Non-accessible Crosslinks and Solvent Accessible Surface Distance in Modeling Proteins with Restraints From Crosslinking Mass Spectrometry*

    PubMed Central

    Bullock, Joshua Matthew Allen; Schwab, Jannik; Thalassinos, Konstantinos; Topf, Maya

    2016-01-01

    Crosslinking mass spectrometry (XL-MS) is becoming an increasingly popular technique for modeling protein monomers and complexes. The distance restraints garnered from these experiments can be used alone or as part of an integrative modeling approach, incorporating data from many sources. However, modeling practices are varied and the difference in their usefulness is not clear. Here, we develop a new scoring procedure for models based on crosslink data—Matched and Nonaccessible Crosslink score (MNXL). We compare its performance with that of other commonly-used scoring functions (Number of Violations and Sum of Violation Distances) on a benchmark of 14 protein domains, each with 300 corresponding models (at various levels of quality) and associated, previously published, experimental crosslinks (XLdb). The distances between crosslinked lysines are calculated either as Euclidean distances or Solvent Accessible Surface Distances (SASD) using a newly-developed method (Jwalk). MNXL takes into account whether a crosslink is nonaccessible, i.e. an experimentally observed crosslink has no corresponding SASD in a model due to buried lysines. This metric alone is shown to have a significant impact on modeling performance and is a concept that is not considered at present if only Euclidean distances are used. Additionally, a comparison between modeling with SASD or Euclidean distance shows that SASD is superior, even when factoring out the effect of the nonaccessible crosslinks. Our benchmarking also shows that MNXL outperforms the other tested scoring functions in terms of precision and correlation to Cα-RMSD from the crystal structure. We finally test the MNXL at different levels of crosslink recovery (i.e. the percentage of crosslinks experimentally observed out of all theoretical ones) and set a target recovery of ∼20% after which the performance plateaus. PMID:27150526

  20. Health Education and Symptom Flare Management Using a Video-Based m-Health System for Caring Women with IC/BPS.

    PubMed

    Lee, Ming-Huei; Wu, Huei-Ching; Tseng, Chien-Ming; Ko, Tsung-Liang; Weng, Tang-Jun; Chen, Yung-Fu

    2018-06-10

    To assess effectiveness of the video-based m-health system providing videos dictated by physicians for health education and symptom self-management for patients with IC/BPS. An m-health system was designed to provide videos for weekly health education and symptom flare self-management. O'Leary-Sant index and VAS scale as well as SF-36 health survey were administrated to evaluate the disease severity and quality of life (QoL), respectively. A total of 60 IC/BPS patients were recruited and randomly assigned to either control group (30 patients) or study group (30 patients) in sequence depending on their orders to visit our urological clinic. Patients in both control and study groups received regular treatments, while those in the study group received additional video-based intervention. Statistical analyses were conducted to compare the outcomes between baseline and post-intervention for both groups. The outcomes of video-based intervention were also compared with the text-based intervention conducted in our previous study. After video-based intervention, patients in the study group exhibited significant effect manifested in all disease severity and QoL assessments except the VAS pain scale, while no significance was found in the control group. Moreover, the study group exhibited more significant net improvements than the control group in 7 SF-36 constructs, except the mental health. The limitations include short intervention duration (8 weeks) and different study periods between text-based and video-based interventions. Video-based intervention is effective in improving the QoL of IC/BPS patients and outperforms the text-based intervention even in a short period of intervention. Copyright © 2018. Published by Elsevier Inc.

  1. Validation of a virtual reality-based robotic surgical skills curriculum.

    PubMed

    Connolly, Michael; Seligman, Johnathan; Kastenmeier, Andrew; Goldblatt, Matthew; Gould, Jon C

    2014-05-01

    The clinical application of robotic-assisted surgery (RAS) is rapidly increasing. The da Vinci Surgical System™ is currently the only commercially available RAS system. The skills necessary to perform robotic surgery are unique from those required for open and laparoscopic surgery. A validated laparoscopic surgical skills curriculum (fundamentals of laparoscopic surgery or FLS™) has transformed the way surgeons acquire laparoscopic skills. There is a need for a similar skills training and assessment tool specific for robotic surgery. Based on previously published data and expert opinion, we developed a robotic skills curriculum. We sought to evaluate this curriculum for evidence of construct validity (ability to discriminate between users of different skill levels). Four experienced surgeons (>20 RAS) and 20 novice surgeons (first-year medical students with no surgical or RAS experience) were evaluated. The curriculum comprised five tasks utilizing the da Vinci™ Skills Simulator (Pick and Place, Camera Targeting 2, Peg Board 2, Matchboard 2, and Suture Sponge 3). After an orientation to the robot and a period of acclimation in the simulator, all subjects completed three consecutive repetitions of each task. Computer-derived performance metrics included time, economy of motion, master work space, instrument collisions, excessive force, distance of instruments out of view, drops, missed targets, and overall scores (a composite of all metrics). Experienced surgeons significantly outperformed novice surgeons in most metrics. Statistically significant differences were detected for each task in regards to mean overall scores and mean time (seconds) to completion. The curriculum we propose is a valid method of assessing and distinguishing robotic surgical skill levels on the da Vinci Si™ Surgical System. Further study is needed to establish proficiency levels and to demonstrate that training on the simulator with the proposed curriculum leads to improved robotic surgical performance in the operating room.

  2. An improved conjugate vaccine technology; induction of antibody responses to the tumor vasculature.

    PubMed

    Huijbers, Elisabeth J M; van Beijnum, Judy R; Lê, Chung T; Langman, Sofya; Nowak-Sliwinska, Patrycja; Mayo, Kevin H; Griffioen, Arjan W

    2018-05-17

    The induction of an antibody response against self-antigens requires a conjugate vaccine technology, where the self-antigen is conjugated to a foreign protein sequence, and the co-application of a potent adjuvant. The choice of this foreign sequence is crucial as a very strong antibody response towards it may compromise the anti-self immune response. Here, we aimed to optimize the conjugate design for application of vaccination against the tumor vasculature, using two different approaches. First, the immunogenicity of the previously employed bacterial thioredoxin (TRX) was reduced by using a truncated from (TRXtr). Second, the Escherichia coli proteome was scrutinized to identify alternative proteins, based on immunogenicity and potency to increase solubility, suitable for use in a conjugate vaccine. This technology was used for vaccination against a marker of the tumor vasculature, the well-known extra domain B (EDB) of fibronectin. We demonstrate that engineering of the foreign sequence of a conjugate vaccine can significantly improve antibody production. The TRXtr construct outperformed the one containing full-length TRX, for the production of anti-self antibodies to EDB. In addition, efficient tumor growth inhibition was observed with the new TRXtr-EDB vaccine. Microvessel density was decreased and enhanced leukocyte infiltration was observed, indicative of an active immune response directed against the tumor vasculature. Summarizing, we have identified a truncated form of the foreign antigen TRX that can improve conjugate vaccine technology for induction of anti-self antibody titers. This technology was named Immuno-Boost (I-Boost). Our findings are important for the clinical development of cancer vaccines directed against self antigens, e.g. the ones selectively found in the tumor vasculature. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  3. Revising superior planning performance in chess players: the impact of time restriction and motivation aspects.

    PubMed

    Unterrainer, Josef Martin; Kaller, Christoph Philipp; Leonhart, Rainer; Rahm, Benjamin

    2011-01-01

    In a previous study (Unterrainer, Kaller, Halsband, & Rahm, 2006), chess players outperformed non-chess players in the Tower of London planning task but exhibited disproportionately longer processing times. This pattern of results raises the question of whether chess players' planning capabilities are superior or whether the results reflect differences in the speed-accuracy trade-off between the groups, possibly attributable to sports motivation. The present study was designed to disambiguate these alternative suggestions by implementing various constraints on planning time and by assessing self-reported motivation. In contrast to the previous study, chess players' performance was not superior, independently of whether problems had to be solved with (Experiment 1) or without (Experiment 2) time limits. As expected, chess players reported higher overall trait and state motivation scores across both experiments. These findings revise the notion of superior planning performance in chess players. In consequence, they do not conform with the assumption of a general transfer of chess-related planning expertise to other cognitive domains, instead suggesting that superior performance may be possible only under specific circumstances such as receiving competitive instructions.

  4. An Information Retrieval Approach for Robust Prediction of Road Surface States.

    PubMed

    Park, Jae-Hyung; Kim, Kwanho

    2017-01-28

    Recently, due to the increasing importance of reducing severe vehicle accidents on roads (especially on highways), the automatic identification of road surface conditions, and the provisioning of such information to drivers in advance, have recently been gaining significant momentum as a proactive solution to decrease the number of vehicle accidents. In this paper, we firstly propose an information retrieval approach that aims to identify road surface states by combining conventional machine-learning techniques and moving average methods. Specifically, when signal information is received from a radar system, our approach attempts to estimate the current state of the road surface based on the similar instances observed previously based on utilizing a given similarity function. Next, the estimated state is then calibrated by using the recently estimated states to yield both effective and robust prediction results. To validate the performances of the proposed approach, we established a real-world experimental setting on a section of actual highway in South Korea and conducted a comparison with the conventional approaches in terms of accuracy. The experimental results show that the proposed approach successfully outperforms the previously developed methods.

  5. Acoustic-tactile rendering of visual information

    NASA Astrophysics Data System (ADS)

    Silva, Pubudu Madhawa; Pappas, Thrasyvoulos N.; Atkins, Joshua; West, James E.; Hartmann, William M.

    2012-03-01

    In previous work, we have proposed a dynamic, interactive system for conveying visual information via hearing and touch. The system is implemented with a touch screen that allows the user to interrogate a two-dimensional (2-D) object layout by active finger scanning while listening to spatialized auditory feedback. Sound is used as the primary source of information for object localization and identification, while touch is used both for pointing and for kinesthetic feedback. Our previous work considered shape and size perception of simple objects via hearing and touch. The focus of this paper is on the perception of a 2-D layout of simple objects with identical size and shape. We consider the selection and rendition of sounds for object identification and localization. We rely on the head-related transfer function for rendering sound directionality, and consider variations of sound intensity and tempo as two alternative approaches for rendering proximity. Subjective experiments with visually-blocked subjects are used to evaluate the effectiveness of the proposed approaches. Our results indicate that intensity outperforms tempo as a proximity cue, and that the overall system for conveying a 2-D layout is quite promising.

  6. Emotional Intelligence and cognitive abilities - associations and sex differences.

    PubMed

    Pardeller, Silvia; Frajo-Apor, Beatrice; Kemmler, Georg; Hofer, Alex

    2017-09-01

    In order to expand on previous research, this cross-sectional study investigated the relationship between Emotional Intelligence (EI) and cognitive abilities in healthy adults with a special focus on potential sex differences. EI was assessed by means of the Mayer-Salovey-Caruso-Emotional-Intelligence Test (MSCEIT), whereas cognitive abilities were investigated using the Brief Assessment of Cognition in Schizophrenia (BACS), which measures key aspects of cognitive functioning, i.e. verbal memory, working memory, motor speed, verbal fluency, attention and processing speed, and reasoning and problem solving. 137 subjects (65% female) with a mean age of 38.7 ± 11.8 years were included into the study. While males and females were comparable with regard to EI, men achieved significantly higher BACS composite scores and outperformed women in the BACS subscales motor speed, attention and processing speed, and reasoning and problem solving. Verbal fluency significantly predicted EI, whereas the MSCEIT subscale understanding emotions significantly predicted the BACS composite score. Our findings support previous research and emphasize the relevance of considering cognitive abilities when assessing ability EI in healthy individuals.

  7. Fast Brain Plasticity during Word Learning in Musically-Trained Children.

    PubMed

    Dittinger, Eva; Chobert, Julie; Ziegler, Johannes C; Besson, Mireille

    2017-01-01

    Children learn new words every day and this ability requires auditory perception, phoneme discrimination, attention, associative learning and semantic memory. Based on previous results showing that some of these functions are enhanced by music training, we investigated learning of novel words through picture-word associations in musically-trained and control children (8-12 year-old) to determine whether music training would positively influence word learning. Results showed that musically-trained children outperformed controls in a learning paradigm that included picture-sound matching and semantic associations. Moreover, the differences between unexpected and expected learned words, as reflected by the N200 and N400 effects, were larger in children with music training compared to controls after only 3 min of learning the meaning of novel words. In line with previous results in adults, these findings clearly demonstrate a correlation between music training and better word learning. It is argued that these benefits reflect both bottom-up and top-down influences. The present learning paradigm might provide a useful dynamic diagnostic tool to determine which perceptive and cognitive functions are impaired in children with learning difficulties.

  8. A Model-Based Approach for Identifying Signatures of Ancient Balancing Selection in Genetic Data

    PubMed Central

    DeGiorgio, Michael; Lohmueller, Kirk E.; Nielsen, Rasmus

    2014-01-01

    While much effort has focused on detecting positive and negative directional selection in the human genome, relatively little work has been devoted to balancing selection. This lack of attention is likely due to the paucity of sophisticated methods for identifying sites under balancing selection. Here we develop two composite likelihood ratio tests for detecting balancing selection. Using simulations, we show that these methods outperform competing methods under a variety of assumptions and demographic models. We apply the new methods to whole-genome human data, and find a number of previously-identified loci with strong evidence of balancing selection, including several HLA genes. Additionally, we find evidence for many novel candidates, the strongest of which is FANK1, an imprinted gene that suppresses apoptosis, is expressed during meiosis in males, and displays marginal signs of segregation distortion. We hypothesize that balancing selection acts on this locus to stabilize the segregation distortion and negative fitness effects of the distorter allele. Thus, our methods are able to reproduce many previously-hypothesized signals of balancing selection, as well as discover novel interesting candidates. PMID:25144706

  9. A model-based approach for identifying signatures of ancient balancing selection in genetic data.

    PubMed

    DeGiorgio, Michael; Lohmueller, Kirk E; Nielsen, Rasmus

    2014-08-01

    While much effort has focused on detecting positive and negative directional selection in the human genome, relatively little work has been devoted to balancing selection. This lack of attention is likely due to the paucity of sophisticated methods for identifying sites under balancing selection. Here we develop two composite likelihood ratio tests for detecting balancing selection. Using simulations, we show that these methods outperform competing methods under a variety of assumptions and demographic models. We apply the new methods to whole-genome human data, and find a number of previously-identified loci with strong evidence of balancing selection, including several HLA genes. Additionally, we find evidence for many novel candidates, the strongest of which is FANK1, an imprinted gene that suppresses apoptosis, is expressed during meiosis in males, and displays marginal signs of segregation distortion. We hypothesize that balancing selection acts on this locus to stabilize the segregation distortion and negative fitness effects of the distorter allele. Thus, our methods are able to reproduce many previously-hypothesized signals of balancing selection, as well as discover novel interesting candidates.

  10. Heating and flooding: A unified approach for rapid generation of free energy surfaces

    NASA Astrophysics Data System (ADS)

    Chen, Ming; Cuendet, Michel A.; Tuckerman, Mark E.

    2012-07-01

    We propose a general framework for the efficient sampling of conformational equilibria in complex systems and the generation of associated free energy hypersurfaces in terms of a set of collective variables. The method is a strategic synthesis of the adiabatic free energy dynamics approach, previously introduced by us and others, and existing schemes using Gaussian-based adaptive bias potentials to disfavor previously visited regions. In addition, we suggest sampling the thermodynamic force instead of the probability density to reconstruct the free energy hypersurface. All these elements are combined into a robust extended phase-space formalism that can be easily incorporated into existing molecular dynamics packages. The unified scheme is shown to outperform both metadynamics and adiabatic free energy dynamics in generating two-dimensional free energy surfaces for several example cases including the alanine dipeptide in the gas and aqueous phases and the met-enkephalin oligopeptide. In addition, the method can efficiently generate higher dimensional free energy landscapes, which we demonstrate by calculating a four-dimensional surface in the Ramachandran angles of the gas-phase alanine tripeptide.

  11. Efficient Record Linkage Algorithms Using Complete Linkage Clustering.

    PubMed

    Mamun, Abdullah-Al; Aseltine, Robert; Rajasekaran, Sanguthevar

    2016-01-01

    Data from different agencies share data of the same individuals. Linking these datasets to identify all the records belonging to the same individuals is a crucial and challenging problem, especially given the large volumes of data. A large number of available algorithms for record linkage are prone to either time inefficiency or low-accuracy in finding matches and non-matches among the records. In this paper we propose efficient as well as reliable sequential and parallel algorithms for the record linkage problem employing hierarchical clustering methods. We employ complete linkage hierarchical clustering algorithms to address this problem. In addition to hierarchical clustering, we also use two other techniques: elimination of duplicate records and blocking. Our algorithms use sorting as a sub-routine to identify identical copies of records. We have tested our algorithms on datasets with millions of synthetic records. Experimental results show that our algorithms achieve nearly 100% accuracy. Parallel implementations achieve almost linear speedups. Time complexities of these algorithms do not exceed those of previous best-known algorithms. Our proposed algorithms outperform previous best-known algorithms in terms of accuracy consuming reasonable run times.

  12. Remember Hard But Think Softly: Metaphorical Effects of Hardness/Softness on Cognitive Functions.

    PubMed

    Xie, Jiushu; Lu, Zhi; Wang, Ruiming; Cai, Zhenguang G

    2016-01-01

    Previous studies have found that bodily stimulation, such as hardness biases social judgment and evaluation via metaphorical association; however, it remains unclear whether bodily stimulation also affects cognitive functions, such as memory and creativity. The current study used metaphorical associations between "hard" and "rigid" and between "soft" and "flexible" in Chinese, to investigate whether the experience of hardness affects cognitive functions whose performance depends prospectively on rigidity (memory) and flexibility (creativity). In Experiment 1, we found that Chinese-speaking participants performed better at recalling previously memorized words while sitting on a hard-surface stool (the hard condition) than a cushioned one (the soft condition). In Experiment 2, participants sitting on a cushioned stool outperformed those sitting on a hard-surface stool on a Chinese riddle task, which required creative/flexible thinking, but not on an analogical reasoning task, which required both rigid and flexible thinking. The results suggest the hardness experience affects cognitive functions that are metaphorically associated with rigidity or flexibility. They support the embodiment proposition that cognitive functions and representations can be grounded in bodily states via metaphorical associations.

  13. Fast Brain Plasticity during Word Learning in Musically-Trained Children

    PubMed Central

    Dittinger, Eva; Chobert, Julie; Ziegler, Johannes C.; Besson, Mireille

    2017-01-01

    Children learn new words every day and this ability requires auditory perception, phoneme discrimination, attention, associative learning and semantic memory. Based on previous results showing that some of these functions are enhanced by music training, we investigated learning of novel words through picture-word associations in musically-trained and control children (8–12 year-old) to determine whether music training would positively influence word learning. Results showed that musically-trained children outperformed controls in a learning paradigm that included picture-sound matching and semantic associations. Moreover, the differences between unexpected and expected learned words, as reflected by the N200 and N400 effects, were larger in children with music training compared to controls after only 3 min of learning the meaning of novel words. In line with previous results in adults, these findings clearly demonstrate a correlation between music training and better word learning. It is argued that these benefits reflect both bottom-up and top-down influences. The present learning paradigm might provide a useful dynamic diagnostic tool to determine which perceptive and cognitive functions are impaired in children with learning difficulties. PMID:28553213

  14. Efficient Record Linkage Algorithms Using Complete Linkage Clustering

    PubMed Central

    Mamun, Abdullah-Al; Aseltine, Robert; Rajasekaran, Sanguthevar

    2016-01-01

    Data from different agencies share data of the same individuals. Linking these datasets to identify all the records belonging to the same individuals is a crucial and challenging problem, especially given the large volumes of data. A large number of available algorithms for record linkage are prone to either time inefficiency or low-accuracy in finding matches and non-matches among the records. In this paper we propose efficient as well as reliable sequential and parallel algorithms for the record linkage problem employing hierarchical clustering methods. We employ complete linkage hierarchical clustering algorithms to address this problem. In addition to hierarchical clustering, we also use two other techniques: elimination of duplicate records and blocking. Our algorithms use sorting as a sub-routine to identify identical copies of records. We have tested our algorithms on datasets with millions of synthetic records. Experimental results show that our algorithms achieve nearly 100% accuracy. Parallel implementations achieve almost linear speedups. Time complexities of these algorithms do not exceed those of previous best-known algorithms. Our proposed algorithms outperform previous best-known algorithms in terms of accuracy consuming reasonable run times. PMID:27124604

  15. An Information Retrieval Approach for Robust Prediction of Road Surface States

    PubMed Central

    Park, Jae-Hyung; Kim, Kwanho

    2017-01-01

    Recently, due to the increasing importance of reducing severe vehicle accidents on roads (especially on highways), the automatic identification of road surface conditions, and the provisioning of such information to drivers in advance, have recently been gaining significant momentum as a proactive solution to decrease the number of vehicle accidents. In this paper, we firstly propose an information retrieval approach that aims to identify road surface states by combining conventional machine-learning techniques and moving average methods. Specifically, when signal information is received from a radar system, our approach attempts to estimate the current state of the road surface based on the similar instances observed previously based on utilizing a given similarity function. Next, the estimated state is then calibrated by using the recently estimated states to yield both effective and robust prediction results. To validate the performances of the proposed approach, we established a real-world experimental setting on a section of actual highway in South Korea and conducted a comparison with the conventional approaches in terms of accuracy. The experimental results show that the proposed approach successfully outperforms the previously developed methods. PMID:28134859

  16. Cost-effective handoff scheme based on mobility-aware dual pointer forwarding in proxy mobile IPv6 networks.

    PubMed

    Son, Seungsik; Jeong, Jongpil

    2014-01-01

    In this paper, a mobility-aware Dual Pointer Forwarding scheme (mDPF) is applied in Proxy Mobile IPv6 (PMIPv6) networks. The movement of a Mobile Node (MN) is classified as intra-domain and inter-domain handoff. When the MN moves, this scheme can reduce the high signaling overhead for intra-handoff/inter-handoff, because the Local Mobility Anchor (LMA) and Mobile Access Gateway (MAG) are connected by pointer chains. In other words, a handoff is aware of low mobility between the previously attached MAG (pMAG) and newly attached MAG (nMAG), and another handoff between the previously attached LMA (pLMA) and newly attached LMA (nLMA) is aware of high mobility. Based on these mobility-aware binding updates, the overhead of the packet delivery can be reduced. Also, we analyse the binding update cost and packet delivery cost for route optimization, based on the mathematical analytic model. Analytical results show that our mDPF outperforms the PMIPv6 and the other pointer forwarding schemes, in terms of reducing the total cost of signaling.

  17. Plasticity in older adults' theory of mind performance: the impact of motivation.

    PubMed

    Zhang, Xin; Lecce, Serena; Ceccato, Irene; Cavallini, Elena; Zhang, Linfang; Chen, Tianyong

    2017-09-08

    Recently, motivation has been found to attenuate the age-related decline in Theory of Mind (ToM) performance (i.e. faux pas recognition). However, whether or not this effect could be generalized to other ToM tasks is still unknown. In the present study, we investigated whether and how motivation could enhance older adults' performance and reduce age differences in ToM tasks (Faux Pas vs. Animation task) that differ in familiarity. Following a previous paradigm, 171 Chinese adults (87 younger adults and 84 older adults) were recruited, and we experimentally manipulated the level of perceived closeness between participants and the experimenter before administering the ToM tasks in order to enhance participants' motivation. Results showed that, for the Faux Pas task, we replicated previous findings such that older adults under the enhanced motivation conditions performed equally well as younger adults. Conversely, for the Animation task, younger adults outperformed older adults, regardless of motivation. These results indicate that motivation can enhance older adults' performance in ToM tasks, however, this beneficial effect cannot be generalized across ToM tasks.

  18. An evaluation of ferrihydrite- and Metsorb™-DGT techniques for measuring oxyanion species (As, Se, V, P): effective capacity, competition and diffusion coefficients.

    PubMed

    Price, Helen L; Teasdale, Peter R; Jolley, Dianne F

    2013-11-25

    This study investigated several knowledge gaps with respect to the diffusive gradients in thin films (DGT) technique for measurement of oxyanions (As(III), As(V), Se(IV), Se(VI), PO4(3-), and V(V)) using the ferrihydrite and Metsorb™ binding layers. Elution efficiencies for each binding layer were higher with 1:20 dilutions, as analytical interferences for ICP-MS were minimised. Diffusion coefficients measured by diffusion cell and by DGT time-series experiments were found to agree well and generally agreed with previously reported values, although a range of diffusion coefficients have been reported for inorganic As and Se species. The relative binding affinity for both ferrihydrite and Metsorb™ was PO4(3-) ≈ As(V)>V(V) ≈ As(III)>Se(IV) > Se(VI) and effective binding capacities were measured in single ion solutions, and spiked synthetic freshwater and seawater, advising practical decisions about DGT monitoring. Under the conditions tested the performance of both ferrihydrite and Metsorb™ binding layers was directly comparable for As(V), As(III) Se(IV), V(V) and PO4(3-) over a deployment spanning ≤ 2 days for both freshwater and seawater. In order to return quantitative data for several analytes we recommend that the DGT method using either ferrihydrite or Metsorb™ be deployed for a maximum of 2 days in marine waters likely to contain high levels of the most strongly adsorbing oxyanions contaminants. The high pH, the competitive ions present in seawater and the identity of co-adsorbing ions affect the capacity of each binding layer for the analytes of interest. In freshwaters, longer deployment times can be considered but the concentration and identity of co-adsorbing ions may impact on quantitative uptake of Se(IV). This study found ferrihydrite-DGT outperformed Metsorb-DGT while previous studies have found the opposite, with variation in binding materials masses used being a likely reason. Clearly, preparation of both binding layers should always be optimised to produce the highest capacity possible, especially for seawater deployments. Crown Copyright © 2013. Published by Elsevier B.V. All rights reserved.

  19. 3D-Printed Permanent Magnets Outperform Conventional Versions, Conserve Rare Materials

    ScienceCinema

    Paranthaman, Parans

    2018-06-13

    Researchers at the Department of Energy’s Oak Ridge National Laboratory have demonstrated that permanent magnets produced by additive manufacturing can outperform bonded magnets made using traditional techniques while conserving critical materials. The project is part of DOE’s Critical Materials Institute (CMI), which seeks ways to eliminate and reduce reliance on rare earth metals and other materials critical to the success of clean energy technologies.

  20. Analyzing Enron Data: Bitmap Indexing Outperforms MySQL Queries bySeveral Orders of Magnitude

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stockinger, Kurt; Rotem, Doron; Shoshani, Arie

    2006-01-28

    FastBit is an efficient, compressed bitmap indexing technology that was developed in our group. In this report we evaluate the performance of MySQL and FastBit for analyzing the email traffic of the Enron dataset. The first finding shows that materializing the join results of several tables significantly improves the query performance. The second finding shows that FastBit outperforms MySQL by several orders of magnitude.

  1. Genome-wide assessment of differential translations with ribosome profiling data.

    PubMed

    Xiao, Zhengtao; Zou, Qin; Liu, Yu; Yang, Xuerui

    2016-04-04

    The closely regulated process of mRNA translation is crucial for precise control of protein abundance and quality. Ribosome profiling, a combination of ribosome foot-printing and RNA deep sequencing, has been used in a large variety of studies to quantify genome-wide mRNA translation. Here, we developed Xtail, an analysis pipeline tailored for ribosome profiling data that comprehensively and accurately identifies differentially translated genes in pairwise comparisons. Applied on simulated and real datasets, Xtail exhibits high sensitivity with minimal false-positive rates, outperforming existing methods in the accuracy of quantifying differential translations. With published ribosome profiling datasets, Xtail does not only reveal differentially translated genes that make biological sense, but also uncovers new events of differential translation in human cancer cells on mTOR signalling perturbation and in human primary macrophages on interferon gamma (IFN-γ) treatment. This demonstrates the value of Xtail in providing novel insights into the molecular mechanisms that involve translational dysregulations.

  2. A watershed model of individual differences in fluid intelligence.

    PubMed

    Kievit, Rogier A; Davis, Simon W; Griffiths, John; Correia, Marta M; Cam-Can; Henson, Richard N

    2016-10-01

    Fluid intelligence is a crucial cognitive ability that predicts key life outcomes across the lifespan. Strong empirical links exist between fluid intelligence and processing speed on the one hand, and white matter integrity and processing speed on the other. We propose a watershed model that integrates these three explanatory levels in a principled manner in a single statistical model, with processing speed and white matter figuring as intermediate endophenotypes. We fit this model in a large (N=555) adult lifespan cohort from the Cambridge Centre for Ageing and Neuroscience (Cam-CAN) using multiple measures of processing speed, white matter health and fluid intelligence. The model fit the data well, outperforming competing models and providing evidence for a many-to-one mapping between white matter integrity, processing speed and fluid intelligence. The model can be naturally extended to integrate other cognitive domains, endophenotypes and genotypes. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Robust decentralized controller for minimizing coupling effect in single inductor multiple output DC-DC converter operating in continuous conduction mode.

    PubMed

    Medeiros, Renan Landau Paiva de; Barra, Walter; Bessa, Iury Valente de; Chaves Filho, João Edgar; Ayres, Florindo Antonio de Cavalho; Neves, Cleonor Crescêncio das

    2018-02-01

    This paper describes a novel robust decentralized control design methodology for a single inductor multiple output (SIMO) DC-DC converter. Based on a nominal multiple input multiple output (MIMO) plant model and performance requirements, a pairing input-output analysis is performed to select the suitable input to control each output aiming to attenuate the loop coupling. Thus, the plant uncertainty limits are selected and expressed in interval form with parameter values of the plant model. A single inductor dual output (SIDO) DC-DC buck converter board is developed for experimental tests. The experimental results show that the proposed methodology can maintain a desirable performance even in the presence of parametric uncertainties. Furthermore, the performance indexes calculated from experimental data show that the proposed methodology outperforms classical MIMO control techniques. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  4. Heat transfer analysis of catheters used for localized tissue cooling to attenuate reperfusion injury.

    PubMed

    Merrill, Thomas L; Mitchell, Jennifer E; Merrill, Denise R

    2016-08-01

    Recent revascularization success for ischemic stroke patients using stentrievers has created a new opportunity for therapeutic hypothermia. By using short term localized tissue cooling interventional catheters can be used to reduce reperfusion injury and improve neurological outcomes. Using experimental testing and a well-established heat exchanger design approach, the ɛ-NTU method, this paper examines the cooling performance of commercially available catheters as function of four practical parameters: (1) infusion flow rate, (2) catheter location in the body, (3) catheter configuration and design, and (4) cooling approach. While saline batch cooling outperformed closed-loop autologous blood cooling at all equivalent flow rates in terms of lower delivered temperatures and cooling capacity, hemodilution, systemic and local, remains a concern. For clinicians and engineers this paper provides insights for the selection, design, and operation of commercially available catheters used for localized tissue cooling. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.

  5. Evaluation of nine popular de novo assemblers in microbial genome assembly.

    PubMed

    Forouzan, Esmaeil; Maleki, Masoumeh Sadat Mousavi; Karkhane, Ali Asghar; Yakhchali, Bagher

    2017-12-01

    Next generation sequencing (NGS) technologies are revolutionizing biology, with Illumina being the most popular NGS platform. Short read assembly is a critical part of most genome studies using NGS. Hence, in this study, the performance of nine well-known assemblers was evaluated in the assembly of seven different microbial genomes. Effect of different read coverage and k-mer parameters on the quality of the assembly were also evaluated on both simulated and actual read datasets. Our results show that the performance of assemblers on real and simulated datasets could be significantly different, mainly because of coverage bias. According to outputs on actual read datasets, for all studied read coverages (of 7×, 25× and 100×), SPAdes and IDBA-UD clearly outperformed other assemblers based on NGA50 and accuracy metrics. Velvet is the most conservative assembler with the lowest NGA50 and error rate. Copyright © 2017. Published by Elsevier B.V.

  6. The effect of action video game playing on sensorimotor learning: Evidence from a movement tracking task.

    PubMed

    Gozli, Davood G; Bavelier, Daphne; Pratt, Jay

    2014-10-12

    Research on the impact of action video game playing has revealed performance advantages on a wide range of perceptual and cognitive tasks. It is not known, however, if playing such games confers similar advantages in sensorimotor learning. To address this issue, the present study used a manual motion-tracking task that allowed for a sensitive measure of both accuracy and improvement over time. When the target motion pattern was consistent over trials, gamers improved with a faster rate and eventually outperformed non-gamers. Performance between the two groups, however, did not differ initially. When the target motion was inconsistent, changing on every trial, results revealed no difference between gamers and non-gamers. Together, our findings suggest that video game playing confers no reliable benefit in sensorimotor control, but it does enhance sensorimotor learning, enabling superior performance in tasks with consistent and predictable structure. Copyright © 2014. Published by Elsevier B.V.

  7. Grouped gene selection and multi-classification of acute leukemia via new regularized multinomial regression.

    PubMed

    Li, Juntao; Wang, Yanyan; Jiang, Tao; Xiao, Huimin; Song, Xuekun

    2018-05-09

    Diagnosing acute leukemia is the necessary prerequisite to treating it. Multi-classification on the gene expression data of acute leukemia is help for diagnosing it which contains B-cell acute lymphoblastic leukemia (BALL), T-cell acute lymphoblastic leukemia (TALL) and acute myeloid leukemia (AML). However, selecting cancer-causing genes is a challenging problem in performing multi-classification. In this paper, weighted gene co-expression networks are employed to divide the genes into groups. Based on the dividing groups, a new regularized multinomial regression with overlapping group lasso penalty (MROGL) has been presented to simultaneously perform multi-classification and select gene groups. By implementing this method on three-class acute leukemia data, the grouped genes which work synergistically are identified, and the overlapped genes shared by different groups are also highlighted. Moreover, MROGL outperforms other five methods on multi-classification accuracy. Copyright © 2017. Published by Elsevier B.V.

  8. Real-time biscuit tile image segmentation method based on edge detection.

    PubMed

    Matić, Tomislav; Aleksi, Ivan; Hocenski, Željko; Kraus, Dieter

    2018-05-01

    In this paper we propose a novel real-time Biscuit Tile Segmentation (BTS) method for images from ceramic tile production line. BTS method is based on signal change detection and contour tracing with a main goal of separating tile pixels from background in images captured on the production line. Usually, human operators are visually inspecting and classifying produced ceramic tiles. Computer vision and image processing techniques can automate visual inspection process if they fulfill real-time requirements. Important step in this process is a real-time tile pixels segmentation. BTS method is implemented for parallel execution on a GPU device to satisfy the real-time constraints of tile production line. BTS method outperforms 2D threshold-based methods, 1D edge detection methods and contour-based methods. Proposed BTS method is in use in the biscuit tile production line. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  9. Towards large-scale FAME-based bacterial species identification using machine learning techniques.

    PubMed

    Slabbinck, Bram; De Baets, Bernard; Dawyndt, Peter; De Vos, Paul

    2009-05-01

    In the last decade, bacterial taxonomy witnessed a huge expansion. The swift pace of bacterial species (re-)definitions has a serious impact on the accuracy and completeness of first-line identification methods. Consequently, back-end identification libraries need to be synchronized with the List of Prokaryotic names with Standing in Nomenclature. In this study, we focus on bacterial fatty acid methyl ester (FAME) profiling as a broadly used first-line identification method. From the BAME@LMG database, we have selected FAME profiles of individual strains belonging to the genera Bacillus, Paenibacillus and Pseudomonas. Only those profiles resulting from standard growth conditions have been retained. The corresponding data set covers 74, 44 and 95 validly published bacterial species, respectively, represented by 961, 378 and 1673 standard FAME profiles. Through the application of machine learning techniques in a supervised strategy, different computational models have been built for genus and species identification. Three techniques have been considered: artificial neural networks, random forests and support vector machines. Nearly perfect identification has been achieved at genus level. Notwithstanding the known limited discriminative power of FAME analysis for species identification, the computational models have resulted in good species identification results for the three genera. For Bacillus, Paenibacillus and Pseudomonas, random forests have resulted in sensitivity values, respectively, 0.847, 0.901 and 0.708. The random forests models outperform those of the other machine learning techniques. Moreover, our machine learning approach also outperformed the Sherlock MIS (MIDI Inc., Newark, DE, USA). These results show that machine learning proves very useful for FAME-based bacterial species identification. Besides good bacterial identification at species level, speed and ease of taxonomic synchronization are major advantages of this computational species identification strategy.

  10. A Practical Guide to Estimating the Heritability of Pathogen Traits.

    PubMed

    Mitov, Venelin; Stadler, Tanja

    2018-01-09

    Pathogen traits, such as the virulence of an infection, can vary significantly between patients. A major challenge is to measure the extent to which genetic differences between infecting strains explain the observed variation of the trait. This is quantified by the trait's broad-sense heritability, H2. A recent discrepancy between estimates of the heritability of HIV-virulence has opened a debate on the estimators' accuracy. Here, we show that the discrepancy originates from model limitations and important lifecycle differences between sexually reproducing organisms and transmittable pathogens. In particular, current quantitative genetics methods, such as donor-recipient regression (DR) of surveyed serodiscordant couples and the phylogenetic mixed model (PMM), are prone to underestimate H2, because they neglect or do not fit to the loss of resemblance between transmission partners caused by within-host evolution. In a phylogenetic analysis of 8,483 HIV patients from the UK, we show that the phenotypic correlation between transmission partners decays with the amount of within-host evolution of the virus. We reproduce this pattern in toy-model simulations and show that a phylogenetic Ornstein-Uhlenbeck model (POUMM) outperforms the PMM in capturing this correlation pattern and in quantifying H2. In particular, we show that POUMM outperforms PMM even in simulations without selection - as it captures the mentioned correlation pattern - which has not been appreciated until now. By cross-validating the POUMM estimates with ANOVA on closest phylogenetic pairs (ANOVA-CPP), we obtain H2≈0.2, meaning about 20% of the variation in HIV-virulence is explained by the virus genome both for European and African data. © The Author(s) 2018. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  11. Inhibition of DNA damage repair by the CDK4/6 inhibitor palbociclib delays irradiated intracranial atypical teratoid rhabdoid tumor and glioblastoma xenograft regrowth.

    PubMed

    Hashizume, Rintaro; Zhang, Ali; Mueller, Sabine; Prados, Michael D; Lulla, Rishi R; Goldman, Stewart; Saratsis, Amanda M; Mazar, Andrew P; Stegh, Alexander H; Cheng, Shi-Yuan; Horbinski, Craig; Haas-Kogan, Daphne A; Sarkaria, Jann N; Waldman, Todd; James, C David

    2016-11-01

    Radiation therapy is the most commonly used postsurgical treatment for primary malignant brain tumors. Consequently, investigating the efficacy of chemotherapeutics combined with radiation for treating malignant brain tumors is of high clinical relevance. In this study, we examined the cyclin-dependent kinase 4/6 inhibitor palbociclib, when used in combination with radiation for treating human atypical teratoid rhabdoid tumor (ATRT) as well as glioblastoma (GBM). Evaluation of treatment antitumor activity in vitro was based upon results from cell proliferation assays, clonogenicity assays, flow cytometry, and immunocytochemistry for DNA double-strand break repair. Interpretation of treatment antitumor activity in vivo was based upon bioluminescence imaging, animal subject survival analysis, and staining of tumor sections for markers of proliferation and apoptosis. For each of the retinoblastoma protein (RB)-proficient tumor models examined (2 ATRTs and 2 GBMs), one or more of the combination therapy regimens significantly (P < .05) outperformed both monotherapies with respect to animal subject survival benefit. Among the combination therapy regimens, concurrent palbociclib and radiation treatment and palbociclib treatment following radiation consistently outperformed the sequence in which radiation followed palbociclib treatment. In vitro investigation revealed that the concurrent use of palbociclib with radiation, as well as palbociclib following radiation, inhibited DNA double-strand break repair and promoted increased tumor cell apoptosis. Our results support further investigation and possible clinical translation of palbociclib as an adjuvant to radiation therapy for patients with malignant brain tumors that retain RB expression. © The Author(s) 2016. Published by Oxford University Press on behalf of the Society for Neuro-Oncology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  12. Gene set analysis using variance component tests.

    PubMed

    Huang, Yen-Tsung; Lin, Xihong

    2013-06-28

    Gene set analyses have become increasingly important in genomic research, as many complex diseases are contributed jointly by alterations of numerous genes. Genes often coordinate together as a functional repertoire, e.g., a biological pathway/network and are highly correlated. However, most of the existing gene set analysis methods do not fully account for the correlation among the genes. Here we propose to tackle this important feature of a gene set to improve statistical power in gene set analyses. We propose to model the effects of an independent variable, e.g., exposure/biological status (yes/no), on multiple gene expression values in a gene set using a multivariate linear regression model, where the correlation among the genes is explicitly modeled using a working covariance matrix. We develop TEGS (Test for the Effect of a Gene Set), a variance component test for the gene set effects by assuming a common distribution for regression coefficients in multivariate linear regression models, and calculate the p-values using permutation and a scaled chi-square approximation. We show using simulations that type I error is protected under different choices of working covariance matrices and power is improved as the working covariance approaches the true covariance. The global test is a special case of TEGS when correlation among genes in a gene set is ignored. Using both simulation data and a published diabetes dataset, we show that our test outperforms the commonly used approaches, the global test and gene set enrichment analysis (GSEA). We develop a gene set analyses method (TEGS) under the multivariate regression framework, which directly models the interdependence of the expression values in a gene set using a working covariance. TEGS outperforms two widely used methods, GSEA and global test in both simulation and a diabetes microarray data.

  13. The effects of cigarette smoking behavior and psychosis history on general and social cognition in bipolar disorder.

    PubMed

    Ospina, Luz H; Russo, Manuela; Nitzburg, George M; Cuesta-Diaz, Armando; Shanahan, Megan; Perez-Rodriguez, Mercedes M; Mcgrath, Meaghan; Levine, Hannah; Mulaimovic, Sandra; Burdick, Katherine E

    2016-09-01

    Several studies have documented the prevalence and effects of cigarette smoking on cognition in psychotic disorders; fewer have focused on bipolar disorder (BD). Cognitive and social dysfunction are common in BD, and the severity of these deficits may be related both to illness features (e.g., current symptoms, psychosis history) and health-related behaviors (e.g., smoking, alcohol use). The current study assessed the influence of cigarette smoking on general and social cognition in a BD cohort, accounting for illness features with a focus on psychosis history. We assessed smoking status in 105 euthymic patients with BD, who completed a comprehensive battery including social (facial affect recognition, emotional problem-solving, and theory of mind) and general (the MATRICS Consensus Cognitive Battery and executive functioning) cognitive measures. We compared smokers vs nonsmokers on cognitive performance and tested for the effects of psychosis history, premorbid intellectual functioning, substance use, and current affective symptoms. Within the nonpsychotic subgroup with BD (n=45), smokers generally outperformed nonsmokers; by contrast, for subjects with BD with a history of psychosis (n=41), nonsmokers outperformed smokers. This pattern was noted more globally using a general composite cognitive score and on social/affective measures assessing patients' ability to identify emotions of facial stimuli and solve emotional problems. Cigarette smoking differentially affects performance on both general and social cognition in patients with BD as a function of psychosis history. These results suggest that there may be at least partially divergent underlying neurobiological causes for cognitive dysfunction in patients with BD with and without psychosis. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  14. Combining multiple tools outperforms individual methods in gene set enrichment analyses.

    PubMed

    Alhamdoosh, Monther; Ng, Milica; Wilson, Nicholas J; Sheridan, Julie M; Huynh, Huy; Wilson, Michael J; Ritchie, Matthew E

    2017-02-01

    Gene set enrichment (GSE) analysis allows researchers to efficiently extract biological insight from long lists of differentially expressed genes by interrogating them at a systems level. In recent years, there has been a proliferation of GSE analysis methods and hence it has become increasingly difficult for researchers to select an optimal GSE tool based on their particular dataset. Moreover, the majority of GSE analysis methods do not allow researchers to simultaneously compare gene set level results between multiple experimental conditions. The ensemble of genes set enrichment analyses (EGSEA) is a method developed for RNA-sequencing data that combines results from twelve algorithms and calculates collective gene set scores to improve the biological relevance of the highest ranked gene sets. EGSEA's gene set database contains around 25 000 gene sets from sixteen collections. It has multiple visualization capabilities that allow researchers to view gene sets at various levels of granularity. EGSEA has been tested on simulated data and on a number of human and mouse datasets and, based on biologists' feedback, consistently outperforms the individual tools that have been combined. Our evaluation demonstrates the superiority of the ensemble approach for GSE analysis, and its utility to effectively and efficiently extrapolate biological functions and potential involvement in disease processes from lists of differentially regulated genes. EGSEA is available as an R package at http://www.bioconductor.org/packages/EGSEA/ . The gene sets collections are available in the R package EGSEAdata from http://www.bioconductor.org/packages/EGSEAdata/ . monther.alhamdoosh@csl.com.au mritchie@wehi.edu.au. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  15. Whole-Body Single-Bed Time-of-Flight RPC-PET: Simulation of Axial and Planar Sensitivities With NEMA and Anthropomorphic Phantoms

    NASA Astrophysics Data System (ADS)

    Crespo, Paulo; Reis, João; Couceiro, Miguel; Blanco, Alberto; Ferreira, Nuno C.; Marques, Rui Ferreira; Martins, Paulo; Fonte, Paulo

    2012-06-01

    A single-bed, whole-body positron emission tomograph based on resistive plate chambers has been proposed (RPC-PET). An RPC-PET system with an axial field-of-view (AFOV) of 2.4 m has been shown in simulation to have higher system sensitivity using the NEMA NU2-1994 protocol than commercial PET scanners. However, that protocol does not correlate directly with lesion detectability. The latter is better correlated with the planar (slice) sensitivity, obtained with a NEMA NU2-2001 line-source phantom. After validation with published data for the GE Advance, Siemens TruePoint and TrueV, we study by simulation their axial sensitivity profiles, comparing results with RPC-PET. Planar sensitivities indicate that RPC-PET is expected to outperform 16-cm (22-cm) AFOV scanners by a factor 5.8 (3.0) for 70-cm-long scans. For 1.5-m scans (head to mid-legs), the sensitivity gain increases to 11.7 (6.7). Yet, PET systems with large AFOV provide larger coverage but also larger attenuation in the object. We studied these competing effects with both spherical- and line-sources immersed in a 27-cm-diameter water cylinder. For 1.5-m-long scans, the planar sensitivity drops one order of magnitude in all scanners, with RPC-PET outperforming 16-cm (22-cm) AFOV scanners by a factor 9.2 (5.3) without considering the TOF benefit. A gain in the effective sensitivity is expected with TOF iterative reconstruction. Finally, object scatter in an anthropomorphic phantom is similar for RPC-PET and modern, scintillator-based scanners, although RPC-PET benefits further if its TOF information is utilized to exclude scatter events occurring outside the anthropomorphic phantom.

  16. Predictive and External Validity of a Pre-Market Study to Determine the Most Effective Pictorial Health Warning Label Content for Cigarette Packages.

    PubMed

    Huang, Li-Ling; Thrasher, James F; Reid, Jessica L; Hammond, David

    2016-05-01

    Studies examining cigarette package pictorial health warning label (HWL) content have primarily used designs that do not allow determination of effectiveness after repeated, naturalistic exposure. This research aimed to determine the predictive and external validity of a pre-market evaluation study of pictorial HWLs. Data were analyzed from: (1) a pre-market convenience sample of 544 adult smokers who participated in field experiments in Mexico City before pictorial HWL implementation (September 2010); and (2) a post-market population-based representative sample of 1765 adult smokers in the Mexican administration of the International Tobacco Control Policy Evaluation Survey after pictorial HWL implementation. Participants in both samples rated six HWLs that appeared on cigarette packs, and also ranked HWLs with four different themes. Mixed effects models were estimated for each sample to assess ratings of relative effectiveness for the six HWLs, and to assess which HWL themes were ranked as the most effective. Pre- and post-market data showed similar relative ratings across the six HWLs, with the least and most effective HWLs consistently differentiated from other HWLs. Models predicting rankings of HWL themes in post-market sample indicated: (1) pictorial HWLs were ranked as more effective than text-only HWLs; (2) HWLs with both graphic and "lived experience" content outperformed symbolic content; and, (3) testimonial content significantly outperformed didactic content. Pre-market data showed a similar pattern of results, but with fewer statistically significant findings. The study suggests well-designed pre-market studies can have predictive and external validity, helping regulators select HWL content. © The Author 2015. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  17. Gene expression inference with deep learning.

    PubMed

    Chen, Yifei; Li, Yi; Narayan, Rajiv; Subramanian, Aravind; Xie, Xiaohui

    2016-06-15

    Large-scale gene expression profiling has been widely used to characterize cellular states in response to various disease conditions, genetic perturbations, etc. Although the cost of whole-genome expression profiles has been dropping steadily, generating a compendium of expression profiling over thousands of samples is still very expensive. Recognizing that gene expressions are often highly correlated, researchers from the NIH LINCS program have developed a cost-effective strategy of profiling only ∼1000 carefully selected landmark genes and relying on computational methods to infer the expression of remaining target genes. However, the computational approach adopted by the LINCS program is currently based on linear regression (LR), limiting its accuracy since it does not capture complex nonlinear relationship between expressions of genes. We present a deep learning method (abbreviated as D-GEX) to infer the expression of target genes from the expression of landmark genes. We used the microarray-based Gene Expression Omnibus dataset, consisting of 111K expression profiles, to train our model and compare its performance to those from other methods. In terms of mean absolute error averaged across all genes, deep learning significantly outperforms LR with 15.33% relative improvement. A gene-wise comparative analysis shows that deep learning achieves lower error than LR in 99.97% of the target genes. We also tested the performance of our learned model on an independent RNA-Seq-based GTEx dataset, which consists of 2921 expression profiles. Deep learning still outperforms LR with 6.57% relative improvement, and achieves lower error in 81.31% of the target genes. D-GEX is available at https://github.com/uci-cbcl/D-GEX CONTACT: xhx@ics.uci.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. Natural and Artificial Intelligence in Neurosurgery: A Systematic Review.

    PubMed

    Senders, Joeky T; Arnaout, Omar; Karhade, Aditya V; Dasenbrock, Hormuzdiyar H; Gormley, William B; Broekman, Marike L; Smith, Timothy R

    2017-09-07

    Machine learning (ML) is a domain of artificial intelligence that allows computer algorithms to learn from experience without being explicitly programmed. To summarize neurosurgical applications of ML where it has been compared to clinical expertise, here referred to as "natural intelligence." A systematic search was performed in the PubMed and Embase databases as of August 2016 to review all studies comparing the performance of various ML approaches with that of clinical experts in neurosurgical literature. Twenty-three studies were identified that used ML algorithms for diagnosis, presurgical planning, or outcome prediction in neurosurgical patients. Compared to clinical experts, ML models demonstrated a median absolute improvement in accuracy and area under the receiver operating curve of 13% (interquartile range 4-21%) and 0.14 (interquartile range 0.07-0.21), respectively. In 29 (58%) of the 50 outcome measures for which a P -value was provided or calculated, ML models outperformed clinical experts ( P < .05). In 18 of 50 (36%), no difference was seen between ML and expert performance ( P > .05), while in 3 of 50 (6%) clinical experts outperformed ML models ( P < .05). All 4 studies that compared clinicians assisted by ML models vs clinicians alone demonstrated a better performance in the first group. We conclude that ML models have the potential to augment the decision-making capacity of clinicians in neurosurgical applications; however, significant hurdles remain associated with creating, validating, and deploying ML models in the clinical setting. Shifting from the preconceptions of a human-vs-machine to a human-and-machine paradigm could be essential to overcome these hurdles. Published by Oxford University Press on behalf of Congress of Neurological Surgeons 2017.

  19. Computational Prediction of Electron Ionization Mass Spectra to Assist in GC/MS Compound Identification.

    PubMed

    Allen, Felicity; Pon, Allison; Greiner, Russ; Wishart, David

    2016-08-02

    We describe a tool, competitive fragmentation modeling for electron ionization (CFM-EI) that, given a chemical structure (e.g., in SMILES or InChI format), computationally predicts an electron ionization mass spectrum (EI-MS) (i.e., the type of mass spectrum commonly generated by gas chromatography mass spectrometry). The predicted spectra produced by this tool can be used for putative compound identification, complementing measured spectra in reference databases by expanding the range of compounds able to be considered when availability of measured spectra is limited. The tool extends CFM-ESI, a recently developed method for computational prediction of electrospray tandem mass spectra (ESI-MS/MS), but unlike CFM-ESI, CFM-EI can handle odd-electron ions and isotopes and incorporates an artificial neural network. Tests on EI-MS data from the NIST database demonstrate that CFM-EI is able to model fragmentation likelihoods in low-resolution EI-MS data, producing predicted spectra whose dot product scores are significantly better than full enumeration "bar-code" spectra. CFM-EI also outperformed previously reported results for MetFrag, MOLGEN-MS, and Mass Frontier on one compound identification task. It also outperformed MetFrag in a range of other compound identification tasks involving a much larger data set, containing both derivatized and nonderivatized compounds. While replicate EI-MS measurements of chemical standards are still a more accurate point of comparison, CFM-EI's predictions provide a much-needed alternative when no reference standard is available for measurement. CFM-EI is available at https://sourceforge.net/projects/cfm-id/ for download and http://cfmid.wishartlab.com as a web service.

  20. Self-organizing neural integration of pose-motion features for human action recognition

    PubMed Central

    Parisi, German I.; Weber, Cornelius; Wermter, Stefan

    2015-01-01

    The visual recognition of complex, articulated human movements is fundamental for a wide range of artificial systems oriented toward human-robot communication, action classification, and action-driven perception. These challenging tasks may generally involve the processing of a huge amount of visual information and learning-based mechanisms for generalizing a set of training actions and classifying new samples. To operate in natural environments, a crucial property is the efficient and robust recognition of actions, also under noisy conditions caused by, for instance, systematic sensor errors and temporarily occluded persons. Studies of the mammalian visual system and its outperforming ability to process biological motion information suggest separate neural pathways for the distinct processing of pose and motion features at multiple levels and the subsequent integration of these visual cues for action perception. We present a neurobiologically-motivated approach to achieve noise-tolerant action recognition in real time. Our model consists of self-organizing Growing When Required (GWR) networks that obtain progressively generalized representations of sensory inputs and learn inherent spatio-temporal dependencies. During the training, the GWR networks dynamically change their topological structure to better match the input space. We first extract pose and motion features from video sequences and then cluster actions in terms of prototypical pose-motion trajectories. Multi-cue trajectories from matching action frames are subsequently combined to provide action dynamics in the joint feature space. Reported experiments show that our approach outperforms previous results on a dataset of full-body actions captured with a depth sensor, and ranks among the best results for a public benchmark of domestic daily actions. PMID:26106323

  1. An empirical model to forecast solar wind velocity through statistical modeling

    NASA Astrophysics Data System (ADS)

    Gao, Y.; Ridley, A. J.

    2013-12-01

    The accurate prediction of the solar wind velocity has been a major challenge in the space weather community. Previous studies proposed many empirical and semi-empirical models to forecast the solar wind velocity based on either the historical observations, e.g. the persistence model, or the instantaneous observations of the sun, e.g. the Wang-Sheeley-Arge model. In this study, we use the one-minute WIND data from January 1995 to August 2012 to investigate and compare the performances of 4 models often used in literature, here referred to as the null model, the persistence model, the one-solar-rotation-ago model, and the Wang-Sheeley-Arge model. It is found that, measured by root mean square error, the persistence model gives the most accurate predictions within two days. Beyond two days, the Wang-Sheeley-Arge model serves as the best model, though it only slightly outperforms the null model and the one-solar-rotation-ago model. Finally, we apply the least-square regression to linearly combine the null model, the persistence model, and the one-solar-rotation-ago model to propose a 'general persistence model'. By comparing its performance against the 4 aforementioned models, it is found that the accuracy of the general persistence model outperforms the other 4 models within five days. Due to its great simplicity and superb performance, we believe that the general persistence model can serve as a benchmark in the forecast of solar wind velocity and has the potential to be modified to arrive at better models.

  2. Butch-Femme Identity and Visuospatial Performance Among Lesbian and Bisexual Women in China.

    PubMed

    Zheng, Lijun; Wen, Guangju; Zheng, Yong

    2018-05-01

    Lesbian and bisexual women who self-identify as "butch" show a masculine profile with regard to gender roles, gender nonconformity, and systemizing cognitive style, whereas lesbian and bisexual women who self-identify as "femme" show a corresponding feminine profile and those who self-identify as "androgynes" show an intermediate profile. This study examined the association between butch or femme lesbian or bisexual identity and visuospatial ability among 323 lesbian and bisexual women, compared to heterosexual women (n = 207) and men (n = 125), from multiple cities in China. Visuospatial ability was assessed using a Shepard and Metzler-type mental rotation task and Judgment of Line Angle and Position (JLAP) test on the Internet. Heterosexual men outperformed heterosexual women on both mental rotation and JLAP tasks. Lesbian and bisexual women outperformed heterosexual women on mental rotation, but not on JLAP. There were significant differences in mental rotation performance among women, with butch- and androgyne-identified lesbian/bisexual women outperforming femme-identified and heterosexual women. There were also significant differences in JLAP performance among women, with butch- and androgyne-identified lesbian/bisexual women and heterosexual women outperforming femme-identified lesbian/bisexual women. The butch-femme differences in visuospatial ability indicated an association between cognitive ability and butch-femme identity and suggest that neurobiological underpinnings may contribute to butch-femme identity although alternative explanations exist.

  3. Neural networks for link prediction in realistic biomedical graphs: a multi-dimensional evaluation of graph embedding-based approaches.

    PubMed

    Crichton, Gamal; Guo, Yufan; Pyysalo, Sampo; Korhonen, Anna

    2018-05-21

    Link prediction in biomedical graphs has several important applications including predicting Drug-Target Interactions (DTI), Protein-Protein Interaction (PPI) prediction and Literature-Based Discovery (LBD). It can be done using a classifier to output the probability of link formation between nodes. Recently several works have used neural networks to create node representations which allow rich inputs to neural classifiers. Preliminary works were done on this and report promising results. However they did not use realistic settings like time-slicing, evaluate performances with comprehensive metrics or explain when or why neural network methods outperform. We investigated how inputs from four node representation algorithms affect performance of a neural link predictor on random- and time-sliced biomedical graphs of real-world sizes (∼ 6 million edges) containing information relevant to DTI, PPI and LBD. We compared the performance of the neural link predictor to those of established baselines and report performance across five metrics. In random- and time-sliced experiments when the neural network methods were able to learn good node representations and there was a negligible amount of disconnected nodes, those approaches outperformed the baselines. In the smallest graph (∼ 15,000 edges) and in larger graphs with approximately 14% disconnected nodes, baselines such as Common Neighbours proved a justifiable choice for link prediction. At low recall levels (∼ 0.3) the approaches were mostly equal, but at higher recall levels across all nodes and average performance at individual nodes, neural network approaches were superior. Analysis showed that neural network methods performed well on links between nodes with no previous common neighbours; potentially the most interesting links. Additionally, while neural network methods benefit from large amounts of data, they require considerable amounts of computational resources to utilise them. Our results indicate that when there is enough data for the neural network methods to use and there are a negligible amount of disconnected nodes, those approaches outperform the baselines. At low recall levels the approaches are mostly equal but at higher recall levels and average performance at individual nodes, neural network approaches are superior. Performance at nodes without common neighbours which indicate more unexpected and perhaps more useful links account for this.

  4. Externalities and article citations: experience of a national public health journal (Gaceta Sanitaria).

    PubMed

    Ruano-Ravina, Alberto; Álvarez-Dardet, Carlos; Domínguez-Berjón, M Felicitas; Fernández, Esteve; García, Ana M; Borrell, Carme

    2016-01-01

    The purpose of the study was to analyze the determinants of citations such as publication year, article type, article topic, article selected for a press release, number of articles previously published by the corresponding author, and publication language in a Spanish journal of public health. Observational study including all articles published in Gaceta Sanitaria during 2007-2011. We retrieved the number of citations from the ISI Web of Knowledge database in June 2013 and also information on other variables such as number of articles published by the corresponding author in the previous 5 years (searched through PubMed), selection for a press release, publication language, article type and topic, and others. We included 542 articles. Of these, 62.5% were cited in the period considered. We observed an increased odds ratio of citations for articles selected for a press release and also with the number of articles published previously by the corresponding author. Articles published in English do not seem to increase their citations. Certain externalities such as number of articles published by the corresponding author and being selected for a press release seem to influence the number of citations in national journals. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. The Impact of Guided Notes on Post-Secondary Student Achievement: A Meta-Analysis

    ERIC Educational Resources Information Center

    Larwin, Karen H.; Larwin, David A.

    2013-01-01

    The common practice of using of guided notes in the post-secondary classroom is not fully appreciated or understood. In an effort to add to the existing research about this phenomenon, the current investigation expands on previously published research and one previously published meta-analysis that examined the impact of guided notes on…

  6. Accurate estimation of influenza epidemics using Google search data via ARGO.

    PubMed

    Yang, Shihao; Santillana, Mauricio; Kou, S C

    2015-11-24

    Accurate real-time tracking of influenza outbreaks helps public health officials make timely and meaningful decisions that could save lives. We propose an influenza tracking model, ARGO (AutoRegression with GOogle search data), that uses publicly available online search data. In addition to having a rigorous statistical foundation, ARGO outperforms all previously available Google-search-based tracking models, including the latest version of Google Flu Trends, even though it uses only low-quality search data as input from publicly available Google Trends and Google Correlate websites. ARGO not only incorporates the seasonality in influenza epidemics but also captures changes in people's online search behavior over time. ARGO is also flexible, self-correcting, robust, and scalable, making it a potentially powerful tool that can be used for real-time tracking of other social events at multiple temporal and spatial resolutions.

  7. Deep Learning for Brain MRI Segmentation: State of the Art and Future Directions.

    PubMed

    Akkus, Zeynettin; Galimzianova, Alfiia; Hoogi, Assaf; Rubin, Daniel L; Erickson, Bradley J

    2017-08-01

    Quantitative analysis of brain MRI is routine for many neurological diseases and conditions and relies on accurate segmentation of structures of interest. Deep learning-based segmentation approaches for brain MRI are gaining interest due to their self-learning and generalization ability over large amounts of data. As the deep learning architectures are becoming more mature, they gradually outperform previous state-of-the-art classical machine learning algorithms. This review aims to provide an overview of current deep learning-based segmentation approaches for quantitative brain MRI. First we review the current deep learning architectures used for segmentation of anatomical brain structures and brain lesions. Next, the performance, speed, and properties of deep learning approaches are summarized and discussed. Finally, we provide a critical assessment of the current state and identify likely future developments and trends.

  8. RCQ-GA: RDF Chain Query Optimization Using Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    Hogenboom, Alexander; Milea, Viorel; Frasincar, Flavius; Kaymak, Uzay

    The application of Semantic Web technologies in an Electronic Commerce environment implies a need for good support tools. Fast query engines are needed for efficient querying of large amounts of data, usually represented using RDF. We focus on optimizing a special class of SPARQL queries, the so-called RDF chain queries. For this purpose, we devise a genetic algorithm called RCQ-GA that determines the order in which joins need to be performed for an efficient evaluation of RDF chain queries. The approach is benchmarked against a two-phase optimization algorithm, previously proposed in literature. The more complex a query is, the more RCQ-GA outperforms the benchmark in solution quality, execution time needed, and consistency of solution quality. When the algorithms are constrained by a time limit, the overall performance of RCQ-GA compared to the benchmark further improves.

  9. A new pattern associative memory model for image recognition based on Hebb rules and dot product

    NASA Astrophysics Data System (ADS)

    Gao, Mingyue; Deng, Limiao; Wang, Yanjiang

    2018-04-01

    A great number of associative memory models have been proposed to realize information storage and retrieval inspired by human brain in the last few years. However, there is still much room for improvement for those models. In this paper, we extend a binary pattern associative memory model to accomplish real-world image recognition. The learning process is based on the fundamental Hebb rules and the retrieval is implemented by a normalized dot product operation. Our proposed model can not only fulfill rapid memory storage and retrieval for visual information but also have the ability on incremental learning without destroying the previous learned information. Experimental results demonstrate that our model outperforms the existing Self-Organizing Incremental Neural Network (SOINN) and Back Propagation Neuron Network (BPNN) on recognition accuracy and time efficiency.

  10. Virtual fringe projection system with nonparallel illumination based on iteration

    NASA Astrophysics Data System (ADS)

    Zhou, Duo; Wang, Zhangying; Gao, Nan; Zhang, Zonghua; Jiang, Xiangqian

    2017-06-01

    Fringe projection profilometry has been widely applied in many fields. To set up an ideal measuring system, a virtual fringe projection technique has been studied to assist in the design of hardware configurations. However, existing virtual fringe projection systems use parallel illumination and have a fixed optical framework. This paper presents a virtual fringe projection system with nonparallel illumination. Using an iterative method to calculate intersection points between rays and reference planes or object surfaces, the proposed system can simulate projected fringe patterns and captured images. A new explicit calibration method has been presented to validate the precision of the system. Simulated results indicate that the proposed iterative method outperforms previous systems. Our virtual system can be applied to error analysis, algorithm optimization, and help operators to find ideal system parameter settings for actual measurements.

  11. Discriminating between intentional and unintentional gaze fixation using multimodal-based fuzzy logic algorithm for gaze tracking system with NIR camera sensor

    NASA Astrophysics Data System (ADS)

    Naqvi, Rizwan Ali; Park, Kang Ryoung

    2016-06-01

    Gaze tracking systems are widely used in human-computer interfaces, interfaces for the disabled, game interfaces, and for controlling home appliances. Most studies on gaze detection have focused on enhancing its accuracy, whereas few have considered the discrimination of intentional gaze fixation (looking at a target to activate or select it) from unintentional fixation while using gaze detection systems. Previous research methods based on the use of a keyboard or mouse button, eye blinking, and the dwell time of gaze position have various limitations. Therefore, we propose a method for discriminating between intentional and unintentional gaze fixation using a multimodal fuzzy logic algorithm applied to a gaze tracking system with a near-infrared camera sensor. Experimental results show that the proposed method outperforms the conventional method for determining gaze fixation.

  12. Information filtering via preferential diffusion.

    PubMed

    Lü, Linyuan; Liu, Weiping

    2011-06-01

    Recommender systems have shown great potential in addressing the information overload problem, namely helping users in finding interesting and relevant objects within a huge information space. Some physical dynamics, including the heat conduction process and mass or energy diffusion on networks, have recently found applications in personalized recommendation. Most of the previous studies focus overwhelmingly on recommendation accuracy as the only important factor, while overlooking the significance of diversity and novelty that indeed provide the vitality of the system. In this paper, we propose a recommendation algorithm based on the preferential diffusion process on a user-object bipartite network. Numerical analyses on two benchmark data sets, MovieLens and Netflix, indicate that our method outperforms the state-of-the-art methods. Specifically, it can not only provide more accurate recommendations, but also generate more diverse and novel recommendations by accurately recommending unpopular objects.

  13. Information filtering via preferential diffusion

    NASA Astrophysics Data System (ADS)

    Lü, Linyuan; Liu, Weiping

    2011-06-01

    Recommender systems have shown great potential in addressing the information overload problem, namely helping users in finding interesting and relevant objects within a huge information space. Some physical dynamics, including the heat conduction process and mass or energy diffusion on networks, have recently found applications in personalized recommendation. Most of the previous studies focus overwhelmingly on recommendation accuracy as the only important factor, while overlooking the significance of diversity and novelty that indeed provide the vitality of the system. In this paper, we propose a recommendation algorithm based on the preferential diffusion process on a user-object bipartite network. Numerical analyses on two benchmark data sets, MovieLens and Netflix, indicate that our method outperforms the state-of-the-art methods. Specifically, it can not only provide more accurate recommendations, but also generate more diverse and novel recommendations by accurately recommending unpopular objects.

  14. On piecewise interpolation techniques for estimating solar radiation missing values in Kedah

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saaban, Azizan; Zainudin, Lutfi; Bakar, Mohd Nazari Abu

    2014-12-04

    This paper discusses the use of piecewise interpolation method based on cubic Ball and Bézier curves representation to estimate the missing value of solar radiation in Kedah. An hourly solar radiation dataset is collected at Alor Setar Meteorology Station that is taken from Malaysian Meteorology Deparment. The piecewise cubic Ball and Bézier functions that interpolate the data points are defined on each hourly intervals of solar radiation measurement and is obtained by prescribing first order derivatives at the starts and ends of the intervals. We compare the performance of our proposed method with existing methods using Root Mean Squared Errormore » (RMSE) and Coefficient of Detemination (CoD) which is based on missing values simulation datasets. The results show that our method is outperformed the other previous methods.« less

  15. Deep Convolutional Neural Network-Based Early Automated Detection of Diabetic Retinopathy Using Fundus Image.

    PubMed

    Xu, Kele; Feng, Dawei; Mi, Haibo

    2017-11-23

    The automatic detection of diabetic retinopathy is of vital importance, as it is the main cause of irreversible vision loss in the working-age population in the developed world. The early detection of diabetic retinopathy occurrence can be very helpful for clinical treatment; although several different feature extraction approaches have been proposed, the classification task for retinal images is still tedious even for those trained clinicians. Recently, deep convolutional neural networks have manifested superior performance in image classification compared to previous handcrafted feature-based image classification methods. Thus, in this paper, we explored the use of deep convolutional neural network methodology for the automatic classification of diabetic retinopathy using color fundus image, and obtained an accuracy of 94.5% on our dataset, outperforming the results obtained by using classical approaches.

  16. Retinal artery-vein classification via topology estimation

    PubMed Central

    Estrada, Rolando; Allingham, Michael J.; Mettu, Priyatham S.; Cousins, Scott W.; Tomasi, Carlo; Farsiu, Sina

    2015-01-01

    We propose a novel, graph-theoretic framework for distinguishing arteries from veins in a fundus image. We make use of the underlying vessel topology to better classify small and midsized vessels. We extend our previously proposed tree topology estimation framework by incorporating expert, domain-specific features to construct a simple, yet powerful global likelihood model. We efficiently maximize this model by iteratively exploring the space of possible solutions consistent with the projected vessels. We tested our method on four retinal datasets and achieved classification accuracies of 91.0%, 93.5%, 91.7%, and 90.9%, outperforming existing methods. Our results show the effectiveness of our approach, which is capable of analyzing the entire vasculature, including peripheral vessels, in wide field-of-view fundus photographs. This topology-based method is a potentially important tool for diagnosing diseases with retinal vascular manifestation. PMID:26068204

  17. High performance computation of radiative transfer equation using the finite element method

    NASA Astrophysics Data System (ADS)

    Badri, M. A.; Jolivet, P.; Rousseau, B.; Favennec, Y.

    2018-05-01

    This article deals with an efficient strategy for numerically simulating radiative transfer phenomena using distributed computing. The finite element method alongside the discrete ordinate method is used for spatio-angular discretization of the monochromatic steady-state radiative transfer equation in an anisotropically scattering media. Two very different methods of parallelization, angular and spatial decomposition methods, are presented. To do so, the finite element method is used in a vectorial way. A detailed comparison of scalability, performance, and efficiency on thousands of processors is established for two- and three-dimensional heterogeneous test cases. Timings show that both algorithms scale well when using proper preconditioners. It is also observed that our angular decomposition scheme outperforms our domain decomposition method. Overall, we perform numerical simulations at scales that were previously unattainable by standard radiative transfer equation solvers.

  18. Updated logistic regression equations for the calculation of post-fire debris-flow likelihood in the western United States

    USGS Publications Warehouse

    Staley, Dennis M.; Negri, Jacquelyn A.; Kean, Jason W.; Laber, Jayme L.; Tillery, Anne C.; Youberg, Ann M.

    2016-06-30

    Wildfire can significantly alter the hydrologic response of a watershed to the extent that even modest rainstorms can generate dangerous flash floods and debris flows. To reduce public exposure to hazard, the U.S. Geological Survey produces post-fire debris-flow hazard assessments for select fires in the western United States. We use publicly available geospatial data describing basin morphology, burn severity, soil properties, and rainfall characteristics to estimate the statistical likelihood that debris flows will occur in response to a storm of a given rainfall intensity. Using an empirical database and refined geospatial analysis methods, we defined new equations for the prediction of debris-flow likelihood using logistic regression methods. We showed that the new logistic regression model outperformed previous models used to predict debris-flow likelihood.

  19. Abstraction and Assume-Guarantee Reasoning for Automated Software Verification

    NASA Technical Reports Server (NTRS)

    Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.

    2004-01-01

    Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.

  20. Multi-National Banknote Classification Based on Visible-light Line Sensor and Convolutional Neural Network.

    PubMed

    Pham, Tuyen Danh; Lee, Dong Eun; Park, Kang Ryoung

    2017-07-08

    Automatic recognition of banknotes is applied in payment facilities, such as automated teller machines (ATMs) and banknote counters. Besides the popular approaches that focus on studying the methods applied to various individual types of currencies, there have been studies conducted on simultaneous classification of banknotes from multiple countries. However, their methods were conducted with limited numbers of banknote images, national currencies, and denominations. To address this issue, we propose a multi-national banknote classification method based on visible-light banknote images captured by a one-dimensional line sensor and classified by a convolutional neural network (CNN) considering the size information of each denomination. Experiments conducted on the combined banknote image database of six countries with 62 denominations gave a classification accuracy of 100%, and results show that our proposed algorithm outperforms previous methods.

  1. Road Lane Detection by Discriminating Dashed and Solid Road Lanes Using a Visible Light Camera Sensor.

    PubMed

    Hoang, Toan Minh; Hong, Hyung Gil; Vokhidov, Husan; Park, Kang Ryoung

    2016-08-18

    With the increasing need for road lane detection used in lane departure warning systems and autonomous vehicles, many studies have been conducted to turn road lane detection into a virtual assistant to improve driving safety and reduce car accidents. Most of the previous research approaches detect the central line of a road lane and not the accurate left and right boundaries of the lane. In addition, they do not discriminate between dashed and solid lanes when detecting the road lanes. However, this discrimination is necessary for the safety of autonomous vehicles and the safety of vehicles driven by human drivers. To overcome these problems, we propose a method for road lane detection that distinguishes between dashed and solid lanes. Experimental results with the Caltech open database showed that our method outperforms conventional methods.

  2. Road Lane Detection by Discriminating Dashed and Solid Road Lanes Using a Visible Light Camera Sensor

    PubMed Central

    Hoang, Toan Minh; Hong, Hyung Gil; Vokhidov, Husan; Park, Kang Ryoung

    2016-01-01

    With the increasing need for road lane detection used in lane departure warning systems and autonomous vehicles, many studies have been conducted to turn road lane detection into a virtual assistant to improve driving safety and reduce car accidents. Most of the previous research approaches detect the central line of a road lane and not the accurate left and right boundaries of the lane. In addition, they do not discriminate between dashed and solid lanes when detecting the road lanes. However, this discrimination is necessary for the safety of autonomous vehicles and the safety of vehicles driven by human drivers. To overcome these problems, we propose a method for road lane detection that distinguishes between dashed and solid lanes. Experimental results with the Caltech open database showed that our method outperforms conventional methods. PMID:27548176

  3. Fast Object Motion Estimation Based on Dynamic Stixels.

    PubMed

    Morales, Néstor; Morell, Antonio; Toledo, Jonay; Acosta, Leopoldo

    2016-07-28

    The stixel world is a simplification of the world in which obstacles are represented as vertical instances, called stixels, standing on a surface assumed to be planar. In this paper, previous approaches for stixel tracking are extended using a two-level scheme. In the first level, stixels are tracked by matching them between frames using a bipartite graph in which edges represent a matching cost function. Then, stixels are clustered into sets representing objects in the environment. These objects are matched based on the number of stixels paired inside them. Furthermore, a faster, but less accurate approach is proposed in which only the second level is used. Several configurations of our method are compared to an existing state-of-the-art approach to show how our methodology outperforms it in several areas, including an improvement in the quality of the depth reconstruction.

  4. Control of microparticles packing density in a microfluidic channel for bead based immunoassays applications

    NASA Astrophysics Data System (ADS)

    Caballero-Robledo, Gabriel; Guevara-Pantoja, Pablo

    2014-11-01

    Bead based immunoassays in microfluidic devices have shown to greatly outperform conventional methods. But if functional point-of-care devices are to be developed, precise and reproducible control over the granulate packings inside microchannels is needed. In this work we study the efficiency of a nanoparticles magnetic trap previously developed by B. Teste et al. [Lab Chip 11, 4207 (2011)] when we vary the compaction of micrometric iron beads packed against a restriction inside a microfluidic channel. The packing density of the beads is finely and reproducibly changed by applying a vibrational protocol originally developed for macroscopic, dry granular systems. We find, counterintuitively, that the most compact and stable packings are up to four times less efficient in trapping nano particles than the loosest packings. This work has been supported by Conacyt, Mexico, under Grant No. 180873.

  5. Corrected goodness-of-fit test in covariance structure analysis.

    PubMed

    Hayakawa, Kazuhiko

    2018-05-17

    Many previous studies report simulation evidence that the goodness-of-fit test in covariance structure analysis or structural equation modeling suffers from the overrejection problem when the number of manifest variables is large compared with the sample size. In this study, we demonstrate that one of the tests considered in Browne (1974) can address this long-standing problem. We also propose a simple modification of Satorra and Bentler's mean and variance adjusted test for non-normal data. A Monte Carlo simulation is carried out to investigate the performance of the corrected tests in the context of a confirmatory factor model, a panel autoregressive model, and a cross-lagged panel (panel vector autoregressive) model. The simulation results reveal that the corrected tests overcome the overrejection problem and outperform existing tests in most cases. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  6. Multi-National Banknote Classification Based on Visible-light Line Sensor and Convolutional Neural Network

    PubMed Central

    Pham, Tuyen Danh; Lee, Dong Eun; Park, Kang Ryoung

    2017-01-01

    Automatic recognition of banknotes is applied in payment facilities, such as automated teller machines (ATMs) and banknote counters. Besides the popular approaches that focus on studying the methods applied to various individual types of currencies, there have been studies conducted on simultaneous classification of banknotes from multiple countries. However, their methods were conducted with limited numbers of banknote images, national currencies, and denominations. To address this issue, we propose a multi-national banknote classification method based on visible-light banknote images captured by a one-dimensional line sensor and classified by a convolutional neural network (CNN) considering the size information of each denomination. Experiments conducted on the combined banknote image database of six countries with 62 denominations gave a classification accuracy of 100%, and results show that our proposed algorithm outperforms previous methods. PMID:28698466

  7. The effects of bilingualism on toddlers’ executive functioning

    PubMed Central

    Poulin-Dubois, Diane; Blaye, Agnes; Coutya, Julie; Bialystok, Ellen

    2015-01-01

    Bilingual children have been shown to outperform monolingual children on tasks measuring executive functioning skills. This advantage is usually attributed to bilinguals’ extensive practice in exercising selective attention and cognitive flexibility during language use because both languages are active when one of them is being used. We examined whether this advantage is observed in 24-month-olds who have had much less experience in language production. A battery of executive functioning tasks and the cognitive scale of the Bayley test were administered to 63 monolingual and bilingual children. Native bilingual children performed significantly better than monolingual children on the Stroop task, with no difference between groups on the other tasks, confirming the specificity of bilingual effects to conflict tasks reported in older children. These results demonstrate that bilingual advantages in executive control emerge at an age not previously shown. PMID:21122877

  8. Forecasting influenza-like illness dynamics for military populations using neural networks and social media

    PubMed Central

    Ayton, Ellyn; Porterfield, Katherine; Corley, Courtney D.

    2017-01-01

    This work is the first to take advantage of recurrent neural networks to predict influenza-like illness (ILI) dynamics from various linguistic signals extracted from social media data. Unlike other approaches that rely on timeseries analysis of historical ILI data and the state-of-the-art machine learning models, we build and evaluate the predictive power of neural network architectures based on Long Short Term Memory (LSTMs) units capable of nowcasting (predicting in “real-time”) and forecasting (predicting the future) ILI dynamics in the 2011 – 2014 influenza seasons. To build our models we integrate information people post in social media e.g., topics, embeddings, word ngrams, stylistic patterns, and communication behavior using hashtags and mentions. We then quantitatively evaluate the predictive power of different social media signals and contrast the performance of the-state-of-the-art regression models with neural networks using a diverse set of evaluation metrics. Finally, we combine ILI and social media signals to build a joint neural network model for ILI dynamics prediction. Unlike the majority of the existing work, we specifically focus on developing models for local rather than national ILI surveillance, specifically for military rather than general populations in 26 U.S. and six international locations., and analyze how model performance depends on the amount of social media data available per location. Our approach demonstrates several advantages: (a) Neural network architectures that rely on LSTM units trained on social media data yield the best performance compared to previously used regression models. (b) Previously under-explored language and communication behavior features are more predictive of ILI dynamics than stylistic and topic signals expressed in social media. (c) Neural network models learned exclusively from social media signals yield comparable or better performance to the models learned from ILI historical data, thus, signals from social media can be potentially used to accurately forecast ILI dynamics for the regions where ILI historical data is not available. (d) Neural network models learned from combined ILI and social media signals significantly outperform models that rely solely on ILI historical data, which adds to a great potential of alternative public sources for ILI dynamics prediction. (e) Location-specific models outperform previously used location-independent models e.g., U.S. only. (f) Prediction results significantly vary across geolocations depending on the amount of social media data available and ILI activity patterns. (g) Model performance improves with more tweets available per geo-location e.g., the error gets lower and the Pearson score gets higher for locations with more tweets. PMID:29244814

  9. Forecasting influenza-like illness dynamics for military populations using neural networks and social media.

    PubMed

    Volkova, Svitlana; Ayton, Ellyn; Porterfield, Katherine; Corley, Courtney D

    2017-01-01

    This work is the first to take advantage of recurrent neural networks to predict influenza-like illness (ILI) dynamics from various linguistic signals extracted from social media data. Unlike other approaches that rely on timeseries analysis of historical ILI data and the state-of-the-art machine learning models, we build and evaluate the predictive power of neural network architectures based on Long Short Term Memory (LSTMs) units capable of nowcasting (predicting in "real-time") and forecasting (predicting the future) ILI dynamics in the 2011 - 2014 influenza seasons. To build our models we integrate information people post in social media e.g., topics, embeddings, word ngrams, stylistic patterns, and communication behavior using hashtags and mentions. We then quantitatively evaluate the predictive power of different social media signals and contrast the performance of the-state-of-the-art regression models with neural networks using a diverse set of evaluation metrics. Finally, we combine ILI and social media signals to build a joint neural network model for ILI dynamics prediction. Unlike the majority of the existing work, we specifically focus on developing models for local rather than national ILI surveillance, specifically for military rather than general populations in 26 U.S. and six international locations., and analyze how model performance depends on the amount of social media data available per location. Our approach demonstrates several advantages: (a) Neural network architectures that rely on LSTM units trained on social media data yield the best performance compared to previously used regression models. (b) Previously under-explored language and communication behavior features are more predictive of ILI dynamics than stylistic and topic signals expressed in social media. (c) Neural network models learned exclusively from social media signals yield comparable or better performance to the models learned from ILI historical data, thus, signals from social media can be potentially used to accurately forecast ILI dynamics for the regions where ILI historical data is not available. (d) Neural network models learned from combined ILI and social media signals significantly outperform models that rely solely on ILI historical data, which adds to a great potential of alternative public sources for ILI dynamics prediction. (e) Location-specific models outperform previously used location-independent models e.g., U.S. only. (f) Prediction results significantly vary across geolocations depending on the amount of social media data available and ILI activity patterns. (g) Model performance improves with more tweets available per geo-location e.g., the error gets lower and the Pearson score gets higher for locations with more tweets.

  10. Survival of extensively damaged endodontically treated incisors restored with different types of posts-and-core foundation restoration material.

    PubMed

    Lazari, Priscilla Cardoso; de Carvalho, Marco Aurélio; Del Bel Cury, Altair A; Magne, Pascal

    2018-05-01

    Which post-and-core combination will best improve the performance of extensively damaged endodontically treated incisors without a ferrule is still unclear. The purpose of this in vitro study was to investigate the restoration of extensively damaged endodontically treated incisors without a ferrule using glass-ceramic crowns bonded to various composite resin foundation restorations and 2 types of posts. Sixty decoronated endodontically treated bovine incisors without a ferrule were divided into 4 groups and restored with various post-and-core foundation restorations. NfPfB=no-ferrule (Nf) with glass-fiber post (Pf) and bulk-fill resin foundation restoration (B); NfPfP=no-ferrule (Nf) with glass-fiber post (Pf) and dual-polymerized composite resin core foundation restoration (P); NfPt=no-ferrule (Nf) with titanium post (Pt) and resin core foundation restoration; and NfPtB=no-ferrule (Nf) with titanium post (Pt) and bulk-fill resin core foundation restoration (B). Two additional groups from previously published data from the same authors (FPf=2mm of ferrule (F) and glass-fiber post (Pf) and composite resin core foundation restoration; and NfPf=no-ferrule (Nf) with glass-fiber post (Pf) and composite resin core foundation restoration), which were tested concomitantly and using the same experimental arrangement, were included for comparison. All teeth were prepared to receive bonded glass-ceramic crowns luted with dual-polymerized resin cement and were subjected to accelerated fatigue testing under submerged conditions at room temperature. Cyclic isometric loading was applied to the incisal edge at an angle of 30 degrees with a frequency of 5 Hz, beginning with a load of 100 N (5000 cycles). A 100-N load increase was applied every 15000 cycles. The specimens were loaded until failure or to a maximum of 1000 N (140000 cycles). The 6 groups (4 groups from the present study and 2 groups from the previously published study) were compared using the Kaplan-Meier survival analysis (log-rank post hoc test at α=.05 for pairwise comparisons). None of the tested specimen withstood all 140 000 cycles. All specimens without a ferrule were affected by an initial failure phenomenon (wide gap at the lingual margin between the core foundation restoration/crown assembly and the root). NfPfP, NfPt, and NfPtB had similar survival (29649 to 30987 mean cycles until initial failure). NfPfB outperformed NfPt and NfPtB. None of the post-and-core foundation restoration materials were able to match the performance of the ferrule group FPf (72667 cycles). In all groups, 100% of failures were catastrophic. The survival of extensively damaged endodontically treated incisors without a ferrule was slightly improved by the use of a fiber post with a bulk-fill composite resin core foundation restoration. However, none of the post-and-core techniques was able to compensate for the absence of a ferrule. The presence of the posts always adversely affected the failure mode. Copyright © 2017 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  11. Outperforming whom? A multilevel study of performance-prove goal orientation, performance, and the moderating role of shared team identification.

    PubMed

    Dietz, Bart; van Knippenberg, Daan; Hirst, Giles; Restubog, Simon Lloyd D

    2015-11-01

    Performance-prove goal orientation affects performance because it drives people to try to outperform others. A proper understanding of the performance-motivating potential of performance-prove goal orientation requires, however, that we consider the question of whom people desire to outperform. In a multilevel analysis of this issue, we propose that the shared team identification of a team plays an important moderating role here, directing the performance-motivating influence of performance-prove goal orientation to either the team level or the individual level of performance. A multilevel study of salespeople nested in teams supports this proposition, showing that performance-prove goal orientation motivates team performance more with higher shared team identification, whereas performance-prove goal orientation motivates individual performance more with lower shared team identification. Establishing the robustness of these findings, a second study replicates them with individual and team performance in an educational context. (c) 2015 APA, all rights reserved).

  12. The Impact of Transcription Writing Interventions for First-Grade Students

    PubMed Central

    Wanzek, Jeanne; Gatlin, Brandy; Al Otaiba, Stephanie; Kim, Young-Suk Grace

    2016-01-01

    We examined the effects of transcription instruction for students in first grade. Students in the lowest 70% of the participating schools were selected for the study. These 81 students were randomly assigned to: (a) spelling instruction, (b) handwriting instruction, (c) combination spelling and handwriting instruction, or (d) no intervention. Intervention was provided in small groups of 4 students, 25 min a day, 4 days a week for 8 weeks. Students in the spelling condition outperformed the control group on spelling measures with moderate effect sizes noted on curriculum-based writing measures (e.g., correct word sequence; g range = 0.34 to 0.68). Students in the handwriting condition outperformed the control group on correct word sequences with small to moderate effects on other handwriting and writing measures (g range = 0.31 to 0.71). Students in the combined condition outperformed the control group on correct word sequences with a small effect on total words written (g range = 0.39 to 0.84). PMID:28989267

  13. Gender and sexual orientation differences in cognition across adulthood: age is kinder to women than to men regardless of sexual orientation.

    PubMed

    Maylor, Elizabeth A; Reimers, Stian; Choi, Jean; Collaer, Marcia L; Peters, Michael; Silverman, Irwin

    2007-04-01

    Despite some evidence of greater age-related deterioration of the brain in males than in females, gender differences in rates of cognitive aging have proved inconsistent. The present study employed web-based methodology to collect data from people aged 20-65 years (109,612 men; 88,509 women). As expected, men outperformed women on tests of mental rotation and line angle judgment, whereas women outperformed men on tests of category fluency and object location memory. Performance on all tests declined with age but significantly more so for men than for women. Heterosexuals of each gender generally outperformed bisexuals and homosexuals on tests where that gender was superior; however, there were no clear interactions between age and sexual orientation for either gender. At least for these particular tests from young adulthood to retirement, age is kinder to women than to men, but treats heterosexuals, bisexuals, and homosexuals just the same.

  14. Forecasting Tehran stock exchange volatility; Markov switching GARCH approach

    NASA Astrophysics Data System (ADS)

    Abounoori, Esmaiel; Elmi, Zahra (Mila); Nademi, Younes

    2016-03-01

    This paper evaluates several GARCH models regarding their ability to forecast volatility in Tehran Stock Exchange (TSE). These include GARCH models with both Gaussian and fat-tailed residual conditional distribution, concerning their ability to describe and forecast volatility from 1-day to 22-day horizon. Results indicate that AR(2)-MRSGARCH-GED model outperforms other models at one-day horizon. Also, the AR(2)-MRSGARCH-GED as well as AR(2)-MRSGARCH-t models outperform other models at 5-day horizon. In 10 day horizon, three models of AR(2)-MRSGARCH outperform other models. Concerning 22 day forecast horizon, results indicate no differences between MRSGARCH models with that of standard GARCH models. Regarding Risk management out-of-sample evaluation (95% VaR), a few models seem to provide reasonable and accurate VaR estimates at 1-day horizon, with a coverage rate close to the nominal level. According to the risk management loss functions, there is not a uniformly most accurate model.

  15. Visuospatial performance on an internet line judgment task and potential hormonal markers: sex, sexual orientation, and 2D:4D.

    PubMed

    Collaer, Marcia L; Reimers, Stian; Manning, John T

    2007-04-01

    We investigated whether performance on a visuospatial line judgment task, the Judgment of Line Angle and Position-15 test (JLAP-15), showed evidence of sensitivity to early sex steroid exposure by examining how it related to sex, as well as to sexual orientation and 2D:4D digit ratios. Participants were drawn from a large Internet study with over 250,000 participants. In the main sample (ages 12-58 years), males outperformed females on the JLAP-15, showing a moderate effect size for sex. In agreement with a prenatal sex hormone hypothesis, line judgment accuracy in adults related to 2D:4D and sexual orientation, both of which are postulated to be influenced by early steroids. In both sexes, better visuospatial performance was associated with lower (more male-typical) digit ratios. For men, heterosexual participants outperformed homosexual/bisexual participants on the JLAP-15 and, for women, homosexual/bisexual participants outperformed heterosexual participants. In children aged 8-10 years, presumed to be a largely prepubertal group, boys also outperformed girls. These findings are consistent with the hypothesis that visuospatial ability is influenced by early sex steroids, although they do not rule out alternative explanations or additional influences. More broadly, such results support a prenatal sex hormone hypothesis that degree of androgen exposure may influence the neural circuitry underlying cognition (visuospatial ability) and sexual orientation as well as aspects of somatic (digit ratio) development.

  16. Oxidation Mechanisms of Toluene and Benzene

    NASA Technical Reports Server (NTRS)

    Bittker, David A.

    1995-01-01

    An expanded and improved version of a previously published benzene oxidation mechanism is presented and shown to model published experimental data fairly successfully. This benzene submodel is coupled to a modified version of a toluene oxidation submodel from the recent literature. This complete mechanism is shown to successfully model published experimental toluene oxidation data for a highly mixed flow reactor and for higher temperature ignition delay times in a shock tube. A comprehensive sensitivity analysis showing the most important reactions is presented for both the benzene and toluene reacting systems. The NASA Lewis toluene mechanism's modeling capability is found to be equivalent to that of the previously published mechanism which contains a somewhat different benzene submodel.

  17. Validating internet research: a test of the psychometric equivalence of internet and in-person samples.

    PubMed

    Meyerson, Paul; Tryon, Warren W

    2003-11-01

    This study evaluated the psychometric equivalency of Web-based research. The Sexual Boredom Scale was presented via the World-Wide Web along with five additional scales used to validate it. A subset of 533 participants that matched a previously published sample (Watt & Ewing, 1996) on age, gender, and race was identified. An 8 x 8 correlation matrix from the matched Internet sample was compared via structural equation modeling with a similar 8 x 8 correlation matrix from the previously published study. The Internet and previously published samples were psychometrically equivalent. Coefficient alpha values calculated on the matched Internet sample yielded reliability coefficients almost identical to those for the previously published sample. Factors such as computer administration and uncontrollable administration settings did not appear to affect the results. Demographic data indicated an overrepresentation of males by about 6% and Caucasians by about 13% relative to the U.S. Census (2000). A total of 2,230 participants were obtained in about 8 months without remuneration. These results suggest that data collection on the Web is (1) reliable, (2) valid, (3) reasonably representative, (4) cost effective, and (5) efficient.

  18. Third-order elastic constants of diamond determined from experimental data

    DOE PAGES

    Winey, J. M.; Hmiel, A.; Gupta, Y. M.

    2016-06-01

    The pressure derivatives of the second-order elastic constants (SOECs) of diamond were determined by analyzing previous sound velocity measurements under hydrostatic stress [McSkimin and Andreatch, J. Appl. Phys. 43, 294 (1972)]. Furthermore, our analysis corrects an error in the previously reported results.We present a complete and corrected set of third-order elastic constants (TOECs) using the corrected pressure derivatives, together with published data for the nonlinear elastic response of shock compressed diamond [Lang and Gupta, Phys. Rev. Lett. 106, 125502 (2011)] and it differs significantly from TOECs published previously.

  19. Site fidelity, territory fidelity, and natal philopatry in Willow Flycatchers (Empidonax traillii)

    USGS Publications Warehouse

    Sedgwick, James A.

    2004-01-01

    I investigated the causes and consequences of adult breeding-site fidelity, territory fidelity, and natal philopatry in Willow Flycatchers (Empidonax traillii) in southeastern Oregon over a 10-year period, testing the general hypothesis that fidelity and dispersal distances are influenced by previous breeding performance. Willow Flycatchers adhered to the generally observed tendencies of passerine birds for low natal philopatry and high breedingsite fidelity. Site fidelity (return to the study area) of adult males (52.0%) and females (51.3%), and median dispersal distances between seasons (16 m vs. 19 m) were similar. Previous breeding performance and residency (age-experience), but not study-site quality, explained site fidelity in females. Site fidelity of females rearing 4–5 young (64.4%) exceeded that of unsuccessful females (40.0%), breeding dispersal was less (successful: 15 m; unsuccessful: 33 m), and novice residents were more site-faithful than former residents. Probability of site fidelity was higher for previously successful females (odds ratio = 4.76), those with greater seasonal fecundity (odds ratio = 1.58), novice residents (odds ratio = 1.41), and unparasitized females (odds ratio = 2.76). Male site fidelity was not related to residency, site quality, or previous breeding performance. Territory fidelity (return to the previous territory) in females was best explained by previous breeding performance, but not by site quality or residency. Previously successful females were more likely to return to their territory of the previous season than either unsuccessful (odds ratio = 14.35) or parasitized birds (odds ratio = 6.38). Male territory fidelity was not related to residency, site quality, or previous breeding performance. Natal philopatry was low (7.8%) and similar for males and females. Site quality appeared to influence philopatry, given that no birds reared at a low-quality study site returned there to breed, and birds reared there dispersed farther than birds reared at two other study sites. My results partially support the hypothesis that site fidelity is an adaptive response: (1) previously successful females that switched territories underperformed those that did not switch (P = 0.01); and (2) previously unsuccessful females that switched territories outperformed those that did not switch, but not significantly (P = 0.22).

  20. Feature Selection for Chemical Sensor Arrays Using Mutual Information

    PubMed Central

    Wang, X. Rosalind; Lizier, Joseph T.; Nowotny, Thomas; Berna, Amalia Z.; Prokopenko, Mikhail; Trowell, Stephen C.

    2014-01-01

    We address the problem of feature selection for classifying a diverse set of chemicals using an array of metal oxide sensors. Our aim is to evaluate a filter approach to feature selection with reference to previous work, which used a wrapper approach on the same data set, and established best features and upper bounds on classification performance. We selected feature sets that exhibit the maximal mutual information with the identity of the chemicals. The selected features closely match those found to perform well in the previous study using a wrapper approach to conduct an exhaustive search of all permitted feature combinations. By comparing the classification performance of support vector machines (using features selected by mutual information) with the performance observed in the previous study, we found that while our approach does not always give the maximum possible classification performance, it always selects features that achieve classification performance approaching the optimum obtained by exhaustive search. We performed further classification using the selected feature set with some common classifiers and found that, for the selected features, Bayesian Networks gave the best performance. Finally, we compared the observed classification performances with the performance of classifiers using randomly selected features. We found that the selected features consistently outperformed randomly selected features for all tested classifiers. The mutual information filter approach is therefore a computationally efficient method for selecting near optimal features for chemical sensor arrays. PMID:24595058

  1. Exploiting molecular dynamics in Nested Sampling simulations of small peptides

    NASA Astrophysics Data System (ADS)

    Burkoff, Nikolas S.; Baldock, Robert J. N.; Várnai, Csilla; Wild, David L.; Csányi, Gábor

    2016-04-01

    Nested Sampling (NS) is a parameter space sampling algorithm which can be used for sampling the equilibrium thermodynamics of atomistic systems. NS has previously been used to explore the potential energy surface of a coarse-grained protein model and has significantly outperformed parallel tempering when calculating heat capacity curves of Lennard-Jones clusters. The original NS algorithm uses Monte Carlo (MC) moves; however, a variant, Galilean NS, has recently been introduced which allows NS to be incorporated into a molecular dynamics framework, so NS can be used for systems which lack efficient prescribed MC moves. In this work we demonstrate the applicability of Galilean NS to atomistic systems. We present an implementation of Galilean NS using the Amber molecular dynamics package and demonstrate its viability by sampling alanine dipeptide, both in vacuo and implicit solvent. Unlike previous studies of this system, we present the heat capacity curves of alanine dipeptide, whose calculation provides a stringent test for sampling algorithms. We also compare our results with those calculated using replica exchange molecular dynamics (REMD) and find good agreement. We show the computational effort required for accurate heat capacity estimation for small peptides. We also calculate the alanine dipeptide Ramachandran free energy surface for a range of temperatures and use it to compare the results using the latest Amber force field with previous theoretical and experimental results.

  2. The financial performance of the health care industry: a global, regional and industry specific empirical investigation.

    PubMed

    Dorfleitner, Gregor; Rößle, Felix

    2018-05-01

    This article analyzes the financial (out-) performance of all listed health care companies. The health care sector outperformed the market in the period from 2000 to June 2015. The performance was driven by companies from Americas, and Asia as well as companies from the pharmaceuticals sub-segment. Additionally, bull periods appear to be the main driver for the outperformance. Euro-based investors can expect different outcomes of their investments to those of USD investors. However, the main trends remain unchanged.

  3. Comparison of crisp and fuzzy character networks in handwritten word recognition

    NASA Technical Reports Server (NTRS)

    Gader, Paul; Mohamed, Magdi; Chiang, Jung-Hsien

    1992-01-01

    Experiments involving handwritten word recognition on words taken from images of handwritten address blocks from the United States Postal Service mailstream are described. The word recognition algorithm relies on the use of neural networks at the character level. The neural networks are trained using crisp and fuzzy desired outputs. The fuzzy outputs were defined using a fuzzy k-nearest neighbor algorithm. The crisp networks slightly outperformed the fuzzy networks at the character level but the fuzzy networks outperformed the crisp networks at the word level.

  4. Improving biomedical information retrieval by linear combinations of different query expansion techniques.

    PubMed

    Abdulla, Ahmed AbdoAziz Ahmed; Lin, Hongfei; Xu, Bo; Banbhrani, Santosh Kumar

    2016-07-25

    Biomedical literature retrieval is becoming increasingly complex, and there is a fundamental need for advanced information retrieval systems. Information Retrieval (IR) programs scour unstructured materials such as text documents in large reserves of data that are usually stored on computers. IR is related to the representation, storage, and organization of information items, as well as to access. In IR one of the main problems is to determine which documents are relevant and which are not to the user's needs. Under the current regime, users cannot precisely construct queries in an accurate way to retrieve particular pieces of data from large reserves of data. Basic information retrieval systems are producing low-quality search results. In our proposed system for this paper we present a new technique to refine Information Retrieval searches to better represent the user's information need in order to enhance the performance of information retrieval by using different query expansion techniques and apply a linear combinations between them, where the combinations was linearly between two expansion results at one time. Query expansions expand the search query, for example, by finding synonyms and reweighting original terms. They provide significantly more focused, particularized search results than do basic search queries. The retrieval performance is measured by some variants of MAP (Mean Average Precision) and according to our experimental results, the combination of best results of query expansion is enhanced the retrieved documents and outperforms our baseline by 21.06 %, even it outperforms a previous study by 7.12 %. We propose several query expansion techniques and their combinations (linearly) to make user queries more cognizable to search engines and to produce higher-quality search results.

  5. Wafer-scale, massively parallel carbon nanotube arrays for realizing field effect transistors with current density exceeding silicon and gallium arsenide

    NASA Astrophysics Data System (ADS)

    Arnold, Michael

    Calculations have indicated that aligned arrays of semiconducting carbon nanotubes (CNTs) promise to outperform conventional semiconducting materials in short-channel, aggressively scaled field effect transistors (FETs) like those used in semiconductor logic and high frequency amplifier technologies. These calculations have been based on extrapolation of measurements of FETs based on one CNT, in which ballistic transport approaching the quantum conductance limit of 2Go = 4e2/h has been achieved. However, constraints in CNT sorting, processing, alignment, and contacts give rise to non-idealities when CNTs are implemented in densely-packed parallel arrays, which has resulted in a conductance per CNT far from 2Go. The consequence has been that it has been very difficult to create high performance CNT array FETs, and CNT array FETs have not outperformed but rather underperformed channel materials such as Si by 6 x or more. Here, we report nearly ballistic CNT array FETs at a density of 50 CNTs um-1, created via CNT sorting, wafer-scale alignment and assembly, and treatment. The on-state conductance in the arrays is as high as 0.46 Go per CNT, and the conductance of the arrays reaches 1.7 mS um-1, which is 7 x higher than previous state-of-the-art CNT array FETs made by other methods. The saturated on-state current density reaches 900 uA um-1 and is similar to or exceeds that of Si FETs when compared at equivalent gate oxide thickness, off-state current density, and channel length. The on-state current density exceeds that of GaAs FETs, as well. This leap in CNT FET array performance is a significant advance towards the exploitation of CNTs in high-performance semiconductor electronics technologies.

  6. More Technology, Better Learning Resources, Better Learning? Lessons from Adopting Virtual Microscopy in Undergraduate Medical Education

    PubMed Central

    Helle, Laura; Nivala, Markus; Kronqvist, Pauliina

    2013-01-01

    The adoption of virtual microscopy at the University of Turku, Finland, created a unique real-world laboratory for exploring ways of reforming the learning environment. The purpose of this study was to evaluate the students' reactions and the impact of a set of measures designed to boost an experimental group's understanding of abnormal histology through an emphasis on knowledge of normal cells and tissues. The set of measures included (1) digital resources to review normal structures and an entrance examination for enforcement, (2) digital course slides highlighting normal and abnormal tissues, and (3) self-diagnostic quizzes. The performance of historical controls was used as a baseline, as previous students had never been exposed to the above-mentioned measures. The students' understanding of normal histology was assessed in the beginning of the module to determine the impact of the first set of measures, whereas that of abnormal histology was assessed at the end of the module to determine the impact of the whole set of measures. The students' reactions to the instructional measures were assessed by course evaluation data. Additionally, four students were interviewed. Results confirmed that the experimental group significantly outperformed the historical controls in understanding normal histology. The students held favorable opinions on the idea of emphasizing normal structures. However, with regards to abnormal histology, the historical controls outperformed the experimental group. In conclusion, allowing students access to high-quality digitized materials and boosting prerequisite skills are clearly not sufficient to boost final competence. Instead, the solution may lie in making students externally accountable for their learning throughout their training. Anat Sci Educ 6: 73–80. © 2012 American Association of Anatomists. PMID:22930425

  7. Team Learning for Healthcare Quality Improvement

    PubMed Central

    Eppstein, Margaret J.; Horbar, Jeffrey D.

    2014-01-01

    In organized healthcare quality improvement collaboratives (QICs), teams of practitioners from different hospitals exchange information on clinical practices with the aim of improving health outcomes at their own institutions. However, what works in one hospital may not work in others with different local contexts because of nonlinear interactions among various demographics, treatments, and practices. In previous studies of collaborations where the goal is a collective problem solving, teams of diverse individuals have been shown to outperform teams of similar individuals. However, when the purpose of collaboration is knowledge diffusion in complex environments, it is not clear whether team diversity will help or hinder effective learning. In this paper, we first use an agent-based model of QICs to show that teams comprising similar individuals outperform those with more diverse individuals under nearly all conditions, and that this advantage increases with the complexity of the landscape and level of noise in assessing performance. Examination of data from a network of real hospitals provides encouraging evidence of a high degree of similarity in clinical practices, especially within teams of hospitals engaging in QIC teams. However, our model also suggests that groups of similar hospitals could benefit from larger teams and more open sharing of details on clinical outcomes than is currently the norm. To facilitate this, we propose a secure virtual collaboration system that would allow hospitals to efficiently identify potentially better practices in use at other institutions similar to theirs without any institutions having to sacrifice the privacy of their own data. Our results may also have implications for other types of data-driven diffusive learning such as in personalized medicine and evolutionary search in noisy, complex combinatorial optimization problems. PMID:25360395

  8. Attentional focus and performance anxiety: effects on simulated race-driving performance and heart rate variability.

    PubMed

    Mullen, Richard; Faull, Andrea; Jones, Eleri S; Kingston, Kieran

    2012-01-01

    Previous studies have demonstrated that an external focus can enhance motor learning compared to an internal focus. The benefits of adopting an external focus are attributed to the use of less effortful automatic control processes, while an internal focus relies upon more effort-intensive consciously controlled processes. The aim of this study was to compare the effectiveness of a distal external focus with an internal focus in the acquisition of a simulated driving task and subsequent performance in a competitive condition designed to increase state anxiety. To provide further evidence for the automatic nature of externally controlled movements, the study included heart rate variability (HRV) as an index of mental effort. Sixteen participants completed eight blocks of four laps in either a distal external or internal focus condition, followed by two blocks of four laps in the competitive condition. During acquisition, the performance of both groups improved; however, the distal external focus group outperformed the internal focus group. The poorer performance of the internal focus group was accompanied by a larger reduction in HRV, indicating a greater investment of mental effort. In the competition condition, state anxiety increased, and for both groups, performance improved as a function of the increased anxiety. Increased heart rate and self-reported mental effort accompanied the performance improvement. The distal external focus group also outperformed the internal focus group across both neutral and competitive conditions and this more effective performance was again associated with lower levels of HRV. Overall, the results offer support for the suggestion that an external focus promotes a more automatic mode of functioning. In the competitive condition, both foci enhanced performance and while the improved performance may have been achieved at the expense of greater compensatory mental effort, this was not reflected in HRV scores.

  9. Extracting laboratory test information from biomedical text

    PubMed Central

    Kang, Yanna Shen; Kayaalp, Mehmet

    2013-01-01

    Background: No previous study reported the efficacy of current natural language processing (NLP) methods for extracting laboratory test information from narrative documents. This study investigates the pathology informatics question of how accurately such information can be extracted from text with the current tools and techniques, especially machine learning and symbolic NLP methods. The study data came from a text corpus maintained by the U.S. Food and Drug Administration, containing a rich set of information on laboratory tests and test devices. Methods: The authors developed a symbolic information extraction (SIE) system to extract device and test specific information about four types of laboratory test entities: Specimens, analytes, units of measures and detection limits. They compared the performance of SIE and three prominent machine learning based NLP systems, LingPipe, GATE and BANNER, each implementing a distinct supervised machine learning method, hidden Markov models, support vector machines and conditional random fields, respectively. Results: Machine learning systems recognized laboratory test entities with moderately high recall, but low precision rates. Their recall rates were relatively higher when the number of distinct entity values (e.g., the spectrum of specimens) was very limited or when lexical morphology of the entity was distinctive (as in units of measures), yet SIE outperformed them with statistically significant margins on extracting specimen, analyte and detection limit information in both precision and F-measure. Its high recall performance was statistically significant on analyte information extraction. Conclusions: Despite its shortcomings against machine learning methods, a well-tailored symbolic system may better discern relevancy among a pile of information of the same type and may outperform a machine learning system by tapping into lexically non-local contextual information such as the document structure. PMID:24083058

  10. Automatic identification of high impact articles in PubMed to support clinical decision making.

    PubMed

    Bian, Jiantao; Morid, Mohammad Amin; Jonnalagadda, Siddhartha; Luo, Gang; Del Fiol, Guilherme

    2017-09-01

    The practice of evidence-based medicine involves integrating the latest best available evidence into patient care decisions. Yet, critical barriers exist for clinicians' retrieval of evidence that is relevant for a particular patient from primary sources such as randomized controlled trials and meta-analyses. To help address those barriers, we investigated machine learning algorithms that find clinical studies with high clinical impact from PubMed®. Our machine learning algorithms use a variety of features including bibliometric features (e.g., citation count), social media attention, journal impact factors, and citation metadata. The algorithms were developed and evaluated with a gold standard composed of 502 high impact clinical studies that are referenced in 11 clinical evidence-based guidelines on the treatment of various diseases. We tested the following hypotheses: (1) our high impact classifier outperforms a state-of-the-art classifier based on citation metadata and citation terms, and PubMed's® relevance sort algorithm; and (2) the performance of our high impact classifier does not decrease significantly after removing proprietary features such as citation count. The mean top 20 precision of our high impact classifier was 34% versus 11% for the state-of-the-art classifier and 4% for PubMed's® relevance sort (p=0.009); and the performance of our high impact classifier did not decrease significantly after removing proprietary features (mean top 20 precision=34% vs. 36%; p=0.085). The high impact classifier, using features such as bibliometrics, social media attention and MEDLINE® metadata, outperformed previous approaches and is a promising alternative to identifying high impact studies for clinical decision support. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Aging of theory of mind: the influence of educational level and cognitive processing.

    PubMed

    Li, Xiaoming; Wang, Kai; Wang, Fan; Tao, Qian; Xie, Yu; Cheng, Qi

    2013-01-01

    Previous studies of theory of mind (ToM) in old age have provided mixed results. We predicted that educational level and cognitive processing are two factors influencing the pattern of the aging of ToM. To test this hypothesis, a younger group who received higher education (mean age 20.46 years), an older group with an education level equal to that of the young group (mean age 76.29 years), and an older group with less education (mean age 73.52 years) were recruited. ToM tasks included the following tests: the second-order false-belief task, the faux-pas task, the eyes test, and tests of fundamental aspects of cognitive function that included two background tests (memory span and processing speed) and three subcomponents of executive function (inhibition, updating, and shifting). We found that the younger group and the older group with equally high education outperformed the older group with less education in false-belief and faux-pas tasks. However, there was no significant difference between the two former groups. The three groups of participants performed equivalently in the eyes test as well as in control tasks (false-belief control question, faux-pas control question, faux-pas control story, and Eyes Test control task). The younger group outperformed the other two groups in the cognitive processing tasks. Mediation analyses showed that difficulties in inhibition, memory span, and processing speed mediated the age differences in false-belief reasoning. Also, the variables of inhibition, updating, memory span, and processing speed mediated age-related variance in faux-pas. Discussion focused on the links between ToM aging, educational level, and cognitive processing. Supported by Chinese National Natural Science Foundation (number: 30870766) and Anhui Province Natural Science Foundation (number: 11040606M166).

  12. More technology, better learning resources, better learning? Lessons from adopting virtual microscopy in undergraduate medical education.

    PubMed

    Helle, Laura; Nivala, Markus; Kronqvist, Pauliina

    2013-01-01

    The adoption of virtual microscopy at the University of Turku, Finland, created a unique real-world laboratory for exploring ways of reforming the learning environment. The purpose of this study was to evaluate the students' reactions and the impact of a set of measures designed to boost an experimental group's understanding of abnormal histology through an emphasis on knowledge of normal cells and tissues. The set of measures included (1) digital resources to review normal structures and an entrance examination for enforcement, (2) digital course slides highlighting normal and abnormal tissues, and (3) self-diagnostic quizzes. The performance of historical controls was used as a baseline, as previous students had never been exposed to the above-mentioned measures. The students' understanding of normal histology was assessed in the beginning of the module to determine the impact of the first set of measures, whereas that of abnormal histology was assessed at the end of the module to determine the impact of the whole set of measures. The students' reactions to the instructional measures were assessed by course evaluation data. Additionally, four students were interviewed. Results confirmed that the experimental group significantly outperformed the historical controls in understanding normal histology. The students held favorable opinions on the idea of emphasizing normal structures. However, with regards to abnormal histology, the historical controls outperformed the experimental group. In conclusion, allowing students access to high-quality digitized materials and boosting prerequisite skills are clearly not sufficient to boost final competence. Instead, the solution may lie in making students externally accountable for their learning throughout their training. Copyright © 2012 American Association of Anatomists.

  13. Noise Tolerance of Attractor and Feedforward Memory Models

    PubMed Central

    Lim, Sukbin; Goldman, Mark S.

    2017-01-01

    In short-term memory networks, transient stimuli are represented by patterns of neural activity that persist long after stimulus offset. Here, we compare the performance of two prominent classes of memory networks, feedback-based attractor networks and feedforward networks, in conveying information about the amplitude of a briefly presented stimulus in the presence of gaussian noise. Using Fisher information as a metric of memory performance, we find that the optimal form of network architecture depends strongly on assumptions about the forms of nonlinearities in the network. For purely linear networks, we find that feedforward networks outperform attractor networks because noise is continually removed from feedforward networks when signals exit the network; as a result, feedforward networks can amplify signals they receive faster than noise accumulates over time. By contrast, attractor networks must operate in a signal-attenuating regime to avoid the buildup of noise. However, if the amplification of signals is limited by a finite dynamic range of neuronal responses or if noise is reset at the time of signal arrival, as suggested by recent experiments, we find that attractor networks can out-perform feedforward ones. Under a simple model in which neurons have a finite dynamic range, we find that the optimal attractor networks are forgetful if there is no mechanism for noise reduction with signal arrival but nonforgetful (perfect integrators) in the presence of a strong reset mechanism. Furthermore, we find that the maximal Fisher information for the feedforward and attractor networks exhibits power law decay as a function of time and scales linearly with the number of neurons. These results highlight prominent factors that lead to trade-offs in the memory performance of networks with different architectures and constraints, and suggest conditions under which attractor or feedforward networks may be best suited to storing information about previous stimuli. PMID:22091664

  14. Correlation, evaluation, and extension of linearized theories for tire motion and wheel shimmy

    NASA Technical Reports Server (NTRS)

    Smiley, Robert F

    1957-01-01

    An evaluation is made of the existing theories of a linearized tire motion and wheel shimmy. It is demonstrated that most of the previously published theories represent varying degrees of approximation to a summary theory developed in this report which is a minor modification of the basic theory of Von Schlippe and Dietrich. In most cases where strong differences exist between the previously published theories and summary theory, the previously published theories are shown to possess certain deficiencies. A series of systematic approximations to the summary theory is developed for the treatment of problems too simple to merit the use of the complete summary theory, and procedures are discussed for applying the summary theory and its systematic approximations to the shimmy of more complex landing-gear structures than have previously been considered. Comparisons of the existing experimental data with the predictions of the summary theory and the systematic approximations provide a fair substantiation of the more detailed approximate theories.

  15. Combining multiple positive training sets to generate confidence scores for protein-protein interactions.

    PubMed

    Yu, Jingkai; Finley, Russell L

    2009-01-01

    High-throughput experimental and computational methods are generating a wealth of protein-protein interaction data for a variety of organisms. However, data produced by current state-of-the-art methods include many false positives, which can hinder the analyses needed to derive biological insights. One way to address this problem is to assign confidence scores that reflect the reliability and biological significance of each interaction. Most previously described scoring methods use a set of likely true positives to train a model to score all interactions in a dataset. A single positive training set, however, may be biased and not representative of true interaction space. We demonstrate a method to score protein interactions by utilizing multiple independent sets of training positives to reduce the potential bias inherent in using a single training set. We used a set of benchmark yeast protein interactions to show that our approach outperforms other scoring methods. Our approach can also score interactions across data types, which makes it more widely applicable than many previously proposed methods. We applied the method to protein interaction data from both Drosophila melanogaster and Homo sapiens. Independent evaluations show that the resulting confidence scores accurately reflect the biological significance of the interactions.

  16. In-Network Processing of an Iceberg Join Query in Wireless Sensor Networks Based on 2-Way Fragment Semijoins

    PubMed Central

    Kang, Hyunchul

    2015-01-01

    We investigate the in-network processing of an iceberg join query in wireless sensor networks (WSNs). An iceberg join is a special type of join where only those joined tuples whose cardinality exceeds a certain threshold (called iceberg threshold) are qualified for the result. Processing such a join involves the value matching for the join predicate as well as the checking of the cardinality constraint for the iceberg threshold. In the previous scheme, the value matching is carried out as the main task for filtering non-joinable tuples while the iceberg threshold is treated as an additional constraint. We take an alternative approach, meeting the cardinality constraint first and matching values next. In this approach, with a logical fragmentation of the join operand relations on the aggregate counts of the joining attribute values, the optimal sequence of 2-way fragment semijoins is generated, where each fragment semijoin employs a Bloom filter as a synopsis of the joining attribute values. This sequence filters non-joinable tuples in an energy-efficient way in WSNs. Through implementation and a set of detailed experiments, we show that our alternative approach considerably outperforms the previous one. PMID:25774710

  17. Feigning Amnesia Moderately Impairs Memory for a Mock Crime Video.

    PubMed

    Mangiulli, Ivan; van Oorsouw, Kim; Curci, Antonietta; Merckelbach, Harald; Jelicic, Marko

    2018-01-01

    Previous studies showed that feigning amnesia for a crime impairs actual memory for the target event. Lack of rehearsal has been proposed as an explanation for this memory-undermining effect of feigning. The aim of the present study was to replicate and extend previous research adopting a mock crime video instead of a narrative story. We showed participants a video of a violent crime. Next, they were requested to imagine that they had committed this offense and to either feign amnesia or confess the crime. A third condition was included: Participants in the delayed test-only control condition did not receive any instruction. On subsequent recall tests, participants in all three conditions were instructed to report as much information as possible about the offense. On the free recall test, feigning amnesia impaired memory for the video clip, but participants who were asked to feign crime-related amnesia outperformed controls. However, no differences between simulators and confessors were found on both correct cued recollection or on distortion and commission rates. We also explored whether inner speech might modulate memory for the crime. Inner speech traits were not found to be related to the simulating amnesia effect. Theoretical and practical implications of our results are discussed.

  18. Music training and working memory: an ERP study.

    PubMed

    George, Elyse M; Coch, Donna

    2011-04-01

    While previous research has suggested that music training is associated with improvements in various cognitive and linguistic skills, the mechanisms mediating or underlying these associations are mostly unknown. Here, we addressed the hypothesis that previous music training is related to improved working memory. Using event-related potentials (ERPs) and a standardized test of working memory, we investigated both neural and behavioral aspects of working memory in college-aged, non-professional musicians and non-musicians. Behaviorally, musicians outperformed non-musicians on standardized subtests of visual, phonological, and executive memory. ERPs were recorded in standard auditory and visual oddball paradigms (participants responded to infrequent deviant stimuli embedded in lists of standard stimuli). Electrophysiologically, musicians demonstrated faster updating of working memory (shorter latency P300s) in both the auditory and visual domains and musicians allocated more neural resources to auditory stimuli (larger amplitude P300), showing increased sensitivity to the auditory standard/deviant difference and less effortful updating of auditory working memory. These findings demonstrate that long-term music training is related to improvements in working memory, in both the auditory and visual domains and in terms of both behavioral and ERP measures. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. A New Approach for Resolving Conflicts in Actionable Behavioral Rules

    PubMed Central

    Zhu, Dan; Zeng, Daniel

    2014-01-01

    Knowledge is considered actionable if users can take direct actions based on such knowledge to their advantage. Among the most important and distinctive actionable knowledge are actionable behavioral rules that can directly and explicitly suggest specific actions to take to influence (restrain or encourage) the behavior in the users' best interest. However, in mining such rules, it often occurs that different rules may suggest the same actions with different expected utilities, which we call conflicting rules. To resolve the conflicts, a previous valid method was proposed. However, inconsistency of the measure for rule evaluating may hinder its performance. To overcome this problem, we develop a new method that utilizes rule ranking procedure as the basis for selecting the rule with the highest utility prediction accuracy. More specifically, we propose an integrative measure, which combines the measures of the support and antecedent length, to evaluate the utility prediction accuracies of conflicting rules. We also introduce a tunable weight parameter to allow the flexibility of integration. We conduct several experiments to test our proposed approach and evaluate the sensitivity of the weight parameter. Empirical results indicate that our approach outperforms those from previous research. PMID:25162054

  20. Retrieval evaluation and distance learning from perceived similarity between endomicroscopy videos.

    PubMed

    André, Barbara; Vercauteren, Tom; Buchner, Anna M; Wallace, Michael B; Ayache, Nicholas

    2011-01-01

    Evaluating content-based retrieval (CBR) is challenging because it requires an adequate ground-truth. When the available groundtruth is limited to textual metadata such as pathological classes, retrieval results can only be evaluated indirectly, for example in terms of classification performance. In this study we first present a tool to generate perceived similarity ground-truth that enables direct evaluation of endomicroscopic video retrieval. This tool uses a four-points Likert scale and collects subjective pairwise similarities perceived by multiple expert observers. We then evaluate against the generated ground-truth a previously developed dense bag-of-visual-words method for endomicroscopic video retrieval. Confirming the results of previous indirect evaluation based on classification, our direct evaluation shows that this method significantly outperforms several other state-of-the-art CBR methods. In a second step, we propose to improve the CBR method by learning an adjusted similarity metric from the perceived similarity ground-truth. By minimizing a margin-based cost function that differentiates similar and dissimilar video pairs, we learn a weight vector applied to the visual word signatures of videos. Using cross-validation, we demonstrate that the learned similarity distance is significantly better correlated with the perceived similarity than the original visual-word-based distance.

  1. Combining guilt-by-association and guilt-by-profiling to predict Saccharomyces cerevisiae gene function

    PubMed Central

    Tian, Weidong; Zhang, Lan V; Taşan, Murat; Gibbons, Francis D; King, Oliver D; Park, Julie; Wunderlich, Zeba; Cherry, J Michael; Roth, Frederick P

    2008-01-01

    Background: Learning the function of genes is a major goal of computational genomics. Methods for inferring gene function have typically fallen into two categories: 'guilt-by-profiling', which exploits correlation between function and other gene characteristics; and 'guilt-by-association', which transfers function from one gene to another via biological relationships. Results: We have developed a strategy ('Funckenstein') that performs guilt-by-profiling and guilt-by-association and combines the results. Using a benchmark set of functional categories and input data for protein-coding genes in Saccharomyces cerevisiae, Funckenstein was compared with a previous combined strategy. Subsequently, we applied Funckenstein to 2,455 Gene Ontology terms. In the process, we developed 2,455 guilt-by-profiling classifiers based on 8,848 gene characteristics and 12 functional linkage graphs based on 23 biological relationships. Conclusion: Funckenstein outperforms a previous combined strategy using a common benchmark dataset. The combination of 'guilt-by-profiling' and 'guilt-by-association' gave significant improvement over the component classifiers, showing the greatest synergy for the most specific functions. Performance was evaluated by cross-validation and by literature examination of the top-scoring novel predictions. These quantitative predictions should help prioritize experimental study of yeast gene functions. PMID:18613951

  2. Multi-Robot Coalitions Formation with Deadlines: Complexity Analysis and Solutions

    PubMed Central

    2017-01-01

    Multi-robot task allocation is one of the main problems to address in order to design a multi-robot system, very especially when robots form coalitions that must carry out tasks before a deadline. A lot of factors affect the performance of these systems and among them, this paper is focused on the physical interference effect, produced when two or more robots want to access the same point simultaneously. To our best knowledge, this paper presents the first formal description of multi-robot task allocation that includes a model of interference. Thanks to this description, the complexity of the allocation problem is analyzed. Moreover, the main contribution of this paper is to provide the conditions under which the optimal solution of the aforementioned allocation problem can be obtained solving an integer linear problem. The optimal results are compared to previous allocation algorithms already proposed by the first two authors of this paper and with a new method proposed in this paper. The results obtained show how the new task allocation algorithms reach up more than an 80% of the median of the optimal solution, outperforming previous auction algorithms with a huge reduction of the execution time. PMID:28118384

  3. Delay of gratification by orangutans (Pongo pygmaeus) in the accumulation task.

    PubMed

    Parrish, Audrey E; Perdue, Bonnie M; Stromberg, Erin E; Bania, Amanda E; Evans, Theodore A; Beran, Michael J

    2014-05-01

    There is considerable evidence indicating that chimpanzees can delay gratification for extended time intervals, particularly in the accumulation task in which food items accumulate within a participant's reach until the participant begins to consume them. However, there is limited evidence that other ape species might also exhibit this capacity, despite there being a number of similar studies indicating that nonape species (e.g., monkeys and birds) can delay gratification, but not for nearly as long as chimpanzees. To help define the taxonomic distribution of delay of gratification behavior in the order Primates, we tested 6 orangutans in the current experiments and compared their performance with comparable data from a previous study with capuchin monkeys. We varied delay length and visibility of the items that were still available for accumulation to determine the impact of these factors on performance. Species differences on the accumulation task emerged when comparing the current data to data from a previous study. Orangutans outperformed capuchin monkeys, suggesting that ape species may generally show better delay of gratification and delay maintenance abilities than monkeys. However, more studies are necessary to rule out alternative hypotheses on the distribution of delay maintenance abilities across primate species. ©2014 APA, all rights reserved.

  4. Performance comparison of six independent components analysis algorithms for fetal signal extraction from real fMCG data

    NASA Astrophysics Data System (ADS)

    Hild, Kenneth E.; Alleva, Giovanna; Nagarajan, Srikantan; Comani, Silvia

    2007-01-01

    In this study we compare the performance of six independent components analysis (ICA) algorithms on 16 real fetal magnetocardiographic (fMCG) datasets for the application of extracting the fetal cardiac signal. We also compare the extraction results for real data with the results previously obtained for synthetic data. The six ICA algorithms are FastICA, CubICA, JADE, Infomax, MRMI-SIG and TDSEP. The results obtained using real fMCG data indicate that the FastICA method consistently outperforms the others in regard to separation quality and that the performance of an ICA method that uses temporal information suffers in the presence of noise. These two results confirm the previous results obtained using synthetic fMCG data. There were also two notable differences between the studies based on real and synthetic data. The differences are that all six ICA algorithms are independent of gestational age and sensor dimensionality for synthetic data, but depend on gestational age and sensor dimensionality for real data. It is possible to explain these differences by assuming that the number of point sources needed to completely explain the data is larger than the dimensionality used in the ICA extraction.

  5. Combined assessment of DYRK1A, BDNF and homocysteine levels as diagnostic marker for Alzheimer's disease.

    PubMed

    Janel, N; Alexopoulos, P; Badel, A; Lamari, F; Camproux, A C; Lagarde, J; Simon, S; Feraudet-Tarisse, C; Lamourette, P; Arbones, M; Paul, J L; Dubois, B; Potier, M C; Sarazin, M; Delabar, J M

    2017-06-20

    Early identification of Alzheimer's disease (AD) risk factors would aid development of interventions to delay the onset of dementia, but current biomarkers are invasive and/or costly to assess. Validated plasma biomarkers would circumvent these challenges. We previously identified the kinase DYRK1A in plasma. To validate DYRK1A as a biomarker for AD diagnosis, we assessed the levels of DYRK1A and the related markers brain-derived neurotrophic factor (BDNF) and homocysteine in two unrelated AD patient cohorts with age-matched controls. Receiver-operating characteristic curves and logistic regression analyses showed that combined assessment of DYRK1A, BDNF and homocysteine has a sensitivity of 0.952, a specificity of 0.889 and an accuracy of 0.933 in testing for AD. The blood levels of these markers provide a diagnosis assessment profile. Combined assessment of these three markers outperforms most of the previous markers and could become a useful substitute to the current panel of AD biomarkers. These results associate a decreased level of DYRK1A with AD and challenge the use of DYRK1A inhibitors in peripheral tissues as treatment. These measures will be useful for diagnosis purposes.

  6. Multi-Robot Coalitions Formation with Deadlines: Complexity Analysis and Solutions.

    PubMed

    Guerrero, Jose; Oliver, Gabriel; Valero, Oscar

    2017-01-01

    Multi-robot task allocation is one of the main problems to address in order to design a multi-robot system, very especially when robots form coalitions that must carry out tasks before a deadline. A lot of factors affect the performance of these systems and among them, this paper is focused on the physical interference effect, produced when two or more robots want to access the same point simultaneously. To our best knowledge, this paper presents the first formal description of multi-robot task allocation that includes a model of interference. Thanks to this description, the complexity of the allocation problem is analyzed. Moreover, the main contribution of this paper is to provide the conditions under which the optimal solution of the aforementioned allocation problem can be obtained solving an integer linear problem. The optimal results are compared to previous allocation algorithms already proposed by the first two authors of this paper and with a new method proposed in this paper. The results obtained show how the new task allocation algorithms reach up more than an 80% of the median of the optimal solution, outperforming previous auction algorithms with a huge reduction of the execution time.

  7. Deep Recurrent Neural Network-Based Autoencoders for Acoustic Novelty Detection

    PubMed Central

    Vesperini, Fabio; Schuller, Björn

    2017-01-01

    In the emerging field of acoustic novelty detection, most research efforts are devoted to probabilistic approaches such as mixture models or state-space models. Only recent studies introduced (pseudo-)generative models for acoustic novelty detection with recurrent neural networks in the form of an autoencoder. In these approaches, auditory spectral features of the next short term frame are predicted from the previous frames by means of Long-Short Term Memory recurrent denoising autoencoders. The reconstruction error between the input and the output of the autoencoder is used as activation signal to detect novel events. There is no evidence of studies focused on comparing previous efforts to automatically recognize novel events from audio signals and giving a broad and in depth evaluation of recurrent neural network-based autoencoders. The present contribution aims to consistently evaluate our recent novel approaches to fill this white spot in the literature and provide insight by extensive evaluations carried out on three databases: A3Novelty, PASCAL CHiME, and PROMETHEUS. Besides providing an extensive analysis of novel and state-of-the-art methods, the article shows how RNN-based autoencoders outperform statistical approaches up to an absolute improvement of 16.4% average F-measure over the three databases. PMID:28182121

  8. The use of genetic programming to develop a predictor of swash excursion on sandy beaches

    NASA Astrophysics Data System (ADS)

    Passarella, Marinella; Goldstein, Evan B.; De Muro, Sandro; Coco, Giovanni

    2018-02-01

    We use genetic programming (GP), a type of machine learning (ML) approach, to predict the total and infragravity swash excursion using previously published data sets that have been used extensively in swash prediction studies. Three previously published works with a range of new conditions are added to this data set to extend the range of measured swash conditions. Using this newly compiled data set we demonstrate that a ML approach can reduce the prediction errors compared to well-established parameterizations and therefore it may improve coastal hazards assessment (e.g. coastal inundation). Predictors obtained using GP can also be physically sound and replicate the functionality and dependencies of previous published formulas. Overall, we show that ML techniques are capable of both improving predictability (compared to classical regression approaches) and providing physical insight into coastal processes.

  9. Smell and taste function in the visually impaired.

    PubMed

    Smith, R S; Doty, R L; Burlingame, G K; McKeown, D A

    1993-11-01

    Surprisingly few quantitative studies have addressed the question of whether visually impaired individuals evidence, perhaps in compensation for their loss of vision, increased acuteness in their other senses. In this experiment we sought to determine whether blind subjects outperform sighted subjects on a number of basic tests of chemosensory function. Over 50 blind and 75 sighted subjects were administered the following olfactory and gustatory tests: the University of Pennsylvania Smell Identification Test (UPSIT); a 16-item odor discrimination test; and a suprathreshold taste test in which measures of taste-quality identification and ratings of the perceived intensity and pleasantness of sucrose, citric acid, sodium chloride, and caffeine were obtained. In addition, 39 blind subjects and 77 sighted subjects were administered a single staircase phenyl ethyl alcohol (PEA) odor detection threshold test. Twenty-three of the sighted subjects were employed by the Philadelphia Water Department and trained to serve on its water quality evaluation panel. The primary findings of the study were that (a) the blind subjects did not outperform sighted subjects on any test of chemosensory function and (b) the trained subjects significantly outperformed the other two groups on the odor detection, odor discrimination, and taste identification tests, and nearly outperformed the blind subjects on the UPSIT. The citric acid concentrations received larger pleasantness ratings from the trained panel members than from the blind subjects, whose ratings did not differ significantly from those of the untrained sighted subjects. Overall, the data imply that blindness, per se, has little influence on chemosensory function and add further support to the notion that specialized training enhances performance on a number of chemosensory tasks.

  10. Extraction of memory colors for preferred color correction in digital TVs

    NASA Astrophysics Data System (ADS)

    Ryu, Byong Tae; Yeom, Jee Young; Kim, Choon-Woo; Ahn, Ji-Young; Kang, Dong-Woo; Shin, Hyun-Ho

    2009-01-01

    Subjective image quality is one of the most important performance indicators for digital TVs. In order to improve subjective image quality, preferred color correction is often employed. More specifically, areas of memory colors such as skin, grass, and sky are modified to generate pleasing impression to viewers. Before applying the preferred color correction, tendency of preference for memory colors should be identified. It is often accomplished by off-line human visual tests. Areas containing the memory colors should be extracted then color correction is applied to the extracted areas. These processes should be performed on-line. This paper presents a new method for area extraction of three types of memory colors. Performance of the proposed method is evaluated by calculating the correct and false detection ratios. Experimental results indicate that proposed method outperform previous methods proposed for the memory color extraction.

  11. Accurate estimation of influenza epidemics using Google search data via ARGO

    PubMed Central

    Yang, Shihao; Santillana, Mauricio; Kou, S. C.

    2015-01-01

    Accurate real-time tracking of influenza outbreaks helps public health officials make timely and meaningful decisions that could save lives. We propose an influenza tracking model, ARGO (AutoRegression with GOogle search data), that uses publicly available online search data. In addition to having a rigorous statistical foundation, ARGO outperforms all previously available Google-search–based tracking models, including the latest version of Google Flu Trends, even though it uses only low-quality search data as input from publicly available Google Trends and Google Correlate websites. ARGO not only incorporates the seasonality in influenza epidemics but also captures changes in people’s online search behavior over time. ARGO is also flexible, self-correcting, robust, and scalable, making it a potentially powerful tool that can be used for real-time tracking of other social events at multiple temporal and spatial resolutions. PMID:26553980

  12. Modifications to the Objective Lightning Probability Forecast Tool at Kennedy Space Center/Cape Canaveral Air Force Station, Florida

    NASA Technical Reports Server (NTRS)

    Crawford, Winifred; Roeder, William

    2010-01-01

    The 45th Weather Squadron (45 WS) at Cape Canaveral Air Force Station (CCAFS) includes the probability of lightning occurrence in their 24-Hour and Weekly Planning Forecasts, briefed at 0700 EDT for daily operations planning on Kennedy Space Center (KSC) and CCAFS. This forecast is based on subjective analyses of model and observational data and output from an objective tool developed by the Applied Meteorology Unit (AMU). This tool was developed over two phases (Lambert and Wheeler 2005, Lambert 2007). It consists of five equations, one for each warm season month (May-Sep), that calculate the probability of lightning occurrence for the day and a graphical user interface (GUI) to display the output. The Phase I and II equations outperformed previous operational tools by a total of 56%. Based on this success, the 45 WS tasked the AMU with Phase III to improve the tool further.

  13. Small target detection using objectness and saliency

    NASA Astrophysics Data System (ADS)

    Zhang, Naiwen; Xiao, Yang; Fang, Zhiwen; Yang, Jian; Wang, Li; Li, Tao

    2017-10-01

    We are motived by the need for generic object detection algorithm which achieves high recall for small targets in complex scenes with acceptable computational efficiency. We propose a novel object detection algorithm, which has high localization quality with acceptable computational cost. Firstly, we obtain the objectness map as in BING[1] and use NMS to get the top N points. Then, k-means algorithm is used to cluster them into K classes according to their location. We set the center points of the K classes as seed points. For each seed point, an object potential region is extracted. Finally, a fast salient object detection algorithm[2] is applied to the object potential regions to highlight objectlike pixels, and a series of efficient post-processing operations are proposed to locate the targets. Our method runs at 5 FPS on 1000*1000 images, and significantly outperforms previous methods on small targets in cluttered background.

  14. Ultra-low contact resistance in graphene devices at the Dirac point

    NASA Astrophysics Data System (ADS)

    Anzi, Luca; Mansouri, Aida; Pedrinazzi, Paolo; Guerriero, Erica; Fiocco, Marco; Pesquera, Amaia; Centeno, Alba; Zurutuza, Amaia; Behnam, Ashkan; Carrion, Enrique A.; Pop, Eric; Sordan, Roman

    2018-04-01

    Contact resistance is one of the main factors limiting performance of short-channel graphene field-effect transistors (GFETs), preventing their use in low-voltage applications. Here we investigated the contact resistance between graphene grown by chemical vapor deposition (CVD) and different metals, and found that etching holes in graphene below the contacts consistently reduced the contact resistance, down to 23 Ω \\cdot μ m with Au contacts. This low contact resistance was obtained at the Dirac point of graphene, in contrast to previous studies where the lowest contact resistance was obtained at the highest carrier density in graphene (here 200 Ω \\cdot μ m was obtained under such conditions). The ‘holey’ Au contacts were implemented in GFETs which exhibited an average transconductance of 940 S m-1 at a drain bias of only 0.8 V and gate length of 500 nm, which out-perform GFETs with conventional Au contacts.

  15. Network-assisted target identification for haploinsufficiency and homozygous profiling screens

    PubMed Central

    Wang, Sheng

    2017-01-01

    Chemical genomic screens have recently emerged as a systematic approach to drug discovery on a genome-wide scale. Drug target identification and elucidation of the mechanism of action (MoA) of hits from these noisy high-throughput screens remain difficult. Here, we present GIT (Genetic Interaction Network-Assisted Target Identification), a network analysis method for drug target identification in haploinsufficiency profiling (HIP) and homozygous profiling (HOP) screens. With the drug-induced phenotypic fitness defect of the deletion of a gene, GIT also incorporates the fitness defects of the gene’s neighbors in the genetic interaction network. On three genome-scale yeast chemical genomic screens, GIT substantially outperforms previous scoring methods on target identification on HIP and HOP assays, respectively. Finally, we showed that by combining HIP and HOP assays, GIT further boosts target identification and reveals potential drug’s mechanism of action. PMID:28574983

  16. Using the MMPI-2-RF to discriminate psychometrically identified schizotypic college students from a matched comparison sample.

    PubMed

    Hunter, Helen K; Bolinskey, P Kevin; Novi, Jonathan H; Hudak, Daniel V; James, Alison V; Myers, Kevin R; Schuder, Kelly M

    2014-01-01

    This study investigates the extent to which the Minnesota Multiphasic Personality Inventory-2 Restructured Form (MMPI-2-RF) profiles of 52 individuals making up a psychometrically identified schizotypes (SZT) sample could be successfully discriminated from the protocols of 52 individuals in a matched comparison (MC) sample. Replication analyses were performed with an additional 53 pairs of SZT and MC participants. Results showed significant differences in mean T-score values between these 2 groups across a variety of MMPI-2-RF scales. Results from discriminant function analyses indicate that schizotypy can be predicted effectively using 4 MMPI-2-RF scales and that this method of classification held up on replication. Additional results demonstrated that these MMPI-2-RF scales nominally outperformed MMPI-2 scales suggested by previous research as being indicative of schizophrenia liability. Directions for future research with the MMPI-2-RF are suggested.

  17. Joint histogram-based cost aggregation for stereo matching.

    PubMed

    Min, Dongbo; Lu, Jiangbo; Do, Minh N

    2013-10-01

    This paper presents a novel method for performing efficient cost aggregation in stereo matching. The cost aggregation problem is reformulated from the perspective of a histogram, giving us the potential to reduce the complexity of the cost aggregation in stereo matching significantly. Differently from previous methods which have tried to reduce the complexity in terms of the size of an image and a matching window, our approach focuses on reducing the computational redundancy that exists among the search range, caused by a repeated filtering for all the hypotheses. Moreover, we also reduce the complexity of the window-based filtering through an efficient sampling scheme inside the matching window. The tradeoff between accuracy and complexity is extensively investigated by varying the parameters used in the proposed method. Experimental results show that the proposed method provides high-quality disparity maps with low complexity and outperforms existing local methods. This paper also provides new insights into complexity-constrained stereo-matching algorithm design.

  18. Synthetic Biology Toolbox for Controlling Gene Expression in the Cyanobacterium Synechococcus sp. strain PCC 7002

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Markley, Andrew L.; Begemann, Matthew B.; Clarke, Ryan E.

    The application of synthetic biology requires characterized tools to precisely control gene expression. This toolbox of genetic parts previously did not exist for the industrially promising cyanobacterium, Synechococcus sp. strain PCC 7002. To address this gap, two orthogonal constitutive promoter libraries, one based on a cyanobacterial promoter and the other ported from Escherichia coli, were built and tested in PCC 7002. The libraries demonstrated 3 and 2.5 log dynamic ranges, respectively, but correlated poorly with E. coli expression levels. These promoter libraries were then combined to create and optimize a series of IPTG inducible cassettes. The resultant induction system hadmore » a 48-fold dynamic range and was shown to out-perform P trc constructs. Finally, a RBS library was designed and tested in PCC 7002. The presented synthetic biology toolbox will enable accelerated engineering of PCC 7002.« less

  19. Synthetic Biology Toolbox for Controlling Gene Expression in the Cyanobacterium Synechococcus sp. strain PCC 7002

    DOE PAGES

    Markley, Andrew L.; Begemann, Matthew B.; Clarke, Ryan E.; ...

    2014-09-12

    The application of synthetic biology requires characterized tools to precisely control gene expression. This toolbox of genetic parts previously did not exist for the industrially promising cyanobacterium, Synechococcus sp. strain PCC 7002. To address this gap, two orthogonal constitutive promoter libraries, one based on a cyanobacterial promoter and the other ported from Escherichia coli, were built and tested in PCC 7002. The libraries demonstrated 3 and 2.5 log dynamic ranges, respectively, but correlated poorly with E. coli expression levels. These promoter libraries were then combined to create and optimize a series of IPTG inducible cassettes. The resultant induction system hadmore » a 48-fold dynamic range and was shown to out-perform P trc constructs. Finally, a RBS library was designed and tested in PCC 7002. The presented synthetic biology toolbox will enable accelerated engineering of PCC 7002.« less

  20. Personalized recommendation based on preferential bidirectional mass diffusion

    NASA Astrophysics Data System (ADS)

    Chen, Guilin; Gao, Tianrun; Zhu, Xuzhen; Tian, Hui; Yang, Zhao

    2017-03-01

    Recommendation system provides a promising way to alleviate the dilemma of information overload. In physical dynamics, mass diffusion has been used to design effective recommendation algorithms on bipartite network. However, most of the previous studies focus overwhelmingly on unidirectional mass diffusion from collected objects to uncollected objects, while overlooking the opposite direction, leading to the risk of similarity estimation deviation and performance degradation. In addition, they are biased towards recommending popular objects which will not necessarily promote the accuracy but make the recommendation lack diversity and novelty that indeed contribute to the vitality of the system. To overcome the aforementioned disadvantages, we propose a preferential bidirectional mass diffusion (PBMD) algorithm by penalizing the weight of popular objects in bidirectional diffusion. Experiments are evaluated on three benchmark datasets (Movielens, Netflix and Amazon) by 10-fold cross validation, and results indicate that PBMD remarkably outperforms the mainstream methods in accuracy, diversity and novelty.

  1. Information Filtering via Heterogeneous Diffusion in Online Bipartite Networks

    PubMed Central

    Zhang, Fu-Guo; Zeng, An

    2015-01-01

    The rapid expansion of Internet brings us overwhelming online information, which is impossible for an individual to go through all of it. Therefore, recommender systems were created to help people dig through this abundance of information. In networks composed by users and objects, recommender algorithms based on diffusion have been proven to be one of the best performing methods. Previous works considered the diffusion process from user to object, and from object to user to be equivalent. We show in this work that it is not the case and we improve the quality of the recommendation by taking into account the asymmetrical nature of this process. We apply this idea to modify the state-of-the-art recommendation methods. The simulation results show that the new methods can outperform these existing methods in both recommendation accuracy and diversity. Finally, this modification is checked to be able to improve the recommendation in a realistic case. PMID:26125631

  2. A vertical handoff decision algorithm based on ARMA prediction model

    NASA Astrophysics Data System (ADS)

    Li, Ru; Shen, Jiao; Chen, Jun; Liu, Qiuhuan

    2012-01-01

    With the development of computer technology and the increasing demand for mobile communications, the next generation wireless networks will be composed of various wireless networks (e.g., WiMAX and WiFi). Vertical handoff is a key technology of next generation wireless networks. During the vertical handoff procedure, handoff decision is a crucial issue for an efficient mobility. Based on auto regression moving average (ARMA) prediction model, we propose a vertical handoff decision algorithm, which aims to improve the performance of vertical handoff and avoid unnecessary handoff. Based on the current received signal strength (RSS) and the previous RSS, the proposed approach adopt ARMA model to predict the next RSS. And then according to the predicted RSS to determine whether trigger the link layer triggering event and complete vertical handoff. The simulation results indicate that the proposed algorithm outperforms the RSS-based scheme with a threshold in the performance of handoff and the number of handoff.

  3. Simulating and assessing boson sampling experiments with phase-space representations

    NASA Astrophysics Data System (ADS)

    Opanchuk, Bogdan; Rosales-Zárate, Laura; Reid, Margaret D.; Drummond, Peter D.

    2018-04-01

    The search for new, application-specific quantum computers designed to outperform any classical computer is driven by the ending of Moore's law and the quantum advantages potentially obtainable. Photonic networks are promising examples, with experimental demonstrations and potential for obtaining a quantum computer to solve problems believed classically impossible. This introduces a challenge: how does one design or understand such photonic networks? One must be able to calculate observables using general methods capable of treating arbitrary inputs, dissipation, and noise. We develop complex phase-space software for simulating these photonic networks, and apply this to boson sampling experiments. Our techniques give sampling errors orders of magnitude lower than experimental correlation measurements for the same number of samples. We show that these techniques remove systematic errors in previous algorithms for estimating correlations, with large improvements in errors in some cases. In addition, we obtain a scalable channel-combination strategy for assessment of boson sampling devices.

  4. A quasi-current representation for information needs inspired by Two-State Vector Formalism

    NASA Astrophysics Data System (ADS)

    Wang, Panpan; Hou, Yuexian; Li, Jingfei; Zhang, Yazhou; Song, Dawei; Li, Wenjie

    2017-09-01

    Recently, a number of quantum theory (QT)-based information retrieval (IR) models have been proposed for modeling session search task that users issue queries continuously in order to describe their evolving information needs (IN). However, the standard formalism of QT cannot provide a complete description for users' current IN in a sense that it does not take the 'future' information into consideration. Therefore, to seek a more proper and complete representation for users' IN, we construct a representation of quasi-current IN inspired by an emerging Two-State Vector Formalism (TSVF). With the enlightenment of the completeness of TSVF, a "two-state vector" derived from the 'future' (the current query) and the 'history' (the previous query) is employed to describe users' quasi-current IN in a more complete way. Extensive experiments are conducted on the session tracks of TREC 2013 & 2014, and show that our model outperforms a series of compared IR models.

  5. Target Fishing for Chemical Compounds using Target-Ligand Activity data and Ranking based Methods

    PubMed Central

    Wale, Nikil; Karypis, George

    2009-01-01

    In recent years the development of computational techniques that identify all the likely targets for a given chemical compound, also termed as the problem of Target Fishing, has been an active area of research. Identification of likely targets of a chemical compound helps to understand problems such as toxicity, lack of efficacy in humans, and poor physical properties associated with that compound in the early stages of drug discovery. In this paper we present a set of techniques whose goal is to rank or prioritize targets in the context of a given chemical compound such that most targets that this compound may show activity against appear higher in the ranked list. These methods are based on our extensions to the SVM and Ranking Perceptron algorithms for this problem. Our extensive experimental study shows that the methods developed in this work outperform previous approaches by 2% to 60% under different evaluation criterions. PMID:19764745

  6. Solving Assembly Sequence Planning using Angle Modulated Simulated Kalman Filter

    NASA Astrophysics Data System (ADS)

    Mustapa, Ainizar; Yusof, Zulkifli Md.; Adam, Asrul; Muhammad, Badaruddin; Ibrahim, Zuwairie

    2018-03-01

    This paper presents an implementation of Simulated Kalman Filter (SKF) algorithm for optimizing an Assembly Sequence Planning (ASP) problem. The SKF search strategy contains three simple steps; predict-measure-estimate. The main objective of the ASP is to determine the sequence of component installation to shorten assembly time or save assembly costs. Initially, permutation sequence is generated to represent each agent. Each agent is then subjected to a precedence matrix constraint to produce feasible assembly sequence. Next, the Angle Modulated SKF (AMSKF) is proposed for solving ASP problem. The main idea of the angle modulated approach in solving combinatorial optimization problem is to use a function, g(x), to create a continuous signal. The performance of the proposed AMSKF is compared against previous works in solving ASP by applying BGSA, BPSO, and MSPSO. Using a case study of ASP, the results show that AMSKF outperformed all the algorithms in obtaining the best solution.

  7. Image contrast enhancement with brightness preservation using an optimal gamma correction and weighted sum approach

    NASA Astrophysics Data System (ADS)

    Jiang, G.; Wong, C. Y.; Lin, S. C. F.; Rahman, M. A.; Ren, T. R.; Kwok, Ngaiming; Shi, Haiyan; Yu, Ying-Hao; Wu, Tonghai

    2015-04-01

    The enhancement of image contrast and preservation of image brightness are two important but conflicting objectives in image restoration. Previous attempts based on linear histogram equalization had achieved contrast enhancement, but exact preservation of brightness was not accomplished. A new perspective is taken here to provide balanced performance of contrast enhancement and brightness preservation simultaneously by casting the quest of such solution to an optimization problem. Specifically, the non-linear gamma correction method is adopted to enhance the contrast, while a weighted sum approach is employed for brightness preservation. In addition, the efficient golden search algorithm is exploited to determine the required optimal parameters to produce the enhanced images. Experiments are conducted on natural colour images captured under various indoor, outdoor and illumination conditions. Results have shown that the proposed method outperforms currently available methods in contrast to enhancement and brightness preservation.

  8. A Dual-Stimuli-Responsive Sodium-Bromine Battery with Ultrahigh Energy Density.

    PubMed

    Wang, Faxing; Yang, Hongliu; Zhang, Jian; Zhang, Panpan; Wang, Gang; Zhuang, Xiaodong; Cuniberti, Gianaurelio; Feng, Xinliang

    2018-06-01

    Stimuli-responsive energy storage devices have emerged for the fast-growing popularity of intelligent electronics. However, all previously reported stimuli-responsive energy storage devices have rather low energy densities (<250 Wh kg -1 ) and single stimuli-response, which seriously limit their application scopes in intelligent electronics. Herein, a dual-stimuli-responsive sodium-bromine (Na//Br 2 ) battery featuring ultrahigh energy density, electrochromic effect, and fast thermal response is demonstrated. Remarkably, the fabricated Na//Br 2 battery exhibits a large operating voltage of 3.3 V and an energy density up to 760 Wh kg -1 , which outperforms those for the state-of-the-art stimuli-responsive electrochemical energy storage devices. This work offers a promising approach for designing multi-stimuli-responsive and high-energy rechargeable batteries without sacrificing the electrochemical performance. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Edge map analysis in chest X-rays for automatic pulmonary abnormality screening.

    PubMed

    Santosh, K C; Vajda, Szilárd; Antani, Sameer; Thoma, George R

    2016-09-01

    Our particular motivator is the need for screening HIV+ populations in resource-constrained regions for the evidence of tuberculosis, using posteroanterior chest radiographs (CXRs). The proposed method is motivated by the observation that abnormal CXRs tend to exhibit corrupted and/or deformed thoracic edge maps. We study histograms of thoracic edges for all possible orientations of gradients in the range [Formula: see text] at different numbers of bins and different pyramid levels, using five different regions-of-interest selection. We have used two CXR benchmark collections made available by the U.S. National Library of Medicine and have achieved a maximum abnormality detection accuracy (ACC) of 86.36 % and area under the ROC curve (AUC) of 0.93 at 1 s per image, on average. We have presented an automatic method for screening pulmonary abnormalities using thoracic edge map in CXR images. The proposed method outperforms previously reported state-of-the-art results.

  10. Intrinsic melanin and hemoglobin colour components for skin lesion malignancy detection.

    PubMed

    Madooei, Ali; Drew, Mark S; Sadeghi, Maryam; Atkins, M Stella

    2012-01-01

    In this paper we propose a new log-chromaticity 2-D colour space, an extension of previous approaches, which succeeds in removing confounding factors from dermoscopic images: (i) the effects of the particular camera characteristics for the camera system used in forming RGB images; (ii) the colour of the light used in the dermoscope; (iii) shading induced by imaging non-flat skin surfaces; (iv) and light intensity, removing the effect of light-intensity falloff toward the edges of the dermoscopic image. In the context of a blind source separation of the underlying colour, we arrive at intrinsic melanin and hemoglobin images, whose properties are then used in supervised learning to achieve excellent malignant vs. benign skin lesion classification. In addition, we propose using the geometric-mean of colour for skin lesion segmentation based on simple grey-level thresholding, with results outperforming the state of the art.

  11. Information Filtering via Heterogeneous Diffusion in Online Bipartite Networks.

    PubMed

    Zhang, Fu-Guo; Zeng, An

    2015-01-01

    The rapid expansion of Internet brings us overwhelming online information, which is impossible for an individual to go through all of it. Therefore, recommender systems were created to help people dig through this abundance of information. In networks composed by users and objects, recommender algorithms based on diffusion have been proven to be one of the best performing methods. Previous works considered the diffusion process from user to object, and from object to user to be equivalent. We show in this work that it is not the case and we improve the quality of the recommendation by taking into account the asymmetrical nature of this process. We apply this idea to modify the state-of-the-art recommendation methods. The simulation results show that the new methods can outperform these existing methods in both recommendation accuracy and diversity. Finally, this modification is checked to be able to improve the recommendation in a realistic case.

  12. Highly efficient frequency conversion with bandwidth compression of quantum light

    PubMed Central

    Allgaier, Markus; Ansari, Vahid; Sansoni, Linda; Eigner, Christof; Quiring, Viktor; Ricken, Raimund; Harder, Georg; Brecht, Benjamin; Silberhorn, Christine

    2017-01-01

    Hybrid quantum networks rely on efficient interfacing of dissimilar quantum nodes, as elements based on parametric downconversion sources, quantum dots, colour centres or atoms are fundamentally different in their frequencies and bandwidths. Although pulse manipulation has been demonstrated in very different systems, to date no interface exists that provides both an efficient bandwidth compression and a substantial frequency translation at the same time. Here we demonstrate an engineered sum-frequency-conversion process in lithium niobate that achieves both goals. We convert pure photons at telecom wavelengths to the visible range while compressing the bandwidth by a factor of 7.47 under preservation of non-classical photon-number statistics. We achieve internal conversion efficiencies of 61.5%, significantly outperforming spectral filtering for bandwidth compression. Our system thus makes the connection between previously incompatible quantum systems as a step towards usable quantum networks. PMID:28134242

  13. Design and development of an ancient Chinese document recognition system

    NASA Astrophysics Data System (ADS)

    Peng, Liangrui; Xiu, Pingping; Ding, Xiaoqing

    2003-12-01

    The digitization of ancient Chinese documents presents new challenges to OCR (Optical Character Recognition) research field due to the large character set of ancient Chinese characters, variant font types, and versatile document layout styles, as these documents are historical reflections to the thousands of years of Chinese civilization. After analyzing the general characteristics of ancient Chinese documents, we present a solution for recognition of ancient Chinese documents with regular font-types and layout-styles. Based on the previous work on multilingual OCR in TH-OCR system, we focus on the design and development of two key technologies which include character recognition and page segmentation. Experimental results show that the developed character recognition kernel of 19,635 Chinese characters outperforms our original traditional Chinese recognition kernel; Benchmarked test on printed ancient Chinese books proves that the proposed system is effective for regular ancient Chinese documents.

  14. Visuo-spatial processing in autism--testing the predictions of extreme male brain theory.

    PubMed

    Falter, Christine M; Plaisted, Kate C; Davis, Greg

    2008-03-01

    It has been hypothesised that autism is an extreme version of the male brain, caused by high levels of prenatal testosterone (Baron-Cohen 1999). To test this proposal, associations were assessed between three visuo-spatial tasks and prenatal testosterone, indexed in second-to-fourth digit length ratios (2D:4D). The study included children with Autism Spectrum Disorder, ASD (N = 28), and chronological as well as mental age matched typically-developing children (N = 31). While the group with ASD outperformed the control group at Mental Rotation and Figure-Disembedding, these group differences were not related to differences in prenatal testosterone level. Previous findings of an association between Targeting and 2D:4D were replicated in typically-developing children and children with ASD. The implications of these results for the extreme male brain (EMB) theory of autism are discussed.

  15. Flexible Kernel Memory

    PubMed Central

    Nowicki, Dimitri; Siegelmann, Hava

    2010-01-01

    This paper introduces a new model of associative memory, capable of both binary and continuous-valued inputs. Based on kernel theory, the memory model is on one hand a generalization of Radial Basis Function networks and, on the other, is in feature space, analogous to a Hopfield network. Attractors can be added, deleted, and updated on-line simply, without harming existing memories, and the number of attractors is independent of input dimension. Input vectors do not have to adhere to a fixed or bounded dimensionality; they can increase and decrease it without relearning previous memories. A memory consolidation process enables the network to generalize concepts and form clusters of input data, which outperforms many unsupervised clustering techniques; this process is demonstrated on handwritten digits from MNIST. Another process, reminiscent of memory reconsolidation is introduced, in which existing memories are refreshed and tuned with new inputs; this process is demonstrated on series of morphed faces. PMID:20552013

  16. Integrating a dual-silicon photoelectrochemical cell into a redox flow battery for unassisted photocharging.

    PubMed

    Liao, Shichao; Zong, Xu; Seger, Brian; Pedersen, Thomas; Yao, Tingting; Ding, Chunmei; Shi, Jingying; Chen, Jian; Li, Can

    2016-05-04

    Solar rechargeable flow cells (SRFCs) provide an attractive approach for in situ capture and storage of intermittent solar energy via photoelectrochemical regeneration of discharged redox species for electricity generation. However, overall SFRC performance is restricted by inefficient photoelectrochemical reactions. Here we report an efficient SRFC based on a dual-silicon photoelectrochemical cell and a quinone/bromine redox flow battery for in situ solar energy conversion and storage. Using narrow bandgap silicon for efficient photon collection and fast redox couples for rapid interface charge injection, our device shows an optimal solar-to-chemical conversion efficiency of ∼5.9% and an overall photon-chemical-electricity energy conversion efficiency of ∼3.2%, which, to our knowledge, outperforms previously reported SRFCs. The proposed SRFC can be self-photocharged to 0.8 V and delivers a discharge capacity of 730 mAh l(-1). Our work may guide future designs for highly efficient solar rechargeable devices.

  17. Fast Acquisition and Reconstruction of Optical Coherence Tomography Images via Sparse Representation

    PubMed Central

    Li, Shutao; McNabb, Ryan P.; Nie, Qing; Kuo, Anthony N.; Toth, Cynthia A.; Izatt, Joseph A.; Farsiu, Sina

    2014-01-01

    In this paper, we present a novel technique, based on compressive sensing principles, for reconstruction and enhancement of multi-dimensional image data. Our method is a major improvement and generalization of the multi-scale sparsity based tomographic denoising (MSBTD) algorithm we recently introduced for reducing speckle noise. Our new technique exhibits several advantages over MSBTD, including its capability to simultaneously reduce noise and interpolate missing data. Unlike MSBTD, our new method does not require an a priori high-quality image from the target imaging subject and thus offers the potential to shorten clinical imaging sessions. This novel image restoration method, which we termed sparsity based simultaneous denoising and interpolation (SBSDI), utilizes sparse representation dictionaries constructed from previously collected datasets. We tested the SBSDI algorithm on retinal spectral domain optical coherence tomography images captured in the clinic. Experiments showed that the SBSDI algorithm qualitatively and quantitatively outperforms other state-of-the-art methods. PMID:23846467

  18. Proficiency and sentence constraint effects on second language word learning.

    PubMed

    Ma, Tengfei; Chen, Baoguo; Lu, Chunming; Dunlap, Susan

    2015-07-01

    This paper presents an experiment that investigated the effects of L2 proficiency and sentence constraint on semantic processing of unknown L2 words (pseudowords). All participants were Chinese native speakers who learned English as a second language. In the experiment, we used a whole sentence presentation paradigm with a delayed semantic relatedness judgment task. Both higher and lower-proficiency L2 learners could make use of the high-constraint sentence context to judge the meaning of novel pseudowords, and higher-proficiency L2 learners outperformed lower-proficiency L2 learners in all conditions. These results demonstrate that both L2 proficiency and sentence constraint affect subsequent word learning among second language learners. We extended L2 word learning into a sentence context, replicated the sentence constraint effects previously found among native speakers, and found proficiency effects in L2 word learning. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Strong systematicity through sensorimotor conceptual grounding: an unsupervised, developmental approach to connectionist sentence processing

    NASA Astrophysics Data System (ADS)

    Jansen, Peter A.; Watter, Scott

    2012-03-01

    Connectionist language modelling typically has difficulty with syntactic systematicity, or the ability to generalise language learning to untrained sentences. This work develops an unsupervised connectionist model of infant grammar learning. Following the semantic boostrapping hypothesis, the network distils word category using a developmentally plausible infant-scale database of grounded sensorimotor conceptual representations, as well as a biologically plausible semantic co-occurrence activation function. The network then uses this knowledge to acquire an early benchmark clausal grammar using correlational learning, and further acquires separate conceptual and grammatical category representations. The network displays strongly systematic behaviour indicative of the general acquisition of the combinatorial systematicity present in the grounded infant-scale language stream, outperforms previous contemporary models that contain primarily noun and verb word categories, and successfully generalises broadly to novel untrained sensorimotor grounded sentences composed of unfamiliar nouns and verbs. Limitations as well as implications to later grammar learning are discussed.

  20. Localized Dictionaries Based Orientation Field Estimation for Latent Fingerprints.

    PubMed

    Xiao Yang; Jianjiang Feng; Jie Zhou

    2014-05-01

    Dictionary based orientation field estimation approach has shown promising performance for latent fingerprints. In this paper, we seek to exploit stronger prior knowledge of fingerprints in order to further improve the performance. Realizing that ridge orientations at different locations of fingerprints have different characteristics, we propose a localized dictionaries-based orientation field estimation algorithm, in which noisy orientation patch at a location output by a local estimation approach is replaced by real orientation patch in the local dictionary at the same location. The precondition of applying localized dictionaries is that the pose of the latent fingerprint needs to be estimated. We propose a Hough transform-based fingerprint pose estimation algorithm, in which the predictions about fingerprint pose made by all orientation patches in the latent fingerprint are accumulated. Experimental results on challenging latent fingerprint datasets show the proposed method outperforms previous ones markedly.

  1. Geometry Helps to Compare Persistence Diagrams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kerber, Michael; Morozov, Dmitriy; Nigmetov, Arnur

    2015-11-16

    Exploiting geometric structure to improve the asymptotic complexity of discrete assignment problems is a well-studied subject. In contrast, the practical advantages of using geometry for such problems have not been explored. We implement geometric variants of the Hopcroft--Karp algorithm for bottleneck matching (based on previous work by Efrat el al.), and of the auction algorithm by Bertsekas for Wasserstein distance computation. Both implementations use k-d trees to replace a linear scan with a geometric proximity query. Our interest in this problem stems from the desire to compute distances between persistence diagrams, a problem that comes up frequently in topological datamore » analysis. We show that our geometric matching algorithms lead to a substantial performance gain, both in running time and in memory consumption, over their purely combinatorial counterparts. Moreover, our implementation significantly outperforms the only other implementation available for comparing persistence diagrams.« less

  2. Numerical approach for unstructured quantum key distribution

    PubMed Central

    Coles, Patrick J.; Metodiev, Eric M.; Lütkenhaus, Norbert

    2016-01-01

    Quantum key distribution (QKD) allows for communication with security guaranteed by quantum theory. The main theoretical problem in QKD is to calculate the secret key rate for a given protocol. Analytical formulas are known for protocols with symmetries, since symmetry simplifies the analysis. However, experimental imperfections break symmetries, hence the effect of imperfections on key rates is difficult to estimate. Furthermore, it is an interesting question whether (intentionally) asymmetric protocols could outperform symmetric ones. Here we develop a robust numerical approach for calculating the key rate for arbitrary discrete-variable QKD protocols. Ultimately this will allow researchers to study ‘unstructured' protocols, that is, those that lack symmetry. Our approach relies on transforming the key rate calculation to the dual optimization problem, which markedly reduces the number of parameters and hence the calculation time. We illustrate our method by investigating some unstructured protocols for which the key rate was previously unknown. PMID:27198739

  3. A Social Diffusion Model with an Application on Election Simulation

    PubMed Central

    Wang, Fu-Min; Hung, San-Chuan; Kung, Perng-Hwa; Lin, Shou-De

    2014-01-01

    Issues about opinion diffusion have been studied for decades. It has so far no empirical approach to model the interflow and formation of crowd's opinion in elections due to two reasons. First, unlike the spread of information or flu, individuals have their intrinsic attitudes to election candidates in advance. Second, opinions are generally simply assumed as single values in most diffusion models. However, in this case, an opinion should represent preference toward multiple candidates. Previously done models thus may not intuitively interpret such scenario. This work is to design a diffusion model which is capable of managing the aforementioned scenario. To demonstrate the usefulness of our model, we simulate the diffusion on the network built based on a publicly available bibliography dataset. We compare the proposed model with other well-known models such as independent cascade. It turns out that our model consistently outperforms other models. We additionally investigate electoral issues with our model simulator. PMID:24995351

  4. Spatiotemporal Recurrent Convolutional Networks for Traffic Prediction in Transportation Networks

    PubMed Central

    Yu, Haiyang; Wu, Zhihai; Wang, Shuqin; Wang, Yunpeng; Ma, Xiaolei

    2017-01-01

    Predicting large-scale transportation network traffic has become an important and challenging topic in recent decades. Inspired by the domain knowledge of motion prediction, in which the future motion of an object can be predicted based on previous scenes, we propose a network grid representation method that can retain the fine-scale structure of a transportation network. Network-wide traffic speeds are converted into a series of static images and input into a novel deep architecture, namely, spatiotemporal recurrent convolutional networks (SRCNs), for traffic forecasting. The proposed SRCNs inherit the advantages of deep convolutional neural networks (DCNNs) and long short-term memory (LSTM) neural networks. The spatial dependencies of network-wide traffic can be captured by DCNNs, and the temporal dynamics can be learned by LSTMs. An experiment on a Beijing transportation network with 278 links demonstrates that SRCNs outperform other deep learning-based algorithms in both short-term and long-term traffic prediction. PMID:28672867

  5. A Longitudinal Study of Memory Advantages in Bilinguals

    PubMed Central

    Ljungberg, Jessica K.; Hansson, Patrik; Andrés, Pilar; Josefsson, Maria; Nilsson, Lars-Göran

    2013-01-01

    Typically, studies of cognitive advantages in bilinguals have been conducted previously by using executive and inhibitory tasks (e.g. Simon task) and applying cross-sectional designs. This study longitudinally investigated bilingual advantages on episodic memory recall, verbal letter and categorical fluency during the trajectory of life. Monolingual and bilingual participants (n = 178) between 35–70 years at baseline were drawn from the Betula Prospective Cohort Study of aging, memory, and health. Results showed that bilinguals outperformed monolinguals at the first testing session and across time both in episodic memory recall and in letter fluency. No interaction with age was found indicating that the rate of change across ages was similar for bilinguals and monolinguals. As predicted and in line with studies applying cross-sectional designs, no advantages associated with bilingualism were found in the categorical fluency task. The results are discussed in the light of successful aging. PMID:24023803

  6. Automatic Summarization as a Combinatorial Optimization Problem

    NASA Astrophysics Data System (ADS)

    Hirao, Tsutomu; Suzuki, Jun; Isozaki, Hideki

    We derived the oracle summary with the highest ROUGE score that can be achieved by integrating sentence extraction with sentence compression from the reference abstract. The analysis results of the oracle revealed that summarization systems have to assign an appropriate compression rate for each sentence in the document. In accordance with this observation, this paper proposes a summarization method as a combinatorial optimization: selecting the set of sentences that maximize the sum of the sentence scores from the pool which consists of the sentences with various compression rates, subject to length constrains. The score of the sentence is defined by its compression rate, content words and positional information. The parameters for the compression rates and positional information are optimized by minimizing the loss between score of oracles and that of candidates. The results obtained from TSC-2 corpus showed that our method outperformed the previous systems with statistical significance.

  7. Parallel Solver for H(div) Problems Using Hybridization and AMG

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Chak S.; Vassilevski, Panayot S.

    2016-01-15

    In this paper, a scalable parallel solver is proposed for H(div) problems discretized by arbitrary order finite elements on general unstructured meshes. The solver is based on hybridization and algebraic multigrid (AMG). Unlike some previously studied H(div) solvers, the hybridization solver does not require discrete curl and gradient operators as additional input from the user. Instead, only some element information is needed in the construction of the solver. The hybridization results in a H1-equivalent symmetric positive definite system, which is then rescaled and solved by AMG solvers designed for H1 problems. Weak and strong scaling of the method are examinedmore » through several numerical tests. Our numerical results show that the proposed solver provides a promising alternative to ADS, a state-of-the-art solver [12], for H(div) problems. In fact, it outperforms ADS for higher order elements.« less

  8. Spatiotemporal Recurrent Convolutional Networks for Traffic Prediction in Transportation Networks.

    PubMed

    Yu, Haiyang; Wu, Zhihai; Wang, Shuqin; Wang, Yunpeng; Ma, Xiaolei

    2017-06-26

    Predicting large-scale transportation network traffic has become an important and challenging topic in recent decades. Inspired by the domain knowledge of motion prediction, in which the future motion of an object can be predicted based on previous scenes, we propose a network grid representation method that can retain the fine-scale structure of a transportation network. Network-wide traffic speeds are converted into a series of static images and input into a novel deep architecture, namely, spatiotemporal recurrent convolutional networks (SRCNs), for traffic forecasting. The proposed SRCNs inherit the advantages of deep convolutional neural networks (DCNNs) and long short-term memory (LSTM) neural networks. The spatial dependencies of network-wide traffic can be captured by DCNNs, and the temporal dynamics can be learned by LSTMs. An experiment on a Beijing transportation network with 278 links demonstrates that SRCNs outperform other deep learning-based algorithms in both short-term and long-term traffic prediction.

  9. Preliminary Multivariable Cost Model for Space Telescopes

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip

    2010-01-01

    Parametric cost models are routinely used to plan missions, compare concepts and justify technology investments. Previously, the authors published two single variable cost models based on 19 flight missions. The current paper presents the development of a multi-variable space telescopes cost model. The validity of previously published models are tested. Cost estimating relationships which are and are not significant cost drivers are identified. And, interrelationships between variables are explored

  10. Correlating student interest and high school preparation with learning and performance in an introductory university physics course

    NASA Astrophysics Data System (ADS)

    Harlow, Jason J. B.; Harrison, David M.; Meyertholen, Andrew

    2014-06-01

    We have studied the correlation of student performance in a large first year university physics course with their reasons for taking the course and whether or not the student took a senior-level high school physics course. Performance was measured both by the Force Concept Inventory and by the grade on the final examination. Students who took the course primarily for their own interest outperformed students who took the course primarily because it was required, both on the Force Concept Inventory and on the final examination; students who took a senior-level high school physics course outperformed students who did not, also both on the Force Concept Inventory and on the final exam. Students who took the course for their own interest and took high school physics outperformed students who took the course because it was required and did not take high school physics by a wide margin. However, the normalized gain on the Force Concept Inventory was the same within uncertainties for all groups and subgroups of students.

  11. New Method of Calculating a Multiplication by using the Generalized Bernstein-Vazirani Algorithm

    NASA Astrophysics Data System (ADS)

    Nagata, Koji; Nakamura, Tadao; Geurdes, Han; Batle, Josep; Abdalla, Soliman; Farouk, Ahmed

    2018-06-01

    We present a new method of more speedily calculating a multiplication by using the generalized Bernstein-Vazirani algorithm and many parallel quantum systems. Given the set of real values a1,a2,a3,\\ldots ,aN and a function g:bf {R}→ {0,1}, we shall determine the following values g(a1),g(a2),g(a3),\\ldots , g(aN) simultaneously. The speed of determining the values is shown to outperform the classical case by a factor of N. Next, we consider it as a number in binary representation; M 1 = ( g( a 1), g( a 2), g( a 3),…, g( a N )). By using M parallel quantum systems, we have M numbers in binary representation, simultaneously. The speed of obtaining the M numbers is shown to outperform the classical case by a factor of M. Finally, we calculate the product; M1× M2× \\cdots × MM. The speed of obtaining the product is shown to outperform the classical case by a factor of N × M.

  12. Performance of 68Ga-PSMA PET/CT for Prostate Cancer Management at Initial Staging and Time of Biochemical Recurrence.

    PubMed

    Bailey, Jason; Piert, Morand

    2017-09-09

    Recently introduced Gallium-68 labeled PSMA-ligands such as HBED-CC ( 68 Ga-PSMA) have shown promise for unmet diagnostic needs in prostate cancer. 68 Ga-PSMA has demonstrated improved detection rates and specificity for prostate cancer compared to standard imaging approaches. In the setting of primary disease, 68 Ga-PSMA appears to preferentially identify treatment-relevant intermediate and high-risk prostate cancer. There is also a growing evidence that 68 Ga-PSMA positron emission tomography (PET) outperforms alternative conventional imaging methods including choline-based radiotracers for the localization of disease sites at biochemical recurrence, particularly at lower prostate-specific antigen (PSA) levels (< 1 ng/mL). However, the majority of published work lacks rigorous verification of imaging results. 68 Ga-PSMA offers significant promise for both, primary disease and biochemically recurrent prostate cancer. The evidence base to support 68 Ga-PSMA is however still underdeveloped, and more rigorous studies substantiating efficacy are needed.

  13. Genome-wide assessment of differential translations with ribosome profiling data

    PubMed Central

    Xiao, Zhengtao; Zou, Qin; Liu, Yu; Yang, Xuerui

    2016-01-01

    The closely regulated process of mRNA translation is crucial for precise control of protein abundance and quality. Ribosome profiling, a combination of ribosome foot-printing and RNA deep sequencing, has been used in a large variety of studies to quantify genome-wide mRNA translation. Here, we developed Xtail, an analysis pipeline tailored for ribosome profiling data that comprehensively and accurately identifies differentially translated genes in pairwise comparisons. Applied on simulated and real datasets, Xtail exhibits high sensitivity with minimal false-positive rates, outperforming existing methods in the accuracy of quantifying differential translations. With published ribosome profiling datasets, Xtail does not only reveal differentially translated genes that make biological sense, but also uncovers new events of differential translation in human cancer cells on mTOR signalling perturbation and in human primary macrophages on interferon gamma (IFN-γ) treatment. This demonstrates the value of Xtail in providing novel insights into the molecular mechanisms that involve translational dysregulations. PMID:27041671

  14. A randomized controlled trial of yoga for pregnant women with symptoms of depression and anxiety.

    PubMed

    Davis, Kyle; Goodman, Sherryl H; Leiferman, Jenn; Taylor, Mary; Dimidjian, Sona

    2015-08-01

    Yoga may be well suited for depressed and anxious pregnant women, given reported benefits of meditation and physical activity and pregnant women's preference for nonpharmacological treatments. We randomly assigned 46 pregnant women with symptoms of depression and anxiety to an 8-week yoga intervention or treatment-as-usual (TAU) in order to examine feasibility and preliminary outcomes. Yoga was associated with high levels of credibility and satisfaction as an intervention for depression and anxiety during pregnancy. Participants in both conditions reported significant improvement in symptoms of depression and anxiety over time; and yoga was associated with significantly greater reduction in negative affect as compared to TAU (β = -0.53, SE = 0.20, p = .011). Prenatal yoga was found to be a feasible and acceptable intervention and was associated with reductions in symptoms of anxiety and depression; however, prenatal yoga only significantly outperformed TAU on reduction of negative affect. Published by Elsevier Ltd.

  15. Discriminating Projections for Estimating Face Age in Wild Images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tokola, Ryan A; Bolme, David S; Ricanek, Karl

    2014-01-01

    We introduce a novel approach to estimating the age of a human from a single uncontrolled image. Current face age estimation algorithms work well in highly controlled images, and some are robust to changes in illumination, but it is usually assumed that images are close to frontal. This bias is clearly seen in the datasets that are commonly used to evaluate age estimation, which either entirely or mostly consist of frontal images. Using pose-specific projections, our algorithm maps image features into a pose-insensitive latent space that is discriminative with respect to age. Age estimation is then performed using a multi-classmore » SVM. We show that our approach outperforms other published results on the Images of Groups dataset, which is the only age-related dataset with a non-trivial number of off-axis face images, and that we are competitive with recent age estimation algorithms on the mostly-frontal FG-NET dataset. We also experimentally demonstrate that our feature projections introduce insensitivity to pose.« less

  16. Non-faradic carbon nanotube-based supercapacitors: state of the art. Analysis of all the main scientific contributions from 1997 to our days

    NASA Astrophysics Data System (ADS)

    Bondavalli, P.; Pribat, D.; Schnell, J.-P.; Delfaure, C.; Gorintin, L.; Legagneux, P.; Baraton, L.; Galindo, C.

    2012-10-01

    This contribution deals with the state of the art of studies concerning the fabrication of electric double-layer capacitors (EDLCs) also called super- or ultracapacitors and obtained using carbon nanotubes (CNTs) without exploiting Faradic reactions. From the first work published in 1997, EDLCs fabricated using carbon nanotubes as constitutive material for electrodes showed very interesting characteristics. It appeared that they could potentially outperform traditional technologies based on activated carbon. Different methods to fabricate the CNT-based electrodes have been proposed in order to improve the performances (mainly energy densities and power densities), for example filtration, direct growth on metal collector or deposition using an air-brush technique. In this contribution we will introduce the main works in the field. Finally, we will point out an emerging interest for supercapacitors fabricated on flexible substrates, exploiting the outstanding mechanical performances of CNTs, for new kinds of applications such as portable electronics.

  17. Pathway-GPS and SIGORA: identifying relevant pathways based on the over-representation of their gene-pair signatures

    PubMed Central

    Foroushani, Amir B.K.; Brinkman, Fiona S.L.

    2013-01-01

    Motivation. Predominant pathway analysis approaches treat pathways as collections of individual genes and consider all pathway members as equally informative. As a result, at times spurious and misleading pathways are inappropriately identified as statistically significant, solely due to components that they share with the more relevant pathways. Results. We introduce the concept of Pathway Gene-Pair Signatures (Pathway-GPS) as pairs of genes that, as a combination, are specific to a single pathway. We devised and implemented a novel approach to pathway analysis, Signature Over-representation Analysis (SIGORA), which focuses on the statistically significant enrichment of Pathway-GPS in a user-specified gene list of interest. In a comparative evaluation of several published datasets, SIGORA outperformed traditional methods by delivering biologically more plausible and relevant results. Availability. An efficient implementation of SIGORA, as an R package with precompiled GPS data for several human and mouse pathway repositories is available for download from http://sigora.googlecode.com/svn/. PMID:24432194

  18. Factoring local sequence composition in motif significance analysis.

    PubMed

    Ng, Patrick; Keich, Uri

    2008-01-01

    We recently introduced a biologically realistic and reliable significance analysis of the output of a popular class of motif finders. In this paper we further improve our significance analysis by incorporating local base composition information. Relying on realistic biological data simulation, as well as on FDR analysis applied to real data, we show that our method is significantly better than the increasingly popular practice of using the normal approximation to estimate the significance of a finder's output. Finally we turn to leveraging our reliable significance analysis to improve the actual motif finding task. Specifically, endowing a variant of the Gibbs Sampler with our improved significance analysis we demonstrate that de novo finders can perform better than has been perceived. Significantly, our new variant outperforms all the finders reviewed in a recently published comprehensive analysis of the Harbison genome-wide binding location data. Interestingly, many of these finders incorporate additional information such as nucleosome positioning and the significance of binding data.

  19. Belief attribution in deaf and hearing infants.

    PubMed

    Meristo, Marek; Morgan, Gary; Geraci, Alessandra; Iozzi, Laura; Hjelmquist, Erland; Surian, Luca; Siegal, Michael

    2012-09-01

    Based on anticipatory looking and reactions to violations of expected events, infants have been credited with 'theory of mind' (ToM) knowledge that a person's search behaviour for an object will be guided by true or false beliefs about the object's location. However, little is known about the preconditions for looking patterns consistent with belief attribution in infants. In this study, we compared the performance of 17- to 26-month-olds on anticipatory looking in ToM tasks. The infants were either hearing or were deaf from hearing families and thus delayed in communicative experience gained from access to language and conversational input. Hearing infants significantly outperformed their deaf counterparts in anticipating the search actions of a cartoon character that held a false belief about a target-object location. By contrast, the performance of the two groups in a true belief condition did not differ significantly. These findings suggest for the first time that access to language and conversational input contributes to early ToM reasoning. © 2012 Blackwell Publishing Ltd.

  20. Winners. Strategies of ten of America's most successful hospitals.

    PubMed

    Beckham, J D

    1989-01-01

    Through our work in strategy consulting, we knew that there were hospitals significantly outperforming their competitors in markets nationwide. In 1988, we set out to develop a better understanding of why these hospitals have been so successful. We isolated metropolitan areas and identified "winners" in each of those markets by drawing upon data from the American Hospital Association, Medicare cost reports, and market research analysis. We examined market share performance, customer preference, and relative financial performance. We then created a short list of hospitals that shared the common characteristic of success but were likely to be different enough to provide some varying perspectives. Included in the list was the largest hospital in the HCA stable, a Catholic hospital, a Harvard teaching hospital, and a Florida hospital that had bounded back from near insolvency. Initial analysis was followed by hundreds of hours of personal interviews with hospital executives and doctors. This article is an overview of what we found. The results of the investigation will be published as a book.

  1. Functional brain connectivity is predictable from anatomic network's Laplacian eigen-structure.

    PubMed

    Abdelnour, Farras; Dayan, Michael; Devinsky, Orrin; Thesen, Thomas; Raj, Ashish

    2018-05-15

    How structural connectivity (SC) gives rise to functional connectivity (FC) is not fully understood. Here we mathematically derive a simple relationship between SC measured from diffusion tensor imaging, and FC from resting state fMRI. We establish that SC and FC are related via (structural) Laplacian spectra, whereby FC and SC share eigenvectors and their eigenvalues are exponentially related. This gives, for the first time, a simple and analytical relationship between the graph spectra of structural and functional networks. Laplacian eigenvectors are shown to be good predictors of functional eigenvectors and networks based on independent component analysis of functional time series. A small number of Laplacian eigenmodes are shown to be sufficient to reconstruct FC matrices, serving as basis functions. This approach is fast, and requires no time-consuming simulations. It was tested on two empirical SC/FC datasets, and was found to significantly outperform generative model simulations of coupled neural masses. Copyright © 2018. Published by Elsevier Inc.

  2. Orally Bioavailable Metal Chelators and Radical Scavengers: Multifunctional Antioxidants for the Coadjutant Treatment of Neurodegenerative Diseases.

    PubMed

    Kawada, Hiroyoshi; Kador, Peter F

    2015-11-25

    Neurodegenerative diseases are associated with oxidative stress that is induced by the presence of reactive oxygen species and the abnormal cellular accumulation of transition metals. Here, a new series of orally bioavailable multifunctional antioxidants (MFAO-2s) possessing a 2-diacetylamino-5-hydroxypyrimidine moiety is described. These MFAO-2s demonstrate both free radical and metal attenuating properties that are similar to the original published MFAO-1s that are based on 1-N,N'-dimethylsulfamoyl-1-4-(2-pyrimidyl)piperazine. Oral bioavailability studies in C57BL/6 mice demonstrate that the MFAO-2s accumulate in the brain at significantly higher levels than the MFAO-1s while achieving similar neural retina levels. The MFAO-2s protect human neuroblastoma and retinal pigmented epithelial cells against hydroxyl radicals in a dose-dependent manner by maintaining cell viability and intracellular glutathione levels. The MFAO-2s outperform clioquinol, a metal attenuator that has been investigated for the treatment of Alzheimer's disease.

  3. Application of ion implantation in tooling industry

    NASA Astrophysics Data System (ADS)

    Straede, Christen A.

    1996-06-01

    In papers published during the last half of the 1980s it is often stated that the application of ion beams to non-semiconductor purposes seems ready for full-scale industrial exploitation. However, progress with respect to commercialisation of ion implantation has been slower than predicted, although the process is quite clearly building up niche markets, especially in the tooling industry. It is the main purpose of this paper to discuss the implementation of the process in the tooling market, and to describe strategies used to ensure its success. The basic idea has been to find niches where ion implantation out-performs other processes both technically and in prices. For instance, it has been clearly realised that one should avoid competing with physical vapour deposition or other coating techniques in market areas where they perform excellently, and instead find niches where the advantages of the ion implantation technique can be fully utilised. The paper will present typical case stories in order to illustrate market niches where the technique has its greatest successes and potential.

  4. Writing and Publishing Handbook.

    ERIC Educational Resources Information Center

    Hansen, William F., Ed.

    Intended to provide guidance in academic publishing to faculty members, especially younger faculty members, this handbook is a compilation of four previously published essays by different authors. Following a preface and an introduction, the four essays and their authors are as follows: (1) "One Writer's Secrets" (Donald M. Murray); (2)…

  5. A Systematic Review of Virtual Reality Simulators for Robot-assisted Surgery.

    PubMed

    Moglia, Andrea; Ferrari, Vincenzo; Morelli, Luca; Ferrari, Mauro; Mosca, Franco; Cuschieri, Alfred

    2016-06-01

    No single large published randomized controlled trial (RCT) has confirmed the efficacy of virtual simulators in the acquisition of skills to the standard required for safe clinical robotic surgery. This remains the main obstacle for the adoption of these virtual simulators in surgical residency curricula. To evaluate the level of evidence in published studies on the efficacy of training on virtual simulators for robotic surgery. In April 2015 a literature search was conducted on PubMed, Web of Science, Scopus, Cochrane Library, the Clinical Trials Database (US) and the Meta Register of Controlled Trials. All publications were scrutinized for relevance to the review and for assessment of the levels of evidence provided using the classification developed by the Oxford Centre for Evidence-Based Medicine. The publications included in the review consisted of one RCT and 28 cohort studies on validity, and seven RCTs and two cohort studies on skills transfer from virtual simulators to robot-assisted surgery. Simulators were rated good for realism (face validity) and for usefulness as a training tool (content validity). However, the studies included used various simulation training methodologies, limiting the assessment of construct validity. The review confirms the absence of any consensus on which tasks and metrics are the most effective for the da Vinci Skills Simulator and dV-Trainer, the most widely investigated systems. Although there is consensus for the RoSS simulator, this is based on only two studies on construct validity involving four exercises. One study on initial evaluation of an augmented reality module for partial nephrectomy using the dV-Trainer reported high correlation (r=0.8) between in vivo porcine nephrectomy and a virtual renorrhaphy task according to the overall Global Evaluation Assessment of Robotic Surgery (GEARS) score. In one RCT on skills transfer, the experimental group outperformed the control group, with a significant difference in overall GEARS score (p=0.012) during performance of urethrovesical anastomosis on an inanimate model. Only one study included assessment of a surgical procedure on real patients: subjects trained on a virtual simulator outperformed the control group following traditional training. However, besides the small numbers, this study was not randomized. There is an urgent need for a large, well-designed, preferably multicenter RCT to study the efficacy of virtual simulation for acquisition competence in and safe execution of clinical robotic-assisted surgery. We reviewed the literature on virtual simulators for robot-assisted surgery. Validity studies used various simulation training methodologies. It is not clear which exercises and metrics are the most effective in distinguishing different levels of experience on the da Vinci robot. There is no reported evidence of skills transfer from simulation to clinical surgery on real patients. Copyright © 2015 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  6. Calibrating random forests for probability estimation.

    PubMed

    Dankowski, Theresa; Ziegler, Andreas

    2016-09-30

    Probabilities can be consistently estimated using random forests. It is, however, unclear how random forests should be updated to make predictions for other centers or at different time points. In this work, we present two approaches for updating random forests for probability estimation. The first method has been proposed by Elkan and may be used for updating any machine learning approach yielding consistent probabilities, so-called probability machines. The second approach is a new strategy specifically developed for random forests. Using the terminal nodes, which represent conditional probabilities, the random forest is first translated to logistic regression models. These are, in turn, used for re-calibration. The two updating strategies were compared in a simulation study and are illustrated with data from the German Stroke Study Collaboration. In most simulation scenarios, both methods led to similar improvements. In the simulation scenario in which the stricter assumptions of Elkan's method were not met, the logistic regression-based re-calibration approach for random forests outperformed Elkan's method. It also performed better on the stroke data than Elkan's method. The strength of Elkan's method is its general applicability to any probability machine. However, if the strict assumptions underlying this approach are not met, the logistic regression-based approach is preferable for updating random forests for probability estimation. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  7. Is email a reliable means of contacting authors of previously published papers? A study of the Emergency Medicine Journal for 2001.

    PubMed

    O'Leary, F

    2003-07-01

    To determine whether it is possible to contact authors of previously published papers via email. A cross sectional study of the Emergency Medicine Journal for 2001. 118 articles were included in the study. The response rate from those with valid email addresses was 73%. There was no statistical difference between the type of email address used and the address being invalid (p=0.392) or between the type of article and the likelihood of a reply (p=0.197). More responses were obtained from work addresses when compared with Hotmail addresses (86% v 57%, p=0.02). Email is a valid means of contacting authors of previously published articles, particularly within the emergency medicine specialty. A work based email address may be a more valid means of contact than a Hotmail address.

  8. Relationship Between the Menstrual Cycle and Timing of Ovulation Revealed by New Protocols: Analysis of Data from a Self-Tracking Health App.

    PubMed

    Sohda, Satoshi; Suzuki, Kenta; Igari, Ichiro

    2017-11-27

    There are many mobile phone apps aimed at helping women map their ovulation and menstrual cycles and facilitating successful conception (or avoiding pregnancy). These apps usually ask users to input various biological features and have accumulated the menstrual cycle data of a vast number of women. The purpose of our study was to clarify how the data obtained from a self-tracking health app for female mobile phone users can be used to improve the accuracy of prediction of the date of next ovulation. Using the data of 7043 women who had reliable menstrual and ovulation records out of 8,000,000 users of a mobile phone app of a health care service, we analyzed the relationship between the menstrual cycle length, follicular phase length, and luteal phase length. Then we fitted a linear function to the relationship between the length of the menstrual cycle and timing of ovulation and compared it with the existing calendar-based methods. The correlation between the length of the menstrual cycle and the length of the follicular phase was stronger than the correlation between the length of the menstrual cycle and the length of the luteal phase, and there was a positive correlation between the lengths of past and future menstrual cycles. A strong positive correlation was also found between the mean length of past cycles and the length of the follicular phase. The correlation between the mean cycle length and the luteal phase length was also statistically significant. In most of the subjects, our method (ie, the calendar-based method based on the optimized function) outperformed the Ogino method of predicting the next ovulation date. Our method also outperformed the ovulation date prediction method that assumes the middle day of a mean menstrual cycle as the date of the next ovulation. The large number of subjects allowed us to capture the relationships between the lengths of the menstrual cycle, follicular phase, and luteal phase in more detail than previous studies. We then demonstrated how the present calendar methods could be improved by the better grouping of women. This study suggested that even without integrating various biological metrics, the dataset collected by a self-tracking app can be used to develop formulas that predict the ovulation day when the data are aggregated. Because the method that we developed requires data only on the first day of menstruation, it would be the best option for couples during the early stages of their attempt to have a baby or for those who want to avoid the cost associated with other methods. Moreover, the result will be the baseline for more advanced methods that integrate other biological metrics. ©Satoshi Sohda, Kenta Suzuki, Ichiro Igari. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 27.11.2017.

  9. Anaesthetic efficacy of articaine versus lidocaine in children's dentistry: a systematic review and meta-analysis.

    PubMed

    Tong, Huei Jinn; Alzahrani, Fatma Salem; Sim, Yu Fan; Tahmassebi, Jinous F; Duggal, Monty

    2018-04-10

    Over the last few years, numerous reviews and studies have awarded articaine hydrochloride local anaesthetic (LA) a superior reputation, with outcomes of different studies demonstrating a general tendency for articaine hydrochloride to outperform lidocaine hydrochloride for dental treatment. Nevertheless, there seems to be no clear agreement on which LA solution is more efficacious in dental treatment for children. There is no previous publication systematically reviewing and summarising the current best evidence with respect to the success rates of LA solutions in children. To evaluate the available evidence on the efficacy of lidocaine and articaine, used in paediatric dentistry. A systematic search was conducted on Cochrane CENTRAL Register of Controlled Trials, MEDLINE (OVID; 1950 to June 2017), Cumulative Index to Nursing and Allied Health Literature (CINAHL; EBSCOhost; 1982 to June 2017), EMBASE (OVID; 1980 to June 2017), SCI-EXPANDED (ISI Web of Knowledge; 1900 to June 2017), key journals, and previous review bibliographies through June 2017. Original research studies that compared articaine with lidocaine for dental treatment in children were included. Methodological quality assessment and assessment of risk of bias were carried out for each of the included studies. Electronic searching identified 525 publications. Following the primary and secondary assessment process, six randomised controlled trials (RCT) were included in the final analysis. There was no difference between patient self-reported pain between articaine and lidocaine during treatment procedures (SMD = 0.06, P-value = 0.614), and no difference in the occurrence of adverse events between articaine and lidocaine injections following treatment in paediatric patients (RR = 1.10, P-value = 0.863). Yet, patients reported significantly less pain post-procedure following articaine injections (SMD = 0.37, P-value = 0.013). Substantial heterogeneity was noted in the reporting of outcomes among studies, with the overall quality of majority of studies being at high risk of bias. There is low quality evidence suggesting that both articaine as infiltration and lidocaine IAD nerve blocks presented the same efficacy when used for routine dental treatments, with no difference between patient self-reported pain between articaine and lidocaine during treatment procedures. Yet, significantly less pain post-procedure was reported following articaine injections. There was no difference in the occurrence of adverse events between articaine and lidocaine injections following treatment in paediatric patients. © 2018 BSPD, IAPD and John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  10. RCK: accurate and efficient inference of sequence- and structure-based protein-RNA binding models from RNAcompete data.

    PubMed

    Orenstein, Yaron; Wang, Yuhao; Berger, Bonnie

    2016-06-15

    Protein-RNA interactions, which play vital roles in many processes, are mediated through both RNA sequence and structure. CLIP-based methods, which measure protein-RNA binding in vivo, suffer from experimental noise and systematic biases, whereas in vitro experiments capture a clearer signal of protein RNA-binding. Among them, RNAcompete provides binding affinities of a specific protein to more than 240 000 unstructured RNA probes in one experiment. The computational challenge is to infer RNA structure- and sequence-based binding models from these data. The state-of-the-art in sequence models, Deepbind, does not model structural preferences. RNAcontext models both sequence and structure preferences, but is outperformed by GraphProt. Unfortunately, GraphProt cannot detect structural preferences from RNAcompete data due to the unstructured nature of the data, as noted by its developers, nor can it be tractably run on the full RNACompete dataset. We develop RCK, an efficient, scalable algorithm that infers both sequence and structure preferences based on a new k-mer based model. Remarkably, even though RNAcompete data is designed to be unstructured, RCK can still learn structural preferences from it. RCK significantly outperforms both RNAcontext and Deepbind in in vitro binding prediction for 244 RNAcompete experiments. Moreover, RCK is also faster and uses less memory, which enables scalability. While currently on par with existing methods in in vivo binding prediction on a small scale test, we demonstrate that RCK will increasingly benefit from experimentally measured RNA structure profiles as compared to computationally predicted ones. By running RCK on the entire RNAcompete dataset, we generate and provide as a resource a set of protein-RNA structure-based models on an unprecedented scale. Software and models are freely available at http://rck.csail.mit.edu/ bab@mit.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  11. Development and Validation of a Novel Robotic Procedure Specific Simulation Platform: Partial Nephrectomy.

    PubMed

    Hung, Andrew J; Shah, Swar H; Dalag, Leonard; Shin, Daniel; Gill, Inderbir S

    2015-08-01

    We developed a novel procedure specific simulation platform for robotic partial nephrectomy. In this study we prospectively evaluate its face, content, construct and concurrent validity. This hybrid platform features augmented reality and virtual reality. Augmented reality involves 3-dimensional robotic partial nephrectomy surgical videos overlaid with virtual instruments to teach surgical anatomy, technical skills and operative steps. Advanced technical skills are assessed with an embedded full virtual reality renorrhaphy task. Participants were classified as novice (no surgical training, 15), intermediate (less than 100 robotic cases, 13) or expert (100 or more robotic cases, 14) and prospectively assessed. Cohort performance was compared with the Kruskal-Wallis test (construct validity). Post-study questionnaire was used to assess the realism of simulation (face validity) and usefulness for training (content validity). Concurrent validity evaluated correlation between virtual reality renorrhaphy task and a live porcine robotic partial nephrectomy performance (Spearman's analysis). Experts rated the augmented reality content as realistic (median 8/10) and helpful for resident/fellow training (8.0-8.2/10). Experts rated the platform highly for teaching anatomy (9/10) and operative steps (8.5/10) but moderately for technical skills (7.5/10). Experts and intermediates outperformed novices (construct validity) in efficiency (p=0.0002) and accuracy (p=0.002). For virtual reality renorrhaphy, experts outperformed intermediates on GEARS metrics (p=0.002). Virtual reality renorrhaphy and in vivo porcine robotic partial nephrectomy performance correlated significantly (r=0.8, p <0.0001) (concurrent validity). This augmented reality simulation platform displayed face, content and construct validity. Performance in the procedure specific virtual reality task correlated highly with a porcine model (concurrent validity). Future efforts will integrate procedure specific virtual reality tasks and their global assessment. Copyright © 2015 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  12. Predicting 30-Day Hospital Readmissions in Acute Myocardial Infarction: The AMI "READMITS" (Renal Function, Elevated Brain Natriuretic Peptide, Age, Diabetes Mellitus, Nonmale Sex, Intervention with Timely Percutaneous Coronary Intervention, and Low Systolic Blood Pressure) Score.

    PubMed

    Nguyen, Oanh Kieu; Makam, Anil N; Clark, Christopher; Zhang, Song; Das, Sandeep R; Halm, Ethan A

    2018-04-17

    Readmissions after hospitalization for acute myocardial infarction (AMI) are common. However, the few currently available AMI readmission risk prediction models have poor-to-modest predictive ability and are not readily actionable in real time. We sought to develop an actionable and accurate AMI readmission risk prediction model to identify high-risk patients as early as possible during hospitalization. We used electronic health record data from consecutive AMI hospitalizations from 6 hospitals in north Texas from 2009 to 2010 to derive and validate models predicting all-cause nonelective 30-day readmissions, using stepwise backward selection and 5-fold cross-validation. Of 826 patients hospitalized with AMI, 13% had a 30-day readmission. The first-day AMI model (the AMI "READMITS" score) included 7 predictors: renal function, elevated brain natriuretic peptide, age, diabetes mellitus, nonmale sex, intervention with timely percutaneous coronary intervention, and low systolic blood pressure, had an optimism-corrected C-statistic of 0.73 (95% confidence interval, 0.71-0.74) and was well calibrated. The full-stay AMI model, which included 3 additional predictors (use of intravenous diuretics, anemia on discharge, and discharge to postacute care), had an optimism-corrected C-statistic of 0.75 (95% confidence interval, 0.74-0.76) with minimally improved net reclassification and calibration. Both AMI models outperformed corresponding multicondition readmission models. The parsimonious AMI READMITS score enables early prospective identification of high-risk AMI patients for targeted readmissions reduction interventions within the first 24 hours of hospitalization. A full-stay AMI readmission model only modestly outperformed the AMI READMITS score in terms of discrimination, but surprisingly did not meaningfully improve reclassification. © 2018 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  13. Detection of prostate cancer using magnetic resonance imaging/ultrasonography image-fusion targeted biopsy in African-American men.

    PubMed

    Shin, Toshitaka; Smyth, Thomas B; Ukimura, Osamu; Ahmadi, Nariman; de Castro Abreu, Andre Luis; Oishi, Masakatsu; Mimata, Hiromitsu; Gill, Inderbir S

    2017-08-01

    To assess the diagnostic yield of targeted prostate biopsy in African-American (A-A) men using image fusion of multi-parametric magnetic resonance imaging (mpMRI) with real-time transrectal ultrasonography (US). We retrospectively analysed 661 patients (117 A-A and 544 Caucasian) who had mpMRI before biopsy and then underwent MRI/US image-fusion targeted biopsy (FTB) between October 2012 and August 2015. The mpMRIs were reported on a 5-point Likert scale of suspicion. Clinically significant prostate cancer (CSPC) was defined as biopsy Gleason score ≥7. After controlling for age, prostate-specific antigen level and prostate volume, there were no significant differences between A-A and Caucasian men in the detection rate of overall cancer (35.0% vs 34.2%, P = 0.9) and CSPC (18.8% vs 21.7%, P = 0.3) with MRI/US FTB. There were no significant differences between the races in the location of dominant lesions on mpMRI, and in the proportion of 5-point Likert scoring. In A-A men, MRI/US FTB from the grade 4-5 lesions outperformed random biopsy in the detection rate of overall cancer (70.6% vs 37.2%, P = 0.003) and CSPC (52.9% vs 12.4%, P < 0.001). MRI/US FTB outperformed random biopsy in cancer core length (5.0 vs 2.4 mm, P = 0.001), in cancer rate per core (24.9% vs 6.8%, P < 0.001), and in efficiency for detecting one patient with CSPC (mean number of cores needed 13.3 vs 81.9, P < 0.001), respectively. Our key finding confirms a lack of racial difference in the detection rate of overall prostate cancers and CSPC with MRI/US FTB between A-A and Caucasian men. MRI/US FTB detected more CSPC using fewer cores compared with random biopsy. © 2017 The Authors BJU International © 2017 BJU International Published by John Wiley & Sons Ltd.

  14. Effective? Engaging? Secure? Applying the ORCHA-24 framework to evaluate apps for chronic insomnia disorder.

    PubMed

    Leigh, Simon; Ouyang, Jing; Mimnagh, Chris

    2017-11-01

    Mobile health offers many opportunities; however, the 'side-effects' of health apps are often unclear. With no guarantee health apps first do no harm, their role as a viable, safe and effective therapeutic option is limited. To assess the quality of apps for chronic insomnia disorder, available on the Android Google Play Store, and determine whether a novel approach to app assessment could identify high-quality and low-risk health apps in the absence of indicators such as National Health Service (NHS) approval. The Organisation for the Review of Care and Health Applications- 24 Question Assessment (ORCHA-24), 24 app assessment criteria concerning data privacy, clinical efficacy and user experience, answered on a 'yes' or 'no' and evidence-driven basis, was applied to assess 18 insomnia apps identified via the Android Google Play Store, in addition to the NHS-approved iOS app Sleepio. 63.2% of apps (12/19) provided a privacy policy, with seven (36.8%) stating no user data would be shared without explicit consent. 10.5% (2/19) stated they had been shown to be of benefit to those with insomnia, with cognitive behavioural therapy apps outperforming hypnosis and meditation apps (p=0.046). Both the number of app downloads (p=0.29) and user-review scores (p=0.23) were unrelated to ORCHA-24 scores. The NHS-approved app Sleepio, consistently outperformed non-accredited apps across all domains of the ORCHA-24. Apps for chronic insomnia disorder exhibit substantial variation in adherence to published data privacy, user experience and clinical efficacy standards, which are not clearly correlated with app downloads or user-review scores. In absence of formal app accreditation, the ORCHA-24 could feasibly be used to highlight the risk-benefit profiles of health apps prior to downloading. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  15. Can CT and MR Shape and Textural Features Differentiate Benign Versus Malignant Pleural Lesions?

    PubMed

    Pena, Elena; Ojiaku, MacArinze; Inacio, Joao R; Gupta, Ashish; Macdonald, D Blair; Shabana, Wael; Seely, Jean M; Rybicki, Frank J; Dennie, Carole; Thornhill, Rebecca E

    2017-10-01

    The study aimed to identify a radiomic approach based on CT and or magnetic resonance (MR) features (shape and texture) that may help differentiate benign versus malignant pleural lesions, and to assess if the radiomic model may improve confidence and accuracy of radiologists with different subspecialty backgrounds. Twenty-nine patients with pleural lesions studied on both contrast-enhanced CT and MR imaging were reviewed retrospectively. Three texture and three shape features were extracted. Combinations of features were used to generate logistic regression models using histopathology as outcome. Two thoracic and two abdominal radiologists evaluated their degree of confidence in malignancy. Diagnostic accuracy of radiologists was determined using contingency tables. Cohen's kappa coefficient was used to assess inter-reader agreement. Using optimal threshold criteria, sensitivity, specificity, and accuracy of each feature and combination of features were obtained and compared to the accuracy and confidence of radiologists. The CT model that best discriminated malignant from benign lesions revealed an AUC CT  = 0.92 ± 0.05 (P < 0.0001). The most discriminative MR model showed an AUC MR  = 0.87 ± 0.09 (P < 0.0001). The CT model was compared to the diagnostic confidence of all radiologists and the model outperformed both abdominal radiologists (P < 0.002), whereas the top discriminative MR model outperformed one of the abdominal radiologists (P = 0.02). The most discriminative MR model was more accurate than one abdominal (P = 0.04) and one thoracic radiologist (P = 0.02). Quantitative textural and shape analysis may help distinguish malignant from benign lesions. A radiomics-based approach may increase diagnostic confidence of abdominal radiologists on CT and MR and may potentially improve radiologists' accuracy in the assessment of pleural lesions characterized by MR. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  16. Retracted articles in surgery journals. What are surgeons doing wrong?

    PubMed

    Cassão, Bruna Dell'Acqua; Herbella, Fernando A M; Schlottmann, Francisco; Patti, Marco G

    2018-06-01

    Retraction of previously published scientific articles is an important mechanism to preserve the integrity of scientific work. This study analyzed retractions of previously published articles from surgery journals. We searched for retracted articles in the 100 surgery journals with the highest SJR2 indicator grades. We found 130 retracted articles in 49 journals (49%). Five or more retracted articles were published in 8 journals (8%). The mean time between publication and retraction was 26 months (range 1 to 158 months). The United States, China, Germany, Japan, and the United Kingdom accounted for more than 3 out of 4 of the retracted articles. The greatest number of retractions came from manuscripts about orthopedics and traumatology, general surgery, anesthesiology, cardiothoracic surgery, and plastic surgery. Nonsurgeons were responsible for 16% of retractions in these surgery journals. The main reasons for retraction were duplicate publication (42%), plagiarism (16%), absence of proven integrity of the study (14%), incorrect data (13%), data published without authorization (12%), violation of research ethics (11%), documented fraud (11%), request of an author(s) (5%), and unknown (3%). In 25% of the retracted articles, other publications by the same authors also had been retracted. Retraction of published articles does not occur frequently in surgery journals. Some form of scientific misconduct was present in the majority of retractions, especially duplication of publication and plagiarism. Retractions of previously published articles were most frequent from countries with the greatest number of publications; some authors showed recidivism. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. 37 CFR 381.10 - Cost of living adjustment.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2006, to the most recent Index published prior to December 1, 2007... the cost of living during the period from the most recent index published prior to the previous notice...

  18. 37 CFR 253.10 - Cost of living adjustment.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2002, to the most recent Index published prior to December 1, 2003... cost of living during the period from the most recent index published prior to the previous notice, to...

  19. 37 CFR 381.10 - Cost of living adjustment.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2012, to the most recent Index published prior to December 1, 2013... change in the cost of living during the period from the most recent index published prior to the previous...

  20. 37 CFR 381.10 - Cost of living adjustment.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2006, to the most recent Index published prior to December 1, 2007... the cost of living during the period from the most recent index published prior to the previous notice...

  1. 37 CFR 381.10 - Cost of living adjustment.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2012, to the most recent Index published prior to December 1, 2013... change in the cost of living during the period from the most recent index published prior to the previous...

  2. 37 CFR 253.10 - Cost of living adjustment.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2002, to the most recent Index published prior to December 1, 2003... cost of living during the period from the most recent index published prior to the previous notice, to...

  3. 37 CFR 253.10 - Cost of living adjustment.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2002, to the most recent Index published prior to December 1, 2003... cost of living during the period from the most recent index published prior to the previous notice, to...

  4. 37 CFR 253.10 - Cost of living adjustment.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2002, to the most recent Index published prior to December 1, 2003... cost of living during the period from the most recent index published prior to the previous notice, to...

  5. 37 CFR 381.10 - Cost of living adjustment.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2006, to the most recent Index published prior to December 1, 2007... the cost of living during the period from the most recent index published prior to the previous notice...

  6. A Two-Locus Model of the Evolution of Insecticide Resistance to Inform and Optimise Public Health Insecticide Deployment Strategies

    PubMed Central

    2017-01-01

    We develop a flexible, two-locus model for the spread of insecticide resistance applicable to mosquito species that transmit human diseases such as malaria. The model allows differential exposure of males and females, allows them to encounter high or low concentrations of insecticide, and allows selection pressures and dominance values to differ depending on the concentration of insecticide encountered. We demonstrate its application by investigating the relative merits of sequential use of insecticides versus their deployment as a mixture to minimise the spread of resistance. We recover previously published results as subsets of this model and conduct a sensitivity analysis over an extensive parameter space to identify what circumstances favour mixtures over sequences. Both strategies lasted more than 500 mosquito generations (or about 40 years) in 24% of runs, while in those runs where resistance had spread to high levels by 500 generations, 56% favoured sequential use and 44% favoured mixtures. Mixtures are favoured when insecticide effectiveness (their ability to kill homozygous susceptible mosquitoes) is high and exposure (the proportion of mosquitoes that encounter the insecticide) is low. If insecticides do not reliably kill homozygous sensitive genotypes, it is likely that sequential deployment will be a more robust strategy. Resistance to an insecticide always spreads slower if that insecticide is used in a mixture although this may be insufficient to outperform sequential use: for example, a mixture may last 5 years while the two insecticides deployed individually may last 3 and 4 years giving an overall ‘lifespan’ of 7 years for sequential use. We emphasise that this paper is primarily about designing and implementing a flexible modelling strategy to investigate the spread of insecticide resistance in vector populations and demonstrate how our model can identify vector control strategies most likely to minimise the spread of insecticide resistance. PMID:28095406

  7. Development of a methodology for the standardisation and improvement of 'Smartphone' photography of patterned bruises and other cutaneous injuries.

    PubMed

    Biggs, Paul R; Evans, Samuel T; Jones, Michael D; Theobald, Peter S

    2013-09-01

    Human bite-mark analyses can play a prominent role in forensic case investigations, including those involving sexual assault. High-quality photographs routinely secure a link between a bite-mark and an individual's dentition. Access to around the clock forensic photography, however, is often limited, resulting in delay and/or missed opportunities to record valuable evidence. The emergence of Smartphone high-quality photographic technology now provides a previously unimagined opportunity to gather timely forensic photographic evidence. Problems can arise, however, due to the relatively poor quality of the photographs, as a result of many of those taking photographs having received little or no forensic photography training. This study compares unassisted photography with assisted photography, by a specifically developed camera application (App), to provide a standardised method for taking forensic photographs. An App, written in Java, was hosted on the Google Android Operating System, on a Samsung Galaxy SII Smartphone. Twenty-four volunteers participated in a study to photograph a pseudo bite-mark using three methods, (1) unassisted (as a control), (2) assisted by an ABFO No.2 right-angled photographic reference scale and (3) assisted by the App. The App, method (3), was shown to consistently outperform methods (1) and (2), demonstrating greater standardisation and precision (p<0.001). Analysis of the data showed the extent to which acquiring an accurate photograph depends on the image being orthogonal to the camera. It appears likely that the relatively inaccurate photographs acquired by methods (1) and (2), were as a result of deviation from the plane, orthogonal to the bite-mark. Therefore, the App was successful in ensuring that the camera was both orthogonal and at an appropriate distance, relative to the bite-mark. Thus, the App enhanced the abilities of non-experts to acquire more accurate photographs and created the potential to significantly improve the quality of forensic photographs. Copyright © 2013. Published by Elsevier Ireland Ltd.

  8. Understanding the Barriers to Hiring and Promoting Women in Surgical Subspecialties.

    PubMed

    Valsangkar, Nakul; Fecher, Alison M; Rozycki, Grace S; Blanton, Cassie; Bell, Teresa M; Freischlag, Julie; Ahuja, Nita; Zimmers, Teresa A; Koniaris, Leonidas G

    2016-08-01

    The objective of this study was to characterize potential disparities in academic output, NIH-funding, and academic rank between male and female surgical faculty and identify subspecialties in which these differences may be more pronounced. Eighty metrics for 4,015 faculty members at the top-55 NIH-funded departments of surgery were collected. Demographic characteristics, NIH funding details, and scholarly output were analyzed. A new metric, academic velocity (V), reflecting recent citations is defined. Overall, 21.5% of surgical faculty are women. The percentage of female faculty is highest in science/research (41%) and surgical oncology (34%), and lowest in cardiothoracic surgery (9%). Female faculty are less likely to be full professors (22.7% vs 41.2%) and division chiefs (6.2% vs 13.6%). The fraction of women who are full professors is lowest in cardiothoracic surgery. Overall median numbers of publications/citations are lower for female faculty compared with male surgical faculty (21 of 364 vs 43 of 723, p < 0.001), and these differences are more pronounced for assistant professors. Current/previous NIH funding (21.3% vs 24%, p = NS) rates are similar between women and men, and surgical departments with more female full professors have higher NIH funding ranking (R(2) = 0.14, p < 0.05). In certain subspecialties, female associate and full professors outperform male counterparts. Overall, female authors have higher numbers of more recent citations. Subspecialty involvement and academic performance differences by sex vary greatly by subspecialty type and are most pronounced at the assistant professor level. Identification of potential barriers for entry of women into certain subspecialties, causes for the observed lower number of publications/citations among female assistant professors, and obstacles for attaining leadership roles need to be determined. We propose a new metric for assessment of publications/citations that can offset the effects of seniority differences between male and female faculty members. Copyright © 2016 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  9. A numerical study of combined use of two biocontrol agents with different biocontrol mechanisms in controlling foliar pathogens.

    PubMed

    Xu, X-M; Jeffries, P; Pautasso, M; Jeger, M J

    2011-09-01

    Effective use of biocontrol agents is an important component of sustainable agriculture. A previous numerical study of a generic model showed that biocontrol efficacy was greatest for a single biocontrol agent (BCA) combining competition with mycoparasitism or antibiosis. This study uses the same mathematical model to investigate whether the biocontrol efficacy of combined use of two BCAs with different biocontrol mechanisms is greater than that of a single BCA with either or both of the two mechanisms, assuming that two BCAs occupy the same host tissue as the pathogen. Within the parameter values considered, a BCA with two biocontrol mechanisms always outperformed the combined use of two BCAs with a single but different biocontrol mechanism. Similarly, combined use of two BCAs with a single but different biocontrol mechanism is shown to be far less effective than that of a single BCA with both mechanisms. Disease suppression from combined use of two BCAs was very similar to that achieved by the more efficacious one. As expected, a higher BCA introduction rate led to increased disease suppression. Incorporation of interactions between two BCAs did not greatly affect the disease dynamics except when a mycoparasitic and, to a lesser extent, an antibiotic-producing BCA was involved. Increasing the competitiveness of a mycoparasitic BCA over a BCA whose biocontrol mechanism is either competition or antibiosis may lead to improved biocontrol initially and reduced fluctuations in disease dynamics. The present study suggests that, under the model assumptions, combined use of two BCAs with different biocontrol mechanisms in most cases only results in control efficacies similar to using the more efficacious one alone. These predictions are consistent with published experimental results, suggesting that combined use of BCAs should not be recommended without clear understanding of their main biocontrol mechanisms and relative competitiveness, and experimental evaluation.

  10. Multisite external validation of a risk prediction model for the diagnosis of blood stream infections in febrile pediatric oncology patients without severe neutropenia.

    PubMed

    Esbenshade, Adam J; Zhao, Zhiguo; Aftandilian, Catherine; Saab, Raya; Wattier, Rachel L; Beauchemin, Melissa; Miller, Tamara P; Wilkes, Jennifer J; Kelly, Michael J; Fernbach, Alison; Jeng, Michael; Schwartz, Cindy L; Dvorak, Christopher C; Shyr, Yu; Moons, Karl G M; Sulis, Maria-Luisa; Friedman, Debra L

    2017-10-01

    Pediatric oncology patients are at an increased risk of invasive bacterial infection due to immunosuppression. The risk of such infection in the absence of severe neutropenia (absolute neutrophil count ≥ 500/μL) is not well established and a validated prediction model for blood stream infection (BSI) risk offers clinical usefulness. A 6-site retrospective external validation was conducted using a previously published risk prediction model for BSI in febrile pediatric oncology patients without severe neutropenia: the Esbenshade/Vanderbilt (EsVan) model. A reduced model (EsVan2) excluding 2 less clinically reliable variables also was created using the initial EsVan model derivative cohort, and was validated using all 5 external validation cohorts. One data set was used only in sensitivity analyses due to missing some variables. From the 5 primary data sets, there were a total of 1197 febrile episodes and 76 episodes of bacteremia. The overall C statistic for predicting bacteremia was 0.695, with a calibration slope of 0.50 for the original model and a calibration slope of 1.0 when recalibration was applied to the model. The model performed better in predicting high-risk bacteremia (gram-negative or Staphylococcus aureus infection) versus BSI alone, with a C statistic of 0.801 and a calibration slope of 0.65. The EsVan2 model outperformed the EsVan model across data sets with a C statistic of 0.733 for predicting BSI and a C statistic of 0.841 for high-risk BSI. The results of this external validation demonstrated that the EsVan and EsVan2 models are able to predict BSI across multiple performance sites and, once validated and implemented prospectively, could assist in decision making in clinical practice. Cancer 2017;123:3781-3790. © 2017 American Cancer Society. © 2017 American Cancer Society.

  11. Comparison of the HEART and TIMI Risk Scores for Suspected Acute Coronary Syndrome in the Emergency Department.

    PubMed

    Sun, Benjamin C; Laurie, Amber; Fu, Rongwei; Ferencik, Maros; Shapiro, Michael; Lindsell, Christopher J; Diercks, Deborah; Hoekstra, James W; Hollander, Judd E; Kirk, J Douglas; Peacock, W Frank; Anantharaman, Venkataraman; Pollack, Charles V

    2016-03-01

    The emergency department evaluation for suspected acute coronary syndrome (ACS) is common, costly, and challenging. Risk scores may help standardize clinical care and screening for research studies. The Thrombolysis in Myocardial Infarction (TIMI) and HEART are two commonly cited risk scores. We tested the null hypothesis that the TIMI and HEART risk scores have equivalent test characteristics. We analyzed data from the Internet Tracking Registry of Acute Coronary Syndromes (i*trACS) from 9 EDs on patients with suspected ACS, 1999-2001. We excluded patients with an emergency department diagnosis consistent with ACS, or without sufficient data to calculate TIMI and HEART scores. The primary outcome was 30-day major adverse cardiovascular events, including all-cause death, acute myocardial infarction, and urgent revascularization. We describe test characteristics of the TIMI and HEART risk scores. The study cohort included 8255 patients with 508 (6.2%) 30-day major adverse cardiovascular events. Receiver operating curve and reclassification analyses favored HEART [c statistic: 0.753, 95% confidence interval (CI): 0.733-0.773; continuous net reclassification improvement: 0.608, 95% CI: 0.527-0.689] over TIMI (c statistic: 0.678, 95% CI: 0.655-0.702). A HEART score 0-3 [negative predictive value (NPV) 0.982, 95% CI: 0.978-0.986; positive predictive value (PPV) 0.103, 95% CI: 0.094-0.113; likelihood ratio (LR) positive 1.76; LR negative 0.28] demonstrates similar or superior NPV/PPV/LR compared with TIMI = 0 (NPV 0.978, 95% CI: 0.971-0.983; PPV 0.077, 95% CI: 0.071-0.084; LR positive 1.28; LR negative 0.35) and TIMI = 0-1 (NPV 0.963, 95% CI: 0.958-0.968; PPV 0.102, 95% CI: 0.092-0.113; LR positive 1.73; LR negative 0.58). The HEART score has better discrimination than TIMI and outperforms TIMI within previously published "low-risk" categories.

  12. Behavioural and neural basis of anomalous motor learning in children with autism.

    PubMed

    Marko, Mollie K; Crocetti, Deana; Hulst, Thomas; Donchin, Opher; Shadmehr, Reza; Mostofsky, Stewart H

    2015-03-01

    Autism spectrum disorder is a developmental disorder characterized by deficits in social and communication skills and repetitive and stereotyped interests and behaviours. Although not part of the diagnostic criteria, individuals with autism experience a host of motor impairments, potentially due to abnormalities in how they learn motor control throughout development. Here, we used behavioural techniques to quantify motor learning in autism spectrum disorder, and structural brain imaging to investigate the neural basis of that learning in the cerebellum. Twenty children with autism spectrum disorder and 20 typically developing control subjects, aged 8-12, made reaching movements while holding the handle of a robotic manipulandum. In random trials the reach was perturbed, resulting in errors that were sensed through vision and proprioception. The brain learned from these errors and altered the motor commands on the subsequent reach. We measured learning from error as a function of the sensory modality of that error, and found that children with autism spectrum disorder outperformed typically developing children when learning from errors that were sensed through proprioception, but underperformed typically developing children when learning from errors that were sensed through vision. Previous work had shown that this learning depends on the integrity of a region in the anterior cerebellum. Here we found that the anterior cerebellum, extending into lobule VI, and parts of lobule VIII were smaller than normal in children with autism spectrum disorder, with a volume that was predicted by the pattern of learning from visual and proprioceptive errors. We suggest that the abnormal patterns of motor learning in children with autism spectrum disorder, showing an increased sensitivity to proprioceptive error and a decreased sensitivity to visual error, may be associated with abnormalities in the cerebellum. © The Author (2015). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  13. Combining Gene Signatures Improves Prediction of Breast Cancer Survival

    PubMed Central

    Zhao, Xi; Naume, Bjørn; Langerød, Anita; Frigessi, Arnoldo; Kristensen, Vessela N.; Børresen-Dale, Anne-Lise; Lingjærde, Ole Christian

    2011-01-01

    Background Several gene sets for prediction of breast cancer survival have been derived from whole-genome mRNA expression profiles. Here, we develop a statistical framework to explore whether combination of the information from such sets may improve prediction of recurrence and breast cancer specific death in early-stage breast cancers. Microarray data from two clinically similar cohorts of breast cancer patients are used as training (n = 123) and test set (n = 81), respectively. Gene sets from eleven previously published gene signatures are included in the study. Principal Findings To investigate the relationship between breast cancer survival and gene expression on a particular gene set, a Cox proportional hazards model is applied using partial likelihood regression with an L2 penalty to avoid overfitting and using cross-validation to determine the penalty weight. The fitted models are applied to an independent test set to obtain a predicted risk for each individual and each gene set. Hierarchical clustering of the test individuals on the basis of the vector of predicted risks results in two clusters with distinct clinical characteristics in terms of the distribution of molecular subtypes, ER, PR status, TP53 mutation status and histological grade category, and associated with significantly different survival probabilities (recurrence: p = 0.005; breast cancer death: p = 0.014). Finally, principal components analysis of the gene signatures is used to derive combined predictors used to fit a new Cox model. This model classifies test individuals into two risk groups with distinct survival characteristics (recurrence: p = 0.003; breast cancer death: p = 0.001). The latter classifier outperforms all the individual gene signatures, as well as Cox models based on traditional clinical parameters and the Adjuvant! Online for survival prediction. Conclusion Combining the predictive strength of multiple gene signatures improves prediction of breast cancer survival. The presented methodology is broadly applicable to breast cancer risk assessment using any new identified gene set. PMID:21423775

  14. Experimental testing of the noise-canceling processor.

    PubMed

    Collins, Michael D; Baer, Ralph N; Simpson, Harry J

    2011-09-01

    Signal-processing techniques for localizing an acoustic source buried in noise are tested in a tank experiment. Noise is generated using a discrete source, a bubble generator, and a sprinkler. The experiment has essential elements of a realistic scenario in matched-field processing, including complex source and noise time series in a waveguide with water, sediment, and multipath propagation. The noise-canceling processor is found to outperform the Bartlett processor and provide the correct source range for signal-to-noise ratios below -10 dB. The multivalued Bartlett processor is found to outperform the Bartlett processor but not the noise-canceling processor. © 2011 Acoustical Society of America

  15. Profitability of simple technical trading rules of Chinese stock exchange indexes

    NASA Astrophysics Data System (ADS)

    Zhu, Hong; Jiang, Zhi-Qiang; Li, Sai-Ping; Zhou, Wei-Xing

    2015-12-01

    Although technical trading rules have been widely used by practitioners in financial markets, their profitability still remains controversial. We here investigate the profitability of moving average (MA) and trading range break (TRB) rules by using the Shanghai Stock Exchange Composite Index (SHCI) from May 21, 1992 through December 31, 2013 and Shenzhen Stock Exchange Component Index (SZCI) from April 3, 1991 through December 31, 2013. The t-test is adopted to check whether the mean returns which are conditioned on the trading signals are significantly different from unconditioned returns and whether the mean returns conditioned on the buy signals are significantly different from the mean returns conditioned on the sell signals. We find that TRB rules outperform MA rules and short-term variable moving average (VMA) rules outperform long-term VMA rules. By applying White's Reality Check test and accounting for the data snooping effects, we find that the best trading rule outperforms the buy-and-hold strategy when transaction costs are not taken into consideration. Once transaction costs are included, trading profits will be eliminated completely. Our analysis suggests that simple trading rules like MA and TRB cannot beat the standard buy-and-hold strategy for the Chinese stock exchange indexes.

  16. REMARK checklist elaborated to improve tumor prognostician

    Cancer.gov

    Experts have elaborated on a previously published checklist of 20 items -- including descriptions of design, methods, and analysis -- that researchers should address when publishing studies of prognostic markers. These markers are indicators that enable d

  17. Periodicity of Strong Seismicity in Italy: Schuster Spectrum Analysis Extended to the Destructive Earthquakes of 2016

    NASA Astrophysics Data System (ADS)

    Bragato, P. L.

    2017-10-01

    The strong earthquakes that occurred in Italy between 2009 and 2016 represent an abrupt acceleration of seismicity in respect of the previous 30 years. Such behavior seems to agree with the periodic rate change I observed in a previous paper. The present work improves that study by extending the data set up to the end of 2016, adopting the latest version of the historical seismic catalog of Italy, and introducing Schuster spectrum analysis for the detection of the oscillatory period and the assessment of its statistical significance. Applied to the declustered catalog of M w ≥ 6 earthquakes that occurred between 1600 and 2016, the analysis individuates a marked periodicity of 46 years, which is recognized above the 95% confidence level. Monte Carlo simulation shows that the oscillatory behavior is stable in respect of random errors on magnitude estimation. A parametric oscillatory model for the annual rate of seismicity is estimated by likelihood maximization under the hypothesis of inhomogeneous Poisson point process. According to the Akaike Information Criterion, such model outperforms the simpler homogeneous one with constant annual rate. A further element emerges form the analysis: so far, despite recent earthquakes, the Italian seismicity is still within a long-term decreasing trend established since the first half of the twentieth century.

  18. Evolutionary Design of Convolutional Neural Networks for Human Activity Recognition in Sensor-Rich Environments.

    PubMed

    Baldominos, Alejandro; Saez, Yago; Isasi, Pedro

    2018-04-23

    Human activity recognition is a challenging problem for context-aware systems and applications. It is gaining interest due to the ubiquity of different sensor sources, wearable smart objects, ambient sensors, etc. This task is usually approached as a supervised machine learning problem, where a label is to be predicted given some input data, such as the signals retrieved from different sensors. For tackling the human activity recognition problem in sensor network environments, in this paper we propose the use of deep learning (convolutional neural networks) to perform activity recognition using the publicly available OPPORTUNITY dataset. Instead of manually choosing a suitable topology, we will let an evolutionary algorithm design the optimal topology in order to maximize the classification F1 score. After that, we will also explore the performance of committees of the models resulting from the evolutionary process. Results analysis indicates that the proposed model was able to perform activity recognition within a heterogeneous sensor network environment, achieving very high accuracies when tested with new sensor data. Based on all conducted experiments, the proposed neuroevolutionary system has proved to be able to systematically find a classification model which is capable of outperforming previous results reported in the state-of-the-art, showing that this approach is useful and improves upon previously manually-designed architectures.

  19. Improving Simulated Annealing by Recasting it as a Non-Cooperative Game

    NASA Technical Reports Server (NTRS)

    Wolpert, David; Bandari, Esfandiar; Tumer, Kagan

    2001-01-01

    The game-theoretic field of COllective INtelligence (COIN) concerns the design of computer-based players engaged in a non-cooperative game so that as those players pursue their self-interests, a pre-specified global goal for the collective computational system is achieved "as a side-effect". Previous implementations of COIN algorithms have outperformed conventional techniques by up to several orders of magnitude, on domains ranging from telecommunications control to optimization in congestion problems. Recent mathematical developments have revealed that these previously developed game-theory-motivated algorithms were based on only two of the three factors determining performance. Consideration of only the third factor would instead lead to conventional optimization techniques like simulated annealing that have little to do with non-cooperative games. In this paper we present an algorithm based on all three terms at once. This algorithm can be viewed as a way to modify simulated annealing by recasting it as a non-cooperative game, with each variable replaced by a player. This recasting allows us to leverage the intelligent behavior of the individual players to substantially improve the exploration step of the simulated annealing. Experiments are presented demonstrating that this recasting improves simulated annealing by several orders of magnitude for spin glass relaxation and bin-packing.

  20. Feigning Amnesia Moderately Impairs Memory for a Mock Crime Video

    PubMed Central

    Mangiulli, Ivan; van Oorsouw, Kim; Curci, Antonietta; Merckelbach, Harald; Jelicic, Marko

    2018-01-01

    Previous studies showed that feigning amnesia for a crime impairs actual memory for the target event. Lack of rehearsal has been proposed as an explanation for this memory-undermining effect of feigning. The aim of the present study was to replicate and extend previous research adopting a mock crime video instead of a narrative story. We showed participants a video of a violent crime. Next, they were requested to imagine that they had committed this offense and to either feign amnesia or confess the crime. A third condition was included: Participants in the delayed test-only control condition did not receive any instruction. On subsequent recall tests, participants in all three conditions were instructed to report as much information as possible about the offense. On the free recall test, feigning amnesia impaired memory for the video clip, but participants who were asked to feign crime-related amnesia outperformed controls. However, no differences between simulators and confessors were found on both correct cued recollection or on distortion and commission rates. We also explored whether inner speech might modulate memory for the crime. Inner speech traits were not found to be related to the simulating amnesia effect. Theoretical and practical implications of our results are discussed. PMID:29760675

Top