Sample records for prediction method abstracts

  1. Beyond Captions: Linking Figures with Abstract Sentences in Biomedical Articles

    PubMed Central

    Bockhorst, Joseph P.; Conroy, John M.; Agarwal, Shashank; O’Leary, Dianne P.; Yu, Hong

    2012-01-01

    Although figures in scientific articles have high information content and concisely communicate many key research findings, they are currently under utilized by literature search and retrieval systems. Many systems ignore figures, and those that do not typically only consider caption text. This study describes and evaluates a fully automated approach for associating figures in the body of a biomedical article with sentences in its abstract. We use supervised methods to learn probabilistic language models, hidden Markov models, and conditional random fields for predicting associations between abstract sentences and figures. Three kinds of evidence are used: text in abstract sentences and figures, relative positions of sentences and figures, and the patterns of sentence/figure associations across an article. Each information source is shown to have predictive value, and models that use all kinds of evidence are more accurate than models that do not. Our most accurate method has an -score of 69% on a cross-validation experiment, is competitive with the accuracy of human experts, has significantly better predictive accuracy than state-of-the-art methods and enables users to access figures associated with an abstract sentence with an average of 1.82 fewer mouse clicks. A user evaluation shows that human users find our system beneficial. The system is available at http://FigureItOut.askHERMES.org. PMID:22815711

  2. Probabilistic Seeking Prediction in P2P VoD Systems

    NASA Astrophysics Data System (ADS)

    Wang, Weiwei; Xu, Tianyin; Gao, Yang; Lu, Sanglu

    In P2P VoD streaming systems, user behavior modeling is critical to help optimise user experience as well as system throughput. However, it still remains a challenging task due to the dynamic characteristics of user viewing behavior. In this paper, we consider the problem of user seeking prediction which is to predict the user's next seeking position so that the system can proactively make response. We present a novel method for solving this problem. In our method, frequent sequential patterns mining is first performed to extract abstract states which are not overlapped and cover the whole video file altogether. After mapping the raw training dataset to state transitions according to the abstract states, we use a simpel probabilistic contingency table to build the prediction model. We design an experiment on the synthetic P2P VoD dataset. The results demonstrate the effectiveness of our method.

  3. EVALUATION OF METHODS FOR PREDICTING THE TOXICITY OF POLYCYCLIC AROMATIC HYDROCARBON MIXTURES. (R825408)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  4. USSR and Eastern Europe Scientific Abstracts, Geophysics, Astronomy and Space, Number 394.

    DTIC Science & Technology

    1977-04-13

    Abstracts of Scientific Articles 10 Temperature and Wind Variations in Upper Atmosphere 10 Prediction of Precipitation for Five Days 10 Short...34 Method in Sea Gravimetry ... 30 Frequency Characteristics of Filter in "Points of Intersection" Method 30 Page Gravitational Anomalies in the...Inclination of Axes of Rotation and Orbits of Planets 47 Effect of Solar Activity on Precipitation Regime 47 Gravitational Orientation Systems with Two

  5. Predicting landslides in clearcut patches

    Treesearch

    Raymond M. Rice; Norman H. Pillsbury

    1982-01-01

    Abstract - Accelerated erosion in the form of landslides can be an undesirable consequence of clearcut logging on steep slopes. Forest managers need a method of predicting the risk of such erosion. Data collected after logging in a granitic area of northwestern California were used to develop a predictive equation. A linear discriminant function was developed that...

  6. Ab Initio Kinetics of Hydrogen Abstraction from Methyl Acetate by Hydrogen, Methyl, Oxygen, Hydroxyl, and Hydroperoxy Radicals.

    PubMed

    Tan, Ting; Yang, Xueliang; Krauter, Caroline M; Ju, Yiguang; Carter, Emily A

    2015-06-18

    The kinetics of hydrogen abstraction by five radicals (H, O((3)P), OH, CH3, and HO2) from methyl acetate (MA) is investigated theoretically in order to gain further understanding of certain aspects of the combustion chemistry of biodiesels, such as the effect of the ester moiety. We employ ab initio quantum chemistry methods, coupled cluster singles and doubles with perturbative triples correction (CCSD(T)) and multireference averaged coupled pair functional theory (MRACPF2), to predict chemically accurate reaction energetics. Overall, MRACPF2 predicts slightly higher barrier heights than CCSD(T) for MA + H/CH3/O/OH, but slightly lower barrier heights for hydrogen abstraction by HO2. Based on the obtained reaction energies, we also report high-pressure-limit rate constants using transition state theory (TST) in conjunction with the separable-hindered-rotor approximation, the variable reaction coordinate TST, and the multi-structure all-structure approach. The fitted modified Arrhenius expressions are provided over a temperature range of 250 to 2000 K. The predictions are in good agreement with available experimental results. Abstractions from both of the methyl groups in MA are expected to contribute to consumption of the fuel as they exhibit similar rate coefficients. The reactions involving the OH radical are predicted to have the highest rates among the five abstracting radicals, while those initiated by HO2 are expected to be the lowest.

  7. Moving beyond regression techniques in cardiovascular risk prediction: applying machine learning to address analytic challenges

    PubMed Central

    Goldstein, Benjamin A.; Navar, Ann Marie; Carter, Rickey E.

    2017-01-01

    Abstract Risk prediction plays an important role in clinical cardiology research. Traditionally, most risk models have been based on regression models. While useful and robust, these statistical methods are limited to using a small number of predictors which operate in the same way on everyone, and uniformly throughout their range. The purpose of this review is to illustrate the use of machine-learning methods for development of risk prediction models. Typically presented as black box approaches, most machine-learning methods are aimed at solving particular challenges that arise in data analysis that are not well addressed by typical regression approaches. To illustrate these challenges, as well as how different methods can address them, we consider trying to predicting mortality after diagnosis of acute myocardial infarction. We use data derived from our institution's electronic health record and abstract data on 13 regularly measured laboratory markers. We walk through different challenges that arise in modelling these data and then introduce different machine-learning approaches. Finally, we discuss general issues in the application of machine-learning methods including tuning parameters, loss functions, variable importance, and missing data. Overall, this review serves as an introduction for those working on risk modelling to approach the diffuse field of machine learning. PMID:27436868

  8. Applying temporal abstraction and case-based reasoning to predict approaching influenza waves.

    PubMed

    Schmidt, Rainer; Gierl, Lothar

    2002-01-01

    The goal of the TeCoMed project is to send early warnings against forthcoming waves or even epidemics of infectious diseases, especially of influenza, to interested practitioners, pharmacists etc. in the German federal state Mecklenburg-Western Pomerania. The forecast of these waves is based on written confirmations of unfitness for work of the main German health insurance company. Since influenza waves are difficult to predict because of their cyclic but not regular behaviour, statistical methods based on the computation of mean values are not helpful. Instead, we have developed a prognostic model that makes use of similar former courses. Our method combines Case-based Reasoning with Temporal Abstraction to decide whether early warning is appropriate.

  9. Prediction method abstracts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1994-12-31

    This conference was held December 4--8, 1994 in Asilomar, California. The purpose of this meeting was to provide a forum for exchange of state-of-the-art information concerning the prediction of protein structure. Attention if focused on the following: comparative modeling; sequence to fold assignment; and ab initio folding.

  10. Locally Weighted Learning Methods for Predicting Dose-Dependent Toxicity with Application to the Human Maximum Recommended Daily Dose

    DTIC Science & Technology

    2012-09-10

    Advanced Technology Research Center, U.S. Army Medical Research and Materiel Command, Fort Detrick, Maryland 21702, United States ABSTRACT: Toxicological ...species. Thus, it is more advantageous to predict the toxicological effects of a compound on humans directly from the human toxicological data of related...compounds. However, many popular quantitative structure−activity relationship ( QSAR ) methods that build a single global model by fitting all training

  11. Towards A Predictive First Principles Understanding Of Molecular Adsorption On Graphene

    DTIC Science & Technology

    2016-10-05

    used and developed state-of-the-art quantum mechanical methods to make accurate predictions about the interaction strength and adsorption structure...density functional theory, ab initio methods 16.  SECURITY CLASSIFICATION OF: 17.  LIMITATION OF ABSTRACT SAR 18.  NUMBER OF PAGES   11   19a.  NAME OF...important physical properties for a whole class of systems with weak non-covalent interactions, for example those involving the binding between water

  12. Predicting landslides related to clearcut logging, northwestern California, U.S.A.

    Treesearch

    David J. Furbish; Raymond M. Rice

    1983-01-01

    Abstract - Landslides related to clearcut logging are a significant source of erosion in the mountains of northwestern California. Forest managers, therefore, frequently must include assessments of landslide risk in their land-use plans. A quantitative method is needed to predict such risk over large areas of rugged mountainous terrain. From air photographs, data...

  13. Development of a nonlinear vortex method. [steady and unsteady aerodynamic loads of highly sweptback wings

    NASA Technical Reports Server (NTRS)

    Kandil, O. A.

    1981-01-01

    Progress is reported in the development of reliable nonlinear vortex methods for predicting the steady and unsteady aerodynamic loads of highly sweptback wings at large angles of attack. Abstracts of the papers, talks, and theses produced through this research are included. The modified nonlinear discrete vortex method and the nonlinear hybrid vortex method are highlighted.

  14. Effluent composition prediction of a two-stage anaerobic digestion process: machine learning and stoichiometry techniques.

    PubMed

    Alejo, Luz; Atkinson, John; Guzmán-Fierro, Víctor; Roeckel, Marlene

    2018-05-16

    Computational self-adapting methods (Support Vector Machines, SVM) are compared with an analytical method in effluent composition prediction of a two-stage anaerobic digestion (AD) process. Experimental data for the AD of poultry manure were used. The analytical method considers the protein as the only source of ammonia production in AD after degradation. Total ammonia nitrogen (TAN), total solids (TS), chemical oxygen demand (COD), and total volatile solids (TVS) were measured in the influent and effluent of the process. The TAN concentration in the effluent was predicted, this being the most inhibiting and polluting compound in AD. Despite the limited data available, the SVM-based model outperformed the analytical method for the TAN prediction, achieving a relative average error of 15.2% against 43% for the analytical method. Moreover, SVM showed higher prediction accuracy in comparison with Artificial Neural Networks. This result reveals the future promise of SVM for prediction in non-linear and dynamic AD processes. Graphical abstract ᅟ.

  15. The application of top-down abstraction learning using prediction as a supervisory signal to cyber security

    NASA Astrophysics Data System (ADS)

    Mugan, Jonathan; Khalili, Aram E.

    2014-05-01

    Current computer systems are dumb automatons, and their blind execution of instructions makes them open to attack. Their inability to reason means that they don't consider the larger, constantly changing context outside their immediate inputs. Their nearsightedness is particularly dangerous because, in our complex systems, it is difficult to prevent all exploitable situations. Additionally, the lack of autonomous oversight of our systems means they are unable to fight through attacks. Keeping adversaries completely out of systems may be an unreasonable expectation, and our systems need to adapt to attacks and other disruptions to achieve their objectives. What is needed is an autonomous controller within the computer system that can sense the state of the system and reason about that state. In this paper, we present Self-Awareness Through Predictive Abstraction Modeling (SATPAM). SATPAM uses prediction to learn abstractions that allow it to recognize the right events at the right level of detail. These abstractions allow SATPAM to break the world into small, relatively independent, pieces that allow employment of existing reasoning methods. SATPAM goes beyond classification-based machine learning and statistical anomaly detection to be able to reason about the system, and SATPAM's knowledge representation and reasoning is more like that of a human. For example, humans intuitively know that the color of a car is not relevant to any mechanical problem, and SATPAM provides a plausible method whereby a machine can acquire such reasoning patterns. In this paper, we present the initial experimental results using SATPAM.

  16. Theoretical study of the thermodynamics and kinetics of hydrogen abstractions from hydrocarbons.

    PubMed

    Vandeputte, Aäron G; Sabbe, Maarten K; Reyniers, Marie-Françoise; Van Speybroeck, Veronique; Waroquier, Michel; Marin, Guy B

    2007-11-22

    Thermochemical and kinetic data were calculated at four cost-effective levels of theory for a set consisting of five hydrogen abstraction reactions between hydrocarbons for which experimental data are available. The selection of a reliable, yet cost-effective method to study this type of reactions for a broad range of applications was done on the basis of comparison with experimental data or with results obtained from computationally demanding high level of theory calculations. For this benchmark study two composite methods (CBS-QB3 and G3B3) and two density functional theory (DFT) methods, MPW1PW91/6-311G(2d,d,p) and BMK/6-311G(2d,d,p), were selected. All four methods succeeded well in describing the thermochemical properties of the five studied hydrogen abstraction reactions. High-level Weizmann-1 (W1) calculations indicated that CBS-QB3 succeeds in predicting the most accurate reaction barrier for the hydrogen abstraction of methane by methyl but tends to underestimate the reaction barriers for reactions where spin contamination is observed in the transition state. Experimental rate coefficients were most accurately predicted with CBS-QB3. Therefore, CBS-QB3 was selected to investigate the influence of both the 1D hindered internal rotor treatment about the forming bond (1D-HR) and tunneling on the rate coefficients for a set of 21 hydrogen abstraction reactions. Three zero curvature tunneling (ZCT) methods were evaluated (Wigner, Skodje & Truhlar, Eckart). As the computationally more demanding centrifugal dominant small curvature semiclassical (CD-SCS) tunneling method did not yield significantly better agreement with experiment compared to the ZCT methods, CD-SCS tunneling contributions were only assessed for the hydrogen abstractions by methyl from methane and ethane. The best agreement with experimental rate coefficients was found when Eckart tunneling and 1D-HR corrections were applied. A mean deviation of a factor 6 on the rate coefficients is found for the complete set of 21 reactions at temperatures ranging from 298 to 1000 K. Tunneling corrections play a critical role in obtaining accurate rate coefficients, especially at lower temperatures, whereas the hindered rotor treatment only improves the agreement with experiment in the high-temperature range.

  17. Kinetic modeling of α-hydrogen abstractions from unsaturated and saturated oxygenate compounds by hydrogen atoms.

    PubMed

    Paraskevas, Paschalis D; Sabbe, Maarten K; Reyniers, Marie-Françoise; Papayannakos, Nikos G; Marin, Guy B

    2014-10-09

    Hydrogen-abstraction reactions play a significant role in thermal biomass conversion processes, as well as regular gasification, pyrolysis, or combustion. In this work, a group additivity model is constructed that allows prediction of reaction rates and Arrhenius parameters of hydrogen abstractions by hydrogen atoms from alcohols, ethers, esters, peroxides, ketones, aldehydes, acids, and diketones in a broad temperature range (300-2000 K). A training set of 60 reactions was developed with rate coefficients and Arrhenius parameters calculated by the CBS-QB3 method in the high-pressure limit with tunneling corrections using Eckart tunneling coefficients. From this set of reactions, 15 group additive values were derived for the forward and the reverse reaction, 4 referring to primary and 11 to secondary contributions. The accuracy of the model is validated upon an ab initio and an experimental validation set of 19 and 21 reaction rates, respectively, showing that reaction rates can be predicted with a mean factor of deviation of 2 for the ab initio and 3 for the experimental values. Hence, this work illustrates that the developed group additive model can be reliably applied for the accurate prediction of kinetics of α-hydrogen abstractions by hydrogen atoms from a broad range of oxygenates.

  18. Patient Similarity in Prediction Models Based on Health Data: A Scoping Review

    PubMed Central

    Sharafoddini, Anis; Dubin, Joel A

    2017-01-01

    Background Physicians and health policy makers are required to make predictions during their decision making in various medical problems. Many advances have been made in predictive modeling toward outcome prediction, but these innovations target an average patient and are insufficiently adjustable for individual patients. One developing idea in this field is individualized predictive analytics based on patient similarity. The goal of this approach is to identify patients who are similar to an index patient and derive insights from the records of similar patients to provide personalized predictions.. Objective The aim is to summarize and review published studies describing computer-based approaches for predicting patients’ future health status based on health data and patient similarity, identify gaps, and provide a starting point for related future research. Methods The method involved (1) conducting the review by performing automated searches in Scopus, PubMed, and ISI Web of Science, selecting relevant studies by first screening titles and abstracts then analyzing full-texts, and (2) documenting by extracting publication details and information on context, predictors, missing data, modeling algorithm, outcome, and evaluation methods into a matrix table, synthesizing data, and reporting results. Results After duplicate removal, 1339 articles were screened in abstracts and titles and 67 were selected for full-text review. In total, 22 articles met the inclusion criteria. Within included articles, hospitals were the main source of data (n=10). Cardiovascular disease (n=7) and diabetes (n=4) were the dominant patient diseases. Most studies (n=18) used neighborhood-based approaches in devising prediction models. Two studies showed that patient similarity-based modeling outperformed population-based predictive methods. Conclusions Interest in patient similarity-based predictive modeling for diagnosis and prognosis has been growing. In addition to raw/coded health data, wavelet transform and term frequency-inverse document frequency methods were employed to extract predictors. Selecting predictors with potential to highlight special cases and defining new patient similarity metrics were among the gaps identified in the existing literature that provide starting points for future work. Patient status prediction models based on patient similarity and health data offer exciting potential for personalizing and ultimately improving health care, leading to better patient outcomes. PMID:28258046

  19. Estimating the Accuracy of the Chedoke–McMaster Stroke Assessment Predictive Equations for Stroke Rehabilitation

    PubMed Central

    Dang, Mia; Ramsaran, Kalinda D.; Street, Melissa E.; Syed, S. Noreen; Barclay-Goddard, Ruth; Miller, Patricia A.

    2011-01-01

    ABSTRACT Purpose: To estimate the predictive accuracy and clinical usefulness of the Chedoke–McMaster Stroke Assessment (CMSA) predictive equations. Method: A longitudinal prognostic study using historical data obtained from 104 patients admitted post cerebrovascular accident was undertaken. Data were abstracted for all patients undergoing rehabilitation post stroke who also had documented admission and discharge CMSA scores. Published predictive equations were used to determine predicted outcomes. To determine the accuracy and clinical usefulness of the predictive model, shrinkage coefficients and predictions with 95% confidence bands were calculated. Results: Complete data were available for 74 patients with a mean age of 65.3±12.4 years. The shrinkage values for the six Impairment Inventory (II) dimensions varied from −0.05 to 0.09; the shrinkage value for the Activity Inventory (AI) was 0.21. The error associated with predictive values was greater than ±1.5 stages for the II dimensions and greater than ±24 points for the AI. Conclusions: This study shows that the large error associated with the predictions (as defined by the confidence band) for the CMSA II and AI limits their clinical usefulness as a predictive measure. Further research to establish predictive models using alternative statistical procedures is warranted. PMID:22654239

  20. New insights from cluster analysis methods for RNA secondary structure prediction

    PubMed Central

    Rogers, Emily; Heitsch, Christine

    2016-01-01

    A widening gap exists between the best practices for RNA secondary structure prediction developed by computational researchers and the methods used in practice by experimentalists. Minimum free energy (MFE) predictions, although broadly used, are outperformed by methods which sample from the Boltzmann distribution and data mine the results. In particular, moving beyond the single structure prediction paradigm yields substantial gains in accuracy. Furthermore, the largest improvements in accuracy and precision come from viewing secondary structures not at the base pair level but at lower granularity/higher abstraction. This suggests that random errors affecting precision and systematic ones affecting accuracy are both reduced by this “fuzzier” view of secondary structures. Thus experimentalists who are willing to adopt a more rigorous, multilayered approach to secondary structure prediction by iterating through these levels of granularity will be much better able to capture fundamental aspects of RNA base pairing. PMID:26971529

  1. Thinking Through Computational Exposure as an Evolving Paradign Shift for Exposure Science: Development and Application of Predictive Models from Big Data

    EPA Science Inventory

    Symposium Abstract: Exposure science has evolved from a time when the primary focus was on measurements of environmental and biological media and the development of enabling field and laboratory methods. The Total Exposure Assessment Method (TEAM) studies of the 1980s were class...

  2. Predictive local receptive fields based respiratory motion tracking for motion-adaptive radiotherapy.

    PubMed

    Yubo Wang; Tatinati, Sivanagaraja; Liyu Huang; Kim Jeong Hong; Shafiq, Ghufran; Veluvolu, Kalyana C; Khong, Andy W H

    2017-07-01

    Extracranial robotic radiotherapy employs external markers and a correlation model to trace the tumor motion caused by the respiration. The real-time tracking of tumor motion however requires a prediction model to compensate the latencies induced by the software (image data acquisition and processing) and hardware (mechanical and kinematic) limitations of the treatment system. A new prediction algorithm based on local receptive fields extreme learning machines (pLRF-ELM) is proposed for respiratory motion prediction. All the existing respiratory motion prediction methods model the non-stationary respiratory motion traces directly to predict the future values. Unlike these existing methods, the pLRF-ELM performs prediction by modeling the higher-level features obtained by mapping the raw respiratory motion into the random feature space of ELM instead of directly modeling the raw respiratory motion. The developed method is evaluated using the dataset acquired from 31 patients for two horizons in-line with the latencies of treatment systems like CyberKnife. Results showed that pLRF-ELM is superior to that of existing prediction methods. Results further highlight that the abstracted higher-level features are suitable to approximate the nonlinear and non-stationary characteristics of respiratory motion for accurate prediction.

  3. Integration of element specific persistent homology and machine learning for protein-ligand binding affinity prediction.

    PubMed

    Cang, Zixuan; Wei, Guo-Wei

    2018-02-01

    Protein-ligand binding is a fundamental biological process that is paramount to many other biological processes, such as signal transduction, metabolic pathways, enzyme construction, cell secretion, and gene expression. Accurate prediction of protein-ligand binding affinities is vital to rational drug design and the understanding of protein-ligand binding and binding induced function. Existing binding affinity prediction methods are inundated with geometric detail and involve excessively high dimensions, which undermines their predictive power for massive binding data. Topology provides the ultimate level of abstraction and thus incurs too much reduction in geometric information. Persistent homology embeds geometric information into topological invariants and bridges the gap between complex geometry and abstract topology. However, it oversimplifies biological information. This work introduces element specific persistent homology (ESPH) or multicomponent persistent homology to retain crucial biological information during topological simplification. The combination of ESPH and machine learning gives rise to a powerful paradigm for macromolecular analysis. Tests on 2 large data sets indicate that the proposed topology-based machine-learning paradigm outperforms other existing methods in protein-ligand binding affinity predictions. ESPH reveals protein-ligand binding mechanism that can not be attained from other conventional techniques. The present approach reveals that protein-ligand hydrophobic interactions are extended to 40Å  away from the binding site, which has a significant ramification to drug and protein design. Copyright © 2017 John Wiley & Sons, Ltd.

  4. Energy-Efficient Integration of Continuous Context Sensing and Prediction into Smartwatches.

    PubMed

    Rawassizadeh, Reza; Tomitsch, Martin; Nourizadeh, Manouchehr; Momeni, Elaheh; Peery, Aaron; Ulanova, Liudmila; Pazzani, Michael

    2015-09-08

    As the availability and use of wearables increases, they are becoming a promising platform for context sensing and context analysis. Smartwatches are a particularly interesting platform for this purpose, as they offer salient advantages, such as their proximity to the human body. However, they also have limitations associated with their small form factor, such as processing power and battery life, which makes it difficult to simply transfer smartphone-based context sensing and prediction models to smartwatches. In this paper, we introduce an energy-efficient, generic, integrated framework for continuous context sensing and prediction on smartwatches. Our work extends previous approaches for context sensing and prediction on wrist-mounted wearables that perform predictive analytics outside the device. We offer a generic sensing module and a novel energy-efficient, on-device prediction module that is based on a semantic abstraction approach to convert sensor data into meaningful information objects, similar to human perception of a behavior. Through six evaluations, we analyze the energy efficiency of our framework modules, identify the optimal file structure for data access and demonstrate an increase in accuracy of prediction through our semantic abstraction method. The proposed framework is hardware independent and can serve as a reference model for implementing context sensing and prediction on small wearable devices beyond smartwatches, such as body-mounted cameras.

  5. Energy-Efficient Integration of Continuous Context Sensing and Prediction into Smartwatches

    PubMed Central

    Rawassizadeh, Reza; Tomitsch, Martin; Nourizadeh, Manouchehr; Momeni, Elaheh; Peery, Aaron; Ulanova, Liudmila; Pazzani, Michael

    2015-01-01

    As the availability and use of wearables increases, they are becoming a promising platform for context sensing and context analysis. Smartwatches are a particularly interesting platform for this purpose, as they offer salient advantages, such as their proximity to the human body. However, they also have limitations associated with their small form factor, such as processing power and battery life, which makes it difficult to simply transfer smartphone-based context sensing and prediction models to smartwatches. In this paper, we introduce an energy-efficient, generic, integrated framework for continuous context sensing and prediction on smartwatches. Our work extends previous approaches for context sensing and prediction on wrist-mounted wearables that perform predictive analytics outside the device. We offer a generic sensing module and a novel energy-efficient, on-device prediction module that is based on a semantic abstraction approach to convert sensor data into meaningful information objects, similar to human perception of a behavior. Through six evaluations, we analyze the energy efficiency of our framework modules, identify the optimal file structure for data access and demonstrate an increase in accuracy of prediction through our semantic abstraction method. The proposed framework is hardware independent and can serve as a reference model for implementing context sensing and prediction on small wearable devices beyond smartwatches, such as body-mounted cameras. PMID:26370997

  6. New structural information on food allergens (abstract)

    USDA-ARS?s Scientific Manuscript database

    A small number of protein families are responsible for food allergies suffered by the majority of allergy patients. What properties of these proteins make them allergens is not clear at present. Reliable methods for allergen prediction and mitigation are lacking. Most the immediate type of food alle...

  7. Boosting compound-protein interaction prediction by deep learning.

    PubMed

    Tian, Kai; Shao, Mingyu; Wang, Yang; Guan, Jihong; Zhou, Shuigeng

    2016-11-01

    The identification of interactions between compounds and proteins plays an important role in network pharmacology and drug discovery. However, experimentally identifying compound-protein interactions (CPIs) is generally expensive and time-consuming, computational approaches are thus introduced. Among these, machine-learning based methods have achieved a considerable success. However, due to the nonlinear and imbalanced nature of biological data, many machine learning approaches have their own limitations. Recently, deep learning techniques show advantages over many state-of-the-art machine learning methods in some applications. In this study, we aim at improving the performance of CPI prediction based on deep learning, and propose a method called DL-CPI (the abbreviation of Deep Learning for Compound-Protein Interactions prediction), which employs deep neural network (DNN) to effectively learn the representations of compound-protein pairs. Extensive experiments show that DL-CPI can learn useful features of compound-protein pairs by a layerwise abstraction, and thus achieves better prediction performance than existing methods on both balanced and imbalanced datasets. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. An Analysis of Factors Predicting Successful Transition From Pancreatology Abstracts to Full Publications.

    PubMed

    Grunwald, Douglas; Feuerstein, Joseph D; Maier, Irena Maria; Sheth, Sunil G

    2017-01-01

    Historically, less than half of peer-reviewed abstracts are published. We set out to determine how many pancreas-related abstracts are published within 5 years of presentation at gastroenterology conferences and to determine a model that predicts successful transition from abstract to journal publication. We collected data on study design from all pancreas-related abstracts at the 2010 Digestive Disease Week (DDW), American College of Gastroenterology, and American Pancreatic Association conferences. We then determined whether an abstract was published by October 2015 using a standardized search algorithm. Of 412 abstracts, 39.8% were published. Studies that were of basic science or translational design (P = 0.02, 0.01, respectively); had more listed authors (P = 0.05); employed randomized, prospective, and multicenter methodology (P = 0.02); and were accepted to DDW (P = 0.02) were more likely to be published. After regression, basic/translational studies (P = 0.002, 0.02, respectively) and DDW-accepted abstracts (P = 0.004) continued to predict successful publication. It is not clear why only 40% of the pancreas abstracts from 2010 were published 5 years later. Some abstracts may go unpublished because of methodological flaws that escape detection during abstract peer review. Therefore, physicians should use caution when applying abstract data to their clinical decision making.

  9. Families Who Begin versus Decline Therapy for Children Who Are Sexually Abused

    ERIC Educational Resources Information Center

    Lippert, Tonya; Favre, Tricia; Alexander, Cindy; Cross, Theodore P.

    2008-01-01

    Objective: To identify child characteristics, factors related to the therapy referral, and caregivers' psychological and social variables that predict sexually abused children's beginning therapy following a therapy referral. Method: Investigators abstracted data from case records of 101 families whose children were referred to a Children's…

  10. Evaluation of sectrally-selective materials for multi-layer solar thermal crop drying (abstract)

    USDA-ARS?s Scientific Manuscript database

    Solar thermal (ST) drying is a ubiquitous method in widespread use for fruit and vegetable crop preservation in developing countries; however, it has had limited commercialization in the United States due to concerns about slow drying rates, poor product quality, and predicted low return-on-investme...

  11. EMISSIONS INVENTORY OF PM 2.5 TRACE ELEMENTS ACROSS THE U.S.

    EPA Science Inventory

    This abstract describes work done to speciate PM2.5 emissions into emissions of trace metals to enable concentrations of metal species to be predicted by air quality models. Methods are described and initial results are presented. A technique for validating the resul...

  12. Predicting Semantic Changes in Abstraction in Tutor Responses to Students

    ERIC Educational Resources Information Center

    Lipschultz, Michael; Litman, Diane; Katz, Sandra; Albacete, Patricia; Jordan, Pamela

    2014-01-01

    Post-problem reflective tutorial dialogues between human tutors and students are examined to predict when the tutor changed the level of abstraction from the student's preceding turn (i.e., used more general terms or more specific terms); such changes correlate with learning. Prior work examined lexical changes in abstraction. In this work, we…

  13. Point-Mass Aircraft Trajectory Prediction Using a Hierarchical, Highly-Adaptable Software Design

    NASA Technical Reports Server (NTRS)

    Karr, David A.; Vivona, Robert A.; Woods, Sharon E.; Wing, David J.

    2017-01-01

    A highly adaptable and extensible method for predicting four-dimensional trajectories of civil aircraft has been developed. This method, Behavior-Based Trajectory Prediction, is based on taxonomic concepts developed for the description and comparison of trajectory prediction software. A hierarchical approach to the "behavioral" layer of a point-mass model of aircraft flight, a clear separation between the "behavioral" and "mathematical" layers of the model, and an abstraction of the methods of integrating differential equations in the "mathematical" layer have been demonstrated to support aircraft models of different types (in particular, turbojet vs. turboprop aircraft) using performance models at different levels of detail and in different formats, and promise to be easily extensible to other aircraft types and sources of data. The resulting trajectories predict location, altitude, lateral and vertical speeds, and fuel consumption along the flight path of the subject aircraft accurately and quickly, accounting for local conditions of wind and outside air temperature. The Behavior-Based Trajectory Prediction concept was implemented in NASA's Traffic Aware Planner (TAP) flight-optimizing cockpit software application.

  14. An Efficient Pattern Mining Approach for Event Detection in Multivariate Temporal Data

    PubMed Central

    Batal, Iyad; Cooper, Gregory; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos

    2015-01-01

    This work proposes a pattern mining approach to learn event detection models from complex multivariate temporal data, such as electronic health records. We present Recent Temporal Pattern mining, a novel approach for efficiently finding predictive patterns for event detection problems. This approach first converts the time series data into time-interval sequences of temporal abstractions. It then constructs more complex time-interval patterns backward in time using temporal operators. We also present the Minimal Predictive Recent Temporal Patterns framework for selecting a small set of predictive and non-spurious patterns. We apply our methods for predicting adverse medical events in real-world clinical data. The results demonstrate the benefits of our methods in learning accurate event detection models, which is a key step for developing intelligent patient monitoring and decision support systems. PMID:26752800

  15. TOXICO-CHEMINFORMATICS AND QSAR MODELING OF ...

    EPA Pesticide Factsheets

    This abstract concludes that QSAR approaches combined with toxico-chemoinformatics descriptors can enhance predictive toxicology models. This abstract concludes that QSAR approaches combined with toxico-chemoinformatics descriptors can enhance predictive toxicology models.

  16. A Novel Method Using Abstract Convex Underestimation in Ab-Initio Protein Structure Prediction for Guiding Search in Conformational Feature Space.

    PubMed

    Hao, Xiao-Hu; Zhang, Gui-Jun; Zhou, Xiao-Gen; Yu, Xu-Feng

    2016-01-01

    To address the searching problem of protein conformational space in ab-initio protein structure prediction, a novel method using abstract convex underestimation (ACUE) based on the framework of evolutionary algorithm was proposed. Computing such conformations, essential to associate structural and functional information with gene sequences, is challenging due to the high-dimensionality and rugged energy surface of the protein conformational space. As a consequence, the dimension of protein conformational space should be reduced to a proper level. In this paper, the high-dimensionality original conformational space was converted into feature space whose dimension is considerably reduced by feature extraction technique. And, the underestimate space could be constructed according to abstract convex theory. Thus, the entropy effect caused by searching in the high-dimensionality conformational space could be avoided through such conversion. The tight lower bound estimate information was obtained to guide the searching direction, and the invalid searching area in which the global optimal solution is not located could be eliminated in advance. Moreover, instead of expensively calculating the energy of conformations in the original conformational space, the estimate value is employed to judge if the conformation is worth exploring to reduce the evaluation time, thereby making computational cost lower and the searching process more efficient. Additionally, fragment assembly and the Monte Carlo method are combined to generate a series of metastable conformations by sampling in the conformational space. The proposed method provides a novel technique to solve the searching problem of protein conformational space. Twenty small-to-medium structurally diverse proteins were tested, and the proposed ACUE method was compared with It Fix, HEA, Rosetta and the developed method LEDE without underestimate information. Test results show that the ACUE method can more rapidly and more efficiently obtain the near-native protein structure.

  17. Planner-Based Control of Advanced Life Support Systems

    NASA Technical Reports Server (NTRS)

    Muscettola, Nicola; Kortenkamp, David; Fry, Chuck; Bell, Scott

    2005-01-01

    The paper describes an approach to the integration of qualitative and quantitative modeling techniques for advanced life support (ALS) systems. Developing reliable control strategies that scale up to fully integrated life support systems requires augmenting quantitative models and control algorithms with the abstractions provided by qualitative, symbolic models and their associated high-level control strategies. This will allow for effective management of the combinatorics due to the integration of a large number of ALS subsystems. By focusing control actions at different levels of detail and reactivity we can use faster: simpler responses at the lowest level and predictive but complex responses at the higher levels of abstraction. In particular, methods from model-based planning and scheduling can provide effective resource management over long time periods. We describe reference implementation of an advanced control system using the IDEA control architecture developed at NASA Ames Research Center. IDEA uses planning/scheduling as the sole reasoning method for predictive and reactive closed loop control. We describe preliminary experiments in planner-based control of ALS carried out on an integrated ALS simulation developed at NASA Johnson Space Center.

  18. Competition H(D) Kinetic Isotope Effects in the Autoxidation of Hydrocarbons

    PubMed Central

    Muchalski, Hubert; Levonyak, Alexander J.; Xu, Libin; Ingold, Keith U.; Porter, Ned A.

    2016-01-01

    Hydrogen atom transfer is central to many important radical chain sequences. We report here a method for determination of both the primary and secondary isotope effects for symmetrical substrates by the use of NMR. Intramolecular competition reactions were carried out on substrates having an increasing number of deuterium atoms at symmetry-related sites. Products that arise from peroxyl radical abstraction at each position of the various substrates reflect the competition rates for H(D) abstraction. The primary KIE for autoxidation of tetralin was determined to be 15.9 ± 1.4, a value that exceeds the maximum predicted by differences in H(D) zero-point energies (~7) and strongly suggests that H atom abstraction by the peroxyl radical occurs with substantial quantum mechanical tunneling. PMID:25533605

  19. Competition H(D) kinetic isotope effects in the autoxidation of hydrocarbons.

    PubMed

    Muchalski, Hubert; Levonyak, Alexander J; Xu, Libin; Ingold, Keith U; Porter, Ned A

    2015-01-14

    Hydrogen atom transfer is central to many important radical chain sequences. We report here a method for determination of both the primary and secondary isotope effects for symmetrical substrates by the use of NMR. Intramolecular competition reactions were carried out on substrates having an increasing number of deuterium atoms at symmetry-related sites. Products that arise from peroxyl radical abstraction at each position of the various substrates reflect the competition rates for H(D) abstraction. The primary KIE for autoxidation of tetralin was determined to be 15.9 ± 1.4, a value that exceeds the maximum predicted by differences in H(D) zero-point energies (∼7) and strongly suggests that H atom abstraction by the peroxyl radical occurs with substantial quantum mechanical tunneling.

  20. Relative Packing Groups in Template-Based Structure Prediction: Cooperative Effects of True Positive Constraints

    PubMed Central

    Day, Ryan; Qu, Xiaotao; Swanson, Rosemarie; Bohannan, Zach; Bliss, Robert

    2011-01-01

    Abstract Most current template-based structure prediction methods concentrate on finding the correct backbone conformation and then packing sidechains within that backbone. Our packing-based method derives distance constraints from conserved relative packing groups (RPGs). In our refinement approach, the RPGs provide a level of resolution that restrains global topology while allowing conformational sampling. In this study, we test our template-based structure prediction method using 51 prediction units from CASP7 experiments. RPG-based constraints are able to substantially improve approximately two-thirds of starting templates. Upon deeper investigation, we find that true positive spatial constraints, especially those non-local in sequence, derived from the RPGs were important to building nearer native models. Surprisingly, the fraction of incorrect or false positive constraints does not strongly influence the quality of the final candidate. This result indicates that our RPG-based true positive constraints sample the self-consistent, cooperative interactions of the native structure. The lack of such reinforcing cooperativity explains the weaker effect of false positive constraints. Generally, these findings are encouraging indications that RPGs will improve template-based structure prediction. PMID:21210729

  1. Identification and in silico prediction of metabolites of the model compound, tebufenozide by human CYP3A4 and CYP2C19.

    PubMed

    Shirotani, Naoki; Togawa, Moe; Ikushiro, Shinichi; Sakaki, Toshiyuki; Harada, Toshiyuki; Miyagawa, Hisashi; Matsui, Masayoshi; Nagahori, Hirohisa; Mikata, Kazuki; Nishioka, Kazuhiko; Hirai, Nobuhiro; Akamatsu, Miki

    2015-10-15

    The metabolites of tebufenozide, a model compound, formed by the yeast-expressed human CYP3A4 and CYP2C19 were identified to clarify the substrate recognition mechanism of the human cytochrome P450 (CYP) isozymes. We then determined whether tebufenozide metabolites may be predicted in silico. Hydrogen abstraction energies were calculated with the density functional theory method B3LYP/6-31G(∗). A docking simulation was performed using FRED software. Several alkyl sites of tebufenozide were hydroxylated by CYP3A4 whereas only one site was modified by CYP2C19. The accessibility of each site of tebufenozide to the reaction center of CYP enzymes and the susceptibility of each hydrogen atom for metabolism by CYP enzymes were evaluated by a docking simulation and hydrogen abstraction energy estimation, respectively. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Intuitive reasoning about abstract and familiar physics problems

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary Kister; Jonides, John; Alexander, Joanne

    1986-01-01

    Previous research has demonstrated that many people have misconceptions about basic properties of motion. Two experiments examined whether people are more likely to produce dynamically correct predictions about basic motion problems involving situations with which they are familiar, and whether solving such problems enhances performance on a subsequent abstract problem. In experiment 1, college students were asked to predict the trajectories of objects exiting a curved tube. Subjects were more accurate on the familiar version of the problem, and there was no evidence of transfer to the abstract problem. In experiment 2, two familiar problems were provided in an attempt to enhance subjects' tendency to extract the general structure of the problems. Once again, they gave more correct responses to the familiar problems but failed to generalize to the abstract problem. Formal physics training was associated with correct predictions for the abstract problem but was unrelated to performance on the familiar problems.

  3. Abstract Conceptual Feature Ratings Predict Gaze within Written Word Arrays: Evidence from a Visual Wor(l)d Paradigm

    ERIC Educational Resources Information Center

    Primativo, Silvia; Reilly, Jamie; Crutch, Sebastian J

    2017-01-01

    The Abstract Conceptual Feature (ACF) framework predicts that word meaning is represented within a high-dimensional semantic space bounded by weighted contributions of perceptual, affective, and encyclopedic information. The ACF, like latent semantic analysis, is amenable to distance metrics between any two words. We applied predictions of the ACF…

  4. Frame prediction using recurrent convolutional encoder with residual learning

    NASA Astrophysics Data System (ADS)

    Yue, Boxuan; Liang, Jun

    2018-05-01

    The prediction for the frame of a video is difficult but in urgent need in auto-driving. Conventional methods can only predict some abstract trends of the region of interest. The boom of deep learning makes the prediction for frames possible. In this paper, we propose a novel recurrent convolutional encoder and DE convolutional decoder structure to predict frames. We introduce the residual learning in the convolution encoder structure to solve the gradient issues. The residual learning can transform the gradient back propagation to an identity mapping. It can reserve the whole gradient information and overcome the gradient issues in Recurrent Neural Networks (RNN) and Convolutional Neural Networks (CNN). Besides, compared with the branches in CNNs and the gated structures in RNNs, the residual learning can save the training time significantly. In the experiments, we use UCF101 dataset to train our networks, the predictions are compared with some state-of-the-art methods. The results show that our networks can predict frames fast and efficiently. Furthermore, our networks are used for the driving video to verify the practicability.

  5. Journal of Aeronautics.

    DTIC Science & Technology

    1982-07-21

    aerodynamic tool for design of elastic aircraft. Several numerical examples are given and some dynamical problems of elastic aircraft are also discussed...Qiangang, Wu Changlin, Jian Zheng Northwestern Polytechnical University Abstract: A numerical metbod,6* ted for predicting the aerodynamic characte- ristics... Numerical value calculation method is one important means of the present research on elastic aircraft pneumatic characteristics. Be- cause this

  6. Theoretical Study of the Electronic Spectra of a Polycyclic Aromatic Hydrocarbon, Naphthalene, and its Derivatives

    NASA Technical Reports Server (NTRS)

    Du, Ping; Salama, Farid; Loew, Gilda H.

    1993-01-01

    In order to preselect possible candidates for the origin of diffuse interstellar bands observed, semiempirical quantum mechanical method INDO/S was applied to the optical spectra of neutral, cationic, and anionic states of naphthalene and its hydrogen abstraction and addition derivatives. Comparison with experiment shows that the spectra of naphthalene and its ions were reliably predicted. The configuration interaction calculations with single-electron excitations provided reasonable excited state wavefunctions compared to ab initio calculations that included higher excitations. The degree of similarity of the predicted spectra of the hydrogen abstraction and derivatives to those of naphthalene and ions depends largely on the similarity of the it electron configurations. For the hydrogen addition derivatives, very little resemblance of the predicted spectra to naphthalene was found because of the disruption of the aromatic conjugation system. The relevance of these calculations to astrophysical issues is discussed within the context of these polycyclic aromatic hydrocarbon models. Comparing the calculated electronic energies to the Diffuse Interstellar Bands (DIBs), a list of possible candidates of naphthalene derivatives is established which provides selected candidates for a definitive test through laboratory studies.

  7. Fate of abstracts presented at the 2008 European Congress of Physical and Rehabilitation Medicine.

    PubMed

    Allart, E; Beaucamp, F; Tiffreau, V; Thevenon, A

    2015-08-01

    The subsequent full-text publication of abstracts presented at a scientific congress reflects the latter's scientific quality. The aim of this paper was to evaluate the publication rate for abstracts presented at the 2008 European Congress of Physical and Rehabilitation Medicine (ECPRM), characterize the publications and identify factors that were predictive of publication. It is a bibliography search. We used the PubMed database to search for subsequent publication of abstracts. We screened the abstracts' characteristics for features that were predictive of publication among abstracts features, such the status of the authors, the topic and the type of work. We performed univariate analyses and a logistic regression analysis. Of 779 abstracts presented at ECPRM 2008, 169 (21.2%) were subsequently published. The mean time to publication was 12±15.7 months and the mean impact factor of the publishing journals was 2.05±2.1. In a univariate analysis, university status (P<10-6), geographic origin (P=10-3), oral presentation (P<10-6), and original research (P<10-6) (and particularly multicentre trials [P<0.01] and randomized controlled trials [P=10-3]) were predictive of publication. In a logistic regression analysis, oral presentation (odds ratio [OR]=0.37) and university status (OR=0.36) were significant, independent predictors of publication. ECPRM 2008 publication rate and impact factor were relatively low, when compared with most other national and international conferences in this field. University status, the type of abstract and oral presentation were predictive of subsequent publication.

  8. Deep-Learning-Based Drug-Target Interaction Prediction.

    PubMed

    Wen, Ming; Zhang, Zhimin; Niu, Shaoyu; Sha, Haozhi; Yang, Ruihan; Yun, Yonghuan; Lu, Hongmei

    2017-04-07

    Identifying interactions between known drugs and targets is a major challenge in drug repositioning. In silico prediction of drug-target interaction (DTI) can speed up the expensive and time-consuming experimental work by providing the most potent DTIs. In silico prediction of DTI can also provide insights about the potential drug-drug interaction and promote the exploration of drug side effects. Traditionally, the performance of DTI prediction depends heavily on the descriptors used to represent the drugs and the target proteins. In this paper, to accurately predict new DTIs between approved drugs and targets without separating the targets into different classes, we developed a deep-learning-based algorithmic framework named DeepDTIs. It first abstracts representations from raw input descriptors using unsupervised pretraining and then applies known label pairs of interaction to build a classification model. Compared with other methods, it is found that DeepDTIs reaches or outperforms other state-of-the-art methods. The DeepDTIs can be further used to predict whether a new drug targets to some existing targets or whether a new target interacts with some existing drugs.

  9. Framework for Smart Electronic Health Record-Linked Predictive Models to Optimize Care for Complex Digestive Diseases

    DTIC Science & Technology

    2012-06-01

    indices was independent of the presence or absence of hepatic steatosis on abdominal imaging. Key Research Accomplishments Abstracts presented at...transplant; 3) hepatic encephalopathy; and 4) hepatocellular carcinoma. Logistic regression analysis confirmed that the clinical predictive value of the...Gumus S, Saul,MI Bae KT. Noninvasive Hepatic Fibrosis Scores Predict Liver-Related Outcomes in Diabetic Patients [abstract]. Gastroenterology. 2012

  10. BepiPred-2.0: improving sequence-based B-cell epitope prediction using conformational epitopes

    PubMed Central

    Jespersen, Martin Closter; Peters, Bjoern

    2017-01-01

    Abstract Antibodies have become an indispensable tool for many biotechnological and clinical applications. They bind their molecular target (antigen) by recognizing a portion of its structure (epitope) in a highly specific manner. The ability to predict epitopes from antigen sequences alone is a complex task. Despite substantial effort, limited advancement has been achieved over the last decade in the accuracy of epitope prediction methods, especially for those that rely on the sequence of the antigen only. Here, we present BepiPred-2.0 (http://www.cbs.dtu.dk/services/BepiPred/), a web server for predicting B-cell epitopes from antigen sequences. BepiPred-2.0 is based on a random forest algorithm trained on epitopes annotated from antibody-antigen protein structures. This new method was found to outperform other available tools for sequence-based epitope prediction both on epitope data derived from solved 3D structures, and on a large collection of linear epitopes downloaded from the IEDB database. The method displays results in a user-friendly and informative way, both for computer-savvy and non-expert users. We believe that BepiPred-2.0 will be a valuable tool for the bioinformatics and immunology community. PMID:28472356

  11. Mammalian genomic regulatory regions predicted by utilizing human genomics, transcriptomics, and epigenetics data

    PubMed Central

    Nguyen, Quan H; Tellam, Ross L; Naval-Sanchez, Marina; Porto-Neto, Laercio R; Barendse, William; Reverter, Antonio; Hayes, Benjamin; Kijas, James; Dalrymple, Brian P

    2018-01-01

    Abstract Genome sequences for hundreds of mammalian species are available, but an understanding of their genomic regulatory regions, which control gene expression, is only beginning. A comprehensive prediction of potential active regulatory regions is necessary to functionally study the roles of the majority of genomic variants in evolution, domestication, and animal production. We developed a computational method to predict regulatory DNA sequences (promoters, enhancers, and transcription factor binding sites) in production animals (cows and pigs) and extended its broad applicability to other mammals. The method utilizes human regulatory features identified from thousands of tissues, cell lines, and experimental assays to find homologous regions that are conserved in sequences and genome organization and are enriched for regulatory elements in the genome sequences of other mammalian species. Importantly, we developed a filtering strategy, including a machine learning classification method, to utilize a very small number of species-specific experimental datasets available to select for the likely active regulatory regions. The method finds the optimal combination of sensitivity and accuracy to unbiasedly predict regulatory regions in mammalian species. Furthermore, we demonstrated the utility of the predicted regulatory datasets in cattle for prioritizing variants associated with multiple production and climate change adaptation traits and identifying potential genome editing targets. PMID:29618048

  12. Sphinx: merging knowledge-based and ab initio approaches to improve protein loop prediction

    PubMed Central

    Marks, Claire; Nowak, Jaroslaw; Klostermann, Stefan; Georges, Guy; Dunbar, James; Shi, Jiye; Kelm, Sebastian

    2017-01-01

    Abstract Motivation: Loops are often vital for protein function, however, their irregular structures make them difficult to model accurately. Current loop modelling algorithms can mostly be divided into two categories: knowledge-based, where databases of fragments are searched to find suitable conformations and ab initio, where conformations are generated computationally. Existing knowledge-based methods only use fragments that are the same length as the target, even though loops of slightly different lengths may adopt similar conformations. Here, we present a novel method, Sphinx, which combines ab initio techniques with the potential extra structural information contained within loops of a different length to improve structure prediction. Results: We show that Sphinx is able to generate high-accuracy predictions and decoy sets enriched with near-native loop conformations, performing better than the ab initio algorithm on which it is based. In addition, it is able to provide predictions for every target, unlike some knowledge-based methods. Sphinx can be used successfully for the difficult problem of antibody H3 prediction, outperforming RosettaAntibody, one of the leading H3-specific ab initio methods, both in accuracy and speed. Availability and Implementation: Sphinx is available at http://opig.stats.ox.ac.uk/webapps/sphinx. Contact: deane@stats.ox.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28453681

  13. Priming of Spatial Distance Enhances Children's Creative Performance

    ERIC Educational Resources Information Center

    Liberman, Nira; Polack, Orli; Hameiri, Boaz; Blumenfeld, Maayan

    2012-01-01

    According to construal level theory, psychological distance promotes more abstract thought. Theories of creativity, in turn, suggest that abstract thought promotes creativity. Based on these lines of theorizing, we predicted that spatial distancing would enhance creative performance in elementary school children. To test this prediction, we primed…

  14. Nonpolitical images evoke neural predictors of political ideology.

    PubMed

    Ahn, Woo-Young; Kishida, Kenneth T; Gu, Xiaosi; Lohrenz, Terry; Harvey, Ann; Alford, John R; Smith, Kevin B; Yaffe, Gideon; Hibbing, John R; Dayan, Peter; Montague, P Read

    2014-11-17

    Political ideologies summarize dimensions of life that define how a person organizes their public and private behavior, including their attitudes associated with sex, family, education, and personal autonomy. Despite the abstract nature of such sensibilities, fundamental features of political ideology have been found to be deeply connected to basic biological mechanisms that may serve to defend against environmental challenges like contamination and physical threat. These results invite the provocative claim that neural responses to nonpolitical stimuli (like contaminated food or physical threats) should be highly predictive of abstract political opinions (like attitudes toward gun control and abortion). We applied a machine-learning method to fMRI data to test the hypotheses that brain responses to emotionally evocative images predict individual scores on a standard political ideology assay. Disgusting images, especially those related to animal-reminder disgust (e.g., mutilated body), generate neural responses that are highly predictive of political orientation even though these neural predictors do not agree with participants' conscious rating of the stimuli. Images from other affective categories do not support such predictions. Remarkably, brain responses to a single disgusting stimulus were sufficient to make accurate predictions about an individual subject's political ideology. These results provide strong support for the idea that fundamental neural processing differences that emerge under the challenge of emotionally evocative stimuli may serve to structure political beliefs in ways formerly unappreciated. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Abstract analysis method facilitates filtering low-methodological quality and high-bias risk systematic reviews on psoriasis interventions.

    PubMed

    Gómez-García, Francisco; Ruano, Juan; Aguilar-Luque, Macarena; Alcalde-Mellado, Patricia; Gay-Mimbrera, Jesús; Hernández-Romero, José Luis; Sanz-Cabanillas, Juan Luis; Maestre-López, Beatriz; González-Padilla, Marcelino; Carmona-Fernández, Pedro J; García-Nieto, Antonio Vélez; Isla-Tejera, Beatriz

    2017-12-29

    Article summaries' information and structure may influence researchers/clinicians' decisions to conduct deeper full-text analyses. Specifically, abstracts of systematic reviews (SRs) and meta-analyses (MA) should provide structured summaries for quick assessment. This study explored a method for determining the methodological quality and bias risk of full-text reviews using abstract information alone. Systematic literature searches for SRs and/or MA about psoriasis were undertaken on MEDLINE, EMBASE, and Cochrane database. For each review, quality, abstract-reporting completeness, full-text methodological quality, and bias risk were evaluated using Preferred Reporting Items for Systematic Reviews and Meta-analyses for abstracts (PRISMA-A), Assessing the Methodological Quality of Systematic Reviews (AMSTAR), and ROBIS tools, respectively. Article-, author-, and journal-derived metadata were systematically extracted from eligible studies using a piloted template, and explanatory variables concerning abstract-reporting quality were assessed using univariate and multivariate-regression models. Two classification models concerning SRs' methodological quality and bias risk were developed based on per-item and total PRISMA-A scores and decision-tree algorithms. This work was supported, in part, by project ICI1400136 (JR). No funding was received from any pharmaceutical company. This study analysed 139 SRs on psoriasis interventions. On average, they featured 56.7% of PRISMA-A items. The mean total PRISMA-A score was significantly higher for high-methodological-quality SRs than for moderate- and low-methodological-quality reviews. SRs with low-bias risk showed higher total PRISMA-A values than reviews with high-bias risk. In the final model, only 'authors per review > 6' (OR: 1.098; 95%CI: 1.012-1.194), 'academic source of funding' (OR: 3.630; 95%CI: 1.788-7.542), and 'PRISMA-endorsed journal' (OR: 4.370; 95%CI: 1.785-10.98) predicted PRISMA-A variability. Reviews with a total PRISMA-A score < 6, lacking identification as SR or MA in the title, and lacking explanation concerning bias risk assessment methods were classified as low-methodological quality. Abstracts with a total PRISMA-A score ≥ 9, including main outcomes results and explanation bias risk assessment method were classified as having low-bias risk. The methodological quality and bias risk of SRs may be determined by abstract's quality and completeness analyses. Our proposal aimed to facilitate synthesis of evidence evaluation by clinical professionals lacking methodological skills. External validation is necessary.

  16. Tracking perturbations in Boolean networks with spectral methods

    NASA Astrophysics Data System (ADS)

    Kesseli, Juha; Rämö, Pauli; Yli-Harja, Olli

    2005-08-01

    In this paper we present a method for predicting the spread of perturbations in Boolean networks. The method is applicable to networks that have no regular topology. The prediction of perturbations can be performed easily by using a presented result which enables the efficient computation of the required iterative formulas. This result is based on abstract Fourier transform of the functions in the network. In this paper the method is applied to show the spread of perturbations in networks containing a distribution of functions found from biological data. The advances in the study of the spread of perturbations can directly be applied to enable ways of quantifying chaos in Boolean networks. Derrida plots over an arbitrary number of time steps can be computed and thus distributions of functions compared with each other with respect to the amount of order they create in random networks.

  17. Predicting Heart Rate at the Ventilatory Threshold for Aerobic Exercise Prescription in Persons With Chronic Stroke.

    PubMed

    Boyne, Pierce; Buhr, Sarah; Rockwell, Bradley; Khoury, Jane; Carl, Daniel; Gerson, Myron; Kissela, Brett; Dunning, Kari

    2015-10-01

    Treadmill aerobic exercise improves gait, aerobic capacity, and cardiovascular health after stroke, but a lack of specificity in current guidelines could lead to underdosing or overdosing of aerobic intensity. The ventilatory threshold (VT) has been recommended as an optimal, specific starting point for continuous aerobic exercise. However, VT measurement is not available in clinical stroke settings. Therefore, the purpose of this study was to identify an accurate method to predict heart rate at the VT (HRVT) for use as a surrogate for VT. A cross-sectional design was employed. Using symptom-limited graded exercise test (GXT) data from 17 subjects more than 6 months poststroke, prediction methods for HRVT were derived by traditional target HR calculations (percentage of HRpeak achieved during GXT, percentage of peak HR reserve [HRRpeak], percentage of age-predicted maximal HR, and percentage of age-predicted maximal HR reserve) and by regression analysis. The validity of the prediction methods was then tested among 8 additional subjects. All prediction methods were validated by the second sample, so data were pooled to calculate refined prediction equations. HRVT was accurately predicted by 80% HRpeak (R, 0.62; standard deviation of error [SDerror], 7 bpm), 62% HRRpeak (R, 0.66; SDerror, 7 bpm), and regression models that included HRpeak (R, 0.62-0.75; SDerror, 5-6 bpm). Derived regression equations, 80% HRpeak and 62% HRRpeak, provide a specific target intensity for initial aerobic exercise prescription that should minimize underdosing and overdosing for persons with chronic stroke. The specificity of these methods may lead to more efficient and effective treatment for poststroke deconditioning.Video Abstract available for more insights from the authors (see Supplemental Digital Content 1, http://links.lww.com/JNPT/A114).

  18. Learning Traffic as Images: A Deep Convolutional Neural Network for Large-Scale Transportation Network Speed Prediction.

    PubMed

    Ma, Xiaolei; Dai, Zhuang; He, Zhengbing; Ma, Jihui; Wang, Yong; Wang, Yunpeng

    2017-04-10

    This paper proposes a convolutional neural network (CNN)-based method that learns traffic as images and predicts large-scale, network-wide traffic speed with a high accuracy. Spatiotemporal traffic dynamics are converted to images describing the time and space relations of traffic flow via a two-dimensional time-space matrix. A CNN is applied to the image following two consecutive steps: abstract traffic feature extraction and network-wide traffic speed prediction. The effectiveness of the proposed method is evaluated by taking two real-world transportation networks, the second ring road and north-east transportation network in Beijing, as examples, and comparing the method with four prevailing algorithms, namely, ordinary least squares, k-nearest neighbors, artificial neural network, and random forest, and three deep learning architectures, namely, stacked autoencoder, recurrent neural network, and long-short-term memory network. The results show that the proposed method outperforms other algorithms by an average accuracy improvement of 42.91% within an acceptable execution time. The CNN can train the model in a reasonable time and, thus, is suitable for large-scale transportation networks.

  19. Learning Traffic as Images: A Deep Convolutional Neural Network for Large-Scale Transportation Network Speed Prediction

    PubMed Central

    Ma, Xiaolei; Dai, Zhuang; He, Zhengbing; Ma, Jihui; Wang, Yong; Wang, Yunpeng

    2017-01-01

    This paper proposes a convolutional neural network (CNN)-based method that learns traffic as images and predicts large-scale, network-wide traffic speed with a high accuracy. Spatiotemporal traffic dynamics are converted to images describing the time and space relations of traffic flow via a two-dimensional time-space matrix. A CNN is applied to the image following two consecutive steps: abstract traffic feature extraction and network-wide traffic speed prediction. The effectiveness of the proposed method is evaluated by taking two real-world transportation networks, the second ring road and north-east transportation network in Beijing, as examples, and comparing the method with four prevailing algorithms, namely, ordinary least squares, k-nearest neighbors, artificial neural network, and random forest, and three deep learning architectures, namely, stacked autoencoder, recurrent neural network, and long-short-term memory network. The results show that the proposed method outperforms other algorithms by an average accuracy improvement of 42.91% within an acceptable execution time. The CNN can train the model in a reasonable time and, thus, is suitable for large-scale transportation networks. PMID:28394270

  20. USSR and Eastern Europe Scientific Abstracts Geophysics, Astronomy and Space No. 404

    DTIC Science & Technology

    1977-09-01

    atmospheric circulation. A reliable linear correlation was established between the monthly fallout activity of 10^Ru + -^Rh and monthly precipitation and...therefore the washing out of this radionuclide from tropospheric air by precipitation is more important for its fallout. [153] ANALYTICAL...development of some methods for predicting definite weather phenomena (such as precipitation ), taking into account the evolution of the

  1. Challenges of Aircraft Design Integration

    DTIC Science & Technology

    2003-03-01

    predicted by the conceptual stick model and the full FEM of the Challenger wing without winglets . Advanced aerodynamic wing design methods To design wings...Piperni, E. Laurendeau Advanced Aerodynamics Bombardier Aerospace 400 CMte Vertu Road Dorval, Quebec, Canada, H4S 1Y9 Fassi.Kafyeke @notes.canadair.ca Tel...514) 855-7186 Abstract The design of a modern airplane brings together many disciplines: structures, aerodynamics , controls, systems, propulsion

  2. A univariate model of river water nitrate time series

    NASA Astrophysics Data System (ADS)

    Worrall, F.; Burt, T. P.

    1999-01-01

    Four time series were taken from three catchments in the North and South of England. The sites chosen included two in predominantly agricultural catchments, one at the tidal limit and one downstream of a sewage treatment works. A time series model was constructed for each of these series as a means of decomposing the elements controlling river water nitrate concentrations and to assess whether this approach could provide a simple management tool for protecting water abstractions. Autoregressive (AR) modelling of the detrended and deseasoned time series showed a "memory effect". This memory effect expressed itself as an increase in the winter-summer difference in nitrate levels that was dependent upon the nitrate concentration 12 or 6 months previously. Autoregressive moving average (ARMA) modelling showed that one of the series contained seasonal, non-stationary elements that appeared as an increasing trend in the winter-summer difference. The ARMA model was used to predict nitrate levels and predictions were tested against data held back from the model construction process - predictions gave average percentage errors of less than 10%. Empirical modelling can therefore provide a simple, efficient method for constructing management models for downstream water abstraction.

  3. Coupling Radar Rainfall to Hydrological Models for Water Abstraction Management

    NASA Astrophysics Data System (ADS)

    Asfaw, Alemayehu; Shucksmith, James; Smith, Andrea; MacDonald, Ken

    2015-04-01

    The impacts of climate change and growing water use are likely to put considerable pressure on water resources and the environment. In the UK, a reform to surface water abstraction policy has recently been proposed which aims to increase the efficiency of using available water resources whilst minimising impacts on the aquatic environment. Key aspects to this reform include the consideration of dynamic rather than static abstraction licensing as well as introducing water trading concepts. Dynamic licensing will permit varying levels of abstraction dependent on environmental conditions (i.e. river flow and quality). The practical implementation of an effective dynamic abstraction strategy requires suitable flow forecasting techniques to inform abstraction asset management. Potentially the predicted availability of water resources within a catchment can be coupled to predicted demand and current storage to inform a cost effective water resource management strategy which minimises environmental impacts. The aim of this work is to use a historical analysis of UK case study catchment to compare potential water resource availability using modelled dynamic abstraction scenario informed by a flow forecasting model, against observed abstraction under a conventional abstraction regime. The work also demonstrates the impacts of modelling uncertainties on the accuracy of predicted water availability over range of forecast lead times. The study utilised a conceptual rainfall-runoff model PDM - Probability-Distributed Model developed by Centre for Ecology & Hydrology - set up in the Dove River catchment (UK) using 1km2 resolution radar rainfall as inputs and 15 min resolution gauged flow data for calibration and validation. Data assimilation procedures are implemented to improve flow predictions using observed flow data. Uncertainties in the radar rainfall data used in the model are quantified using artificial statistical error model described by Gaussian distribution and propagated through the model to assess its influence on the forecasted flow uncertainty. Furthermore, the effects of uncertainties at different forecast lead times on potential abstraction strategies are assessed. The results show that over a 10 year period, an average of approximately 70 ML/d of potential water is missed in the study catchment under a convention abstraction regime. This indicates a considerable potential for the use of flow forecasting models to effectively implement advanced abstraction management and more efficiently utilize available water resources in the study catchment.

  4. Reaction of Pentanol isomers with OH radical – A theoretical perspective

    NASA Astrophysics Data System (ADS)

    Aazaad, Basheer; Lakshmipathi, Senthilkumar

    2018-05-01

    The stability of all the three isomeric forms of Pentanol has been examined with relative energy analysis. Even though 2-Pentanol is predicted to be most stable isomeric form, all the three isomeric forms undergo hydrogen atom abstraction reaction with OH radical. Among the proposed 18 different hydrogen atom abstraction reaction, the abstraction from CH2 and CH functional group is found to be a favourable reactive site with low energy barrier in M06-2X/6-311+G(d,p) level of theory. Wiberg bond order analysis shows all the abstraction reactions are concreted but not synchronic in nature. Using force analysis, the calculated work done of individual reaction regions illustrates that structural rearrangements drive the reaction with higher contribution to the energy barrier. The rate constant calculated at M06-2X method for the most favourable reaction is well matched with available experimental data. Using the reported atmospheric OH concentration (1 × 106 molecules/cm3), the life time of 1-Pentanol, 2-Pentanol and 3-Pentanol has calculated to be 18.66, 0.36 and 2.86 days, respectively.

  5. Predicting volume of distribution with decision tree-based regression methods using predicted tissue:plasma partition coefficients.

    PubMed

    Freitas, Alex A; Limbu, Kriti; Ghafourian, Taravat

    2015-01-01

    Volume of distribution is an important pharmacokinetic property that indicates the extent of a drug's distribution in the body tissues. This paper addresses the problem of how to estimate the apparent volume of distribution at steady state (Vss) of chemical compounds in the human body using decision tree-based regression methods from the area of data mining (or machine learning). Hence, the pros and cons of several different types of decision tree-based regression methods have been discussed. The regression methods predict Vss using, as predictive features, both the compounds' molecular descriptors and the compounds' tissue:plasma partition coefficients (Kt:p) - often used in physiologically-based pharmacokinetics. Therefore, this work has assessed whether the data mining-based prediction of Vss can be made more accurate by using as input not only the compounds' molecular descriptors but also (a subset of) their predicted Kt:p values. Comparison of the models that used only molecular descriptors, in particular, the Bagging decision tree (mean fold error of 2.33), with those employing predicted Kt:p values in addition to the molecular descriptors, such as the Bagging decision tree using adipose Kt:p (mean fold error of 2.29), indicated that the use of predicted Kt:p values as descriptors may be beneficial for accurate prediction of Vss using decision trees if prior feature selection is applied. Decision tree based models presented in this work have an accuracy that is reasonable and similar to the accuracy of reported Vss inter-species extrapolations in the literature. The estimation of Vss for new compounds in drug discovery will benefit from methods that are able to integrate large and varied sources of data and flexible non-linear data mining methods such as decision trees, which can produce interpretable models. Graphical AbstractDecision trees for the prediction of tissue partition coefficient and volume of distribution of drugs.

  6. Abstracts for the symposium on the Application of neural networks to the earth sciences

    USGS Publications Warehouse

    Singer, Donald A.

    2002-01-01

    Artificial neural networks are a group of mathematical methods that attempt to mimic some of the processes in the human mind. Although the foundations for these ideas were laid as early as 1943 (McCulloch and Pitts, 1943), it wasn't until 1986 (Rumelhart and McClelland, 1986; Masters, 1995) that applications to practical problems became possible. It is the acknowledged superiority of the human mind at recognizing patterns that the artificial neural networks are trying to imitate with their interconnected neurons. Interconnections used in the methods that have been developed allow robust learning. Capabilities of neural networks fall into three kinds of applications: (1) function fitting or prediction, (2) noise reduction or pattern recognition, and (3) classification or placing into types. Because of these capabilities and the powerful abilities of artificial neural networks, there have been increasing applications of these methods in the earth sciences. The abstracts in this document represent excellent samples of the range of applications. Talks associated with the abstracts were presented at the Symposium on the Application of Neural Networks to the Earth Sciences: Seventh International Symposium on Mineral Exploration (ISME–02), held August 20–21, 2002, at NASA Moffett Field, Mountain View, California. This symposium was sponsored by the Mining and Materials Processing Institute of Japan (MMIJ), the U.S. Geological Survey, the Circum-Pacific Council, and NASA. The ISME symposia have been held every two years in order to bring together scientists actively working on diverse quantitative methods applied to the earth sciences. Although the title, International Symposium on Mineral Exploration, suggests exclusive focus on mineral exploration, interests and presentations have always been wide-ranging—abstracts presented here are no exception.

  7. A link prediction method for heterogeneous networks based on BP neural network

    NASA Astrophysics Data System (ADS)

    Li, Ji-chao; Zhao, Dan-ling; Ge, Bing-Feng; Yang, Ke-Wei; Chen, Ying-Wu

    2018-04-01

    Most real-world systems, composed of different types of objects connected via many interconnections, can be abstracted as various complex heterogeneous networks. Link prediction for heterogeneous networks is of great significance for mining missing links and reconfiguring networks according to observed information, with considerable applications in, for example, friend and location recommendations and disease-gene candidate detection. In this paper, we put forward a novel integrated framework, called MPBP (Meta-Path feature-based BP neural network model), to predict multiple types of links for heterogeneous networks. More specifically, the concept of meta-path is introduced, followed by the extraction of meta-path features for heterogeneous networks. Next, based on the extracted meta-path features, a supervised link prediction model is built with a three-layer BP neural network. Then, the solution algorithm of the proposed link prediction model is put forward to obtain predicted results by iteratively training the network. Last, numerical experiments on the dataset of examples of a gene-disease network and a combat network are conducted to verify the effectiveness and feasibility of the proposed MPBP. It shows that the MPBP with very good performance is superior to the baseline methods.

  8. An automated procedure to identify biomedical articles that contain cancer-associated gene variants.

    PubMed

    McDonald, Ryan; Scott Winters, R; Ankuda, Claire K; Murphy, Joan A; Rogers, Amy E; Pereira, Fernando; Greenblatt, Marc S; White, Peter S

    2006-09-01

    The proliferation of biomedical literature makes it increasingly difficult for researchers to find and manage relevant information. However, identifying research articles containing mutation data, a requisite first step in integrating large and complex mutation data sets, is currently tedious, time-consuming and imprecise. More effective mechanisms for identifying articles containing mutation information would be beneficial both for the curation of mutation databases and for individual researchers. We developed an automated method that uses information extraction, classifier, and relevance ranking techniques to determine the likelihood of MEDLINE abstracts containing information regarding genomic variation data suitable for inclusion in mutation databases. We targeted the CDKN2A (p16) gene and the procedure for document identification currently used by CDKN2A Database curators as a measure of feasibility. A set of abstracts was manually identified from a MEDLINE search as potentially containing specific CDKN2A mutation events. A subset of these abstracts was used as a training set for a maximum entropy classifier to identify text features distinguishing "relevant" from "not relevant" abstracts. Each document was represented as a set of indicative word, word pair, and entity tagger-derived genomic variation features. When applied to a test set of 200 candidate abstracts, the classifier predicted 88 articles as being relevant; of these, 29 of 32 manuscripts in which manual curation found CDKN2A sequence variants were positively predicted. Thus, the set of potentially useful articles that a manual curator would have to review was reduced by 56%, maintaining 91% recall (sensitivity) and more than doubling precision (positive predictive value). Subsequent expansion of the training set to 494 articles yielded similar precision and recall rates, and comparison of the original and expanded trials demonstrated that the average precision improved with the larger data set. Our results show that automated systems can effectively identify article subsets relevant to a given task and may prove to be powerful tools for the broader research community. This procedure can be readily adapted to any or all genes, organisms, or sets of documents. Published 2006 Wiley-Liss, Inc.

  9. The Minimal Control Principle Predicts Strategy Shifts in the Abstract Decision Making Task

    ERIC Educational Resources Information Center

    Taatgen, Niels A.

    2011-01-01

    The minimal control principle (Taatgen, 2007) predicts that people strive for problem-solving strategies that require as few internal control states as possible. In an experiment with the Abstract Decision Making task (ADM task; Joslyn & Hunt, 1998) the reward structure was manipulated to make either a low-control strategy or a high-strategy…

  10. Process improvement: a multi-registry database abstraction success story.

    PubMed

    Abrich, Victor; Rokey, Roxann; Devadas, Christopher; Uebel, Julie

    2014-01-01

    The St. Joseph Hospital/Marshfield Clinic Cardiac Database Registry submits data to the National Cardiovascular Data Registry (NCDR) and to the Society of Thoracic Surgeons (STS) National Database. Delayed chart abstraction is problematic, since hospital policy prohibits patient care clarifications made to the medical record more than 1 month after hospital discharge. This can also lead to late identification of missed care opportunities and untimely notification to providers. Our institution was 3.5 months behind in retrospective postdischarge case abstraction. A process improvement plan was implemented to shorten this delay to 1 month postdischarge. Daily demand of incoming cases and abstraction capacity were determined for 4 employees. Demand was matched to capacity, with the remaining time allocated to reducing backlog. Daily demand of new cases was 17.1 hours. Daily abstraction capacity was 24 hours, assuming 6 hours of effective daily abstraction time per employee, leaving 7 hours per day for backlogged case abstraction. The predicted time to reach abstraction target was 10 weeks. This was accomplished after 10 weeks, as predicted, leading to a 60% reduction of backlogged cases. The delay of postdischarge chart abstraction was successfully shortened from 3.5 months to 1 month. We intend to maintain same-day abstraction efficiency without reaccumulating substantial backlog.

  11. Evaluation of free modeling targets in CASP11 and ROLL.

    PubMed

    Kinch, Lisa N; Li, Wenlin; Monastyrskyy, Bohdan; Kryshtafovych, Andriy; Grishin, Nick V

    2016-09-01

    We present an assessment of 'template-free modeling' (FM) in CASP11and ROLL. Community-wide server performance suggested the use of automated scores similar to previous CASPs would provide a good system of evaluating performance, even in the absence of comprehensive manual assessment. The CASP11 FM category included several outstanding examples, including successful prediction by the Baker group of a 256-residue target (T0806-D1) that lacked sequence similarity to any existing template. The top server model prediction by Zhang's Quark, which was apparently selected and refined by several manual groups, encompassed the entire fold of target T0837-D1. Methods from the same two groups tended to dominate overall CASP11 FM and ROLL rankings. Comparison of top FM predictions with those from the previous CASP experiment revealed progress in the category, particularly reflected in high prediction accuracy for larger protein domains. FM prediction models for two cases were sufficient to provide functional insights that were otherwise not obtainable by traditional sequence analysis methods. Importantly, CASP11 abstracts revealed that alignment-based contact prediction methods brought about much of the CASP11 progress, producing both of the functionally relevant models as well as several of the other outstanding structure predictions. These methodological advances enabled de novo modeling of much larger domain structures than was previously possible and allowed prediction of functional sites. Proteins 2016; 84(Suppl 1):51-66. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  12. Nonpolitical Images Evoke Neural Predictors of Political Ideology

    PubMed Central

    Ahn, Woo-Young; Kishida, Kenneth T.; Gu, Xiaosi; Lohrenz, Terry; Harvey, Ann; Alford, John R.; Smith, Kevin B.; Yaffe, Gideon; Hibbing, John R.; Dayan, Peter; Montague, P. Read

    2014-01-01

    Summary Political ideologies summarize dimensions of life that define how a person organizes their public and private behavior, including their attitudes associated with sex, family, education, and personal autonomy [1, 2]. Despite the abstract nature of such sensibilities, fundamental features of political ideology have been found to be deeply connected to basic biological mechanisms [3–7] that may serve to defend against environmental challenges like contamination and physical threat [8–12]. These results invite the provocative claim that neural responses to nonpolitical stimuli (like contaminated food or physical threats) should be highly predictive of abstract political opinions (like attitudes toward gun control and abortion) [13]. We applied a machine-learning method to fMRI data to test the hypotheses that brain responses to emotionally evocative images predict individual scores on a standard political ideology assay. Disgusting images, especially those related to animal-reminder disgust (e.g., mutilated body), generate neural responses that are highly predictive of political orientation even though these neural predictors do not agree with participants’ conscious rating of the stimuli. Images from other affective categories do not support such predictions. Remarkably, brain responses to a single disgusting stimulus were sufficient to make accurate predictions about an individual subject’s political ideology. These results provide strong support for the idea that fundamental neural processing differences that emerge under the challenge of emotionally evocative stimuli may serve to structure political beliefs in ways formerly unappreciated. PMID:25447997

  13. On Predicting the Crystal Structure of Energetic Materials From Quantum Mechanics

    DTIC Science & Technology

    2008-12-01

    DE ABSTRACT A quantum-mechanically-based potential energy function that describes interactions of dimers of the explosive ...method is capable of producing force fields for interactions of the molecular crystalline explosive RDX, and appears to be suitable to enable reliable...Ridge, TN. Byrd, E.F.C., Scuseria, G.E., Chabalowski, C.F., 2004: “An ab initio study of solid nitromethane , HMX, RDX and CL20: Successes and

  14. A Comparison of Manual Scaled and Predicted foE and foF1 Critical Frequencies

    DTIC Science & Technology

    1990-07-01

    Statistics for the lonograms Studied 17 xi 1.0 INTRODUCTION The ARTIST autoscaling routines use a predicted foE to determine a range to search for the...recommendations are made to help improve autoscaling . 20. DISTRIBUTIONIAVAILABILITY OF ABSTRACT 21. ABSTRACT SECURITY CLASSIFICATION UUNCLASSIFIEDUNLIMITED 0...to estimate foE. In the ARTIST , the predicted foE is the CCIR model described in the CCIR Supplement Report 252-2.1 We have also tested a foE

  15. The Exciting World of Binary Stars: Not Just Eclipses Anymore (Abstract)

    NASA Astrophysics Data System (ADS)

    Pablo, B.

    2018-06-01

    (Abstract only) Binary stars have always been essential to astronomy. Their periodic eclipses are the most common and efficient method for determining precise masses and radii of stars. Binaries are known for their predictability and have been observed for hundreds if not thousands of years. As such, they are often ignored by observers as uninteresting, however, nothing could be farther from the truth. In the last ten years alone the importance of binary stars, as well of our knowledge of them, has changed significantly. In this talk, I will introduce you to this new frontier of heartbeats, mergers, and evolution, while hopefully motivating a change in the collective thinking of how this unique class of objects is viewed. Most importantly,

  16. A theoretical and shock tube kinetic study on hydrogen abstraction from phenyl formate.

    PubMed

    Ning, Hongbo; Liu, Dapeng; Wu, Junjun; Ma, Liuhao; Ren, Wei; Farooq, Aamir

    2018-06-12

    The hydrogen abstraction reactions of phenyl formate (PF) by different radicals (H/O(3P)/OH/HO2) were theoretically investigated. We calculated the reaction energetics for PF + H/O/OH using the composite method ROCBS-QB3//M06-2X/cc-pVTZ and that for PF + HO2 at the M06-2X/cc-pVTZ level of theory. The high-pressure limit rate constants were calculated using the transition state theory in conjunction with the 1-D hindered rotor approximation and tunneling correction. Three-parameter Arrhenius expressions of rate constants were provided over the temperature range of 500-2000 K. To validate the theoretical calculations, the overall rate constants of PF + OH → Products were measured in shock tube experiments at 968-1128 K and 1.16-1.25 atm using OH laser absorption. The predicted overall rate constants agree well with the shock tube data (within 15%) over the entire experimental conditions. Rate constant analysis indicates that the H-abstraction at the formic acid site dominates the PF consumption, whereas the contribution of H-abstractions at the aromatic ring increases with temperature. Additionally, comparisons of site-specific H-abstractions from PF with methyl formate, ethyl formate, benzene, and toluene were performed to understand the effects of the aromatic ring and side-chain substituent on H-abstraction rate constants.

  17. CisMapper: predicting regulatory interactions from transcription factor ChIP-seq data

    PubMed Central

    O'Connor, Timothy; Bodén, Mikael

    2017-01-01

    Abstract Identifying the genomic regions and regulatory factors that control the transcription of genes is an important, unsolved problem. The current method of choice predicts transcription factor (TF) binding sites using chromatin immunoprecipitation followed by sequencing (ChIP-seq), and then links the binding sites to putative target genes solely on the basis of the genomic distance between them. Evidence from chromatin conformation capture experiments shows that this approach is inadequate due to long-distance regulation via chromatin looping. We present CisMapper, which predicts the regulatory targets of a TF using the correlation between a histone mark at the TF's bound sites and the expression of each gene across a panel of tissues. Using both chromatin conformation capture and differential expression data, we show that CisMapper is more accurate at predicting the target genes of a TF than the distance-based approaches currently used, and is particularly advantageous for predicting the long-range regulatory interactions typical of tissue-specific gene expression. CisMapper also predicts which TF binding sites regulate a given gene more accurately than using genomic distance. Unlike distance-based methods, CisMapper can predict which transcription start site of a gene is regulated by a particular binding site of the TF. PMID:28204599

  18. Moving beyond regression techniques in cardiovascular risk prediction: applying machine learning to address analytic challenges.

    PubMed

    Goldstein, Benjamin A; Navar, Ann Marie; Carter, Rickey E

    2017-06-14

    Risk prediction plays an important role in clinical cardiology research. Traditionally, most risk models have been based on regression models. While useful and robust, these statistical methods are limited to using a small number of predictors which operate in the same way on everyone, and uniformly throughout their range. The purpose of this review is to illustrate the use of machine-learning methods for development of risk prediction models. Typically presented as black box approaches, most machine-learning methods are aimed at solving particular challenges that arise in data analysis that are not well addressed by typical regression approaches. To illustrate these challenges, as well as how different methods can address them, we consider trying to predicting mortality after diagnosis of acute myocardial infarction. We use data derived from our institution's electronic health record and abstract data on 13 regularly measured laboratory markers. We walk through different challenges that arise in modelling these data and then introduce different machine-learning approaches. Finally, we discuss general issues in the application of machine-learning methods including tuning parameters, loss functions, variable importance, and missing data. Overall, this review serves as an introduction for those working on risk modelling to approach the diffuse field of machine learning. © The Author 2016. Published by Oxford University Press on behalf of the European Society of Cardiology.

  19. Sixty-five years of the long march in protein secondary structure prediction: the final stretch?

    PubMed Central

    Yang, Yuedong; Gao, Jianzhao; Wang, Jihua; Heffernan, Rhys; Hanson, Jack; Paliwal, Kuldip; Zhou, Yaoqi

    2018-01-01

    Abstract Protein secondary structure prediction began in 1951 when Pauling and Corey predicted helical and sheet conformations for protein polypeptide backbone even before the first protein structure was determined. Sixty-five years later, powerful new methods breathe new life into this field. The highest three-state accuracy without relying on structure templates is now at 82–84%, a number unthinkable just a few years ago. These improvements came from increasingly larger databases of protein sequences and structures for training, the use of template secondary structure information and more powerful deep learning techniques. As we are approaching to the theoretical limit of three-state prediction (88–90%), alternative to secondary structure prediction (prediction of backbone torsion angles and Cα-atom-based angles and torsion angles) not only has more room for further improvement but also allows direct prediction of three-dimensional fragment structures with constantly improved accuracy. About 20% of all 40-residue fragments in a database of 1199 non-redundant proteins have <6 Å root-mean-squared distance from the native conformations by SPIDER2. More powerful deep learning methods with improved capability of capturing long-range interactions begin to emerge as the next generation of techniques for secondary structure prediction. The time has come to finish off the final stretch of the long march towards protein secondary structure prediction. PMID:28040746

  20. Predicting plant biomass accumulation from image-derived parameters

    PubMed Central

    Chen, Dijun; Shi, Rongli; Pape, Jean-Michel; Neumann, Kerstin; Graner, Andreas; Chen, Ming; Klukas, Christian

    2018-01-01

    Abstract Background Image-based high-throughput phenotyping technologies have been rapidly developed in plant science recently, and they provide a great potential to gain more valuable information than traditionally destructive methods. Predicting plant biomass is regarded as a key purpose for plant breeders and ecologists. However, it is a great challenge to find a predictive biomass model across experiments. Results In the present study, we constructed 4 predictive models to examine the quantitative relationship between image-based features and plant biomass accumulation. Our methodology has been applied to 3 consecutive barley (Hordeum vulgare) experiments with control and stress treatments. The results proved that plant biomass can be accurately predicted from image-based parameters using a random forest model. The high prediction accuracy based on this model will contribute to relieving the phenotyping bottleneck in biomass measurement in breeding applications. The prediction performance is still relatively high across experiments under similar conditions. The relative contribution of individual features for predicting biomass was further quantified, revealing new insights into the phenotypic determinants of the plant biomass outcome. Furthermore, methods could also be used to determine the most important image-based features related to plant biomass accumulation, which would be promising for subsequent genetic mapping to uncover the genetic basis of biomass. Conclusions We have developed quantitative models to accurately predict plant biomass accumulation from image data. We anticipate that the analysis results will be useful to advance our views of the phenotypic determinants of plant biomass outcome, and the statistical methods can be broadly used for other plant species. PMID:29346559

  1. Big Data Toolsets to Pharmacometrics: Application of Machine Learning for Time‐to‐Event Analysis

    PubMed Central

    Gong, Xiajing; Hu, Meng

    2018-01-01

    Abstract Additional value can be potentially created by applying big data tools to address pharmacometric problems. The performances of machine learning (ML) methods and the Cox regression model were evaluated based on simulated time‐to‐event data synthesized under various preset scenarios, i.e., with linear vs. nonlinear and dependent vs. independent predictors in the proportional hazard function, or with high‐dimensional data featured by a large number of predictor variables. Our results showed that ML‐based methods outperformed the Cox model in prediction performance as assessed by concordance index and in identifying the preset influential variables for high‐dimensional data. The prediction performances of ML‐based methods are also less sensitive to data size and censoring rates than the Cox regression model. In conclusion, ML‐based methods provide a powerful tool for time‐to‐event analysis, with a built‐in capacity for high‐dimensional data and better performance when the predictor variables assume nonlinear relationships in the hazard function. PMID:29536640

  2. High-level theoretical characterization of the vinoxy radical (•CH2CHO) + O2 reaction

    NASA Astrophysics Data System (ADS)

    Weidman, Jared D.; Allen, Ryan T.; Moore, Kevin B.; Schaefer, Henry F.

    2018-05-01

    Numerous processes in atmospheric and combustion chemistry produce the vinoxy radical (•CH2CHO). To understand the fate of this radical and to provide reliable energies needed for kinetic modeling of such processes, we have examined its reaction with O2 using highly reliable theoretical methods. Utilizing the focal point approach, the energetics of this reaction and subsequent reactions were obtained using coupled-cluster theory with single, double, and perturbative triple excitations [CCSD(T)] extrapolated to the complete basis set limit. These extrapolated energies were appended with several corrections including a treatment of full triples and connected quadruple excitations, i.e., CCSDT(Q). In addition, this study models the initial vinoxy radical + O2 reaction for the first time with multireference methods. We predict a barrier for this reaction of approximately 0.4 kcal mol-1. This result agrees with experimental findings but is in disagreement with previous theoretical studies. The vinoxy radical + O2 reaction produces a 2-oxoethylperoxy radical which can undergo a number of unimolecular reactions. Abstraction of a β-hydrogen (a 1,4-hydrogen shift) and dissociation back to reactants are predicted to be competitive to each other due to their similar barriers of 21.2 and 22.3 kcal mol-1, respectively. The minimum-energy β-hydrogen abstraction pathway produces a hydroperoxy radical (QOOH) that eventually decomposes to formaldehyde, CO, and •OH. Two other unimolecular reactions of the peroxy radical are α-hydrogen abstraction (38.7 kcal mol-1 barrier) and HO2• elimination (43.5 kcal mol-1 barrier). These pathways lead to glyoxal + •OH and ketene + HO2• formation, respectively, but they are expected to be uncompetitive due to their high barriers.

  3. Uncertainty in Predicted Neighborhood-Scale Green Stormwater Infrastructure Performance Informed by field monitoring of Hydrologic Abstractions

    NASA Astrophysics Data System (ADS)

    Smalls-Mantey, L.; Jeffers, S.; Montalto, F. A.

    2013-12-01

    Human alterations to the environment provide infrastructure for housing and transportation but have drastically changed local hydrology. Excess stormwater runoff from impervious surfaces generates erosion, overburdens sewer infrastructure, and can pollute receiving bodies. Increased attention to green stormwater management controls is based on the premise that some of these issues can be mitigated by capturing or slowing the flow of stormwater. However, our ability to predict actual green infrastructure facility performance using physical or statistical methods needs additional validation, and efforts to incorporate green infrastructure controls into hydrologic models are still in their infancy stages. We use more than three years of field monitoring data to derive facility specific probability density functions characterizing the hydrologic abstractions provided by a stormwater treatment wetland, streetside bioretention facility, and a green roof. The monitoring results are normalized by impervious area treated, and incorporated into a neighborhood-scale agent model allowing probabilistic comparisons of the stormwater capture outcomes associated with alternative urban greening scenarios. Specifically, we compare the uncertainty introduced into the model by facility performance (as represented by the variability in the abstraction), to that introduced by both precipitation variability, and spatial patterns of emergence of different types of green infrastructure. The modeling results are used to update a discussion about the potential effectiveness of urban green infrastructure implementation plans.

  4. The Use of Satellite Observed Cloud Patterns in Northern Hemisphere 300 mb and 1000/300 mb Numerical Analysis.

    DTIC Science & Technology

    1984-02-01

    prediction Extratropical cyclones Objective analysis Bogus techniques 20. ABSTRACT (Continue on reverse aide If necooearn mid Identify by block number) Jh A...quasi-objective statistical method for deriving 300 mb geopotential heights and 1000/300 mb thicknesses in the vicinity of extratropical cyclones 0I...with the aid of satellite imagery is presented. The technique utilizes satellite observed extratropical spiral cloud pattern parameters in conjunction

  5. Multi-Entity Bayesian Networks Learning in Predictive Situation Awareness

    DTIC Science & Technology

    2013-06-01

    evaluated on a case study from PROGNOS. 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT Same as Report (SAR) 18...algorithm for MEBN. The methods are evaluated on a case study from PROGNOS. 1 INTRODUCTION Over the past two decades, machine learning has...the MFrag of the child node. Lastly, in the third For-Loop, for all resident nodes in the MTheory, LPDs are generated by MLE. 5 CASE STUDY

  6. Frontotemporal Functional Connectivity and Executive Functions Contribute to Episodic Memory Performance

    PubMed Central

    Blankenship, Tashauna L.; O'Neill, Meagan; Deater-Deckard, Kirby; Diana, Rachel A.; Bell, Martha Ann

    2016-01-01

    The contributions of hemispheric-specific electrophysiology (electroencephalogram or EEG) and independent executive functions (inhibitory control, working memory, cognitive flexibility) to episodic memory performance were examined using abstract paintings. Right hemisphere frontotemporal functional connectivity during encoding and retrieval, measured via EEG alpha coherence, statistically predicted performance on recency but not recognition judgments for the abstract paintings. Theta coherence, however, did not predict performance. Likewise, cognitive flexibility statistically predicted performance on recency judgments, but not recognition. These findings suggest that recognition and recency operate via separate electrophysiological and executive mechanisms. PMID:27388478

  7. Aggregating and Predicting Sequence Labels from Crowd Annotations

    PubMed Central

    Nguyen, An T.; Wallace, Byron C.; Li, Junyi Jessy; Nenkova, Ani; Lease, Matthew

    2017-01-01

    Despite sequences being core to NLP, scant work has considered how to handle noisy sequence labels from multiple annotators for the same text. Given such annotations, we consider two complementary tasks: (1) aggregating sequential crowd labels to infer a best single set of consensus annotations; and (2) using crowd annotations as training data for a model that can predict sequences in unannotated text. For aggregation, we propose a novel Hidden Markov Model variant. To predict sequences in unannotated text, we propose a neural approach using Long Short Term Memory. We evaluate a suite of methods across two different applications and text genres: Named-Entity Recognition in news articles and Information Extraction from biomedical abstracts. Results show improvement over strong baselines. Our source code and data are available online1. PMID:29093611

  8. Biological and functional relevance of CASP predictions

    PubMed Central

    Liu, Tianyun; Ish‐Shalom, Shirbi; Torng, Wen; Lafita, Aleix; Bock, Christian; Mort, Matthew; Cooper, David N; Bliven, Spencer; Capitani, Guido; Mooney, Sean D.

    2017-01-01

    Abstract Our goal is to answer the question: compared with experimental structures, how useful are predicted models for functional annotation? We assessed the functional utility of predicted models by comparing the performances of a suite of methods for functional characterization on the predictions and the experimental structures. We identified 28 sites in 25 protein targets to perform functional assessment. These 28 sites included nine sites with known ligand binding (holo‐sites), nine sites that are expected or suggested by experimental authors for small molecule binding (apo‐sites), and Ten sites containing important motifs, loops, or key residues with important disease‐associated mutations. We evaluated the utility of the predictions by comparing their microenvironments to the experimental structures. Overall structural quality correlates with functional utility. However, the best‐ranked predictions (global) may not have the best functional quality (local). Our assessment provides an ability to discriminate between predictions with high structural quality. When assessing ligand‐binding sites, most prediction methods have higher performance on apo‐sites than holo‐sites. Some servers show consistently high performance for certain types of functional sites. Finally, many functional sites are associated with protein‐protein interaction. We also analyzed biologically relevant features from the protein assemblies of two targets where the active site spanned the protein‐protein interface. For the assembly targets, we find that the features in the models are mainly determined by the choice of template. PMID:28975675

  9. Abstract conceptual feature ratings predict gaze within written word arrays: evidence from a Visual Wor(l)d paradigm

    PubMed Central

    Primativo, Silvia; Reilly, Jamie; Crutch, Sebastian J

    2016-01-01

    The Abstract Conceptual Feature (ACF) framework predicts that word meaning is represented within a high-dimensional semantic space bounded by weighted contributions of perceptual, affective, and encyclopedic information. The ACF, like latent semantic analysis, is amenable to distance metrics between any two words. We applied predictions of the ACF framework to abstract words using eye tracking via an adaptation of the classical ‘visual word paradigm’. Healthy adults (N=20) selected the lexical item most related to a probe word in a 4-item written word array comprising the target and three distractors. The relation between the probe and each of the four words was determined using the semantic distance metrics derived from ACF ratings. Eye-movement data indicated that the word that was most semantically related to the probe received more and longer fixations relative to distractors. Importantly, in sets where participants did not provide an overt behavioral response, the fixation rates were none the less significantly higher for targets than distractors, closely resembling trials where an expected response was given. Furthermore, ACF ratings which are based on individual words predicted eye fixation metrics of probe-target similarity at least as well as latent semantic analysis ratings which are based on word co-occurrence. The results provide further validation of Euclidean distance metrics derived from ACF ratings as a measure of one facet of the semantic relatedness of abstract words and suggest that they represent a reasonable approximation of the organization of abstract conceptual space. The data are also compatible with the broad notion that multiple sources of information (not restricted to sensorimotor and emotion information) shape the organization of abstract concepts. Whilst the adapted ‘visual word paradigm’ is potentially a more metacognitive task than the classical visual world paradigm, we argue that it offers potential utility for studying abstract word comprehension. PMID:26901571

  10. Abstracting Dance: Detaching Ourselves from the Habitual Perception of the Moving Body.

    PubMed

    Aviv, Vered

    2017-01-01

    This work explores to what extent the notion of abstraction in dance is valid and what it entails. Unlike abstraction in the fine arts that aims for a certain independence from representation of the external world through the use of non-figurative elements, dance is realized by a highly familiar object - the human body. In fact, we are all experts in recognizing the human body. For instance, we can mentally reconstruct its motion from minimal information (e.g., via a "dot display"), predict body trajectory during movement and identify emotional expressions of the body. Nonetheless, despite the presence of a human dancer on stage and our extreme familiarity with the human body, the process of abstraction is applicable also to dance. Abstract dance removes itself from familiar daily movements, violates the observer's predictions about future movements and detaches itself from narratives. In so doing, abstract dance exposes the observer to perceptions of unfamiliar situations, thus paving the way to new interpretations of human motion and hence to perceiving ourselves differently in both the physical and emotional domains.

  11. Abstracting Dance: Detaching Ourselves from the Habitual Perception of the Moving Body

    PubMed Central

    Aviv, Vered

    2017-01-01

    This work explores to what extent the notion of abstraction in dance is valid and what it entails. Unlike abstraction in the fine arts that aims for a certain independence from representation of the external world through the use of non-figurative elements, dance is realized by a highly familiar object – the human body. In fact, we are all experts in recognizing the human body. For instance, we can mentally reconstruct its motion from minimal information (e.g., via a “dot display”), predict body trajectory during movement and identify emotional expressions of the body. Nonetheless, despite the presence of a human dancer on stage and our extreme familiarity with the human body, the process of abstraction is applicable also to dance. Abstract dance removes itself from familiar daily movements, violates the observer’s predictions about future movements and detaches itself from narratives. In so doing, abstract dance exposes the observer to perceptions of unfamiliar situations, thus paving the way to new interpretations of human motion and hence to perceiving ourselves differently in both the physical and emotional domains. PMID:28559871

  12. Water Breakthrough Pressure of Cotton Fabrics Treated with Fluorinated Silsesquioxane / Fluoroelastomer Coatings (Preprint)

    DTIC Science & Technology

    2012-10-01

    Clearance Date: 7/20/2012. 14. ABSTRACT Breakthrough pressure is an important parameter associated with the performance of water- resistant fabrics... predicted values based on the geometry of the samples and the surface energy of the components. The theoretical predictions , however, do not explain...Edwards AFB, CA 93524 Abstract Breakthrough pressure is an important parameter associated with the performance of water- resistant fabrics

  13. When will I see you again? The fate of research findings from international wound care conferences*.

    PubMed

    Dumville, Jo C; Petherick, Emily S; Cullum, Nicky

    2008-03-01

    Medical conferences provide a forum for the rapid dissemination of research directly to health professionals and academics. However, the published record of poster and oral presentations from these meetings is usually limited to abstracts. We aimed to assess how many wound studies presented as conference abstracts were eventually published in journals and to identify the factors that predicted publication. The study was a retrospective review. We identified abstracts relating to oral and poster presentation from two large wound conferences. Following data extraction from the abstracts, a systematic search was conducted to examine if the research was subsequently published as a journal article. A time-to-event analysis was conducted to assess predictive associations between features of the research reported in the conference abstracts and time to full publication. In total, 492 abstracts from two European wound care conferences were identified (467 after exclusions). Of the abstracts included, 60% (279) were for posters and 40% (188) were for oral presentations. Over half of the abstracts (53%) reported results from case studies or case series design. In total, only 57 (12%) of the abstracts included resulted in a related publication. Analysis suggested that those studies reporting positive findings were significantly more likely to be published (hazard ratio 1.79, P= 0.001, 95% CIs 1.26-2.55). Few studies presented as conference abstracts at these two wounds conferences were subsequently published. This may be because of the low methodological quality of studies accepted for poster or oral presentation.

  14. Enzymatic aspects in ENT cancer-Matrix metalloproteinases

    PubMed Central

    Zamfir Chiru, AA; Popescu, CR; Gheorghe, DC

    2014-01-01

    Abstract The study of ENT cancer allows the implementation of molecular biology methods in diagnosis, predicting the evolution of the disease and suggesting a certain treatment. MMPs are proteolytic enzymes, zinc dependent endopeptidases, secreted by tissues and proinflammatory cells that play a role in the clearance of cell surface receptors. They are expressed as zymogens (inactive forms). Proteolytic enzymes cleave zymogens generating active forms. They are involved in cell proliferation, adhesion, differentiation, migration, angiogenesis, apoptosis and host defense. PMID:25408759

  15. Compilation of Abstracts of Theses Submitted By Candidates for Degrees

    DTIC Science & Technology

    1990-09-30

    based on a relationship between the Chaos methods ( the Poincare section and Van der Pol plane ) and the vibration amplitude and phase was discovered... half subsampled fields scored well and the one-eighth fields were poor. Even in the latter case, the model filled data gaps and areas of cyclonic and...flight test of a half scale unmanned air vehicle was conducted for the purpose of predicting the londgitudinal and laternal-directional behavior of

  16. Clathrate Structure Determination by Combining Crystal Structure Prediction with Computational and Experimental 129Xe NMR Spectroscopy

    PubMed Central

    Selent, Marcin; Nyman, Jonas; Roukala, Juho; Ilczyszyn, Marek; Oilunkaniemi, Raija; Bygrave, Peter J.; Laitinen, Risto; Jokisaari, Jukka

    2017-01-01

    Abstract An approach is presented for the structure determination of clathrates using NMR spectroscopy of enclathrated xenon to select from a set of predicted crystal structures. Crystal structure prediction methods have been used to generate an ensemble of putative structures of o‐ and m‐fluorophenol, whose previously unknown clathrate structures have been studied by 129Xe NMR spectroscopy. The high sensitivity of the 129Xe chemical shift tensor to the chemical environment and shape of the crystalline cavity makes it ideal as a probe for porous materials. The experimental powder NMR spectra can be used to directly confirm or reject hypothetical crystal structures generated by computational prediction, whose chemical shift tensors have been simulated using density functional theory. For each fluorophenol isomer one predicted crystal structure was found, whose measured and computed chemical shift tensors agree within experimental and computational error margins and these are thus proposed as the true fluorophenol xenon clathrate structures. PMID:28111848

  17. Miscellaneous Topics in Computer-Aided Drug Design: Synthetic Accessibility and GPU Computing, and Other Topics

    PubMed Central

    Fukunishi, Yoshifumi; Mashimo, Tadaaki; Misoo, Kiyotaka; Wakabayashi, Yoshinori; Miyaki, Toshiaki; Ohta, Seiji; Nakamura, Mayu; Ikeda, Kazuyoshi

    2016-01-01

    Abstract: Background Computer-aided drug design is still a state-of-the-art process in medicinal chemistry, and the main topics in this field have been extensively studied and well reviewed. These topics include compound databases, ligand-binding pocket prediction, protein-compound docking, virtual screening, target/off-target prediction, physical property prediction, molecular simulation and pharmacokinetics/pharmacodynamics (PK/PD) prediction. Message and Conclusion: However, there are also a number of secondary or miscellaneous topics that have been less well covered. For example, methods for synthesizing and predicting the synthetic accessibility (SA) of designed compounds are important in practical drug development, and hardware/software resources for performing the computations in computer-aided drug design are crucial. Cloud computing and general purpose graphics processing unit (GPGPU) computing have been used in virtual screening and molecular dynamics simulations. Not surprisingly, there is a growing demand for computer systems which combine these resources. In the present review, we summarize and discuss these various topics of drug design. PMID:27075578

  18. Life prediction of materials exposed to monotonic and cyclic loading: A new technology survey

    NASA Technical Reports Server (NTRS)

    Stuhrke, W. F.; Carpenter, J. L., Jr.

    1975-01-01

    Reviewed and evaluated technical abstracts for about 100 significant documents are reported relating primarily to life prediction for structural materials exposed to monotonic and cyclic loading, particularly in elevated temperature environments. The abstracts in the report are mostly for publications in the period April 1962 through April 1974. The purpose of this report is to provide, in quick reference form, a dependable source for current information

  19. Biological Environmental Arctic Project (BEAP) Preliminary Data (Arctic West Summer 1986 Cruise).

    DTIC Science & Technology

    1986-11-01

    predictive model of bioluminescence in near-surface arctic waters . Data were collected during Arctic West Summer 1986 from USCG POLAR STAR (WAGB 10). . %. J...2 20ODISTRIBUTION AVAILABILIT "Y OF ABSTRACT 21 ABSTRACT SECURITY CLASSIFICATION C]UNCLASSIFIED UNLIMITED SAME AS RPT C] DTIC USERS UNCLASSIFIED David...correlates for a predictive model of bioluminescence in near-surface arctic waters . - In previous years, these measurements were conducted from the USCG

  20. The Motor System Contributes to Comprehension of Abstract Language

    PubMed Central

    Guan, Connie Qun; Meng, Wanjin; Yao, Ru; Glenberg, Arthur M.

    2013-01-01

    If language comprehension requires a sensorimotor simulation, how can abstract language be comprehended? We show that preparation to respond in an upward or downward direction affects comprehension of the abstract quantifiers “more and more” and “less and less” as indexed by an N400-like component. Conversely, the semantic content of the sentence affects the motor potential measured immediately before the upward or downward action is initiated. We propose that this bidirectional link between motor system and language arises because the motor system implements forward models that predict the sensory consequences of actions. Because the same movement (e.g., raising the arm) can have multiple forward models for different contexts, the models can make different predictions depending on whether the arm is raised, for example, to place an object or raised as a threat. Thus, different linguistic contexts invoke different forward models, and the predictions constitute different understandings of the language. PMID:24086463

  1. Techniques for the Enhancement of Linear Predictive Speech Coding in Adverse Conditions

    NASA Astrophysics Data System (ADS)

    Wrench, Alan A.

    Available from UMI in association with The British Library. Requires signed TDF. The Linear Prediction model was first applied to speech two and a half decades ago. Since then it has been the subject of intense research and continues to be one of the principal tools in the analysis of speech. Its mathematical tractability makes it a suitable subject for study and its proven success in practical applications makes the study worthwhile. The model is known to be unsuited to speech corrupted by background noise. This has led many researchers to investigate ways of enhancing the speech signal prior to Linear Predictive analysis. In this thesis this body of work is extended. The chosen application is low bit-rate (2.4 kbits/sec) speech coding. For this task the performance of the Linear Prediction algorithm is crucial because there is insufficient bandwidth to encode the error between the modelled speech and the original input. A review of the fundamentals of Linear Prediction and an independent assessment of the relative performance of methods of Linear Prediction modelling are presented. A new method is proposed which is fast and facilitates stability checking, however, its stability is shown to be unacceptably poorer than existing methods. A novel supposition governing the positioning of the analysis frame relative to a voiced speech signal is proposed and supported by observation. The problem of coding noisy speech is examined. Four frequency domain speech processing techniques are developed and tested. These are: (i) Combined Order Linear Prediction Spectral Estimation; (ii) Frequency Scaling According to an Aural Model; (iii) Amplitude Weighting Based on Perceived Loudness; (iv) Power Spectrum Squaring. These methods are compared with the Recursive Linearised Maximum a Posteriori method. Following on from work done in the frequency domain, a time domain implementation of spectrum squaring is developed. In addition, a new method of power spectrum estimation is developed based on the Minimum Variance approach. This new algorithm is shown to be closely related to Linear Prediction but produces slightly broader spectral peaks. Spectrum squaring is applied to both the new algorithm and standard Linear Prediction and their relative performance is assessed. (Abstract shortened by UMI.).

  2. When do self-discrepancies predict negative emotions? Exploring formal operational thought and abstract reasoning skills as moderators.

    PubMed

    Stevens, Erin N; Holmberg, Nicole J; Lovejoy, M Christine; Pittman, Laura D

    2014-01-01

    Individual differences in higher-order cognitive abilities may be an important piece to understanding how and when self-discrepancies lead to negative emotions. In the current study, three measures of reasoning abilities were considered as potential moderators of the relationship between self-discrepancies and depression and anxiety symptoms. Participants (N = 162) completed measures assessing self-discrepancies, depression and anxiety symptoms, and were administered measures examining formal operational thought, and verbal and non-verbal abstract reasoning skills. Both formal operational thought and verbal abstract reasoning were significant moderators of the relationship between actual:ideal discrepancies and depressive symptoms. Discrepancies predicted depressive symptoms for individuals with higher levels of formal operational thought and verbal abstract reasoning skills, but not for those with lower levels. The discussion focuses on the need to consider advanced reasoning skills when examining self-discrepancies.

  3. RKNNMDA: Ranking-based KNN for MiRNA-Disease Association prediction

    PubMed Central

    Chen, Xing; Yan, Gui-Ying

    2017-01-01

    ABSTRACT Cumulative verified experimental studies have demonstrated that microRNAs (miRNAs) could be closely related with the development and progression of human complex diseases. Based on the assumption that functional similar miRNAs may have a strong correlation with phenotypically similar diseases and vice versa, researchers developed various effective computational models which combine heterogeneous biologic data sets including disease similarity network, miRNA similarity network, and known disease-miRNA association network to identify potential relationships between miRNAs and diseases in biomedical research. Considering the limitations in previous computational study, we introduced a novel computational method of Ranking-based KNN for miRNA-Disease Association prediction (RKNNMDA) to predict potential related miRNAs for diseases, and our method obtained an AUC of 0.8221 based on leave-one-out cross validation. In addition, RKNNMDA was applied to 3 kinds of important human cancers for further performance evaluation. The results showed that 96%, 80% and 94% of predicted top 50 potential related miRNAs for Colon Neoplasms, Esophageal Neoplasms, and Prostate Neoplasms have been confirmed by experimental literatures, respectively. Moreover, RKNNMDA could be used to predict potential miRNAs for diseases without any known miRNAs, and it is anticipated that RKNNMDA would be of great use for novel miRNA-disease association identification. PMID:28421868

  4. A neural network - based algorithm for predicting stone -free status after ESWL therapy

    PubMed Central

    Seckiner, Ilker; Seckiner, Serap; Sen, Haluk; Bayrak, Omer; Dogan, Kazım; Erturhan, Sakip

    2017-01-01

    ABSTRACT Objective: The prototype artificial neural network (ANN) model was developed using data from patients with renal stone, in order to predict stone-free status and to help in planning treatment with Extracorporeal Shock Wave Lithotripsy (ESWL) for kidney stones. Materials and Methods: Data were collected from the 203 patients including gender, single or multiple nature of the stone, location of the stone, infundibulopelvic angle primary or secondary nature of the stone, status of hydronephrosis, stone size after ESWL, age, size, skin to stone distance, stone density and creatinine, for eleven variables. Regression analysis and the ANN method were applied to predict treatment success using the same series of data. Results: Subsequently, patients were divided into three groups by neural network software, in order to implement the ANN: training group (n=139), validation group (n=32), and the test group (n=32). ANN analysis demonstrated that the prediction accuracy of the stone-free rate was 99.25% in the training group, 85.48% in the validation group, and 88.70% in the test group. Conclusions: Successful results were obtained to predict the stone-free rate, with the help of the ANN model designed by using a series of data collected from real patients in whom ESWL was implemented to help in planning treatment for kidney stones. PMID:28727384

  5. Performance of combined fragmentation and retention prediction for the identification of organic micropollutants by LC-HRMS.

    PubMed

    Hu, Meng; Müller, Erik; Schymanski, Emma L; Ruttkies, Christoph; Schulze, Tobias; Brack, Werner; Krauss, Martin

    2018-03-01

    In nontarget screening, structure elucidation of small molecules from high resolution mass spectrometry (HRMS) data is challenging, particularly the selection of the most likely candidate structure among the many retrieved from compound databases. Several fragmentation and retention prediction methods have been developed to improve this candidate selection. In order to evaluate their performance, we compared two in silico fragmenters (MetFrag and CFM-ID) and two retention time prediction models (based on the chromatographic hydrophobicity index (CHI) and on log D). A set of 78 known organic micropollutants was analyzed by liquid chromatography coupled to a LTQ Orbitrap HRMS with electrospray ionization (ESI) in positive and negative mode using two fragmentation techniques with different collision energies. Both fragmenters (MetFrag and CFM-ID) performed well for most compounds, with average ranking the correct candidate structure within the top 25% and 22 to 37% for ESI+ and ESI- mode, respectively. The rank of the correct candidate structure slightly improved when MetFrag and CFM-ID were combined. For unknown compounds detected in both ESI+ and ESI-, generally positive mode mass spectra were better for further structure elucidation. Both retention prediction models performed reasonably well for more hydrophobic compounds but not for early eluting hydrophilic substances. The log D prediction showed a better accuracy than the CHI model. Although the two fragmentation prediction methods are more diagnostic and sensitive for candidate selection, the inclusion of retention prediction by calculating a consensus score with optimized weighting can improve the ranking of correct candidates as compared to the individual methods. Graphical abstract Consensus workflow for combining fragmentation and retention prediction in LC-HRMS-based micropollutant identification.

  6. Concrete Model Checking with Abstract Matching and Refinement

    NASA Technical Reports Server (NTRS)

    Pasareanu Corina S.; Peianek Radek; Visser, Willem

    2005-01-01

    We propose an abstraction-based model checking method which relies on refinement of an under-approximation of the feasible behaviors of the system under analysis. The method preserves errors to safety properties, since all analyzed behaviors are feasible by definition. The method does not require an abstract transition relation to he generated, but instead executes the concrete transitions while storing abstract versions of the concrete states, as specified by a set of abstraction predicates. For each explored transition. the method checks, with the help of a theorem prover, whether there is any loss of precision introduced by abstraction. The results of these checks are used to decide termination or to refine the abstraction, by generating new abstraction predicates. If the (possibly infinite) concrete system under analysis has a finite bisimulation quotient, then the method is guaranteed to eventually explore an equivalent finite bisimilar structure. We illustrate the application of the approach for checking concurrent programs. We also show how a lightweight variant can be used for efficient software testing.

  7. Contextual Processing of Abstract Concepts Reveals Neural Representations of Non-Linguistic Semantic Content

    PubMed Central

    Wilson-Mendenhall, Christine D.; Simmons, W. Kyle; Martin, Alex; Barsalou, Lawrence W.

    2014-01-01

    Concepts develop for many aspects of experience, including abstract internal states and abstract social activities that do not refer to concrete entities in the world. The current study assessed the hypothesis that, like concrete concepts, distributed neural patterns of relevant, non-linguistic semantic content represent the meanings of abstract concepts. In a novel neuroimaging paradigm, participants processed two abstract concepts (convince, arithmetic) and two concrete concepts (rolling, red) deeply and repeatedly during a concept-scene matching task that grounded each concept in typical contexts. Using a catch trial design, neural activity associated with each concept word was separated from neural activity associated with subsequent visual scenes to assess activations underlying the detailed semantics of each concept. We predicted that brain regions underlying mentalizing and social cognition (e.g., medial prefrontal cortex, superior temporal sulcus) would become active to represent semantic content central to convince, whereas brain regions underlying numerical cognition (e.g., bilateral intraparietal sulcus) would become active to represent semantic content central to arithmetic. The results supported these predictions, suggesting that the meanings of abstract concepts arise from distributed neural systems that represent concept-specific content. PMID:23363408

  8. Learning and Processing Abstract Words and Concepts: Insights From Typical and Atypical Development.

    PubMed

    Vigliocco, Gabriella; Ponari, Marta; Norbury, Courtenay

    2018-05-21

    The paper describes two plausible hypotheses concerning the learning of abstract words and concepts. According to a first hypothesis, children would learn abstract words by extracting co-occurrences among words in linguistic input, using, for example, mechanisms as described by models of Distributional Semantics. According to a second hypothesis, children would exploit the fact that abstract words tend to have more emotional associations than concrete words to infer that they refer to internal/mental states. Each hypothesis makes specific predictions with regards to when and which abstract words are more likely to be learned; also they make different predictions concerning the impact of developmental disorders. We start by providing a review of work characterizing how abstract words and concepts are learned in development, especially between the ages of 6 and 12. Second, we review some work from our group that tests the two hypotheses above. This work investigates typically developing (TD) children and children with atypical development (developmental language disorders [DLD] and autism spectrum disorder [ASD] with and without language deficits). We conclude that the use of strategies based on emotional information, or on co-occurrences in language, may play a role at different developmental stages. © 2018 Cognitive Science Society Inc.

  9. Analysis of Particulate Composite Behavior Based on Nonlinear Elasticity and an Improved Mori-Tanaka Theory

    DTIC Science & Technology

    1998-09-01

    to characterize the weakening constraint power of the matrix as opposed to earlier analyses that used an additional eigenstrain term. It also...matrix Poisson ratio was constant and the inclusions were rigid, he showed that the disturbed strain and the eigenstrain in the Eshelby method could...Eshelby, elastic properties, prediction, energy balance, mechanical behavior, eigenstrain , nonlinear dcd03e So7S&3 UNCLASSIFIED SECURITY CLASSIFICATION OF FORM (Highest classification of Title, Abstract, Keywords)

  10. Evaluating In Vitro-In Vivo Extrapolation of Toxicokinetics

    PubMed Central

    MacMillan, Denise K; Ford, Jermaine; Fennell, Timothy R; Black, Sherry R; Snyder, Rodney W; Sipes, Nisha S; Westerhout, Joost; Setzer, R Woodrow; Pearce, Robert G; Simmons, Jane Ellen; Thomas, Russell S

    2018-01-01

    Abstract Prioritizing the risk posed by thousands of chemicals potentially present in the environment requires exposure, toxicity, and toxicokinetic (TK) data, which are often unavailable. Relatively high throughput, in vitro TK (HTTK) assays and in vitro-to-in vivo extrapolation (IVIVE) methods have been developed to predict TK, but most of the in vivo TK data available to benchmark these methods are from pharmaceuticals. Here we report on new, in vivo rat TK experiments for 26 non-pharmaceutical chemicals with environmental relevance. Both intravenous and oral dosing were used to calculate bioavailability. These chemicals, and an additional 19 chemicals (including some pharmaceuticals) from previously published in vivo rat studies, were systematically analyzed to estimate in vivo TK parameters (e.g., volume of distribution [Vd], elimination rate). For each of the chemicals, rat-specific HTTK data were available and key TK predictions were examined: oral bioavailability, clearance, Vd, and uncertainty. For the non-pharmaceutical chemicals, predictions for bioavailability were not effective. While no pharmaceutical was absorbed at less than 10%, the fraction bioavailable for non-pharmaceutical chemicals was as low as 0.3%. Total clearance was generally more under-estimated for nonpharmaceuticals and Vd methods calibrated to pharmaceuticals may not be appropriate for other chemicals. However, the steady-state, peak, and time-integrated plasma concentrations of nonpharmaceuticals were predicted with reasonable accuracy. The plasma concentration predictions improved when experimental measurements of bioavailability were incorporated. In summary, HTTK and IVIVE methods are adequately robust to be applied to high throughput in vitro toxicity screening data of environmentally relevant chemicals for prioritizing based on human health risks. PMID:29385628

  11. Power differences in the construal of a crisis: the immediate aftermath of September 11, 2001.

    PubMed

    Magee, Joe C; Milliken, Frances J; Lurie, Adam R

    2010-03-01

    In this research, we examine the relationship between power and three characteristics of construal-abstraction, valence, and certainty-in individuals' verbatim reactions to the events of September 11, 2001, and during the immediate aftermath of the terrorist attacks. We conceptualize power as a form of social distance and find that position power (but not expert power) was positively associated with the use of language that was more abstract (vs. concrete), positive (vs. negative), and certain (vs. uncertain). These effects persist after controlling for temporal distance, geographic distance, and impression management motivation. Our results support central and corollary predictions of Construal Level Theory (Liberman, Trope, & Stephan, 2007; Trope & Liberman, 2003) in a high-consequence, real-world context, and our method provides a template for future research in this area outside of the laboratory.

  12. EXAMINING PREDICTIVE ACCURACY AMONG DISCOUNTING MODELS. (R826611)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  13. Neural activity during affect labeling predicts expressive writing effects on well-being: GLM and SVM approaches

    PubMed Central

    Memarian, Negar; Torre, Jared B.; Haltom, Kate E.; Stanton, Annette L.

    2017-01-01

    Abstract Affect labeling (putting feelings into words) is a form of incidental emotion regulation that could underpin some benefits of expressive writing (i.e. writing about negative experiences). Here, we show that neural responses during affect labeling predicted changes in psychological and physical well-being outcome measures 3 months later. Furthermore, neural activity of specific frontal regions and amygdala predicted those outcomes as a function of expressive writing. Using supervised learning (support vector machines regression), improvements in four measures of psychological and physical health (physical symptoms, depression, anxiety and life satisfaction) after an expressive writing intervention were predicted with an average of 0.85% prediction error [root mean square error (RMSE) %]. The predictions were significantly more accurate with machine learning than with the conventional generalized linear model method (average RMSE: 1.3%). Consistent with affect labeling research, right ventrolateral prefrontal cortex (RVLPFC) and amygdalae were top predictors of improvement in the four outcomes. Moreover, RVLPFC and left amygdala predicted benefits due to expressive writing in satisfaction with life and depression outcome measures, respectively. This study demonstrates the substantial merit of supervised machine learning for real-world outcome prediction in social and affective neuroscience. PMID:28992270

  14. FIT: statistical modeling tool for transcriptome dynamics under fluctuating field conditions

    PubMed Central

    Iwayama, Koji; Aisaka, Yuri; Kutsuna, Natsumaro

    2017-01-01

    Abstract Motivation: Considerable attention has been given to the quantification of environmental effects on organisms. In natural conditions, environmental factors are continuously changing in a complex manner. To reveal the effects of such environmental variations on organisms, transcriptome data in field environments have been collected and analyzed. Nagano et al. proposed a model that describes the relationship between transcriptomic variation and environmental conditions and demonstrated the capability to predict transcriptome variation in rice plants. However, the computational cost of parameter optimization has prevented its wide application. Results: We propose a new statistical model and efficient parameter optimization based on the previous study. We developed and released FIT, an R package that offers functions for parameter optimization and transcriptome prediction. The proposed method achieves comparable or better prediction performance within a shorter computational time than the previous method. The package will facilitate the study of the environmental effects on transcriptomic variation in field conditions. Availability and Implementation: Freely available from CRAN (https://cran.r-project.org/web/packages/FIT/). Contact: anagano@agr.ryukoku.ac.jp Supplementary information: Supplementary data are available at Bioinformatics online PMID:28158396

  15. Predictive performance of the Vitrigel‐eye irritancy test method using 118 chemicals

    PubMed Central

    Yamaguchi, Hiroyuki; Kojima, Hajime

    2015-01-01

    Abstract We recently developed a novel Vitrigel‐eye irritancy test (EIT) method. The Vitrigel‐EIT method is composed of two parts, i.e., the construction of a human corneal epithelium (HCE) model in a collagen vitrigel membrane chamber and the prediction of eye irritancy by analyzing the time‐dependent profile of transepithelial electrical resistance values for 3 min after exposing a chemical to the HCE model. In this study, we estimated the predictive performance of Vitrigel‐EIT method by testing a total of 118 chemicals. The category determined by the Vitrigel‐EIT method in comparison to the globally harmonized system classification revealed that the sensitivity, specificity and accuracy were 90.1%, 65.9% and 80.5%, respectively. Here, five of seven false‐negative chemicals were acidic chemicals inducing the irregular rising of transepithelial electrical resistance values. In case of eliminating the test chemical solutions showing pH 5 or lower, the sensitivity, specificity and accuracy were improved to 96.8%, 67.4% and 84.4%, respectively. Meanwhile, nine of 16 false‐positive chemicals were classified irritant by the US Environmental Protection Agency. In addition, the disappearance of ZO‐1, a tight junction‐associated protein and MUC1, a cell membrane‐spanning mucin was immunohistologically confirmed in the HCE models after exposing not only eye irritant chemicals but also false‐positive chemicals, suggesting that such false‐positive chemicals have an eye irritant potential. These data demonstrated that the Vitrigel‐EIT method could provide excellent predictive performance to judge the widespread eye irritancy, including very mild irritant chemicals. We hope that the Vitrigel‐EIT method contributes to the development of safe commodity chemicals. Copyright © 2015 The Authors. Journal of Applied Toxicology published by John Wiley & Sons Ltd. PMID:26472347

  16. The need to approximate the use-case in clinical machine learning

    PubMed Central

    Saeb, Sohrab; Jayaraman, Arun; Mohr, David C.; Kording, Konrad P.

    2017-01-01

    Abstract The availability of smartphone and wearable sensor technology is leading to a rapid accumulation of human subject data, and machine learning is emerging as a technique to map those data into clinical predictions. As machine learning algorithms are increasingly used to support clinical decision making, it is vital to reliably quantify their prediction accuracy. Cross-validation (CV) is the standard approach where the accuracy of such algorithms is evaluated on part of the data the algorithm has not seen during training. However, for this procedure to be meaningful, the relationship between the training and the validation set should mimic the relationship between the training set and the dataset expected for the clinical use. Here we compared two popular CV methods: record-wise and subject-wise. While the subject-wise method mirrors the clinically relevant use-case scenario of diagnosis in newly recruited subjects, the record-wise strategy has no such interpretation. Using both a publicly available dataset and a simulation, we found that record-wise CV often massively overestimates the prediction accuracy of the algorithms. We also conducted a systematic review of the relevant literature, and found that this overly optimistic method was used by almost half of the retrieved studies that used accelerometers, wearable sensors, or smartphones to predict clinical outcomes. As we move towards an era of machine learning-based diagnosis and treatment, using proper methods to evaluate their accuracy is crucial, as inaccurate results can mislead both clinicians and data scientists. PMID:28327985

  17. Application of model abstraction techniques to simulate transport in soils

    USDA-ARS?s Scientific Manuscript database

    Successful understanding and modeling of contaminant transport in soils is the precondition of risk-informed predictions of the subsurface contaminant transport. Exceedingly complex models of subsurface contaminant transport are often inefficient. Model abstraction is the methodology for reducing th...

  18. Chiral Brønsted Acid‐Catalyzed Enantioselective α‐Amidoalkylation Reactions: A Joint Experimental and Predictive Study

    PubMed Central

    Aranzamendi, Eider; Arrasate, Sonia; Sotomayor, Nuria

    2016-01-01

    Abstract Enamides with a free NH group have been evaluated as nucleophiles in chiral Brønsted acid‐catalyzed enantioselective α‐amidoalkylation reactions of bicyclic hydroxylactams for the generation of quaternary stereocenters. A quantitative structure–reactivity relationship (QSRR) method has been developed to find a useful tool to rationalize the enantioselectivity in this and related processes and to orient the catalyst choice. This correlative perturbation theory (PT)‐QSRR approach has been used to predict the effect of the structure of the substrate, nucleophile, and catalyst, as well as the experimental conditions, on the enantioselectivity. In this way, trends to improve the experimental results could be found without engaging in a long‐term empirical investigation. PMID:28032023

  19. Predictive factors for the Nursing Diagnoses in people living with Acquired Immune Deficiency Syndrome 1

    PubMed Central

    da Silva, Richardson Augusto Rosendo; Costa, Romanniny Hévillyn Silva; Nelson, Ana Raquel Cortês; Duarte, Fernando Hiago da Silva; Prado, Nanete Caroline da Costa; Rodrigues, Eduardo Henrique Fagundes

    2016-01-01

    Abstract Objective: to identify the predictive factors for the nursing diagnoses in people living with Acquired Immune Deficiency Syndrome. Method: a cross-sectional study, undertaken with 113 people living with AIDS. The data were collected using an interview script and physical examination. Logistic regression was used for the data analysis, considering a level of significance of 10%. Results: the predictive factors identified were: for the nursing diagnosis of knowledge deficit-inadequate following of instructions and verbalization of the problem; for the nursing diagnosis of failure to adhere - years of study, behavior indicative of failure to adhere, participation in the treatment and forgetfulness; for the nursing diagnosis of sexual dysfunction - family income, reduced frequency of sexual practice, perceived deficit in sexual desire, perceived limitations imposed by the disease and altered body function. Conclusion: the predictive factors for these nursing diagnoses involved sociodemographic and clinical characteristics, defining characteristics, and related factors, which must be taken into consideration during the assistance provided by the nurse. PMID:27384466

  20. A Temporal Pattern Mining Approach for Classifying Electronic Health Record Data

    PubMed Central

    Batal, Iyad; Valizadegan, Hamed; Cooper, Gregory F.; Hauskrecht, Milos

    2013-01-01

    We study the problem of learning classification models from complex multivariate temporal data encountered in electronic health record systems. The challenge is to define a good set of features that are able to represent well the temporal aspect of the data. Our method relies on temporal abstractions and temporal pattern mining to extract the classification features. Temporal pattern mining usually returns a large number of temporal patterns, most of which may be irrelevant to the classification task. To address this problem, we present the Minimal Predictive Temporal Patterns framework to generate a small set of predictive and non-spurious patterns. We apply our approach to the real-world clinical task of predicting patients who are at risk of developing heparin induced thrombocytopenia. The results demonstrate the benefit of our approach in efficiently learning accurate classifiers, which is a key step for developing intelligent clinical monitoring systems. PMID:25309815

  1. Towards agile large-scale predictive modelling in drug discovery with flow-based programming design principles.

    PubMed

    Lampa, Samuel; Alvarsson, Jonathan; Spjuth, Ola

    2016-01-01

    Predictive modelling in drug discovery is challenging to automate as it often contains multiple analysis steps and might involve cross-validation and parameter tuning that create complex dependencies between tasks. With large-scale data or when using computationally demanding modelling methods, e-infrastructures such as high-performance or cloud computing are required, adding to the existing challenges of fault-tolerant automation. Workflow management systems can aid in many of these challenges, but the currently available systems are lacking in the functionality needed to enable agile and flexible predictive modelling. We here present an approach inspired by elements of the flow-based programming paradigm, implemented as an extension of the Luigi system which we name SciLuigi. We also discuss the experiences from using the approach when modelling a large set of biochemical interactions using a shared computer cluster.Graphical abstract.

  2. Assessing Strategies Against Gambiense Sleeping Sickness Through Mathematical Modeling

    PubMed Central

    Rock, Kat S; Ndeffo-Mbah, Martial L; Castaño, Soledad; Palmer, Cody; Pandey, Abhishek; Atkins, Katherine E; Ndung’u, Joseph M; Hollingsworth, T Déirdre; Galvani, Alison; Bever, Caitlin; Chitnis, Nakul; Keeling, Matt J

    2018-01-01

    Abstract Background Control of gambiense sleeping sickness relies predominantly on passive and active screening of people, followed by treatment. Methods Mathematical modeling explores the potential of 3 complementary interventions in high- and low-transmission settings. Results Intervention strategies that included vector control are predicted to halt transmission most quickly. Targeted active screening, with better and more focused coverage, and enhanced passive surveillance, with improved access to diagnosis and treatment, are both estimated to avert many new infections but, when used alone, are unlikely to halt transmission before 2030 in high-risk settings. Conclusions There was general model consensus in the ranking of the 3 complementary interventions studied, although with discrepancies between the quantitative predictions due to differing epidemiological assumptions within the models. While these predictions provide generic insights into improving control, the most effective strategy in any situation depends on the specific epidemiology in the region and the associated costs. PMID:29860287

  3. Temporal abstraction and temporal Bayesian networks in clinical domains: a survey.

    PubMed

    Orphanou, Kalia; Stassopoulou, Athena; Keravnou, Elpida

    2014-03-01

    Temporal abstraction (TA) of clinical data aims to abstract and interpret clinical data into meaningful higher-level interval concepts. Abstracted concepts are used for diagnostic, prediction and therapy planning purposes. On the other hand, temporal Bayesian networks (TBNs) are temporal extensions of the known probabilistic graphical models, Bayesian networks. TBNs can represent temporal relationships between events and their state changes, or the evolution of a process, through time. This paper offers a survey on techniques/methods from these two areas that were used independently in many clinical domains (e.g. diabetes, hepatitis, cancer) for various clinical tasks (e.g. diagnosis, prognosis). A main objective of this survey, in addition to presenting the key aspects of TA and TBNs, is to point out important benefits from a potential integration of TA and TBNs in medical domains and tasks. The motivation for integrating these two areas is their complementary function: TA provides clinicians with high level views of data while TBNs serve as a knowledge representation and reasoning tool under uncertainty, which is inherent in all clinical tasks. Key publications from these two areas of relevance to clinical systems, mainly circumscribed to the latest two decades, are reviewed and classified. TA techniques are compared on the basis of: (a) knowledge acquisition and representation for deriving TA concepts and (b) methodology for deriving basic and complex temporal abstractions. TBNs are compared on the basis of: (a) representation of time, (b) knowledge representation and acquisition, (c) inference methods and the computational demands of the network, and (d) their applications in medicine. The survey performs an extensive comparative analysis to illustrate the separate merits and limitations of various TA and TBN techniques used in clinical systems with the purpose of anticipating potential gains through an integration of the two techniques, thus leading to a unified methodology for clinical systems. The surveyed contributions are evaluated using frameworks of respective key features. In addition, for the evaluation of TBN methods, a unifying clinical domain (diabetes) is used. The main conclusion transpiring from this review is that techniques/methods from these two areas, that so far are being largely used independently of each other in clinical domains, could be effectively integrated in the context of medical decision-support systems. The anticipated key benefits of the perceived integration are: (a) during problem solving, the reasoning can be directed at different levels of temporal and/or conceptual abstractions since the nodes of the TBNs can be complex entities, temporally and structurally and (b) during model building, knowledge generated in the form of basic and/or complex abstractions, can be deployed in a TBN. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. PHENOTYPE PREDICTS THE GENOTYPE OF BRAIN STEM INJURY IN AUTISM. (R824758)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  5. NAUTILUS: A MODEL FOR PREDICTING CHEMICAL EMISSIONS FROM SEWERS (R823335)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  6. PREDICTING INVASIONS: PROPAGULE PRESSURE AND THE GRAVITY OF ALLEE EFFECTS. (R828899)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  7. Can the six-minute walk distance predict the occurrence of acute exacerbations of COPD in patients in Brazil?

    PubMed Central

    Morakami, Fernanda Kazmierski; Morita, Andrea Akemi; Bisca, Gianna Waldrich; Felcar, Josiane Marques; Ribeiro, Marcos; Furlanetto, Karina Couto; Hernandes, Nidia Aparecida; Pitta, Fabio

    2017-01-01

    ABSTRACT Objective: To evaluate whether a six-minute walk distance (6MWD) of < 80% of the predicted value can predict the occurrence of acute exacerbations of COPD in patients in Brazil over a 2-year period. Methods: This was a retrospective cross-sectional study involving 50 COPD patients in Brazil. At enrollment, anthropometric data were collected and patients were assessed for pulmonary function (by spirometry) and functional exercise capacity (by the 6MWD). The patients were subsequently divided into two groups: 6MWD ≤ 80% of predicted and 6MWD > 80% of predicted. The occurrence of acute exacerbations of COPD over 2 years was identified by analyzing medical records and contacting patients by telephone. Results: In the sample as a whole, there was moderate-to-severe airflow obstruction (mean FEV1 = 41 ± 12% of predicted) and the mean 6MWD was 469 ± 60 m (86 ± 10% of predicted). Over the 2-year follow-up period, 25 patients (50%) experienced acute exacerbations of COPD. The Kaplan-Meier method showed that the patients in whom the 6MWD was ≤ 80% of predicted were more likely to have exacerbations than were those in whom the 6MWD was > 80% of predicted (p = 0.01), whereas the Cox regression model showed that the former were 2.6 times as likely to have an exacerbation over a 2-year period as were the latter (p = 0.02). Conclusions: In Brazil, the 6MWD can predict acute exacerbations of COPD over a 2-year period. The risk of experiencing an acute exacerbation of COPD within 2 years is more than twice as high in patients in whom the 6MWD is ≤ 80% of predicted. PMID:29365003

  8. Hierarchical Spatial Concept Formation Based on Multimodal Information for Human Support Robots.

    PubMed

    Hagiwara, Yoshinobu; Inoue, Masakazu; Kobayashi, Hiroyoshi; Taniguchi, Tadahiro

    2018-01-01

    In this paper, we propose a hierarchical spatial concept formation method based on the Bayesian generative model with multimodal information e.g., vision, position and word information. Since humans have the ability to select an appropriate level of abstraction according to the situation and describe their position linguistically, e.g., "I am in my home" and "I am in front of the table," a hierarchical structure of spatial concepts is necessary in order for human support robots to communicate smoothly with users. The proposed method enables a robot to form hierarchical spatial concepts by categorizing multimodal information using hierarchical multimodal latent Dirichlet allocation (hMLDA). Object recognition results using convolutional neural network (CNN), hierarchical k-means clustering result of self-position estimated by Monte Carlo localization (MCL), and a set of location names are used, respectively, as features in vision, position, and word information. Experiments in forming hierarchical spatial concepts and evaluating how the proposed method can predict unobserved location names and position categories are performed using a robot in the real world. Results verify that, relative to comparable baseline methods, the proposed method enables a robot to predict location names and position categories closer to predictions made by humans. As an application example of the proposed method in a home environment, a demonstration in which a human support robot moves to an instructed place based on human speech instructions is achieved based on the formed hierarchical spatial concept.

  9. Hierarchical Spatial Concept Formation Based on Multimodal Information for Human Support Robots

    PubMed Central

    Hagiwara, Yoshinobu; Inoue, Masakazu; Kobayashi, Hiroyoshi; Taniguchi, Tadahiro

    2018-01-01

    In this paper, we propose a hierarchical spatial concept formation method based on the Bayesian generative model with multimodal information e.g., vision, position and word information. Since humans have the ability to select an appropriate level of abstraction according to the situation and describe their position linguistically, e.g., “I am in my home” and “I am in front of the table,” a hierarchical structure of spatial concepts is necessary in order for human support robots to communicate smoothly with users. The proposed method enables a robot to form hierarchical spatial concepts by categorizing multimodal information using hierarchical multimodal latent Dirichlet allocation (hMLDA). Object recognition results using convolutional neural network (CNN), hierarchical k-means clustering result of self-position estimated by Monte Carlo localization (MCL), and a set of location names are used, respectively, as features in vision, position, and word information. Experiments in forming hierarchical spatial concepts and evaluating how the proposed method can predict unobserved location names and position categories are performed using a robot in the real world. Results verify that, relative to comparable baseline methods, the proposed method enables a robot to predict location names and position categories closer to predictions made by humans. As an application example of the proposed method in a home environment, a demonstration in which a human support robot moves to an instructed place based on human speech instructions is achieved based on the formed hierarchical spatial concept. PMID:29593521

  10. Openness to Experience and Intellect differentially predict creative achievement in the arts and sciences

    PubMed Central

    Kaufman, Scott Barry; Quilty, Lena C.; Grazioplene, Rachael G.; Hirsh, Jacob B.; Gray, Jeremy R.; Peterson, Jordan B.; DeYoung, Colin G.

    2014-01-01

    Objective The Big Five personality dimension Openness/Intellect is the trait most closely associated with creativity and creative achievement. Little is known, however, regarding the discriminant validity of its two aspects— Openness to Experience (reflecting cognitive engagement with perception, fantasy, aesthetics, and emotions) and Intellect (reflecting cognitive engagement with abstract and semantic information, primarily through reasoning)— in relation to creativity. Method In four demographically diverse samples totaling 1035 participants, we investigated the independent predictive validity of Openness and Intellect by assessing the relations among cognitive ability, divergent thinking, personality, and creative achievement across the arts and sciences. Results and Conclusions We confirmed the hypothesis that whereas Openness predicts creative achievement in the arts, Intellect predicts creative achievement in the sciences. Inclusion of performance measures of general cognitive ability and divergent thinking indicated that the relation of Intellect to scientific creativity may be due at least in part to these abilities. Lastly, we found that Extraversion additionally predicted creative achievement in the arts, independently of Openness. Results are discussed in the context of dual-process theory. PMID:25487993

  11. An intermediate level of abstraction for computational systems chemistry.

    PubMed

    Andersen, Jakob L; Flamm, Christoph; Merkle, Daniel; Stadler, Peter F

    2017-12-28

    Computational techniques are required for narrowing down the vast space of possibilities to plausible prebiotic scenarios, because precise information on the molecular composition, the dominant reaction chemistry and the conditions for that era are scarce. The exploration of large chemical reaction networks is a central aspect in this endeavour. While quantum chemical methods can accurately predict the structures and reactivities of small molecules, they are not efficient enough to cope with large-scale reaction systems. The formalization of chemical reactions as graph grammars provides a generative system, well grounded in category theory, at the right level of abstraction for the analysis of large and complex reaction networks. An extension of the basic formalism into the realm of integer hyperflows allows for the identification of complex reaction patterns, such as autocatalysis, in large reaction networks using optimization techniques.This article is part of the themed issue 'Reconceptualizing the origins of life'. © 2017 The Author(s).

  12. SYNOPTIC CLIMATOLOGY PREDICTIONS OF FRESHWATER FLOW TO CHESAPEAKE BAY. (R828677C002)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  13. TRANSFORMATIONS MODEL FOR PREDICTING SIZE AND COMPOSITION OF ASH DURING COAL COMBUSTIONS. (R827649)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  14. ECOLOGICAL PREDICTIONS AND RISK ASSESSMENT FOR ALIEN FISHES IN NORTH AMERICA. (R828899)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  15. Inference of Gene Regulatory Networks Incorporating Multi-Source Biological Knowledge via a State Space Model with L1 Regularization

    PubMed Central

    Hasegawa, Takanori; Yamaguchi, Rui; Nagasaki, Masao; Miyano, Satoru; Imoto, Seiya

    2014-01-01

    Comprehensive understanding of gene regulatory networks (GRNs) is a major challenge in the field of systems biology. Currently, there are two main approaches in GRN analysis using time-course observation data, namely an ordinary differential equation (ODE)-based approach and a statistical model-based approach. The ODE-based approach can generate complex dynamics of GRNs according to biologically validated nonlinear models. However, it cannot be applied to ten or more genes to simultaneously estimate system dynamics and regulatory relationships due to the computational difficulties. The statistical model-based approach uses highly abstract models to simply describe biological systems and to infer relationships among several hundreds of genes from the data. However, the high abstraction generates false regulations that are not permitted biologically. Thus, when dealing with several tens of genes of which the relationships are partially known, a method that can infer regulatory relationships based on a model with low abstraction and that can emulate the dynamics of ODE-based models while incorporating prior knowledge is urgently required. To accomplish this, we propose a method for inference of GRNs using a state space representation of a vector auto-regressive (VAR) model with L1 regularization. This method can estimate the dynamic behavior of genes based on linear time-series modeling constructed from an ODE-based model and can infer the regulatory structure among several tens of genes maximizing prediction ability for the observational data. Furthermore, the method is capable of incorporating various types of existing biological knowledge, e.g., drug kinetics and literature-recorded pathways. The effectiveness of the proposed method is shown through a comparison of simulation studies with several previous methods. For an application example, we evaluated mRNA expression profiles over time upon corticosteroid stimulation in rats, thus incorporating corticosteroid kinetics/dynamics, literature-recorded pathways and transcription factor (TF) information. PMID:25162401

  16. Predicting Character Traits Through Reddit

    DTIC Science & Technology

    2015-01-01

    platforms, such as Facebook and Twitter, to predict the personalities of users (Schwartz et al., 2013). The question is, can these predictions be... Predicting Character Traits Through Reddit Naval Research Labratory Clarissa Scoggins Thomas Jefferson High School for Science and Technology...Information Technology, 5584 Mentor: Myriam Abramson 1 Abstract This paper discusses the use of subreddits in the prediction of personality as determined by

  17. Learning and cognitive styles in web-based learning: theory, evidence, and application.

    PubMed

    Cook, David A

    2005-03-01

    Cognitive and learning styles (CLS) have long been investigated as a basis to adapt instruction and enhance learning. Web-based learning (WBL) can reach large, heterogenous audiences, and adaptation to CLS may increase its effectiveness. Adaptation is only useful if some learners (with a defined trait) do better with one method and other learners (with a complementary trait) do better with another method (aptitude-treatment interaction). A comprehensive search of health professions education literature found 12 articles on CLS in computer-assisted learning and WBL. Because so few reports were found, research from non-medical education was also included. Among all the reports, four CLS predominated. Each CLS construct was used to predict relationships between CLS and WBL. Evidence was then reviewed to support or refute these predictions. The wholist-analytic construct shows consistent aptitude-treatment interactions consonant with predictions (wholists need structure, a broad-before-deep approach, and social interaction, while analytics need less structure and a deep-before-broad approach). Limited evidence for the active-reflective construct suggests aptitude-treatment interaction, with active learners doing better with interactive learning and reflective learners doing better with methods to promote reflection. As predicted, no consistent interaction between the concrete-abstract construct and computer format was found, but one study suggests that there is interaction with instructional method. Contrary to predictions, no interaction was found for the verbal-imager construct. Teachers developing WBL activities should consider assessing and adapting to accommodate learners defined by the wholist-analytic and active-reflective constructs. Other adaptations should be considered experimental. Further WBL research could clarify the feasibility and effectiveness of assessing and adapting to CLS.

  18. Prediction of maize phenotype based on whole-genome single nucleotide polymorphisms using deep belief networks

    NASA Astrophysics Data System (ADS)

    Rachmatia, H.; Kusuma, W. A.; Hasibuan, L. S.

    2017-05-01

    Selection in plant breeding could be more effective and more efficient if it is based on genomic data. Genomic selection (GS) is a new approach for plant-breeding selection that exploits genomic data through a mechanism called genomic prediction (GP). Most of GP models used linear methods that ignore effects of interaction among genes and effects of higher order nonlinearities. Deep belief network (DBN), one of the architectural in deep learning methods, is able to model data in high level of abstraction that involves nonlinearities effects of the data. This study implemented DBN for developing a GP model utilizing whole-genome Single Nucleotide Polymorphisms (SNPs) as data for training and testing. The case study was a set of traits in maize. The maize dataset was acquisitioned from CIMMYT’s (International Maize and Wheat Improvement Center) Global Maize program. Based on Pearson correlation, DBN is outperformed than other methods, kernel Hilbert space (RKHS) regression, Bayesian LASSO (BL), best linear unbiased predictor (BLUP), in case allegedly non-additive traits. DBN achieves correlation of 0.579 within -1 to 1 range.

  19. Combining PubMed knowledge and EHR data to develop a weighted bayesian network for pancreatic cancer prediction.

    PubMed

    Zhao, Di; Weng, Chunhua

    2011-10-01

    In this paper, we propose a novel method that combines PubMed knowledge and Electronic Health Records to develop a weighted Bayesian Network Inference (BNI) model for pancreatic cancer prediction. We selected 20 common risk factors associated with pancreatic cancer and used PubMed knowledge to weigh the risk factors. A keyword-based algorithm was developed to extract and classify PubMed abstracts into three categories that represented positive, negative, or neutral associations between each risk factor and pancreatic cancer. Then we designed a weighted BNI model by adding the normalized weights into a conventional BNI model. We used this model to extract the EHR values for patients with or without pancreatic cancer, which then enabled us to calculate the prior probabilities for the 20 risk factors in the BNI. The software iDiagnosis was designed to use this weighted BNI model for predicting pancreatic cancer. In an evaluation using a case-control dataset, the weighted BNI model significantly outperformed the conventional BNI and two other classifiers (k-Nearest Neighbor and Support Vector Machine). We conclude that the weighted BNI using PubMed knowledge and EHR data shows remarkable accuracy improvement over existing representative methods for pancreatic cancer prediction. Copyright © 2011 Elsevier Inc. All rights reserved.

  20. Combining PubMed Knowledge and EHR Data to Develop a Weighted Bayesian Network for Pancreatic Cancer Prediction

    PubMed Central

    Zhao, Di; Weng, Chunhua

    2011-01-01

    In this paper, we propose a novel method that combines PubMed knowledge and Electronic Health Records to develop a weighted Bayesian Network Inference (BNI) model for pancreatic cancer prediction. We selected 20 common risk factors associated with pancreatic cancer and used PubMed knowledge to weigh the risk factors. A keyword-based algorithm was developed to extract and classify PubMed abstracts into three categories that represented positive, negative, or neutral associations between each risk factor and pancreatic cancer. Then we designed a weighted BNI model by adding the normalized weights into a conventional BNI model. We used this model to extract the EHR values for patients with or without pancreatic cancer, which then enabled us to calculate the prior probabilities for the 20 risk factors in the BNI. The software iDiagnosis was designed to use this weighted BNI model for predicting pancreatic cancer. In an evaluation using a case-control dataset, the weighted BNI model significantly outperformed the conventional BNI and two other classifiers (k-Nearest Neighbor and Support Vector Machine). We conclude that the weighted BNI using PubMed knowledge and EHR data shows remarkable accuracy improvement over existing representative methods for pancreatic cancer prediction. PMID:21642013

  1. CONSTANT SLOPE IMPEDANCE FACTOR MODEL FOR PREDICTING THE SOLUTE DIFFUSION COEFFICIENT IN UNSATURATED SOIL. (R825433)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  2. ARE LABORATORY MICROCOSM EXPERIMENTS USEFUL FOR PREDICTING THE BIOCONTROL EFFECTIVENESS OF GENERALIST PREDATORS? (R826099)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  3. AIR PERMEABILITY IN UNDISTURBED VOLCANIC ASH SOILS: PREDICTIVE MODEL TEST AND SOIL STRUCTURE FINGERPRINT. (R825433)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  4. Method for Localizing and Differentiating Bacteria within Biofilms Grown on Indium Tin Oxide: Spatial Distribution of Exoelectrogenic Bacteria within Intact ITO Biofilms via FISH

    DTIC Science & Technology

    2017-11-01

    404938, “Microbe-surface Interactions in Biofouling and Corrosion” ERDC/CERL TR-17-42 ii Abstract With a limited supply of fossil fuel, there has been...TR-17-42 iv Figures and Tables Figures Figure 1. DFR system used to rapidly form biofilms on ITO-coated glass substrates; (a) bioreactor with ITO...There is a finite amount of fossil fuel remaining in the world, and at cur- rently predicted rates of consumption, it is estimated that strategic re

  5. Establishing Lower Developmental Thresholds for a Common BlowFly: For Use in Estimating Elapsed Time since Death Using Entomologyical Methods

    DTIC Science & Technology

    2011-10-01

    Abstract …….. Forensic entomology is a science used to estimate a post-mortem interval (PMI). Larvae develop at predictable rates and the time interval...Warren,Jodie; DRDC CSS CR 2011-23; Defence R&D Canada – CSS; October 2011. Introduction or background: Forensic entomology is the study of insects...in Europe since the 1850’s. Forensic entomology is now an integral part of a death investigation when estimating a time since death beyond 72 hours

  6. Parameter uncertainty analysis for the annual phosphorus loss estimator (APLE) model

    USDA-ARS?s Scientific Manuscript database

    Technical abstract: Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study, we conduct an uncertainty analys...

  7. An injury mortality prediction based on the anatomic injury scale

    PubMed Central

    Wang, Muding; Wu, Dan; Qiu, Wusi; Wang, Weimi; Zeng, Yunji; Shen, Yi

    2017-01-01

    Abstract To determine whether the injury mortality prediction (IMP) statistically outperforms the trauma mortality prediction model (TMPM) as a predictor of mortality. The TMPM is currently the best trauma score method, which is based on the anatomic injury. Its ability of mortality prediction is superior to the injury severity score (ISS) and to the new injury severity score (NISS). However, despite its statistical significance, the predictive power of TMPM needs to be further improved. Retrospective cohort study is based on the data of 1,148,359 injured patients in the National Trauma Data Bank hospitalized from 2010 to 2011. Sixty percent of the data was used to derive an empiric measure of severity of different Abbreviated Injury Scale predot codes by taking the weighted average death probabilities of trauma patients. Twenty percent of the data was used to create computing method of the IMP model. The remaining 20% of the data was used to evaluate the statistical performance of IMP and then be compared with the TMPM and the single worst injury by examining area under the receiver operating characteristic curve (ROC), the Hosmer–Lemeshow (HL) statistic, and the Akaike information criterion. IMP exhibits significantly both better discrimination (ROC-IMP, 0.903 [0.899–0.907] and ROC-TMPM, 0.890 [0.886–0.895]) and calibration (HL-IMP, 9.9 [4.4–14.7] and HL-TMPM, 197 [143–248]) compared with TMPM. All models show slight changes after the extension of age, gender, and mechanism of injury, but the extended IMP still dominated TMPM in every performance. The IMP has slight improvement in discrimination and calibration compared with the TMPM and can accurately predict mortality. Therefore, we consider it as a new feasible scoring method in trauma research. PMID:28858124

  8. Comparative study of joint analysis of microarray gene expression data in survival prediction and risk assessment of breast cancer patients

    PubMed Central

    2016-01-01

    Abstract Microarray gene expression data sets are jointly analyzed to increase statistical power. They could either be merged together or analyzed by meta-analysis. For a given ensemble of data sets, it cannot be foreseen which of these paradigms, merging or meta-analysis, works better. In this article, three joint analysis methods, Z -score normalization, ComBat and the inverse normal method (meta-analysis) were selected for survival prognosis and risk assessment of breast cancer patients. The methods were applied to eight microarray gene expression data sets, totaling 1324 patients with two clinical endpoints, overall survival and relapse-free survival. The performance derived from the joint analysis methods was evaluated using Cox regression for survival analysis and independent validation used as bias estimation. Overall, Z -score normalization had a better performance than ComBat and meta-analysis. Higher Area Under the Receiver Operating Characteristic curve and hazard ratio were also obtained when independent validation was used as bias estimation. With a lower time and memory complexity, Z -score normalization is a simple method for joint analysis of microarray gene expression data sets. The derived findings suggest further assessment of this method in future survival prediction and cancer classification applications. PMID:26504096

  9. Prediction of brain tissue temperature using near-infrared spectroscopy

    PubMed Central

    Holper, Lisa; Mitra, Subhabrata; Bale, Gemma; Robertson, Nicola; Tachtsidis, Ilias

    2017-01-01

    Abstract. Broadband near-infrared spectroscopy (NIRS) can provide an endogenous indicator of tissue temperature based on the temperature dependence of the water absorption spectrum. We describe a first evaluation of the calibration and prediction of brain tissue temperature obtained during hypothermia in newborn piglets (animal dataset) and rewarming in newborn infants (human dataset) based on measured body (rectal) temperature. The calibration using partial least squares regression proved to be a reliable method to predict brain tissue temperature with respect to core body temperature in the wavelength interval of 720 to 880 nm with a strong mean predictive power of R2=0.713±0.157 (animal dataset) and R2=0.798±0.087 (human dataset). In addition, we applied regression receiver operating characteristic curves for the first time to evaluate the temperature prediction, which provided an overall mean error bias between NIRS predicted brain temperature and body temperature of 0.436±0.283°C (animal dataset) and 0.162±0.149°C (human dataset). We discuss main methodological aspects, particularly the well-known aspect of over- versus underestimation between brain and body temperature, which is relevant for potential clinical applications. PMID:28630878

  10. PSSMHCpan: a novel PSSM-based software for predicting class I peptide-HLA binding affinity

    PubMed Central

    Liu, Geng; Li, Dongli; Li, Zhang; Qiu, Si; Li, Wenhui; Chao, Cheng-chi; Yang, Naibo; Li, Handong; Cheng, Zhen; Song, Xin; Cheng, Le; Zhang, Xiuqing; Wang, Jian; Yang, Huanming

    2017-01-01

    Abstract Predicting peptide binding affinity with human leukocyte antigen (HLA) is a crucial step in developing powerful antitumor vaccine for cancer immunotherapy. Currently available methods work quite well in predicting peptide binding affinity with HLA alleles such as HLA-A*0201, HLA-A*0101, and HLA-B*0702 in terms of sensitivity and specificity. However, quite a few types of HLA alleles that are present in the majority of human populations including HLA-A*0202, HLA-A*0203, HLA-A*6802, HLA-B*5101, HLA-B*5301, HLA-B*5401, and HLA-B*5701 still cannot be predicted with satisfactory accuracy using currently available methods. Furthermore, currently the most popularly used methods for predicting peptide binding affinity are inefficient in identifying neoantigens from a large quantity of whole genome and transcriptome sequencing data. Here we present a Position Specific Scoring Matrix (PSSM)-based software called PSSMHCpan to accurately and efficiently predict peptide binding affinity with a broad coverage of HLA class I alleles. We evaluated the performance of PSSMHCpan by analyzing 10-fold cross-validation on a training database containing 87 HLA alleles and obtained an average area under receiver operating characteristic curve (AUC) of 0.94 and accuracy (ACC) of 0.85. In an independent dataset (Peptide Database of Cancer Immunity) evaluation, PSSMHCpan is substantially better than the popularly used NetMHC-4.0, NetMHCpan-3.0, PickPocket, Nebula, and SMM with a sensitivity of 0.90, as compared to 0.74, 0.81, 0.77, 0.24, and 0.79. In addition, PSSMHCpan is more than 197 times faster than NetMHC-4.0, NetMHCpan-3.0, PickPocket, sNebula, and SMM when predicting neoantigens from 661 263 peptides from a breast tumor sample. Finally, we built a neoantigen prediction pipeline and identified 117 017 neoantigens from 467 cancer samples of various cancers from TCGA. PSSMHCpan is superior to the currently available methods in predicting peptide binding affinity with a broad coverage of HLA class I alleles. PMID:28327987

  11. Event-related potentials reveal the relations between feature representations at different levels of abstraction.

    PubMed

    Hannah, Samuel D; Shedden, Judith M; Brooks, Lee R; Grundy, John G

    2016-11-01

    In this paper, we use behavioural methods and event-related potentials (ERPs) to explore the relations between informational and instantiated features, as well as the relation between feature abstraction and rule type. Participants are trained to categorize two species of fictitious animals and then identify perceptually novel exemplars. Critically, two groups are given a perfectly predictive counting rule that, according to Hannah and Brooks (2009. Featuring familiarity: How a familiar feature instantiation influences categorization. Canadian Journal of Experimental Psychology/Revue Canadienne de Psychologie Expérimentale, 63, 263-275. Retrieved from http://doi.org/10.1037/a0017919), should orient them to using abstract informational features when categorizing the novel transfer items. A third group is taught a feature list rule, which should orient them to using detailed instantiated features. One counting-rule group were taught their rule before any exposure to the actual stimuli, and the other immediately after training, having learned the instantiations first. The feature-list group were also taught their rule after training. The ERP results suggest that at test, the two counting-rule groups processed items differently, despite their identical rule. This not only supports the distinction that informational and instantiated features are qualitatively different feature representations, but also implies that rules can readily operate over concrete inputs, in contradiction to traditional approaches that assume that rules necessarily act on abstract inputs.

  12. Predicting pathological complete response to neoadjuvant chemoradiotherapy in locally advanced rectal cancer: a systematic review.

    PubMed

    Ryan, J E; Warrier, S K; Lynch, A C; Ramsay, R G; Phillips, W A; Heriot, A G

    2016-03-01

    Approximately 20% of patients treated with neoadjuvant chemoradiotherapy (nCRT) for locally advanced rectal cancer achieve a pathological complete response (pCR) while the remainder derive the benefit of improved local control and downstaging and a small proportion show a minimal response. The ability to predict which patients will benefit would allow for improved patient stratification directing therapy to those who are likely to achieve a good response, thereby avoiding ineffective treatment in those unlikely to benefit. A systematic review of the English language literature was conducted to identify pathological factors, imaging modalities and molecular factors that predict pCR following chemoradiotherapy. PubMed, MEDLINE and Cochrane Database searches were conducted with the following keywords and MeSH search terms: 'rectal neoplasm', 'response', 'neoadjuvant', 'preoperative chemoradiation', 'tumor response'. After review of title and abstracts, 85 articles addressing the prediction of pCR were selected. Clear methods to predict pCR before chemoradiotherapy have not been defined. Clinical and radiological features of the primary cancer have limited ability to predict response. Molecular profiling holds the greatest potential to predict pCR but adoption of this technology will require greater concordance between cohorts for the biomarkers currently under investigation. At present no robust markers of the prediction of pCR have been identified and the topic remains an area for future research. This review critically evaluates existing literature providing an overview of the methods currently available to predict pCR to nCRT for locally advanced rectal cancer. The review also provides a comprehensive comparison of the accuracy of each modality. Colorectal Disease © 2015 The Association of Coloproctology of Great Britain and Ireland.

  13. MULTI-SCALE GRID DEFINITION IMPACTS ON REGIONAL, THREE-DIMENSIONAL AIR QUALITY MODEL PREDICTIONS AND PERFORMANCE. (R825821)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  14. PREDICTION OF SINGLE PHASE TRANSPORT PARAMETERS IN A VARIABLE APERTURE FRACTURE. (R825689C063,R825689C080)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  15. GAS DIFFUSIVITY IN UNDISTURBED VOLCANIC ASH SOILS: TEST OF SOIL-WATER-CHARACTERISTIC-BASED PREDICTION MODELS. (R825433)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  16. PREDICTING THE RESPONSE OF GULF OF MEXICO HYPOXIA TO VARIATIONS IN MISSISSIPPI RIVER NITROGEN LOAD. (R828009)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  17. INTERPRETING THE INFORMATION IN OZONE OBSERVATIONS AND MODEL PREDICTIONS RELEVANT TO REGULATORY POLICIES IN THE EASTERN UNITED STATES. (R825260)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  18. Where Full-Text Is Viable.

    ERIC Educational Resources Information Center

    Cotton, P. L.

    1987-01-01

    Defines two types of online databases: source, referring to those intended to be complete in themselves, whether full-text or abstracts; and bibliographic, meaning those that are not complete. Predictions are made about the future growth rate of these two types of databases, as well as full-text versus abstract databases. (EM)

  19. Computing and Applying Atomic Regulons to Understand Gene Expression and Regulation

    DOE PAGES

    Faria, José P.; Davis, James J.; Edirisinghe, Janaka N.; ...

    2016-11-24

    Understanding gene function and regulation is essential for the interpretation, prediction, and ultimate design of cell responses to changes in the environment. A multitude of technologies, abstractions, and interpretive frameworks have emerged to answer the challenges presented by genome function and regulatory network inference. Here, we propose a new approach for producing biologically meaningful clusters of coexpressed genes, called Atomic Regulons (ARs), based on expression data, gene context, and functional relationships. We demonstrate this new approach by computing ARs for Escherichia coli, which we compare with the coexpressed gene clusters predicted by two prevalent existing methods: hierarchical clustering and k-meansmore » clustering. We test the consistency of ARs predicted by all methods against expected interactions predicted by the Context Likelihood of Relatedness (CLR) mutual information based method, finding that the ARs produced by our approach show better agreement with CLR interactions. We then apply our method to compute ARs for four other genomes: Shewanella oneidensis, Pseudomonas aeruginosa, Thermus thermophilus, and Staphylococcus aureus. We compare the AR clusters from all genomes to study the similarity of coexpression among a phylogenetically diverse set of species, identifying subsystems that show remarkable similarity over wide phylogenetic distances. We also study the sensitivity of our method for computing ARs to the expression data used in the computation, showing that our new approach requires less data than competing approaches to converge to a near final configuration of ARs. We go on to use our sensitivity analysis to identify the specific experiments that lead most rapidly to the final set of ARs for E. coli. As a result, this analysis produces insights into improving the design of gene expression experiments.« less

  20. Computing and Applying Atomic Regulons to Understand Gene Expression and Regulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faria, José P.; Davis, James J.; Edirisinghe, Janaka N.

    Understanding gene function and regulation is essential for the interpretation, prediction, and ultimate design of cell responses to changes in the environment. A multitude of technologies, abstractions, and interpretive frameworks have emerged to answer the challenges presented by genome function and regulatory network inference. Here, we propose a new approach for producing biologically meaningful clusters of coexpressed genes, called Atomic Regulons (ARs), based on expression data, gene context, and functional relationships. We demonstrate this new approach by computing ARs for Escherichia coli, which we compare with the coexpressed gene clusters predicted by two prevalent existing methods: hierarchical clustering and k-meansmore » clustering. We test the consistency of ARs predicted by all methods against expected interactions predicted by the Context Likelihood of Relatedness (CLR) mutual information based method, finding that the ARs produced by our approach show better agreement with CLR interactions. We then apply our method to compute ARs for four other genomes: Shewanella oneidensis, Pseudomonas aeruginosa, Thermus thermophilus, and Staphylococcus aureus. We compare the AR clusters from all genomes to study the similarity of coexpression among a phylogenetically diverse set of species, identifying subsystems that show remarkable similarity over wide phylogenetic distances. We also study the sensitivity of our method for computing ARs to the expression data used in the computation, showing that our new approach requires less data than competing approaches to converge to a near final configuration of ARs. We go on to use our sensitivity analysis to identify the specific experiments that lead most rapidly to the final set of ARs for E. coli. As a result, this analysis produces insights into improving the design of gene expression experiments.« less

  1. Abstraction Techniques for Parameterized Verification

    DTIC Science & Technology

    2006-11-01

    approach for applying model checking to unbounded systems is to extract finite state models from them using conservative abstraction techniques. Prop...36 2.5.1 Multiple Reference Processes . . . . . . . . . . . . . . . . . . . 36 2.5.2 Adding Monitor Processes...model checking to complex pieces of code like device drivers depends on the use of abstraction methods. An abstraction method extracts a small finite

  2. Methods for solving reasoning problems in abstract argumentation – A survey

    PubMed Central

    Charwat, Günther; Dvořák, Wolfgang; Gaggl, Sarah A.; Wallner, Johannes P.; Woltran, Stefan

    2015-01-01

    Within the last decade, abstract argumentation has emerged as a central field in Artificial Intelligence. Besides providing a core formalism for many advanced argumentation systems, abstract argumentation has also served to capture several non-monotonic logics and other AI related principles. Although the idea of abstract argumentation is appealingly simple, several reasoning problems in this formalism exhibit high computational complexity. This calls for advanced techniques when it comes to implementation issues, a challenge which has been recently faced from different angles. In this survey, we give an overview on different methods for solving reasoning problems in abstract argumentation and compare their particular features. Moreover, we highlight available state-of-the-art systems for abstract argumentation, which put these methods to practice. PMID:25737590

  3. Toward High-Level Theoretical Studies of Large Biodiesel Molecules: An ONIOM [QCISD(T)/CBS:DFT] Study of the Reactions between Unsaturated Methyl Esters (C nH2 n-1COOCH3) and Hydrogen Radical.

    PubMed

    Zhang, Lidong; Meng, Qinghui; Chi, Yicheng; Zhang, Peng

    2018-05-31

    A two-layer ONIOM[QCISD(T)/CBS:DFT] method was proposed for the high-level single-point energy calculations of large biodiesel molecules and was validated for the hydrogen abstraction reactions of unsaturated methyl esters that are important components of real biodiesel. The reactions under investigation include all the reactions on the potential energy surface of C n H 2 n-1 COOCH 3 ( n = 2-5, 17) + H, including the hydrogen abstraction, the hydrogen addition, the isomerization (intramolecular hydrogen shift), and the β-scission reactions. By virtue of the introduced concept of chemically active center, a unified specification of chemically active portion for the ONIOM (ONIOM = our own n-layered integrated molecular orbital and molecular mechanics) method was proposed to account for the additional influence of C═C double bond. The predicted energy barriers and heats of reaction by using the ONIOM method are in very good agreement with those obtained by using the widely accepted high-level QCISD(T)/CBS theory, as verified by the computational deviations being less than 0.15 kcal/mol, for almost all the reaction pathways under investigation. The method provides a computationally accurate and affordable approach to combustion chemists for high-level theoretical chemical kinetics of large biodiesel molecules.

  4. Dynamic motif occupancy (DynaMO) analysis identifies transcription factors and their binding sites driving dynamic biological processes

    PubMed Central

    Kuang, Zheng; Ji, Zhicheng

    2018-01-01

    Abstract Biological processes are usually associated with genome-wide remodeling of transcription driven by transcription factors (TFs). Identifying key TFs and their spatiotemporal binding patterns are indispensable to understanding how dynamic processes are programmed. However, most methods are designed to predict TF binding sites only. We present a computational method, dynamic motif occupancy analysis (DynaMO), to infer important TFs and their spatiotemporal binding activities in dynamic biological processes using chromatin profiling data from multiple biological conditions such as time-course histone modification ChIP-seq data. In the first step, DynaMO predicts TF binding sites with a random forests approach. Next and uniquely, DynaMO infers dynamic TF binding activities at predicted binding sites using their local chromatin profiles from multiple biological conditions. Another landmark of DynaMO is to identify key TFs in a dynamic process using a clustering and enrichment analysis of dynamic TF binding patterns. Application of DynaMO to the yeast ultradian cycle, mouse circadian clock and human neural differentiation exhibits its accuracy and versatility. We anticipate DynaMO will be generally useful for elucidating transcriptional programs in dynamic processes. PMID:29325176

  5. Kinetic modeling of α-hydrogen abstractions from unsaturated and saturated oxygenate compounds by carbon-centered radicals.

    PubMed

    Paraskevas, Paschalis D; Sabbe, Maarten K; Reyniers, Marie-Françoise; Papayannakos, Nikos; Marin, Guy B

    2014-06-23

    Hydrogen abstractions are important elementary reactions in a variety of reacting media at high temperatures in which oxygenates and hydrocarbon radicals are present. Accurate kinetic data are obtained from CBS-QB3 ab initio (AI) calculations by using conventional transition-state theory within the high-pressure limit, including corrections for hindered rotation and tunneling. From the obtained results, a group-additive (GA) model is developed that allows the Arrhenius parameters and rate coefficients for abstraction of the α-hydrogen from a wide range of oxygenate compounds to be predicted at temperatures ranging from 300 to 1500 K. From a training set of 60 hydrogen abstractions from oxygenates by carbon-centered radicals, 15 GA values (ΔGAV°s) are obtained for both the forward and reverse reactions. Among them, four ΔGAV°s refer to primary contributions, and the remaining 11 ΔGAV°s refer to secondary ones. The accuracy of the model is further improved by introducing seven corrections for cross-resonance stabilization of the transition state from an additional set of 43 reactions. The determined ΔGAV°s are validated upon a test set of AI data for 17 reactions. The mean absolute deviation of the pre-exponential factors (log A) and activation energies (E(a)) for the forward reaction at 300 K are 0.238 log(m(3)  mol(-1)  s(-1)) and 1.5 kJ mol(-1), respectively, whereas the mean factor of deviation <ρ> between the GA-predicted and the AI-calculated rate coefficients is 1.6. In comparison with a compilation of 33 experimental rate coefficients, the <ρ> between the GA-predicted values and these experimental values is only 2.2. Hence, the constructed GA model can be reliably used in the prediction of the kinetics of α-hydrogen-abstraction reactions between a broad range of oxygenates and oxygenate radicals. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Kinetics and Thermodynamics of the Reaction between the (•)OH Radical and Adenine: A Theoretical Investigation.

    PubMed

    Milhøj, Birgitte O; Sauer, Stephan P A

    2015-06-18

    The accessibility of all possible reaction paths for the reaction between the nucleobase adenine and the (•)OH radical is investigated through quantum chemical calculations of barrier heights and rate constants at the ωB97X-D/6-311++G(2df,2pd) level with Eckart tunneling corrections. First the computational method is validated by considering the hydrogen abstraction from the heterocyclic N9 nitrogen in adenine as a test system. Geometries for all molecules in the reaction are optimized with four different DFT exchange-correlation functionals (B3LYP, BHandHLYP, M06-2X, and ωB97X-D), in combination with Pople and Dunning basis sets, all of which have been employed in similar investigations in the literature. Improved energies are obtained through single point calculations with CCSD(T) and the same basis sets, and reaction rate constants are calculated for all methods both without tunneling corrections and with the Wigner, Bell, and Eckart corrections. In comparison to CCSD(T)//BHandHLYP/aug-cc-pVTZ reference results, the ωB97X-D/6-311++G(2df,2pd) method combined with Eckart tunneling corrections provides a sensible compromise between accuracy and time. Using this method, all subreactions of the reaction between adenine and the (•)OH radical are investigated. The total rate constants for hydrogen abstraction and addition for adenine are predicted with this method to be 1.06 × 10(-12) and 1.10 × 10(-12) cm(3) molecules(-1) s(-1), respectively. Abstractions of H61 and H62 contribute the most, while only addition to the C8 carbon is found to be of any significance, in contrast to previous claims that addition is the dominant reaction pathway. The overall rate constant for the complete reaction is found to be 2.17 × 10(-12) cm(3) molecules(-1) s(-1), which agrees exceptionally well with experimental results.

  7. In a year, memory will benefit from learning, tomorrow it won't: distance and construal level effects on the basis of metamemory judgments.

    PubMed

    Halamish, Vered; Nussinson, Ravit; Ben-Ari, Liat

    2013-09-01

    Metamemory judgments may rely on 2 bases of information: subjective experience and abstract theories about memory. On the basis of construal level theory, we predicted that psychological distance and construal level (i.e., concrete vs. abstract thinking) would have a qualitative impact on the relative reliance on these 2 bases: When considering learning from proximity or under a low-construal mindset, learners would rely more heavily on their experience, whereas when considering learning from a distance or under a high-construal mindset, they would rely more heavily on their abstract theories. Consistent with this prediction, results of 2 experiments revealed that temporal distance (Experiment 1) and construal level (Experiment 2) affected the stability bias--the failure to predict the benefits of learning. When considering learning from proximity or using a low-construal mindset, participants relied less heavily on their theory regarding the benefits of learning and were therefore insensitive to future learning. However, when considering learning from temporal distance or using a high-construal mindset, participants relied more heavily on their theory and were therefore better able to predict the benefits of future learning, thus overcoming the stability bias. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  8. Humanoid infers Archimedes' principle: understanding physical relations and object affordances through cumulative learning experiences.

    PubMed

    Bhat, Ajaz Ahmad; Mohan, Vishwanathan; Sandini, Giulio; Morasso, Pietro

    2016-07-01

    Emerging studies indicate that several species such as corvids, apes and children solve 'The Crow and the Pitcher' task (from Aesop's Fables) in diverse conditions. Hidden beneath this fascinating paradigm is a fundamental question: by cumulatively interacting with different objects, how can an agent abstract the underlying cause-effect relations to predict and creatively exploit potential affordances of novel objects in the context of sought goals? Re-enacting this Aesop's Fable task on a humanoid within an open-ended 'learning-prediction-abstraction' loop, we address this problem and (i) present a brain-guided neural framework that emulates rapid one-shot encoding of ongoing experiences into a long-term memory and (ii) propose four task-agnostic learning rules (elimination, growth, uncertainty and status quo) that correlate predictions from remembered past experiences with the unfolding present situation to gradually abstract the underlying causal relations. Driven by the proposed architecture, the ensuing robot behaviours illustrated causal learning and anticipation similar to natural agents. Results further demonstrate that by cumulatively interacting with few objects, the predictions of the robot in case of novel objects converge close to the physical law, i.e. the Archimedes principle: this being independent of both the objects explored during learning and the order of their cumulative exploration. © 2016 The Author(s).

  9. The Development of Shared Liking of Representational but not Abstract Art in Primary School Children and Their Justifications for Liking

    PubMed Central

    Rodway, Paul; Kirkham, Julie; Schepman, Astrid; Lambert, Jordana; Locke, Anastasia

    2016-01-01

    Understanding how aesthetic preferences are shared among individuals, and its developmental time course, is a fundamental question in aesthetics. It has been shown that semantic associations, in response to representational artworks, overlap more strongly among individuals than those generated by abstract artworks and that the emotional valence of the associations also overlaps more for representational artworks. This valence response may be a key driver in aesthetic appreciation. The current study tested predictions derived from the semantic association account in a developmental context. Twenty 4-, 6-, 8- and 10-year-old children (n = 80) were shown 20 artworks (10 representational, 10 abstract) and were asked to rate each artwork and to explain their decision. Cross-observer agreement in aesthetic preferences increased with age from 4–8 years for both abstract and representational art. However, after age 6 the level of shared appreciation for representational and abstract artworks diverged, with significantly higher levels of agreement for representational than abstract artworks at age 8 and 10. The most common justifications for representational artworks involved subject matter, while for abstract artworks formal artistic properties and color were the most commonly used justifications. Representational artwork also showed a significantly higher proportion of associations and emotional responses than abstract artworks. In line with predictions from developmental cognitive neuroscience, references to the artist as an agent increased between ages 4 and 6 and again between ages 6 and 8, following the development of Theory of Mind. The findings support the view that increased experience with representational content during the life span reduces inter-individual variation in aesthetic appreciation and increases shared preferences. In addition, brain and cognitive development appear to impact on art appreciation at milestone ages. PMID:26903834

  10. Publication rate of studies presented at veterinary anaesthesia specialty meetings during the years 2003-2008.

    PubMed

    Wieser, Marilies; Braun, Christina; Moens, Yves

    2016-03-01

    To assess publication rates, factors predicting publication, and discrepancies between conference abstracts and subsequent full-text publications of abstracts from the veterinary meetings of the American College of Veterinary Anesthesiologists and the Association of Veterinary Anaesthetists from 2003 to 2008. Retrospective cohort study. A total of 607 abstracts were identified and a database search (Scopus, PubMed, CAB) was conducted to identify matching publications. Authors of nonmatching abstracts were contacted to participate in a confidential online survey. Risk ratios were used to assess factors predicting publication and these were tested for significance (p < 0.05) using Fisher's exact test. The overall publication rate was 63.3% and the mean (± SD) time to publication was 25 ± 19 months. Factors significantly associated with subsequent full publication (i.e. publication of a full manuscript in a peer-reviewed journal) were continent of origin (North America), study design (experimental studies), specialty (analgesia) and the presence of a source of funding. The principal reasons why studies remained unpublished were lack of time and responsibility lying with co-authors. Minor changes compared with the original abstract were found in 71.6% of all publications. Major changes were noted in 34.6% and the outcome of the study changed in 7.6%. These data suggest that some of the abstracts reported preliminary findings. Therefore, caution is warranted when quoting abstracts as references in scientific publications. To date, major veterinary journals have not issued recommendations in their author guidelines addressing the use of abstracts as a reference. The authors propose the inclusion of such a statement in author guidelines. © 2015 Association of Veterinary Anaesthetists and the American College of Veterinary Anesthesia and Analgesia.

  11. The Development of Shared Liking of Representational but not Abstract Art in Primary School Children and Their Justifications for Liking.

    PubMed

    Rodway, Paul; Kirkham, Julie; Schepman, Astrid; Lambert, Jordana; Locke, Anastasia

    2016-01-01

    Understanding how aesthetic preferences are shared among individuals, and its developmental time course, is a fundamental question in aesthetics. It has been shown that semantic associations, in response to representational artworks, overlap more strongly among individuals than those generated by abstract artworks and that the emotional valence of the associations also overlaps more for representational artworks. This valence response may be a key driver in aesthetic appreciation. The current study tested predictions derived from the semantic association account in a developmental context. Twenty 4-, 6-, 8- and 10-year-old children (n = 80) were shown 20 artworks (10 representational, 10 abstract) and were asked to rate each artwork and to explain their decision. Cross-observer agreement in aesthetic preferences increased with age from 4-8 years for both abstract and representational art. However, after age 6 the level of shared appreciation for representational and abstract artworks diverged, with significantly higher levels of agreement for representational than abstract artworks at age 8 and 10. The most common justifications for representational artworks involved subject matter, while for abstract artworks formal artistic properties and color were the most commonly used justifications. Representational artwork also showed a significantly higher proportion of associations and emotional responses than abstract artworks. In line with predictions from developmental cognitive neuroscience, references to the artist as an agent increased between ages 4 and 6 and again between ages 6 and 8, following the development of Theory of Mind. The findings support the view that increased experience with representational content during the life span reduces inter-individual variation in aesthetic appreciation and increases shared preferences. In addition, brain and cognitive development appear to impact on art appreciation at milestone ages.

  12. DeepSynergy: predicting anti-cancer drug synergy with Deep Learning

    PubMed Central

    Preuer, Kristina; Lewis, Richard P I; Hochreiter, Sepp; Bender, Andreas; Bulusu, Krishna C; Klambauer, Günter

    2018-01-01

    Abstract Motivation While drug combination therapies are a well-established concept in cancer treatment, identifying novel synergistic combinations is challenging due to the size of combinatorial space. However, computational approaches have emerged as a time- and cost-efficient way to prioritize combinations to test, based on recently available large-scale combination screening data. Recently, Deep Learning has had an impact in many research areas by achieving new state-of-the-art model performance. However, Deep Learning has not yet been applied to drug synergy prediction, which is the approach we present here, termed DeepSynergy. DeepSynergy uses chemical and genomic information as input information, a normalization strategy to account for input data heterogeneity, and conical layers to model drug synergies. Results DeepSynergy was compared to other machine learning methods such as Gradient Boosting Machines, Random Forests, Support Vector Machines and Elastic Nets on the largest publicly available synergy dataset with respect to mean squared error. DeepSynergy significantly outperformed the other methods with an improvement of 7.2% over the second best method at the prediction of novel drug combinations within the space of explored drugs and cell lines. At this task, the mean Pearson correlation coefficient between the measured and the predicted values of DeepSynergy was 0.73. Applying DeepSynergy for classification of these novel drug combinations resulted in a high predictive performance of an AUC of 0.90. Furthermore, we found that all compared methods exhibit low predictive performance when extrapolating to unexplored drugs or cell lines, which we suggest is due to limitations in the size and diversity of the dataset. We envision that DeepSynergy could be a valuable tool for selecting novel synergistic drug combinations. Availability and implementation DeepSynergy is available via www.bioinf.jku.at/software/DeepSynergy. Contact klambauer@bioinf.jku.at Supplementary information Supplementary data are available at Bioinformatics online. PMID:29253077

  13. Improving protein–protein interactions prediction accuracy using protein evolutionary information and relevance vector machine model

    PubMed Central

    An, Ji‐Yong; Meng, Fan‐Rong; Chen, Xing; Yan, Gui‐Ying; Hu, Ji‐Pu

    2016-01-01

    Abstract Predicting protein–protein interactions (PPIs) is a challenging task and essential to construct the protein interaction networks, which is important for facilitating our understanding of the mechanisms of biological systems. Although a number of high‐throughput technologies have been proposed to predict PPIs, there are unavoidable shortcomings, including high cost, time intensity, and inherently high false positive rates. For these reasons, many computational methods have been proposed for predicting PPIs. However, the problem is still far from being solved. In this article, we propose a novel computational method called RVM‐BiGP that combines the relevance vector machine (RVM) model and Bi‐gram Probabilities (BiGP) for PPIs detection from protein sequences. The major improvement includes (1) Protein sequences are represented using the Bi‐gram probabilities (BiGP) feature representation on a Position Specific Scoring Matrix (PSSM), in which the protein evolutionary information is contained; (2) For reducing the influence of noise, the Principal Component Analysis (PCA) method is used to reduce the dimension of BiGP vector; (3) The powerful and robust Relevance Vector Machine (RVM) algorithm is used for classification. Five‐fold cross‐validation experiments executed on yeast and Helicobacter pylori datasets, which achieved very high accuracies of 94.57 and 90.57%, respectively. Experimental results are significantly better than previous methods. To further evaluate the proposed method, we compare it with the state‐of‐the‐art support vector machine (SVM) classifier on the yeast dataset. The experimental results demonstrate that our RVM‐BiGP method is significantly better than the SVM‐based method. In addition, we achieved 97.15% accuracy on imbalance yeast dataset, which is higher than that of balance yeast dataset. The promising experimental results show the efficiency and robust of the proposed method, which can be an automatic decision support tool for future proteomics research. For facilitating extensive studies for future proteomics research, we developed a freely available web server called RVM‐BiGP‐PPIs in Hypertext Preprocessor (PHP) for predicting PPIs. The web server including source code and the datasets are available at http://219.219.62.123:8888/BiGP/. PMID:27452983

  14. Accurate disulfide-bonding network predictions improve ab initio structure prediction of cysteine-rich proteins

    PubMed Central

    Yang, Jing; He, Bao-Ji; Jang, Richard; Zhang, Yang; Shen, Hong-Bin

    2015-01-01

    Abstract Motivation: Cysteine-rich proteins cover many important families in nature but there are currently no methods specifically designed for modeling the structure of these proteins. The accuracy of disulfide connectivity pattern prediction, particularly for the proteins of higher-order connections, e.g. >3 bonds, is too low to effectively assist structure assembly simulations. Results: We propose a new hierarchical order reduction protocol called Cyscon for disulfide-bonding prediction. The most confident disulfide bonds are first identified and bonding prediction is then focused on the remaining cysteine residues based on SVR training. Compared with purely machine learning-based approaches, Cyscon improved the average accuracy of connectivity pattern prediction by 21.9%. For proteins with more than 5 disulfide bonds, Cyscon improved the accuracy by 585% on the benchmark set of PDBCYS. When applied to 158 non-redundant cysteine-rich proteins, Cyscon predictions helped increase (or decrease) the TM-score (or RMSD) of the ab initio QUARK modeling by 12.1% (or 14.4%). This result demonstrates a new avenue to improve the ab initio structure modeling for cysteine-rich proteins. Availability and implementation: http://www.csbio.sjtu.edu.cn/bioinf/Cyscon/ Contact: zhng@umich.edu or hbshen@sjtu.edu.cn Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26254435

  15. Frailty screening methods for predicting outcome of a comprehensive geriatric assessment in elderly patients with cancer: a systematic review.

    PubMed

    Hamaker, Marije E; Jonker, Judith M; de Rooij, Sophia E; Vos, Alinda G; Smorenburg, Carolien H; van Munster, Barbara C

    2012-10-01

    Comprehensive geriatric assessment (CGA) is done to detect vulnerability in elderly patients with cancer so that treatment can be adjusted accordingly; however, this process is time-consuming and pre-screening is often used to identify fit patients who are able to receive standard treatment versus those in whom a full CGA should be done. We aimed to assess which of the frailty screening methods available show the best sensitivity and specificity for predicting the presence of impairments on CGA in elderly patients with cancer. We did a systematic search of Medline and Embase, and a hand-search of conference abstracts, for studies on the association between frailty screening outcome and results of CGA in elderly patients with cancer. Our search identified 4440 reports, of which 22 publications from 14 studies, were included in this Review. Seven different frailty screening methods were assessed. The median sensitivity and specificity of each screening method for predicting frailty on CGA were as follows: Vulnerable Elders Survey-13 (VES-13), 68% and 78%; Geriatric 8 (G8), 87% and 61%; Triage Risk Screening Tool (TRST 1+; patient considered frail if one or more impairments present), 92% and 47%, Groningen Frailty Index (GFI) 57% and 86%, Fried frailty criteria 31% and 91%, Barber 59% and 79%, and abbreviated CGA (aCGA) 51% and 97%. However, even in case of the highest sensitivity, the negative predictive value was only roughly 60%. G8 and TRST 1+ had the highest sensitivity for frailty, but both had poor specificity and negative predictive value. These findings suggest that, for now, it might be beneficial for all elderly patients with cancer to receive a complete geriatric assessment, since available frailty screening methods have insufficient discriminative power to select patients for further assessment. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. The Direction of Hemispheric Asymmetries for Object Categorization at Different Levels of Abstraction Depends on the Task

    ERIC Educational Resources Information Center

    Studer, Tobias; Hubner, Ronald

    2008-01-01

    In this study hemispheric asymmetries for categorizing objects at the basic versus subordinate level of abstraction were investigated. As predictions derived from different theoretical approaches are contradictory and experimental evidence is inconclusive in this regard, we conducted two categorization experiments, where we contrasted two…

  17. The Association between Psychological Distance and Construal Level: Evidence from an Implicit Association Test

    ERIC Educational Resources Information Center

    Bar-Anan, Yoav; Liberman, Nira; Trope, Yaacov

    2006-01-01

    According to construal level theory (N. Liberman, Y. Trope, & E. Stephan, in press; Y. Trope & N. Liberman, 2003), people use a more abstract, high construal level when judging, perceiving, and predicting more psychologically distal targets, and they judge more abstract targets as being more psychologically distal. The present research…

  18. THE PUTATIVE HIGH ACTIVITY VARIANT CYP3A4*1B PREDICTS THE ONSET OF PUBERTY IN YOUNG GIRLS. (R825816)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  19. Asking Questions in Child English: Evidence for Early Abstract Representations

    ERIC Educational Resources Information Center

    Pozzan, Lucia; Valian, Virginia

    2017-01-01

    We compare the predictions of two different accounts of first language acquisition by investigating the relative contributions of abstract syntax and input frequency to the elicited production of main and embedded questions by 36 monolingual English-speaking toddlers aged 3;00 to 5;11. In particular, we investigate whether children's accuracy…

  20. THREE-DIMENSIONAL QUANTITATIVE STRUCTURE-PROPERTY RELATIONSHIP (3D-QSPR) MODELS FOR PREDICTION OF THERMODYNAMIC PROPERTIES OF POLYCHLORINATED BIPHENYLS (PCBS): ENTHALPY OF VAPORIZATION. (R826133)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  1. Deep learning of mutation-gene-drug relations from the literature.

    PubMed

    Lee, Kyubum; Kim, Byounggun; Choi, Yonghwa; Kim, Sunkyu; Shin, Wonho; Lee, Sunwon; Park, Sungjoon; Kim, Seongsoon; Tan, Aik Choon; Kang, Jaewoo

    2018-01-25

    Molecular biomarkers that can predict drug efficacy in cancer patients are crucial components for the advancement of precision medicine. However, identifying these molecular biomarkers remains a laborious and challenging task. Next-generation sequencing of patients and preclinical models have increasingly led to the identification of novel gene-mutation-drug relations, and these results have been reported and published in the scientific literature. Here, we present two new computational methods that utilize all the PubMed articles as domain specific background knowledge to assist in the extraction and curation of gene-mutation-drug relations from the literature. The first method uses the Biomedical Entity Search Tool (BEST) scoring results as some of the features to train the machine learning classifiers. The second method uses not only the BEST scoring results, but also word vectors in a deep convolutional neural network model that are constructed from and trained on numerous documents such as PubMed abstracts and Google News articles. Using the features obtained from both the BEST search engine scores and word vectors, we extract mutation-gene and mutation-drug relations from the literature using machine learning classifiers such as random forest and deep convolutional neural networks. Our methods achieved better results compared with the state-of-the-art methods. We used our proposed features in a simple machine learning model, and obtained F1-scores of 0.96 and 0.82 for mutation-gene and mutation-drug relation classification, respectively. We also developed a deep learning classification model using convolutional neural networks, BEST scores, and the word embeddings that are pre-trained on PubMed or Google News data. Using deep learning, the classification accuracy improved, and F1-scores of 0.96 and 0.86 were obtained for the mutation-gene and mutation-drug relations, respectively. We believe that our computational methods described in this research could be used as an important tool in identifying molecular biomarkers that predict drug responses in cancer patients. We also built a database of these mutation-gene-drug relations that were extracted from all the PubMed abstracts. We believe that our database can prove to be a valuable resource for precision medicine researchers.

  2. Similarity-based Regularized Latent Feature Model for Link Prediction in Bipartite Networks.

    PubMed

    Wang, Wenjun; Chen, Xue; Jiao, Pengfei; Jin, Di

    2017-12-05

    Link prediction is an attractive research topic in the field of data mining and has significant applications in improving performance of recommendation system and exploring evolving mechanisms of the complex networks. A variety of complex systems in real world should be abstractly represented as bipartite networks, in which there are two types of nodes and no links connect nodes of the same type. In this paper, we propose a framework for link prediction in bipartite networks by combining the similarity based structure and the latent feature model from a new perspective. The framework is called Similarity Regularized Nonnegative Matrix Factorization (SRNMF), which explicitly takes the local characteristics into consideration and encodes the geometrical information of the networks by constructing a similarity based matrix. We also develop an iterative scheme to solve the objective function based on gradient descent. Extensive experiments on a variety of real world bipartite networks show that the proposed framework of link prediction has a more competitive, preferable and stable performance in comparison with the state-of-art methods.

  3. B-mode Ultrasound Versus Color Doppler Twinkling Artifact in Detecting Kidney Stones

    PubMed Central

    Harper, Jonathan D.; Hsi, Ryan S.; Shah, Anup R.; Dighe, Manjiri K.; Carter, Stephen J.; Moshiri, Mariam; Paun, Marla; Lu, Wei; Bailey, Michael R.

    2013-01-01

    Abstract Purpose To compare color Doppler twinkling artifact and B-mode ultrasonography in detecting kidney stones. Patients and Methods Nine patients with recent CT scans prospectively underwent B-mode and twinkling artifact color Doppler ultrasonography on a commercial ultrasound machine. Video segments of the upper pole, interpolar area, and lower pole were created, randomized, and independently reviewed by three radiologists. Receiver operator characteristics were determined. Results There were 32 stones in 18 kidneys with a mean stone size of 8.9±7.5 mm. B-mode ultrasonography had 71% sensitivity, 48% specificity, 52% positive predictive value, and 68% negative predictive value, while twinkling artifact Doppler ultrasonography had 56% sensitivity, 74% specificity, 62% positive predictive value, and 68% negative predictive value. Conclusions When used alone, B-mode is more sensitive, but twinkling artifact is more specific in detecting kidney stones. This information may help users employ twinkling and B-mode to identify stones and developers to improve signal processing to harness the fundamental acoustic differences to ultimately improve stone detection. PMID:23067207

  4. In Silico Dynamics: computer simulation in a Virtual Embryo (SOT)

    EPA Science Inventory

    Abstract: Utilizing cell biological information to predict higher order biological processes is a significant challenge in predictive toxicology. This is especially true for highly dynamical systems such as the embryo where morphogenesis, growth and differentiation require preci...

  5. Impact of Initial Condition Errors and Precipitation Forecast Bias on Drought Simulation and Prediction in the Huaihe River Basin

    NASA Astrophysics Data System (ADS)

    Xu, H.; Luo, L.; Wu, Z.

    2016-12-01

    Drought, regarded as one of the major disasters all over the world, is not always easy to detect and forecast. Hydrological models coupled with Numerical Weather Prediction (NWP) has become a relatively effective method for drought monitoring and prediction. The accuracy of hydrological initial condition (IC) and the skill of NWP precipitation forecast can both heavily affect the quality and skill of hydrological forecast. In the study, the Variable Infiltration Capacity (VIC) model and Global Environmental Multi-scale (GEM) model were used to investigate the roles of IC and NWP forecast accuracy on hydrological predictions. A rev-ESP type experiment was conducted for a number of drought events in the Huaihe river basin. The experiment suggests that errors in ICs indeed affect the drought simulations by VIC and thus the drought monitoring. Although errors introduced in the ICs diminish gradually, the influence sometimes can last beyond 12 months. Using the soil moisture anomaly percentage index (SMAPI) as the metric to measure drought severity for the study region, we are able to quantify that time scale of influence from IC ranges. The analysis shows that the time scale is directly related to the magnitude of the introduced IC range and the average precipitation intensity. In order to explore how systematic bias correction in GEM forecasted precipitation can affect precipitation and hydrological forecast, we then both used station and gridded observations to eliminate biases of forecasted data. Meanwhile, different precipitation inputs with corrected data during drought process were conducted by VIC to investigate the changes of drought simulations, thus demonstrated short-term rolling drought prediction using a better performed corrected precipitation forecast. There is a word limit on the length of the abstract. So make sure your abstract fits the requirement. If this version is too long, try to shorten it as much as you can.

  6. Clarifying the abstracts of systematic literature reviews*

    PubMed Central

    Hartley, James

    2000-01-01

    Background: There is a small body of research on improving the clarity of abstracts in general that is relevant to improving the clarity of abstracts of systematic reviews. Objectives: To summarize this earlier research and indicate its implications for writing the abstracts of systematic reviews. Method: Literature review with commentary on three main features affecting the clarity of abstracts: their language, structure, and typographical presentation. Conclusions: The abstracts of systematic reviews should be easier to read than the abstracts of medical research articles, as they are targeted at a wider audience. The aims, methods, results, and conclusions of systematic reviews need to be presented in a consistent way to help search and retrieval. The typographic detailing of the abstracts (type-sizes, spacing, and weights) should be planned to help, rather than confuse, the reader. PMID:11055300

  7. Odors and Air Pollution: A Bibliography with Abstracts.

    ERIC Educational Resources Information Center

    Environmental Protection Agency, Research Triangle Park, NC. Office of Air Programs.

    The annotated bibliography presents a compilation of abstracts which deal with odors as they relate to air pollution. The abstracts are arranged within the following categories: Emission sources; Control methods; Measurement methods; Air quality measurements; Atmospheric interaction; Basic science and technology; Effects-human health;…

  8. Efficient embedding of complex networks to hyperbolic space via their Laplacian

    PubMed Central

    Alanis-Lobato, Gregorio; Mier, Pablo; Andrade-Navarro, Miguel A.

    2016-01-01

    The different factors involved in the growth process of complex networks imprint valuable information in their observable topologies. How to exploit this information to accurately predict structural network changes is the subject of active research. A recent model of network growth sustains that the emergence of properties common to most complex systems is the result of certain trade-offs between node birth-time and similarity. This model has a geometric interpretation in hyperbolic space, where distances between nodes abstract this optimisation process. Current methods for network hyperbolic embedding search for node coordinates that maximise the likelihood that the network was produced by the afore-mentioned model. Here, a different strategy is followed in the form of the Laplacian-based Network Embedding, a simple yet accurate, efficient and data driven manifold learning approach, which allows for the quick geometric analysis of big networks. Comparisons against existing embedding and prediction techniques highlight its applicability to network evolution and link prediction. PMID:27445157

  9. Estimating West Nile virus transmission period in Pennsylvania using an optimized degree-day model.

    PubMed

    Chen, Shi; Blanford, Justine I; Fleischer, Shelby J; Hutchinson, Michael; Saunders, Michael C; Thomas, Matthew B

    2013-07-01

    Abstract We provide calibrated degree-day models to predict potential West Nile virus (WNV) transmission periods in Pennsylvania. We begin by following the standard approach of treating the degree-days necessary for the virus to complete the extrinsic incubation period (EIP), and mosquito longevity as constants. This approach failed to adequately explain virus transmission periods based on mosquito surveillance data from 4 locations (Harrisburg, Philadelphia, Pittsburgh, and Williamsport) in Pennsylvania from 2002 to 2008. Allowing the EIP and adult longevity to vary across time and space improved model fit substantially. The calibrated models increase the ability to successfully predict the WNV transmission period in Pennsylvania to 70-80% compared to less than 30% in the uncalibrated model. Model validation showed the optimized models to be robust in 3 of the locations, although still showing errors for Philadelphia. These models and methods could provide useful tools to predict WNV transmission period from surveillance datasets, assess potential WNV risk, and make informed mosquito surveillance strategies.

  10. Efficient embedding of complex networks to hyperbolic space via their Laplacian

    NASA Astrophysics Data System (ADS)

    Alanis-Lobato, Gregorio; Mier, Pablo; Andrade-Navarro, Miguel A.

    2016-07-01

    The different factors involved in the growth process of complex networks imprint valuable information in their observable topologies. How to exploit this information to accurately predict structural network changes is the subject of active research. A recent model of network growth sustains that the emergence of properties common to most complex systems is the result of certain trade-offs between node birth-time and similarity. This model has a geometric interpretation in hyperbolic space, where distances between nodes abstract this optimisation process. Current methods for network hyperbolic embedding search for node coordinates that maximise the likelihood that the network was produced by the afore-mentioned model. Here, a different strategy is followed in the form of the Laplacian-based Network Embedding, a simple yet accurate, efficient and data driven manifold learning approach, which allows for the quick geometric analysis of big networks. Comparisons against existing embedding and prediction techniques highlight its applicability to network evolution and link prediction.

  11. Thermomechanical Modeling of Sintered Silver - A Fracture Mechanics-based Approach: Extended Abstract: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paret, Paul P; DeVoto, Douglas J; Narumanchi, Sreekant V

    Sintered silver has proven to be a promising candidate for use as a die-attach and substrate-attach material in automotive power electronics components. It holds promise of greater reliability than lead-based and lead-free solders, especially at higher temperatures (less than 200 degrees Celcius). Accurate predictive lifetime models of sintered silver need to be developed and its failure mechanisms thoroughly characterized before it can be deployed as a die-attach or substrate-attach material in wide-bandgap device-based packages. We present a finite element method (FEM) modeling methodology that can offer greater accuracy in predicting the failure of sintered silver under accelerated thermal cycling. Amore » fracture mechanics-based approach is adopted in the FEM model, and J-integral/thermal cycle values are computed. In this paper, we outline the procedures for obtaining the J-integral/thermal cycle values in a computational model and report on the possible advantage of using these values as modeling parameters in a predictive lifetime model.« less

  12. A hierarchical clustering methodology for the estimation of toxicity.

    PubMed

    Martin, Todd M; Harten, Paul; Venkatapathy, Raghuraman; Das, Shashikala; Young, Douglas M

    2008-01-01

    ABSTRACT A quantitative structure-activity relationship (QSAR) methodology based on hierarchical clustering was developed to predict toxicological endpoints. This methodology utilizes Ward's method to divide a training set into a series of structurally similar clusters. The structural similarity is defined in terms of 2-D physicochemical descriptors (such as connectivity and E-state indices). A genetic algorithm-based technique is used to generate statistically valid QSAR models for each cluster (using the pool of descriptors described above). The toxicity for a given query compound is estimated using the weighted average of the predictions from the closest cluster from each step in the hierarchical clustering assuming that the compound is within the domain of applicability of the cluster. The hierarchical clustering methodology was tested using a Tetrahymena pyriformis acute toxicity data set containing 644 chemicals in the training set and with two prediction sets containing 339 and 110 chemicals. The results from the hierarchical clustering methodology were compared to the results from several different QSAR methodologies.

  13. Intra Frame Coding In Advanced Video Coding Standard (H.264) to Obtain Consistent PSNR and Reduce Bit Rate for Diagonal Down Left Mode Using Gaussian Pulse

    NASA Astrophysics Data System (ADS)

    Manjanaik, N.; Parameshachari, B. D.; Hanumanthappa, S. N.; Banu, Reshma

    2017-08-01

    Intra prediction process of H.264 video coding standard used to code first frame i.e. Intra frame of video to obtain good coding efficiency compare to previous video coding standard series. More benefit of intra frame coding is to reduce spatial pixel redundancy with in current frame, reduces computational complexity and provides better rate distortion performance. To code Intra frame it use existing process Rate Distortion Optimization (RDO) method. This method increases computational complexity, increases in bit rate and reduces picture quality so it is difficult to implement in real time applications, so the many researcher has been developed fast mode decision algorithm for coding of intra frame. The previous work carried on Intra frame coding in H.264 standard using fast decision mode intra prediction algorithm based on different techniques was achieved increased in bit rate, degradation of picture quality(PSNR) for different quantization parameters. Many previous approaches of fast mode decision algorithms on intra frame coding achieved only reduction of computational complexity or it save encoding time and limitation was increase in bit rate with loss of quality of picture. In order to avoid increase in bit rate and loss of picture quality a better approach was developed. In this paper developed a better approach i.e. Gaussian pulse for Intra frame coding using diagonal down left intra prediction mode to achieve higher coding efficiency in terms of PSNR and bitrate. In proposed method Gaussian pulse is multiplied with each 4x4 frequency domain coefficients of 4x4 sub macro block of macro block of current frame before quantization process. Multiplication of Gaussian pulse for each 4x4 integer transformed coefficients at macro block levels scales the information of the coefficients in a reversible manner. The resulting signal would turn abstract. Frequency samples are abstract in a known and controllable manner without intermixing of coefficients, it avoids picture getting bad hit for higher values of quantization parameters. The proposed work was implemented using MATLAB and JM 18.6 reference software. The proposed work measure the performance parameters PSNR, bit rate and compression of intra frame of yuv video sequences in QCIF resolution under different values of quantization parameter with Gaussian value for diagonal down left intra prediction mode. The simulation results of proposed algorithm are tabulated and compared with previous algorithm i.e. Tian et al method. The proposed algorithm achieved reduced in bit rate averagely 30.98% and maintain consistent picture quality for QCIF sequences compared to previous algorithm i.e. Tian et al method.

  14. A novel two-dimensional liquid-chromatography method for online prediction of the toxicity of transformation products of benzophenones after water chlorination.

    PubMed

    Li, Jian; Ma, Li-Yun; Xu, Li; Shi, Zhi-Guo

    2015-08-01

    Benzophenone-type UV filters (BPs) are ubiquitous in the environment. Transformation products (TPs) of BPs with suspected toxicity are likely to be produced during disinfection of water by chlorination. To quickly predict the toxicity of TPs, in this study, a novel two-dimensional liquid-chromatography (2D-LC) method was established in which the objective of the first dimension was to separate the multiple components of the BPs sample after chlorination, using a reversed-phase liquid-chromatography mode. A biochromatographic system, i.e. bio-partitioning micellar chromatography with the polyoxyethylene (23) lauryl ether aqueous solution as the mobile phase, served as the second dimension to predict the toxicity of the fraction from the first dimension on the basis of the quantitative retention-activity relationships (QRARs) model. Six BPs, namely 2,4-dihydroxybenzophenone, oxybenzone, 4-hydroxybenzophenone, 2-hydroxy-4-methoxybenzophenone-5-sulfonic acid, 2,2'-dihydroxy-4,4'-dimethoxybenzophenone and 2,2'-dihydroxy-4-methoxybenzophenone, were the target analytes subjected to chlorination. The products of these BPs after chlorination were directly injected to the 2D-LC system for analysis. The results indicated that most TPs may be less toxic than their parent chemicals, but some may be more toxic, and that intestinal toxicity of TPs may be more obvious than blood toxicity. The proposed method is time-saving, high-throughput, and reliable, and has great potential for predicting toxicity or bioactivity of unknown and/or known components in a complex sample. Graphical Abstract The scheme for the 2D-LC online prediction of toxicity of the transformation products of benzophenone-type UV filters after chlorination.

  15. Characteristics of genomic signatures derived using univariate methods and mechanistically anchored functional descriptors for predicting drug- and xenobiotic-induced nephrotoxicity.

    PubMed

    Shi, Weiwei; Bugrim, Andrej; Nikolsky, Yuri; Nikolskya, Tatiana; Brennan, Richard J

    2008-01-01

    ABSTRACT The ideal toxicity biomarker is composed of the properties of prediction (is detected prior to traditional pathological signs of injury), accuracy (high sensitivity and specificity), and mechanistic relationships to the endpoint measured (biological relevance). Gene expression-based toxicity biomarkers ("signatures") have shown good predictive power and accuracy, but are difficult to interpret biologically. We have compared different statistical methods of feature selection with knowledge-based approaches, using GeneGo's database of canonical pathway maps, to generate gene sets for the classification of renal tubule toxicity. The gene set selection algorithms include four univariate analyses: t-statistics, fold-change, B-statistics, and RankProd, and their combination and overlap for the identification of differentially expressed probes. Enrichment analysis following the results of the four univariate analyses, Hotelling T-square test, and, finally out-of-bag selection, a variant of cross-validation, were used to identify canonical pathway maps-sets of genes coordinately involved in key biological processes-with classification power. Differentially expressed genes identified by the different statistical univariate analyses all generated reasonably performing classifiers of tubule toxicity. Maps identified by enrichment analysis or Hotelling T-square had lower classification power, but highlighted perturbed lipid homeostasis as a common discriminator of nephrotoxic treatments. The out-of-bag method yielded the best functionally integrated classifier. The map "ephrins signaling" performed comparably to a classifier derived using sparse linear programming, a machine learning algorithm, and represents a signaling network specifically involved in renal tubule development and integrity. Such functional descriptors of toxicity promise to better integrate predictive toxicogenomics with mechanistic analysis, facilitating the interpretation and risk assessment of predictive genomic investigations.

  16. Clinicopathologic characteristics associated with long-term survival in advanced epithelial ovarian cancer: an NRG Oncology/Gynecologic Oncology Group ancillary data study

    PubMed Central

    Hamilton, C. A.; Miller, A.; Casablanca, Y.; Horowitz, N. S.; Rungruang, B.; Krivak, T. C.; Richard, S. D.; Rodriguez, N.; Birrer, M.J.; Backes, F.J.; Geller, M.A.; Quinn, M.; Goodheart, M.J.; Mutch, D.G.; Kavanagh, J.J.; Maxwell, G. L.; Bookman, M. A.

    2018-01-01

    Objective To identify clinicopathologic factors associated with 10-year overall survival in epithelial ovarian cancer (EOC) and primary peritoneal cancer (PPC), and to develop a predictive model identifying long-term survivors. Methods Demographic, surgical, and clinicopathologic data were abstracted from GOG 182 records. The association between clinical variables and long-term survival (LTS) (>10 years) was assessed using multivariable regression analysis. Bootstrap methods were used to develop predictive models from known prognostic clinical factors and predictive accuracy was quantified using optimism-adjusted area under the receiver operating characteristic curve (AUC). Results The analysis dataset included 3,010 evaluable patients, of whom 195 survived greater than ten years. These patients were more likely to have better performance status, endometrioid histology, stage III (rather than stage IV) disease, absence of ascites, less extensive preoperative disease distribution, microscopic disease residual following cyoreduction (R0), and decreased complexity of surgery (p<0.01). Multivariable regression analysis revealed that lower CA-125 levels, absence of ascites, stage, and R0 were significant independent predictors of LTS. A predictive model created using these variables had an AUC=0.729, which outperformed any of the individual predictors. Conclusions The absence of ascites, a low CA-125, stage, and R0 at the time of cytoreduction are factors associated with LTS when controlling for other confounders. An extensively annotated clinicopathologic prediction model for LTS fell short of clinical utility suggesting that prognostic molecular profiles are needed to better predict which patients are likely to be long-term survivors. PMID:29195926

  17. Lossless medical image compression using geometry-adaptive partitioning and least square-based prediction.

    PubMed

    Song, Xiaoying; Huang, Qijun; Chang, Sheng; He, Jin; Wang, Hao

    2018-06-01

    To improve the compression rates for lossless compression of medical images, an efficient algorithm, based on irregular segmentation and region-based prediction, is proposed in this paper. Considering that the first step of a region-based compression algorithm is segmentation, this paper proposes a hybrid method by combining geometry-adaptive partitioning and quadtree partitioning to achieve adaptive irregular segmentation for medical images. Then, least square (LS)-based predictors are adaptively designed for each region (regular subblock or irregular subregion). The proposed adaptive algorithm not only exploits spatial correlation between pixels but it utilizes local structure similarity, resulting in efficient compression performance. Experimental results show that the average compression performance of the proposed algorithm is 10.48, 4.86, 3.58, and 0.10% better than that of JPEG 2000, CALIC, EDP, and JPEG-LS, respectively. Graphical abstract ᅟ.

  18. Speaking two "Languages" in America: A semantic space analysis of how presidential candidates and their supporters represent abstract political concepts differently.

    PubMed

    Li, Ping; Schloss, Benjamin; Follmer, D Jake

    2017-10-01

    In this article we report a computational semantic analysis of the presidential candidates' speeches in the two major political parties in the USA. In Study One, we modeled the political semantic spaces as a function of party, candidate, and time of election, and findings revealed patterns of differences in the semantic representation of key political concepts and the changing landscapes in which the presidential candidates align or misalign with their parties in terms of the representation and organization of politically central concepts. Our models further showed that the 2016 US presidential nominees had distinct conceptual representations from those of previous election years, and these patterns did not necessarily align with their respective political parties' average representation of the key political concepts. In Study Two, structural equation modeling demonstrated that reported political engagement among voters differentially predicted reported likelihoods of voting for Clinton versus Trump in the 2016 presidential election. Study Three indicated that Republicans and Democrats showed distinct, systematic word association patterns for the same concepts/terms, which could be reliably distinguished using machine learning methods. These studies suggest that given an individual's political beliefs, we can make reliable predictions about how they understand words, and given how an individual understands those same words, we can also predict an individual's political beliefs. Our study provides a bridge between semantic space models and abstract representations of political concepts on the one hand, and the representations of political concepts and citizens' voting behavior on the other.

  19. "What is relevant in a text document?": An interpretable machine learning approach

    PubMed Central

    Arras, Leila; Horn, Franziska; Montavon, Grégoire; Müller, Klaus-Robert

    2017-01-01

    Text documents can be described by a number of abstract concepts such as semantic category, writing style, or sentiment. Machine learning (ML) models have been trained to automatically map documents to these abstract concepts, allowing to annotate very large text collections, more than could be processed by a human in a lifetime. Besides predicting the text’s category very accurately, it is also highly desirable to understand how and why the categorization process takes place. In this paper, we demonstrate that such understanding can be achieved by tracing the classification decision back to individual words using layer-wise relevance propagation (LRP), a recently developed technique for explaining predictions of complex non-linear classifiers. We train two word-based ML models, a convolutional neural network (CNN) and a bag-of-words SVM classifier, on a topic categorization task and adapt the LRP method to decompose the predictions of these models onto words. Resulting scores indicate how much individual words contribute to the overall classification decision. This enables one to distill relevant information from text documents without an explicit semantic information extraction step. We further use the word-wise relevance scores for generating novel vector-based document representations which capture semantic information. Based on these document vectors, we introduce a measure of model explanatory power and show that, although the SVM and CNN models perform similarly in terms of classification accuracy, the latter exhibits a higher level of explainability which makes it more comprehensible for humans and potentially more useful for other applications. PMID:28800619

  20. Exponential Formulae and Effective Operations

    NASA Technical Reports Server (NTRS)

    Mielnik, Bogdan; Fernandez, David J. C.

    1996-01-01

    One of standard methods to predict the phenomena of squeezing consists in splitting the unitary evolution operator into the product of simpler operations. The technique, while mathematically general, is not so simple in applications and leaves some pragmatic problems open. We report an extended class of exponential formulae, which yield a quicker insight into the laboratory details for a class of squeezing operations, and moreover, can be alternatively used to programme different type of operations, as: (1) the free evolution inversion; and (2) the soft simulations of the sharp kicks (so that all abstract results involving the kicks of the oscillator potential, become realistic laboratory prescriptions).

  1. Definition and Formulation of Scientific Prediction and Its Role in Inquiry-Based Laboratories

    ERIC Educational Resources Information Center

    Mauldin, Robert F.

    2011-01-01

    The formulation of a scientific prediction by students in college-level laboratories is proposed. This activity will develop the students' ability to apply abstract concepts via deductive reasoning. For instances in which a hypothesis will be tested by an experiment, students should develop a prediction that states what sort of experimental…

  2. The Relationship of Language and Emotion: N400 Support for an Embodied View of Language Comprehension

    ERIC Educational Resources Information Center

    Chwilla, Dorothee J.; Virgillito, Daniele; Vissers, Constance Th. W. M.

    2011-01-01

    According to embodied theories, the symbols used by language are meaningful because they are grounded in perception, action, and emotion. In contrast, according to abstract symbol theories, meaning arises from the syntactic combination of abstract, amodal symbols. If language is grounded in internal bodily states, then one would predict that…

  3. NUMERICAL INVESTIGATION OF THE EFFECTS OF BOUNDARY-LAYER EVOLUTION ON THE PREDICTIONS OF OZONE AND THE EFFICACY OF EMISSION CONTROL OPTIONS IN THE NORTHEASTERN UNITED STATES. (R826373)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  4. Qualitative Differences in the Representation of Abstract versus Concrete Words: Evidence from the Visual-World Paradigm

    ERIC Educational Resources Information Center

    Dunabeitia, Jon Andoni; Aviles, Alberto; Afonso, Olivia; Scheepers, Christoph; Carreiras, Manuel

    2009-01-01

    In the present visual-world experiment, participants were presented with visual displays that included a target item that was a semantic associate of an abstract or a concrete word. This manipulation allowed us to test a basic prediction derived from the qualitatively different representational framework that supports the view of different…

  5. The method of abstraction in the design of databases and the interoperability

    NASA Astrophysics Data System (ADS)

    Yakovlev, Nikolay

    2018-03-01

    When designing the database structure oriented to the contents of indicators presented in the documents and communications subject area. First, the method of abstraction is applied by expansion of the indices of new, artificially constructed abstract concepts. The use of abstract concepts allows to avoid registration of relations many-to-many. For this reason, when built using abstract concepts, demonstrate greater stability in the processes. The example abstract concepts to address structure - a unique house number. Second, the method of abstraction can be used in the transformation of concepts by omitting some attributes that are unnecessary for solving certain classes of problems. Data processing associated with the amended concepts is more simple without losing the possibility of solving the considered classes of problems. For example, the concept "street" loses the binding to the land. The content of the modified concept of "street" are only the relations of the houses to the declared name. For most accounting tasks and ensure communication is enough.

  6. Multivariate prediction of motor diagnosis in Huntington's disease: 12 years of PREDICT‐HD

    PubMed Central

    Long, Jeffrey D.

    2015-01-01

    Abstract Background It is well known in Huntington's disease that cytosine‐adenine‐guanine expansion and age at study entry are predictive of the timing of motor diagnosis. The goal of this study was to assess whether additional motor, imaging, cognitive, functional, psychiatric, and demographic variables measured at study entry increased the ability to predict the risk of motor diagnosis over 12 years. Methods One thousand seventy‐eight Huntington's disease gene–expanded carriers (64% female) from the Neurobiological Predictors of Huntington's Disease study were followed up for up to 12 y (mean = 5, standard deviation = 3.3) covering 2002 to 2014. No one had a motor diagnosis at study entry, but 225 (21%) carriers prospectively received a motor diagnosis. Analysis was performed with random survival forests, which is a machine learning method for right‐censored data. Results Adding 34 variables along with cytosine‐adenine‐guanine and age substantially increased predictive accuracy relative to cytosine‐adenine‐guanine and age alone. Adding six of the common motor and cognitive variables (total motor score, diagnostic confidence level, Symbol Digit Modalities Test, three Stroop tests) resulted in lower predictive accuracy than the full set, but still had twice the 5‐y predictive accuracy than when using cytosine‐adenine‐guanine and age alone. Additional analysis suggested interactions and nonlinear effects that were characterized in a post hoc Cox regression model. Conclusions Measurement of clinical variables can substantially increase the accuracy of predicting motor diagnosis over and above cytosine‐adenine‐guanine and age (and their interaction). Estimated probabilities can be used to characterize progression level and aid in future studies' sample selection. © 2015 The Authors. Movement Disorders published by Wiley Periodicals, Inc. on behalf of International Parkinson and Movement Disorder Society PMID:26340420

  7. Synthetic biology: new engineering rules for an emerging discipline

    PubMed Central

    Andrianantoandro, Ernesto; Basu, Subhayu; Karig, David K; Weiss, Ron

    2006-01-01

    Synthetic biologists engineer complex artificial biological systems to investigate natural biological phenomena and for a variety of applications. We outline the basic features of synthetic biology as a new engineering discipline, covering examples from the latest literature and reflecting on the features that make it unique among all other existing engineering fields. We discuss methods for designing and constructing engineered cells with novel functions in a framework of an abstract hierarchy of biological devices, modules, cells, and multicellular systems. The classical engineering strategies of standardization, decoupling, and abstraction will have to be extended to take into account the inherent characteristics of biological devices and modules. To achieve predictability and reliability, strategies for engineering biology must include the notion of cellular context in the functional definition of devices and modules, use rational redesign and directed evolution for system optimization, and focus on accomplishing tasks using cell populations rather than individual cells. The discussion brings to light issues at the heart of designing complex living systems and provides a trajectory for future development. PMID:16738572

  8. Synthetic biology: new engineering rules for an emerging discipline.

    PubMed

    Andrianantoandro, Ernesto; Basu, Subhayu; Karig, David K; Weiss, Ron

    2006-01-01

    Synthetic biologists engineer complex artificial biological systems to investigate natural biological phenomena and for a variety of applications. We outline the basic features of synthetic biology as a new engineering discipline, covering examples from the latest literature and reflecting on the features that make it unique among all other existing engineering fields. We discuss methods for designing and constructing engineered cells with novel functions in a framework of an abstract hierarchy of biological devices, modules, cells, and multicellular systems. The classical engineering strategies of standardization, decoupling, and abstraction will have to be extended to take into account the inherent characteristics of biological devices and modules. To achieve predictability and reliability, strategies for engineering biology must include the notion of cellular context in the functional definition of devices and modules, use rational redesign and directed evolution for system optimization, and focus on accomplishing tasks using cell populations rather than individual cells. The discussion brings to light issues at the heart of designing complex living systems and provides a trajectory for future development.

  9. PRISM 3: expanded prediction of natural product chemical structures from microbial genomes

    PubMed Central

    Skinnider, Michael A.; Merwin, Nishanth J.; Johnston, Chad W.

    2017-01-01

    Abstract Microbial natural products represent a rich resource of pharmaceutically and industrially important compounds. Genome sequencing has revealed that the majority of natural products remain undiscovered, and computational methods to connect biosynthetic gene clusters to their corresponding natural products therefore have the potential to revitalize natural product discovery. Previously, we described PRediction Informatics for Secondary Metabolomes (PRISM), a combinatorial approach to chemical structure prediction for genetically encoded nonribosomal peptides and type I and II polyketides. Here, we present a ground-up rewrite of the PRISM structure prediction algorithm to derive prediction of natural products arising from non-modular biosynthetic paradigms. Within this new version, PRISM 3, natural product scaffolds are modeled as chemical graphs, permitting structure prediction for aminocoumarins, antimetabolites, bisindoles and phosphonate natural products, and building upon the addition of ribosomally synthesized and post-translationally modified peptides. Further, with the addition of cluster detection for 11 new cluster types, PRISM 3 expands to detect 22 distinct natural product cluster types. Other major modifications to PRISM include improved sequence input and ORF detection, user-friendliness and output. Distribution of PRISM 3 over a 300-core server grid improves the speed and capacity of the web application. PRISM 3 is available at http://magarveylab.ca/prism/. PMID:28460067

  10. LitMiner and WikiGene: identifying problem-related key players of gene regulation using publication abstracts.

    PubMed

    Maier, Holger; Döhr, Stefanie; Grote, Korbinian; O'Keeffe, Sean; Werner, Thomas; Hrabé de Angelis, Martin; Schneider, Ralf

    2005-07-01

    The LitMiner software is a literature data-mining tool that facilitates the identification of major gene regulation key players related to a user-defined field of interest in PubMed abstracts. The prediction of gene-regulatory relationships is based on co-occurrence analysis of key terms within the abstracts. LitMiner predicts relationships between key terms from the biomedical domain in four categories (genes, chemical compounds, diseases and tissues). Owing to the limitations (no direction, unverified automatic prediction) of the co-occurrence approach, the primary data in the LitMiner database represent postulated basic gene-gene relationships. The usefulness of the LitMiner system has been demonstrated recently in a study that reconstructed disease-related regulatory networks by promoter modelling that was initiated by a LitMiner generated primary gene list. To overcome the limitations and to verify and improve the data, we developed WikiGene, a Wiki-based curation tool that allows revision of the data by expert users over the Internet. LitMiner (http://andromeda.gsf.de/litminer) and WikiGene (http://andromeda.gsf.de/wiki) can be used unrestricted with any Internet browser.

  11. Abstract shapes of RNA.

    PubMed

    Giegerich, Robert; Voss, Björn; Rehmsmeier, Marc

    2004-01-01

    The function of a non-protein-coding RNA is often determined by its structure. Since experimental determination of RNA structure is time-consuming and expensive, its computational prediction is of great interest, and efficient solutions based on thermodynamic parameters are known. Frequently, however, the predicted minimum free energy structures are not the native ones, leading to the necessity of generating suboptimal solutions. While this can be accomplished by a number of programs, the user is often confronted with large outputs of similar structures, although he or she is interested in structures with more fundamental differences, or, in other words, with different abstract shapes. Here, we formalize the concept of abstract shapes and introduce their efficient computation. Each shape of an RNA molecule comprises a class of similar structures and has a representative structure of minimal free energy within the class. Shape analysis is implemented in the program RNAshapes. We applied RNAshapes to the prediction of optimal and suboptimal abstract shapes of several RNAs. For a given energy range, the number of shapes is considerably smaller than the number of structures, and in all cases, the native structures were among the top shape representatives. This demonstrates that the researcher can quickly focus on the structures of interest, without processing up to thousands of near-optimal solutions. We complement this study with a large-scale analysis of the growth behaviour of structure and shape spaces. RNAshapes is available for download and as an online version on the Bielefeld Bioinformatics Server.

  12. Design of Biomedical Robots for Phenotype Prediction Problems

    PubMed Central

    deAndrés-Galiana, Enrique J.; Sonis, Stephen T.

    2016-01-01

    Abstract Genomics has been used with varying degrees of success in the context of drug discovery and in defining mechanisms of action for diseases like cancer and neurodegenerative and rare diseases in the quest for orphan drugs. To improve its utility, accuracy, and cost-effectiveness optimization of analytical methods, especially those that translate to clinically relevant outcomes, is critical. Here we define a novel tool for genomic analysis termed a biomedical robot in order to improve phenotype prediction, identifying disease pathogenesis and significantly defining therapeutic targets. Biomedical robot analytics differ from historical methods in that they are based on melding feature selection methods and ensemble learning techniques. The biomedical robot mathematically exploits the structure of the uncertainty space of any classification problem conceived as an ill-posed optimization problem. Given a classifier, there exist different equivalent small-scale genetic signatures that provide similar predictive accuracies. We perform the sensitivity analysis to noise of the biomedical robot concept using synthetic microarrays perturbed by different kinds of noises in expression and class assignment. Finally, we show the application of this concept to the analysis of different diseases, inferring the pathways and the correlation networks. The final aim of a biomedical robot is to improve knowledge discovery and provide decision systems to optimize diagnosis, treatment, and prognosis. This analysis shows that the biomedical robots are robust against different kinds of noises and particularly to a wrong class assignment of the samples. Assessing the uncertainty that is inherent to any phenotype prediction problem is the right way to address this kind of problem. PMID:27347715

  13. Granular Activated Carbon Performance Capability and Availability.

    DTIC Science & Technology

    1983-06-01

    services were surveyed to determine availability of data and to develop a strategy for later computerized searches: * Chemical Abstracts; * Engineering ...Chemical Abstracts; * Engineering Abstracts; * Environmental Abstracts; * Selected Water Resources Abstracts; * Pollution Abstracts; and * the U.S...chemicals addressed, and scientific and engineering methods used. Publications were also reviewed for quality and consistency with the bulk of available data

  14. Accurate and reproducible functional maps in 127 human cell types via 2D genome segmentation

    PubMed Central

    Hardison, Ross C.

    2017-01-01

    Abstract The Roadmap Epigenomics Consortium has published whole-genome functional annotation maps in 127 human cell types by integrating data from studies of multiple epigenetic marks. These maps have been widely used for studying gene regulation in cell type-specific contexts and predicting the functional impact of DNA mutations on disease. Here, we present a new map of functional elements produced by applying a method called IDEAS on the same data. The method has several unique advantages and outperforms existing methods, including that used by the Roadmap Epigenomics Consortium. Using five categories of independent experimental datasets, we compared the IDEAS and Roadmap Epigenomics maps. While the overall concordance between the two maps is high, the maps differ substantially in the prediction details and in their consistency of annotation of a given genomic position across cell types. The annotation from IDEAS is uniformly more accurate than the Roadmap Epigenomics annotation and the improvement is substantial based on several criteria. We further introduce a pipeline that improves the reproducibility of functional annotation maps. Thus, we provide a high-quality map of candidate functional regions across 127 human cell types and compare the quality of different annotation methods in order to facilitate biomedical research in epigenomics. PMID:28973456

  15. The Use of Remotely Controlled Mandibular Positioner as a Predictive Screening Tool for Mandibular Advancement Device Therapy in Patients with Obstructive Sleep Apnea through Single-Night Progressive Titration of the Mandible: A Systematic Review

    PubMed Central

    Kastoer, Chloé; Dieltjens, Marijke; Oorts, Eline; Hamans, Evert; Braem, Marc J.; Van de Heyning, Paul H.; Vanderveken, Olivier M.

    2016-01-01

    Study Objectives: To perform a review of the current evidence regarding the use of a remotely controlled mandibular positioner (RCMP) and to analyze the efficacy of RCMP as a predictive selection tool in the treatment of obstructive sleep apnea (OSA) with oral appliances that protrude the mandible (OAm), exclusively relying on single-night RCMP titration. Methods: An extensive literature search is performed through PubMed.com, Thecochranelibrary.com (CENTRAL only), Embase.com, and recent conference meeting abstracts in the field. Results: A total of 254 OSA patients from four full-text articles and 5 conference meeting abstracts contribute data to the review. Criteria for successful RCMP test and success with OAm differed between studies. Study populations were not fully comparable due to range-difference in baseline apneahypopnea index (AHI). However, in all studies elimination of airway obstruction events during sleep by RCMP titration predicted OAm therapy success by the determination of the most effective target protrusive position (ETPP). A statistically significant association is found between mean AHI predicted outcome with RCMP and treatment outcome with OAm on polysomnographic or portable sleep monitoring evaluation (p < 0.05). Conclusions: The existing evidence regarding the use of RCMP in patients with OSA indicates that it might be possible to protrude the mandible progressively during sleep under poly(somno)graphic observation by RCMP until respiratory events are eliminated without disturbing sleep or arousing the patient. ETPP as measured by the use of RCMP was significantly associated with success of OAm therapy in the reported studies. RCMP might be a promising instrument for predicting OAm treatment outcome and targeting the degree of mandibular advancement needed. Citation: Kastoer C, Dieltjens M, Oorts E, Hamans E, Braem MJ, Van de Heyning PH, Vanderveken OM. The use of remotely controlled mandibular positioner as a predictive screening tool for mandibular advancement device therapy in patients with obstructive sleep apnea through single-night progressive titration of the mandible: a systematic review. J Clin Sleep Med 2016;12(10):1411–1421. PMID:27568892

  16. Hydride, hydrogen, proton, and electron affinities of imines and their reaction intermediates in acetonitrile and construction of thermodynamic characteristic graphs (TCGs) of imines as a "molecule ID card".

    PubMed

    Zhu, Xiao-Qing; Liu, Qiao-Yun; Chen, Qiang; Mei, Lian-Rui

    2010-02-05

    A series of 61 imines with various typical structures were synthesized, and the thermodynamic affinities (defined as enthalpy changes or redox potentials in this work) of the imines to abstract hydride anions, hydrogen atoms, and electrons, the thermodynamic affinities of the radical anions of the imines to abstract hydrogen atoms and protons, and the thermodynamic affinities of the hydrogen adducts of the imines to abstract electrons in acetonitrile were determined by using titration calorimetry and electrochemical methods. The pure heterolytic and homolytic dissociation energies of the C=N pi-bond in the imines were estimated. The polarity of the C=N double bond in the imines was examined using a linear free-energy relationship. The idea of a thermodynamic characteristic graph (TCG) of imines as an efficient "Molecule ID Card" was introduced. The TCG can be used to quantitatively diagnose and predict the characteristic chemical properties of imines and their various reaction intermediates as well as the reduction mechanism of the imines. The information disclosed in this work could not only supply a gap of thermodynamics for the chemistry of imines but also strongly promote the fast development of the applications of imines.

  17. SAR/QSAR MODELS FOR TOXICITY PREDICTION: APPROACHES AND NEW DIRECTIONS

    EPA Science Inventory

    Abstract

    SAR/QSAR MODELS FOR TOXICITY PREDICTION: APPROACHES AND NEW DIRECTIONS

    Risk assessment typically incorporates some relevant toxicity information upon which to base a sound estimation for a chemical of concern. However, there are many circumstances in whic...

  18. Overview of the status of predictive computer models for skin sensitization (JRC Expert meeting on pre- and pro-haptens )

    EPA Science Inventory

    No abstract was prepared or requested. This is a short presentation aiming to present a status of what in silico models and approaches exists in the prediction of skin sensitization potential and/or potency.

  19. Digital Troposcatter Performance Model. Users Manual.

    DTIC Science & Technology

    1983-11-01

    and Information Systems - .,- - - UNCLASSIFIED SECURITY CLASSIFICATION OF THIS PAGE (When Data Entered) S REPORT DOCUIAENTATION PAGE READ...Diffraction Multipath Prediction MD-918 Modem Error Rate Prediction AN/TRC-170 Link Analysis 20. ABSTRACT (Continue en reverse esie If neceseay end...configurations used in the Defense Communications System (DCS), and prediction of the performance of both the MD-918 and AN/TRC-170 digital troposcatter modems

  20. Local contextual processing of abstract and meaningful real-life images in professional athletes.

    PubMed

    Fogelson, Noa; Fernandez-Del-Olmo, Miguel; Acero, Rafael Martín

    2012-05-01

    We investigated the effect of abstract versus real-life meaningful images from sports on local contextual processing in two groups of professional athletes. Local context was defined as the occurrence of a short predictive series of stimuli occurring before delivery of a target event. EEG was recorded in 10 professional basketball players and 9 professional athletes of individual sports during three sessions. In each session, a different set of visual stimuli were presented: triangles facing left, up, right, or down; four images of a basketball player throwing a ball; four images of a baseball player pitching a baseball. Stimuli consisted of 15 % targets and 85 % of equal numbers of three types of standards. Recording blocks consisted of targets preceded by randomized sequences of standards and by sequences including a predictive sequence signaling the occurrence of a subsequent target event. Subjects pressed a button in response to targets. In all three sessions, reaction times and peak P3b latencies were shorter for predicted targets compared with random targets, the last most informative stimulus of the predictive sequence induced a robust P3b, and N2 amplitude was larger for random targets compared with predicted targets. P3b and N2 peak amplitudes were larger in the professional basketball group in comparison with professional athletes of individual sports, across the three sessions. The findings of this study suggest that local contextual information is processed similarly for abstract and for meaningful images and that professional basketball players seem to allocate more attentional resources in the processing of these visual stimuli.

  1. Abstract Interpreters for Free

    NASA Astrophysics Data System (ADS)

    Might, Matthew

    In small-step abstract interpretations, the concrete and abstract semantics bear an uncanny resemblance. In this work, we present an analysis-design methodology that both explains and exploits that resemblance. Specifically, we present a two-step method to convert a small-step concrete semantics into a family of sound, computable abstract interpretations. The first step re-factors the concrete state-space to eliminate recursive structure; this refactoring of the state-space simultaneously determines a store-passing-style transformation on the underlying concrete semantics. The second step uses inference rules to generate an abstract state-space and a Galois connection simultaneously. The Galois connection allows the calculation of the "optimal" abstract interpretation. The two-step process is unambiguous, but nondeterministic: at each step, analysis designers face choices. Some of these choices ultimately influence properties such as flow-, field- and context-sensitivity. Thus, under the method, we can give the emergence of these properties a graph-theoretic characterization. To illustrate the method, we systematically abstract the continuation-passing style lambda calculus to arrive at two distinct families of analyses. The first is the well-known k-CFA family of analyses. The second consists of novel "environment-centric" abstract interpretations, none of which appear in the literature on static analysis of higher-order programs.

  2. A Bayesian Framework for Analysis of Pseudo-Spatial Models of Comparable Engineered Systems with Application to Spacecraft Anomaly Prediction Based on Precedent Data

    NASA Astrophysics Data System (ADS)

    Ndu, Obibobi Kamtochukwu

    To ensure that estimates of risk and reliability inform design and resource allocation decisions in the development of complex engineering systems, early engagement in the design life cycle is necessary. An unfortunate constraint on the accuracy of such estimates at this stage of concept development is the limited amount of high fidelity design and failure information available on the actual system under development. Applying the human ability to learn from experience and augment our state of knowledge to evolve better solutions mitigates this limitation. However, the challenge lies in formalizing a methodology that takes this highly abstract, but fundamentally human cognitive, ability and extending it to the field of risk analysis while maintaining the tenets of generalization, Bayesian inference, and probabilistic risk analysis. We introduce an integrated framework for inferring the reliability, or other probabilistic measures of interest, of a new system or a conceptual variant of an existing system. Abstractly, our framework is based on learning from the performance of precedent designs and then applying the acquired knowledge, appropriately adjusted based on degree of relevance, to the inference process. This dissertation presents a method for inferring properties of the conceptual variant using a pseudo-spatial model that describes the spatial configuration of the family of systems to which the concept belongs. Through non-metric multidimensional scaling, we formulate the pseudo-spatial model based on rank-ordered subjective expert perception of design similarity between systems that elucidate the psychological space of the family. By a novel extension of Kriging methods for analysis of geospatial data to our "pseudo-space of comparable engineered systems", we develop a Bayesian inference model that allows prediction of the probabilistic measure of interest.

  3. Emerging trend prediction in biomedical literature.

    PubMed

    Moerchen, Fabian; Fradkin, Dmitriy; Dejori, Mathaeus; Wachmann, Bernd

    2008-11-06

    We present a study on how to predict new emerging trends in the biomedical domain based on textual data. We thereby propose a way of anticipating the transformation of arbitrary information into ground truth knowledge by predicting the inclusion of new terms into the MeSH ontology. We also discuss the preparation of a dataset for the evaluation of emerging trend prediction algorithms that is based on PubMed abstracts and related MeSH terms. The results suggest that early prediction of emerging trends is possible.

  4. GOClonto: an ontological clustering approach for conceptualizing PubMed abstracts.

    PubMed

    Zheng, Hai-Tao; Borchert, Charles; Kim, Hong-Gee

    2010-02-01

    Concurrent with progress in biomedical sciences, an overwhelming of textual knowledge is accumulating in the biomedical literature. PubMed is the most comprehensive database collecting and managing biomedical literature. To help researchers easily understand collections of PubMed abstracts, numerous clustering methods have been proposed to group similar abstracts based on their shared features. However, most of these methods do not explore the semantic relationships among groupings of documents, which could help better illuminate the groupings of PubMed abstracts. To address this issue, we proposed an ontological clustering method called GOClonto for conceptualizing PubMed abstracts. GOClonto uses latent semantic analysis (LSA) and gene ontology (GO) to identify key gene-related concepts and their relationships as well as allocate PubMed abstracts based on these key gene-related concepts. Based on two PubMed abstract collections, the experimental results show that GOClonto is able to identify key gene-related concepts and outperforms the STC (suffix tree clustering) algorithm, the Lingo algorithm, the Fuzzy Ants algorithm, and the clustering based TRS (tolerance rough set) algorithm. Moreover, the two ontologies generated by GOClonto show significant informative conceptual structures.

  5. How does tunneling contribute to counterintuitive H-abstraction reactivity of nonheme Fe(IV)O oxidants with alkanes?

    PubMed

    Mandal, Debasish; Ramanan, Rajeev; Usharani, Dandamudi; Janardanan, Deepa; Wang, Binju; Shaik, Sason

    2015-01-21

    This article addresses the intriguing hydrogen-abstraction (H-abstraction) and oxygen-transfer (O-transfer) reactivity of a series of nonheme [Fe(IV)(O)(TMC)(Lax)](z+) complexes, with a tetramethyl cyclam ligand and a variable axial ligand (Lax), toward three substrates: 1,4-cyclohexadiene, 9,10-dihydroanthracene, and triphenyl phosphine. Experimentally, O-transfer-reactivity follows the relative electrophilicity of the complexes, whereas the corresponding H-abstraction-reactivity generally increases as the axial ligand becomes a better electron donor, hence exhibiting an antielectrophilic trend. Our theoretical results show that the antielectrophilic trend in H-abstraction is affected by tunneling contributions. Room-temperature tunneling increases with increase of the electron donation power of the axial-ligand, and this reverses the natural electrophilic trend, as revealed through calculations without tunneling, and leads to the observed antielectrophilic trend. By contrast, O-transfer-reactivity, not being subject to tunneling, retains an electrophilic-dependent reactivity trend, as revealed experimentally and computationally. Tunneling-corrected kinetic-isotope effect (KIE) calculations matched the experimental KIE values only if all of the H-abstraction reactions proceeded on the quintet state (S = 2) surface. As such, the present results corroborate the initially predicted two-state reactivity (TSR) scenario for these reactions. The increase of tunneling with the electron-releasing power of the axial ligand, and the reversal of the "natural" reactivity pattern, support the "tunneling control" hypothesis (Schreiner et al., ref 19). Should these predictions be corroborated, the entire field of C-H bond activation in bioinorganic chemistry would lay open to reinvestigation.

  6. The semantic richness of abstract concepts

    PubMed Central

    Recchia, Gabriel; Jones, Michael N.

    2012-01-01

    We contrasted the predictive power of three measures of semantic richness—number of features (NFs), contextual dispersion (CD), and a novel measure of number of semantic neighbors (NSN)—for a large set of concrete and abstract concepts on lexical decision and naming tasks. NSN (but not NF) facilitated processing for abstract concepts, while NF (but not NSN) facilitated processing for the most concrete concepts, consistent with claims that linguistic information is more relevant for abstract concepts in early processing. Additionally, converging evidence from two datasets suggests that when NSN and CD are controlled for, the features that most facilitate processing are those associated with a concept's physical characteristics and real-world contexts. These results suggest that rich linguistic contexts (many semantic neighbors) facilitate early activation of abstract concepts, whereas concrete concepts benefit more from rich physical contexts (many associated objects and locations). PMID:23205008

  7. COPEPOD REPRODUCTIVE STRATEGIES: LIFE-HISTORY THEORY, PHYLOGENETIC PATTERN AND INVASION OF INLAND WATERS. (R824771)

    EPA Science Inventory

    Abstract

    Life-history theory predicts that different reproductive strategies should evolve in environments that differ in resource availability, mortality, seasonality, and in spatial or temporal variation. Within a population, the predicted optimal strategy is driven ...

  8. P09.62 Towards individualized survival prediction in glioblastoma patients using machine learning methods

    PubMed Central

    Vera, L.; Pérez-Beteta, J.; Molina, D.; Borrás, J. M.; Benavides, M.; Barcia, J. A.; Velásquez, C.; Albillo, D.; Lara, P.; Pérez-García, V. M.

    2017-01-01

    Abstract Introduction: Machine learning methods are integrated in clinical research studies due to their strong capability to discover parameters having a high information content and their predictive combined potential. Several studies have been developed using glioblastoma patient’s imaging data. Many of them have focused on including large numbers of variables, mostly two-dimensional textural features and/or genomic data, regardless of their meaning or potential clinical relevance. Materials and methods: 193 glioblastoma patients were included in the study. Preoperative 3D magnetic resonance images were collected and semi-automatically segmented using an in-house software. After segmentation, a database of 90 parameters including geometrical and textural image-based measures together with patients’ clinical data (including age, survival, type of treatment, etc.) was constructed. The criterion for including variables in the study was that they had either shown individual impact on survival in single or multivariate analyses or have a precise clinical or geometrical meaning. These variables were used to perform several machine learning experiments. In a first set of computational cross-validation experiments based on regression trees, those attributes showing the highest information measures were extracted. In the second phase, more sophisticated learning methods were employed in order to validate the potential of the previous variables predicting survival. Concretely support vector machines, neural networks and sparse grid methods were used. Results: Variables showing high information measure in the first phase provided the best prediction results in the second phase. Specifically, patient age, Stupp regimen and a geometrical measure related with the irregularity of contrast-enhancing areas were the variables showing the highest information measure in the first stage. For the second phase, the combinations of patient age and Stupp regimen together with one tumor geometrical measure and one tumor heterogeneity feature reached the best quality prediction. Conclusions: Advanced machine learning methods identified the parameters with the highest information measure and survival predictive potential. The uninformed machine learning methods identified a novel feature measure with direct impact on survival. Used in combination with other previously known variables multi-indexes can be defined that can help in tumor characterization and prognosis prediction. Recent advances on the definition of those multi-indexes will be reported in the conference. Funding: James S. Mc. Donnell Foundation (USA) 21st Century Science Initiative in Mathematical and Complex Systems Approaches for Brain Cancer [Collaborative award 220020450 and planning grant 220020420], MINECO/FEDER [MTM2015-71200-R], JCCM [PEII-2014-031-P].

  9. Deep 3D Convolutional Encoder Networks With Shortcuts for Multiscale Feature Integration Applied to Multiple Sclerosis Lesion Segmentation.

    PubMed

    Brosch, Tom; Tang, Lisa Y W; Youngjin Yoo; Li, David K B; Traboulsee, Anthony; Tam, Roger

    2016-05-01

    We propose a novel segmentation approach based on deep 3D convolutional encoder networks with shortcut connections and apply it to the segmentation of multiple sclerosis (MS) lesions in magnetic resonance images. Our model is a neural network that consists of two interconnected pathways, a convolutional pathway, which learns increasingly more abstract and higher-level image features, and a deconvolutional pathway, which predicts the final segmentation at the voxel level. The joint training of the feature extraction and prediction pathways allows for the automatic learning of features at different scales that are optimized for accuracy for any given combination of image types and segmentation task. In addition, shortcut connections between the two pathways allow high- and low-level features to be integrated, which enables the segmentation of lesions across a wide range of sizes. We have evaluated our method on two publicly available data sets (MICCAI 2008 and ISBI 2015 challenges) with the results showing that our method performs comparably to the top-ranked state-of-the-art methods, even when only relatively small data sets are available for training. In addition, we have compared our method with five freely available and widely used MS lesion segmentation methods (EMS, LST-LPA, LST-LGA, Lesion-TOADS, and SLS) on a large data set from an MS clinical trial. The results show that our method consistently outperforms these other methods across a wide range of lesion sizes.

  10. TADSim: Discrete Event-based Performance Prediction for Temperature Accelerated Dynamics

    DOE PAGES

    Mniszewski, Susan M.; Junghans, Christoph; Voter, Arthur F.; ...

    2015-04-16

    Next-generation high-performance computing will require more scalable and flexible performance prediction tools to evaluate software--hardware co-design choices relevant to scientific applications and hardware architectures. Here, we present a new class of tools called application simulators—parameterized fast-running proxies of large-scale scientific applications using parallel discrete event simulation. Parameterized choices for the algorithmic method and hardware options provide a rich space for design exploration and allow us to quickly find well-performing software--hardware combinations. We demonstrate our approach with a TADSim simulator that models the temperature-accelerated dynamics (TAD) method, an algorithmically complex and parameter-rich member of the accelerated molecular dynamics (AMD) family ofmore » molecular dynamics methods. The essence of the TAD application is captured without the computational expense and resource usage of the full code. We accomplish this by identifying the time-intensive elements, quantifying algorithm steps in terms of those elements, abstracting them out, and replacing them by the passage of time. We use TADSim to quickly characterize the runtime performance and algorithmic behavior for the otherwise long-running simulation code. We extend TADSim to model algorithm extensions, such as speculative spawning of the compute-bound stages, and predict performance improvements without having to implement such a method. Validation against the actual TAD code shows close agreement for the evolution of an example physical system, a silver surface. Finally, focused parameter scans have allowed us to study algorithm parameter choices over far more scenarios than would be possible with the actual simulation. This has led to interesting performance-related insights and suggested extensions.« less

  11. Statistical Learning Is Constrained to Less Abstract Patterns in Complex Sensory Input (but not the Least)

    PubMed Central

    Emberson, Lauren L.; Rubinstein, Dani

    2016-01-01

    The influence of statistical information on behavior (either through learning or adaptation) is quickly becoming foundational to many domains of cognitive psychology and cognitive neuroscience, from language comprehension to visual development. We investigate a central problem impacting these diverse fields: when encountering input with rich statistical information, are there any constraints on learning? This paper examines learning outcomes when adult learners are given statistical information across multiple levels of abstraction simultaneously: from abstract, semantic categories of everyday objects to individual viewpoints on these objects. After revealing statistical learning of abstract, semantic categories with scrambled individual exemplars (Exp. 1), participants viewed pictures where the categories as well as the individual objects predicted picture order (e.g., bird1—dog1, bird2—dog2). Our findings suggest that participants preferentially encode the relationships between the individual objects, even in the presence of statistical regularities linking semantic categories (Exps. 2 and 3). In a final experiment we investigate whether learners are biased towards learning object-level regularities or simply construct the most detailed model given the data (and therefore best able to predict the specifics of the upcoming stimulus) by investigating whether participants preferentially learn from the statistical regularities linking individual snapshots of objects or the relationship between the objects themselves (e.g., bird_picture1— dog_picture1, bird_picture2—dog_picture2). We find that participants fail to learn the relationships between individual snapshots, suggesting a bias towards object-level statistical regularities as opposed to merely constructing the most complete model of the input. This work moves beyond the previous existence proofs that statistical learning is possible at both very high and very low levels of abstraction (categories vs. individual objects) and suggests that, at least with the current categories and type of learner, there are biases to pick up on statistical regularities between individual objects even when robust statistical information is present at other levels of abstraction. These findings speak directly to emerging theories about how systems supporting statistical learning and prediction operate in our structure-rich environments. Moreover, the theoretical implications of the current work across multiple domains of study is already clear: statistical learning cannot be assumed to be unconstrained even if statistical learning has previously been established at a given level of abstraction when that information is presented in isolation. PMID:27139779

  12. Construal level as a moderator of the role of affective and cognitive attitudes in the prediction of health-risk behavioural intentions.

    PubMed

    Carrera, Pilar; Caballero, Amparo; Muñoz, Dolores; González-Iraizoz, Marta; Fernández, Itziar

    2014-12-01

    In two preliminary control checks it was shown that affective attitudes presented greater abstraction than cognitive attitudes. Three further studies explored how construal level moderated the role of affective and cognitive attitudes in predicting one health-promoting behaviour (exercising) and two risk behaviours (sleep debt and binge drinking). There was a stronger influence of affective attitudes both when participants were in abstract (vs. concrete) mindsets induced by a priming task in Studies 1a and 1b, and when behavioural intentions were formed for the distant (vs. near) future in Study 2. In the case of concrete mindsets, the results were inconclusive; the interaction between construal level and cognitive attitudes was only marginally significant in Study 1b. The present research supports the assertion that in abstract mindsets (vs. concrete mindsets) people use more affective attitudes to construe their behavioural intentions. Practical implications for health promotion are discussed in the framework of construal-level theory. © 2014 The British Psychological Society.

  13. Laser-Induced Thermal Damage of Skin

    DTIC Science & Technology

    1977-12-01

    identify by block number) Skin Burns Skin Model Laser Effects \\Thermal Predictions 20 ABSTRACT (Continue on reverse side it necessary and identify by...block number) A computerized model was developed for predicting thermal damage of skin by laser exposures. Thermal, optical, and physiological data are...presented for the model. Model predictions of extent of irreversible damage were compared with histologic determinations of the extent of damage

  14. Multivariate Patterns in the Human Object-Processing Pathway Reveal a Shift from Retinotopic to Shape Curvature Representations in Lateral Occipital Areas, LO-1 and LO-2.

    PubMed

    Vernon, Richard J W; Gouws, André D; Lawrence, Samuel J D; Wade, Alex R; Morland, Antony B

    2016-05-25

    Representations in early visual areas are organized on the basis of retinotopy, but this organizational principle appears to lose prominence in the extrastriate cortex. Nevertheless, an extrastriate region, such as the shape-selective lateral occipital cortex (LO), must still base its activation on the responses from earlier retinotopic visual areas, implying that a transition from retinotopic to "functional" organizations should exist. We hypothesized that such a transition may lie in LO-1 or LO-2, two visual areas lying between retinotopically defined V3d and functionally defined LO. Using a rapid event-related fMRI paradigm, we measured neural similarity in 12 human participants between pairs of stimuli differing along dimensions of shape exemplar and shape complexity within both retinotopically and functionally defined visual areas. These neural similarity measures were then compared with low-level and more abstract (curvature-based) measures of stimulus similarity. We found that low-level, but not abstract, stimulus measures predicted V1-V3 responses, whereas the converse was true for LO, a double dissociation. Critically, abstract stimulus measures were most predictive of responses within LO-2, akin to LO, whereas both low-level and abstract measures were predictive for responses within LO-1, perhaps indicating a transitional point between those two organizational principles. Similar transitions to abstract representations were not observed in the more ventral stream passing through V4 and VO-1/2. The transition we observed in LO-1 and LO-2 demonstrates that a more "abstracted" representation, typically considered the preserve of "category-selective" extrastriate cortex, can nevertheless emerge in retinotopic regions. Visual areas are typically identified either through retinotopy (e.g., V1-V3) or from functional selectivity [e.g., shape-selective lateral occipital complex (LOC)]. We combined these approaches to explore the nature of shape representations through the visual hierarchy. Two different representations emerged: the first reflected low-level shape properties (dependent on the spatial layout of the shape outline), whereas the second captured more abstract curvature-related shape features. Critically, early visual cortex represented low-level information but this diminished in the extrastriate cortex (LO-1/LO-2/LOC), in which the abstract representation emerged. Therefore, this work further elucidates the nature of shape representations in the LOC, provides insight into how those representations emerge from early retinotopic cortex, and crucially demonstrates that retinotopically tuned regions (LO-1/LO-2) are not necessarily constrained to retinotopic representations. Copyright © 2016 Vernon et al.

  15. Prognostic Physiology: Modeling Patient Severity in Intensive Care Units Using Radial Domain Folding

    PubMed Central

    Joshi, Rohit; Szolovits, Peter

    2012-01-01

    Real-time scalable predictive algorithms that can mine big health data as the care is happening can become the new “medical tests” in critical care. This work describes a new unsupervised learning approach, radial domain folding, to scale and summarize the enormous amount of data collected and to visualize the degradations or improvements in multiple organ systems in real time. Our proposed system is based on learning multi-layer lower dimensional abstractions from routinely generated patient data in modern Intensive Care Units (ICUs), and is dramatically different from most of the current work being done in ICU data mining that rely on building supervised predictive models using commonly measured clinical observations. We demonstrate that our system discovers abstract patient states that summarize a patient’s physiology. Further, we show that a logistic regression model trained exclusively on our learned layer outperforms a customized SAPS II score on the mortality prediction task. PMID:23304406

  16. Abstract Linguistic Structure Correlates with Temporal Activity during Naturalistic Comprehension

    PubMed Central

    Brennan, Jonathan R.; Stabler, Edward P.; Van Wagenen, Sarah E.; Luh, Wen-Ming; Hale, John T.

    2016-01-01

    Neurolinguistic accounts of sentence comprehension identify a network of relevant brain regions, but do not detail the information flowing through them. We investigate syntactic information. Does brain activity implicate a computation over hierarchical grammars or does it simply reflect linear order, as in a Markov chain? To address this question, we quantify the cognitive states implied by alternative parsing models. We compare processing-complexity predictions from these states against fMRI timecourses from regions that have been implicated in sentence comprehension. We find that hierarchical grammars independently predict timecourses from left anterior and posterior temporal lobe. Markov models are predictive in these regions and across a broader network that includes the inferior frontal gyrus. These results suggest that while linear effects are wide-spread across the language network, certain areas in the left temporal lobe deal with abstract, hierarchical syntactic representations. PMID:27208858

  17. In a Year, Memory Will Benefit from Learning, Tomorrow It Won't: Distance and Construal Level Effects on the Basis of Metamemory Judgments

    ERIC Educational Resources Information Center

    Halamish, Vered; Nussinson, Ravit; Ben-Ari, Liat

    2013-01-01

    Metamemory judgments may rely on 2 bases of information: subjective experience and abstract theories about memory. On the basis of construal level theory, we predicted that psychological distance and construal level (i.e., concrete vs. abstract thinking) would have a qualitative impact on the relative reliance on these 2 bases: When considering…

  18. Advanced Fuel Properties; A Computer Program for Estimating Property Values

    DTIC Science & Technology

    1993-05-01

    security considerations, contractual obligations, or notice on a specific document. REPORT DOCUMENTATION PAGE Fogu Approwd I OMB No. 0704-01=5 Ps NP...found in fuels. 14. SUBJECT TERMS 15. NUMBEROF PAGES 175 Fuel properties, Physical Propertie, Thermodynamnics, Predictions 16. PRICE CODE 17. SECURITY ...CLASSIFICATION is. SECURrrY CLASSIFICATION 19. SECURITY CLASSIFICATION 20. LIMITFATION OF ABSTRACT OF REPORT OF THIS PAGE OF ABSTRACT Unclassified

  19. Understanding and predicting changing use of groundwater with climate and other uncertainties: a Bayesian approach

    NASA Astrophysics Data System (ADS)

    Costa, F. A. F.; Keir, G.; McIntyre, N.; Bulovic, N.

    2015-12-01

    Most groundwater supply bores in Australia do not have flow metering equipment and so regional groundwater abstraction rates are not well known. Past estimates of unmetered abstraction for regional numerical groundwater modelling typically have not attempted to quantify the uncertainty inherent in the estimation process in detail. In particular, the spatial properties of errors in the estimates are almost always neglected. Here, we apply Bayesian spatial models to estimate these abstractions at a regional scale, using the state-of-the-art computationally inexpensive approaches of integrated nested Laplace approximation (INLA) and stochastic partial differential equations (SPDE). We examine a case study in the Condamine Alluvium aquifer in southern Queensland, Australia; even in this comparatively data-rich area with extensive groundwater abstraction for agricultural irrigation, approximately 80% of bores do not have reliable metered flow records. Additionally, the metering data in this area are characterised by complicated statistical features, such as zero-valued observations, non-normality, and non-stationarity. While this precludes the use of many classical spatial estimation techniques, such as kriging, our model (using the R-INLA package) is able to accommodate these features. We use a joint model to predict both probability and magnitude of abstraction from bores in space and time, and examine the effect of a range of high-resolution gridded meteorological covariates upon the predictive ability of the model. Deviance Information Criterion (DIC) scores are used to assess a range of potential models, which reward good model fit while penalising excessive model complexity. We conclude that maximum air temperature (as a reasonably effective surrogate for evapotranspiration) is the most significant single predictor of abstraction rate; and that a significant spatial effect exists (represented by the SPDE approximation of a Gaussian random field with a Matérn covariance function). Our final model adopts air temperature, solar exposure, and normalized difference vegetation index (NDVI) as covariates, shows good agreement with previous estimates at a regional scale, and additionally offers rigorous quantification of uncertainty in the estimate.

  20. Abstracting Concepts and Methods.

    ERIC Educational Resources Information Center

    Borko, Harold; Bernier, Charles L.

    This text provides a complete discussion of abstracts--their history, production, organization, publication--and of indexing. Instructions for abstracting are outlined, and standards and criteria for abstracting are stated. Management, automation, and personnel are discussed in terms of possible economies that can be derived from the introduction…

  1. An Efficient Deterministic Approach to Model-based Prediction Uncertainty Estimation

    DTIC Science & Technology

    2012-09-01

    94035, USA abhinav.saxena@nasa.gov ABSTRACT Prognostics deals with the prediction of the end of life ( EOL ) of a system. EOL is a random variable, due...future evolution of the system, accumulating additional uncertainty into the predicted EOL . Prediction algorithms that do not account for these sources of...uncertainty are misrepresenting the EOL and can lead to poor decisions based on their results. In this paper, we explore the impact of uncertainty in

  2. Success Prediction: A DDC Bibliography. December 1949--December 1971.

    ERIC Educational Resources Information Center

    Defense Documentation Center, Alexandria, VA.

    This document contains bibliographic information, descriptive terms, and abstracts for 145 technical reports on the general subject of success prediction. The bibliography includes reports on development of individuals during military training, peer evaluation, biographical inventory, and the validity of tests which may be used as predictors of…

  3. MOAtox: A comprehensive mode of action and acute aquatic toxicity database for predictive model development (SETAC abstract)

    EPA Science Inventory

    The mode of toxic action (MOA) has been recognized as a key determinant of chemical toxicity and as an alternative to chemical class-based predictive toxicity modeling. However, the development of quantitative structure activity relationship (QSAR) and other models has been limit...

  4. Prediction of lung function response for populations exposed to a wide range of ozone conditions

    EPA Science Inventory

    Abstract Context: A human exposure-response (E-R) model that has previously been demonstrated to accurately predict population mean FEV1 response to ozone exposure has been proposed as the foundation for future risk assessments for ambient ozone. Objective: Fit the origi...

  5. Analogical scaffolding: Making meaning in physics through representation and analogy

    NASA Astrophysics Data System (ADS)

    Podolefsky, Noah Solomon

    This work reviews the literature on analogy, introduces a new model of analogy, and presents a series of experiments that test and confirm the utility of this model to describe and predict student learning in physics with analogy. Pilot studies demonstrate that representations (e.g., diagrams) can play a key role in students' use of analogy. A new model of analogy, Analogical Scaffolding, is developed to explain these initial empirical results. This model will be described in detail, and then applied to describe and predict the outcomes of further experiments. Two large-scale (N>100) studies will demonstrate that: (1) students taught with analogies, according to the Analogical Scaffolding model, outperform students taught without analogies on pre-post assessments focused on electromagnetic waves; (2) the representational forms used to teach with analogy can play a significant role in student learning, with students in one treatment group outperforming students in other treatment groups by factors of two or three. It will be demonstrated that Analogical Scaffolding can be used to predict these results, as well as finer-grained results such as the types of distracters students choose in different treatment groups, and to describe and analyze student reasoning in interviews. Abstraction in physics is reconsidered using Analogical Scaffolding. An operational definition of abstraction is developed within the Analogical Scaffolding framework and employed to explain (a) why physicists consider some ideas more abstract than others in physics, and (b) how students conceptions of these ideas can be modeled. This new approach to abstraction suggests novel approaches to curriculum design in physics using Analogical Scaffolding.

  6. Cross-scale modeling of surface temperature and tree seedling establishment inmountain landscapes

    USGS Publications Warehouse

    Dingman, John; Sweet, Lynn C.; McCullough, Ian M.; Davis, Frank W.; Flint, Alan L.; Franklin, Janet; Flint, Lorraine E.

    2013-01-01

    Abstract: Introduction: Estimating surface temperature from above-ground field measurements is important for understanding the complex landscape patterns of plant seedling survival and establishment, processes which occur at heights of only several centimeters. Currently, future climate models predict temperature at 2 m above ground, leaving ground-surface microclimate not well characterized. Methods: Using a network of field temperature sensors and climate models, a ground-surface temperature method was used to estimate microclimate variability of minimum and maximum temperature. Temperature lapse rates were derived from field temperature sensors and distributed across the landscape capturing differences in solar radiation and cold air drainages modeled at a 30-m spatial resolution. Results: The surface temperature estimation method used for this analysis successfully estimated minimum surface temperatures on north-facing, south-facing, valley, and ridgeline topographic settings, and when compared to measured temperatures yielded an R2 of 0.88, 0.80, 0.88, and 0.80, respectively. Maximum surface temperatures generally had slightly more spatial variability than minimum surface temperatures, resulting in R2 values of 0.86, 0.77, 0.72, and 0.79 for north-facing, south-facing, valley, and ridgeline topographic settings. Quasi-Poisson regressions predicting recruitment of Quercus kelloggii (black oak) seedlings from temperature variables were significantly improved using these estimates of surface temperature compared to air temperature modeled at 2 m. Conclusion: Predicting minimum and maximum ground-surface temperatures using a downscaled climate model coupled with temperature lapse rates estimated from field measurements provides a method for modeling temperature effects on plant recruitment. Such methods could be applied to improve projections of species’ range shifts under climate change. Areas of complex topography can provide intricate microclimates that may allow species to redistribute locally as climate changes.

  7. Quantifying and Validating Rapid Floodplain Geomorphic Evolution, a Monitoring and Modelling Case Study

    NASA Astrophysics Data System (ADS)

    Scott, R.; Entwistle, N. S.

    2017-12-01

    Gravel bed rivers and their associated wider systems present an ideal subject for development and improvement of rapid monitoring tools, with features dynamic enough to evolve within relatively short-term timescales. For detecting and quantifying topographical evolution, UAV based remote sensing has manifested as a reliable, low cost, and accurate means of topographic data collection. Here we present some validated methodologies for detection of geomorphic change at resolutions down to 0.05 m, building on the work of Wheaton et al. (2009) and Milan et al. (2007), to generate mesh based and pointcloud comparison data to produce a reliable picture of topographic evolution. Results are presented for the River Glen, Northumberland, UK. Recent channel avulsion and floodplain interaction, resulting in damage to flood defence structures make this site a particularly suitable case for application of geomorphic change detection methods, with the UAV platform at its centre. We compare multi-temporal, high-resolution point clouds derived from SfM processing, cross referenced with aerial LiDAR data, over a 1.5 km reach of the watercourse. Changes detected included bank erosion, bar and splay deposition, vegetation stripping and incipient channel avulsion. Utilisation of the topographic data for numerical modelling, carried out using CAESAR-Lisflood predicted the avulsion of the main channel, resulting in erosion of and potentially complete circumvention of original channel and flood levees. A subsequent UAV survey highlighted topographic change and reconfiguration of the local sedimentary conveyor as we predicted with preliminary modelling. The combined monitoring and modelling approach has allowed probable future geomorphic configurations to be predicted permitting more informed implementation of channel and floodplain management strategies.

  8. Cognitive control over learning: Creating, clustering and generalizing task-set structure

    PubMed Central

    Collins, Anne G.E.; Frank, Michael J.

    2013-01-01

    Executive functions and learning share common neural substrates essential for their expression, notably in prefrontal cortex and basal ganglia. Understanding how they interact requires studying how cognitive control facilitates learning, but also how learning provides the (potentially hidden) structure, such as abstract rules or task-sets, needed for cognitive control. We investigate this question from three complementary angles. First, we develop a new computational “C-TS” (context-task-set) model inspired by non-parametric Bayesian methods, specifying how the learner might infer hidden structure and decide whether to re-use that structure in new situations, or to create new structure. Second, we develop a neurobiologically explicit model to assess potential mechanisms of such interactive structured learning in multiple circuits linking frontal cortex and basal ganglia. We systematically explore the link betweens these levels of modeling across multiple task demands. We find that the network provides an approximate implementation of high level C-TS computations, where manipulations of specific neural mechanisms are well captured by variations in distinct C-TS parameters. Third, this synergism across models yields strong predictions about the nature of human optimal and suboptimal choices and response times during learning. In particular, the models suggest that participants spontaneously build task-set structure into a learning problem when not cued to do so, which predicts positive and negative transfer in subsequent generalization tests. We provide evidence for these predictions in two experiments and show that the C-TS model provides a good quantitative fit to human sequences of choices in this task. These findings implicate a strong tendency to interactively engage cognitive control and learning, resulting in structured abstract representations that afford generalization opportunities, and thus potentially long-term rather than short-term optimality. PMID:23356780

  9. Direct Quantification of Cd2+ in the Presence of Cu2+ by a Combination of Anodic Stripping Voltammetry Using a Bi-Film-Modified Glassy Carbon Electrode and an Artificial Neural Network.

    PubMed

    Zhao, Guo; Wang, Hui; Liu, Gang

    2017-07-03

    Abstract : In this study, a novel method based on a Bi/glassy carbon electrode (Bi/GCE) for quantitatively and directly detecting Cd 2+ in the presence of Cu 2+ without further electrode modifications by combining square-wave anodic stripping voltammetry (SWASV) and a back-propagation artificial neural network (BP-ANN) has been proposed. The influence of the Cu 2+ concentration on the stripping response to Cd 2+ was studied. In addition, the effect of the ferrocyanide concentration on the SWASV detection of Cd 2+ in the presence of Cu 2+ was investigated. A BP-ANN with two inputs and one output was used to establish the nonlinear relationship between the concentration of Cd 2+ and the stripping peak currents of Cu 2+ and Cd 2+ . The factors affecting the SWASV detection of Cd 2+ and the key parameters of the BP-ANN were optimized. Moreover, the direct calibration model (i.e., adding 0.1 mM ferrocyanide before detection), the BP-ANN model and other prediction models were compared to verify the prediction performance of these models in terms of their mean absolute errors (MAEs), root mean square errors (RMSEs) and correlation coefficients. The BP-ANN model exhibited higher prediction accuracy than the direct calibration model and the other prediction models. Finally, the proposed method was used to detect Cd 2+ in soil samples with satisfactory results.

  10. Frontal Theta Links Prediction Errors to Behavioral Adaptation in Reinforcement Learning

    PubMed Central

    Cavanagh, James F.; Frank, Michael J.; Klein, Theresa J.; Allen, John J.B.

    2009-01-01

    Investigations into action monitoring have consistently detailed a fronto-central voltage deflection in the Event-Related Potential (ERP) following the presentation of negatively valenced feedback, sometimes termed the Feedback Related Negativity (FRN). The FRN has been proposed to reflect a neural response to prediction errors during reinforcement learning, yet the single trial relationship between neural activity and the quanta of expectation violation remains untested. Although ERP methods are not well suited to single trial analyses, the FRN has been associated with theta band oscillatory perturbations in the medial prefrontal cortex. Medio-frontal theta oscillations have been previously associated with expectation violation and behavioral adaptation and are well suited to single trial analysis. Here, we recorded EEG activity during a probabilistic reinforcement learning task and fit the performance data to an abstract computational model (Q-learning) for calculation of single-trial reward prediction errors. Single-trial theta oscillatory activities following feedback were investigated within the context of expectation (prediction error) and adaptation (subsequent reaction time change). Results indicate that interactive medial and lateral frontal theta activities reflect the degree of negative and positive reward prediction error in the service of behavioral adaptation. These different brain areas use prediction error calculations for different behavioral adaptations: with medial frontal theta reflecting the utilization of prediction errors for reaction time slowing (specifically following errors), but lateral frontal theta reflecting prediction errors leading to working memory-related reaction time speeding for the correct choice. PMID:19969093

  11. Disciplinarity and sport science in Europe: A statistical and sociological study of ECSS conference abstracts.

    PubMed

    Champely, Stéphane; Fargier, Patrick; Camy, Jean

    2017-02-01

    Abstracts of European College of Sports Science conferences (1995-2014) are studied. The number of abstracts has been increasing regularly (+90 per year). This rise is in recent years largely due to extra-European countries. The magnitude and accumulation of the different topics of discussion are examined. An operational criterion determines four stages of evolution of a topic: social network, cluster, specialty, and discipline. The scientific production can, therefore, be classified as disciplinary or non-disciplinary. The disciplinary part is more important but has been less dynamic recently. The cognitive content of sport science is then explored through a multidimensional scaling of the topics based on the keywords used in the abstracts. Three areas are visible: social sciences and humanities, sports medicine and physiology, and biomechanics and neurophysiology. According to the field theory of Bourdieu ( 1975 ), three scientific habitus are distinguished. The logic of academic disciplinary excellence is the consequence of the autonomy of this scientific field, its closure, peer-review process, and barriers to entry. The distribution of scientific capital and professional capital is unequal across the three areas. Basically, conservation strategies of academic disciplinary excellence are predicted in biomechanics and neurophysiology, subversion strategies of interdisciplinarity based on professional concerns can appear in the sports medicine and physiology area, and critical strategies of interdisciplinarity based on social utility in social sciences and humanities. Moreover, additional tensions within these areas are depicted. Lastly methods based on co-citations of disciplines and boundary objects are proposed to find tangible patterns of multidisciplinarity confirming these strategies.

  12. Assessing the impacts of water abstractions on river ecosystem services: an eco-hydraulic modelling approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carolli, Mauro, E-mail: mauro.carolli@unitn.it; Geneletti, Davide, E-mail: davide.geneletti@unitn.it; Zolezzi, Guido, E-mail: guido.zolezzi@unitn.it

    The provision of important river ecosystem services (ES) is dependent on the flow regime. This requires methods to assess the impacts on ES caused by interventions on rivers that affect flow regime, such as water abstractions. This study proposes a method to i) quantify the provision of a set of river ES, ii) simulate the effects of water abstraction alternatives that differ in location and abstracted flow, and iii) assess the impact of water abstraction alternatives on the selected ES. The method is based on river modelling science, and integrates spatially distributed hydrological, hydraulic and habitat models at different spatialmore » and temporal scales. The method is applied to the hydropeaked upper Noce River (Northern Italy), which is regulated by hydropower operations. We selected locally relevant river ES: habitat suitability for the adult marble trout, white-water rafting suitability, hydroelectricity production from run-of-river (RoR) plants. Our results quantify the seasonality of river ES response variables and their intrinsic non-linearity, which explains why the same abstracted flow can produce different effects on trout habitat and rafting suitability depending on the morphology of the abstracted reach. An economic valuation of the examined river ES suggests that incomes from RoR hydropower plants are of comparable magnitude to touristic revenue losses related to the decrease in rafting suitability.« less

  13. Structured Semantic Knowledge Can Emerge Automatically from Predicting Word Sequences in Child-Directed Speech

    PubMed Central

    Huebner, Philip A.; Willits, Jon A.

    2018-01-01

    Previous research has suggested that distributional learning mechanisms may contribute to the acquisition of semantic knowledge. However, distributional learning mechanisms, statistical learning, and contemporary “deep learning” approaches have been criticized for being incapable of learning the kind of abstract and structured knowledge that many think is required for acquisition of semantic knowledge. In this paper, we show that recurrent neural networks, trained on noisy naturalistic speech to children, do in fact learn what appears to be abstract and structured knowledge. We trained two types of recurrent neural networks (Simple Recurrent Network, and Long Short-Term Memory) to predict word sequences in a 5-million-word corpus of speech directed to children ages 0–3 years old, and assessed what semantic knowledge they acquired. We found that learned internal representations are encoding various abstract grammatical and semantic features that are useful for predicting word sequences. Assessing the organization of semantic knowledge in terms of the similarity structure, we found evidence of emergent categorical and hierarchical structure in both models. We found that the Long Short-term Memory (LSTM) and SRN are both learning very similar kinds of representations, but the LSTM achieved higher levels of performance on a quantitative evaluation. We also trained a non-recurrent neural network, Skip-gram, on the same input to compare our results to the state-of-the-art in machine learning. We found that Skip-gram achieves relatively similar performance to the LSTM, but is representing words more in terms of thematic compared to taxonomic relations, and we provide reasons why this might be the case. Our findings show that a learning system that derives abstract, distributed representations for the purpose of predicting sequential dependencies in naturalistic language may provide insight into emergence of many properties of the developing semantic system. PMID:29520243

  14. Solution of magnetic field and eddy current problem induced by rotating magnetic poles (abstract)

    NASA Astrophysics Data System (ADS)

    Liu, Z. J.; Low, T. S.

    1996-04-01

    The magnetic field and eddy current problems induced by rotating permanent magnet poles occur in electromagnetic dampers, magnetic couplings, and many other devices. Whereas numerical techniques, for example, finite element methods can be exploited to study various features of these problems, such as heat generation and drag torque development, etc., the analytical solution is always of interest to the designers since it helps them to gain the insight into the interdependence of the parameters involved and provides an efficient tool for designing. Some of the previous work showed that the solution of the eddy current problem due to the linearly moving magnet poles can give satisfactory approximation for the eddy current problem due to rotating fields. However, in many practical cases, especially when the number of magnet poles is small, there is significant effect of flux focusing due to the geometry. The above approximation can therefore lead to marked errors in the theoretical predictions of the device performance. Bernot et al. recently described an analytical solution in a polar coordinate system where the radial field is excited by a time-varying source. A discussion of an analytical solution of the magnetic field and eddy current problems induced by moving magnet poles in radial field machines will be given in this article. The theoretical predictions obtained from this method is compared with the results obtained from finite element calculations. The validity of the method is also checked by the comparison of the theoretical predictions and the measurements from a test machine. It is shown that the introduced solution leads to a significant improvement in the air gap field prediction as compared with the results obtained from the analytical solution that models the eddy current problems induced by linearly moving magnet poles.

  15. Density functional calculations on structural materials for nuclear energy applications and functional materials for photovoltaic energy applications (abstract only).

    PubMed

    Domain, C; Olsson, P; Becquart, C S; Legris, A; Guillemoles, J F

    2008-02-13

    Ab initio density functional theory calculations are carried out in order to predict the evolution of structural materials under aggressive working conditions such as cases with exposure to corrosion and irradiation, as well as to predict and investigate the properties of functional materials for photovoltaic energy applications. Structural metallic materials used in nuclear facilities are subjected to irradiation which induces the creation of large amounts of point defects. These defects interact with each other as well as with the different elements constituting the alloys, which leads to modifications of the microstructure and the mechanical properties. VASP (Vienna Ab initio Simulation Package) has been used to determine the properties of point defect clusters and also those of extended defects such as dislocations. The resulting quantities, such as interaction energies and migration energies, are used in larger scale simulation methods in order to build predictive tools. For photovoltaic energy applications, ab initio calculations are used in order to search for new semiconductors and possible element substitutions for existing ones in order to improve their efficiency.

  16. Knowledge-based prediction of protein backbone conformation using a structural alphabet.

    PubMed

    Vetrivel, Iyanar; Mahajan, Swapnil; Tyagi, Manoj; Hoffmann, Lionel; Sanejouand, Yves-Henri; Srinivasan, Narayanaswamy; de Brevern, Alexandre G; Cadet, Frédéric; Offmann, Bernard

    2017-01-01

    Libraries of structural prototypes that abstract protein local structures are known as structural alphabets and have proven to be very useful in various aspects of protein structure analyses and predictions. One such library, Protein Blocks, is composed of 16 standard 5-residues long structural prototypes. This form of analyzing proteins involves drafting its structure as a string of Protein Blocks. Predicting the local structure of a protein in terms of protein blocks is the general objective of this work. A new approach, PB-kPRED is proposed towards this aim. It involves (i) organizing the structural knowledge in the form of a database of pentapeptide fragments extracted from all protein structures in the PDB and (ii) applying a knowledge-based algorithm that does not rely on any secondary structure predictions and/or sequence alignment profiles, to scan this database and predict most probable backbone conformations for the protein local structures. Though PB-kPRED uses the structural information from homologues in preference, if available. The predictions were evaluated rigorously on 15,544 query proteins representing a non-redundant subset of the PDB filtered at 30% sequence identity cut-off. We have shown that the kPRED method was able to achieve mean accuracies ranging from 40.8% to 66.3% depending on the availability of homologues. The impact of the different strategies for scanning the database on the prediction was evaluated and is discussed. Our results highlight the usefulness of the method in the context of proteins without any known structural homologues. A scoring function that gives a good estimate of the accuracy of prediction was further developed. This score estimates very well the accuracy of the algorithm (R2 of 0.82). An online version of the tool is provided freely for non-commercial usage at http://www.bo-protscience.fr/kpred/.

  17. Assessing effects of water abstraction on fish assemblages in Mediterranean streams

    USGS Publications Warehouse

    Benejam, Lluis; Angermeier, Paul L.; Munne, Antoni; García-Berthou, Emili

    2010-01-01

    1. Water abstraction strongly affects streams in arid and semiarid ecosystems, particularly where there is a Mediterranean climate. Excessive abstraction reduces the availability of water for human uses downstream and impairs the capacity of streams to support native biota. 2. We investigated the flow regime and related variables in six river basins of the Iberian Peninsula and show that they have been strongly altered, with declining flows (autoregressive models) and groundwater levels during the 20th century. These streams had lower flows and more frequent droughts than predicted by the official hydrological model used in this region. Three of these rivers were sometimes dry, whereas there were predicted by the model to be permanently flowing. Meanwhile, there has been no decrease in annual precipitation. 3. We also investigated the fish assemblage of a stream in one of these river basins (Tordera) for 6 years and show that sites more affected by water abstraction display significant differences in four fish metrics (catch per unit effort, number of benthic species, number of intolerant species and proportional abundance of intolerant individuals) commonly used to assess the biotic condition of streams. 4. We discuss the utility of these metrics in assessing impacts of water abstraction and point out the need for detailed characterisation of the natural flow regime (and hence drought events) prior to the application of biotic indices in streams severely affected by water abstraction. In particular, in cases of artificially dry streams, it is more appropriate for regulatory agencies to assign index scores that reflect biotic degradation than to assign ‘missing’ scores, as is presently customary in assessments of Iberian streams.

  18. Dynamics of Small-Scale Perched Aquifers in the Semi-Arid South-Western Region of Madagascar and Implications for the Sustainable Groundwater Exploitation

    NASA Astrophysics Data System (ADS)

    Englert, A.; Brinkmann, K.; Kobbe, S.; Buerkert, A.

    2016-12-01

    The south-western region of Madagascar is characterized by limited water resources throughout the year and recurrent droughts, which affect agricultural production and increase the risk of food insecurity. To deliver reliable estimates on the availability and dynamics of water resources, we studied the hydrogeology of several villages in the Mahafaly region. Detailed investigations were conducted for a selected village on a calcareous plateau to predict the local water resources under changing boundary conditions including enhanced water abstraction and changes in groundwater recharge. In 2014 a participatory monitoring network was established, which allowed groundwater level measurements in three wells twice a day. Additional hydrogeological investigations included pumping tests, automatic monitoring of meteorological data, daily groundwater abstraction appraisal and mapping of the spatial extent of the perched aquifer using satellite data. Analysis of the measured data unraveled the aquifer dynamic to be dominated by a groundwater level driven leakage process. The latter is superimposed by groundwater recharge in the rainy season and a daily groundwater abstraction. Based on these findings we developed a model for the aquifer, which allows to predict the duration of groundwater availability as a function of annual precipitation and daily water abstraction. The latter will be implemented in an agent-based land-use model, were groundwater abstraction is a function of population and livestock. The main objective is to model land use scenarios and global trends (climate, market trends and population development) through explicit imbedding of artificial and natural groundwater dynamics. The latter is expected to enable the evaluation of additional water abstraction for agricultural purposes without endangering water supply of the local population and their livestock.

  19. [Local Regression Algorithm Based on Net Analyte Signal and Its Application in Near Infrared Spectral Analysis].

    PubMed

    Zhang, Hong-guang; Lu, Jian-gang

    2016-02-01

    Abstract To overcome the problems of significant difference among samples and nonlinearity between the property and spectra of samples in spectral quantitative analysis, a local regression algorithm is proposed in this paper. In this algorithm, net signal analysis method(NAS) was firstly used to obtain the net analyte signal of the calibration samples and unknown samples, then the Euclidean distance between net analyte signal of the sample and net analyte signal of calibration samples was calculated and utilized as similarity index. According to the defined similarity index, the local calibration sets were individually selected for each unknown sample. Finally, a local PLS regression model was built on each local calibration sets for each unknown sample. The proposed method was applied to a set of near infrared spectra of meat samples. The results demonstrate that the prediction precision and model complexity of the proposed method are superior to global PLS regression method and conventional local regression algorithm based on spectral Euclidean distance.

  20. Chemotherapy Necessitates Increased Immune Control of HHVs: A Cause of Persistent Inflammation Enabling Protracted Fatigue in Breast Cancer Survivors

    DTIC Science & Technology

    2014-10-01

    14. ABSTRACT This work hypothesizes that chemotherapy can permanently alter the balance between the immune system and chronic herpes virus...infections. We predicted that herpes virus-driven inflammatory cytokines exacerbate cancer treatment related fatigue (CTRF). Here we report the significant...TERMS breast cancer, chemotherapy, immunology, human herpes viruses, survivor fatigue 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT

  1. Molecular Solutions to Low Injuries Resulting from Battlefield Injuries

    DTIC Science & Technology

    2007-05-01

    nerve regeneration; (5) dry eye bydetermining how to minimize dry eye after LASIK refractive surgery by developing new tests to predict pre-disposition...Conjunctiva after Corneal Wounding Mimicking Nerve Loss in LASIK Surgery, Invest. Ophthalmol. Vis. Sci. 2006 47: E-Abstract 4600. CONCLUSIONS We...Mimicking Nerve Loss in LASIK Surgery, Invest. Ophthalmol. Vis. Sci. 2006 47: E-Abstract 4600. Table 1. Alterations in phosphoprotein levels in non-wounded

  2. Software Design Description for the Tidal Open-boundary Prediction System (TOPS)

    DTIC Science & Technology

    2010-05-04

    Naval Research Laboratory Stennis Space Center, MS 39529-5004 NRL/MR/7320--10-9209 Approved for public release; distribution is unlimited. Software ...Department of Defense, Washington Headquarters Services , Directorate for Information Operations and Reports (0704-0188), 1215 Jefferson Davis Highway, Suite...RESPONSIBLE PERSON 19b. TELEPHONE NUMBER (include area code) b. ABSTRACT c. THIS PAGE 18. NUMBER OF PAGES 17. LIMITATION OF ABSTRACT Software Design

  3. Modeling Multiple Stresses Placed Upon A Groundwater System In A Semi-Arid Brackish Environment

    NASA Astrophysics Data System (ADS)

    Toll, M.; Salameh, E.; Sauter, M.

    2008-12-01

    In semi-arid areas groundwater systems are frequently not sufficiently characterized hydrogeologically and long term data records are generally not available. Long-term time series are necessary, however to design future groundwater abstraction scenarios or to predict the influence of future climate change effects on groundwater resources. To overcome these problems an integrated approach for the provision of a reliable database based on sparse and fuzzy data is proposed. This integrated approach is demonstrated in the lowermost area of the Jordan Valley. The Jordan Valley is part of the Jordan Dead Sea Wadi Araba Rift Valley, which extends from the Red Sea to lake Tiberias and beyond with a major 107 km sinistral strike-slip fault between the Arabian plate to the east and the northeastern part of the African plate to the west. Due to extensional forces a topographic depression was formed. As a result of an arid environment it is filled with evaporites, lacustrine sediments, and clastic fluvial components. A subtropical climate with hot, dry summers and mild humid winters with low amounts of rainfall provide excellent farming conditions. Therefore the Jordan Valley is considered as the food basket of Jordan and is used intensively for agriculture. As a result hundreds of shallow wells were drilled and large amounts of groundwater were abstracted since groundwater is the major source for irrigation. Consequently groundwater quality decreased rapidly since the sixties and signs of overpumping and an increase in soil salinity could clearly be seen. In order to achieve a sustainable state of water resources and to quantify the impact of climate change on water resources a proper assessment of the groundwater resources as well as their quality is a prerequisite. In order to sufficiently describe the complex hydrogeologic flow system an integrated approach, combining geological, geophysical, hydrogeological, historical, and chemical methods was chosen. The aquifer geometry and composition is described with the help of geological, hydochemical, and geophysical methods. As far as the water budget is concerned, the recharge to the considered aquifer is estimated with geological methods and available data sets, while the abstraction from the aquifer is estimated with the help of remote sensing techniques. A historical approach is used to detect the general conditions under which the groundwater system has been in the past. Afterwards this information is implemented into a flow model. On the basis of the findings a numerical 3-D transient model integrating all important features of the hydrogeological system was developed.3 In order to be able to give reliable predictions about the impacts of climate change scenarios on the groundwater system the flow model was tested against stress periods depicted during the historical review of the test area (model period: 1955 - 2008). These stress periods include periods of intense rainfall, of drought, and of anthropogenic impacts, like building of storage dams and of violent conflicts. Recommendations for future sustainable groundwater abstractions are given.

  4. A prior-based integrative framework for functional transcriptional regulatory network inference

    PubMed Central

    Siahpirani, Alireza F.

    2017-01-01

    Abstract Transcriptional regulatory networks specify regulatory proteins controlling the context-specific expression levels of genes. Inference of genome-wide regulatory networks is central to understanding gene regulation, but remains an open challenge. Expression-based network inference is among the most popular methods to infer regulatory networks, however, networks inferred from such methods have low overlap with experimentally derived (e.g. ChIP-chip and transcription factor (TF) knockouts) networks. Currently we have a limited understanding of this discrepancy. To address this gap, we first develop a regulatory network inference algorithm, based on probabilistic graphical models, to integrate expression with auxiliary datasets supporting a regulatory edge. Second, we comprehensively analyze our and other state-of-the-art methods on different expression perturbation datasets. Networks inferred by integrating sequence-specific motifs with expression have substantially greater agreement with experimentally derived networks, while remaining more predictive of expression than motif-based networks. Our analysis suggests natural genetic variation as the most informative perturbation for network inference, and, identifies core TFs whose targets are predictable from expression. Multiple reasons make the identification of targets of other TFs difficult, including network architecture and insufficient variation of TF mRNA level. Finally, we demonstrate the utility of our inference algorithm to infer stress-specific regulatory networks and for regulator prioritization. PMID:27794550

  5. Maintainability Improvement Through Corrosion Prediction

    DTIC Science & Technology

    1997-12-01

    Aluminum base alloys - Mechanical properties; Lithium- Alloying elements; Crack propagation- Corrosion effects ; Fatigue life - Corrosion... effects on the corrosion fatigue life of 7075-T6 aluminum alloy . Ma,L CORPORATE SOURCE: University of Utah JOURNAL: Dissertation Abstracts International...Diffusion effects ; Hydrogen- Diffusion SECTION HEADINGS: 64 (Corrosion) 52. 715866 87-640094 The Life Prediction for 2024

  6. USE OF INTERSPECIES CORRELATION ESTIMATIONS TO PREDICT HC5'S BASED ON QSAR

    EPA Science Inventory

    Dyer, S.D., S. Belanger, J. Chaney, D. Versteeg and F. Mayer. In press. Use of Interspecies Correlation Estimations to predict HC5's Based on QSARs (Abstract). To be presented at the SETAC Europe 14th Annual Meeting: Environmental Science Solution: A Pan-European Perspective, 18-...

  7. Reliability Prediction for Aerospace Electronics

    DTIC Science & Technology

    2015-04-20

    RESEARCH AUTHORITY 3 KIRYAT HAMADA ARIEL ISRAEL EOARD GRANT FA9550-14-1-0216 Report Date: April 2015 Final Report for 15 July 2014 to 14... Ariel , Israel Period of Performance 15 July 2014 – 14 April 2015   Abstract...AFRL-AFOSR-UK-TR-2015-0028 Reliability Prediction for Aerospace Electronics Joseph B. Bernstein ARIEL UNIVERSITY

  8. IN SILICO APPROACHES TO MECHANISTIC AND PREDICTIVE TOXICOLOGY: AN INTRODUCTION TO BIOINFORMATICS FOR TOXICOLOGISTS. (R827402)

    EPA Science Inventory

    Abstract

    Bioinformatics, or in silico biology, is a rapidly growing field that encompasses the theory and application of computational approaches to model, predict, and explain biological function at the molecular level. This information rich field requires new ...

  9. A NONSTEADY-STATE ANALYTICAL MODEL TO PREDICT GASEOUS EMISSIONS OF VOLATILE ORGANIC COMPOUNDS FROM LANDFILLS. (R825689C072)

    EPA Science Inventory

    Abstract

    A general mathematical model is developed to predict emissions of volatile organic compounds (VOCs) from hazardous or sanitary landfills. The model is analytical in nature and includes important mechanisms occurring in unsaturated subsurface landfill environme...

  10. Navigating the Functional Landscape of Transcription Factors via Non-Negative Tensor Factorization Analysis of MEDLINE Abstracts

    PubMed Central

    Roy, Sujoy; Yun, Daqing; Madahian, Behrouz; Berry, Michael W.; Deng, Lih-Yuan; Goldowitz, Daniel; Homayouni, Ramin

    2017-01-01

    In this study, we developed and evaluated a novel text-mining approach, using non-negative tensor factorization (NTF), to simultaneously extract and functionally annotate transcriptional modules consisting of sets of genes, transcription factors (TFs), and terms from MEDLINE abstracts. A sparse 3-mode term × gene × TF tensor was constructed that contained weighted frequencies of 106,895 terms in 26,781 abstracts shared among 7,695 genes and 994 TFs. The tensor was decomposed into sub-tensors using non-negative tensor factorization (NTF) across 16 different approximation ranks. Dominant entries of each of 2,861 sub-tensors were extracted to form term–gene–TF annotated transcriptional modules (ATMs). More than 94% of the ATMs were found to be enriched in at least one KEGG pathway or GO category, suggesting that the ATMs are functionally relevant. One advantage of this method is that it can discover potentially new gene–TF associations from the literature. Using a set of microarray and ChIP-Seq datasets as gold standard, we show that the precision of our method for predicting gene–TF associations is significantly higher than chance. In addition, we demonstrate that the terms in each ATM can be used to suggest new GO classifications to genes and TFs. Taken together, our results indicate that NTF is useful for simultaneous extraction and functional annotation of transcriptional regulatory networks from unstructured text, as well as for literature based discovery. A web tool called Transcriptional Regulatory Modules Extracted from Literature (TREMEL), available at http://binf1.memphis.edu/tremel, was built to enable browsing and searching of ATMs. PMID:28894735

  11. Enhanced Regulatory Sequence Prediction Using Gapped k-mer Features

    PubMed Central

    Mohammad-Noori, Morteza; Beer, Michael A.

    2014-01-01

    Abstract Oligomers of length k, or k-mers, are convenient and widely used features for modeling the properties and functions of DNA and protein sequences. However, k-mers suffer from the inherent limitation that if the parameter k is increased to resolve longer features, the probability of observing any specific k-mer becomes very small, and k-mer counts approach a binary variable, with most k-mers absent and a few present once. Thus, any statistical learning approach using k-mers as features becomes susceptible to noisy training set k-mer frequencies once k becomes large. To address this problem, we introduce alternative feature sets using gapped k-mers, a new classifier, gkm-SVM, and a general method for robust estimation of k-mer frequencies. To make the method applicable to large-scale genome wide applications, we develop an efficient tree data structure for computing the kernel matrix. We show that compared to our original kmer-SVM and alternative approaches, our gkm-SVM predicts functional genomic regulatory elements and tissue specific enhancers with significantly improved accuracy, increasing the precision by up to a factor of two. We then show that gkm-SVM consistently outperforms kmer-SVM on human ENCODE ChIP-seq datasets, and further demonstrate the general utility of our method using a Naïve-Bayes classifier. Although developed for regulatory sequence analysis, these methods can be applied to any sequence classification problem. PMID:25033408

  12. Dual coding theory, word abstractness, and emotion: a critical review of Kousta et al. (2011).

    PubMed

    Paivio, Allan

    2013-02-01

    Kousta, Vigliocco, Del Campo, Vinson, and Andrews (2011) questioned the adequacy of dual coding theory and the context availability model as explanations of representational and processing differences between concrete and abstract words. They proposed an alternative approach that focuses on the role of emotional content in the processing of abstract concepts. Their dual coding critique is, however, based on impoverished and, in some respects, incorrect interpretations of the theory and its implications. This response corrects those gaps and misinterpretations and summarizes research findings that show predicted variations in the effects of dual coding variables in different tasks and contexts. Especially emphasized is an empirically supported dual coding theory of emotion that goes beyond the Kousta et al. emphasis on emotion in abstract semantics. 2013 APA, all rights reserved

  13. Convincing similar and dissimilar others: the power of language abstraction in political communication.

    PubMed

    Menegatti, Michela; Rubini, Monica

    2013-05-01

    Three studies examined the production of political messages and their persuasive impact on recipients as a function of speaker-audience similarity. The first two studies found support for the hypothesis that political leaders (Study 1) and party activists (Study 2) formulate more abstract messages when the audience is politically similar to them than when the audience is dissimilar or heterogeneous. The third study examined the persuasive impact of message abstractness versus concreteness. We predicted and found that abstract messages are more effective in convincing an audience whose political positions are similar to the speaker's and concrete messages are more effective in convincing an audience whose political positions differ from the speaker's or are heterogeneous. Implications of these findings for the relation between language and social cognition are discussed.

  14. The effects of recall-concurrent visual-motor distraction on picture and word recall.

    PubMed

    Warren, M W

    1977-05-01

    The dual-coding model (Paivio, 1971, 1975) predicts a larger imaginal component in the recall of pictures relative to words and a larger imaginal component in the recall of concrete words relative to abstract words. These predictions were tested by examining the effect of a recall-concurrent imagery-suppression task (pursuit-rotor tracking) on the recall of pictures vs picture labels and on the recall of concrete words vs abstract words. The results showed that recall-concurrent pursuit-rotor tracking interfered with picture recall, but not word recall (Experiments 1 and 2); however, there was no evidence of an effect of recall-concurrent tracking on the recall of concrete words (Experiment 3). The results suggested a revision of the dual-coding model.

  15. Grounding Abstractness: Abstract Concepts and the Activation of the Mouth

    PubMed Central

    Borghi, Anna M.; Zarcone, Edoardo

    2016-01-01

    One key issue for theories of cognition is how abstract concepts, such as freedom, are represented. According to the WAT (Words As social Tools) proposal, abstract concepts activate both sensorimotor and linguistic/social information, and their acquisition modality involves the linguistic experience more than the acquisition of concrete concepts. We report an experiment in which participants were presented with abstract and concrete definitions followed by concrete and abstract target-words. When the definition and the word matched, participants were required to press a key, either with the hand or with the mouth. Response times and accuracy were recorded. As predicted, we found that abstract definitions and abstract words yielded slower responses and more errors compared to concrete definitions and concrete words. More crucially, there was an interaction between the target-words and the effector used to respond (hand, mouth). While responses with the mouth were overall slower, the advantage of the hand over the mouth responses was more marked with concrete than with abstract concepts. The results are in keeping with grounded and embodied theories of cognition and support the WAT proposal, according to which abstract concepts evoke linguistic-social information, hence activate the mouth. The mechanisms underlying the mouth activation with abstract concepts (re-enactment of acquisition experience, or re-explanation of the word meaning, possibly through inner talk) are discussed. To our knowledge this is the first behavioral study demonstrating with real words that the advantage of the hand over the mouth is more marked with concrete than with abstract concepts, likely because of the activation of linguistic information with abstract concepts. PMID:27777563

  16. Comparison of human gastrocnemius forces predicted by Hill-type muscle models and estimated from ultrasound images

    PubMed Central

    Biewener, Andrew A.; Wakeling, James M.

    2017-01-01

    ABSTRACT Hill-type models are ubiquitous in the field of biomechanics, providing estimates of a muscle's force as a function of its activation state and its assumed force–length and force–velocity properties. However, despite their routine use, the accuracy with which Hill-type models predict the forces generated by muscles during submaximal, dynamic tasks remains largely unknown. This study compared human gastrocnemius forces predicted by Hill-type models with the forces estimated from ultrasound-based measures of tendon length changes and stiffness during cycling, over a range of loads and cadences. We tested both a traditional model, with one contractile element, and a differential model, with two contractile elements that accounted for independent contributions of slow and fast muscle fibres. Both models were driven by subject-specific, ultrasound-based measures of fascicle lengths, velocities and pennation angles and by activation patterns of slow and fast muscle fibres derived from surface electromyographic recordings. The models predicted, on average, 54% of the time-varying gastrocnemius forces estimated from the ultrasound-based methods. However, differences between predicted and estimated forces were smaller under low speed–high activation conditions, with models able to predict nearly 80% of the gastrocnemius force over a complete pedal cycle. Additionally, the predictions from the Hill-type muscle models tested here showed that a similar pattern of force production could be achieved for most conditions with and without accounting for the independent contributions of different muscle fibre types. PMID:28202584

  17. A Lagrangian cylindrical coordinate system for characterizing dynamic surface geometry of tubular anatomic structures.

    PubMed

    Lundh, Torbjörn; Suh, Ga-Young; DiGiacomo, Phillip; Cheng, Christopher

    2018-03-03

    Vascular morphology characterization is useful for disease diagnosis, risk stratification, treatment planning, and prediction of treatment durability. To quantify the dynamic surface geometry of tubular-shaped anatomic structures, we propose a simple, rigorous Lagrangian cylindrical coordinate system to monitor well-defined surface points. Specifically, the proposed system enables quantification of surface curvature and cross-sectional eccentricity. Using idealized software phantom examples, we validate the method's ability to accurately quantify longitudinal and circumferential surface curvature, as well as eccentricity and orientation of eccentricity. We then apply the method to several medical imaging data sets of human vascular structures to exemplify the utility of this coordinate system for analyzing morphology and dynamic geometric changes in blood vessels throughout the body. Graphical abstract Pointwise longitudinal curvature of a thoracic aortic endograft surface for systole and diastole, with their absolute difference.

  18. Functional competency and cognitive ability in mild Alzheimer's disease: relationship between ADL assessed by a relative/ carer-rated scale and neuropsychological performance.

    PubMed

    Matsuda, Osamu; Saito, Masahiko

    2005-06-01

    Alzheimer's disease (AD) is characterized by multiple cognitive deficits and affects functional competency to perform daily activities (ADL). As this may contribute to the patient's overall disability, it is important to identify factors that compromise competency. The relationship between different cognitive domains and functional activities in AD was studied. The functional competency of 73 Japanese AD patients, most with mild dementia, was assessed using a 27-item relative/carer-rating scale covering 7 ADL: managing finances, using transportation, taking precautions, self-care, housekeeping, communication and taking medicine. Cognitive assessment used 16 neuropsychological tests from the Japanese version of the WAIS-R and COGNISTAT, covering 9 cognitive domains: orientation, attention, episodic memory, semantic memory, language, visuoperceptual and construction abilities, computational ability, abstract thinking, and psychomotor speed. Multiple regression analysis by the stepwise method indicated that functional competency could, for the most part, be predicted from test scores for orientation, abstract thinking and psychomotor speed. The results of this study suggest that impairment of these three cognitive domains plays an important role in the functional deterioration of AD.

  19. RNA-protein binding motifs mining with a new hybrid deep learning based cross-domain knowledge integration approach.

    PubMed

    Pan, Xiaoyong; Shen, Hong-Bin

    2017-02-28

    RNAs play key roles in cells through the interactions with proteins known as the RNA-binding proteins (RBP) and their binding motifs enable crucial understanding of the post-transcriptional regulation of RNAs. How the RBPs correctly recognize the target RNAs and why they bind specific positions is still far from clear. Machine learning-based algorithms are widely acknowledged to be capable of speeding up this process. Although many automatic tools have been developed to predict the RNA-protein binding sites from the rapidly growing multi-resource data, e.g. sequence, structure, their domain specific features and formats have posed significant computational challenges. One of current difficulties is that the cross-source shared common knowledge is at a higher abstraction level beyond the observed data, resulting in a low efficiency of direct integration of observed data across domains. The other difficulty is how to interpret the prediction results. Existing approaches tend to terminate after outputting the potential discrete binding sites on the sequences, but how to assemble them into the meaningful binding motifs is a topic worth of further investigation. In viewing of these challenges, we propose a deep learning-based framework (iDeep) by using a novel hybrid convolutional neural network and deep belief network to predict the RBP interaction sites and motifs on RNAs. This new protocol is featured by transforming the original observed data into a high-level abstraction feature space using multiple layers of learning blocks, where the shared representations across different domains are integrated. To validate our iDeep method, we performed experiments on 31 large-scale CLIP-seq datasets, and our results show that by integrating multiple sources of data, the average AUC can be improved by 8% compared to the best single-source-based predictor; and through cross-domain knowledge integration at an abstraction level, it outperforms the state-of-the-art predictors by 6%. Besides the overall enhanced prediction performance, the convolutional neural network module embedded in iDeep is also able to automatically capture the interpretable binding motifs for RBPs. Large-scale experiments demonstrate that these mined binding motifs agree well with the experimentally verified results, suggesting iDeep is a promising approach in the real-world applications. The iDeep framework not only can achieve promising performance than the state-of-the-art predictors, but also easily capture interpretable binding motifs. iDeep is available at http://www.csbio.sjtu.edu.cn/bioinf/iDeep.

  20. Knowledge acquisition for temporal abstraction.

    PubMed

    Stein, A; Musen, M A; Shahar, Y

    1996-01-01

    Temporal abstraction is the task of detecting relevant patterns in data over time. The knowledge-based temporal-abstraction method uses knowledge about a clinical domain's contexts, external events, and parameters to create meaningful interval-based abstractions from raw time-stamped clinical data. In this paper, we describe the acquisition and maintenance of domain-specific temporal-abstraction knowledge. Using the PROTEGE-II framework, we have designed a graphical tool for acquiring temporal knowledge directly from expert physicians, maintaining the knowledge in a sharable form, and converting the knowledge into a suitable format for use by an appropriate problem-solving method. In initial tests, the tool offered significant gains in our ability to rapidly acquire temporal knowledge and to use that knowledge to perform automated temporal reasoning.

  1. Linear free energy relationships between aqueous phase hydroxyl radical reaction rate constants and free energy of activation.

    PubMed

    Minakata, Daisuke; Crittenden, John

    2011-04-15

    The hydroxyl radical (HO(•)) is a strong oxidant that reacts with electron-rich sites on organic compounds and initiates complex radical chain reactions in aqueous phase advanced oxidation processes (AOPs). Computer based kinetic modeling requires a reaction pathway generator and predictions of associated reaction rate constants. Previously, we reported a reaction pathway generator that can enumerate the most important elementary reactions for aliphatic compounds. For the reaction rate constant predictor, we develop linear free energy relationships (LFERs) between aqueous phase literature-reported HO(•) reaction rate constants and theoretically calculated free energies of activation for H-atom abstraction from a C-H bond and HO(•) addition to alkenes. The theoretical method uses ab initio quantum mechanical calculations, Gaussian 1-3, for gas phase reactions and a solvation method, COSMO-RS theory, to estimate the impact of water. Theoretically calculated free energies of activation are found to be within approximately ±3 kcal/mol of experimental values. Considering errors that arise from quantum mechanical calculations and experiments, this should be within the acceptable errors. The established LFERs are used to predict the HO(•) reaction rate constants within a factor of 5 from the experimental values. This approach may be applied to other reaction mechanisms to establish a library of rate constant predictions for kinetic modeling of AOPs.

  2. Thermodynamic DFT analysis of natural gas.

    PubMed

    Neto, Abel F G; Huda, Muhammad N; Marques, Francisco C; Borges, Rosivaldo S; Neto, Antonio M J C

    2017-08-01

    Density functional theory was performed for thermodynamic predictions on natural gas, whose B3LYP/6-311++G(d,p), B3LYP/6-31+G(d), CBS-QB3, G3, and G4 methods were applied. Additionally, we carried out thermodynamic predictions using G3/G4 averaged. The calculations were performed for each major component of seven kinds of natural gas and to their respective air + natural gas mixtures at a thermal equilibrium between room temperature and the initial temperature of a combustion chamber during the injection stage. The following thermodynamic properties were obtained: internal energy, enthalpy, Gibbs free energy and entropy, which enabled us to investigate the thermal resistance of fuels. Also, we estimated an important parameter, namely, the specific heat ratio of each natural gas; this allowed us to compare the results with the empirical functions of these parameters, where the B3LYP/6-311++G(d,p) and G3/G4 methods showed better agreements. In addition, relevant information on the thermal and mechanic resistance of natural gases were investigated, as well as the standard thermodynamic properties for the combustion of natural gas. Thus, we show that density functional theory can be useful for predicting the thermodynamic properties of natural gas, enabling the production of more efficient compositions for the investigated fuels. Graphical abstract Investigation of the thermodynamic properties of natural gas through the canonical ensemble model and the density functional theory.

  3. Prediction of Reduction Potentials of Copper Proteins with Continuum Electrostatics and Density Functional Theory

    PubMed Central

    Fowler, Nicholas J.; Blanford, Christopher F.

    2017-01-01

    Abstract Blue copper proteins, such as azurin, show dramatic changes in Cu2+/Cu+ reduction potential upon mutation over the full physiological range. Hence, they have important functions in electron transfer and oxidation chemistry and have applications in industrial biotechnology. The details of what determines these reduction potential changes upon mutation are still unclear. Moreover, it has been difficult to model and predict the reduction potential of azurin mutants and currently no unique procedure or workflow pattern exists. Furthermore, high‐level computational methods can be accurate but are too time consuming for practical use. In this work, a novel approach for calculating reduction potentials of azurin mutants is shown, based on a combination of continuum electrostatics, density functional theory and empirical hydrophobicity factors. Our method accurately reproduces experimental reduction potential changes of 30 mutants with respect to wildtype within experimental error and highlights the factors contributing to the reduction potential change. Finally, reduction potentials are predicted for a series of 124 new mutants that have not yet been investigated experimentally. Several mutants are identified that are located well over 10 Å from the copper center that change the reduction potential by more than 85 mV. The work shows that secondary coordination sphere mutations mostly lead to long‐range electrostatic changes and hence can be modeled accurately with continuum electrostatics. PMID:28815759

  4. Predicting the mortality in geriatric patients with dengue fever

    PubMed Central

    Huang, Hung-Sheng; Hsu, Chien-Chin; Ye, Je-Chiuan; Su, Shih-Bin; Huang, Chien-Cheng; Lin, Hung-Jung

    2017-01-01

    Abstract Geriatric patients have high mortality for dengue fever (DF); however, there is no adequate method to predict mortality in geriatric patients. Therefore, we conducted this study to develop a tool in an attempt to address this issue. We conducted a retrospective case–control study in a tertiary medical center during the DF outbreak in Taiwan in 2015. All the geriatric patients (aged ≥65 years) who visited the study hospital between September 1, 2015, and December 31, 2015, were recruited into this study. Variables included demographic data, vital signs, symptoms and signs, comorbidities, living status, laboratory data, and 30-day mortality. We investigated independent mortality predictors by univariate analysis and multivariate logistic regression analysis and then combined these predictors to predict the mortality. A total of 627 geriatric DF patients were recruited, with a mortality rate of 4.3% (27 deaths and 600 survivals). The following 4 independent mortality predictors were identified: severe coma [Glasgow Coma Scale: ≤8; adjusted odds ratio (AOR): 11.36; 95% confidence interval (CI): 1.89–68.19], bedridden (AOR: 10.46; 95% CI: 1.58–69.16), severe hepatitis (aspartate aminotransferase >1000 U/L; AOR: 96.08; 95% CI: 14.11–654.40), and renal failure (serum creatinine >2 mg/dL; AOR: 6.03; 95% CI: 1.50–24.24). When we combined the predictors, we found that the sensitivity, specificity, positive predictive value, and negative predictive value for patients with 1 or more predictors were 70.37%, 88.17%, 21.11%, and 98.51%, respectively. For patients with 2 or more predictors, the respective values were 33.33%, 99.44%, 57.14%, and 98.51%. We developed a new method to help decision making. Among geriatric patients with none of the predictors, the survival rate was 98.51%, and among those with 2 or more predictors, the mortality rate was 57.14%. This method is simple and useful, especially in an outbreak. PMID:28906367

  5. Models to predict length of stay in the Intensive Care Unit after coronary artery bypass grafting: a systematic review.

    PubMed

    Atashi, Alireza; Verburg, Ilona W; Karim, Hesam; Miri, Mirmohammad; Abu-Hanna, Ameen; de Jonge, Evert; de Keizer, Nicolette F; Eslami, Saeid

    2018-06-01

    Intensive Care Units (ICU) length of stay (LoS) prediction models are used to compare different institutions and surgeons on their performance, and is useful as an efficiency indicator for quality control. There is little consensus about which prediction methods are most suitable to predict (ICU) length of stay. The aim of this study is to systematically review models for predicting ICU LoS after coronary artery bypass grafting and to assess the reporting and methodological quality of these models to apply them for benchmarking. A general search was conducted in Medline and Embase up to 31-12-2016. Three authors classified the papers for inclusion by reading their title, abstract and full text. All original papers describing development and/or validation of a prediction model for LoS in the ICU after CABG surgery were included. We used a checklist developed for critical appraisal and data extraction for systematic reviews of prediction modeling and extended it on handling specific patients subgroups. We also defined other items and scores to assess the methodological and reporting quality of the models. Of 5181 uniquely identified articles, fifteen studies were included of which twelve on development of new models and three on validation of existing models. All studies used linear or logistic regression as method for model development, and reported various performance measures based on the difference between predicted and observed ICU LoS. Most used a prospective (46.6%) or retrospective study design (40%). We found heterogeneity in patient inclusion/exclusion criteria; sample size; reported accuracy rates; and methods of candidate predictor selection. Most (60%) studies have not mentioned the handling of missing values and none compared the model outcome measure of survivors with non-survivors. For model development and validation studies respectively, the maximum reporting (methodological) scores were 66/78 and 62/62 (14/22 and 12/22). There are relatively few models for predicting ICU length of stay after CABG. Several aspects of methodological and reporting quality of studies in this field should be improved. There is a need for standardizing outcome and risk factor definitions in order to develop/validate a multi-institutional and international risk scoring system.

  6. Is a Picture Worth a Thousand Words? Using Images to Create a Concreteness Effect for Abstract Words: Evidence from Beginning L2 Learners of Spanish

    ERIC Educational Resources Information Center

    Farley, Andrew; Pahom, Olga; Ramonda, Kris

    2014-01-01

    This study examines the lexical representation and recall of abstract words by beginning L2 learners of Spanish in the light of the predictions of the dual coding theory (Paivio 1971; Paivio and Desrochers 1980). Ninety-seven learners (forty-four males and fifty-three females) were randomly placed in the picture or non-picture group and taught…

  7. Molecular Solutions to Low Vision Resulting from Battlefield Injuries

    DTIC Science & Technology

    2006-05-01

    promote optic nerve regeneration; (5) dry eye bydetermining how to minimize dry eye after LASIK refractive surgery by developing new tests to predict...Nerve Loss in LASIK Surgery, Invest. Ophthalmol. Vis. Sci. 2006 47: E-Abstract 4600. CONCLUSIONS We conclude that in the normal mouse conjunctiva, the...Mimicking Nerve Loss in LASIK Surgery, Invest. Ophthalmol. Vis. Sci. 2006 47: E-Abstract 4600. Services Email this article to a friend Similar articles in

  8. Application of Computational and High-Throughput in vitro ...

    EPA Pesticide Factsheets

    Abstract: There are tens of thousands of man-made chemicals to which humans are exposed, but only a fraction of these have the extensive in vivo toxicity data used in most traditional risk assessments. This lack of data, coupled with concerns about testing costs and animal use, are driving the development of new methods for assessing the risk of toxicity. These methods include the use of in vitro high-throughput screening assays and computational models. This talk will review a variety of high-throughput, non-animal methods being used at the U.S. EPA to screen chemicals for a variety of toxicity endpoints, with a focus on their potential to be endocrine disruptors as part of the Endocrine Disruptor Screening Program (EDSP). These methods all start with the use of in vitro assays, e.g. for activity against the estrogen and androgen receptors (ER and AR) and targets in the steroidogenesis and thyroid signaling pathways. Because all individual assays are subject to a variety of noise processes and technology-specific assay artefacts, we have developed methods to create consensus predictions from multiple assays against the same target. The goal of these models is to both robustly predict in vivo activity, and also to provide quantitative estimates of uncertainty. This talk will describe these models, and how they are validated against both in vitro and in vivo reference chemicals. The U.S. EPA has deemed the in vitro ER model results to be of high enough accuracy t

  9. Application of computational and high-throughput in vitro ...

    EPA Pesticide Factsheets

    Abstract: There are tens of thousands of man-made chemicals to which humans are exposed, but only a fraction of these have the extensive in vivo toxicity data used in most traditional risk assessments. This lack of data, coupled with concerns about testing costs and animal use, are driving the development of new methods for assessing the risk of toxicity. These methods include the use of in vitro high-throughput screening assays and computational models. This talk will review a variety of high-throughput, non-animal methods being used at the U.S. EPA to screen chemicals for their potential to be endocrine disruptors as part of the Endocrine Disruptor Screening Program (EDSP). These methods all start with the use of in vitro assays, e.g. for activity against the estrogen and androgen receptors (ER and AR) and targets in the steroidogenesis and thyroid signaling pathways. Because all individual assays are subject to a variety of noise processes and technology-specific assay artefacts, we have developed methods to create consensus predictions from multiple assays against the same target. The goal of these models is to both robustly predict in vivo activity, and also to provide quantitative estimates of uncertainty. This talk will describe these models, and how they are validated against both in vitro and in vivo reference chemicals. The U.S. EPA has deemed the in vitro ER model results to be of high enough accuracy to be used as a substitute for the current EDSP Ti

  10. Decomposition of the complex system into nonlinear spatio-temporal modes: algorithm and application to climate data mining

    NASA Astrophysics Data System (ADS)

    Feigin, Alexander; Gavrilov, Andrey; Loskutov, Evgeny; Mukhin, Dmitry

    2015-04-01

    Proper decomposition of the complex system into well separated "modes" is a way to reveal and understand the mechanisms governing the system behaviour as well as discover essential feedbacks and nonlinearities. The decomposition is also natural procedure that provides to construct adequate and concurrently simplest models of both corresponding sub-systems, and of the system in whole. In recent works two new methods of decomposition of the Earth's climate system into well separated modes were discussed. The first method [1-3] is based on the MSSA (Multichannel Singular Spectral Analysis) [4] for linear expanding vector (space-distributed) time series and makes allowance delayed correlations of the processes recorded in spatially separated points. The second one [5-7] allows to construct nonlinear dynamic modes, but neglects delay of correlations. It was demonstrated [1-3] that first method provides effective separation of different time scales, but prevent from correct reduction of data dimension: slope of variance spectrum of spatio-temporal empirical orthogonal functions that are "structural material" for linear spatio-temporal modes, is too flat. The second method overcomes this problem: variance spectrum of nonlinear modes falls essentially sharply [5-7]. However neglecting time-lag correlations brings error of mode selection that is uncontrolled and increases with growth of mode time scale. In the report we combine these two methods in such a way that the developed algorithm allows constructing nonlinear spatio-temporal modes. The algorithm is applied for decomposition of (i) multi hundreds years globally distributed data generated by the INM RAS Coupled Climate Model [8], and (ii) 156 years time series of SST anomalies distributed over the globe [9]. We compare efficiency of different methods of decomposition and discuss the abilities of nonlinear spatio-temporal modes for construction of adequate and concurrently simplest ("optimal") models of climate systems. 1. Feigin A.M., Mukhin D., Gavrilov A., Volodin E.M., and Loskutov E.M. (2013) "Separation of spatial-temporal patterns ("climatic modes") by combined analysis of really measured and generated numerically vector time series", AGU 2013 Fall Meeting, Abstract NG33A-1574. 2. Alexander Feigin, Dmitry Mukhin, Andrey Gavrilov, Evgeny Volodin, and Evgeny Loskutov (2014) "Approach to analysis of multiscale space-distributed time series: separation of spatio-temporal modes with essentially different time scales", Geophysical Research Abstracts, Vol. 16, EGU2014-6877. 3. Dmitry Mukhin, Dmitri Kondrashov, Evgeny Loskutov, Andrey Gavrilov, Alexander Feigin, and Michael Ghil (2014) "Predicting critical transitions in ENSO models, Part II: Spatially dependent models", Journal of Climate (accepted, doi: 10.1175/JCLI-D-14-00240.1). 4. Ghil, M., R. M. Allen, M. D. Dettinger, K. Ide, D. Kondrashov, et al. (2002) "Advanced spectral methods for climatic time series", Rev. Geophys. 40(1), 3.1-3.41. 5. Dmitry Mukhin, Andrey Gavrilov, Evgeny M Loskutov and Alexander M Feigin (2014) "Nonlinear Decomposition of Climate Data: a New Method for Reconstruction of Dynamical Modes", AGU 2014 Fall Meeting, Abstract NG43A-3752. 6. Andrey Gavrilov, Dmitry Mukhin, Evgeny Loskutov, and Alexander Feigin (2015) "Empirical decomposition of climate data into nonlinear dynamic modes", Geophysical Research Abstracts, Vol. 17, EGU2015-627. 7. Dmitry Mukhin, Andrey Gavrilov, Evgeny Loskutov, Alexander Feigin, and Juergen Kurths (2015) "Reconstruction of principal dynamical modes from climatic variability: nonlinear approach", Geophysical Research Abstracts, Vol. 17, EGU2015-5729. 8. http://83.149.207.89/GCM_DATA_PLOTTING/GCM_INM_DATA_XY_en.htm. 9. http://iridl.ldeo.columbia.edu/SOURCES/.KAPLAN/.EXTENDED/.v2/.ssta/.

  11. Toward a detailed understanding of search trajectories in fragment assembly approaches to protein structure prediction

    PubMed Central

    Handl, Julia; Lovell, Simon C.

    2016-01-01

    ABSTRACT Energy functions, fragment libraries, and search methods constitute three key components of fragment‐assembly methods for protein structure prediction, which are all crucial for their ability to generate high‐accuracy predictions. All of these components are tightly coupled; efficient searching becomes more important as the quality of fragment libraries decreases. Given these relationships, there is currently a poor understanding of the strengths and weaknesses of the sampling approaches currently used in fragment‐assembly techniques. Here, we determine how the performance of search techniques can be assessed in a meaningful manner, given the above problems. We describe a set of techniques that aim to reduce the impact of the energy function, and assess exploration in view of the search space defined by a given fragment library. We illustrate our approach using Rosetta and EdaFold, and show how certain features of these methods encourage or limit conformational exploration. We demonstrate that individual trajectories of Rosetta are susceptible to local minima in the energy landscape, and that this can be linked to non‐uniform sampling across the protein chain. We show that EdaFold's novel approach can help balance broad exploration with locating good low‐energy conformations. This occurs through two mechanisms which cannot be readily differentiated using standard performance measures: exclusion of false minima, followed by an increasingly focused search in low‐energy regions of conformational space. Measures such as ours can be helpful in characterizing new fragment‐based methods in terms of the quality of conformational exploration realized. Proteins 2016; 84:411–426. © 2016 The Authors Proteins: Structure, Function, and Bioinformatics Published by Wiley Periodicals, Inc. PMID:26799916

  12. Emerging Patterns of Abstraction in Environmental Education: A Review of Materials, Methods and Professional Development Perspectives

    ERIC Educational Resources Information Center

    O'Donoghue, Rob; Russo, Vladimir

    2004-01-01

    This paper examines how emerging materials and associated methods became inscribed within and have shaped developing patterns of practice in environmental education. In so doing, it gives attention to how materials and methods have informed methodological narratives and shaped abstracted propositions used in professional development activities.…

  13. Promoting Students' Self-Directed Learning Ability through Teaching Mathematics for Social Justice

    ERIC Educational Resources Information Center

    Voss, Richard; Rickards, Tony

    2016-01-01

    Mathematics is a subject which is often taught using abstract methods and processes. These methods by their very nature may for students alienate the relationship between Mathematics and real life situations. Further, these abstract methods and processes may disenfranchise students from becoming self-directed learners of Mathematics. A solution to…

  14. Theoretical Kinetics Analysis for Ḣ Atom Addition to 1,3-Butadiene and Related Reactions on the Ċ4H7 Potential Energy Surface.

    PubMed

    Li, Yang; Klippenstein, Stephen J; Zhou, Chong-Wen; Curran, Henry J

    2017-10-12

    The oxidation chemistry of the simplest conjugated hydrocarbon, 1,3-butadiene, can provide a first step in understanding the role of polyunsaturated hydrocarbons in combustion and, in particular, an understanding of their contribution toward soot formation. On the basis of our previous work on propene and the butene isomers (1-, 2-, and isobutene), it was found that the reaction kinetics of Ḣ-atom addition to the C═C double bond plays a significant role in fuel consumption kinetics and influences the predictions of high-temperature ignition delay times, product species concentrations, and flame speed measurements. In this study, the rate constants and thermodynamic properties for Ḣ-atom addition to 1,3-butadiene and related reactions on the Ċ 4 H 7 potential energy surface have been calculated using two different series of quantum chemical methods and two different kinetic codes. Excellent agreement is obtained between the two different kinetics codes. The calculated results including zero-point energies, single-point energies, rate constants, barrier heights, and thermochemistry are systematically compared among the two quantum chemical methods. 1-Methylallyl (Ċ 4 H 7 1-3) and 3-buten-1-yl (Ċ 4 H 7 1-4) radicals and C 2 H 4 + Ċ 2 H 3 are found to be the most important channels and reactivity-promoting products, respectively. We calculated that terminal addition is dominant (>80%) compared to internal Ḣ-atom addition at all temperatures in the range 298-2000 K. However, this dominance decreases with increasing temperature. The calculated rate constants for the bimolecular reaction C 4 H 6 + Ḣ → products and C 2 H 4 + Ċ 2 H 3 → products are in excellent agreement with both experimental and theoretical results from the literature. For selected C 4 species, the calculated thermochemical values are also in good agreement with literature data. In addition, the rate constants for H atom abstraction by Ḣ atoms have also been calculated, and it is found that abstraction from the central carbon atoms is the dominant channel (>70%) at temperatures in the range of 298-2000 K. Finally, by incorporating our calculated rate constants for both Ḣ atom addition and abstraction into our recently developed 1,3-butadiene model, we show that laminar flame speed predictions are significantly improved, emphasizing the value of this study.

  15. Sex, drugs and moral goals: reproductive strategies and views about recreational drugs

    PubMed Central

    Kurzban, Robert; Dukes, Amber; Weeden, Jason

    2010-01-01

    Humans, unlike most other species, show intense interest in the activities of conspecifics, even when the activities in question pose no obvious fitness threat or opportunity. Here, we investigate one content domain in which people show substantial interest, the use of drugs for non-medical purposes. Drawing from two subject populations—one undergraduate and one Internet-based—we look at the relationships among (i) abstract political commitments; (ii) attitudes about sexuality; and (iii) views surrounding recreational drugs. Whereas some theories suggest that drug views are best understood as the result of abstract political ideology, we suggest that these views can be better understood in the context of reproductive strategy. We show that, as predicted by a strategic construal, drug attitudes are best predicted by sexual items rather than abstract political commitments and, further, that the relationship between factors such as political ideology and drugs, while positive, are reduced to zero or nearly zero when items assessing sexuality are controlled for. We conclude that considering morality from the standpoint of strategic interests is a potentially useful way to understand why humans care about third party behaviour. PMID:20554547

  16. Effects of induced rumination on body dissatisfaction: Is there any difference between men and women?

    PubMed

    Rivière, Julie; Rousseau, Amélie; Douilliez, Céline

    2018-05-19

    Rumination is a factor in the development and maintenance of body dissatisfaction. However, no study has yet investigated the impact of the type of rumination on body image. The first aim of this study was to examine whether the induction of analytic-abstract vs. concrete-experiential rumination affects body dissatisfaction following an induction of negative body image. The second objective was to examine gender differences in these effects. Following induction of negative body image, 102 university undergraduates were randomly assigned to one of three experimental conditions-distraction, concrete rumination or abstract rumination. As expected, there were significant main effects of gender and condition, and a significant interaction between gender and condition on change in body dissatisfaction. In women abstract rumination predicted the highest increase in body dissatisfaction, whereas concrete rumination predicted the highest increase in body dissatisfaction in men. Given that our sample consisted of undergraduate students, our findings cannot be generalized to clinical sample suffering from eating disorder. The different types of rumination seem to impact differentially body dissatisfaction in men and women. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Multivariate Patterns in the Human Object-Processing Pathway Reveal a Shift from Retinotopic to Shape Curvature Representations in Lateral Occipital Areas, LO-1 and LO-2

    PubMed Central

    Vernon, Richard J. W.; Gouws, André D.; Lawrence, Samuel J. D.; Wade, Alex R.

    2016-01-01

    Representations in early visual areas are organized on the basis of retinotopy, but this organizational principle appears to lose prominence in the extrastriate cortex. Nevertheless, an extrastriate region, such as the shape-selective lateral occipital cortex (LO), must still base its activation on the responses from earlier retinotopic visual areas, implying that a transition from retinotopic to “functional” organizations should exist. We hypothesized that such a transition may lie in LO-1 or LO-2, two visual areas lying between retinotopically defined V3d and functionally defined LO. Using a rapid event-related fMRI paradigm, we measured neural similarity in 12 human participants between pairs of stimuli differing along dimensions of shape exemplar and shape complexity within both retinotopically and functionally defined visual areas. These neural similarity measures were then compared with low-level and more abstract (curvature-based) measures of stimulus similarity. We found that low-level, but not abstract, stimulus measures predicted V1–V3 responses, whereas the converse was true for LO, a double dissociation. Critically, abstract stimulus measures were most predictive of responses within LO-2, akin to LO, whereas both low-level and abstract measures were predictive for responses within LO-1, perhaps indicating a transitional point between those two organizational principles. Similar transitions to abstract representations were not observed in the more ventral stream passing through V4 and VO-1/2. The transition we observed in LO-1 and LO-2 demonstrates that a more “abstracted” representation, typically considered the preserve of “category-selective” extrastriate cortex, can nevertheless emerge in retinotopic regions. SIGNIFICANCE STATEMENT Visual areas are typically identified either through retinotopy (e.g., V1–V3) or from functional selectivity [e.g., shape-selective lateral occipital complex (LOC)]. We combined these approaches to explore the nature of shape representations through the visual hierarchy. Two different representations emerged: the first reflected low-level shape properties (dependent on the spatial layout of the shape outline), whereas the second captured more abstract curvature-related shape features. Critically, early visual cortex represented low-level information but this diminished in the extrastriate cortex (LO-1/LO-2/LOC), in which the abstract representation emerged. Therefore, this work further elucidates the nature of shape representations in the LOC, provides insight into how those representations emerge from early retinotopic cortex, and crucially demonstrates that retinotopically tuned regions (LO-1/LO-2) are not necessarily constrained to retinotopic representations. PMID:27225766

  18. Overhead Projector Spectrum of Polymethine Dye: A Physical Chemistry Demonstration.

    ERIC Educational Resources Information Center

    Solomon, Sally; Hur, Chinhyu

    1995-01-01

    Encourages the incorporation into lecture of live experiments that can be predicted or interpreted with abstract models. A demonstration is described where the position of the predominant peak of 1,1'-diethyl-4,4'-cyanine iodide is measured in class using an overhead projector spectrometer, then predicted using the model of a particle in a…

  19. USE OF INTERSPECIES CORRELATION ESTIMATIONS TO PREDICT HC5'S BASED ON MINIMAL DATA

    EPA Science Inventory

    Dyer, S., S. Belanger, J. Chaney, D. Versteeg and F. Mayer. In press. Use of Interspecies Correlation Estimations to Predict HC5's Based on Minimal Data (Abstract). To be presented at the SETAC Fourth World Congress, 14-18 November 2004, Portland, OR. 1 p. (ERL,GB R1013).

  20. Improving Environmental Model Calibration and Prediction

    DTIC Science & Technology

    2011-01-18

    REPORT Final Report - Improving Environmental Model Calibration and Prediction 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: First, we have continued to...develop tools for efficient global optimization of environmental models. Our algorithms are hybrid algorithms that combine evolutionary strategies...toward practical hybrid optimization tools for environmental models. 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 18-01-2011 13

  1. Measurement of reaeration coefficients for selected Florida streams

    USGS Publications Warehouse

    Hampson, P.S.; Coffin, J.E.

    1989-01-01

    A total of 29 separate reaeration coefficient determinations were performed on 27 subreaches of 12 selected Florida streams between October 1981 and May 1985. Measurements performed prior to June 1984 were made using the peak and area methods with ethylene and propane as the tracer gases. Later measurements utilized the steady-state method with propane as the only tracer gas. The reaeration coefficients ranged from 1.07 to 45.9 days with a mean estimated probable error of +/16.7%. Ten predictive equations (compiled from the literature) were also evaluated using the measured coefficients. The most representative equation was one of the energy dissipation type with a standard error of 60.3%. Seven of the 10 predictive additional equations were modified using the measured coefficients and nonlinear regression techniques. The most accurate of the developed equations was also of the energy dissipation form and had a standard error of 54.9%. For 5 of the 13 subreaches in which both ethylene and propane were used, the ethylene data resulted in substantially larger reaeration coefficient values which were rejected. In these reaches, ethylene concentrations were probably significantly affected by one or more electrophilic addition reactions known to occur in aqueous media. (Author 's abstract)

  2. Virtual reality laparoscopy: which potential trainee starts with a higher proficiency level?

    PubMed

    Paschold, M; Schröder, M; Kauff, D W; Gorbauch, T; Herzer, M; Lang, H; Kneist, W

    2011-09-01

    Minimally invasive surgery requires technical skills distinct from those used in conventional surgery. The aim of this prospective study was to identify personal characteristics that may predict the attainable proficiency level of first-time virtual reality laparoscopy (VRL) trainees. Two hundred and seventy-nine consecutive undergraduate medical students without experience attended a standardized VRL training. Performance data of an abstract and a procedural task were correlated with possible predictive factors providing potential competence in VRL. Median global score requirement status was 86.7% (interquartile range (IQR) 75-93) for the abstract task and 74.4% (IQR 67-88) for the procedural task. Unadjusted analysis showed significant increase in the global score in both tasks for trainees who had a gaming console at home and frequently used it as well as for trainees who felt self-confident to assist in a laparoscopic operation. Multiple logistic regression analysis identified frequency of video gaming (often/frequently vs. rarely/not at all, odds ratio: abstract model 2.1 (95% confidence interval 1.2; 3.6), P = 0.009; virtual reality operation procedure 2.4 (95% confidence interval 1.3; 4.2), P = 0.003) as a predictive factor for VRL performance. Frequency of video gaming is associated with quality of first-time VRL performance. Video game experience may be used as trainee selection criteria for tailored concepts of VRL training programs.

  3. Predictive coding accelerates word recognition and learning in the early stages of language development.

    PubMed

    Ylinen, Sari; Bosseler, Alexis; Junttila, Katja; Huotilainen, Minna

    2017-11-01

    The ability to predict future events in the environment and learn from them is a fundamental component of adaptive behavior across species. Here we propose that inferring predictions facilitates speech processing and word learning in the early stages of language development. Twelve- and 24-month olds' electrophysiological brain responses to heard syllables are faster and more robust when the preceding word context predicts the ending of a familiar word. For unfamiliar, novel word forms, however, word-expectancy violation generates a prediction error response, the strength of which significantly correlates with children's vocabulary scores at 12 months. These results suggest that predictive coding may accelerate word recognition and support early learning of novel words, including not only the learning of heard word forms but also their mapping to meanings. Prediction error may mediate learning via attention, since infants' attention allocation to the entire learning situation in natural environments could account for the link between prediction error and the understanding of word meanings. On the whole, the present results on predictive coding support the view that principles of brain function reported across domains in humans and non-human animals apply to language and its development in the infant brain. A video abstract of this article can be viewed at: http://hy.fi/unitube/video/e1cbb495-41d8-462e-8660-0864a1abd02c. [Correction added on 27 January 2017, after first online publication: The video abstract link was added.]. © 2016 John Wiley & Sons Ltd.

  4. Annual Quality Assurance Conference Files by Tom Mancuso

    EPA Pesticide Factsheets

    25th Annual Quality Assurance Conference. Abstract: Learn about the NEW EPA Method 325b for Refinery Fence Line Monitoring and TO-17 Extended for Soil Gas by Tom Mancuso and Abstract: Success Using Alternate Carrier Gases for Volatile Methods

  5. Cell death, perfusion and electrical parameters are critical in models of hepatic radiofrequency ablation

    PubMed Central

    Hall, Sheldon K.; Ooi, Ean H.; Payne, Stephen J.

    2015-01-01

    Abstract Purpose: A sensitivity analysis has been performed on a mathematical model of radiofrequency ablation (RFA) in the liver. The purpose of this is to identify the most important parameters in the model, defined as those that produce the largest changes in the prediction. This is important in understanding the role of uncertainty and when comparing the model predictions to experimental data. Materials and methods: The Morris method was chosen to perform the sensitivity analysis because it is ideal for models with many parameters or that take a significant length of time to obtain solutions. A comprehensive literature review was performed to obtain ranges over which the model parameters are expected to vary, crucial input information. Results: The most important parameters in predicting the ablation zone size in our model of RFA are those representing the blood perfusion, electrical conductivity and the cell death model. The size of the 50 °C isotherm is sensitive to the electrical properties of tissue while the heat source is active, and to the thermal parameters during cooling. Conclusions: The parameter ranges chosen for the sensitivity analysis are believed to represent all that is currently known about their values in combination. The Morris method is able to compute global parameter sensitivities taking into account the interaction of all parameters, something that has not been done before. Research is needed to better understand the uncertainties in the cell death, electrical conductivity and perfusion models, but the other parameters are only of second order, providing a significant simplification. PMID:26000972

  6. Performance of wave function and density functional methods for water hydrogen bond spin-spin coupling constants.

    PubMed

    García de la Vega, J M; Omar, S; San Fabián, J

    2017-04-01

    Spin-spin coupling constants in water monomer and dimer have been calculated using several wave function and density functional-based methods. CCSD, MCSCF, and SOPPA wave functions methods yield similar results, specially when an additive approach is used with the MCSCF. Several functionals have been used to analyze their performance with the Jacob's ladder and a set of functionals with different HF exchange were tested. Functionals with large HF exchange appropriately predict 1 J O H , 2 J H H and 2h J O O couplings, while 1h J O H is better calculated with functionals that include a reduced fraction of HF exchange. Accurate functionals for 1 J O H and 2 J H H have been tested in a tetramer water model. The hydrogen bond effects on these intramolecular couplings are additive when they are calculated by SOPPA(CCSD) wave function and DFT methods. Graphical Abstract Evaluation of the additive effect of the hydrogen bond on spin-spin coupling constants of water using WF and DFT methods.

  7. Apparatuses and Methods for Producing Runtime Architectures of Computer Program Modules

    NASA Technical Reports Server (NTRS)

    Abi-Antoun, Marwan Elia (Inventor); Aldrich, Jonathan Erik (Inventor)

    2013-01-01

    Apparatuses and methods for producing run-time architectures of computer program modules. One embodiment includes creating an abstract graph from the computer program module and from containment information corresponding to the computer program module, wherein the abstract graph has nodes including types and objects, and wherein the abstract graph relates an object to a type, and wherein for a specific object the abstract graph relates the specific object to a type containing the specific object; and creating a runtime graph from the abstract graph, wherein the runtime graph is a representation of the true runtime object graph, wherein the runtime graph represents containment information such that, for a specific object, the runtime graph relates the specific object to another object that contains the specific object.

  8. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  9. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  10. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  11. X-Graphs: Language and Algorithms for Heterogeneous Graph Streams

    DTIC Science & Technology

    2017-09-01

    INTRODUCTION 1 3 METHODS , ASUMPTIONS, AND PROCEDURES 2 Software Abstractions for Graph Analytic Applications 2 High performance Platforms for Graph Processing...data is stored in a distributed file system. 3 METHODS , ASUMPTIONS, AND PROCEDURES Software Abstractions for Graph Analytic Applications To...implementations of novel methods for networks analysis: several methods for detection of overlapping communities, personalized PageRank, node embeddings into a d

  12. Comprehensive curation and analysis of global interaction networks in Saccharomyces cerevisiae

    PubMed Central

    Reguly, Teresa; Breitkreutz, Ashton; Boucher, Lorrie; Breitkreutz, Bobby-Joe; Hon, Gary C; Myers, Chad L; Parsons, Ainslie; Friesen, Helena; Oughtred, Rose; Tong, Amy; Stark, Chris; Ho, Yuen; Botstein, David; Andrews, Brenda; Boone, Charles; Troyanskya, Olga G; Ideker, Trey; Dolinski, Kara; Batada, Nizar N; Tyers, Mike

    2006-01-01

    Background The study of complex biological networks and prediction of gene function has been enabled by high-throughput (HTP) methods for detection of genetic and protein interactions. Sparse coverage in HTP datasets may, however, distort network properties and confound predictions. Although a vast number of well substantiated interactions are recorded in the scientific literature, these data have not yet been distilled into networks that enable system-level inference. Results We describe here a comprehensive database of genetic and protein interactions, and associated experimental evidence, for the budding yeast Saccharomyces cerevisiae, as manually curated from over 31,793 abstracts and online publications. This literature-curated (LC) dataset contains 33,311 interactions, on the order of all extant HTP datasets combined. Surprisingly, HTP protein-interaction datasets currently achieve only around 14% coverage of the interactions in the literature. The LC network nevertheless shares attributes with HTP networks, including scale-free connectivity and correlations between interactions, abundance, localization, and expression. We find that essential genes or proteins are enriched for interactions with other essential genes or proteins, suggesting that the global network may be functionally unified. This interconnectivity is supported by a substantial overlap of protein and genetic interactions in the LC dataset. We show that the LC dataset considerably improves the predictive power of network-analysis approaches. The full LC dataset is available at the BioGRID () and SGD () databases. Conclusion Comprehensive datasets of biological interactions derived from the primary literature provide critical benchmarks for HTP methods, augment functional prediction, and reveal system-level attributes of biological networks. PMID:16762047

  13. A systematic review of models to predict recruitment to multicentre clinical trials

    PubMed Central

    2010-01-01

    Background Less than one third of publicly funded trials managed to recruit according to their original plan often resulting in request for additional funding and/or time extensions. The aim was to identify models which might be useful to a major public funder of randomised controlled trials when estimating likely time requirements for recruiting trial participants. The requirements of a useful model were identified as usability, based on experience, able to reflect time trends, accounting for centre recruitment and contribution to a commissioning decision. Methods A systematic review of English language articles using MEDLINE and EMBASE. Search terms included: randomised controlled trial, patient, accrual, predict, enrol, models, statistical; Bayes Theorem; Decision Theory; Monte Carlo Method and Poisson. Only studies discussing prediction of recruitment to trials using a modelling approach were included. Information was extracted from articles by one author, and checked by a second, using a pre-defined form. Results Out of 326 identified abstracts, only 8 met all the inclusion criteria. Of these 8 studies examined, there are five major classes of model discussed: the unconditional model, the conditional model, the Poisson model, Bayesian models and Monte Carlo simulation of Markov models. None of these meet all the pre-identified needs of the funder. Conclusions To meet the needs of a number of research programmes, a new model is required as a matter of importance. Any model chosen should be validated against both retrospective and prospective data, to ensure the predictions it gives are superior to those currently used. PMID:20604946

  14. In-silico prediction of concentration-dependent viscosity curves for monoclonal antibody solutions

    PubMed Central

    Tomar, Dheeraj S.; Li, Li; Broulidakis, Matthew P.; Luksha, Nicholas G.; Burns, Christopher T.; Singh, Satish K.; Kumar, Sandeep

    2017-01-01

    ABSTRACT Early stage developability assessments of monoclonal antibody (mAb) candidates can help reduce risks and costs associated with their product development. Forecasting viscosity of highly concentrated mAb solutions is an important aspect of such developability assessments. Reliable predictions of concentration-dependent viscosity behaviors for mAb solutions in platform formulations can help screen or optimize drug candidates for flexible manufacturing and drug delivery options. Here, we present a computational method to predict concentration-dependent viscosity curves for mAbs solely from their sequence—structural attributes. This method was developed using experimental data on 16 different mAbs whose concentration-dependent viscosity curves were experimentally obtained under standardized conditions. Each concentration-dependent viscosity curve was fitted with a straight line, via logarithmic manipulations, and the values for intercept and slope were obtained. Intercept, which relates to antibody diffusivity, was found to be nearly constant. In contrast, slope, the rate of increase in solution viscosity with solute concentration, varied significantly across different mAbs, demonstrating the importance of intermolecular interactions toward viscosity. Next, several molecular descriptors for electrostatic and hydrophobic properties of the 16 mAbs derived using their full-length homology models were examined for potential correlations with the slope. An equation consisting of hydrophobic surface area of full-length antibody and charges on VH, VL, and hinge regions was found to be capable of predicting the concentration-dependent viscosity curves of the antibody solutions. Availability of this computational tool may facilitate material-free high-throughput screening of antibody candidates during early stages of drug discovery and development. PMID:28125318

  15. From abstract to publication: the fate of research presented at an annual forensic meeting.

    PubMed

    Tambuscio, Silvia; Boghossian, Elie; Sauvageau, Anny

    2010-11-01

    In forensic sciences, the fate of abstracts presented at international meetings has not yet been assessed. The purpose of this study is to estimate publication ratio and evaluate possible predictors of publication after the 58th edition of the 2006 American Academy of Forensic Sciences annual meeting. Section of the meeting, type of presentation (oral platform or poster), number of authors per abstract and per paper, time span to publication, countries involved, and journal of publication were tabulated. A total of 623 abstracts were presented, from which 102 were subsequently published as a full paper. The overall publication rate was 16.4%, ranging from 3.4% (jurisprudence) to 28.8% (toxicology). The type of presentation (oral platform or poster) did not significantly affect the outcome of the abstract. However, a higher number of authors, foreign authors, and international collaboration were found to be good predictive factors of publication. © 2010 American Academy of Forensic Sciences.

  16. Real-time, adaptive machine learning for non-stationary, near chaotic gasoline engine combustion time series.

    PubMed

    Vaughan, Adam; Bohac, Stanislav V

    2015-10-01

    Fuel efficient Homogeneous Charge Compression Ignition (HCCI) engine combustion timing predictions must contend with non-linear chemistry, non-linear physics, period doubling bifurcation(s), turbulent mixing, model parameters that can drift day-to-day, and air-fuel mixture state information that cannot typically be resolved on a cycle-to-cycle basis, especially during transients. In previous work, an abstract cycle-to-cycle mapping function coupled with ϵ-Support Vector Regression was shown to predict experimentally observed cycle-to-cycle combustion timing over a wide range of engine conditions, despite some of the aforementioned difficulties. The main limitation of the previous approach was that a partially acasual randomly sampled training dataset was used to train proof of concept offline predictions. The objective of this paper is to address this limitation by proposing a new online adaptive Extreme Learning Machine (ELM) extension named Weighted Ring-ELM. This extension enables fully causal combustion timing predictions at randomly chosen engine set points, and is shown to achieve results that are as good as or better than the previous offline method. The broader objective of this approach is to enable a new class of real-time model predictive control strategies for high variability HCCI and, ultimately, to bring HCCI's low engine-out NOx and reduced CO2 emissions to production engines. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Functional analysis of rare variants in mismatch repair proteins augments results from computation-based predictive methods

    PubMed Central

    Arora, Sanjeevani; Huwe, Peter J.; Sikder, Rahmat; Shah, Manali; Browne, Amanda J.; Lesh, Randy; Nicolas, Emmanuelle; Deshpande, Sanat; Hall, Michael J.; Dunbrack, Roland L.; Golemis, Erica A.

    2017-01-01

    ABSTRACT The cancer-predisposing Lynch Syndrome (LS) arises from germline mutations in DNA mismatch repair (MMR) genes, predominantly MLH1, MSH2, MSH6, and PMS2. A major challenge for clinical diagnosis of LS is the frequent identification of variants of uncertain significance (VUS) in these genes, as it is often difficult to determine variant pathogenicity, particularly for missense variants. Generic programs such as SIFT and PolyPhen-2, and MMR gene-specific programs such as PON-MMR and MAPP-MMR, are often used to predict deleterious or neutral effects of VUS in MMR genes. We evaluated the performance of multiple predictive programs in the context of functional biologic data for 15 VUS in MLH1, MSH2, and PMS2. Using cell line models, we characterized VUS predicted to range from neutral to pathogenic on mRNA and protein expression, basal cellular viability, viability following treatment with a panel of DNA-damaging agents, and functionality in DNA damage response (DDR) signaling, benchmarking to wild-type MMR proteins. Our results suggest that the MMR gene-specific classifiers do not always align with the experimental phenotypes related to DDR. Our study highlights the importance of complementary experimental and computational assessment to develop future predictors for the assessment of VUS. PMID:28494185

  18. Abstracts of Research Papers 1970.

    ERIC Educational Resources Information Center

    Drowatzky, John N., Ed.

    This publication includes the abstracts of 199 research papers presented at the 1970 American Association for Health, Physical Education, and Recreation convention in Seattle, Washington. Abstracts from symposia on environmental quality education, obesity, motor development, research methods, and laboratory equipment are also included. Each…

  19. Validity of VO(2 max) in Predicting Blood Volume: Implications for the Effect of Fitness on Aging

    DTIC Science & Technology

    2000-09-01

    not unexpected since an expansion of BV typically accompanies an increase in V̇O2 max with exercise train- ing (9). However, other investigations...for the collection of information is estimated to average 1 hour per response , including the time for reviewing instructions, searching existing data...CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT SAR 18. NUMBER OF PAGES 8 19a. NAME OF RESPONSIBLE PERSON a. REPORT unclassified b. ABSTRACT

  20. USSR and Eastern Europe Scientific Abstracts, Physics and Mathematics, Number 39

    DTIC Science & Technology

    1978-01-17

    examination of a monoclinic single crystal has revealed a U022+ iön, and helical polyphosphate chains with six PO4 tetrahedra per link. Corrugated uranyl...mean mass temperature and local Nusselt number. Figures 5; references 13: 3 Russian, 10 Western. USSR UDC 535.334 DETERMINATION OF THE PARAMETERS...Nuclear Research [Abstract] The theory of pion condensation predicts the existence of super- dense nuclei, on the basis of the structure of the

  1. Generic robot architecture

    DOEpatents

    Bruemmer, David J [Idaho Falls, ID; Few, Douglas A [Idaho Falls, ID

    2010-09-21

    The present invention provides methods, computer readable media, and apparatuses for a generic robot architecture providing a framework that is easily portable to a variety of robot platforms and is configured to provide hardware abstractions, abstractions for generic robot attributes, environment abstractions, and robot behaviors. The generic robot architecture includes a hardware abstraction level and a robot abstraction level. The hardware abstraction level is configured for developing hardware abstractions that define, monitor, and control hardware modules available on a robot platform. The robot abstraction level is configured for defining robot attributes and provides a software framework for building robot behaviors from the robot attributes. Each of the robot attributes includes hardware information from at least one hardware abstraction. In addition, each robot attribute is configured to substantially isolate the robot behaviors from the at least one hardware abstraction.

  2. Automatic Processing of Metallurgical Abstracts for the Purpose of Information Retrieval. Final Report.

    ERIC Educational Resources Information Center

    Melton, Jessica S.

    Objectives of this project were to develop and test a method for automatically processing the text of abstracts for a document retrieval system. The test corpus consisted of 768 abstracts from the metallurgical section of Chemical Abstracts (CA). The system, based on a subject indexing rational, had two components: (1) a stored dictionary of words…

  3. JDFTx: Software for joint density-functional theory

    DOE PAGES

    Sundararaman, Ravishankar; Letchworth-Weaver, Kendra; Schwarz, Kathleen A.; ...

    2017-11-14

    Density-functional theory (DFT) has revolutionized computational prediction of atomic-scale properties from first principles in physics, chemistry and materials science. Continuing development of new methods is necessary for accurate predictions of new classes of materials and properties, and for connecting to nano- and mesoscale properties using coarse-grained theories. JDFTx is a fully-featured open-source electronic DFT software designed specifically to facilitate rapid development of new theories, models and algorithms. Using an algebraic formulation as an abstraction layer, compact C++11 code automatically performs well on diverse hardware including GPUs (Graphics Processing Units). This code hosts the development of joint density-functional theory (JDFT) thatmore » combines electronic DFT with classical DFT and continuum models of liquids for first-principles calculations of solvated and electrochemical systems. In addition, the modular nature of the code makes it easy to extend and interface with, facilitating the development of multi-scale toolkits that connect to ab initio calculations, e.g. photo-excited carrier dynamics combining electron and phonon calculations with electromagnetic simulations.« less

  4. JDFTx: Software for joint density-functional theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sundararaman, Ravishankar; Letchworth-Weaver, Kendra; Schwarz, Kathleen A.

    Density-functional theory (DFT) has revolutionized computational prediction of atomic-scale properties from first principles in physics, chemistry and materials science. Continuing development of new methods is necessary for accurate predictions of new classes of materials and properties, and for connecting to nano- and mesoscale properties using coarse-grained theories. JDFTx is a fully-featured open-source electronic DFT software designed specifically to facilitate rapid development of new theories, models and algorithms. Using an algebraic formulation as an abstraction layer, compact C++11 code automatically performs well on diverse hardware including GPUs (Graphics Processing Units). This code hosts the development of joint density-functional theory (JDFT) thatmore » combines electronic DFT with classical DFT and continuum models of liquids for first-principles calculations of solvated and electrochemical systems. In addition, the modular nature of the code makes it easy to extend and interface with, facilitating the development of multi-scale toolkits that connect to ab initio calculations, e.g. photo-excited carrier dynamics combining electron and phonon calculations with electromagnetic simulations.« less

  5. Developing a Risk Model to Target High-risk Preventive Interventions for Sexual Assault Victimization among Female U.S. Army Soldiers

    PubMed Central

    Street, Amy E.; Rosellini, Anthony J.; Ursano, Robert J.; Heeringa, Steven G.; Hill, Eric D.; Monahan, John; Naifeh, James A.; Petukhova, Maria V.; Reis, Ben Y.; Sampson, Nancy A.; Bliese, Paul D.; Stein, Murray B.; Zaslavsky, Alan M.; Kessler, Ronald C.

    2016-01-01

    Sexual violence victimization is a significant problem among female U.S. military personnel. Preventive interventions for high-risk individuals might reduce prevalence, but would require accurate targeting. We attempted to develop a targeting model for female Regular U.S. Army soldiers based on theoretically-guided predictors abstracted from administrative data records. As administrative reports of sexual assault victimization are known to be incomplete, parallel machine learning models were developed to predict administratively-recorded (in the population) and self-reported (in a representative survey) victimization. Capture-recapture methods were used to combine predictions across models. Key predictors included low status, crime involvement, and treated mental disorders. Area under the Receiver Operating Characteristic curve was .83−.88. 33.7-63.2% of victimizations occurred among soldiers in the highest-risk ventile (5%). This high concentration of risk suggests that the models could be useful in targeting preventive interventions, although final determination would require careful weighing of intervention costs, effectiveness, and competing risks. PMID:28154788

  6. The impact of machine learning techniques in the study of bipolar disorder: A systematic review.

    PubMed

    Librenza-Garcia, Diego; Kotzian, Bruno Jaskulski; Yang, Jessica; Mwangi, Benson; Cao, Bo; Pereira Lima, Luiza Nunes; Bermudez, Mariane Bagatin; Boeira, Manuela Vianna; Kapczinski, Flávio; Passos, Ives Cavalcante

    2017-09-01

    Machine learning techniques provide new methods to predict diagnosis and clinical outcomes at an individual level. We aim to review the existing literature on the use of machine learning techniques in the assessment of subjects with bipolar disorder. We systematically searched PubMed, Embase and Web of Science for articles published in any language up to January 2017. We found 757 abstracts and included 51 studies in our review. Most of the included studies used multiple levels of biological data to distinguish the diagnosis of bipolar disorder from other psychiatric disorders or healthy controls. We also found studies that assessed the prediction of clinical outcomes and studies using unsupervised machine learning to build more consistent clinical phenotypes of bipolar disorder. We concluded that given the clinical heterogeneity of samples of patients with BD, machine learning techniques may provide clinicians and researchers with important insights in fields such as diagnosis, personalized treatment and prognosis orientation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Genome Sequences of Three Cluster AU Arthrobacter Phages, Caterpillar, Nightmare, and Teacup

    PubMed Central

    Adair, Tamarah L.; Stowe, Emily; Pizzorno, Marie C.; Krukonis, Gregory; Harrison, Melinda; Garlena, Rebecca A.; Russell, Daniel A.; Jacobs-Sera, Deborah

    2017-01-01

    ABSTRACT Caterpillar, Nightmare, and Teacup are cluster AU siphoviral phages isolated from enriched soil on Arthrobacter sp. strain ATCC 21022. These genomes are 58 kbp long with an average G+C content of 50%. Sequence analysis predicts 86 to 92 protein-coding genes, including a large number of small proteins with predicted transmembrane domains. PMID:29122860

  8. Geophysical abstracts 167, October-December 1956

    USGS Publications Warehouse

    Rabbitt, Mary C.; Vitaliano, Dorothy B.; Vesselowsky, S.T.; ,

    1956-01-01

    Geophysical Abstracts includes abstracts of technical papers and books on the physics of the solid earth, the application of physical methods and techniques to geologic problems, and geophysical exploration. The table of contents, which is alphabetically arranged, shows the material covered.Abstracts are prepared only of material that is believed to be generally available. Ordinarily abstracts are not published of material with limited circulation (such as dissertations, open-file reports, or memoranda) or of other papers presented orally at meetings unless summaries of substantial length are published. Abstracts of papers in Japanese and Chinese are based on abstracts or summaries in a western language accompanying the paper.

  9. Geophysical abstracts 164, January-March 1956

    USGS Publications Warehouse

    Rabbitt, Mary C.; Vitaliano, Dorothy B.; Vesselowsky, S.T.; ,

    1956-01-01

    Geophysical Abstracts includes abstracts of technical papers and books on the physics of the solid earth, the application of physical methods and techniques to geologic problems, and geophysical exploration. A new table of contents, alphabetically arranged, has been adapted to show more clearly the material covered.Abstracts are prepared only of material that is believed to be generally available. Ordinarily abstracts are not published of material with limited circulation (such as dissertations, open-file reports, or memoranda) or of papers presented orally at meetings unless summaries of substantial length are published. Abstracts of papers in Japanese and Chinese are based on abstracts or summaries in a western language accompanying the paper.

  10. Geophysical abstracts 166, July-September 1956

    USGS Publications Warehouse

    Rabbitt, Mary C.; Vitaliano, Dorothy B.; Vesselowsky, S.T.; ,

    1956-01-01

    Geophysical Abstracts includes abstracts of technical papers and books on the physics of the solid earth, the application of physical methods and techniques to geologic problems, and geophysical exploration. The table of contents, which is alphabetically arranged, shows the material covered.Abstracts are prepared only of material that is believed to be generally available. Ordinarily abstracts are not published of material with limited circulation (such as dissertations, open-file reports, or memoranda) or of other papers presented orally at meetings unless summaries of substantial length are published. Abstracts of papers in Japanese and Chinese are based on abstracts or summaries in a western language accompanying the paper.

  11. Geophysical abstracts 165, April-June 1956

    USGS Publications Warehouse

    Rabbitt, Mary C.; Vitaliano, Dorothy B.; Vesselowsky, S.T.; ,

    1956-01-01

    Geophysical Abstracts includes abstracts of technical papers and books on the physics of the solid earth, the application of physical methods and techniques to geologic problems, and geophysical exploration. The table of contents, which is alphabetically arranged, shows the material covered.Abstracts are prepared only of material that is believed to be generally available. Ordinarily abstracts are not published of material with limited circulation (such as dissertations, open-file reports, or memoranda) or of other papers presented orally at meetings unless summaries of substantial length are published. Abstracts of papers in Japanese and Chinese are based on abstracts or summaries in a western language accompanying the paper.

  12. Quantifying Grain Level Stress-Strain Behavior for AM40 via Instrumented Microindentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, Guang; Barker, Erin I.; Stephens, Elizabeth V.

    2016-01-01

    ABSTRACT Microindentation is performed on hot isostatic pressed (HIP) Mg-Al (AM40) alloy samples produced by high-pressure die cast (HPDC) process for the purpose of quantifying the mechanical properties of the α-Mg grains. The process of obtaining elastic modulus and hardness from indentation load-depth curves is well established in the literature. A new inverse method is developed to extract plastic properties in this study. The method utilizes empirical yield strength-hardness relationship reported in the literature together with finite element modeling of the individual indentation. Due to the shallow depth of the indentation, indentation size effect (ISE) is taken into account whenmore » determining plastic properties. The stress versus strain behavior is determined for a series of indents. The resulting average values and standard deviations are obtained for future use as input distributions for microstructure-based property prediction of AM40.« less

  13. Pyrcca: Regularized Kernel Canonical Correlation Analysis in Python and Its Applications to Neuroimaging.

    PubMed

    Bilenko, Natalia Y; Gallant, Jack L

    2016-01-01

    In this article we introduce Pyrcca, an open-source Python package for performing canonical correlation analysis (CCA). CCA is a multivariate analysis method for identifying relationships between sets of variables. Pyrcca supports CCA with or without regularization, and with or without linear, polynomial, or Gaussian kernelization. We first use an abstract example to describe Pyrcca functionality. We then demonstrate how Pyrcca can be used to analyze neuroimaging data. Specifically, we use Pyrcca to implement cross-subject comparison in a natural movie functional magnetic resonance imaging (fMRI) experiment by finding a data-driven set of functional response patterns that are similar across individuals. We validate this cross-subject comparison method in Pyrcca by predicting responses to novel natural movies across subjects. Finally, we show how Pyrcca can reveal retinotopic organization in brain responses to natural movies without the need for an explicit model.

  14. Pyrcca: Regularized Kernel Canonical Correlation Analysis in Python and Its Applications to Neuroimaging

    PubMed Central

    Bilenko, Natalia Y.; Gallant, Jack L.

    2016-01-01

    In this article we introduce Pyrcca, an open-source Python package for performing canonical correlation analysis (CCA). CCA is a multivariate analysis method for identifying relationships between sets of variables. Pyrcca supports CCA with or without regularization, and with or without linear, polynomial, or Gaussian kernelization. We first use an abstract example to describe Pyrcca functionality. We then demonstrate how Pyrcca can be used to analyze neuroimaging data. Specifically, we use Pyrcca to implement cross-subject comparison in a natural movie functional magnetic resonance imaging (fMRI) experiment by finding a data-driven set of functional response patterns that are similar across individuals. We validate this cross-subject comparison method in Pyrcca by predicting responses to novel natural movies across subjects. Finally, we show how Pyrcca can reveal retinotopic organization in brain responses to natural movies without the need for an explicit model. PMID:27920675

  15. Exploring the potential of high resolution mass spectrometry for the investigation of lignin-derived phenol substitutes in phenolic resin syntheses.

    PubMed

    Dier, Tobias K F; Fleckenstein, Marco; Militz, Holger; Volmer, Dietrich A

    2017-05-01

    Chemical degradation is an efficient method to obtain bio-oils and other compounds from lignin. Lignin bio-oils are potential substitutes for the phenol component of phenol formaldehyde (PF) resins. Here, we developed an analytical method based on high resolution mass spectrometry that provided structural information for the synthesized lignin-derived resins and supported the prediction of their properties. Different model resins based on typical lignin degradation products were analyzed by electrospray ionization in negative ionization mode. Utilizing enhanced mass defect filter techniques provided detailed structural information of the lignin-based model resins and readily complemented the analytical data from differential scanning calorimetry and thermogravimetric analysis. Relative reactivity and chemical diversity of the phenol substitutes were significant determinants of the outcome of the PF resin synthesis and thus controlled the areas of application of the resulting polymers. Graphical abstract ᅟ.

  16. Predictive value of the present-on-admission indicator for hospital-acquired venous thromboembolism.

    PubMed

    Khanna, Raman R; Kim, Sharon B; Jenkins, Ian; El-Kareh, Robert; Afsarmanesh, Nasim; Amin, Alpesh; Sand, Heather; Auerbach, Andrew; Chia, Catherine Y; Maynard, Gregory; Romano, Patrick S; White, Richard H

    2015-04-01

    Hospital-acquired venous thromboembolic (HA-VTE) events are an important, preventable cause of morbidity and death, but accurately identifying HA-VTE events requires labor-intensive chart review. Administrative diagnosis codes and their associated "present-on-admission" (POA) indicator might allow automated identification of HA-VTE events, but only if VTE codes are accurately flagged "not present-on-admission" (POA=N). New codes were introduced in 2009 to improve accuracy. We identified all medical patients with at least 1 VTE "other" discharge diagnosis code from 5 academic medical centers over a 24-month period. We then sampled, within each center, patients with VTE codes flagged POA=N or POA=U (insufficient documentation) and POA=Y or POA=W (timing clinically uncertain) and abstracted each chart to clarify VTE timing. All events that were not clearly POA were classified as HA-VTE. We then calculated predictive values of the POA=N/U flags for HA-VTE and the POA=Y/W flags for non-HA-VTE. Among 2070 cases with at least 1 "other" VTE code, we found 339 codes flagged POA=N/U and 1941 flagged POA=Y/W. Among 275 POA=N/U abstracted codes, 75.6% (95% CI, 70.1%-80.6%) were HA-VTE; among 291 POA=Y/W abstracted events, 73.5% (95% CI, 68.0%-78.5%) were non-HA-VTE. Extrapolating from this sample, we estimated that 59% of actual HA-VTE codes were incorrectly flagged POA=Y/W. POA indicator predictive values did not improve after new codes were introduced in 2009. The predictive value of VTE events flagged POA=N/U for HA-VTE was 75%. However, sole reliance on this flag may substantially underestimate the incidence of HA-VTE.

  17. Text Mining for Protein Docking

    PubMed Central

    Badal, Varsha D.; Kundrotas, Petras J.; Vakser, Ilya A.

    2015-01-01

    The rapidly growing amount of publicly available information from biomedical research is readily accessible on the Internet, providing a powerful resource for predictive biomolecular modeling. The accumulated data on experimentally determined structures transformed structure prediction of proteins and protein complexes. Instead of exploring the enormous search space, predictive tools can simply proceed to the solution based on similarity to the existing, previously determined structures. A similar major paradigm shift is emerging due to the rapidly expanding amount of information, other than experimentally determined structures, which still can be used as constraints in biomolecular structure prediction. Automated text mining has been widely used in recreating protein interaction networks, as well as in detecting small ligand binding sites on protein structures. Combining and expanding these two well-developed areas of research, we applied the text mining to structural modeling of protein-protein complexes (protein docking). Protein docking can be significantly improved when constraints on the docking mode are available. We developed a procedure that retrieves published abstracts on a specific protein-protein interaction and extracts information relevant to docking. The procedure was assessed on protein complexes from Dockground (http://dockground.compbio.ku.edu). The results show that correct information on binding residues can be extracted for about half of the complexes. The amount of irrelevant information was reduced by conceptual analysis of a subset of the retrieved abstracts, based on the bag-of-words (features) approach. Support Vector Machine models were trained and validated on the subset. The remaining abstracts were filtered by the best-performing models, which decreased the irrelevant information for ~ 25% complexes in the dataset. The extracted constraints were incorporated in the docking protocol and tested on the Dockground unbound benchmark set, significantly increasing the docking success rate. PMID:26650466

  18. Abstracts of Presentations at Workshop on Unsteady and Two-Phase-Flows, Held in London, England on June 28-29, 1990

    DTIC Science & Technology

    1990-06-29

    has been found to be a modification of the STAN’ program from Crawford and Kays2. An important characteristic of any boundary layer prediction program...function of freestream turbulence intensity, helped in predicting heat transfer rates between the hot gases and the b’arie surface. a Professor...be a modulator of transition to turbulence and the boundary layer prediction programs currently available have a poor performance in such flows

  19. A theoretical study of the H-abstraction reactions from HOI by moist air radiolytic products (H, OH, and O (3P)) and iodine atoms (2P(3/2)).

    PubMed

    Hammaecher, Catherine; Canneaux, Sébastien; Louis, Florent; Cantrel, Laurent

    2011-06-23

    The rate constants of the reactions of HOI molecules with H, OH, O ((3)P), and I ((2)P(3/2)) atoms have been estimated over the temperature range 300-2500 K using four different levels of theory. Geometry optimizations and vibrational frequency calculations are performed using MP2 methods combined with two basis sets (cc-pVTZ and 6-311G(d,p)). Single-point energy calculations are performed with the highly correlated ab initio coupled cluster method in the space of single, double, and triple (pertubatively) electron excitations CCSD(T) using the cc-pVTZ, cc-pVQZ, 6-311+G(3df,2p), and 6-311++G(3df,3pd) basis sets. Reaction enthalpies at 0 K were calculated at the CCSD(T)/cc-pVnZ//MP2/cc-pVTZ (n = T and Q), CCSD(T)/6-311+G(3df,2p)//MP2/6-311G(d,p), and CCSD(T)/6-311++G(3df,3pd)//MP2/6-311G(d,p) levels of theory and compared to the experimental values taken from the literature. Canonical transition-state theory with an Eckart tunneling correction is used to predict the rate constants as a function of temperature. The computational procedure has been used to predict rate constants for H-abstraction elementary reactions because there are actually no literature data to which the calculated rate constants can be directly compared. The final objective is to implement kinetics of gaseous reactions in the ASTEC (accident source term evaluation code) program to improve speciation of fission products, which can be transported along the reactor coolant system (RCS) of a pressurized water reactor (PWR) in the case of a severe accident.

  20. Metabolic Pathway Assignment of Plant Genes based on Phylogenetic Profiling–A Feasibility Study

    PubMed Central

    Weißenborn, Sandra; Walther, Dirk

    2017-01-01

    Despite many developed experimental and computational approaches, functional gene annotation remains challenging. With the rapidly growing number of sequenced genomes, the concept of phylogenetic profiling, which predicts functional links between genes that share a common co-occurrence pattern across different genomes, has gained renewed attention as it promises to annotate gene functions based on presence/absence calls alone. We applied phylogenetic profiling to the problem of metabolic pathway assignments of plant genes with a particular focus on secondary metabolism pathways. We determined phylogenetic profiles for 40,960 metabolic pathway enzyme genes with assigned EC numbers from 24 plant species based on sequence and pathway annotation data from KEGG and Ensembl Plants. For gene sequence family assignments, needed to determine the presence or absence of particular gene functions in the given plant species, we included data of all 39 species available at the Ensembl Plants database and established gene families based on pairwise sequence identities and annotation information. Aside from performing profiling comparisons, we used machine learning approaches to predict pathway associations from phylogenetic profiles alone. Selected metabolic pathways were indeed found to be composed of gene families of greater than expected phylogenetic profile similarity. This was particularly evident for primary metabolism pathways, whereas for secondary pathways, both the available annotation in different species as well as the abstraction of functional association via distinct pathways proved limiting. While phylogenetic profile similarity was generally not found to correlate with gene co-expression, direct physical interactions of proteins were reflected by a significantly increased profile similarity suggesting an application of phylogenetic profiling methods as a filtering step in the identification of protein-protein interactions. This feasibility study highlights the potential and challenges associated with phylogenetic profiling methods for the detection of functional relationships between genes as well as the need to enlarge the set of plant genes with proven secondary metabolism involvement as well as the limitations of distinct pathways as abstractions of relationships between genes. PMID:29163570

  1. Localization Versus Abstraction: A Comparison of Two Search Reduction Techniques

    NASA Technical Reports Server (NTRS)

    Lansky, Amy L.

    1992-01-01

    There has been much recent work on the use of abstraction to improve planning behavior and cost. Another technique for dealing with the inherently explosive cost of planning is localization. This paper compares the relative strengths of localization and abstraction in reducing planning search cost. In particular, localization is shown to subsume abstraction. Localization techniques can model the various methods of abstraction that have been used, but also provide a much more flexible framework, with a broader range of benefits.

  2. Air Pollution Translations: A Bibliography with Abstracts - Volume 4.

    ERIC Educational Resources Information Center

    Environmental Protection Agency, Research Triangle Park, NC. Air Pollution Technical Information Center.

    This volume is the fourth in a series of compilations presenting abstracts and indexes of translations of technical air pollution literature. The entries are grouped into 12 subject categories: Emission Sources, Control Methods, Measurement Methods, Air Quality Measurements, Atmospheric Interaction, Basic Science and Technology, Effects--Human…

  3. Modeling observations of solar coronal mass ejections with heliospheric imagers verified with the Heliophysics System Observatory

    PubMed Central

    Isavnin, A.; Boakes, P. D.; Kilpua, E. K. J.; Davies, J. A.; Harrison, R. A.; Barnes, D.; Krupar, V.; Eastwood, J. P.; Good, S. W.; Forsyth, R. J.; Bothmer, V.; Reiss, M. A.; Amerstorfer, T.; Winslow, R. M.; Anderson, B. J.; Philpott, L. C.; Rodriguez, L.; Rouillard, A. P.; Gallagher, P.; Nieves‐Chinchilla, T.; Zhang, T. L.

    2017-01-01

    Abstract We present an advance toward accurately predicting the arrivals of coronal mass ejections (CMEs) at the terrestrial planets, including Earth. For the first time, we are able to assess a CME prediction model using data over two thirds of a solar cycle of observations with the Heliophysics System Observatory. We validate modeling results of 1337 CMEs observed with the Solar Terrestrial Relations Observatory (STEREO) heliospheric imagers (HI) (science data) from 8 years of observations by five in situ observing spacecraft. We use the self‐similar expansion model for CME fronts assuming 60° longitudinal width, constant speed, and constant propagation direction. With these assumptions we find that 23%–35% of all CMEs that were predicted to hit a certain spacecraft lead to clear in situ signatures, so that for one correct prediction, two to three false alarms would have been issued. In addition, we find that the prediction accuracy does not degrade with the HI longitudinal separation from Earth. Predicted arrival times are on average within 2.6 ± 16.6 h difference of the in situ arrival time, similar to analytical and numerical modeling, and a true skill statistic of 0.21. We also discuss various factors that may improve the accuracy of space weather forecasting using wide‐angle heliospheric imager observations. These results form a first‐order approximated baseline of the prediction accuracy that is possible with HI and other methods used for data by an operational space weather mission at the Sun‐Earth L5 point. PMID:28983209

  4. Language use and stereotyping: the role of approach and avoidance motivation goals.

    PubMed

    Gil de Montes, Lorena; Ortiz, Garbiñe; Valencia, José F; Larrañaga, Maider; Agirrezabal, Arrate

    2012-11-01

    The use of more abstract language to describe expected behaviors as opposed to unexpected behaviors has traditionally been considered a way of stereotype maintenance. This tendency is known as linguistic expectancy bias. Two experiments examined the influence of approach and avoidance motivational orientations on the production of this linguistic expectancy bias. It was predicted that approach strategic orientation is likely to describe expectancy consistent behaviors at a higher level of linguistic abstraction than expectancy inconsistent behaviors. In contrast, avoidance strategic orientation is likely to describe both expectancy consistent behaviors and expectancy inconsistent behaviors at a lower level of linguistic abstraction, thus facilitating the disappearance of linguistic expectancy bias. Two experiments confirmed these expectations, using strategic orientation manipulations based either on communication goals or on motor action, and measuring linguistic abstraction either on forced-choice answer format or on free descriptions. Implications for the generalisation of linguistic expectancy bias are discussed.

  5. Identifying Best Practices for and Utilities of the Pharmacy Curriculum Outcome Assessment Examination.

    PubMed

    Mok, Timothy Y; Romanelli, Frank

    2016-12-25

    Objective. A review was conducted to determine implementation strategies, utilities, score interpretation, and limitations of the Pharmacy Curriculum Outcome Assessment (PCOA) examination. Methods. Articles were identified through the PubMed and American Journal of Pharmaceutical Education , and International Pharmaceutical Abstracts databases using the following terms: "Pharmacy Curriculum Outcomes Assessment," "pharmacy comprehensive examination," and "curricular assessment." Studies containing information regarding implementation, utility, and predictive values for US student pharmacists, curricula, and/or PGY1/PGY2 residents were included. Publications from the Academic Medicine Journal , the Accreditation Council for Pharmacy Education (ACPE), and the American Association of Colleges of Pharmacy (ACCP) were included for background information and comparison of predictive utilities of comprehensive examinations in medicine. Results. Ten PCOA and nine residency-related publications were identified. Based on published information, the PCOA may be best used as an additional tool to identify knowledge gaps for third-year student pharmacists. Conclusion. Administering the PCOA to students after they have completed their didactic coursework may yield scores that reflect student knowledge. Predictive utility regarding the North American Pharmacy Licensure Examination (NAPLEX) and potential applications is limited, and more research is required to determine ways to use the PCOA.

  6. A nucleobase-centered coarse-grained representation for structure prediction of RNA motifs

    PubMed Central

    Poblete, Simón; Bottaro, Sandro; Bussi, Giovanni

    2018-01-01

    Abstract We introduce the SPlit-and-conQueR (SPQR) model, a coarse-grained (CG) representation of RNA designed for structure prediction and refinement. In our approach, the representation of a nucleotide consists of a point particle for the phosphate group and an anisotropic particle for the nucleoside. The interactions are, in principle, knowledge-based potentials inspired by the \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$\\mathcal {E}$\\end{document}SCORE function, a base-centered scoring function. However, a special treatment is given to base-pairing interactions and certain geometrical conformations which are lost in a raw knowledge-based model. This results in a representation able to describe planar canonical and non-canonical base pairs and base–phosphate interactions and to distinguish sugar puckers and glycosidic torsion conformations. The model is applied to the folding of several structures, including duplexes with internal loops of non-canonical base pairs, tetraloops, junctions and a pseudoknot. For the majority of these systems, experimental structures are correctly predicted at the level of individual contacts. We also propose a method for efficiently reintroducing atomistic detail from the CG representation. PMID:29272539

  7. Synthesis and Bioassay of Improved Mosquito Repellents Predicted From Chemical Structure

    DTIC Science & Technology

    2008-05-27

    blood meal to develop their eggs. Repellents play a vital role in interrupting this mosquito/human interaction by serving as a means of personal ...provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB...ABSTRACT Same as Report (SAR) 18. NUMBER OF PAGES 6 19a. NAME OF RESPONSIBLE PERSON a. REPORT unclassified b. ABSTRACT unclassified c. THIS PAGE

  8. Comprehension of concrete and abstract words in semantic dementia

    PubMed Central

    Jefferies, Elizabeth; Patterson, Karalyn; Jones, Roy W.; Lambon Ralph, Matthew A.

    2009-01-01

    The vast majority of brain-injured patients with semantic impairment have better comprehension of concrete than abstract words. In contrast, several patients with semantic dementia (SD), who show circumscribed atrophy of the anterior temporal lobes bilaterally, have been reported to show reverse imageability effects, i.e., relative preservation of abstract knowledge. Although these reports largely concern individual patients, some researchers have recently proposed that superior comprehension of abstract concepts is a characteristic feature of SD. This would imply that the anterior temporal lobes are particularly crucial for processing sensory aspects of semantic knowledge, which are associated with concrete not abstract concepts. However, functional neuroimaging studies of healthy participants do not unequivocally predict reverse imageability effects in SD because the temporal poles sometimes show greater activation for more abstract concepts. We examined a case-series of eleven SD patients on a synonym judgement test that orthogonally varied the frequency and imageability of the items. All patients had higher success rates for more imageable as well as more frequent words, suggesting that (a) the anterior temporal lobes underpin semantic knowledge for both concrete and abstract concepts, (b) more imageable items – perhaps due to their richer multimodal representations – are typically more robust in the face of global semantic degradation and (c) reverse imageability effects are not a characteristic feature of SD. PMID:19586212

  9. ChimeRScope: a novel alignment-free algorithm for fusion transcript prediction using paired-end RNA-Seq data

    PubMed Central

    Li, You; Heavican, Tayla B.; Vellichirammal, Neetha N.; Iqbal, Javeed

    2017-01-01

    Abstract The RNA-Seq technology has revolutionized transcriptome characterization not only by accurately quantifying gene expression, but also by the identification of novel transcripts like chimeric fusion transcripts. The ‘fusion’ or ‘chimeric’ transcripts have improved the diagnosis and prognosis of several tumors, and have led to the development of novel therapeutic regimen. The fusion transcript detection is currently accomplished by several software packages, primarily relying on sequence alignment algorithms. The alignment of sequencing reads from fusion transcript loci in cancer genomes can be highly challenging due to the incorrect mapping induced by genomic alterations, thereby limiting the performance of alignment-based fusion transcript detection methods. Here, we developed a novel alignment-free method, ChimeRScope that accurately predicts fusion transcripts based on the gene fingerprint (as k-mers) profiles of the RNA-Seq paired-end reads. Results on published datasets and in-house cancer cell line datasets followed by experimental validations demonstrate that ChimeRScope consistently outperforms other popular methods irrespective of the read lengths and sequencing depth. More importantly, results on our in-house datasets show that ChimeRScope is a better tool that is capable of identifying novel fusion transcripts with potential oncogenic functions. ChimeRScope is accessible as a standalone software at (https://github.com/ChimeRScope/ChimeRScope/wiki) or via the Galaxy web-interface at (https://galaxy.unmc.edu/). PMID:28472320

  10. Development and Psychometric Evaluation of the HPV Clinical Trial Survey for Parents (CTSP‐HPV) Using Traditional Survey Development Methods and Community Engagement Principles

    PubMed Central

    Wallston, Kenneth A.; Wilkins, Consuelo H.; Hull, Pamela C.; Miller, Stephania T.

    2015-01-01

    Abstract Objective This study describes the development and psychometric evaluation of HPV Clinical Trial Survey for Parents with Children Aged 9 to 15 (CTSP‐HPV) using traditional instrument development methods and community engagement principles. Methods An expert panel and parental input informed survey content and parents recommended study design changes (e.g., flyer wording). A convenience sample of 256 parents completed the final survey measuring parental willingness to consent to HPV clinical trial (CT) participation and other factors hypothesized to influence willingness (e.g., HPV vaccine benefits). Cronbach's a, Spearman correlations, and multiple linear regression were used to estimate internal consistency, convergent and discriminant validity, and predictively validity, respectively. Results Internal reliability was confirmed for all scales (a ≥ 0.70.). Parental willingness was positively associated (p < 0.05) with trust in medical researchers, adolescent CT knowledge, HPV vaccine benefits, advantages of adolescent CTs (r range 0.33–0.42), supporting convergent validity. Moderate discriminant construct validity was also demonstrated. Regression results indicate reasonable predictive validity with the six scales accounting for 31% of the variance in parents’ willingness. Conclusions This instrument can inform interventions based on factors that influence parental willingness, which may lead to the eventual increase in trial participation. Further psychometric testing is warranted. PMID:26530324

  11. Predict drug permeability to blood–brain-barrier from clinical phenotypes: drug side effects and drug indications

    PubMed Central

    Gao, Zhen; Chen, Yang; Cai, Xiaoshu; Xu, Rong

    2017-01-01

    Abstract Motivation: Blood–Brain-Barrier (BBB) is a rigorous permeability barrier for maintaining homeostasis of Central Nervous System (CNS). Determination of compound’s permeability to BBB is prerequisite in CNS drug discovery. Existing computational methods usually predict drug BBB permeability from chemical structure and they generally apply to small compounds passing BBB through passive diffusion. As abundant information on drug side effects and indications has been recorded over time through extensive clinical usage, we aim to explore BBB permeability prediction from a new angle and introduce a novel approach to predict BBB permeability from drug clinical phenotypes (drug side effects and drug indications). This method can apply to both small compounds and macro-molecules penetrating BBB through various mechanisms besides passive diffusion. Results: We composed a training dataset of 213 drugs with known brain and blood steady-state concentrations ratio and extracted their side effects and indications as features. Next, we trained SVM models with polynomial kernel and obtained accuracy of 76.0%, AUC 0.739, and F1 score (macro weighted) 0.760 with Monte Carlo cross validation. The independent test accuracy was 68.3%, AUC 0.692, F1 score 0.676. When both chemical features and clinical phenotypes were available, combining the two types of features achieved significantly better performance than chemical feature based approach (accuracy 85.5% versus 72.9%, AUC 0.854 versus 0.733, F1 score 0.854 versus 0.725; P < e−90). We also conducted de novo prediction and identified 110 drugs in SIDER database having the potential to penetrate BBB, which could serve as start point for CNS drug repositioning research. Availability and Implementation: https://github.com/bioinformatics-gao/CASE-BBB-prediction-Data Contact: rxx@case.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27993785

  12. THE NATIONAL CHILDREN'S STUDY: PROGRESS DEVELOPING METHODS APPROPRIATE FOR ASSESSING CHILDREN'S EXPOSURE, BIOMARKERS AND GENETIC SUSCEPTIBILITY

    EPA Science Inventory

    Invited presentation: no abstract submission fee required
    Introduction abstract for Workshop.

    CONTROL ID: 56947
    CONTACT (NAME ONLY): Barbara Abbott
    Abstract Details
    PRESENTATION TYPE: Invited Presentation : Workshop
    KEYWORDS: National Childrens Study, Ri...

  13. An Improved Method of Predicting Extinction Coefficients for the Determination of Protein Concentration.

    PubMed

    Hilario, Eric C; Stern, Alan; Wang, Charlie H; Vargas, Yenny W; Morgan, Charles J; Swartz, Trevor E; Patapoff, Thomas W

    2017-01-01

    Concentration determination is an important method of protein characterization required in the development of protein therapeutics. There are many known methods for determining the concentration of a protein solution, but the easiest to implement in a manufacturing setting is absorption spectroscopy in the ultraviolet region. For typical proteins composed of the standard amino acids, absorption at wavelengths near 280 nm is due to the three amino acid chromophores tryptophan, tyrosine, and phenylalanine in addition to a contribution from disulfide bonds. According to the Beer-Lambert law, absorbance is proportional to concentration and path length, with the proportionality constant being the extinction coefficient. Typically the extinction coefficient of proteins is experimentally determined by measuring a solution absorbance then experimentally determining the concentration, a measurement with some inherent variability depending on the method used. In this study, extinction coefficients were calculated based on the measured absorbance of model compounds of the four amino acid chromophores. These calculated values for an unfolded protein were then compared with an experimental concentration determination based on enzymatic digestion of proteins. The experimentally determined extinction coefficient for the native proteins was consistently found to be 1.05 times the calculated value for the unfolded proteins for a wide range of proteins with good accuracy and precision under well-controlled experimental conditions. The value of 1.05 times the calculated value was termed the predicted extinction coefficient. Statistical analysis shows that the differences between predicted and experimentally determined coefficients are scattered randomly, indicating no systematic bias between the values among the proteins measured. The predicted extinction coefficient was found to be accurate and not subject to the inherent variability of experimental methods. We propose the use of a predicted extinction coefficient for determining the protein concentration of therapeutic proteins starting from early development through the lifecycle of the product. LAY ABSTRACT: Knowing the concentration of a protein in a pharmaceutical solution is important to the drug's development and posology. There are many ways to determine the concentration, but the easiest one to use in a testing lab employs absorption spectroscopy. Absorbance of ultraviolet light by a protein solution is proportional to its concentration and path length; the proportionality constant is the extinction coefficient. The extinction coefficient of a protein therapeutic is usually determined experimentally during early product development and has some inherent method variability. In this study, extinction coefficients of several proteins were calculated based on the measured absorbance of model compounds. These calculated values for an unfolded protein were then compared with experimental concentration determinations based on enzymatic digestion of the proteins. The experimentally determined extinction coefficient for the native protein was 1.05 times the calculated value for the unfolded protein with good accuracy and precision under controlled experimental conditions, so the value of 1.05 times the calculated coefficient was called the predicted extinction coefficient. Comparison of predicted and measured extinction coefficients indicated that the predicted value was very close to the experimentally determined values for the proteins. The predicted extinction coefficient was accurate and removed the variability inherent in experimental methods. © PDA, Inc. 2017.

  14. Atmospheric reaction of Cl + methacrolein: a theoretical study on the mechanism, and pressure- and temperature-dependent rate constants.

    PubMed

    Sun, Cuihong; Xu, Baoen; Zhang, Shaowen

    2014-05-22

    Methacrolein is a major degradation product of isoprene, the reaction of methacrolein with Cl atoms may play some roles in the degradation of isoprene where these species are relatively abundant. However, the energetics and kinetics of this reaction, which govern the reaction branching, are still not well understood so far. In the present study, two-dimensional potential energy surfaces were constructed to analyze the minimum energy path of the barrierless addition process between Cl and the C═C double bond of methacrolein, which reveals that the terminal addition intermediate is directly formed from the addition reaction. The terminal addition intermediate can further yield different products among which the reaction paths abstracting the aldehyde hydrogen atom and the methyl hydrogen atom are dominant reaction exits. The minimum reaction path for the direct aldehydic hydrogen atom abstraction is also obtained. The reaction kinetics was calculated by the variational transition state theory in conjunction with the master equation method. From the theoretical model we predicted that the overall rate constant of the Cl + methacrolein reaction at 297 K and atmospheric pressure is koverall = 2.3× 10(-10) cm(3) molecule(-1) s(-1), and the branching ratio of the aldehydic hydrogen abstraction is about 12%. The reaction is pressure dependent at P < 10 Torr with the high pressure limit at about 100 Torr. The calculated results could well account for the experimental observations.

  15. Air Pollution Translations: A Bibliography with Abstracts - Volume 2.

    ERIC Educational Resources Information Center

    National Air Pollution Control Administration (DHEW), Raleigh, NC.

    This volume is the second in a series of compilations presenting abstracts and indexes of translations of technical air pollution literature. The 444 entries are grouped into 12 subject categories: General; Emission Sources; Atmospheric Interaction; Measurement Methods; Control Methods; Effects--Human Health; Effects--Plants and Livestock;…

  16. Multiple Grammars and the Logic of Learnability in Second Language Acquisition.

    PubMed

    Roeper, Tom W

    2016-01-01

    The core notion of modern Universal Grammar is that language ability requires abstract representation in terms of hierarchy, movement operations, abstract features on words, and fixed mapping to meaning. These mental structures are a step toward integrating representational knowledge of all kinds into a larger model of cognitive psychology. Examining first and second language at once provides clues as to how abstractly we should represent this knowledge. The abstract nature of grammar allows both the formulation of many grammars and the possibility that a rule of one grammar could apply to another grammar. We argue that every language contains Multiple Grammars which may reflect different language families. We develop numerous examples of how the same abstract rules can apply in various languages and develop a theory of how language modules (case-marking, topicalization, and quantification) interact to predict L2 acquisition paths. In particular we show in depth how Germanic Verb-second operations, based on Verb-final structure, can apply in English. The argument is built around how and where V2 from German can apply in English, seeking to explain the crucial contrast: "nothing" yelled out Bill/(*)"nothing" yelled Bill out in terms of the necessary abstractness of the V2 rule.

  17. Overcoming the Challenges of Unstructured Data in Multisite, Electronic Medical Record-based Abstraction.

    PubMed

    Polnaszek, Brock; Gilmore-Bykovskyi, Andrea; Hovanes, Melissa; Roiland, Rachel; Ferguson, Patrick; Brown, Roger; Kind, Amy J H

    2016-10-01

    Unstructured data encountered during retrospective electronic medical record (EMR) abstraction has routinely been identified as challenging to reliably abstract, as these data are often recorded as free text, without limitations to format or structure. There is increased interest in reliably abstracting this type of data given its prominent role in care coordination and communication, yet limited methodological guidance exists. As standard abstraction approaches resulted in substandard data reliability for unstructured data elements collected as part of a multisite, retrospective EMR study of hospital discharge communication quality, our goal was to develop, apply and examine the utility of a phase-based approach to reliably abstract unstructured data. This approach is examined using the specific example of discharge communication for warfarin management. We adopted a "fit-for-use" framework to guide the development and evaluation of abstraction methods using a 4-step, phase-based approach including (1) team building; (2) identification of challenges; (3) adaptation of abstraction methods; and (4) systematic data quality monitoring. Unstructured data elements were the focus of this study, including elements communicating steps in warfarin management (eg, warfarin initiation) and medical follow-up (eg, timeframe for follow-up). After implementation of the phase-based approach, interrater reliability for all unstructured data elements demonstrated κ's of ≥0.89-an average increase of +0.25 for each unstructured data element. As compared with standard abstraction methodologies, this phase-based approach was more time intensive, but did markedly increase abstraction reliability for unstructured data elements within multisite EMR documentation.

  18. Mixed metaphors: Electrophysiological brain responses to (un)expected concrete and abstract prepositional phrases.

    PubMed

    Zane, Emily; Shafer, Valerie

    2018-02-01

    Languages around the world use spatial terminology, like prepositions, to describe non-spatial, abstract concepts, including time (e.g., in the moment). The Metaphoric Mapping Theory explains this pattern by positing that a universal human cognitive process underlies it, whereby abstract concepts are conceptualized via the application of concrete, three-dimensional space onto abstract domains. The alternative view is that the use of spatial propositions in abstract phrases is idiomatic, and thus does not trigger metaphoric mapping. In the current study, event-related potentials (ERPs) were used to examine the time-course of neural processing of concrete and abstract phrases consisting of the prepositions in or on followed by congruent and incongruent nouns (e.g., in the bowl/plate and in the moment/mend). ERPs were recorded from the onset of reference nouns in 28 adult participants using a 128-channel electrode net. Results show that congruency has differential effects on neural measures, depending on whether the noun is concrete or abstract. Incongruent reference nouns in concrete phrases (e.g., on the bowl) elicited a significant central negativity (an N400 effect), while incongruent reference nouns in abstract phrases (e.g., on the moment) did not. These results suggest that spatially incongruent concrete nouns are semantically unexpected (N400 effect). A P600 effect, which might indicate rechecking, reanalysis and/or reconstruction, was predicted for incongruent abstract nouns, but was not observed, possibly due to the variability in abstract stimuli. Findings cast doubt on accounts claiming that abstract uses of prepositions are cognitively and metaphorically linked to their spatial sense during natural, on-line processing. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Abstract analogical reasoning in high-functioning children with autism spectrum disorders.

    PubMed

    Green, Adam E; Kenworthy, Lauren; Mosner, Maya G; Gallagher, Natalie M; Fearon, Edward W; Balhana, Carlos D; Yerys, Benjamin E

    2014-12-01

    Children with autism spectrum disorders (ASD) exhibit a deficit in spontaneously recognizing abstract similarities that are crucial for generalizing learning to new situations. This may contribute to deficits in the development of appropriate schemas for navigating novel situations, including social interactions. Analogical reasoning is the central cognitive mechanism that enables typically developing children to understand abstract similarities between different situations. Intriguingly, studies of high-functioning children with ASD point to a relative cognitive strength in basic, nonabstract forms of analogical reasoning. If this analogical reasoning ability extends to abstract analogical reasoning (i.e., between superficially dissimilar situations), it may provide a bridge between a cognitive capability and core ASD deficits in areas such as generalization and categorization. This study tested whether preserved analogical reasoning abilities in ASD can be extended to abstract analogical reasoning, using photographs of real-world items and situations. Abstractness of the analogies was determined via a quantitative measure of semantic distance derived from latent semantic analysis. Children with ASD performed as well as typically developing children at identifying abstract analogical similarities when explicitly instructed to apply analogical reasoning. Individual differences in abstract analogical reasoning ability predicted individual differences in a measure of social function in the ASD group. Preliminary analyses indicated that children with ASD, but not typically developing children, showed an effect of age on abstract analogical reasoning. These results provide new evidence that children with ASD are capable of identifying abstract similarities through analogical reasoning, pointing to abstract analogical reasoning as a potential lever for improving generalization skills and social function in ASD. © 2014 International Society for Autism Research, Wiley Periodicals, Inc.

  20. Modeling Habitat Suitability of Migratory Birds from Remote Sensing Images Using Convolutional Neural Networks

    PubMed Central

    Su, Jin-He; Piao, Ying-Chao; Luo, Ze; Yan, Bao-Ping

    2018-01-01

    Simple Summary The understanding of the spatio-temporal distribution of the species habitats would facilitate wildlife resource management and conservation efforts. Existing methods have poor performance due to the limited availability of training samples. More recently, location-aware sensors have been widely used to track animal movements. The aim of the study was to generate suitability maps of bar-head geese using movement data coupled with environmental parameters, such as remote sensing images and temperature data. Therefore, we modified a deep convolutional neural network for the multi-scale inputs. The results indicate that the proposed method can identify the areas with the dense goose species around Qinghai Lake. In addition, this approach might also be interesting for implementation in other species with different niche factors or in areas where biological survey data are scarce. Abstract With the application of various data acquisition devices, a large number of animal movement data can be used to label presence data in remote sensing images and predict species distribution. In this paper, a two-stage classification approach for combining movement data and moderate-resolution remote sensing images was proposed. First, we introduced a new density-based clustering method to identify stopovers from migratory birds’ movement data and generated classification samples based on the clustering result. We split the remote sensing images into 16 × 16 patches and labeled them as positive samples if they have overlap with stopovers. Second, a multi-convolution neural network model is proposed for extracting the features from temperature data and remote sensing images, respectively. Then a Support Vector Machines (SVM) model was used to combine the features together and predict classification results eventually. The experimental analysis was carried out on public Landsat 5 TM images and a GPS dataset was collected on 29 birds over three years. The results indicated that our proposed method outperforms the existing baseline methods and was able to achieve good performance in habitat suitability prediction. PMID:29701686

  1. Prognostic and predictive value of TP53 mutations in node-positive breast cancer patients treated with anthracycline- or anthracycline/taxane-based adjuvant therapy: results from the BIG 02-98 phase III trial

    PubMed Central

    2012-01-01

    Abstract Introduction Pre-clinical data suggest p53-dependent anthracycline-induced apoptosis and p53-independent taxane activity. However, dedicated clinical research has not defined a predictive role for TP53 gene mutations. The aim of the current study was to retrospectively explore the prognosis and predictive values of TP53 somatic mutations in the BIG 02-98 randomized phase III trial in which women with node-positive breast cancer were treated with adjuvant doxorubicin-based chemotherapy with or without docetaxel. Methods The prognostic and predictive values of TP53 were analyzed in tumor samples by gene sequencing within exons 5 to 8. Patients were classified according to p53 protein status predicted from TP53 gene sequence, as wild-type (no TP53 variation or TP53 variations which are predicted not to modify p53 protein sequence) or mutant (p53 nonsynonymous mutations). Mutations were subcategorized according to missense or truncating mutations. Survival analyses were performed using the Kaplan-Meier method and log-rank test. Cox-regression analysis was used to identify independent predictors of outcome. Results TP53 gene status was determined for 18% (520 of 2887) of the women enrolled in BIG 02-98. TP53 gene variations were found in 17% (90 of 520). Nonsynonymous p53 mutations, found in 16.3% (85 of 520), were associated with older age, ductal morphology, higher grade and hormone-receptor negativity. Of the nonsynonymous mutations, 12.3% (64 of 520) were missense and 3.6% were truncating (19 of 520). Only truncating mutations showed significant independent prognostic value, with an increased recurrence risk compared to patients with non-modified p53 protein (hazard ratio = 3.21, 95% confidence interval = 1.740 to 5.935, P = 0.0002). p53 status had no significant predictive value for response to docetaxel. Conclusions p53 truncating mutations were uncommon but associated with poor prognosis. No significant predictive role for p53 status was detected. Trial registration ClinicalTrials.gov NCT00174655 PMID:22551440

  2. Is gastroenterology research in decline? A comparison of abstract publication rates from The British Society of Gastroenterology meetings between 1995 and 2005.

    PubMed

    Prendergast, Sarah; Mattishent, Katharina; Broughton, Tom; Beales, Ian

    2013-01-01

    Background: Reports have suggested that academic medicine may be in decline within the UK. Further evidence suggests that rates of subsequent full publication of abstracts presented at major scientific meetings are low and may be declining. We have compared the publication rates of abstracts presented at meetings of the British Society of Gastroenterology (BSG) between 1995 and 2005 and examined factors associated with full paper publication.  Methods: Abstracts presented at BSG meetings in 1995 and 2005 were assessed by cross-referencing with multiple databases. Abstract characteristics associated with publication were analysed. Results: There were no differences in overall publication rates, impact factors or time to publication between 1995 and 2005. Overall, basic-science abstracts were twice as likely to achieve full publication than non-basic science. There was a significant fall in the publication rates for case series and audits, and significantly increased rates for fundamental/basic-science abstracts over the study period. There were non-significant increases in publication rates for controlled trials and systematic reviews. In general, publication rates for all predominantly clinically orientated abstracts reduced between the two periods with the most notable fall occurring in nutrition.  Conclusions: There was no evidence of a decline in overall abstract publication rates between 1995 and 2005. There seemed to be trend for increased publication rates of abstracts using perceived high-quality study methodologies with a corresponding decrease in those with lower quality methods. The proportion of basic-science abstracts is likely to be a determinant of overall full publication rates following scientific meetings.

  3. First-Day Newborn Weight Loss Predicts In-Hospital Weight Nadir for Breastfeeding Infants

    PubMed Central

    Bokser, Seth; Newman, Thomas B.

    2010-01-01

    Abstract Background Exclusive breastfeeding reduces infant infectious disease. Losing ≥10% birth weight may lead to formula use. The predictive value of first-day weight loss for subsequent weight loss has not been studied. The objective of the present study was to evaluate the relationship between weight loss at <24 hours and subsequent in-hospital weight loss ≥10%. Methods For 1,049 infants, we extracted gestational age, gender, delivery method, feeding type, and weights from medical records. Weight nadir was defined as the lowest weight recorded during birth hospitalization. We used multivariate logistic regression to assess the effect of first-day weight loss on subsequent in-hospital weight loss. Results Mean in-hospital weight nadir was 6.0 ± 2.6%, and mean age at in-hospital weight nadir was 38.7 ± 18.5 hours. While in the hospital 6.4% of infants lost ≥10% of birth weight. Infants losing ≥4.5% birth weight at <24 hours had greater risk of eventual in-hospital weight loss ≥10% (adjusted odds ratio 3.57 [1.75, 7.28]). In this cohort, 798 (76.1%) infants did not have documented weight gain while in the hospital. Conclusions Early weight loss predicts higher risk of ≥10% in-hospital weight loss. Infants with high first-day weight loss could be targeted for further research into improved interventions to promote breastfeeding. PMID:20113202

  4. Theory of Mind: A Neural Prediction Problem

    PubMed Central

    Koster-Hale, Jorie; Saxe, Rebecca

    2014-01-01

    Predictive coding posits that neural systems make forward-looking predictions about incoming information. Neural signals contain information not about the currently perceived stimulus, but about the difference between the observed and the predicted stimulus. We propose to extend the predictive coding framework from high-level sensory processing to the more abstract domain of theory of mind; that is, to inferences about others’ goals, thoughts, and personalities. We review evidence that, across brain regions, neural responses to depictions of human behavior, from biological motion to trait descriptions, exhibit a key signature of predictive coding: reduced activity to predictable stimuli. We discuss how future experiments could distinguish predictive coding from alternative explanations of this response profile. This framework may provide an important new window on the neural computations underlying theory of mind. PMID:24012000

  5. Complete Genome Sequence of Thermus thermophilus TMY, Isolated from a Geothermal Power Plant

    PubMed Central

    Fujino, Yasuhiro; Nagayoshi, Yuko; Ohshima, Toshihisa; Ogata, Seiya

    2017-01-01

    ABSTRACT Thermus thermophilus TMY (JCM 10668) was isolated from silica scale formed at a geothermal power plant in Japan. Here, we report the complete genome sequence for this strain, which contains a chromosomal DNA of 2,121,526 bp with 2,500 predicted genes and a pTMY plasmid of 19,139 bp, with 28 predicted genes. PMID:28153912

  6. Survey of Condition Indicators for Condition Monitoring Systems (Open Access)

    DTIC Science & Technology

    2014-09-29

    Hinesburg, Vermont, 05461, USA jz@renewablenrgsystems.com ABSTRACT Currently, the wind energy industry is swiftly changing its maintenance strategy...from schedule based maintenance to predictive based maintenance . Condition monitoring systems (CMS) play an important role in the predictive... maintenance cycle. As condition monitoring systems are being adopted by more and more OEM and O&M service providers from the wind energy industry, it is

  7. A risk analysis approach for using discriminant functions to manage logging-related landslides on granitic terrain

    Treesearch

    Raymond M. Rice; Norman H. Pillsbury; Kurt W. Schmidt

    1985-01-01

    Abstract - A linear discriminant function, developed to predict debris avalanches after clearcut logging on a granitic batholith in northwestern California, was tested on data from two batholiths. The equation was inaccurate in predicting slope stability on one of them. A new equation based on slope, crown cover, and distance from a stream (retained from the original...

  8. Abstraction and model evaluation in category learning.

    PubMed

    Vanpaemel, Wolf; Storms, Gert

    2010-05-01

    Thirty previously published data sets, from seminal category learning tasks, are reanalyzed using the varying abstraction model (VAM). Unlike a prototype-versus-exemplar analysis, which focuses on extreme levels of abstraction only, a VAM analysis also considers the possibility of partial abstraction. Whereas most data sets support no abstraction when only the extreme possibilities are considered, we show that evidence for abstraction can be provided using the broader view on abstraction provided by the VAM. The present results generalize earlier demonstrations of partial abstraction (Vanpaemel & Storms, 2008), in which only a small number of data sets was analyzed. Following the dominant modus operandi in category learning research, Vanpaemel and Storms evaluated the models on their best fit, a practice known to ignore the complexity of the models under consideration. In the present study, in contrast, model evaluation not only relies on the maximal likelihood, but also on the marginal likelihood, which is sensitive to model complexity. Finally, using a large recovery study, it is demonstrated that, across the 30 data sets, complexity differences between the models in the VAM family are small. This indicates that a (computationally challenging) complexity-sensitive model evaluation method is uncalled for, and that the use of a (computationally straightforward) complexity-insensitive model evaluation method is justified.

  9. Discrepancies Between Plastic Surgery Meeting Abstracts and Subsequent Full-Length Manuscript Publications.

    PubMed

    Denadai, Rafael; Araujo, Gustavo Henrique; Pinho, Andre Silveira; Denadai, Rodrigo; Samartine, Hugo; Raposo-Amaral, Cassio Eduardo

    2016-10-01

    The purpose of this bibliometric study was to assess the discrepancies between plastic surgery meeting abstracts and subsequent full-length manuscript publications. Abstracts presented at the Brazilian Congress of Plastic Surgery from 2010 to 2011 were compared with matching manuscript publications. Discrepancies between the abstract and the subsequent manuscript were categorized as major (changes in the purpose, methods, study design, sample size, statistical analysis, results, and conclusions) and minor (changes in the title and authorship) variations. The overall discrepancy rate was 96 %, with at least one major (76 %) and/or minor (96 %) variation. There were inconsistencies between the study title (56 %), authorship (92 %), purpose (6 %), methods (20 %), study design (36 %), sample size (51.2 %), statistical analysis (14 %), results (20 %), and conclusions (8 %) of manuscripts compared with their corresponding meeting abstracts. As changes occur before manuscript publication of plastic surgery meeting abstracts, caution should be exercised in referencing abstracts or altering surgical practices based on abstracts' content. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .

  10. Current Literature on Venereal Disease, 1972. Number Three. Abstracts and Bibliography.

    ERIC Educational Resources Information Center

    Lea, Mildred V., Ed.

    Presented are abstracts of documents and research pertaining to the clinical description, laboratory diagnosis, management, and therapy of syphilis and gonorrhea. Abstracted case studies of other minor venereal and related diseases are also included, as are bibliographies on current research and evaluation, public health methods, and behavioral…

  11. Current Literature on Venereal Disease, 1972. Number Two. Abstracts and Bibliography.

    ERIC Educational Resources Information Center

    Lea, Mildred V., Ed.

    Presented are abstracts of documents and research pertaining to the clinical description, laboratory diagnosis, management, and therapy of syphilis and gonorrhea. Abstracted case studies of other minor venereal and related diseases are also included, as are bibliographies on current research and evaluation, public health methods, and behavioral…

  12. Current Literature on Venereal Disease, 1972. Number One. Abstracts and Bibliography.

    ERIC Educational Resources Information Center

    Lea, Mildred V., Ed.

    Presented are abstracts of documents and research pertaining to the clinical description, laboratory diagnosis, management, and therapy of syphilis and gonorrhea. Abstracted case studies of other minor venereal and related diseases are also included, as are bibliographies on current research and evaluation, public health methods, and behavioral…

  13. [Qualitative research in health services research - discussion paper, Part 2: Qualitative research in health services research in Germany - an overview].

    PubMed

    Karbach, U; Stamer, M; Holmberg, C; Güthlin, C; Patzelt, C; Meyer, T

    2012-08-01

    This is the second part of a 3-part discussion paper by the working group on "Qualitative Methods" in the German network of health services research (DNVF) that shall contribute to the development of a memorandum concerning qualitative health services research. It aims to depict the different types of qualitative research that are conducted in health services research in Germany. In addition, the authors present a specific set of qualitative data collection and analysis tools to demonstrate the potential of qualitative research for health services research. QUALITATIVE RESEARCH IN HEALTH SERVICES RESEARCH - AN OVERVIEW: To give an overview of the types of qualitative research conducted in German health services research, the abstracts of the 8th German Conference on Health Services Research were filtered to identify qualitative or mixed-methods studies. These were then analysed by looking at the context which was studied, who was studied, the aims of the studies, and what type of methods were used. Those methods that were mentioned most often for data collection and analysis are described in detail. QUALITATIVE RESEARCH AT THE CONFERENCE FOR HEALTH SERVICES RESEARCH 2009: Approximately a fifth of all abstracts (n=74) had a qualitative (n=47) or a mixed-methods approach combining quantitative and qualitative methods (n=27). Research aims included needs assessment (41%), survey development (36%), evaluation (22%), and theorizing (1%). Data collection mostly consisted of one-on-one interviews (n=45) and group discussions (n=29). Qualitative content analysis was named in 35 abstracts, 30 abstracts did not reference their method of analysis. In addition to a quantitative summary of the abstract findings, the diversity of fields addressed by qualitative methods is highlighted. Although drawing conclusions on the use of qualitative methods in German health services research from the analysis of conference abstracts is not possible, the overview we present demonstrates the diversity of methods used for data collection and analysis and showed that a few select methods are extensively used. One of the tasks a memorandum of qualitative health services research should accomplish is to highlight underutilized research methods, which may help to develop the potential of qualitative methodology in German health services research. © Georg Thieme Verlag KG Stuttgart · New York.

  14. Patient-Reported Outcomes After Radiation Therapy in Men With Prostate Cancer: A Systematic Review of Prognostic Tool Accuracy and Validity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Callaghan, Michael E., E-mail: elspeth.raymond@health.sa.gov.au; Freemasons Foundation Centre for Men's Health, University of Adelaide; Urology Unit, Repatriation General Hospital, SA Health, Flinders Centre for Innovation in Cancer

    Purpose: To identify, through a systematic review, all validated tools used for the prediction of patient-reported outcome measures (PROMs) in patients being treated with radiation therapy for prostate cancer, and provide a comparative summary of accuracy and generalizability. Methods and Materials: PubMed and EMBASE were searched from July 2007. Title/abstract screening, full text review, and critical appraisal were undertaken by 2 reviewers, whereas data extraction was performed by a single reviewer. Eligible articles had to provide a summary measure of accuracy and undertake internal or external validation. Tools were recommended for clinical implementation if they had been externally validated and foundmore » to have accuracy ≥70%. Results: The search strategy identified 3839 potential studies, of which 236 progressed to full text review and 22 were included. From these studies, 50 tools predicted gastrointestinal/rectal symptoms, 29 tools predicted genitourinary symptoms, 4 tools predicted erectile dysfunction, and no tools predicted quality of life. For patients treated with external beam radiation therapy, 3 tools could be recommended for the prediction of rectal toxicity, gastrointestinal toxicity, and erectile dysfunction. For patients treated with brachytherapy, 2 tools could be recommended for the prediction of urinary retention and erectile dysfunction. Conclusions: A large number of tools for the prediction of PROMs in prostate cancer patients treated with radiation therapy have been developed. Only a small minority are accurate and have been shown to be generalizable through external validation. This review provides an accessible catalogue of tools that are ready for clinical implementation as well as which should be prioritized for validation.« less

  15. Use of a Machine-learning Method for Predicting Highly Cited Articles Within General Radiology Journals.

    PubMed

    Rosenkrantz, Andrew B; Doshi, Ankur M; Ginocchio, Luke A; Aphinyanaphongs, Yindalon

    2016-12-01

    This study aimed to assess the performance of a text classification machine-learning model in predicting highly cited articles within the recent radiological literature and to identify the model's most influential article features. We downloaded from PubMed the title, abstract, and medical subject heading terms for 10,065 articles published in 25 general radiology journals in 2012 and 2013. Three machine-learning models were applied to predict the top 10% of included articles in terms of the number of citations to the article in 2014 (reflecting the 2-year time window in conventional impact factor calculations). The model having the highest area under the curve was selected to derive a list of article features (words) predicting high citation volume, which was iteratively reduced to identify the smallest possible core feature list maintaining predictive power. Overall themes were qualitatively assigned to the core features. The regularized logistic regression (Bayesian binary regression) model had highest performance, achieving an area under the curve of 0.814 in predicting articles in the top 10% of citation volume. We reduced the initial 14,083 features to 210 features that maintain predictivity. These features corresponded with topics relating to various imaging techniques (eg, diffusion-weighted magnetic resonance imaging, hyperpolarized magnetic resonance imaging, dual-energy computed tomography, computed tomography reconstruction algorithms, tomosynthesis, elastography, and computer-aided diagnosis), particular pathologies (prostate cancer; thyroid nodules; hepatic adenoma, hepatocellular carcinoma, non-alcoholic fatty liver disease), and other topics (radiation dose, electroporation, education, general oncology, gadolinium, statistics). Machine learning can be successfully applied to create specific feature-based models for predicting articles likely to achieve high influence within the radiological literature. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  16. Tracking Temporal Hazard in the Human Electroencephalogram Using a Forward Encoding Model

    PubMed Central

    2018-01-01

    Abstract Human observers automatically extract temporal contingencies from the environment and predict the onset of future events. Temporal predictions are modeled by the hazard function, which describes the instantaneous probability for an event to occur given it has not occurred yet. Here, we tackle the question of whether and how the human brain tracks continuous temporal hazard on a moment-to-moment basis, and how flexibly it adjusts to strictly implicit variations in the hazard function. We applied an encoding-model approach to human electroencephalographic data recorded during a pitch-discrimination task, in which we implicitly manipulated temporal predictability of the target tones by varying the interval between cue and target tone (i.e. the foreperiod). Critically, temporal predictability either was driven solely by the passage of time (resulting in a monotonic hazard function) or was modulated to increase at intermediate foreperiods (resulting in a modulated hazard function with a peak at the intermediate foreperiod). Forward-encoding models trained to predict the recorded EEG signal from different temporal hazard functions were able to distinguish between experimental conditions, showing that implicit variations of temporal hazard bear tractable signatures in the human electroencephalogram. Notably, this tracking signal was reconstructed best from the supplementary motor area, underlining this area’s link to cognitive processing of time. Our results underline the relevance of temporal hazard to cognitive processing and show that the predictive accuracy of the encoding-model approach can be utilized to track abstract time-resolved stimuli. PMID:29740594

  17. Traversing psychological distance.

    PubMed

    Liberman, Nira; Trope, Yaacov

    2014-07-01

    Traversing psychological distance involves going beyond direct experience, and includes planning, perspective taking, and contemplating counterfactuals. Consistent with this view, temporal, spatial, and social distances as well as hypotheticality are associated, affect each other, and are inferred from one another. Moreover, traversing all distances involves the use of abstraction, which we define as forming a belief about the substitutability for a specific purpose of subjectively distinct objects. Indeed, across many instances of both abstraction and psychological distancing, more abstract constructs are used for more distal objects. Here, we describe the implications of this relation for prediction, choice, communication, negotiation, and self-control. We ask whether traversing distance is a general mental ability and whether distance should replace expectancy in expected-utility theories. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Modelling of the material flow of Nd-Fe-B magnets under high temperature deformation via finite element simulation method

    PubMed Central

    Chen, Yen-Ju; Lee, Yen-I; Chang, Wen-Cheng; Hsiao, Po-Jen; You, Jr-Shian; Wang, Chun-Chieh; Wei, Chia-Min

    2017-01-01

    Abstract Hot deformation of Nd-Fe-B magnets has been studied for more than three decades. With a good combination of forming processing parameters, the remanence and (BH)max values of Nd-Fe-B magnets could be greatly increased due to the formation of anisotropic microstructures during hot deformation. In this work, a methodology is proposed for visualizing the material flow in hot-deformed Nd-Fe-B magnets via finite element simulation. Material flow in hot-deformed Nd-Fe-B magnets could be predicted by simulation, which fitted with experimental results. By utilizing this methodology, the correlation between strain distribution and magnetic properties enhancement could be better understood. PMID:28970869

  19. How to prepare and submit abstracts for scientific meetings

    PubMed Central

    Japiassú, Andre Miguel

    2013-01-01

    The presentation of study results is a key step in scientific research, and submitting an abstract to a meeting is often the first form of public communication. Meeting abstracts have a defined structure that is similar to abstracts for scientific articles, with an introduction, the objective, methods, results and conclusions. However, abstracts for meetings are not presented as part of a full article and, therefore, must contain the necessary and most relevant data. In this article, we detail their structure and include tips to make them technically correct. PMID:23917970

  20. How to prepare and submit abstracts for scientific meetings.

    PubMed

    Japiassú, Andre Miguel

    2013-01-01

    The presentation of study results is a key step in scientific research, and submitting an abstract to a meeting is often the first form of public communication. Meeting abstracts have a defined structure that is similar to abstracts for scientific articles, with an introduction, the objective, methods, results and conclusions. However, abstracts for meetings are not presented as part of a full article and, therefore, must contain the necessary and most relevant data. In this article, we detail their structure and include tips to make them technically correct.

  1. Disease Prediction Models and Operational Readiness

    PubMed Central

    Corley, Courtney D.; Pullum, Laura L.; Hartley, David M.; Benedum, Corey; Noonan, Christine; Rabinowitz, Peter M.; Lancaster, Mary J.

    2014-01-01

    The objective of this manuscript is to present a systematic review of biosurveillance models that operate on select agents and can forecast the occurrence of a disease event. We define a disease event to be a biological event with focus on the One Health paradigm. These events are characterized by evidence of infection and or disease condition. We reviewed models that attempted to predict a disease event, not merely its transmission dynamics and we considered models involving pathogens of concern as determined by the US National Select Agent Registry (as of June 2011). We searched commercial and government databases and harvested Google search results for eligible models, using terms and phrases provided by public health analysts relating to biosurveillance, remote sensing, risk assessments, spatial epidemiology, and ecological niche modeling. After removal of duplications and extraneous material, a core collection of 6,524 items was established, and these publications along with their abstracts are presented in a semantic wiki at http://BioCat.pnnl.gov. As a result, we systematically reviewed 44 papers, and the results are presented in this analysis. We identified 44 models, classified as one or more of the following: event prediction (4), spatial (26), ecological niche (28), diagnostic or clinical (6), spread or response (9), and reviews (3). The model parameters (e.g., etiology, climatic, spatial, cultural) and data sources (e.g., remote sensing, non-governmental organizations, expert opinion, epidemiological) were recorded and reviewed. A component of this review is the identification of verification and validation (V&V) methods applied to each model, if any V&V method was reported. All models were classified as either having undergone Some Verification or Validation method, or No Verification or Validation. We close by outlining an initial set of operational readiness level guidelines for disease prediction models based upon established Technology Readiness Level definitions. PMID:24647562

  2. Development of a Screening Tool for Predicting Adverse Outcomes of Gestational Diabetes Mellitus

    PubMed Central

    Park, Jee Soo; Kim, Deok Won; Kwon, Ja-Young; Park, Yong Won; Kim, Young Han; Cho, Hee Young

    2016-01-01

    Abstract Gestational diabetes mellitus (GDM) is a common disease in pregnancy causing maternal and fetal complications. To prevent these adverse outcomes, optimal screening and diagnostic criteria must be adequate, timely, and efficient. This study suggests a novel approach that is practical, efficient, and patient- and clinician-friendly in predicting adverse outcomes of GDM. The authors conducted a retrospective cohort study via medical record review of patients admitted between March 2001 and April 2013 at the Severance Hospital, Seoul, South Korea. Patients diagnosed by a conventional 2-step method were evaluated according to the presence of adverse outcomes (neonatal hypoglycemia, hyperbilirubinemia, and hyperinsulinemia; admission to the neonatal intensive care unit; large for gestational age; gestational insulin therapy; and gestational hypertension). Of 802 women who had an abnormal 50-g, 1-hour glucose challenge test, 306 were diagnosed with GDM and 496 did not have GDM (false-positive group). In the GDM group, 218 women (71.2%) had adverse outcomes. In contrast, 240 women (48.4%) in the false-positive group had adverse outcomes. Women with adverse outcomes had a significantly higher body mass index (BMI) at entry (P = 0.03) and fasting blood glucose (FBG) (P = 0.03). Our logistic regression model derived from 2 variables, BMI at entry and FBG, predicted GDM adverse outcome with an area under the curve of 0.642, accuracy of 61.3%, sensitivity of 57.2%, and specificity of 66.9% compared with the conventional 2-step method with an area under the curve of 0.610, accuracy of 59.1%, sensitivity of 47.6%, and specificity of 74.4%. Our model performed better in predicting GDM adverse outcomes than the conventional 2-step method using only BMI at entry and FBG. Moreover, our model represents a practical, inexpensive, efficient, reproducible, easy, and patient- and clinician-friendly approach. PMID:26735528

  3. Development of a screening tool to predict malnutrition among children under two years old in Zambia

    PubMed Central

    Hasegawa, Junko; Ito, Yoichi M; Yamauchi, Taro

    2017-01-01

    ABSTRACT Background: Maternal and child undernutrition is an important issue, particularly in low- and middle-income countries. Children at high risk of malnutrition should be prioritized to receive necessary interventions to minimize such risk. Several risk factors have been proposed; however, until now, there has been no appropriate evaluation method to identify these children. In sub-Saharan Africa, children commonly receive regular check-ups from community health workers. A simple and easy nutrition assessment method is therefore needed for use by semi-professional health workers. Objectives: The aim of this study was to develop and test a practical screening tool for community use in predicting growth stunting in children under two years in rural Zambia. Methods: Field research was conducted from July to August 2014 in Southern Province, Zambia. Two hundred and sixty-four mother-child pairs participated in the study. Anthropometric measurements were performed on all children and mothers, and all mothers were interviewed. Risk factors for the screening test were estimated by using least absolute shrinkage and selection operator analysis. After re-evaluating all participants using the new screening tool, a receiver operating characteristic curve was drawn to set the cut-off value. Sensitivity and specificity were also calculated. Results: The screening tool included age, weight-for-age Z-score status, birth weight, feeding status, history of sibling death, multiple birth, and maternal education level. The total score ranged from 0 to 22, and the cut-off value was eight. Sensitivity and specificity were 0.963 and 0.697 respectively. Conclusions: A screening tool was developed to predict children at high risk of malnutrition living in Zambia. Further longitudinal studies are needed to confirm the test’s validity in detecting future stunting and to investigate the effectiveness of malnutrition treatment. PMID:28730929

  4. Spectral response model for a multibin photon-counting spectral computed tomography detector and its applications

    PubMed Central

    Liu, Xuejin; Persson, Mats; Bornefalk, Hans; Karlsson, Staffan; Xu, Cheng; Danielsson, Mats; Huber, Ben

    2015-01-01

    Abstract. Variations among detector channels in computed tomography can lead to ring artifacts in the reconstructed images and biased estimates in projection-based material decomposition. Typically, the ring artifacts are corrected by compensation methods based on flat fielding, where transmission measurements are required for a number of material-thickness combinations. Phantoms used in these methods can be rather complex and require an extensive number of transmission measurements. Moreover, material decomposition needs knowledge of the individual response of each detector channel to account for the detector inhomogeneities. For this purpose, we have developed a spectral response model that binwise predicts the response of a multibin photon-counting detector individually for each detector channel. The spectral response model is performed in two steps. The first step employs a forward model to predict the expected numbers of photon counts, taking into account parameters such as the incident x-ray spectrum, absorption efficiency, and energy response of the detector. The second step utilizes a limited number of transmission measurements with a set of flat slabs of two absorber materials to fine-tune the model predictions, resulting in a good correspondence with the physical measurements. To verify the response model, we apply the model in two cases. First, the model is used in combination with a compensation method which requires an extensive number of transmission measurements to determine the necessary parameters. Our spectral response model successfully replaces these measurements by simulations, saving a significant amount of measurement time. Second, the spectral response model is used as the basis of the maximum likelihood approach for projection-based material decomposition. The reconstructed basis images show a good separation between the calcium-like material and the contrast agents, iodine and gadolinium. The contrast agent concentrations are reconstructed with more than 94% accuracy. PMID:26839904

  5. Complete Genome Sequence of Cluster J Mycobacteriophage Superphikiman

    PubMed Central

    Pradhan, Pratik; Nako, Sprikena; Tran, Trinh; Aluri, Lavanya S.; Anandarajan, Dharman; Betini, Niteesha; Bhatt, Shivangi D.; Chengalvala, Swetha; Cox, Nicole E.; Delvadia, Bela P.; Desai, Aishwary S.; Devaney, Andrew M.; Doyle, Brenna K.; Edgerton, Arden O.; Erlich, Matthew C.; Fitzpatrick, Kevin C.; Gajjar, Esha A.; Ganguly, Anjali; Gill, Ramnik S.; Good, Pauline M.; Gupta, Nishtha; Haddad, Leila M.; Han, Esther J.; Jain, Shelby; Jiang, Andrew; Jurgielewicz, Andrew D.; Kainth, Devneet K.; Karam, Jawhara M.; Kodavatiganti, Mallika; Kriete, Sinja J.; MacDonald, Catherine E.; Maret, Josh P.; Mathew, Ashley E.; Natrajan, Maanasa; Nishu, Nusrat M.; Patel, Nirali; Patel, Pooja D.; Patel, Shivani; Patra, Kaustav; Rai, Karima K.; Sarkar, Arghyadeep; Shah, Priyanka; Tata, Ravi K.; Tawfik, Andrew H.; Thuremella, Bhavya T.; Toma, Justina; Veera, Shika; Vemulapalli, Vamsee K.; Vidas, Trevor V.; Vieira, Katy S.; Vijayakumar, Gayathri; Walor, Tru A.; White, Clara R.; Wong, Brianna M.; Zhao, Shu L.; Bollivar, David W.; McDonald, Matthew T.; Dalia, Ritu R.; Smith, Kevin P. W.; Little, Joy L.

    2018-01-01

    ABSTRACT Mycobacteriophage Superphikiman is a cluster J bacteriophage which was isolated from soil collected in Philadelphia, PA. Superphikiman has a 109,799-bp genome with 239 predicted genes, including 2 tRNA genes. PMID:29437101

  6. Sessions with Associated Abstracts by Day: Teaching Materials and Methods.

    ERIC Educational Resources Information Center

    Physiologist, 1984

    1984-01-01

    Presented are abstracts of five papers on teaching materials/methods presented at the 35th annual meeting of the American Physiological Society. Topic areas include expert system used as a teacher/consultant in hemostasis problems, computer assisted testing, and excitation/conduction properties of membranes as illustrated by the compound action…

  7. Enumerating Small Sudoku Puzzles in a First Abstract Algebra Course

    ERIC Educational Resources Information Center

    Lorch, Crystal; Lorch, John

    2008-01-01

    Two methods are presented for counting small "essentially different" sudoku puzzles using elementary group theory: one method (due to Jarvis and Russell) uses Burnside's counting formula, while the other employs an invariant property of sudoku puzzles. Ideas are included for incorporating this material into an introductory abstract algebra course.…

  8. Prediction of functional aerobic capacity without exercise testing

    NASA Technical Reports Server (NTRS)

    Jackson, A. S.; Blair, S. N.; Mahar, M. T.; Wier, L. T.; Ross, R. M.; Stuteville, J. E.

    1990-01-01

    The purpose of this study was to develop functional aerobic capacity prediction models without using exercise tests (N-Ex) and to compare the accuracy with Astrand single-stage submaximal prediction methods. The data of 2,009 subjects (9.7% female) were randomly divided into validation (N = 1,543) and cross-validation (N = 466) samples. The validation sample was used to develop two N-Ex models to estimate VO2peak. Gender, age, body composition, and self-report activity were used to develop two N-Ex prediction models. One model estimated percent fat from skinfolds (N-Ex %fat) and the other used body mass index (N-Ex BMI) to represent body composition. The multiple correlations for the developed models were R = 0.81 (SE = 5.3 ml.kg-1.min-1) and R = 0.78 (SE = 5.6 ml.kg-1.min-1). This accuracy was confirmed when applied to the cross-validation sample. The N-Ex models were more accurate than what was obtained from VO2peak estimated from the Astrand prediction models. The SEs of the Astrand models ranged from 5.5-9.7 ml.kg-1.min-1. The N-Ex models were cross-validated on 59 men on hypertensive medication and 71 men who were found to have a positive exercise ECG. The SEs of the N-Ex models ranged from 4.6-5.4 ml.kg-1.min-1 with these subjects.(ABSTRACT TRUNCATED AT 250 WORDS).

  9. Effect of visual target blurring on accommodation under distance viewing

    NASA Astrophysics Data System (ADS)

    Iwata, Yo; Handa, Tomoya; Ishikawa, Hitoshi

    2018-04-01

    Abstract Purpose To examine the effect of visual target blurring on accommodation. Methods We evaluated the objective refraction values when the visual target (asterisk; 8°) was changed from the state without Gaussian blur (15 s) to the state with Gaussian blur adapted [0(without blur) → 10, 0 → 50, 0 → 100: 15 s each]. Results In Gaussian blur 10, when blurring of the target occurred, refraction value did not change significantly. In Gaussian blur 50 and 100, when blurring of the target occurred, the refraction value became significantly myopic. Conclusion Blurring of the distant visual target results in intervention of accommodation.

  10. Gas-phase kinetics study of reaction of OH radical with CH3NHNH2 by second-order multireference perturbation theory.

    PubMed

    Sun, Hongyan; Zhang, Peng; Law, Chung K

    2012-05-31

    The gas-phase kinetics of H-abstraction reactions of monomethylhydrazine (MMH) by OH radical was investigated by second-order multireference perturbation theory and two-transition-state kinetic model. It was found that the abstractions of the central and terminal amine H atoms by the OH radical proceed through the formation of two hydrogen bonded preactivated complexes with energies of 6.16 and 5.90 kcal mol(-1) lower than that of the reactants, whereas the abstraction of methyl H atom is direct. Due to the multireference characters of the transition states, the geometries and ro-vibrational frequencies of the reactant, transition states, reactant complexes, and product complexes were optimized by the multireference CASPT2/aug-cc-pVTZ method, and the energies of the stationary points of the potential energy surface were refined at the QCISD(T)/CBS level via extrapolation of the QCISD(T)/cc-pVTZ and QCISD(T)/cc-pVQZ energies. It was found that the abstraction reactions of the central and two terminal amine H atoms of MMH have the submerged energy barriers with energies of 2.95, 2.12, and 1.24 kcal mol(-1) lower than that that of the reactants respectively, and the abstraction of methyl H atom has a real energy barrier of 3.09 kcal mol(-1). Furthermore, four MMH radical-H(2)O complexes were found to connect with product channels and the corresponding transition states. Consequently, the rate coefficients of MMH + OH for the H-abstraction of the amine H atoms were determined on the basis of a two-transition-state model, with the total energy E and angular momentum J conserved between the two transition-state regions. In units of cm(3) molecule(-1) s(-1), the rate coefficient was found to be k(1) = 3.37 × 10(-16)T(1.295) exp(1126.17/T) for the abstraction of the central amine H to form the CH(3)N(•)NH(2) radical, k(2) = 2.34 × 10(-17)T(1.907) exp(1052.26/T) for the abstraction of the terminal amine H to form the trans-CH(3)NHN(•)H radical, k(3) = 7.41 × 10(-20)T(2.428) exp(1343.20/T) for the abstraction of the terminal amine H to form the cis-CH(3)NHN(•)H radical, and k(4) = 9.13 × 10(-21)T(2.964) exp(-114.09/T) for the abstraction of the methyl H atom to form the C(•)H(2)NHNH(2) radical, respectively. Assuming that the rate coefficients are additive, the total rate coefficient of these theoretical predictions quantitatively agrees with the measured rate constant at temperatures of 200-650 K, with no adjustable parameters.

  11. A Mathematical Model for Railway Control Systems

    NASA Technical Reports Server (NTRS)

    Hoover, D. N.

    1996-01-01

    We present a general method for modeling safety aspects of railway control systems. Using our modeling method, one can progressively refine an abstract railway safety model, sucessively adding layers of detail about how a real system actually operates, while maintaining a safety property that refines the original abstract safety property. This method supports a top-down approach to specification of railway control systems and to proof of a variety of safety-related properties. We demonstrate our method by proving safety of the classical block control system.

  12. Urological research in sub-Saharan Africa: a retrospective cohort study of abstracts presented at the Nigerian Association of Urological Surgeons conferences.

    PubMed

    Bello, Jibril Oyekunle

    2013-11-14

    Nigeria is one of the top three countries in Africa in terms of science research output and Nigerian urologists' biomedical research output contributes to this. Each year, urologists in Nigeria gather to present their recent research at the conference of the Nigerian Association of Urological Surgeons (NAUS). These abstracts are not thoroughly vetted as are full length manuscripts published in peer reviewed journals but the information they disseminate may affect clinical practice of attendees. This study aims to describe the characteristics of abstracts presented at the annual conferences of NAUS, the quality of the abstracts as determined by the subsequent publication of full length manuscripts in peer-review indexed journals and the factors that influence such successful publication. Abstracts presented at the 2007 to 2010 NAUS conferences were identified through conference abstracts books. Using a strict search protocol, publication in peer-reviewed journals was determined. The abstracts characteristics were analyzed and their quality judged by subsequent successful publishing of full length manuscripts. Statistical analysis was performed using SPSS 16.0 software to determine factors predictive of successful publication. Only 75 abstracts were presented at the NAUS 2007 to 2010 conferences; a quarter (24%) of the presented abstracts was subsequently published as full length manuscripts. Median time to publication was 15 months (range 2-40 months). Manuscripts whose result data were analyzed with 'beyond basic' statistics of frequencies and averages were more likely to be published than those with basic or no statistics. Quality of the abstracts and thus subsequent publication success is influenced by the use of 'beyond basic' statistics in analysis of the result data presented. There is a need for improvement in the quality of urological research from Nigeria.

  13. Comparison of Eight Equations That Predict Percent Body Fat Using Skinfolds in American Youth

    PubMed Central

    Roberts, Amy; Cai, Jianwen; Berge, Jerica M.; Stevens, June

    2016-01-01

    Abstract Background: Skinfolds are often used in equations to predict percent body fat (PBF) in youth. Although there are numerous such equations published, there is limited information to help researchers determine which equation to use for their sample. Methods: Using data from the 1999–2006 National Health and Nutrition Examination Surveys (NHANES), we compared eight published equations for prediction of PBF. These published equations all included triceps and/or subscapular skinfold measurements. We examined the PBF equations in a nationally representative sample of American youth that was matched by age, sex, and race/ethnicity to the original equation development population and a full sample of 8- to 18-year-olds. We compared the equation-predicted PBF to the dual-emission X-ray absorptiometry (DXA)-measured PBF. The adjusted R2, root mean square error (RMSE), and mean signed difference (MSD) were compared. The MSDs were used to examine accuracy and differential bias by age, sex, and race/ethnicity. Results: When applied to the full range of 8- 18-year-old youth, the R2 values ranged from 0.495 to 0.738. The MSD between predicted and DXA-measured PBF indicated high average accuracy (MSD between −1.0 and 1.0) for only three equations (Bray subscapular equation and Dezenberg equations [with and without race/ethnicity]). The majority of the equations showed differential bias by sex, race/ethnicity, weight status, or age. Conclusions: These findings indicate that investigators should use caution in the selection of an equation to predict PBF in youth given that results may vary systematically in important subgroups. PMID:27045618

  14. Clinicopathologic characteristics associated with long-term survival in advanced epithelial ovarian cancer: an NRG Oncology/Gynecologic Oncology Group ancillary data study.

    PubMed

    Hamilton, C A; Miller, A; Casablanca, Y; Horowitz, N S; Rungruang, B; Krivak, T C; Richard, S D; Rodriguez, N; Birrer, M J; Backes, F J; Geller, M A; Quinn, M; Goodheart, M J; Mutch, D G; Kavanagh, J J; Maxwell, G L; Bookman, M A

    2018-02-01

    To identify clinicopathologic factors associated with 10-year overall survival in epithelial ovarian cancer (EOC) and primary peritoneal cancer (PPC), and to develop a predictive model identifying long-term survivors. Demographic, surgical, and clinicopathologic data were abstracted from GOG 182 records. The association between clinical variables and long-term survival (LTS) (>10years) was assessed using multivariable regression analysis. Bootstrap methods were used to develop predictive models from known prognostic clinical factors and predictive accuracy was quantified using optimism-adjusted area under the receiver operating characteristic curve (AUC). The analysis dataset included 3010 evaluable patients, of whom 195 survived greater than ten years. These patients were more likely to have better performance status, endometrioid histology, stage III (rather than stage IV) disease, absence of ascites, less extensive preoperative disease distribution, microscopic disease residual following cyoreduction (R0), and decreased complexity of surgery (p<0.01). Multivariable regression analysis revealed that lower CA-125 levels, absence of ascites, stage, and R0 were significant independent predictors of LTS. A predictive model created using these variables had an AUC=0.729, which outperformed any of the individual predictors. The absence of ascites, a low CA-125, stage, and R0 at the time of cytoreduction are factors associated with LTS when controlling for other confounders. An extensively annotated clinicopathologic prediction model for LTS fell short of clinical utility suggesting that prognostic molecular profiles are needed to better predict which patients are likely to be long-term survivors. Published by Elsevier Inc.

  15. Overcoming the Challenges of Unstructured Data in Multi-site, Electronic Medical Record-based Abstraction

    PubMed Central

    Polnaszek, Brock; Gilmore-Bykovskyi, Andrea; Hovanes, Melissa; Roiland, Rachel; Ferguson, Patrick; Brown, Roger; Kind, Amy JH

    2014-01-01

    Background Unstructured data encountered during retrospective electronic medical record (EMR) abstraction has routinely been identified as challenging to reliably abstract, as this data is often recorded as free text, without limitations to format or structure. There is increased interest in reliably abstracting this type of data given its prominent role in care coordination and communication, yet limited methodological guidance exists. Objective As standard abstraction approaches resulted in sub-standard data reliability for unstructured data elements collected as part of a multi-site, retrospective EMR study of hospital discharge communication quality, our goal was to develop, apply and examine the utility of a phase-based approach to reliably abstract unstructured data. This approach is examined using the specific example of discharge communication for warfarin management. Research Design We adopted a “fit-for-use” framework to guide the development and evaluation of abstraction methods using a four step, phase-based approach including (1) team building, (2) identification of challenges, (3) adaptation of abstraction methods, and (4) systematic data quality monitoring. Measures Unstructured data elements were the focus of this study, including elements communicating steps in warfarin management (e.g., warfarin initiation) and medical follow-up (e.g., timeframe for follow-up). Results After implementation of the phase-based approach, inter-rater reliability for all unstructured data elements demonstrated kappas of ≥ 0.89 -- an average increase of + 0.25 for each unstructured data element. Conclusions As compared to standard abstraction methodologies, this phase-based approach was more time intensive, but did markedly increase abstraction reliability for unstructured data elements within multi-site EMR documentation. PMID:27624585

  16. On the State of Stress and Failure Prediction Near Planetary Surface Loads

    NASA Astrophysics Data System (ADS)

    Schultz, R. A.

    1996-03-01

    The state of stress surrounding planetary surface loads has been used extensively to predict failure of surface rocks and to invert this information for effective elastic thickness. As demonstrated previously, however, several factors can be important including an explicit comparison between model stresses and rock strength as well as the magnitude of calculated stress. As re-emphasized below, failure to take stress magnitudes into account can lead to erroneous predictions of near-surface faulting. This abstract results from discussions on graben formation at Fall 1995 AGU.

  17. Application of Earth Sciences Products for use in Next Generation Numerical Aerosol Prediction Models

    DTIC Science & Technology

    2008-09-30

    retrievals, Geophysical Research Abstracts, Vol. 10, EGU2008-A-11193, 2008, SRef-ID: 1607-7962/gra/EGU2008-A­ 11193, EGU General Assembly 2008. Liu, M...Application of Earth Sciences Products for use in Next Generation Numerical Aerosol...can be generated and predicted. Through this system, we will be able to advance a number of US Navy Applied Science needs in the areas of improved

  18. Subharmonic Imaging and Pressure Estimation for Monitoring Neoadjuvant Chemotherapy

    DTIC Science & Technology

    2014-09-01

    and therapy response [10]. However, the level of IFP has been shown to predict disease free survival for cervix cancer (34% disease free survival...p. 1951-1961. 11. Milosevic M, et al., Interstitial fluid pressure predicts survival in patients with cervix cancer independent of clinical...12b. DISTRIBUTION CODE 13. ABSTRACT (Maximum 200 Words) Neoadjuvant chemotherapy is currently the standard of care for locally advanced breast cancer

  19. Software Design Description for the Polar Ice Prediction System (PIPS) Version 3.0

    DTIC Science & Technology

    2008-11-05

    Naval Research Laboratory Stennis Space Center, MS 39529-5004 NRL/MR/7320--08-9150 Approved for public release; distribution is unlimited. Software ...collection of information, including suggestions for reducing this burden to Department of Defense, Washington Headquarters Services , Directorate for...THIS PAGE 18. NUMBER OF PAGES 17. LIMITATION OF ABSTRACT Software Design Description for the Polar Ice Prediction System (PIPS) Version 3.0 Pamela G

  20. Publication rates of abstracts presented at the Association of Chiropractic Colleges Educational Conference/Research Agenda Conference from 2002 to 2008

    PubMed Central

    Bakkum, Barclay W.; Chapman, Cynthia; Johnson, Claire

    2014-01-01

    Objective The purposes of this study were to investigate the overall publication rates of presentations at the Association of Chiropractic Colleges Educational Conference/Research Agenda Conference (ACC/RAC) meetings (2002–2008), differences in the publication rates of platform vs poster presentations, and the consistency of the meeting abstract compared to the full-length journal article. Methods Abstracts were obtained from proceedings published in the Journal of Chiropractic Education. Literature searches using PubMed and the Index to the Chiropractic Literature (ICL) were performed to locate peer-reviewed journal articles based upon those abstracts. Whether the article was based upon a poster or platform presentation, and the congruence of the information in the abstract and article were recorded. Results We identified 776 proceeding abstracts, 249 of which eventually were published between 2002 and 2012. The overall publication rate was 32.2%. A total of 42.7% of platform presentations eventually were published vs 20.3% of posters. Congruency showed that 43.2% had the same title as the meeting abstract, 59.7% had the same authorship, and 88.8% had the same methods. Conclusion Publication rates of abstracts from spine and orthopedic surgery national meetings range from 34% to 59%. The ACC/RAC meetings have similar publication rates. More platform than poster presentations reach full publication. The congruency of ACC/RAC abstracts to published articles is higher than national meetings in other fields. PMID:24295363

  1. Partition dataset according to amino acid type improves the prediction of deleterious non-synonymous SNPs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Jing; Li, Yuan-Yuan; Shanghai Center for Bioinformation Technology, Shanghai 200235

    2012-03-02

    Highlights: Black-Right-Pointing-Pointer Proper dataset partition can improve the prediction of deleterious nsSNPs. Black-Right-Pointing-Pointer Partition according to original residue type at nsSNP is a good criterion. Black-Right-Pointing-Pointer Similar strategy is supposed promising in other machine learning problems. -- Abstract: Many non-synonymous SNPs (nsSNPs) are associated with diseases, and numerous machine learning methods have been applied to train classifiers for sorting disease-associated nsSNPs from neutral ones. The continuously accumulated nsSNP data allows us to further explore better prediction approaches. In this work, we partitioned the training data into 20 subsets according to either original or substituted amino acid type at the nsSNPmore » site. Using support vector machine (SVM), training classification models on each subset resulted in an overall accuracy of 76.3% or 74.9% depending on the two different partition criteria, while training on the whole dataset obtained an accuracy of only 72.6%. Moreover, the dataset was also randomly divided into 20 subsets, but the corresponding accuracy was only 73.2%. Our results demonstrated that partitioning the whole training dataset into subsets properly, i.e., according to the residue type at the nsSNP site, will improve the performance of the trained classifiers significantly, which should be valuable in developing better tools for predicting the disease-association of nsSNPs.« less

  2. ABodyBuilder: Automated antibody structure prediction with data–driven accuracy estimation

    PubMed Central

    Leem, Jinwoo; Dunbar, James; Georges, Guy; Shi, Jiye; Deane, Charlotte M.

    2016-01-01

    ABSTRACT Computational modeling of antibody structures plays a critical role in therapeutic antibody design. Several antibody modeling pipelines exist, but no freely available methods currently model nanobodies, provide estimates of expected model accuracy, or highlight potential issues with the antibody's experimental development. Here, we describe our automated antibody modeling pipeline, ABodyBuilder, designed to overcome these issues. The algorithm itself follows the standard 4 steps of template selection, orientation prediction, complementarity-determining region (CDR) loop modeling, and side chain prediction. ABodyBuilder then annotates the ‘confidence’ of the model as a probability that a component of the antibody (e.g., CDRL3 loop) will be modeled within a root–mean square deviation threshold. It also flags structural motifs on the model that are known to cause issues during in vitro development. ABodyBuilder was tested on 4 separate datasets, including the 11 antibodies from the Antibody Modeling Assessment–II competition. ABodyBuilder builds models that are of similar quality to other methodologies, with sub–Angstrom predictions for the ‘canonical’ CDR loops. Its ability to model nanobodies, and rapidly generate models (∼30 seconds per model) widens its potential usage. ABodyBuilder can also help users in decision–making for the development of novel antibodies because it provides model confidence and potential sequence liabilities. ABodyBuilder is freely available at http://opig.stats.ox.ac.uk/webapps/abodybuilder. PMID:27392298

  3. Affect systems, changes in body mass index, disordered eating and stress: an 18-month longitudinal study in women

    PubMed Central

    Kupeli, N.; Norton, S.; Chilcot, J.; Campbell, I. C.; Schmidt, U. H.; Troop, N. A.

    2017-01-01

    ABSTRACT Background: Evidence suggests that stress plays a role in changes in body weight and disordered eating. The present study examined the effect of mood, affect systems (attachment and social rank) and affect regulatory processes (self-criticism, self-reassurance) on the stress process and how this impacts on changes in weight and disordered eating. Methods: A large sample of women participated in a community-based prospective, longitudinal online study in which measures of body mass index (BMI), disordered eating, perceived stress, attachment, social rank, mood and self-criticism/reassurance were measured at 6-monthly intervals over an 18-month period. Results: Latent Growth Curve Modelling showed that BMI increased over 18 months while stress and disordered eating decreased and that these changes were predicted by high baseline levels of these constructs. Independently of this, however, increases in stress predicted a reduction in BMI which was, itself, predicted by baseline levels of self-hatred and unfavourable social comparison. Conclusions: This study adds support to the evidence that stress is important in weight change. In addition, this is the first study to show in a longitudinal design, that social rank and self-criticism (as opposed to self-reassurance) at times of difficulty predict increases in stress and, thus, suggests a role for these constructs in weight regulation. PMID:28553564

  4. Motor Skill Competence and Perceived Motor Competence: Which Best Predicts Physical Activity among Girls?

    PubMed Central

    Khodaverdi, Zeinab; Bahram, Abbas; Khalaji, Hassan; Kazemnejad, Anoshirvan

    2013-01-01

    Abstract Background The main purpose of this study was to determine which correlate, perceived motor competence or motor skill competence, best predicts girls’ physical activity behavior. Methods A sample of 352 girls (mean age=8.7, SD=0.3 yr) participated in this study. To assess motor skill competence and perceived motor competence, each child completed the Test of Gross Motor Development-2 and Physical Ability sub-scale of Marsh’s Self-Description Questionnaire. Children’s physical activity was assessed by the Physical Activity Questionnaire for Older Children. Multiple linear regression model was used to determine whether perceived motor competence or motor skill competence best predicts moderate-to-vigorous self-report physical activity. Results Multiple regression analysis indicated that motor skill competence and perceived motor competence predicted 21% variance in physical activity (R2=0.21, F=48.9, P=0.001), and motor skill competence (R2=0.15, ᵝ=0.33, P= 0.001) resulted in more variance than perceived motor competence (R2=0.06, ᵝ=0.25, P=0.001) in physical activity. Conclusion Results revealed motor skill competence had more influence in comparison with perceived motor competence on physical activity level. We suggest interventional programs based on motor skill competence and perceived motor competence should be administered or implemented to promote physical activity in young girls. PMID:26060623

  5. Use of risk assessment instruments to predict violence and antisocial behaviour in 73 samples involving 24 827 people: systematic review and meta-analysis

    PubMed Central

    Singh, Jay P; Doll, Helen; Grann, Martin

    2012-01-01

    Objective To investigate the predictive validity of tools commonly used to assess the risk of violence, sexual, and criminal behaviour. Design Systematic review and tabular meta-analysis of replication studies following PRISMA guidelines. Data sources PsycINFO, Embase, Medline, and United States Criminal Justice Reference Service Abstracts. Review methods We included replication studies from 1 January 1995 to 1 January 2011 if they provided contingency data for the offending outcome that the tools were designed to predict. We calculated the diagnostic odds ratio, sensitivity, specificity, area under the curve, positive predictive value, negative predictive value, the number needed to detain to prevent one offence, as well as a novel performance indicator—the number safely discharged. We investigated potential sources of heterogeneity using metaregression and subgroup analyses. Results Risk assessments were conducted on 73 samples comprising 24 847 participants from 13 countries, of whom 5879 (23.7%) offended over an average of 49.6 months. When used to predict violent offending, risk assessment tools produced low to moderate positive predictive values (median 41%, interquartile range 27-60%) and higher negative predictive values (91%, 81-95%), and a corresponding median number needed to detain of 2 (2-4) and number safely discharged of 10 (4-18). Instruments designed to predict violent offending performed better than those aimed at predicting sexual or general crime. Conclusions Although risk assessment tools are widely used in clinical and criminal justice settings, their predictive accuracy varies depending on how they are used. They seem to identify low risk individuals with high levels of accuracy, but their use as sole determinants of detention, sentencing, and release is not supported by the current evidence. Further research is needed to examine their contribution to treatment and management. PMID:22833604

  6. Human Purposive Movement Theory

    DTIC Science & Technology

    2012-03-01

    theory and provides examples of developmental and operational technologies that could use this theory in common settings. 15. SUBJECT TERMS human ... activity , prediction of behavior, human algorithms purposive movement theory 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18

  7. MODELING DEPOSITION OF INHALED PARTICLES

    EPA Science Inventory

    Modeling Deposition of Inhaled Particles: ABSTRACT

    The mathematical modeling of the deposition and distribution of inhaled aerosols within human lungs is an invaluable tool in predicting both the health risks associated with inhaled environmental aerosols and the therapeut...

  8. A crowdsourcing workflow for extracting chemical-induced disease relations from free text

    PubMed Central

    Li, Tong Shu; Bravo, Àlex; Furlong, Laura I.; Good, Benjamin M.; Su, Andrew I.

    2016-01-01

    Relations between chemicals and diseases are one of the most queried biomedical interactions. Although expert manual curation is the standard method for extracting these relations from the literature, it is expensive and impractical to apply to large numbers of documents, and therefore alternative methods are required. We describe here a crowdsourcing workflow for extracting chemical-induced disease relations from free text as part of the BioCreative V Chemical Disease Relation challenge. Five non-expert workers on the CrowdFlower platform were shown each potential chemical-induced disease relation highlighted in the original source text and asked to make binary judgments about whether the text supported the relation. Worker responses were aggregated through voting, and relations receiving four or more votes were predicted as true. On the official evaluation dataset of 500 PubMed abstracts, the crowd attained a 0.505 F-score (0.475 precision, 0.540 recall), with a maximum theoretical recall of 0.751 due to errors with named entity recognition. The total crowdsourcing cost was $1290.67 ($2.58 per abstract) and took a total of 7 h. A qualitative error analysis revealed that 46.66% of sampled errors were due to task limitations and gold standard errors, indicating that performance can still be improved. All code and results are publicly available at https://github.com/SuLab/crowd_cid_relex Database URL: https://github.com/SuLab/crowd_cid_relex PMID:27087308

  9. Mining patterns in persistent surveillance systems with smart query and visual analytics

    NASA Astrophysics Data System (ADS)

    Habibi, Mohammad S.; Shirkhodaie, Amir

    2013-05-01

    In Persistent Surveillance Systems (PSS) the ability to detect and characterize events geospatially help take pre-emptive steps to counter adversary's actions. Interactive Visual Analytic (VA) model offers this platform for pattern investigation and reasoning to comprehend and/or predict such occurrences. The need for identifying and offsetting these threats requires collecting information from diverse sources, which brings with it increasingly abstract data. These abstract semantic data have a degree of inherent uncertainty and imprecision, and require a method for their filtration before being processed further. In this paper, we have introduced an approach based on Vector Space Modeling (VSM) technique for classification of spatiotemporal sequential patterns of group activities. The feature vectors consist of an array of attributes extracted from generated sensors semantic annotated messages. To facilitate proper similarity matching and detection of time-varying spatiotemporal patterns, a Temporal-Dynamic Time Warping (DTW) method with Gaussian Mixture Model (GMM) for Expectation Maximization (EM) is introduced. DTW is intended for detection of event patterns from neighborhood-proximity semantic frames derived from established ontology. GMM with EM, on the other hand, is employed as a Bayesian probabilistic model to estimated probability of events associated with a detected spatiotemporal pattern. In this paper, we present a new visual analytic tool for testing and evaluation group activities detected under this control scheme. Experimental results demonstrate the effectiveness of proposed approach for discovery and matching of subsequences within sequentially generated patterns space of our experiments.

  10. Improved Ant Colony Clustering Algorithm and Its Performance Study

    PubMed Central

    Gao, Wei

    2016-01-01

    Clustering analysis is used in many disciplines and applications; it is an important tool that descriptively identifies homogeneous groups of objects based on attribute values. The ant colony clustering algorithm is a swarm-intelligent method used for clustering problems that is inspired by the behavior of ant colonies that cluster their corpses and sort their larvae. A new abstraction ant colony clustering algorithm using a data combination mechanism is proposed to improve the computational efficiency and accuracy of the ant colony clustering algorithm. The abstraction ant colony clustering algorithm is used to cluster benchmark problems, and its performance is compared with the ant colony clustering algorithm and other methods used in existing literature. Based on similar computational difficulties and complexities, the results show that the abstraction ant colony clustering algorithm produces results that are not only more accurate but also more efficiently determined than the ant colony clustering algorithm and the other methods. Thus, the abstraction ant colony clustering algorithm can be used for efficient multivariate data clustering. PMID:26839533

  11. Non-Fourier based thermal-mechanical tissue damage prediction for thermal ablation

    PubMed Central

    Li, Xin; Zhong, Yongmin; Smith, Julian; Gu, Chengfan

    2017-01-01

    ABSTRACT Prediction of tissue damage under thermal loads plays important role for thermal ablation planning. A new methodology is presented in this paper by combing non-Fourier bio-heat transfer, constitutive elastic mechanics as well as non-rigid motion of dynamics to predict and analyze thermal distribution, thermal-induced mechanical deformation and thermal-mechanical damage of soft tissues under thermal loads. Simulations and comparison analysis demonstrate that the proposed methodology based on the non-Fourier bio-heat transfer can account for the thermal-induced mechanical behaviors of soft tissues and predict tissue thermal damage more accurately than classical Fourier bio-heat transfer based model. PMID:27690290

  12. USSR and Eastern Europe Scientific Abstracts, Materials Science and Metallurgy, Number 45

    DTIC Science & Technology

    1977-05-11

    constants VQ and q. The values of the critical stress intensity factor produced by the authors by their indirect method are compared with...and TEREKHOV, A. N., Moscow Institute of Steel and Alloys [Russian abstract provided by the source] [Text] The method of high-temperature...their melting point. References 9; all Russian. USSR ’ UDC 539𔃽 IMPROVING THE PRECISION OF THE ACOUSTIC METHOD OF STRESS DETERMINATION Kiev

  13. Theoretical Kinetics Analysis for $$\\dot{H}$$ Atom Addition to 1,3-Butadiene and Related Reactions on the $$\\dot{C}$$ 4H 7 Potential Energy Surface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Yang; Klippenstein, Stephen J.; Zhou, Chong-Wen

    The oxidation chemistry of the simplest conjugated hydrocarbon, 1,3-butadiene, can provide a first step in understanding the role of poly-unsaturated hydrocarbons in combustion and, in particular, an understanding of their contribution towards soot formation. Based on our previous work on propene and the butene isomers (1-, 2- and isobutene), it was found that the reaction kinetics of H-atom addition to the C=C double bond plays a significant role in fuel consumption kinetics and influences the predictions of high-temperature ignition delay times, product species concentrations and flame speed measurements. In this study, the rate constants and thermodynamic properties formore » $$\\dot{H}$$-atom addition to 1,3-butadiene and related reactions on the $$\\dot{C}$$ 4H 7 potential energy surface have been calculated using two different series of quantum chemical methods and two different kinetic codes. Excellent agreement is obtained between the two different kinetics codes. The calculated results including zero point energies, single point energies, rate constants, barrier heights and thermochemistry are systematically compared among the two quantum chemical methods. 1-methylallyl ($$\\dot{C}$$ 4H 71-3) and 3-buten-1- yl ($$\\dot{C}$$ 4H 71-4) radicals and C 2H 4 + $$\\dot{C}$$2H3 are found to be the most important channels and reactivity promoting products, respectively. We calculated that terminal addition is dominant (> 80%) compared to internal $$\\dot{H}$$-atom addition at all temperatures in the range 298 – 2000 K. However, this dominance decreases with increasing temperature. The calculated rate constants for the bimolecular reaction C 4H 6 + $$\\dot{H}$$ → products and C 2H 4 + $$\\dot{C}$$ 2H 3 → products are in excellent agreement with both experimental and theoretical results from the literature. For selected C 4 species the calculated thermochemical values are also in good agreement with literature data. In addition, the rate constants for H-atom abstraction by $$\\dot{H}$$ atoms have also been calculated, and it is found that abstraction from the central carbon atoms is the dominant channel (> 70%) at temperatures in the range 298 – 2000 K. Lastly, by incorporating our calculated rate constants for both H-atom addition and abstraction into our recently developed 1,3-butadiene model, we show that laminar flame speed predictions are significantly improved, emphasizing the value of this study.« less

  14. Long‐Term Post‐CABG Survival: Performance of Clinical Risk Models Versus Actuarial Predictions

    PubMed Central

    Carr, Brendan M.; Romeiser, Jamie; Ruan, Joyce; Gupta, Sandeep; Seifert, Frank C.; Zhu, Wei

    2015-01-01

    Abstract Background/aim Clinical risk models are commonly used to predict short‐term coronary artery bypass grafting (CABG) mortality but are less commonly used to predict long‐term mortality. The added value of long‐term mortality clinical risk models over traditional actuarial models has not been evaluated. To address this, the predictive performance of a long‐term clinical risk model was compared with that of an actuarial model to identify the clinical variable(s) most responsible for any differences observed. Methods Long‐term mortality for 1028 CABG patients was estimated using the Hannan New York State clinical risk model and an actuarial model (based on age, gender, and race/ethnicity). Vital status was assessed using the Social Security Death Index. Observed/expected (O/E) ratios were calculated, and the models' predictive performances were compared using a nested c‐index approach. Linear regression analyses identified the subgroup of risk factors driving the differences observed. Results Mortality rates were 3%, 9%, and 17% at one‐, three‐, and five years, respectively (median follow‐up: five years). The clinical risk model provided more accurate predictions. Greater divergence between model estimates occurred with increasing long‐term mortality risk, with baseline renal dysfunction identified as a particularly important driver of these differences. Conclusions Long‐term mortality clinical risk models provide enhanced predictive power compared to actuarial models. Using the Hannan risk model, a patient's long‐term mortality risk can be accurately assessed and subgroups of higher‐risk patients can be identified for enhanced follow‐up care. More research appears warranted to refine long‐term CABG clinical risk models. doi: 10.1111/jocs.12665 (J Card Surg 2016;31:23–30) PMID:26543019

  15. An initial-abstraction, constant-loss model for unit hydrograph modeling for applicable watersheds in Texas

    USGS Publications Warehouse

    Asquith, William H.; Roussel, Meghan C.

    2007-01-01

    Estimation of representative hydrographs from design storms, which are known as design hydrographs, provides for cost-effective, riskmitigated design of drainage structures such as bridges, culverts, roadways, and other infrastructure. During 2001?07, the U.S. Geological Survey (USGS), in cooperation with the Texas Department of Transportation, investigated runoff hydrographs, design storms, unit hydrographs,and watershed-loss models to enhance design hydrograph estimation in Texas. Design hydrographs ideally should mimic the general volume, peak, and shape of observed runoff hydrographs. Design hydrographs commonly are estimated in part by unit hydrographs. A unit hydrograph is defined as the runoff hydrograph that results from a unit pulse of excess rainfall uniformly distributed over the watershed at a constant rate for a specific duration. A time-distributed, watershed-loss model is required for modeling by unit hydrographs. This report develops a specific time-distributed, watershed-loss model known as an initial-abstraction, constant-loss model. For this watershed-loss model, a watershed is conceptualized to have the capacity to store or abstract an absolute depth of rainfall at and near the beginning of a storm. Depths of total rainfall less than this initial abstraction do not produce runoff. The watershed also is conceptualized to have the capacity to remove rainfall at a constant rate (loss) after the initial abstraction is satisfied. Additional rainfall inputs after the initial abstraction is satisfied contribute to runoff if the rainfall rate (intensity) is larger than the constant loss. The initial abstraction, constant-loss model thus is a two-parameter model. The initial-abstraction, constant-loss model is investigated through detailed computational and statistical analysis of observed rainfall and runoff data for 92 USGS streamflow-gaging stations (watersheds) in Texas with contributing drainage areas from 0.26 to 166 square miles. The analysis is limited to a previously described, watershed-specific, gamma distribution model of the unit hydrograph. In particular, the initial-abstraction, constant-loss model is tuned to the gamma distribution model of the unit hydrograph. A complex computational analysis of observed rainfall and runoff for the 92 watersheds was done to determine, by storm, optimal values of initial abstraction and constant loss. Optimal parameter values for a given storm were defined as those values that produced a modeled runoff hydrograph with volume equal to the observed runoff hydrograph and also minimized the residual sum of squares of the two hydrographs. Subsequently, the means of the optimal parameters were computed on a watershed-specific basis. These means for each watershed are considered the most representative, are tabulated, and are used in further statistical analyses. Statistical analyses of watershed-specific, initial abstraction and constant loss include documentation of the distribution of each parameter using the generalized lambda distribution. The analyses show that watershed development has substantial influence on initial abstraction and limited influence on constant loss. The means and medians of the 92 watershed-specific parameters are tabulated with respect to watershed development; although they have considerable uncertainty, these parameters can be used for parameter prediction for ungaged watersheds. The statistical analyses of watershed-specific, initial abstraction and constant loss also include development of predictive procedures for estimation of each parameter for ungaged watersheds. Both regression equations and regression trees for estimation of initial abstraction and constant loss are provided. The watershed characteristics included in the regression analyses are (1) main-channel length, (2) a binary factor representing watershed development, (3) a binary factor representing watersheds with an abundance of rocky and thin-soiled terrain, and (4) curve numb

  16. Comparison of Two Mnemonic Encoding Strategies on Children's Recognition and Recall of Abstract Prose Information.

    ERIC Educational Resources Information Center

    Shriberg, Linda K.

    As an extension of an earlier investigation that examined the effects of mnemonic strategy application on children's memory for abstract prose passages, a study compared the benefits accrued by students taught two different variations of the mnemonic keyword method for learning abstract prose information, via tasks of associative recognition and…

  17. Construction of High School Students' Abstraction Levels in Understanding the Concept of Quadrilaterals

    ERIC Educational Resources Information Center

    Budiarto, Mega Teguh; Khabibah, Siti; Setianingsih, Rini

    2017-01-01

    The purpose of this study was to examine the abstraction thinking or the vertical reorganization activity of mathematical concepts of high school students while taking account of the abstraction that was constructed earlier, and the socio-cultural background. This study was qualitative in nature with task-based interviews as the method of…

  18. Multiple Grammars and the Logic of Learnability in Second Language Acquisition

    PubMed Central

    Roeper, Tom W.

    2016-01-01

    The core notion of modern Universal Grammar is that language ability requires abstract representation in terms of hierarchy, movement operations, abstract features on words, and fixed mapping to meaning. These mental structures are a step toward integrating representational knowledge of all kinds into a larger model of cognitive psychology. Examining first and second language at once provides clues as to how abstractly we should represent this knowledge. The abstract nature of grammar allows both the formulation of many grammars and the possibility that a rule of one grammar could apply to another grammar. We argue that every language contains Multiple Grammars which may reflect different language families. We develop numerous examples of how the same abstract rules can apply in various languages and develop a theory of how language modules (case-marking, topicalization, and quantification) interact to predict L2 acquisition paths. In particular we show in depth how Germanic Verb-second operations, based on Verb-final structure, can apply in English. The argument is built around how and where V2 from German can apply in English, seeking to explain the crucial contrast: “nothing” yelled out Bill/*“nothing” yelled Bill out in terms of the necessary abstractness of the V2 rule. PMID:26869945

  19. Use of Recommended Search Strategies in Systematic Reviews and the Impact of Librarian Involvement: A Cross-Sectional Survey of Recent Authors

    PubMed Central

    Koffel, Jonathan B.

    2015-01-01

    Background Previous research looking at published systematic reviews has shown that their search strategies are often suboptimal and that librarian involvement, though recommended, is low. Confidence in the results, however, is limited due to poor reporting of search strategies the published articles. Objectives To more accurately measure the use of recommended search methods in systematic reviews, the levels of librarian involvement, and whether librarian involvement predicts the use of recommended methods. Methods A survey was sent to all authors of English-language systematic reviews indexed in the Database of Abstracts of Reviews of Effects (DARE) from January 2012 through January 2014. The survey asked about their use of search methods recommended by the Institute of Medicine, Cochrane Collaboration, and the Agency for Healthcare Research and Quality and if and how a librarian was involved in the systematic review. Rates of use of recommended methods and librarian involvement were summarized. The impact of librarian involvement on use of recommended methods was examined using a multivariate logistic regression. Results 1560 authors completed the survey. Use of recommended search methods ranged widely from 98% for use of keywords to 9% for registration in PROSPERO and were generally higher than in previous studies. 51% of studies involved a librarian, but only 64% acknowledge their assistance. Librarian involvement was significantly associated with the use of 65% of recommended search methods after controlling for other potential predictors. Odds ratios ranged from 1.36 (95% CI 1.06 to 1.75) for including multiple languages to 3.07 (95% CI 2.06 to 4.58) for using controlled vocabulary. Conclusions Use of recommended search strategies is higher than previously reported, but many methods are still under-utilized. Librarian involvement predicts the use of most methods, but their involvement is under-reported within the published article. PMID:25938454

  20. Nouns, verbs, objects, actions, and abstractions: Local fMRI activity indexes semantics, not lexical categories

    PubMed Central

    Moseley, Rachel L.; Pulvermüller, Friedemann

    2014-01-01

    Noun/verb dissociations in the literature defy interpretation due to the confound between lexical category and semantic meaning; nouns and verbs typically describe concrete objects and actions. Abstract words, pertaining to neither, are a critical test case: dissociations along lexical-grammatical lines would support models purporting lexical category as the principle governing brain organisation, whilst semantic models predict dissociation between concrete words but not abstract items. During fMRI scanning, participants read orthogonalised word categories of nouns and verbs, with or without concrete, sensorimotor meaning. Analysis of inferior frontal/insula, precentral and central areas revealed an interaction between lexical class and semantic factors with clear category differences between concrete nouns and verbs but not abstract ones. Though the brain stores the combinatorial and lexical-grammatical properties of words, our data show that topographical differences in brain activation, especially in the motor system and inferior frontal cortex, are driven by semantics and not by lexical class. PMID:24727103

  1. Nouns, verbs, objects, actions, and abstractions: local fMRI activity indexes semantics, not lexical categories.

    PubMed

    Moseley, Rachel L; Pulvermüller, Friedemann

    2014-05-01

    Noun/verb dissociations in the literature defy interpretation due to the confound between lexical category and semantic meaning; nouns and verbs typically describe concrete objects and actions. Abstract words, pertaining to neither, are a critical test case: dissociations along lexical-grammatical lines would support models purporting lexical category as the principle governing brain organisation, whilst semantic models predict dissociation between concrete words but not abstract items. During fMRI scanning, participants read orthogonalised word categories of nouns and verbs, with or without concrete, sensorimotor meaning. Analysis of inferior frontal/insula, precentral and central areas revealed an interaction between lexical class and semantic factors with clear category differences between concrete nouns and verbs but not abstract ones. Though the brain stores the combinatorial and lexical-grammatical properties of words, our data show that topographical differences in brain activation, especially in the motor system and inferior frontal cortex, are driven by semantics and not by lexical class. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  2. Pacifier Overuse and Conceptual Relations of Abstract and Emotional Concepts

    PubMed Central

    Barca, Laura; Mazzuca, Claudia; Borghi, Anna M.

    2017-01-01

    This study explores the impact of the extensive use of an oral device since infancy (pacifier) on the acquisition of concrete, abstract, and emotional concepts. While recent evidence showed a negative relation between pacifier use and children's emotional competence (Niedenthal et al., 2012), the possible interaction between use of pacifier and processing of emotional and abstract language has not been investigated. According to recent theories, while all concepts are grounded in sensorimotor experience, abstract concepts activate linguistic and social information more than concrete ones. Specifically, the Words As Social Tools (WAT) proposal predicts that the simulation of their meaning leads to an activation of the mouth (Borghi and Binkofski, 2014; Borghi and Zarcone, 2016). Since the pacifier affects facial mimicry forcing mouth muscles into a static position, we hypothesize its possible interference on acquisition/consolidation of abstract emotional and abstract not-emotional concepts, which are mainly conveyed during social and linguistic interactions, than of concrete concepts. Fifty-nine first grade children, with a history of different frequency of pacifier use, provided oral definitions of the meaning of abstract not-emotional, abstract emotional, and concrete words. Main effect of concept type emerged, with higher accuracy in defining concrete and abstract emotional concepts with respect to abstract not-emotional concepts, independently from pacifier use. Accuracy in definitions was not influenced by the use of pacifier, but correspondence and hierarchical clustering analyses suggest that the use of pacifier differently modulates the conceptual relations elicited by abstract emotional and abstract not-emotional. While the majority of the children produced a similar pattern of conceptual relations, analyses on the few (6) children who overused the pacifier (for more than 3 years) showed that they tend to distinguish less clearly between concrete and abstract emotional concepts and between concrete and abstract not-emotional concepts than children who did not use it (5) or used it for short (17). As to the conceptual relations they produced, children who overused the pacifier tended to refer less to their experience and to social and emotional situations, use more exemplifications and functional relations, and less free associations. PMID:29250003

  3. Pacifier Overuse and Conceptual Relations of Abstract and Emotional Concepts.

    PubMed

    Barca, Laura; Mazzuca, Claudia; Borghi, Anna M

    2017-01-01

    This study explores the impact of the extensive use of an oral device since infancy (pacifier) on the acquisition of concrete, abstract, and emotional concepts. While recent evidence showed a negative relation between pacifier use and children's emotional competence (Niedenthal et al., 2012), the possible interaction between use of pacifier and processing of emotional and abstract language has not been investigated. According to recent theories, while all concepts are grounded in sensorimotor experience, abstract concepts activate linguistic and social information more than concrete ones. Specifically, the Words As Social Tools (WAT) proposal predicts that the simulation of their meaning leads to an activation of the mouth (Borghi and Binkofski, 2014; Borghi and Zarcone, 2016). Since the pacifier affects facial mimicry forcing mouth muscles into a static position, we hypothesize its possible interference on acquisition/consolidation of abstract emotional and abstract not-emotional concepts, which are mainly conveyed during social and linguistic interactions, than of concrete concepts. Fifty-nine first grade children, with a history of different frequency of pacifier use, provided oral definitions of the meaning of abstract not-emotional, abstract emotional, and concrete words. Main effect of concept type emerged, with higher accuracy in defining concrete and abstract emotional concepts with respect to abstract not-emotional concepts, independently from pacifier use. Accuracy in definitions was not influenced by the use of pacifier, but correspondence and hierarchical clustering analyses suggest that the use of pacifier differently modulates the conceptual relations elicited by abstract emotional and abstract not-emotional. While the majority of the children produced a similar pattern of conceptual relations, analyses on the few (6) children who overused the pacifier (for more than 3 years) showed that they tend to distinguish less clearly between concrete and abstract emotional concepts and between concrete and abstract not-emotional concepts than children who did not use it (5) or used it for short (17). As to the conceptual relations they produced, children who overused the pacifier tended to refer less to their experience and to social and emotional situations, use more exemplifications and functional relations, and less free associations.

  4. Quality of reporting of trial abstracts needs to be improved: using the CONSORT for abstracts to assess the four leading Chinese medical journals of traditional Chinese medicine

    PubMed Central

    2010-01-01

    Background Due to language limitations, the abstract of journal article may be the only way for people of non-Chinese speaking countries to know about trials in traditional Chinese medicine (TCM). However, little is known about the reporting quality of these trial abstracts. Our study is to assess the reporting quality of abstracts of randomized controlled trials (RCT) published in four leading Chinese medical journals of TCM, and to identify any differences in reporting between the Chinese and English version of the same abstract publication. Method Two reviewers hand-searched the Chinese Journal of Integrated Traditional and Western Medicine, the Chinese Journal of Integrative Medicine, the China Journal of Chinese Materia Medica and the Chinese Acupuncture & Moxibustion for all abstracts of RCTs published between 2006 and 2007. Two reviewers independently assessed the reporting quality of the Chinese and English version of all eligible abstracts based on a modified version of the CONSORT for reporting randomised trials in journal and conference abstracts (CONSORT for abstracts). Results We identified a total of 345 RCTs of TCM with both a Chinese and English abstract. More than half of Chinese abstracts reported details of the trial participants (68%; 234/345), control group intervention (52%; 179/345), the number of participants randomized (73%; 253/345) and benefits when interpreting the trial results (55%; 190/345). Reporting of methodological quality or key features of trial design and trial results were poor; only 2% (7/345) included details of the trial design, 3% (11/345) defined the primary outcome, 5% (17/345) described the methods of random sequence generation, and only 4% (13/345) reported the number of participants analyzed. No abstracts provided details on allocation concealment and trial registration. The percentage agreement in reporting (between the Chinese and English version of the same abstract) ranged from 84% to 100% across individual checklist item. Conclusion The reporting quality of abstracts of RCTs published in these four TCM journals needs to be improved. Since none of the four journals adopted CONSORT for Abstracts, we hope that the introduction and adoption of CONSORT for Abstracts by TCM journals will lead to an improvement in reporting quality. PMID:20615225

  5. Prediction of citation counts for clinical articles at two years using data available within three weeks of publication: retrospective cohort study

    PubMed Central

    2008-01-01

    Objective To determine if citation counts at two years could be predicted for clinical articles that pass basic criteria for critical appraisal using data within three weeks of publication from external sources and an online article rating service. Design Retrospective cohort study. Setting Online rating service, Canada. Participants 1274 articles from 105 journals published from January to June 2005, randomly divided into a 60:40 split to provide derivation and validation datasets. Main outcome measures 20 article and journal features, including ratings of clinical relevance and newsworthiness, routinely collected by the McMaster online rating of evidence system, compared with citation counts at two years. Results The derivation analysis showed that the regression equation accounted for 60% of the variation (R2=0.60, 95% confidence interval 0.538 to 0.629). This model applied to the validation dataset gave a similar prediction (R2=0.56, 0.476 to 0.596, shrinkage 0.04; shrinkage measures how well the derived equation matches data from the validation dataset). Cited articles in the top half and top third were predicted with 83% and 61% sensitivity and 72% and 82% specificity. Higher citations were predicted by indexing in numerous databases; number of authors; abstraction in synoptic journals; clinical relevance scores; number of cited references; and original, multicentred, and therapy articles from journals with a greater proportion of articles abstracted. Conclusion Citation counts can be reliably predicted at two years using data within three weeks of publication. PMID:18292132

  6. Are groundwater nitrate concentrations reaching a turning point in some chalk aquifers?

    PubMed

    Smith, J T; Clarke, R T; Bowes, M J

    2010-09-15

    In past decades, there has been much scientific effort dedicated to the development of models for simulation and prediction of nitrate concentrations in groundwaters, but producing truly predictive models remains a major challenge. A time-series model, based on long-term variations in nitrate fertiliser applications and average rainfall, was calibrated against measured concentrations from five boreholes in the River Frome catchment of Southern England for the period spanning from the mid-1970s to 2003. The model was then used to "blind" predict nitrate concentrations for the period 2003-2008. To our knowledge, this represents the first "blind" test of a model for predicting nitrate concentrations in aquifers. It was found that relatively simple time-series models could explain and predict a significant proportion of the variation in nitrate concentrations in these groundwater abstraction points (R(2)=0.6-0.9 and mean absolute prediction errors 4.2-8.0%). The study highlighted some important limitations and uncertainties in this, and other modelling approaches, in particular regarding long-term nitrate fertiliser application data. In three of the five groundwater abstraction points (Hooke, Empool and Eagle Lodge), once seasonal variations were accounted for, there was a recent change in the generally upward historical trend in nitrate concentrations. This may be an early indication of a response to levelling-off (and declining) fertiliser application rates since the 1980s. There was no clear indication of trend change at the Forston and Winterbourne Abbas sites nor in the trend of nitrate concentration in the River Frome itself from 1965 to 2008. Copyright 2010 Elsevier B.V. All rights reserved.

  7. Chromatin accessibility prediction via convolutional long short-term memory networks with k-mer embedding

    PubMed Central

    Min, Xu; Zeng, Wanwen; Chen, Ning; Chen, Ting; Jiang, Rui

    2017-01-01

    Abstract Motivation: Experimental techniques for measuring chromatin accessibility are expensive and time consuming, appealing for the development of computational approaches to predict open chromatin regions from DNA sequences. Along this direction, existing methods fall into two classes: one based on handcrafted k-mer features and the other based on convolutional neural networks. Although both categories have shown good performance in specific applications thus far, there still lacks a comprehensive framework to integrate useful k-mer co-occurrence information with recent advances in deep learning. Results: We fill this gap by addressing the problem of chromatin accessibility prediction with a convolutional Long Short-Term Memory (LSTM) network with k-mer embedding. We first split DNA sequences into k-mers and pre-train k-mer embedding vectors based on the co-occurrence matrix of k-mers by using an unsupervised representation learning approach. We then construct a supervised deep learning architecture comprised of an embedding layer, three convolutional layers and a Bidirectional LSTM (BLSTM) layer for feature learning and classification. We demonstrate that our method gains high-quality fixed-length features from variable-length sequences and consistently outperforms baseline methods. We show that k-mer embedding can effectively enhance model performance by exploring different embedding strategies. We also prove the efficacy of both the convolution and the BLSTM layers by comparing two variations of the network architecture. We confirm the robustness of our model to hyper-parameters by performing sensitivity analysis. We hope our method can eventually reinforce our understanding of employing deep learning in genomic studies and shed light on research regarding mechanisms of chromatin accessibility. Availability and implementation: The source code can be downloaded from https://github.com/minxueric/ismb2017_lstm. Contact: tingchen@tsinghua.edu.cn or ruijiang@tsinghua.edu.cn Supplementary information: Supplementary materials are available at Bioinformatics online. PMID:28881969

  8. Model studies of hydrogen atom addition and abstraction processes involving ortho-, meta-, and para-benzynes.

    PubMed

    Clark, A E; Davidson, E R

    2001-10-31

    H-atom addition and abstraction processes involving ortho-, meta-, and para-benzyne have been investigated by multiconfigurational self-consistent field methods. The H(A) + H(B)...H(C) reaction (where r(BC) is adjusted to mimic the appropriate singlet-triplet energy gap) is shown to effectively model H-atom addition to benzyne. The doublet multiconfiguration wave functions are shown to mix the "singlet" and "triplet" valence bond structures of H(B)...H(C) along the reaction coordinate; however, the extent of mixing is dependent on the singlet-triplet energy gap (DeltaE(ST)) of the H(B)...H(C) diradical. Early in the reaction, the ground-state wave function is essentially the "singlet" VB function, yet it gains significant "triplet" VB character along the reaction coordinate that allows H(A)-H(B) bond formation. Conversely, the wave function of the first excited state is predominantly the "triplet" VB configuration early in the reaction coordinate, but gains "singlet" VB character when the H-atom is close to a radical center. As a result, the potential energy surface (PES) for H-atom addition to triplet H(B)...H(C) diradical is repulsive! The H3 model predicts, in agreement with the actual calculations on benzyne, that the singlet diradical electrons are not coupled strongly enough to give rise to an activation barrier associated with C-H bond formation. Moreover, this model predicts that the PES for H-atom addition to triplet benzyne will be characterized by a repulsive curve early in the reaction coordinate, followed by a potential avoided crossing with the (pi)1(sigma*)1 state of the phenyl radical. In contrast to H-atom addition, large activation barriers characterize the abstraction process in both the singlet ground state and first triplet state. In the ground state, this barrier results from the weakly avoided crossing of the dominant VB configurations in the ground-state singlet (S0) and first excited singlet (S1) because of the large energy gap between S0 and S1 early in the reaction coordinate. Because the S1 state is best described as the combination of the triplet X-H bond and the triplet H(B)...H(C) spin couplings, the activation barrier along the S0 abstraction PES will have much less dependence on the DeltaE(ST) of H(B)...H(C) than previously speculated. For similar reasons, the T1 potential surface is quite comparable to the S0 PES.

  9. Risk Prediction Models in Psychiatry: Toward a New Frontier for the Prevention of Mental Illnesses.

    PubMed

    Bernardini, Francesco; Attademo, Luigi; Cleary, Sean D; Luther, Charles; Shim, Ruth S; Quartesan, Roberto; Compton, Michael T

    2017-05-01

    We conducted a systematic, qualitative review of risk prediction models designed and tested for depression, bipolar disorder, generalized anxiety disorder, posttraumatic stress disorder, and psychotic disorders. Our aim was to understand the current state of research on risk prediction models for these 5 disorders and thus future directions as our field moves toward embracing prediction and prevention. Systematic searches of the entire MEDLINE electronic database were conducted independently by 2 of the authors (from 1960 through 2013) in July 2014 using defined search criteria. Search terms included risk prediction, predictive model, or prediction model combined with depression, bipolar, manic depressive, generalized anxiety, posttraumatic, PTSD, schizophrenia, or psychosis. We identified 268 articles based on the search terms and 3 criteria: published in English, provided empirical data (as opposed to review articles), and presented results pertaining to developing or validating a risk prediction model in which the outcome was the diagnosis of 1 of the 5 aforementioned mental illnesses. We selected 43 original research reports as a final set of articles to be qualitatively reviewed. The 2 independent reviewers abstracted 3 types of data (sample characteristics, variables included in the model, and reported model statistics) and reached consensus regarding any discrepant abstracted information. Twelve reports described models developed for prediction of major depressive disorder, 1 for bipolar disorder, 2 for generalized anxiety disorder, 4 for posttraumatic stress disorder, and 24 for psychotic disorders. Most studies reported on sensitivity, specificity, positive predictive value, negative predictive value, and area under the (receiver operating characteristic) curve. Recent studies demonstrate the feasibility of developing risk prediction models for psychiatric disorders (especially psychotic disorders). The field must now advance by (1) conducting more large-scale, longitudinal studies pertaining to depression, bipolar disorder, anxiety disorders, and other psychiatric illnesses; (2) replicating and carrying out external validations of proposed models; (3) further testing potential selective and indicated preventive interventions; and (4) evaluating effectiveness of such interventions in the context of risk stratification using risk prediction models. © Copyright 2017 Physicians Postgraduate Press, Inc.

  10. Development of the Rice Convection Model as a Space Weather Tool

    DTIC Science & Technology

    2015-05-31

    coupled to the ionosphere that is suitable for both scientific studies as well as a prediction tool. We are able to run the model faster than “real...of work by finding ways to fund a more systematic effort in making the RCM a space weather prediction tool for magnetospheric and ionospheric studies...convection electric field, total electron content, TEC, ionospheric convection, plasmasphere 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT

  11. Predicting Suicide Attacks: Integrating Spatial, Temporal, and Social Features of Terrorist Attack Targets

    DTIC Science & Technology

    2013-01-01

    used in health and criminol- ogy to predict outcomes (Browning, Cagney, and Wen, 2003; Samp- son, Morenoff, and Gannon -Rowley, 2002 ). For each...Bernard, H . Russell , and Gery W. Ryan, Analyzing Qualitative Data: Systematic Approaches, Thousand Oaks, Calif.: Sage, 2010. Berrebi, Claude, “Evidence...PERSON a. REPORT unclassified b . ABSTRACT unclassified c. THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18

  12. Analysis and Prediction of Sea Ice Evolution using Koopman Mode Decomposition Techniques

    DTIC Science & Technology

    2018-04-30

    Title: Analysis and Prediction of Sea Ice Evolution using Koopman Mode Decomposition Techniques Subject: Monthly Progress Report Period of...Resources: N/A TOTAL: $18,687 2 TECHNICAL STATUS REPORT Abstract The program goal is analysis of sea ice dynamical behavior using Koopman Mode Decompo...sition (KMD) techniques. The work in the program’s first month consisted of improvements to data processing code, inclusion of additional arctic sea ice

  13. Stochastic Prediction and Feedback Control of Router Queue Size in a Virtual Network Environment

    DTIC Science & Technology

    2014-09-18

    predictor equations, while the update equations for measurement can be thought of as corrector equations. 11 2.3.1.1 Predict Equations In the... Adaptive Filters and Self -Learning Systems. Springer London, 2005. [11] Zarchan, P., and Musoff, H. Fundamentals of Kalman filtering: A Practical...iv AFIT-ENG-T-14-S-10 Abstract Modern congestion and routing management algorithms work well for networks with static topologies and moderate

  14. USSR and Eastern Europe Scientific Abstracts, Engineering and Equipment, Number 31

    DTIC Science & Technology

    1977-04-18

    average coefficient of air absorption is computed by the method of approximate replacement of the real spectrum by the graduated one. The entire range...end of transition area with an accuracy of 15%. Figures 5; References 7. USSR UDC 541.24:532.5 PARAMETRIC METHOD OF CALCULATION OF THERMODYNAMIC...12, 1976 Abstract No 12B723 by V. A. Polyanskiy] GLEBOV, G. A., and KOSHKIN, V. K. [Text] A method is presented for calculation of thermodynamic

  15. An Agent-Based Modeling Template for a Cohort of Veterans with Diabetic Retinopathy.

    PubMed

    Day, Theodore Eugene; Ravi, Nathan; Xian, Hong; Brugh, Ann

    2013-01-01

    Agent-based models are valuable for examining systems where large numbers of discrete individuals interact with each other, or with some environment. Diabetic Veterans seeking eye care at a Veterans Administration hospital represent one such cohort. The objective of this study was to develop an agent-based template to be used as a model for a patient with diabetic retinopathy (DR). This template may be replicated arbitrarily many times in order to generate a large cohort which is representative of a real-world population, upon which in-silico experimentation may be conducted. Agent-based template development was performed in java-based computer simulation suite AnyLogic Professional 6.6. The model was informed by medical data abstracted from 535 patient records representing a retrospective cohort of current patients of the VA St. Louis Healthcare System Eye clinic. Logistic regression was performed to determine the predictors associated with advancing stages of DR. Predicted probabilities obtained from logistic regression were used to generate the stage of DR in the simulated cohort. The simulated cohort of DR patients exhibited no significant deviation from the test population of real-world patients in proportion of stage of DR, duration of diabetes mellitus (DM), or the other abstracted predictors. Simulated patients after 10 years were significantly more likely to exhibit proliferative DR (P<0.001). Agent-based modeling is an emerging platform, capable of simulating large cohorts of individuals based on manageable data abstraction efforts. The modeling method described may be useful in simulating many different conditions where course of disease is described in categorical stages.

  16. Systematic Review of Data Mining Applications in Patient-Centered Mobile-Based Information Systems.

    PubMed

    Fallah, Mina; Niakan Kalhori, Sharareh R

    2017-10-01

    Smartphones represent a promising technology for patient-centered healthcare. It is claimed that data mining techniques have improved mobile apps to address patients' needs at subgroup and individual levels. This study reviewed the current literature regarding data mining applications in patient-centered mobile-based information systems. We systematically searched PubMed, Scopus, and Web of Science for original studies reported from 2014 to 2016. After screening 226 records at the title/abstract level, the full texts of 92 relevant papers were retrieved and checked against inclusion criteria. Finally, 30 papers were included in this study and reviewed. Data mining techniques have been reported in development of mobile health apps for three main purposes: data analysis for follow-up and monitoring, early diagnosis and detection for screening purpose, classification/prediction of outcomes, and risk calculation (n = 27); data collection (n = 3); and provision of recommendations (n = 2). The most accurate and frequently applied data mining method was support vector machine; however, decision tree has shown superior performance to enhance mobile apps applied for patients' self-management. Embedded data-mining-based feature in mobile apps, such as case detection, prediction/classification, risk estimation, or collection of patient data, particularly during self-management, would save, apply, and analyze patient data during and after care. More intelligent methods, such as artificial neural networks, fuzzy logic, and genetic algorithms, and even the hybrid methods may result in more patients-centered recommendations, providing education, guidance, alerts, and awareness of personalized output.

  17. Java-Based Diabetes Type 2 Prediction Tool for Better Diagnosis

    PubMed Central

    Odedra, Devang; Mallick, Medhavi; Shukla, Prateek; Samanta, Subir; Vidyarthi, Ambarish S.

    2012-01-01

    Abstract Background The concept of classification of clinical data can be utilized in the development of an effective diagnosis system by taking the advantage of computational intelligence. Diabetes disease diagnosis via proper interpretation of the diabetes data is an important problem in neural networks. Unfortunately, although several classification studies have been carried out with significant performance, many of the current methods often fail to reach out to patients. Graphical user interface-enabled tools need to be developed through which medical practitioners can simply enter the health profiles of their patients and receive an instant diabetes prediction with an acceptable degree of confidence. Methods In this study, the neural network approach was used for a dataset of 768 persons from a Pima Indian population living near Phoenix, AZ. A neural network mixture of experts model was trained with these data using the expectation-minimization algorithm. Results The mixture of experts method was used to train the algorithm with 97% accuracy. A graphical user interface was developed that would work in conjunction with the trained network to provide the output in a presentable format. Conclusions This study provides a machine-implementable approach that can be used by physicians and patients to minimize the extent of error in diagnosis. The authors are hopeful that replication of results of this study in other populations may lead to improved diagnosis. Physicians can simply enter the health profile of patients and get the diagnosis for diabetes type 2. PMID:22059431

  18. Simultaneous quantification of the boar-taint compounds skatole and androstenone by surface-enhanced Raman scattering (SERS) and multivariate data analysis.

    PubMed

    Sørensen, Klavs M; Westley, Chloe; Goodacre, Royston; Engelsen, Søren Balling

    2015-10-01

    This study investigates the feasibility of using surface-enhanced Raman scattering (SERS) for the quantification of absolute levels of the boar-taint compounds skatole and androstenone in porcine fat. By investigation of different types of nanoparticles, pH and aggregating agents, an optimized environment that promotes SERS of the analytes was developed and tested with different multivariate spectral pre-processing techniques, and this was combined with variable selection on a series of analytical standards. The resulting method exhibited prediction errors (root mean square error of cross validation, RMSECV) of 2.4 × 10(-6) M skatole and 1.2 × 10(-7) M androstenone, with a limit of detection corresponding to approximately 2.1 × 10(-11) M for skatole and approximately 1.8 × 10(-10) for androstenone. The method was subsequently tested on porcine fat extract, leading to prediction errors (RMSECV) of 0.17 μg/g for skatole and 1.5 μg/g for androstenone. It is clear that this optimized SERS method, when combined with multivariate analysis, shows great potential for optimization into an on-line application, which will be the first of its kind, and opens up possibilities for simultaneous detection of other meat-quality metabolites or pathogen markers. Graphical abstract Artistic rendering of a laser-illuminated gold colloid sphere with skatole and androstenone adsorbed on the surface.

  19. Development of a Natural Language Processing Engine to Generate Bladder Cancer Pathology Data for Health Services Research.

    PubMed

    Schroeck, Florian R; Patterson, Olga V; Alba, Patrick R; Pattison, Erik A; Seigne, John D; DuVall, Scott L; Robertson, Douglas J; Sirovich, Brenda; Goodney, Philip P

    2017-12-01

    To take the first step toward assembling population-based cohorts of patients with bladder cancer with longitudinal pathology data, we developed and validated a natural language processing (NLP) engine that abstracts pathology data from full-text pathology reports. Using 600 bladder pathology reports randomly selected from the Department of Veterans Affairs, we developed and validated an NLP engine to abstract data on histology, invasion (presence vs absence and depth), grade, the presence of muscularis propria, and the presence of carcinoma in situ. Our gold standard was based on an independent review of reports by 2 urologists, followed by adjudication. We assessed the NLP performance by calculating the accuracy, the positive predictive value, and the sensitivity. We subsequently applied the NLP engine to pathology reports from 10,725 patients with bladder cancer. When comparing the NLP output to the gold standard, NLP achieved the highest accuracy (0.98) for the presence vs the absence of carcinoma in situ. Accuracy for histology, invasion (presence vs absence), grade, and the presence of muscularis propria ranged from 0.83 to 0.96. The most challenging variable was depth of invasion (accuracy 0.68), with an acceptable positive predictive value for lamina propria (0.82) and for muscularis propria (0.87) invasion. The validated engine was capable of abstracting pathologic characteristics for 99% of the patients with bladder cancer. NLP had high accuracy for 5 of 6 variables and abstracted data for the vast majority of the patients. This now allows for the assembly of population-based cohorts with longitudinal pathology data. Published by Elsevier Inc.

  20. Analysis of Hydrogen Atom Abstraction from Ethylbenzene by an FeVO(TAML) Complex.

    PubMed

    Shen, Longzhu Q; Kundu, Soumen; Collins, Terrence J; Bominaar, Emile L

    2017-04-17

    It was shown previously (Chem. Eur. J. 2015, 21, 1803) that the rate of hydrogen atom abstraction, k, from ethylbenzene (EB) by TAML complex [Fe V (O)B*] - (1) in acetonitrile exhibits a large kinetic isotope effect (KIE ∼ 26) in the experimental range 233-243 K. The extrapolated tangents of ln(k/T) vs T -1 plots for EB-d 10 and EB gave a large, negative intercept difference, Int(EB) - Int(EB-d 10 ) = -34.5 J mol -1 K -1 for T -1 → 0, which is shown to be exclusively due to an isotopic mass effect on tunneling. A decomposition of the apparent activation barrier in terms of electronic, ZPE, thermal enthalpic, tunneling, and entropic contributions is presented. Tunneling corrections to ΔH ⧧ and ΔS ⧧ are estimated to be large. The DFT prediction, using functional B3LYP and basis set 6-311G, for the electronic contribution is significantly smaller than suggested by experiment. However, the agreement improves after correction for the basis set superposition error in the interaction between EB and 1. The kinetic model employed has been used to predict rate constants outside the experimental temperature range, which enabled us to compare the reactivity of 1 with those of other hydrogen abstracting complexes.

  1. Automatic identification of abstract online groups

    DOEpatents

    Engel, David W; Gregory, Michelle L; Bell, Eric B; Cowell, Andrew J; Piatt, Andrew W

    2014-04-15

    Online abstract groups, in which members aren't explicitly connected, can be automatically identified by computer-implemented methods. The methods involve harvesting records from social media and extracting content-based and structure-based features from each record. Each record includes a social-media posting and is associated with one or more entities. Each feature is stored on a data storage device and includes a computer-readable representation of an attribute of one or more records. The methods further involve grouping records into record groups according to the features of each record. Further still the methods involve calculating an n-dimensional surface representing each record group and defining an outlier as a record having feature-based distances measured from every n-dimensional surface that exceed a threshold value. Each of the n-dimensional surfaces is described by a footprint that characterizes the respective record group as an online abstract group.

  2. Use of recommended search strategies in systematic reviews and the impact of librarian involvement: a cross-sectional survey of recent authors.

    PubMed

    Koffel, Jonathan B

    2015-01-01

    Previous research looking at published systematic reviews has shown that their search strategies are often suboptimal and that librarian involvement, though recommended, is low. Confidence in the results, however, is limited due to poor reporting of search strategies the published articles. To more accurately measure the use of recommended search methods in systematic reviews, the levels of librarian involvement, and whether librarian involvement predicts the use of recommended methods. A survey was sent to all authors of English-language systematic reviews indexed in the Database of Abstracts of Reviews of Effects (DARE) from January 2012 through January 2014. The survey asked about their use of search methods recommended by the Institute of Medicine, Cochrane Collaboration, and the Agency for Healthcare Research and Quality and if and how a librarian was involved in the systematic review. Rates of use of recommended methods and librarian involvement were summarized. The impact of librarian involvement on use of recommended methods was examined using a multivariate logistic regression. 1560 authors completed the survey. Use of recommended search methods ranged widely from 98% for use of keywords to 9% for registration in PROSPERO and were generally higher than in previous studies. 51% of studies involved a librarian, but only 64% acknowledge their assistance. Librarian involvement was significantly associated with the use of 65% of recommended search methods after controlling for other potential predictors. Odds ratios ranged from 1.36 (95% CI 1.06 to 1.75) for including multiple languages to 3.07 (95% CI 2.06 to 4.58) for using controlled vocabulary. Use of recommended search strategies is higher than previously reported, but many methods are still under-utilized. Librarian involvement predicts the use of most methods, but their involvement is under-reported within the published article.

  3. Scientific meeting abstracts: significance, access, and trends.

    PubMed Central

    Kelly, J A

    1998-01-01

    Abstracts of scientific papers and posters that are presented at annual scientific meetings of professional societies are part of the broader category of conference literature. They are an important avenue for the dissemination of current data. While timely and succinct, these abstracts present problems such as an abbreviated peer review and incomplete bibliographic access. METHODS: Seventy societies of health sciences professionals were surveyed about the publication of abstracts from their annual meetings. Nineteen frequently cited journals also were contacted about their policies on the citation of meeting abstracts. Ten databases were searched for the presence of meetings abstracts. RESULTS: Ninety percent of the seventy societies publish their abstracts, with nearly half appearing in the society's journal. Seventy-seven percent of the societies supply meeting attendees with a copy of each abstract, and 43% make their abstracts available in an electronic format. Most of the journals surveyed allow meeting abstracts to be cited. Bibliographic access to these abstracts does not appear to be widespread. CONCLUSIONS: Meeting abstracts play an important role in the dissemination of scientific knowledge. Bibliographic access to meeting abstracts is very limited. The trend toward making meeting abstracts available via the Internet has the potential to give a broader audience access to the information they contain. PMID:9549015

  4. Abstracts of SIG Sessions.

    ERIC Educational Resources Information Center

    Proceedings of the ASIS Annual Meeting, 1997

    1997-01-01

    Presents abstracts of SIG Sessions. Highlights include digital collections; information retrieval methods; public interest/fair use; classification and indexing; electronic publication; funding; globalization; information technology projects; interface design; networking in developing countries; metadata; multilingual databases; networked…

  5. Writing Abstracts for MLIS Research Proposals Using Worked Examples: An Innovative Approach to Teaching the Elements of Research Design

    ERIC Educational Resources Information Center

    Ondrusek, Anita L.; Thiele, Harold E.; Yang, Changwoo

    2014-01-01

    The authors examined abstracts written by graduate students for their research proposals as a requirement for a course in research methods in a distance learning MLIS program. The students learned under three instructional conditions that involved varying levels of access to worked examples created from abstracts representing research in the LIS…

  6. Metagram Software - A New Perspective on the Art of Computation.

    DTIC Science & Technology

    1981-10-01

    numober) Computer Programming Information and Analysis Metagramming Philosophy Intelligence Information Systefs Abstraction & Metasystems Metagranmming...control would also serve well in the analysis of military and political intelligence, and in other areas where highly abstract methods of thought serve...needed in intelligence because several levels of abstraction are involved in a political or military system, because analysis entails a complex interplay

  7. Identifying Best Practices for and Utilities of the Pharmacy Curriculum Outcome Assessment Examination

    PubMed Central

    Romanelli, Frank

    2016-01-01

    Objective. A review was conducted to determine implementation strategies, utilities, score interpretation, and limitations of the Pharmacy Curriculum Outcome Assessment (PCOA) examination. Methods. Articles were identified through the PubMed and American Journal of Pharmaceutical Education, and International Pharmaceutical Abstracts databases using the following terms: “Pharmacy Curriculum Outcomes Assessment,” “pharmacy comprehensive examination,” and “curricular assessment.” Studies containing information regarding implementation, utility, and predictive values for US student pharmacists, curricula, and/or PGY1/PGY2 residents were included. Publications from the Academic Medicine Journal, the Accreditation Council for Pharmacy Education (ACPE), and the American Association of Colleges of Pharmacy (ACCP) were included for background information and comparison of predictive utilities of comprehensive examinations in medicine. Results. Ten PCOA and nine residency-related publications were identified. Based on published information, the PCOA may be best used as an additional tool to identify knowledge gaps for third-year student pharmacists. Conclusion. Administering the PCOA to students after they have completed their didactic coursework may yield scores that reflect student knowledge. Predictive utility regarding the North American Pharmacy Licensure Examination (NAPLEX) and potential applications is limited, and more research is required to determine ways to use the PCOA. PMID:28179712

  8. Simulating Dissolution of Intravitreal Triamcinolone Acetonide Suspensions in an Anatomically Accurate Rabbit Eye Model

    PubMed Central

    Horner, Marc; Muralikrishnan, R.

    2010-01-01

    ABSTRACT Purpose A computational fluid dynamics (CFD) study examined the impact of particle size on dissolution rate and residence of intravitreal suspension depots of Triamcinolone Acetonide (TAC). Methods A model for the rabbit eye was constructed using insights from high-resolution NMR imaging studies (Sawada 2002). The current model was compared to other published simulations in its ability to predict clearance of various intravitreally injected materials. Suspension depots were constructed explicitly rendering individual particles in various configurations: 4 or 16 mg drug confined to a 100 μL spherical depot, or 4 mg exploded to fill the entire vitreous. Particle size was reduced systematically in each configuration. The convective diffusion/dissolution process was simulated using a multiphase model. Results Release rate became independent of particle diameter below a certain value. The size-independent limits occurred for particle diameters ranging from 77 to 428 μM depending upon the depot configuration. Residence time predicted for the spherical depots in the size-independent limit was comparable to that observed in vivo. Conclusions Since the size-independent limit was several-fold greater than the particle size of commercially available pharmaceutical TAC suspensions, differences in particle size amongst such products are predicted to be immaterial to their duration or performance. PMID:20467888

  9. Advances in In Vitro and In Silico Tools for Toxicokinetic Dose ...

    EPA Pesticide Factsheets

    Recent advances in vitro assays, in silico tools, and systems biology approaches provide opportunities for refined mechanistic understanding for chemical safety assessment that will ultimately lead to reduced reliance on animal-based methods. With the U.S. commercial chemical landscape encompassing thousands of chemicals with limited data, safety assessment strategies that reliably predict in vivo systemic exposures and subsequent in vivo effects efficiently are a priority. Quantitative in vitro-in vivo extrapolation (QIVIVE) is a methodology that facilitates the explicit and quantitative application of in vitro experimental data and in silico modeling to predict in vivo system behaviors and can be applied to predict chemical toxicokinetics, toxicodynamics and also population variability. Tiered strategies that incorporate sufficient information to reliably inform the relevant decision context will facilitate acceptance of these alternative data streams for safety assessments. This abstract does not necessarily reflect U.S. EPA policy. This talk will provide an update to an international audience on the state of science being conducted within the EPA’s Office of Research and Development to develop and refine approaches that estimate internal chemical concentrations following a given exposure, known as toxicokinetics. Toxicokinetic approaches hold great potential in their ability to link in vitro activities or toxicities identified during high-throughput screen

  10. Changes in Water Mobility Measured by Diffusion MRI Predict Response of Metastatic Breast Cancer to Chemotherapy

    PubMed Central

    Theilmann, Rebecca J; Borders, Rebecca; Trouard, Theodore P; Xia, Guowei; Outwater, Eric; Ranger-Moore, James; Gillies, Robert J; Stopeck, Alison

    2004-01-01

    Abstract A goal of oncology is the individualization of patient care to optimize therapeutic responses and minimize toxicities. Achieving this will require noninvasive, quantifiable, and early markers of tumor response. Preclinical data from xenografted tumors using a variety of antitumor therapies have shown that magnetic resonance imaging (MRI)-measured mobility of tissue water (apparent diffusion coefficient of water, or ADCw) is a biomarker presaging cell death in the tumor. This communication tests the hypothesis that changes in water mobility will quantitatively presage tumor responses in patients with metastatic liver lesions from breast cancer. A total of 13 patients with metastatic breast cancer and 60measurable liver lesionsweremonitored by diffusion MRI after initiation of new courses of chemotherapy. MR images were obtained prior to, and at 4, 11, and 39 days following the initiation of therapy for determination of volumes and ADCw values. The data indicate that diffusion MRI can predict response by 4 or 11 days after commencement of therapy, depending on the analytic method. The highest concordance was observed in tumor lesions that were less than 8 cm3 in volume at presentation. These results suggest that diffusion MRI can be useful to predict the response of liver metastases to effective chemotherapy. PMID:15720810

  11. Modeling the Biodegradability of Chemical Compounds Using the Online CHEmical Modeling Environment (OCHEM)

    PubMed Central

    Vorberg, Susann

    2013-01-01

    Abstract Biodegradability describes the capacity of substances to be mineralized by free‐living bacteria. It is a crucial property in estimating a compound’s long‐term impact on the environment. The ability to reliably predict biodegradability would reduce the need for laborious experimental testing. However, this endpoint is difficult to model due to unavailability or inconsistency of experimental data. Our approach makes use of the Online Chemical Modeling Environment (OCHEM) and its rich supply of machine learning methods and descriptor sets to build classification models for ready biodegradability. These models were analyzed to determine the relationship between characteristic structural properties and biodegradation activity. The distinguishing feature of the developed models is their ability to estimate the accuracy of prediction for each individual compound. The models developed using seven individual descriptor sets were combined in a consensus model, which provided the highest accuracy. The identified overrepresented structural fragments can be used by chemists to improve the biodegradability of new chemical compounds. The consensus model, the datasets used, and the calculated structural fragments are publicly available at http://ochem.eu/article/31660. PMID:27485201

  12. Theoretical prediction of the mechanistic pathways and kinetics of methylcyclohexane initiated by OH radicals

    NASA Astrophysics Data System (ADS)

    Begum, Saheen Shehnaz; Deka, Ramesh Chandra; Gour, Nand Kishor

    2018-06-01

    In this manuscript, we have systematically depicted the theoretical prediction of H-absorption from methylcyclohexane initiated by OH radical. For this we have performed dual-level of quantum chemical calculations on the gas-phase reactions between methylcyclohexane (MCH) and OH radical. Geometry optimisation and vibrational frequency calculations have been performed at BHandHLYP/6-311G(d,p) level of theory along with energetic calculations at coupled cluster CCSD(T) method using the same basis set. All the stationary points of titled reaction have been located on the potential energy surface. It has also been found that the H-abstraction takes place from -CH site of MCH, which is the minimum energy pathway than others. The rate constant was calculated using canonical transition state theory for MCH with OH radical and is found to be 3.27 × 10-12 cm3 molecule-1 s-1, which is in sound agreement with reported experimental data. The atmospheric lifetime of MCH and branching ratios of the reaction channels are also reported in the manuscript.

  13. Toward a Unified Sub-symbolic Computational Theory of Cognition

    PubMed Central

    Butz, Martin V.

    2016-01-01

    This paper proposes how various disciplinary theories of cognition may be combined into a unifying, sub-symbolic, computational theory of cognition. The following theories are considered for integration: psychological theories, including the theory of event coding, event segmentation theory, the theory of anticipatory behavioral control, and concept development; artificial intelligence and machine learning theories, including reinforcement learning and generative artificial neural networks; and theories from theoretical and computational neuroscience, including predictive coding and free energy-based inference. In the light of such a potential unification, it is discussed how abstract cognitive, conceptualized knowledge and understanding may be learned from actively gathered sensorimotor experiences. The unification rests on the free energy-based inference principle, which essentially implies that the brain builds a predictive, generative model of its environment. Neural activity-oriented inference causes the continuous adaptation of the currently active predictive encodings. Neural structure-oriented inference causes the longer term adaptation of the developing generative model as a whole. Finally, active inference strives for maintaining internal homeostasis, causing goal-directed motor behavior. To learn abstract, hierarchical encodings, however, it is proposed that free energy-based inference needs to be enhanced with structural priors, which bias cognitive development toward the formation of particular, behaviorally suitable encoding structures. As a result, it is hypothesized how abstract concepts can develop from, and thus how they are structured by and grounded in, sensorimotor experiences. Moreover, it is sketched-out how symbol-like thought can be generated by a temporarily active set of predictive encodings, which constitute a distributed neural attractor in the form of an interactive free-energy minimum. The activated, interactive network attractor essentially characterizes the semantics of a concept or a concept composition, such as an actual or imagined situation in our environment. Temporal successions of attractors then encode unfolding semantics, which may be generated by a behavioral or mental interaction with an actual or imagined situation in our environment. Implications, further predictions, possible verification, and falsifications, as well as potential enhancements into a fully spelled-out unified theory of cognition are discussed at the end of the paper. PMID:27445895

  14. Functional Abstraction as a Method to Discover Knowledge in Gene Ontologies

    PubMed Central

    Ultsch, Alfred; Lötsch, Jörn

    2014-01-01

    Computational analyses of functions of gene sets obtained in microarray analyses or by topical database searches are increasingly important in biology. To understand their functions, the sets are usually mapped to Gene Ontology knowledge bases by means of over-representation analysis (ORA). Its result represents the specific knowledge of the functionality of the gene set. However, the specific ontology typically consists of many terms and relationships, hindering the understanding of the ‘main story’. We developed a methodology to identify a comprehensibly small number of GO terms as “headlines” of the specific ontology allowing to understand all central aspects of the roles of the involved genes. The Functional Abstraction method finds a set of headlines that is specific enough to cover all details of a specific ontology and is abstract enough for human comprehension. This method exceeds the classical approaches at ORA abstraction and by focusing on information rather than decorrelation of GO terms, it directly targets human comprehension. Functional abstraction provides, with a maximum of certainty, information value, coverage and conciseness, a representation of the biological functions in a gene set plays a role. This is the necessary means to interpret complex Gene Ontology results thus strengthening the role of functional genomics in biomarker and drug discovery. PMID:24587272

  15. Abstracting of suspected illegal land use in urban areas using case-based classification of remote sensing images

    NASA Astrophysics Data System (ADS)

    Chen, Fulong; Wang, Chao; Yang, Chengyun; Zhang, Hong; Wu, Fan; Lin, Wenjuan; Zhang, Bo

    2008-11-01

    This paper proposed a method that uses a case-based classification of remote sensing images and applied this method to abstract the information of suspected illegal land use in urban areas. Because of the discrete cases for imagery classification, the proposed method dealt with the oscillation of spectrum or backscatter within the same land use category, and it not only overcame the deficiency of maximum likelihood classification (the prior probability of land use could not be obtained) but also inherited the advantages of the knowledge-based classification system, such as artificial intelligence and automatic characteristics. Consequently, the proposed method could do the classifying better. Then the researchers used the object-oriented technique for shadow removal in highly dense city zones. With multi-temporal SPOT 5 images whose resolution was 2.5×2.5 meters, the researchers found that the method can abstract suspected illegal land use information in urban areas using post-classification comparison technique.

  16. Clinical Application of Vibration Controlled Transient Elastography in Patients with Chronic Hepatitis B

    PubMed Central

    Liang, Xie-Er; Chen, Yong-Peng

    2017-01-01

    Abstract Evaluation of the extent and progression of liver fibrosis and cirrhosis is of critical importance in the management and prognosis of patients with chronic hepatitis B. Due to the limitation of liver biopsy, non-invasive methods, especially liver stiffness measurement (LSM) by vibration controlled transient elastography, have been developed and widely applied for liver fibrosis assessment. LSM aims to reduce, but not to substitute, the need for liver biopsy for fibrosis/cirrhosis diagnosis. While LSM may have potential utility in monitoring treatment response, its applications in prediction of liver complications in terms of portal hypertension and esophageal varices, as well as disease prognosis, have been gradually validated. Here, we review the latest clinical applications of LSM in patients with chronic hepatitis B. PMID:29226103

  17. Overview: Parity Violation and Fundamental Symmetries

    NASA Astrophysics Data System (ADS)

    Carlini, Roger

    2017-09-01

    The fields of nuclear and particle physics have undertaken extensive programs of research to search for evidence of new phenomena via the precision measurement of observables that are well predicted within the standard model of electroweak interaction. It is already known that the standard model is incomplete as it does not include gravity and dark matter/energy and therefore likely the low energy approximation of a more complex theory. This talk will be an overview of the motivation, experimental methods and status of some of these efforts (past and future) related to precision in-direct searches that are complementary to the direct searches underway at the Large Hadron Collider. This abstract is for the invited talk associated with the Mini-symposium titled ``Electro-weak Physics and Fundamental Symmetries'' organized by Julie Roche.

  18. The effect of word concreteness on recognition memory.

    PubMed

    Fliessbach, K; Weis, S; Klaver, P; Elger, C E; Weber, B

    2006-09-01

    Concrete words that are readily imagined are better remembered than abstract words. Theoretical explanations for this effect either claim a dual coding of concrete words in the form of both a verbal and a sensory code (dual-coding theory), or a more accessible semantic network for concrete words than for abstract words (context-availability theory). However, the neural mechanisms of improved memory for concrete versus abstract words are poorly understood. Here, we investigated the processing of concrete and abstract words during encoding and retrieval in a recognition memory task using event-related functional magnetic resonance imaging (fMRI). As predicted, memory performance was significantly better for concrete words than for abstract words. Abstract words elicited stronger activations of the left inferior frontal cortex both during encoding and recognition than did concrete words. Stronger activation of this area was also associated with successful encoding for both abstract and concrete words. Concrete words elicited stronger activations bilaterally in the posterior inferior parietal lobe during recognition. The left parietal activation was associated with correct identification of old stimuli. The anterior precuneus, left cerebellar hemisphere and the posterior and anterior cingulate cortex showed activations both for successful recognition of concrete words and for online processing of concrete words during encoding. Additionally, we observed a correlation across subjects between brain activity in the left anterior fusiform gyrus and hippocampus during recognition of learned words and the strength of the concreteness effect. These findings support the idea of specific brain processes for concrete words, which are reactivated during successful recognition.

  19. Some of the most interesting CASP11 targets through the eyes of their authors

    PubMed Central

    Kryshtafovych, Andriy; Moult, John; Baslé, Arnaud; Burgin, Alex; Craig, Timothy K.; Edwards, Robert A.; Fass, Deborah; Hartmann, Marcus D.; Korycinski, Mateusz; Lewis, Richard J.; Lorimer, Donald; Lupas, Andrei N.; Newman, Janet; Peat, Thomas S.; Piepenbrink, Kurt H.; Prahlad, Janani; van Raaij, Mark J.; Rohwer, Forest; Segall, Anca M.; Seguritan, Victor; Sundberg, Eric J.; Singh, Abhimanyu K.; Wilson, Mark A.

    2015-01-01

    ABSTRACT The Critical Assessment of protein Structure Prediction (CASP) experiment would not have been possible without the prediction targets provided by the experimental structural biology community. In this article, selected crystallographers providing targets for the CASP11 experiment discuss the functional and biological significance of the target proteins, highlight their most interesting structural features, and assess whether these features were correctly reproduced in the predictions submitted to CASP11. Proteins 2016; 84(Suppl 1):34–50. © 2015 The Authors. Proteins: Structure, Function, and Bioinformatics Published by Wiley Periodicals, Inc. PMID:26473983

  20. Safe surgery: validation of pre and postoperative checklists 1

    PubMed Central

    Alpendre, Francine Taporosky; Cruz, Elaine Drehmer de Almeida; Dyniewicz, Ana Maria; Mantovani, Maria de Fátima; Silva, Ana Elisa Bauer de Camargo e; dos Santos, Gabriela de Souza

    2017-01-01

    ABSTRACT Objective: to develop, evaluate and validate a surgical safety checklist for patients in the pre and postoperative periods in surgical hospitalization units. Method: methodological research carried out in a large public teaching hospital in the South of Brazil, with application of the principles of the Safe Surgery Saves Lives Programme of the World Health Organization. The checklist was applied to 16 nurses of 8 surgical units and submitted for validation by a group of eight experts using the Delphi method online. Results: the instrument was validated and it was achieved a mean score ≥1, level of agreement ≥75% and Cronbach’s alpha >0.90. The final version included 97 safety indicators organized into six categories: identification, preoperative, immediate postoperative, immediate postoperative, other surgical complications, and hospital discharge. Conclusion: the Surgical Safety Checklist in the Pre and Postoperative periods is another strategy to promote patient safety, as it allows the monitoring of predictive signs and symptoms of surgical complications and the early detection of adverse events. PMID:28699994

  1. Performance of the dipstick screening test as a predictor of negative urine culture

    PubMed Central

    Marques, Alexandre Gimenes; Doi, André Mario; Pasternak, Jacyr; Damascena, Márcio dos Santos; França, Carolina Nunes; Martino, Marinês Dalla Valle

    2017-01-01

    ABSTRACT Objective To investigate whether the urine dipstick screening test can be used to predict urine culture results. Methods A retrospective study conducted between January and December 2014 based on data from 8,587 patients with a medical order for urine dipstick test, urine sediment analysis and urine culture. Sensitivity, specificity, positive and negative predictive values were determined and ROC curve analysis was performed. Results The percentage of positive cultures was 17.5%. Nitrite had 28% sensitivity and 99% specificity, with positive and negative predictive values of 89% and 87%, respectively. Leukocyte esterase had 79% sensitivity and 84% specificity, with positive and negative predictive values of 51% and 95%, respectively. The combination of positive nitrite or positive leukocyte esterase tests had 85% sensitivity and 84% specificity, with positive and negative predictive values of 53% and 96%, respectively. Positive urinary sediment (more than ten leukocytes per microliter) had 92% sensitivity and 71% specificity, with positive and negative predictive values of 40% and 98%, respectively. The combination of nitrite positive test and positive urinary sediment had 82% sensitivity and 99% specificity, with positive and negative predictive values of 91% and 98%, respectively. The combination of nitrite or leukocyte esterase positive tests and positive urinary sediment had the highest sensitivity (94%) and specificity (84%), with positive and negative predictive values of 58% and 99%, respectively. Based on ROC curve analysis, the best indicator of positive urine culture was the combination of positives leukocyte esterase or nitrite tests and positive urinary sediment, followed by positives leukocyte and nitrite tests, positive urinary sediment alone, positive leukocyte esterase test alone, positive nitrite test alone and finally association of positives nitrite and urinary sediment (AUC: 0.845, 0.844, 0.817, 0.814, 0.635 and 0.626, respectively). Conclusion A negative urine culture can be predicted by negative dipstick test results. Therefore, this test may be a reliable predictor of negative urine culture. PMID:28444086

  2. Uniform deposition of uranium hexafluoride (UF6): Standardized mass deposits and controlled isotopic ratios using a thermal fluorination method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McNamara, Bruce K.; O’Hara, Matthew J.; Casella, Andrew M.

    2016-07-01

    Abstract: We report a convenient method for the generation of volatile uranium hexafluoride (UF6) from solid uranium oxides and other uranium compounds, followed by uniform deposition of low levels of UF6 onto sampling coupons. Under laminar flow conditions, UF6 is shown to interact with surfaces within the chamber to a highly predictable degree. We demonstrate the preparation of uranium deposits that range between ~0.01 and 470±34 ng∙cm-2. The data suggest the method can be extended to creating depositions at the sub-picogram∙cm-2 level. Additionally, the isotopic composition of the deposits can be customized by selection of the uranium source materials. Wemore » demonstrate a layering technique whereby two uranium solids, each with a different isotopic composition, are employed to form successive layers of UF6 on a surface. The result is an ultra-thin deposit of UF6 that bears an isotopic signature that is a composite of the two uranium sources. The reported deposition method has direct application to the development of unique analytical standards for nuclear safeguards and forensics.« less

  3. Analysis of multiple soybean phytonutrients by near-infrared reflectance spectroscopy.

    PubMed

    Zhang, Gaoyang; Li, Penghui; Zhang, Wenfei; Zhao, Jian

    2017-05-01

    Improvement of the nutritional quality of soybean is usually facilitated by a vast range of soybean germplasm with enough information about their multiple phytonutrients. In order to acquire this essential information from a huge number of soybean samples, a rapid analytic method is urgently required. Here, a nondestructive near-infrared reflectance spectroscopy (NIRS) method was developed for rapid and accurate measurement of 25 nutritional components in soybean simultaneously, including fatty acids palmitic acid, stearic acid, oleic acid, linoleic acid, and linolenic acid, vitamin E (VE), α-VE, γ-VE, δ-VE, saponins, isoflavonoids, and flavonoids. Modified partial least squares regression and first, second, third, and fourth derivative transformation was applied for the model development. The 1 minus variance ratio (1-VR) value of the optimal model can reach between the highest 0.95 and lowest 0.64. The predicted values of phytonutrients in soybean using NIRS technology are comparable to those obtained from using the traditional spectrum or chemical methods. A robust NIRS can be adopted as a reliable method to evaluate complex plant constituents for screening large-scale samples of soybean germplasm resources or genetic populations for improvement of nutritional qualities. Graphical Abstract ᅟ.

  4. Prognostic Fusion for Uncertainty Reduction

    DTIC Science & Technology

    2007-02-01

    Damage estimates are arrived at using sensor information such as oil debris monitoring data as well as vibration data. The method detects the onset of...NAME OF RESPONSIBLE PERSON ( Monitor ) a. REPORT Unclassified b. ABSTRACT Unclassified c . THIS PAGE Unclassified 17. LIMITATION OF ABSTRACT...estimates are arrived at using sensor information such as oil debris monitoring data as well as vibration data. The method detects the onset of

  5. Effects of semantic neighborhood density in abstract and concrete words.

    PubMed

    Reilly, Megan; Desai, Rutvik H

    2017-12-01

    Concrete and abstract words are thought to differ along several psycholinguistic variables, such as frequency and emotional content. Here, we consider another variable, semantic neighborhood density, which has received much less attention, likely because semantic neighborhoods of abstract words are difficult to measure. Using a corpus-based method that creates representations of words that emphasize featural information, the current investigation explores the relationship between neighborhood density and concreteness in a large set of English nouns. Two important observations emerge. First, semantic neighborhood density is higher for concrete than for abstract words, even when other variables are accounted for, especially for smaller neighborhood sizes. Second, the effects of semantic neighborhood density on behavior are different for concrete and abstract words. Lexical decision reaction times are fastest for words with sparse neighborhoods; however, this effect is stronger for concrete words than for abstract words. These results suggest that semantic neighborhood density plays a role in the cognitive and psycholinguistic differences between concrete and abstract words, and should be taken into account in studies involving lexical semantics. Furthermore, the pattern of results with the current feature-based neighborhood measure is very different from that with associatively defined neighborhoods, suggesting that these two methods should be treated as separate measures rather than two interchangeable measures of semantic neighborhoods. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Accuracy Analysis of DSMC Chemistry Models Applied to a Normal Shock Wave

    DTIC Science & Technology

    2012-06-20

    CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18 . NUMBER OF PAGES 19a. NAME OF RESPONSIBLE PERSON A. Ketsdever a. REPORT Unclassified b. ABSTRACT...coefficient from [4] is assumed to be 2×10−19 m3/s at 5000 K and 7− 18 m3/s at 10,000K ; the QK prediction using the present VHS collision parameters...is 9−20 m3/s at 5000 K and 2− 18 m3/s at 10000K. Note that the QK for the present work was modified for use with AHO energy levels for consistency

  7. An Annotated Bibliography of Literature Integrating Organizational and Systems Theory

    DTIC Science & Technology

    1985-09-01

    believed to be representative of current thinking on the problem as it is defined in this particular effort. 4. Abstracting For abstracting purposes...individual concept or isolated case which defies mathematical description or classical empirical validation) or nomothetic (pertaining to the abstract ...and to induce change in organizations - laboratory training. Laboratory training is a method used to promote changes in the learning process itself

  8. Written Language and Writing Abilities: Abstracts of Doctoral Dissertations Published in "Dissertation Abstracts International," July 1979 through June 1980 (Vol. 40 Nos. 1 through 12).

    ERIC Educational Resources Information Center

    ERIC Clearinghouse on Reading and Communication Skills, Urbana, IL.

    This collection of abstracts is part of a continuing series providing information on recent doctoral dissertations. The 21 titles deal with the following topics: (1) the adolescent writer's developing sense of audience; (2) the entry skills, methods, and attitudes of intermediate composition students in postsecondary composition programs; (3)…

  9. Conversion of Society for Maternal Fetal Medicine Abstract Presentations to Manuscript Publications

    PubMed Central

    Manuck, Tracy A.; Barbour, Kelli; Janicki, Lindsay; Blackwell, Sean C.; Berghella, Vincenzo

    2015-01-01

    Objective To evaluate the rate of conversion of Society for Maternal Fetal Medicine (SMFM) Annual Meeting abstract presentations to full manuscript publications over time. Methods Full manuscript publications corresponding to all SMFM oral abstracts 2003–2010 inclusive, and SMFM poster abstracts in 2003, 2005, 2007, and 2009 were manually searched in PubMed. An abstract was considered to ‘match’ a full publication if the abstract and publication titles as well as main methods and results were similar and the abstract first author was a publication author. In cases of uncertainty, the abstract-publication match was reviewed by a second physician researcher. Time to publication, publication rates over time, and publication rates among US vs. non-US authors were examined. PMID numbers were also collected to determine if >1 abstract contributed to a manuscript. Data were analyzed using Wilcoxon rank-sum, ANOVA, t-test, and logistic regression. Results 3,281 abstracts presented at SMFM over the study period, including 629 orals (63 main plenary, 64 fellows plenary, 502 concurrent), were reviewed. 1,780/3,281 (54.3%) were published, generating 1,582 unique publications. Oral abstracts had a consistently higher rate of conversion to publications vs. posters (77.1% vs. 48.8%, p<0.001). The median time to publication was 19 (IQR 9–36) months, and was significantly shorter for orals vs. posters (11 vs. 21 months, p<0.001). Over the study period, rates of publication of orals remained constant, but rates of publication of posters were lower in 2007 and 2009 compared to 2003 and 2005. Publications related to SMFM abstracts were published in 194 different journals, most commonly AJOG (39.8%), Obstet Gynecol (9.7%), and J Matern Fetal Neonatal Med (6.5%). Publication rates were higher if the abstract’s first author was affiliated with a non-US institution (64.8% vs. 51.1%, p<0.001) and if the abstract received an award (82.7% vs. 53.3%, p<0.001). In regression models, oral presentation at SMFM, first author affiliation with a non-US institution, submission for AJOG SMFM special issue, and year of abstract presentation at SMFM were associated with full manuscript publication. Conclusions Between 2003 and 2010, full manuscript publication rates of SMFM abstracts were high and consistent, and time to publication decreased/improved across the study period for oral presentations. PMID:25981850

  10. Predictive Coding Strategies for Invariant Object Recognition and Volitional Motion Control in Neuromorphic Agents

    DTIC Science & Technology

    2015-09-02

    human behavior. In this project, we hypothesized that visual memory of past motion trajectories may be used for selecting future behavior. In other...34Decoding sequence of actions using fMRI ", Society for Neuroscience Annual Meeting, San Diego, CA, USA, Nov 9-13 2013 (only abstract) 3. Hansol Choi, Dae...Shik Kim, "Planning as inference in a Hierarchical Predictive Memory ", Proceedings of International Conference on Neural Information Processing

  11. Composite Social Network for Predicting Mobile Apps Installation

    DTIC Science & Technology

    2011-06-02

    analysis used by social scientists such as matched sample estimation (Aral, Muchnik, and Sundararajan 2009) are only for identifying network effects and...ar X iv :1 10 6. 03 59 v1 [ cs .S I] 2 J un 2 01 1 Composite Social Network for Predicting Mobile Apps Installation Wei Pan and Nadav Aharony...and Alex (Sandy) Pentland MIT Media Laboratory 20 Ames Street Cambridge, Massachusetts 02139 Abstract We have carefully instrumented a large portion of

  12. EF5 PET of Tumor Hypoxia: A Predictive Imaging Biomarker of Response to Stereotactic Ablative Radiotherapy (SABR) for Early Lung Cancer

    DTIC Science & Technology

    2014-09-01

    Predictive Imaging Biomarker of Response to Stereotactic Ablative Radiotherapy ( SABR ) for Early Lung Cancer PRINCIPAL INVESTIGATOR: Billy W...CONTRACT NUMBER Response to Stereotactic Ablative Radiotherapy ( SABR ) for Early Lung Cancer 5b. GRANT NUMBER W81XWH-12-1-0236 5c...NOTES 14. ABSTRACT Purpose and scope: Stereotactic ablative radiotherapy ( SABR ) has become a new standard of care for early stage lung

  13. How Accurately Can We Predict Eclipses for Algol? (Poster abstract)

    NASA Astrophysics Data System (ADS)

    Turner, D.

    2016-06-01

    (Abstract only) beta Persei, or Algol, is a very well known eclipsing binary system consisting of a late B-type dwarf that is regularly eclipsed by a GK subgiant every 2.867 days. Eclipses, which last about 8 hours, are regular enough that predictions for times of minima are published in various places, Sky & Telescope magazine and The Observer's Handbook, for example. But eclipse minimum lasts for less than a half hour, whereas subtle mistakes in the current ephemeris for the star can result in predictions that are off by a few hours or more. The Algol system is fairly complex, with the Algol A and Algol B eclipsing system also orbited by Algol C with an orbital period of nearly 2 years. Added to that are complex long-term O-C variations with a periodicity of almost two centuries that, although suggested by Hoffmeister to be spurious, fit the type of light travel time variations expected for a fourth star also belonging to the system. The AB sub-system also undergoes mass transfer events that add complexities to its O-C behavior. Is it actually possible to predict precise times of eclipse minima for Algol months in advance given such complications, or is it better to encourage ongoing observations of the star so that O-C variations can be tracked in real time?

  14. Prediction of Therapy Tumor-Absorbed Dose Estimates in I-131 Radioimmunotherapy Using Tracer Data Via a Mixed-Model Fit to Time Activity

    PubMed Central

    Koral, Kenneth F.; Avram, Anca M.; Kaminski, Mark S.; Dewaraja, Yuni K.

    2012-01-01

    Abstract Background For individualized treatment planning in radioimmunotherapy (RIT), correlations must be established between tracer-predicted and therapy-delivered absorbed doses. The focus of this work was to investigate this correlation for tumors. Methods The study analyzed 57 tumors in 19 follicular lymphoma patients treated with I-131 tositumomab and imaged with SPECT/CT multiple times after tracer and therapy administrations. Instead of the typical least-squares fit to a single tumor's measured time-activity data, estimation was accomplished via a biexponential mixed model in which the curves from multiple subjects were jointly estimated. The tumor-absorbed dose estimates were determined by patient-specific Monte Carlo calculation. Results The mixed model gave realistic tumor time-activity fits that showed the expected uptake and clearance phases even with noisy data or missing time points. Correlation between tracer and therapy tumor-residence times (r=0.98; p<0.0001) and correlation between tracer-predicted and therapy-delivered mean tumor-absorbed doses (r=0.86; p<0.0001) were very high. The predicted and delivered absorbed doses were within±25% (or within±75 cGy) for 80% of tumors. Conclusions The mixed-model approach is feasible for fitting tumor time-activity data in RIT treatment planning when individual least-squares fitting is not possible due to inadequate sampling points. The good correlation between predicted and delivered tumor doses demonstrates the potential of using a pretherapy tracer study for tumor dosimetry-based treatment planning in RIT. PMID:22947086

  15. Clinical Prediction of Functional Outcome after Ischemic Stroke: The Surprising Importance of Periventricular White Matter Disease and Race

    PubMed Central

    Kissela, Brett; Lindsell, Christopher J.; Kleindorfer, Dawn; Alwell, Kathleen; Moomaw, Charles J.; Woo, Daniel; Flaherty, Matthew L.; Air, Ellen; Broderick, Joseph; Tsevat, Joel

    2009-01-01

    Background We sought 0074o build models that address questions of interest to patients and families by predicting short- and long-term mortality and functional outcome after ischemic stroke, while allowing for risk re-stratification as comorbid events accumulate. Methods A cohort of 451 ischemic stroke subjects in 1999 were interviewed during hospitalization, at 3 months, and at approximately 4 years. Medical records from the acute hospitalization were abstracted. All hospitalizations for 3 months post-stroke were reviewed to ascertain medical and psychiatric comorbidities, which were categorized for analysis. Multivariable models were derived to predict mortality and functional outcome (modified Rankin Scale) at 3 months and 4 years. Comorbidities were included as modifiers of the 3 month models, and included in 4-year predictions. Results Post-stroke medical and psychiatric comorbidities significantly increased short term post-stroke mortality and morbidity. Severe periventricular white matter disease (PVWMD) was significantly associated with poor functional outcome at 3 months, independent of other factors, such as diabetes and age; inclusion of this imaging variable eliminated other traditional risk factors often found in stroke outcomes models. Outcome at 3 months was a significant predictor of long-term mortality and functional outcome. Black race was a predictor of 4-year mortality. Conclusions We propose that predictive models for stroke outcome, as well as analysis of clinical trials, should include adjustment for comorbid conditions. The effects of PVWMD on short-term functional outcomes and black race on long-term mortality are findings that require confirmation. PMID:19109548

  16. Use of infrared imaging to predict the developmental stages and differences in chicken embryos exposed to different photoperiods

    NASA Astrophysics Data System (ADS)

    Frederick, Rebecca A.; Hsieh, Sheng-Jen; Palomares, Benjamin Giron

    2012-06-01

    Monitoring development of chicken embryos allows determination of when an egg is not developing and when eggs are close to hatching for more efficient production. Research has been conducted on the effects of temperature fluctuations and light exposure on embryo development; similarities between chicken and mammal embryos; and the use of MRI, tomography, and ultrasound to view specific areas and processes within the embryo. However, there has been little exploration of the use of infrared imaging as a non-destructive method for analyzing and predicting embryonic development. In this study, we built an automated loading system for image acquisition. Pilot experiments were conducted to determine the overall scanning time and scanning frequency. A batch of fertilized eggs was scanned each day as the embryos continued to grow. The captured images were analyzed and categorized into three stages: Stage 1 (days 1 to 7), Stage 2 (days 8 to 14), and Stage 3 (days 15 to 21). The temperature data abstracted from the captured images were divided into two groups. Group 1, consisting of two-thirds of the data, was used to construct a model. Group 2, consisting of one-third of the data, was used to evaluate the predictive accuracy of the model. A three-layer artificial neural network model was developed to predict embryo development stage given a temperature profile. Results suggest that the neural network model is sufficient to predict embryo development stage with good accuracy of 75%. Accuracy can likely be improved if more data sets for each development stage are available.

  17. Great interactions: How binding incorrect partners can teach us about protein recognition and function

    PubMed Central

    Vamparys, Lydie; Laurent, Benoist; Carbone, Alessandra

    2016-01-01

    ABSTRACT Protein–protein interactions play a key part in most biological processes and understanding their mechanism is a fundamental problem leading to numerous practical applications. The prediction of protein binding sites in particular is of paramount importance since proteins now represent a major class of therapeutic targets. Amongst others methods, docking simulations between two proteins known to interact can be a useful tool for the prediction of likely binding patches on a protein surface. From the analysis of the protein interfaces generated by a massive cross‐docking experiment using the 168 proteins of the Docking Benchmark 2.0, where all possible protein pairs, and not only experimental ones, have been docked together, we show that it is also possible to predict a protein's binding residues without having any prior knowledge regarding its potential interaction partners. Evaluating the performance of cross‐docking predictions using the area under the specificity‐sensitivity ROC curve (AUC) leads to an AUC value of 0.77 for the complete benchmark (compared to the 0.5 AUC value obtained for random predictions). Furthermore, a new clustering analysis performed on the binding patches that are scattered on the protein surface show that their distribution and growth will depend on the protein's functional group. Finally, in several cases, the binding‐site predictions resulting from the cross‐docking simulations will lead to the identification of an alternate interface, which corresponds to the interaction with a biomolecular partner that is not included in the original benchmark. Proteins 2016; 84:1408–1421. © 2016 The Authors Proteins: Structure, Function, and Bioinformatics Published by Wiley Periodicals, Inc. PMID:27287388

  18. USE OF SCORE AND CEREBROSPINAL FLUID LACTATE DOSAGE IN DIFFERENTIAL DIAGNOSIS OF BACTERIAL AND ASEPTIC MENINGITIS

    PubMed Central

    Pires, Frederico Ribeiro; Franco, Andréia Christine Bonotto Farias; Gilio, Alfredo Elias; Troster, Eduardo Juan

    2017-01-01

    ABSTRACT Objective: To evaluate Bacterial Meningitis Score (BMS) on its own and in association with Cerebrospinal Fluid (CSF) lactate dosage in order to distinguish bacterial from aseptic meningitis. Methods: Children diagnosed with meningitis at a tertiary hospital between January/2011 and December/2014 were selected. All data were obtained upon admission. BMS was applied and included: CSF Gram staining (2 points); CSF neutrophil count ≥1,000 cells/mm3 (1 point); CSF protein ≥80 mg/dL (1 point); peripheral blood neutrophil count ≥10,000 cells/mm3 (1 point) and seizures upon/before arrival (1 point). Cutoff value for CSF lactate was ≥30 mg/dL. Sensitivity, specificity and negative predictive value of several BMS cutoffs and BMS associated with high CSF lactate were evaluated for prediction of bacterial meningitis. Results: Among 439 eligible patients, 94 did not have all data available to complete the score, and 345 patients were included: 7 in bacterial meningitis group and 338 in aseptic meningitis group. As predictive factors of bacterial meningitis, BMS ≥1 had 100% sensitivity (95%CI 47.3-100), 64.2% specificity (58.8-100) and 100% negative predictive value (97.5-100); BMS ≥2 or BMS ≥1 associated with high CSF lactate also showed 100% sensitivity (47.3-100); but 98.5% specificity (96.6-99.5) and 100% negative predictive value (98.3-100). Conclusions: 2 point BMS in association with CSF lactate dosage had the same sensitivity and negative predictive value, with increased specificity for diagnosis of bacterial meningitis when compared with 1-point BMS. PMID:29185620

  19. Predicting the Individual Risk of Acute Severe Colitis at Diagnosis

    PubMed Central

    Cesarini, Monica; Collins, Gary S.; Rönnblom, Anders; Santos, Antonieta; Wang, Lai Mun; Sjöberg, Daniel; Parkes, Miles; Keshav, Satish

    2017-01-01

    Abstract Background and Aims: Acute severe colitis [ASC] is associated with major morbidity. We aimed to develop and externally validate an index that predicted ASC within 3 years of diagnosis. Methods: The development cohort included patients aged 16–89 years, diagnosed with ulcerative colitis [UC] in Oxford and followed for 3 years. Primary outcome was hospitalization for ASC, excluding patients admitted within 1 month of diagnosis. Multivariable logistic regression examined the adjusted association of seven risk factors with ASC. Backwards elimination produced a parsimonious model that was simplified to create an easy-to-use index. External validation occurred in separate cohorts from Cambridge, UK, and Uppsala, Sweden. Results: The development cohort [Oxford] included 34/111 patients who developed ASC within a median 14 months [range 1–29]. The final model applied the sum of 1 point each for extensive disease, C-reactive protein [CRP] > 10mg/l, or haemoglobin < 12g/dl F or < 14g/dl M at diagnosis, to give a score from 0/3 to 3/3. This predicted a 70% risk of developing ASC within 3 years [score 3/3]. Validation cohorts included different proportions with ASC [Cambridge = 25/96; Uppsala = 18/298]. Of those scoring 3/3 at diagnosis, 18/18 [Cambridge] and 12/13 [Uppsala] subsequently developed ASC. Discriminant ability [c-index, where 1.0 = perfect discrimination] was 0.81 [Oxford], 0.95 [Cambridge], 0.97 [Uppsala]. Internal validation using bootstrapping showed good calibration, with similar predicted risk across all cohorts. A nomogram predicted individual risk. Conclusions: An index applied at diagnosis reliably predicts the risk of ASC within 3 years in different populations. Patients with a score 3/3 at diagnosis may merit early immunomodulator therapy. PMID:27647858

  20. Prototype Abstraction by Monkeys ("Macaca Mulatta")

    ERIC Educational Resources Information Center

    Smith, J. David; Redford, Joshua S.; Haas, Sarah M.

    2008-01-01

    The authors analyze the shape categorization of rhesus monkeys ("Macaca mulatta") and the role of prototype- and exemplar-based comparison processes in monkeys' category learning. Prototype and exemplar theories make contrasting predictions regarding performance on the Posner-Homa dot-distortion categorization task. Prototype theory--which…

  1. High-throughput screening, predictive modeling and computational embryology - Abstract

    EPA Science Inventory

    High-throughput screening (HTS) studies are providing a rich source of data that can be applied to chemical profiling to address sensitivity and specificity of molecular targets, biological pathways, cellular and developmental processes. EPA’s ToxCast project is testing 960 uniq...

  2. Preliminary evaluation of factors associated with premature trial closure and feasibility of accrual benchmarks in phase III oncology trials

    PubMed Central

    Schroen, Anneke T; Petroni, Gina R; Wang, Hongkun; Gray, Robert; Wang, Xiaofei F; Cronin, Walter; Sargent, Daniel J; Benedetti, Jacqueline; Wickerham, Donald L; Djulbegovic, Benjamin; Slingluff, Craig L

    2014-01-01

    Background A major challenge for randomized phase III oncology trials is the frequent low rates of patient enrollment, resulting in high rates of premature closure due to insufficient accrual. Purpose We conducted a pilot study to determine the extent of trial closure due to poor accrual, feasibility of identifying trial factors associated with sufficient accrual, impact of redesign strategies on trial accrual, and accrual benchmarks designating high failure risk in the clinical trials cooperative group (CTCG) setting. Methods A subset of phase III trials opened by five CTCGs between August 1991 and March 2004 was evaluated. Design elements, experimental agents, redesign strategies, and pretrial accrual assessment supporting accrual predictions were abstracted from CTCG documents. Percent actual/predicted accrual rate averaged per month was calculated. Trials were categorized as having sufficient or insufficient accrual based on reason for trial termination. Analyses included univariate and bivariate summaries to identify potential trial factors associated with accrual sufficiency. Results Among 40 trials from one CTCG, 21 (52.5%) trials closed due to insufficient accrual. In 82 trials from five CTCGs, therapeutic trials accrued sufficiently more often than nontherapeutic trials (59% vs 27%, p = 0.05). Trials including pretrial accrual assessment more often achieved sufficient accrual than those without (67% vs 47%, p = 0.08). Fewer exclusion criteria, shorter consent forms, other CTCG participation, and trial design simplicity were not associated with achieving sufficient accrual. Trials accruing at a rate much lower than predicted (<35% actual/predicted accrual rate) were consistently closed due to insufficient accrual. Limitations This trial subset under-represents certain experimental modalities. Data sources do not allow accounting for all factors potentially related to accrual success. Conclusion Trial closure due to insufficient accrual is common. Certain trial design factors appear associated with attaining sufficient accrual. Defining accrual benchmarks for early trial termination or redesign is feasible, but better accrual prediction methods are critically needed. Future studies should focus on identifying trial factors that allow more accurate accrual predictions and strategies that can salvage open trials experiencing slow accrual. PMID:20595245

  3. Annual Quality Assurance Conference Abstracts by Barbara Marshik

    EPA Pesticide Factsheets

    25th Annual Quality Assurance Conference. Abstracts: Material and Process Conditions for Successful Use of Extractive Sampling Techniques and Certification Methods Errors in the Analysis of NMHC and VOCs in CNG-Based Engine Emissions by Barbara Marshik

  4. Obtaining correct compile results by absorbing mismatches between data types representations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horie, Michihiro; Horii, Hiroshi H.; Kawachiya, Kiyokuni

    Methods and a system are provided. A method includes implementing a function, which a compiler for a first language does not have, using a compiler for a second language. The implementing step includes generating, by the compiler for the first language, a first abstract syntax tree. The implementing step further includes converting, by a converter, the first abstract syntax tree to a second abstract syntax tree of the compiler for the second language using a conversion table from data representation types in the first language to data representation types in the second language. When a compilation error occurs, the implementingmore » step also includes generating a special node for error processing in the second abstract syntax tree and storing an error token in the special node. When unparsing, the implementing step additionally includes outputting the error token, in the form of source code written in the first language.« less

  5. Obtaining correct compile results by absorbing mismatches between data types representations

    DOEpatents

    Horie, Michihiro; Horii, Hiroshi H.; Kawachiya, Kiyokuni; Takeuchi, Mikio

    2017-03-21

    Methods and a system are provided. A method includes implementing a function, which a compiler for a first language does not have, using a compiler for a second language. The implementing step includes generating, by the compiler for the first language, a first abstract syntax tree. The implementing step further includes converting, by a converter, the first abstract syntax tree to a second abstract syntax tree of the compiler for the second language using a conversion table from data representation types in the first language to data representation types in the second language. When a compilation error occurs, the implementing step also includes generating a special node for error processing in the second abstract syntax tree and storing an error token in the special node. When unparsing, the implementing step additionally includes outputting the error token, in the form of source code written in the first language.

  6. Obtaining correct compile results by absorbing mismatches between data types representations

    DOEpatents

    Horie, Michihiro; Horii, Hiroshi H.; Kawachiya, Kiyokuni; Takeuchi, Mikio

    2017-11-21

    Methods and a system are provided. A method includes implementing a function, which a compiler for a first language does not have, using a compiler for a second language. The implementing step includes generating, by the compiler for the first language, a first abstract syntax tree. The implementing step further includes converting, by a converter, the first abstract syntax tree to a second abstract syntax tree of the compiler for the second language using a conversion table from data representation types in the first language to data representation types in the second language. When a compilation error occurs, the implementing step also includes generating a special node for error processing in the second abstract syntax tree and storing an error token in the special node. When unparsing, the implementing step additionally includes outputting the error token, in the form of source code written in the first language.

  7. A critical assessment of topologically associating domain prediction tools

    PubMed Central

    Dali, Rola

    2017-01-01

    Abstract Topologically associating domains (TADs) have been proposed to be the basic unit of chromosome folding and have been shown to play key roles in genome organization and gene regulation. Several different tools are available for TAD prediction, but their properties have never been thoroughly assessed. In this manuscript, we compare the output of seven different TAD prediction tools on two published Hi-C data sets. TAD predictions varied greatly between tools in number, size distribution and other biological properties. Assessed against a manual annotation of TADs, individual TAD boundary predictions were found to be quite reliable, but their assembly into complete TAD structures was much less so. In addition, many tools were sensitive to sequencing depth and resolution of the interaction frequency matrix. This manuscript provides users and designers of TAD prediction tools with information that will help guide the choice of tools and the interpretation of their predictions. PMID:28334773

  8. Research dissemination: The art of writing an abstract for conferences.

    PubMed

    Coad, Jane; Devitt, Patric

    2006-03-01

    This article aims to assist readers with developing an abstract for a conference in order to have a paper accepted for presentation at a conference, whether it is in poster or an oral format. This is important as the authors argue that use of conferences as a method of disseminating research findings and good practice is expanding each year. Drawing on author experiences, both as members of scientific review panels and as submitters of abstracts, the article includes a practical review about the meaning of an abstract, how to get started and then breaks down in clear sections what reviewers look for in a good abstract. There are also some key points on the actual process of review, which are helpful in understanding of what happens to an abstract following submission.

  9. BRENDA in 2013: integrated reactions, kinetic data, enzyme function data, improved disease classification: new options and contents in BRENDA.

    PubMed

    Schomburg, Ida; Chang, Antje; Placzek, Sandra; Söhngen, Carola; Rother, Michael; Lang, Maren; Munaretto, Cornelia; Ulas, Susanne; Stelzer, Michael; Grote, Andreas; Scheer, Maurice; Schomburg, Dietmar

    2013-01-01

    The BRENDA (BRaunschweig ENzyme DAtabase) enzyme portal (http://www.brenda-enzymes.org) is the main information system of functional biochemical and molecular enzyme data and provides access to seven interconnected databases. BRENDA contains 2.7 million manually annotated data on enzyme occurrence, function, kinetics and molecular properties. Each entry is connected to a reference and the source organism. Enzyme ligands are stored with their structures and can be accessed via their names, synonyms or via a structure search. FRENDA (Full Reference ENzyme DAta) and AMENDA (Automatic Mining of ENzyme DAta) are based on text mining methods and represent a complete survey of PubMed abstracts with information on enzymes in different organisms, tissues or organelles. The supplemental database DRENDA provides more than 910 000 new EC number-disease relations in more than 510 000 references from automatic search and a classification of enzyme-disease-related information. KENDA (Kinetic ENzyme DAta), a new amendment extracts and displays kinetic values from PubMed abstracts. The integration of the EnzymeDetector offers an automatic comparison, evaluation and prediction of enzyme function annotations for prokaryotic genomes. The biochemical reaction database BKM-react contains non-redundant enzyme-catalysed and spontaneous reactions and was developed to facilitate and accelerate the construction of biochemical models.

  10. A Review of Auditory Prediction and Its Potential Role in Tinnitus Perception.

    PubMed

    Durai, Mithila; O'Keeffe, Mary G; Searchfield, Grant D

    2018-06-01

    The precise mechanisms underlying tinnitus perception and distress are still not fully understood. A recent proposition is that auditory prediction errors and related memory representations may play a role in driving tinnitus perception. It is of interest to further explore this. To obtain a comprehensive narrative synthesis of current research in relation to auditory prediction and its potential role in tinnitus perception and severity. A narrative review methodological framework was followed. The key words Prediction Auditory, Memory Prediction Auditory, Tinnitus AND Memory, Tinnitus AND Prediction in Article Title, Abstract, and Keywords were extensively searched on four databases: PubMed, Scopus, SpringerLink, and PsychINFO. All study types were selected from 2000-2016 (end of 2016) and had the following exclusion criteria applied: minimum age of participants <18, nonhuman participants, and article not available in English. Reference lists of articles were reviewed to identify any further relevant studies. Articles were short listed based on title relevance. After reading the abstracts and with consensus made between coauthors, a total of 114 studies were selected for charting data. The hierarchical predictive coding model based on the Bayesian brain hypothesis, attentional modulation and top-down feedback serves as the fundamental framework in current literature for how auditory prediction may occur. Predictions are integral to speech and music processing, as well as in sequential processing and identification of auditory objects during auditory streaming. Although deviant responses are observable from middle latency time ranges, the mismatch negativity (MMN) waveform is the most commonly studied electrophysiological index of auditory irregularity detection. However, limitations may apply when interpreting findings because of the debatable origin of the MMN and its restricted ability to model real-life, more complex auditory phenomenon. Cortical oscillatory band activity may act as neurophysiological substrates for auditory prediction. Tinnitus has been modeled as an auditory object which may demonstrate incomplete processing during auditory scene analysis resulting in tinnitus salience and therefore difficulty in habituation. Within the electrophysiological domain, there is currently mixed evidence regarding oscillatory band changes in tinnitus. There are theoretical proposals for a relationship between prediction error and tinnitus but few published empirical studies. American Academy of Audiology.

  11. Interactive and Hands-on Methods for Professional Development of Undergraduate Researchers

    NASA Astrophysics Data System (ADS)

    Pressley, S. N.; LeBeau, J. E.

    2016-12-01

    Professional development workshops for undergraduate research programs can range from communicating science (i.e. oral, technical writing, poster presentations), applying for fellowships and scholarships, applying to graduate school, and learning about careers, among others. Novel methods of presenting the information on the above topics can result in positive outcomes beyond the obvious of transferring knowledge. Examples of innovative methods to present professional development information include 1) An interactive session on how to write an abstract where students are given an opportunity to draft an abstract from a short technical article, followed by discussion amongst a group of peers, and comparison with the "published" abstract. 2) Using the Process Oriented Guided Inquiry Learning (POGIL) method to evaluate and critique a research poster. 3) Inviting "experts" such as a Fulbright scholar graduate student to present on applying for fellowships and scholarships. These innovative methods of delivery provide more hands-on activities that engage the students, and in some cases (abstract writing) provide practice for the student. The methods also require that students develop team work skills, communicate amongst their peers, and develop networks with their cohort. All of these are essential non-technical skills needed for success in any career. Feedback from students on these sessions are positive and most importantly, the students walk out of the session with a smile on their face saying how much fun it was. Evaluating the impact of these sessions is more challenging and under investigation currently.

  12. The left inferior frontal gyrus: A neural crossroads between abstract and concrete knowledge.

    PubMed

    Della Rosa, Pasquale Anthony; Catricalà, Eleonora; Canini, Matteo; Vigliocco, Gabriella; Cappa, Stefano F

    2018-07-15

    Evidence from both neuropsychology and neuroimaging suggests that different types of information are necessary for representing and processing concrete and abstract word meanings. Both abstract and concrete concepts, however, conjointly rely on perceptual, verbal and contextual knowledge, with abstract concepts characterized by low values of imageability (IMG) (low sensory-motor grounding) and low context availability (CA) (more difficult to contextualize). Imaging studies supporting differences between abstract and concrete concepts show a greater recruitment of the left inferior frontal gyrus (LIFG) for abstract concepts, which has been attributed either to the representation of abstract-specific semantic knowledge or to the request for more executive control than in the case of concrete concepts. We conducted an fMRI study on 27 participants, using a lexical decision task involving both abstract and concrete words, whose IMG and CA values were explicitly modelled in separate parametric analyses. The LIFG was significantly more activated for abstract than for concrete words, and a conjunction analysis showed a common activation for words with low IMG or low CA only in the LIFG, in the same area reported for abstract words. A regional template map of brain activations was then traced for words with low IMG or low CA, and BOLD regional time-series were extracted and correlated with the specific LIFG neural activity elicited for abstract words. The regions associated to low IMG, which were functionally correlated with LIFG, were mainly in the left hemisphere, while those associated with low CA were in the right hemisphere. Finally, in order to reveal which LIFG-related network increased its connectivity with decreases of IMG or CA, we conducted generalized psychophysiological interaction analyses. The connectivity strength values extracted from each region connected with the LIFG were correlated with specific LIFG neural activity for abstract words, and a regression analysis was conducted to highlight which areas recruited by low IMG or low CA predicted the greater activation of the IFG for abstract concepts. Only the left middle temporal gyrus/angular gyrus, known to be involved in semantic processing, was a significant predictor of LIFG activity differentiating abstract from concrete words. The results show that the abstract conceptual processing requires the interplay of multiple brain regions, necessary for both the intrinsic and extrinsic properties of abstract knowledge. The LIFG can be thus identified as the neural crossroads between different types of information equally necessary for representing processing and differentiating abstract concepts from concrete ones. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. Innovation in the management innovation of planting methods of vegetation in the southwestern edge of Mu Us Sandy land—from a sustainable development view

    NASA Astrophysics Data System (ADS)

    Li, Xing; ZHU, Yan-feng; He, Jianmin; Hou, BingJie

    2017-04-01

    All articles must contain an abstract. The abstract text should be formatted using 10 point Times or Times New Roman and indented 25 mm from the left margin. Leave 10 mm space after the abstract before you begin the main text of your article, starting on the same page as the abstract. The abstract should give readers concise information about the content of the article and indicate the main results obtained and conclusions drawn. The abstract is not part of the text and should be complete in itself; no table numbers, figure numbers, references or displayed mathematical expressions should be included. It should be suitable for direct inclusion in abstracting services and should not normally exceed 200 words in a single paragraph. Since contemporary information-retrieval systems rely heavily on the content of titles and abstracts to identify relevant articles in literature searches, great care should be taken in constructing both.

  14. A systematic review of models to predict recruitment to multicentre clinical trials.

    PubMed

    Barnard, Katharine D; Dent, Louise; Cook, Andrew

    2010-07-06

    Less than one third of publicly funded trials managed to recruit according to their original plan often resulting in request for additional funding and/or time extensions. The aim was to identify models which might be useful to a major public funder of randomised controlled trials when estimating likely time requirements for recruiting trial participants. The requirements of a useful model were identified as usability, based on experience, able to reflect time trends, accounting for centre recruitment and contribution to a commissioning decision. A systematic review of English language articles using MEDLINE and EMBASE. Search terms included: randomised controlled trial, patient, accrual, predict, enroll, models, statistical; Bayes Theorem; Decision Theory; Monte Carlo Method and Poisson. Only studies discussing prediction of recruitment to trials using a modelling approach were included. Information was extracted from articles by one author, and checked by a second, using a pre-defined form. Out of 326 identified abstracts, only 8 met all the inclusion criteria. Of these 8 studies examined, there are five major classes of model discussed: the unconditional model, the conditional model, the Poisson model, Bayesian models and Monte Carlo simulation of Markov models. None of these meet all the pre-identified needs of the funder. To meet the needs of a number of research programmes, a new model is required as a matter of importance. Any model chosen should be validated against both retrospective and prospective data, to ensure the predictions it gives are superior to those currently used.

  15. CA19-9 serum levels predict micrometastases in patients with gastric cancer

    PubMed Central

    Potrc, Stojan; Mis, Katarina; Plankl, Mojca; Mars, Tomaz

    2016-01-01

    Abstract Background We explored the prognostic value of the up-regulated carbohydrate antigen (CA19-9) in node-negative patients with gastric cancer as a surrogate marker for micrometastases. Patients and methods Micrometastases were determined using reverse transcription quantitative polymerase chain reaction (RT-qPCR) for a subgroup of 30 node-negative patients. This group was used to determine the cut-off for preoperative CA19-9 serum levels as a surrogate marker for micrometastases. Then 187 node-negative T1 to T4 patients were selected to validate the predictive value of this CA19-9 threshold. Results Patients with micrometastases had significantly higher preoperative CA19-9 serum levels compared to patients without micrometastases (p = 0.046). CA19-9 serum levels were significantly correlated with tumour site, tumour diameter, and perineural invasion. Although not reaching significance, subgroup analysis showed better five-year survival rates for patients with CA19-9 serum levels below the threshold, compared to patients with CA19-9 serum levels above the cut-off. The cumulative survival for T2 to T4 node-negative patients was significantly better with CA19-9 serum levels below the cut-off (p = 0.04). Conclusions Preoperative CA19-9 serum levels can be used to predict higher risk for haematogenous spread and micrometastases in node-negative patients. However, CA19-9 serum levels lack the necessary sensitivity and specificity to reliably predict micrometastases. PMID:27247553

  16. Automated Assume-Guarantee Reasoning by Abstraction Refinement

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Giannakopoulous, Dimitra; Glannakopoulou, Dimitra

    2008-01-01

    Current automated approaches for compositional model checking in the assume-guarantee style are based on learning of assumptions as deterministic automata. We propose an alternative approach based on abstraction refinement. Our new method computes the assumptions for the assume-guarantee rules as conservative and not necessarily deterministic abstractions of some of the components, and refines those abstractions using counter-examples obtained from model checking them together with the other components. Our approach also exploits the alphabets of the interfaces between components and performs iterative refinement of those alphabets as well as of the abstractions. We show experimentally that our preliminary implementation of the proposed alternative achieves similar or better performance than a previous learning-based implementation.

  17. Value of Impedance Cardiography during 6‐Minute Walk Test in Pulmonary Hypertension

    PubMed Central

    Alkukhun, Laith; Arelli, Vineesha; Ramos, José; Newman, Jennie; McCarthy, Kevin; Pichurko, Bohdan; Minai, Omar A.; Dweik, Raed A.

    2013-01-01

    Abstract Background Methods that predict prognosis and response to therapy in pulmonary hypertension (PH) are lacking. We tested whether the noninvasive estimation of hemodynamic parameters during 6‐minute walk test (6MWT) in PH patients provides information that can improve the value of the test. Methods We estimated hemodynamic parameters during the 6MWT using a portable, signal‐morphology‐based, impedance cardiograph (PhysioFlow Enduro) with real‐time wireless monitoring via a bluetooth USB adapter. Results We recruited 48 subjects in the study (30 with PH and 18 healthy controls). PH patients had significantly lower maximum stroke volume (SV) and CI and slower cardiac output (CO) acceleration and decelerations slopes during the test when compared with healthy controls. In PH patients, CI change was associated with total distance walked (R = 0.62; P < 0.001) and percentage of predicted (R = 0.4, P = 0.03), HR recovery at 1 minute (0.57, P < 0.001), 2 minutes (0.65, P < 0.001), and 3 minutes (0.66, P < 0.001). Interestingly, in PH patients CO change during the test was predominantly related to an increase in SV instead of HR. Conclusions Estimation of hemodynamic parameters such as cardiac index during 6‐minute walk test is feasible and may provide useful information in patients with PH. Clin Trans Sci 2013; Volume #: 1–7 PMID:24330692

  18. MONTE CARLO METHODS. A Bibliography covering the Period 1949 to June 1961

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kraft, R.; Wensrich, C.J.

    1961-09-11

    A partially annotated bibliography is presented containing 508 references to Monte Carlo methods, covering the period from 1947 to June 1961. The references are arranged alphabetically by author. The sources consulted include: Abstracts of Classified Reports; Applied Science and Technology Index; Armed Services Technical Information Agency; Bibliographic Index; Bibliographie der Fremsprachigen Zeitschrifften Literatur; Mathematical Reviews; Nuclear Science Abstracts; and Operations Research, an Annotated Bibliography. (T.F.H.)

  19. Kinetics of Hydrogen Abstraction and Addition Reactions of 3-Hexene by ȮH Radicals.

    PubMed

    Yang, Feiyu; Deng, Fuquan; Pan, Youshun; Zhang, Yingjia; Tang, Chenglong; Huang, Zuohua

    2017-03-09

    Rate coefficients of H atom abstraction and H atom addition reactions of 3-hexene by the hydroxyl radicals were determined using both conventional transition-state theory and canonical variational transition-state theory, with the potential energy surface (PES) evaluated at the CCSD(T)/CBS//BHandHLYP/6-311G(d,p) level and quantum mechanical effect corrected by the compounded methods including one-dimensional Wigner method, multidimensional zero-curvature tunneling method, and small-curvature tunneling method. Results reveal that accounting for approximate 70% of the overall H atom abstractions occur in the allylic site via both direct and indirect channels. The indirect channel containing two van der Waals prereactive complexes exhibits two times larger rate coefficient relative to the direct one. The OH addition reaction also contains two van der Waals complexes, and its submerged barrier results in a negative temperature coefficient behavior at low temperatures. In contrast, The OH addition pathway dominates only at temperatures below 450 K whereas the H atom abstraction reactions dominate overwhelmingly at temperature over 1000 K. All of the rate coefficients calculated with an uncertainty of a factor of 5 were fitted in a quasi-Arrhenius formula. Analyses on the PES, minimum reaction path and activation free Gibbs energy were also performed in this study.

  20. Exploring students' perceptions and performance on predict-observe-explain tasks in high school chemistry laboratory

    NASA Astrophysics Data System (ADS)

    Vadapally, Praveen

    This study sought to understand the impact of gender and reasoning level on students' perceptions and performances of Predict-Observe-Explain (POE) laboratory tasks in a high school chemistry laboratory. Several literature reviews have reported that students at all levels have not developed the specific knowledge and skills that were expected from their laboratory work. Studies conducted over the last several decades have found that boys tend to be more successful than girls in science and mathematics courses. However, some recent studies have suggested that girls may be reducing this gender gap. This gender difference is the focal point of this research study, which was conducted at a mid-western, rural high school. The participants were 24 boys and 25 girls enrolled in two physical science classes taught by the same teacher. In this mixed methods study, qualitative and quantitative methods were implemented simultaneously over the entire period of the study. MANOVA statistics revealed significant effects due to gender and level of reasoning on the outcome variables, which were POE performances and perceptions of the chemistry laboratory environment. There were no significant interactions between these effects. For the qualitative method, IRB-approved information was collected, coded, grouped, and analyzed. This method was used to derive themes from students' responses on questionnaires and semi-structured interviews. Students with different levels of reasoning and gender were interviewed, and many of them expressed positive themes, which was a clear indication that they had enjoyed participating in the POE learning tasks and they had developed positive perceptions towards POE inquiry laboratory learning environment. When students are capable of formal reasoning, they can use an abstract scientific concept effectively and then relate it to the ideas they generate in their minds. Thus, instructors should factor the nature of students' thinking abilities into their instructional strategies and strive to create a learning environment where students are engaged in thinking, learning, and acting in meaningful and beneficial ways. POE learning tasks enhance students' laboratory experiences and can help deepen their understanding of the empirical nature of science. Key words: predict observe explain, gender, science laboratory inquiry, reasoning ability, social constructivism, mixed methods.

Top