Sample records for present quantitative predictions

  1. Toxicity challenges in environmental chemicals: Prediction of human plasma protein binding through quantitative structure-activity relationship (QSAR) models

    EPA Science Inventory

    The present study explores the merit of utilizing available pharmaceutical data to construct a quantitative structure-activity relationship (QSAR) for prediction of the fraction of a chemical unbound to plasma protein (Fub) in environmentally relevant compounds. Independent model...

  2. Simulation-Based Prediction of Equivalent Continuous Noises during Construction Processes

    PubMed Central

    Zhang, Hong; Pei, Yun

    2016-01-01

    Quantitative prediction of construction noise is crucial to evaluate construction plans to help make decisions to address noise levels. Considering limitations of existing methods for measuring or predicting the construction noise and particularly the equivalent continuous noise level over a period of time, this paper presents a discrete-event simulation method for predicting the construction noise in terms of equivalent continuous level. The noise-calculating models regarding synchronization, propagation and equivalent continuous level are presented. The simulation framework for modeling the noise-affected factors and calculating the equivalent continuous noise by incorporating the noise-calculating models into simulation strategy is proposed. An application study is presented to demonstrate and justify the proposed simulation method in predicting the equivalent continuous noise during construction. The study contributes to provision of a simulation methodology to quantitatively predict the equivalent continuous noise of construction by considering the relevant uncertainties, dynamics and interactions. PMID:27529266

  3. Simulation-Based Prediction of Equivalent Continuous Noises during Construction Processes.

    PubMed

    Zhang, Hong; Pei, Yun

    2016-08-12

    Quantitative prediction of construction noise is crucial to evaluate construction plans to help make decisions to address noise levels. Considering limitations of existing methods for measuring or predicting the construction noise and particularly the equivalent continuous noise level over a period of time, this paper presents a discrete-event simulation method for predicting the construction noise in terms of equivalent continuous level. The noise-calculating models regarding synchronization, propagation and equivalent continuous level are presented. The simulation framework for modeling the noise-affected factors and calculating the equivalent continuous noise by incorporating the noise-calculating models into simulation strategy is proposed. An application study is presented to demonstrate and justify the proposed simulation method in predicting the equivalent continuous noise during construction. The study contributes to provision of a simulation methodology to quantitatively predict the equivalent continuous noise of construction by considering the relevant uncertainties, dynamics and interactions.

  4. Quantitative prediction of solute strengthening in aluminium alloys.

    PubMed

    Leyson, Gerard Paul M; Curtin, William A; Hector, Louis G; Woodward, Christopher F

    2010-09-01

    Despite significant advances in computational materials science, a quantitative, parameter-free prediction of the mechanical properties of alloys has been difficult to achieve from first principles. Here, we present a new analytic theory that, with input from first-principles calculations, is able to predict the strengthening of aluminium by substitutional solute atoms. Solute-dislocation interaction energies in and around the dislocation core are first calculated using density functional theory and a flexible-boundary-condition method. An analytic model for the strength, or stress to move a dislocation, owing to the random field of solutes, is then presented. The theory, which has no adjustable parameters and is extendable to other metallic alloys, predicts both the energy barriers to dislocation motion and the zero-temperature flow stress, allowing for predictions of finite-temperature flow stresses. Quantitative comparisons with experimental flow stresses at temperature T=78 K are made for Al-X alloys (X=Mg, Si, Cu, Cr) and good agreement is obtained.

  5. Cancer imaging phenomics toolkit: quantitative imaging analytics for precision diagnostics and predictive modeling of clinical outcome.

    PubMed

    Davatzikos, Christos; Rathore, Saima; Bakas, Spyridon; Pati, Sarthak; Bergman, Mark; Kalarot, Ratheesh; Sridharan, Patmaa; Gastounioti, Aimilia; Jahani, Nariman; Cohen, Eric; Akbari, Hamed; Tunc, Birkan; Doshi, Jimit; Parker, Drew; Hsieh, Michael; Sotiras, Aristeidis; Li, Hongming; Ou, Yangming; Doot, Robert K; Bilello, Michel; Fan, Yong; Shinohara, Russell T; Yushkevich, Paul; Verma, Ragini; Kontos, Despina

    2018-01-01

    The growth of multiparametric imaging protocols has paved the way for quantitative imaging phenotypes that predict treatment response and clinical outcome, reflect underlying cancer molecular characteristics and spatiotemporal heterogeneity, and can guide personalized treatment planning. This growth has underlined the need for efficient quantitative analytics to derive high-dimensional imaging signatures of diagnostic and predictive value in this emerging era of integrated precision diagnostics. This paper presents cancer imaging phenomics toolkit (CaPTk), a new and dynamically growing software platform for analysis of radiographic images of cancer, currently focusing on brain, breast, and lung cancer. CaPTk leverages the value of quantitative imaging analytics along with machine learning to derive phenotypic imaging signatures, based on two-level functionality. First, image analysis algorithms are used to extract comprehensive panels of diverse and complementary features, such as multiparametric intensity histogram distributions, texture, shape, kinetics, connectomics, and spatial patterns. At the second level, these quantitative imaging signatures are fed into multivariate machine learning models to produce diagnostic, prognostic, and predictive biomarkers. Results from clinical studies in three areas are shown: (i) computational neuro-oncology of brain gliomas for precision diagnostics, prediction of outcome, and treatment planning; (ii) prediction of treatment response for breast and lung cancer, and (iii) risk assessment for breast cancer.

  6. [Influence of sample surface roughness on mathematical model of NIR quantitative analysis of wood density].

    PubMed

    Huang, An-Min; Fei, Ben-Hua; Jiang, Ze-Hui; Hse, Chung-Yun

    2007-09-01

    Near infrared spectroscopy is widely used as a quantitative method, and the main multivariate techniques consist of regression methods used to build prediction models, however, the accuracy of analysis results will be affected by many factors. In the present paper, the influence of different sample roughness on the mathematical model of NIR quantitative analysis of wood density was studied. The result of experiments showed that if the roughness of predicted samples was consistent with that of calibrated samples, the result was good, otherwise the error would be much higher. The roughness-mixed model was more flexible and adaptable to different sample roughness. The prediction ability of the roughness-mixed model was much better than that of the single-roughness model.

  7. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  8. Trainee and Instructor Task Quantification: Development of Quantitative Indices and a Predictive Methodology.

    ERIC Educational Resources Information Center

    Whaton, George R.; And Others

    As the first step in a program to develop quantitative techniques for prescribing the design and use of training systems, the present study attempted: to compile an initial set of quantitative indices, to determine whether these indices could be used to describe a sample of trainee tasks and differentiate among them, to develop a predictive…

  9. Quantitative magnetic resonance imaging in traumatic brain injury.

    PubMed

    Bigler, E D

    2001-04-01

    Quantitative neuroimaging has now become a well-established method for analyzing magnetic resonance imaging in traumatic brain injury (TBI). A general review of studies that have examined quantitative changes following TBI is presented. The consensus of quantitative neuroimaging studies is that most brain structures demonstrate changes in volume or surface area after injury. The patterns of atrophy are consistent with the generalized nature of brain injury and diffuse axonal injury. Various clinical caveats are provided including how quantitative neuroimaging findings can be used clinically and in predicting rehabilitation outcome. The future of quantitative neuroimaging also is discussed.

  10. Does the Social Working Environment Predict Beginning Teachers' Self-Efficacy and Feelings of Depression?

    ERIC Educational Resources Information Center

    Devos, Christelle; Dupriez, Vincent; Paquay, Leopold

    2012-01-01

    We investigate how the social working environment predicts beginning teachers' self-efficacy and feelings of depression. Two quantitative studies are presented. The results show that the goal structure of the school culture (mastery or performance orientation) predicts both outcomes. Frequent collaborative interactions with colleagues are related…

  11. Benchmarking B-Cell Epitope Prediction with Quantitative Dose-Response Data on Antipeptide Antibodies: Towards Novel Pharmaceutical Product Development

    PubMed Central

    Caoili, Salvador Eugenio C.

    2014-01-01

    B-cell epitope prediction can enable novel pharmaceutical product development. However, a mechanistically framed consensus has yet to emerge on benchmarking such prediction, thus presenting an opportunity to establish standards of practice that circumvent epistemic inconsistencies of casting the epitope prediction task as a binary-classification problem. As an alternative to conventional dichotomous qualitative benchmark data, quantitative dose-response data on antibody-mediated biological effects are more meaningful from an information-theoretic perspective in the sense that such effects may be expressed as probabilities (e.g., of functional inhibition by antibody) for which the Shannon information entropy (SIE) can be evaluated as a measure of informativeness. Accordingly, half-maximal biological effects (e.g., at median inhibitory concentrations of antibody) correspond to maximally informative data while undetectable and maximal biological effects correspond to minimally informative data. This applies to benchmarking B-cell epitope prediction for the design of peptide-based immunogens that elicit antipeptide antibodies with functionally relevant cross-reactivity. Presently, the Immune Epitope Database (IEDB) contains relatively few quantitative dose-response data on such cross-reactivity. Only a small fraction of these IEDB data is maximally informative, and many more of them are minimally informative (i.e., with zero SIE). Nevertheless, the numerous qualitative data in IEDB suggest how to overcome the paucity of informative benchmark data. PMID:24949474

  12. Handling of computational in vitro/in vivo correlation problems by Microsoft Excel: V. Predictive absorbability models.

    PubMed

    Langenbucher, Frieder

    2007-08-01

    This paper discusses Excel applications related to the prediction of drug absorbability from physicochemical constants. PHDISSOC provides a generalized model for pH profiles of electrolytic dissociation, water solubility, and partition coefficient. SKMODEL predicts drug absorbability, based on a log-log plot of water solubility and O/W partitioning; augmented by additional features such as electrolytic dissociation, melting point, and the dose administered. GIABS presents a mechanistic model of g.i. drug absorption. BIODATCO presents a database compiling relevant drug data to be used for quantitative predictions.

  13. Quantitative prediction of drug side effects based on drug-related features.

    PubMed

    Niu, Yanqing; Zhang, Wen

    2017-09-01

    Unexpected side effects of drugs are great concern in the drug development, and the identification of side effects is an important task. Recently, machine learning methods are proposed to predict the presence or absence of interested side effects for drugs, but it is difficult to make the accurate prediction for all of them. In this paper, we transform side effect profiles of drugs as their quantitative scores, by summing up their side effects with weights. The quantitative scores may measure the dangers of drugs, and thus help to compare the risk of different drugs. Here, we attempt to predict quantitative scores of drugs, namely the quantitative prediction. Specifically, we explore a variety of drug-related features and evaluate their discriminative powers for the quantitative prediction. Then, we consider several feature combination strategies (direct combination, average scoring ensemble combination) to integrate three informative features: chemical substructures, targets, and treatment indications. Finally, the average scoring ensemble model which produces the better performances is used as the final quantitative prediction model. Since weights for side effects are empirical values, we randomly generate different weights in the simulation experiments. The experimental results show that the quantitative method is robust to different weights, and produces satisfying results. Although other state-of-the-art methods cannot make the quantitative prediction directly, the prediction results can be transformed as the quantitative scores. By indirect comparison, the proposed method produces much better results than benchmark methods in the quantitative prediction. In conclusion, the proposed method is promising for the quantitative prediction of side effects, which may work cooperatively with existing state-of-the-art methods to reveal dangers of drugs.

  14. Quantitative analysis and predictive engineering of self-rolling of nanomembranes under anisotropic mismatch strain

    NASA Astrophysics Data System (ADS)

    Chen, Cheng; Song, Pengfei; Meng, Fanchao; Li, Xiao; Liu, Xinyu; Song, Jun

    2017-12-01

    The present work presents a quantitative modeling framework for investigating the self-rolling of nanomembranes under different lattice mismatch strain anisotropy. The effect of transverse mismatch strain on the roll-up direction and curvature has been systematically studied employing both analytical modeling and numerical simulations. The bidirectional nature of the self-rolling of nanomembranes and the critical role of transverse strain in affecting the rolling behaviors have been demonstrated. Two fabrication strategies, i.e., third-layer deposition and corner geometry engineering, have been proposed to predictively manipulate the bidirectional rolling competition of strained nanomembranes, so as to achieve controlled, unidirectional roll-up. In particular for the strategy of corner engineering, microfabrication experiments have been performed to showcase its practical application and effectiveness. Our study offers new mechanistic knowledge towards understanding and predictive engineering of self-rolling of nanomembranes with improved roll-up yield.

  15. Testing process predictions of models of risky choice: a quantitative model comparison approach

    PubMed Central

    Pachur, Thorsten; Hertwig, Ralph; Gigerenzer, Gerd; Brandstätter, Eduard

    2013-01-01

    This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or non-linear functions thereof) and the separate evaluation of risky options (expectation models). Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models). We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter et al., 2006), and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up) and direction of search (i.e., gamble-wise vs. reason-wise). In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly); acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988) called “similarity.” In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies. PMID:24151472

  16. PREDICTING THE ADSORPTION CAPACITY OF ACTIVATED CARBON FOR EMERGING ORGANIC CONTAMINANTS FROM FUNDAMENTAL ADSORBENT AND ADSORBATE PROPERTIES - PRESENTATION

    EPA Science Inventory

    A quantitative structure-property relationship (QSPR) was developed and combined with the Polanyi-Dubinin-Manes model to predict adsorption isotherms of emerging contaminants on activated carbons with a wide range of physico-chemical properties. Affinity coefficients (βl

  17. Extensions and evaluations of a general quantitative theory of forest structure and dynamics

    PubMed Central

    Enquist, Brian J.; West, Geoffrey B.; Brown, James H.

    2009-01-01

    Here, we present the second part of a quantitative theory for the structure and dynamics of forests under demographic and resource steady state. The theory is based on individual-level allometric scaling relations for how trees use resources, fill space, and grow. These scale up to determine emergent properties of diverse forests, including size–frequency distributions, spacing relations, canopy configurations, mortality rates, population dynamics, successional dynamics, and resource flux rates. The theory uniquely makes quantitative predictions for both stand-level scaling exponents and normalizations. We evaluate these predictions by compiling and analyzing macroecological datasets from several tropical forests. The close match between theoretical predictions and data suggests that forests are organized by a set of very general scaling rules. Our mechanistic theory is based on allometric scaling relations, is complementary to “demographic theory,” but is fundamentally different in approach. It provides a quantitative baseline for understanding deviations from predictions due to other factors, including disturbance, variation in branching architecture, asymmetric competition, resource limitation, and other sources of mortality, which are not included in the deliberately simplified theory. The theory should apply to a wide range of forests despite large differences in abiotic environment, species diversity, and taxonomic and functional composition. PMID:19363161

  18. Quantitative Estimation of Plasma Free Drug Fraction in Patients With Varying Degrees of Hepatic Impairment: A Methodological Evaluation.

    PubMed

    Li, Guo-Fu; Yu, Guo; Li, Yanfei; Zheng, Yi; Zheng, Qing-Shan; Derendorf, Hartmut

    2018-07-01

    Quantitative prediction of unbound drug fraction (f u ) is essential for scaling pharmacokinetics through physiologically based approaches. However, few attempts have been made to evaluate the projection of f u values under pathological conditions. The primary objective of this study was to predict f u values (n = 105) of 56 compounds with or without the information of predominant binding protein in patients with varying degrees of hepatic insufficiency by accounting for quantitative changes in molar concentrations of either the major binding protein or albumin plus alpha 1-acid glycoprotein associated with differing levels of hepatic dysfunction. For the purpose of scaling, data pertaining to albumin and α1-acid glycoprotein levels in response to differing degrees of hepatic impairment were systematically collected from 919 adult donors. The results of the present study demonstrate for the first time the feasibility of physiologically based scaling f u in hepatic dysfunction after verifying with experimentally measured data of a wide variety of compounds from individuals with varying degrees of hepatic insufficiency. Furthermore, the high level of predictive accuracy indicates that the inter-relation between the severity of hepatic impairment and these plasma protein levels are physiologically accurate. The present study enhances the confidence in predicting f u in hepatic insufficiency, particularly for albumin-bound drugs. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  19. Applications of Microfluidics in Quantitative Biology.

    PubMed

    Bai, Yang; Gao, Meng; Wen, Lingling; He, Caiyun; Chen, Yuan; Liu, Chenli; Fu, Xiongfei; Huang, Shuqiang

    2018-05-01

    Quantitative biology is dedicated to taking advantage of quantitative reasoning and advanced engineering technologies to make biology more predictable. Microfluidics, as an emerging technique, provides new approaches to precisely control fluidic conditions on small scales and collect data in high-throughput and quantitative manners. In this review, the authors present the relevant applications of microfluidics to quantitative biology based on two major categories (channel-based microfluidics and droplet-based microfluidics), and their typical features. We also envision some other microfluidic techniques that may not be employed in quantitative biology right now, but have great potential in the near future. © 2017 Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences. Biotechnology Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  20. 3D-quantitative structure-activity relationship studies on benzothiadiazepine hydroxamates as inhibitors of tumor necrosis factor-alpha converting enzyme.

    PubMed

    Murumkar, Prashant R; Giridhar, Rajani; Yadav, Mange Ram

    2008-04-01

    A set of 29 benzothiadiazepine hydroxamates having selective tumor necrosis factor-alpha converting enzyme inhibitory activity were used to compare the quality and predictive power of 3D-quantitative structure-activity relationship, comparative molecular field analysis, and comparative molecular similarity indices models for the atom-based, centroid/atom-based, data-based, and docked conformer-based alignment. Removal of two outliers from the initial training set of molecules improved the predictivity of models. Among the 3D-quantitative structure-activity relationship models developed using the above four alignments, the database alignment provided the optimal predictive comparative molecular field analysis model for the training set with cross-validated r(2) (q(2)) = 0.510, non-cross-validated r(2) = 0.972, standard error of estimates (s) = 0.098, and F = 215.44 and the optimal comparative molecular similarity indices model with cross-validated r(2) (q(2)) = 0.556, non-cross-validated r(2) = 0.946, standard error of estimates (s) = 0.163, and F = 99.785. These models also showed the best test set prediction for six compounds with predictive r(2) values of 0.460 and 0.535, respectively. The contour maps obtained from 3D-quantitative structure-activity relationship studies were appraised for activity trends for the molecules analyzed. The comparative molecular similarity indices models exhibited good external predictivity as compared with that of comparative molecular field analysis models. The data generated from the present study helped us to further design and report some novel and potent tumor necrosis factor-alpha converting enzyme inhibitors.

  1. Results of Instrument Observations and Adaptive Prediction of Thermoabrasion of Banks of the Vilyui Reservoir

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Velikin, S. A.; Sobol', I. S.; Sobol', S. V.

    2013-11-15

    Quantitative data derived from observations of reformation of the thermoabrasive banks of the Viliyui Reservoir in Yakutia during the service period from 1972 through 2011, and results of analytical prediction of bank formations over the next 20 years for purposes of monitoring the ecological safety of this water body are presented.

  2. A quantitative model for transforming reflectance spectra into the Munsell color space using cone sensitivity functions and opponent process weights.

    PubMed

    D'Andrade, Roy G; Romney, A Kimball

    2003-05-13

    This article presents a computational model of the process through which the human visual system transforms reflectance spectra into perceptions of color. Using physical reflectance spectra data and standard human cone sensitivity functions we describe the transformations necessary for predicting the location of colors in the Munsell color space. These transformations include quantitative estimates of the opponent process weights needed to transform cone activations into Munsell color space coordinates. Using these opponent process weights, the Munsell position of specific colors can be predicted from their physical spectra with a mean correlation of 0.989.

  3. Quantitation of cholesterol incorporation into extruded lipid bilayers.

    PubMed

    Ibarguren, Maitane; Alonso, Alicia; Tenchov, Boris G; Goñi, Felix M

    2010-09-01

    Cholesterol incorporation into lipid bilayers, in the form of multilamellar vesicles or extruded large unilamellar vesicles, has been quantitated. To this aim, the cholesterol contents of bilayers prepared from phospholipid:cholesterol mixtures 33-75 mol% cholesterol have been measured and compared with the original mixture before lipid hydration. There is a great diversity of cases, but under most conditions the actual cholesterol proportion present in the extruded bilayers is much lower than predicted. A quantitative analysis of the vesicles is thus required before any experimental study is undertaken. 2010 Elsevier B.V. All rights reserved.

  4. A Physiologically Based Pharmacokinetic Model for Pregnant Women to Predict the Pharmacokinetics of Drugs Metabolized Via Several Enzymatic Pathways.

    PubMed

    Dallmann, André; Ince, Ibrahim; Coboeken, Katrin; Eissing, Thomas; Hempel, Georg

    2017-09-18

    Physiologically based pharmacokinetic modeling is considered a valuable tool for predicting pharmacokinetic changes in pregnancy to subsequently guide in-vivo pharmacokinetic trials in pregnant women. The objective of this study was to extend and verify a previously developed physiologically based pharmacokinetic model for pregnant women for the prediction of pharmacokinetics of drugs metabolized via several cytochrome P450 enzymes. Quantitative information on gestation-specific changes in enzyme activity available in the literature was incorporated in a pregnancy physiologically based pharmacokinetic model and the pharmacokinetics of eight drugs metabolized via one or multiple cytochrome P450 enzymes was predicted. The tested drugs were caffeine, midazolam, nifedipine, metoprolol, ondansetron, granisetron, diazepam, and metronidazole. Pharmacokinetic predictions were evaluated by comparison with in-vivo pharmacokinetic data obtained from the literature. The pregnancy physiologically based pharmacokinetic model successfully predicted the pharmacokinetics of all tested drugs. The observed pregnancy-induced pharmacokinetic changes were qualitatively and quantitatively reasonably well predicted for all drugs. Ninety-seven percent of the mean plasma concentrations predicted in pregnant women fell within a twofold error range and 63% within a 1.25-fold error range. For all drugs, the predicted area under the concentration-time curve was within a 1.25-fold error range. The presented pregnancy physiologically based pharmacokinetic model can quantitatively predict the pharmacokinetics of drugs that are metabolized via one or multiple cytochrome P450 enzymes by integrating prior knowledge of the pregnancy-related effect on these enzymes. This pregnancy physiologically based pharmacokinetic model may thus be used to identify potential exposure changes in pregnant women a priori and to eventually support informed decision making when clinical trials are designed in this special population.

  5. lazar: a modular predictive toxicology framework

    PubMed Central

    Maunz, Andreas; Gütlein, Martin; Rautenberg, Micha; Vorgrimmler, David; Gebele, Denis; Helma, Christoph

    2013-01-01

    lazar (lazy structure–activity relationships) is a modular framework for predictive toxicology. Similar to the read across procedure in toxicological risk assessment, lazar creates local QSAR (quantitative structure–activity relationship) models for each compound to be predicted. Model developers can choose between a large variety of algorithms for descriptor calculation and selection, chemical similarity indices, and model building. This paper presents a high level description of the lazar framework and discusses the performance of example classification and regression models. PMID:23761761

  6. Position of pelvis in the 3rd month of life predicts further motor development.

    PubMed

    Gajewska, Ewa; Sobieska, Magdalena; Moczko, Jerzy

    2018-06-01

    The aim of the study is to select elements of motor skills assessed at 3 months that provide the best predictive properties for motor development at 9 months. In all children a physiotherapeutic assessment of the quantitative and qualitative development at the age of 3 months was performed in the prone and supine positions, which was presented in previous papers as the quantitative and qualitative assessment sheet of motor development. The neurological examination at the age of 9 months was based on the Denver Development Screening Test II and the evaluation of reflexes, muscle tone (hypotony and hypertony), and symmetry. The particular elements of motor performance assessment were shown to have distinct predictive value for further motor development (as assessed at 9 months), and the pelvis position was the strongest predictive element. Irrespective of the symptomatic and anamnestic factors the inappropriate motor performance may already be detected in the 3rd month of life and is predictive for further motor development. The assessment of the motor performance should be performed in both supine and prone positions. The proper position of pelvis summarizes the proper positioning of the whole spine and ensures proper further motor development. To our knowledge, the presented motor development assessment sheet allows the earliest prediction of motor disturbances. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Models of volcanic eruption hazards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wohletz, K.H.

    1992-01-01

    Volcanic eruptions pose an ever present but poorly constrained hazard to life and property for geothermal installations in volcanic areas. Because eruptions occur sporadically and may limit field access, quantitative and systematic field studies of eruptions are difficult to complete. Circumventing this difficulty, laboratory models and numerical simulations are pivotal in building our understanding of eruptions. For example, the results of fuel-coolant interaction experiments show that magma-water interaction controls many eruption styles. Applying these results, increasing numbers of field studies now document and interpret the role of external water eruptions. Similarly, numerical simulations solve the fundamental physics of high-speed fluidmore » flow and give quantitative predictions that elucidate the complexities of pyroclastic flows and surges. A primary goal of these models is to guide geologists in searching for critical field relationships and making their interpretations. Coupled with field work, modeling is beginning to allow more quantitative and predictive volcanic hazard assessments.« less

  8. Models of volcanic eruption hazards

    NASA Astrophysics Data System (ADS)

    Wohletz, K. H.

    Volcanic eruptions pose an ever present but poorly constrained hazard to life and property for geothermal installations in volcanic areas. Because eruptions occur sporadically and may limit field access, quantitative and systematic field studies of eruptions are difficult to complete. Circumventing this difficulty, laboratory models and numerical simulations are pivotal in building our understanding of eruptions. For example, the results of fuel-coolant interaction experiments show that magma-water interaction controls many eruption styles. Applying these results, increasing numbers of field studies now document and interpret the role of external water eruptions. Similarly, numerical simulations solve the fundamental physics of high-speed fluid flow and give quantitative predictions that elucidate the complexities of pyroclastic flows and surges. A primary goal of these models is to guide geologists in searching for critical field relationships and making their interpretations. Coupled with field work, modeling is beginning to allow more quantitative and predictive volcanic hazard assessments.

  9. Direct comparison of low- and mid-frequency Raman spectroscopy for quantitative solid-state pharmaceutical analysis.

    PubMed

    Lipiäinen, Tiina; Fraser-Miller, Sara J; Gordon, Keith C; Strachan, Clare J

    2018-02-05

    This study considers the potential of low-frequency (terahertz) Raman spectroscopy in the quantitative analysis of ternary mixtures of solid-state forms. Direct comparison between low-frequency and mid-frequency spectral regions for quantitative analysis of crystal form mixtures, without confounding sampling and instrumental variations, is reported for the first time. Piroxicam was used as a model drug, and the low-frequency spectra of piroxicam forms β, α2 and monohydrate are presented for the first time. These forms show clear spectral differences in both the low- and mid-frequency regions. Both spectral regions provided quantitative models suitable for predicting the mixture compositions using partial least squares regression (PLSR), but the low-frequency data gave better models, based on lower errors of prediction (2.7, 3.1 and 3.2% root-mean-square errors of prediction [RMSEP] values for the β, α2 and monohydrate forms, respectively) than the mid-frequency data (6.3, 5.4 and 4.8%, for the β, α2 and monohydrate forms, respectively). The better performance of low-frequency Raman analysis was attributed to larger spectral differences between the solid-state forms, combined with a higher signal-to-noise ratio. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Preoperative Cerebral Oxygen Extraction Fraction Imaging Generated from 7T MR Quantitative Susceptibility Mapping Predicts Development of Cerebral Hyperperfusion following Carotid Endarterectomy.

    PubMed

    Nomura, J-I; Uwano, I; Sasaki, M; Kudo, K; Yamashita, F; Ito, K; Fujiwara, S; Kobayashi, M; Ogasawara, K

    2017-12-01

    Preoperative hemodynamic impairment in the affected cerebral hemisphere is associated with the development of cerebral hyperperfusion following carotid endarterectomy. Cerebral oxygen extraction fraction images generated from 7T MR quantitative susceptibility mapping correlate with oxygen extraction fraction images on positron-emission tomography. The present study aimed to determine whether preoperative oxygen extraction fraction imaging generated from 7T MR quantitative susceptibility mapping could identify patients at risk for cerebral hyperperfusion following carotid endarterectomy. Seventy-seven patients with unilateral internal carotid artery stenosis (≥70%) underwent preoperative 3D T2*-weighted imaging using a multiple dipole-inversion algorithm with a 7T MR imager. Quantitative susceptibility mapping images were then obtained, and oxygen extraction fraction maps were generated. Quantitative brain perfusion single-photon emission CT was also performed before and immediately after carotid endarterectomy. ROIs were automatically placed in the bilateral middle cerebral artery territories in all images using a 3D stereotactic ROI template, and affected-to-contralateral ratios in the ROIs were calculated on quantitative susceptibility mapping-oxygen extraction fraction images. Ten patients (13%) showed post-carotid endarterectomy hyperperfusion (cerebral blood flow increases of ≥100% compared with preoperative values in the ROIs on brain perfusion SPECT). Multivariate analysis showed that a high quantitative susceptibility mapping-oxygen extraction fraction ratio was significantly associated with the development of post-carotid endarterectomy hyperperfusion (95% confidence interval, 33.5-249.7; P = .002). Sensitivity, specificity, and positive- and negative-predictive values of the quantitative susceptibility mapping-oxygen extraction fraction ratio for the prediction of the development of post-carotid endarterectomy hyperperfusion were 90%, 84%, 45%, and 98%, respectively. Preoperative oxygen extraction fraction imaging generated from 7T MR quantitative susceptibility mapping identifies patients at risk for cerebral hyperperfusion following carotid endarterectomy. © 2017 by American Journal of Neuroradiology.

  11. Enzyme clustering accelerates processing of intermediates through metabolic channeling

    PubMed Central

    Castellana, Michele; Wilson, Maxwell Z.; Xu, Yifan; Joshi, Preeti; Cristea, Ileana M.; Rabinowitz, Joshua D.; Gitai, Zemer; Wingreen, Ned S.

    2015-01-01

    We present a quantitative model to demonstrate that coclustering multiple enzymes into compact agglomerates accelerates the processing of intermediates, yielding the same efficiency benefits as direct channeling, a well-known mechanism in which enzymes are funneled between enzyme active sites through a physical tunnel. The model predicts the separation and size of coclusters that maximize metabolic efficiency, and this prediction is in agreement with previously reported spacings between coclusters in mammalian cells. For direct validation, we study a metabolic branch point in Escherichia coli and experimentally confirm the model prediction that enzyme agglomerates can accelerate the processing of a shared intermediate by one branch, and thus regulate steady-state flux division. Our studies establish a quantitative framework to understand coclustering-mediated metabolic channeling and its application to both efficiency improvement and metabolic regulation. PMID:25262299

  12. Improving accuracy of genomic prediction in Brangus cattle by adding animals with imputed low-density SNP genotypes.

    PubMed

    Lopes, F B; Wu, X-L; Li, H; Xu, J; Perkins, T; Genho, J; Ferretti, R; Tait, R G; Bauck, S; Rosa, G J M

    2018-02-01

    Reliable genomic prediction of breeding values for quantitative traits requires the availability of sufficient number of animals with genotypes and phenotypes in the training set. As of 31 October 2016, there were 3,797 Brangus animals with genotypes and phenotypes. These Brangus animals were genotyped using different commercial SNP chips. Of them, the largest group consisted of 1,535 animals genotyped by the GGP-LDV4 SNP chip. The remaining 2,262 genotypes were imputed to the SNP content of the GGP-LDV4 chip, so that the number of animals available for training the genomic prediction models was more than doubled. The present study showed that the pooling of animals with both original or imputed 40K SNP genotypes substantially increased genomic prediction accuracies on the ten traits. By supplementing imputed genotypes, the relative gains in genomic prediction accuracies on estimated breeding values (EBV) were from 12.60% to 31.27%, and the relative gain in genomic prediction accuracies on de-regressed EBV was slightly small (i.e. 0.87%-18.75%). The present study also compared the performance of five genomic prediction models and two cross-validation methods. The five genomic models predicted EBV and de-regressed EBV of the ten traits similarly well. Of the two cross-validation methods, leave-one-out cross-validation maximized the number of animals at the stage of training for genomic prediction. Genomic prediction accuracy (GPA) on the ten quantitative traits was validated in 1,106 newly genotyped Brangus animals based on the SNP effects estimated in the previous set of 3,797 Brangus animals, and they were slightly lower than GPA in the original data. The present study was the first to leverage currently available genotype and phenotype resources in order to harness genomic prediction in Brangus beef cattle. © 2018 Blackwell Verlag GmbH.

  13. Simulation of UV atomic radiation for application in exhaust plume spectrometry

    NASA Astrophysics Data System (ADS)

    Wallace, T. L.; Powers, W. T.; Cooper, A. E.

    1993-06-01

    Quantitative analysis of exhaust plume spectral data has long been a goal of developers of advanced engine health monitoring systems which incorporate optical measurements of rocket exhaust constituents. Discussed herein is the status of present efforts to model and predict atomic radiation spectra and infer free-atom densities from emission/absorption measurements as part of the Optical Plume Anomaly Detection (OPAD) program at Marshall Space Flight Center (MSFC). A brief examination of the mathematical formalism is provided in the context of predicting radiation from the Mach disk region of the SSME exhaust flow at nominal conditions during ground level testing at MSFC. Computational results are provided for Chromium and Copper at selected transitions which indicate a strong dependence upon broadening parameter values determining the absorption-emission line shape. Representative plots of recent spectral data from the Stennis Space Center (SSC) Diagnostic Test Facility (DTF) rocket engine are presented and compared to numerical results from the present self-absorbing model; a comprehensive quantitative analysis will be reported at a later date.

  14. Software for quantitative analysis of radiotherapy: overview, requirement analysis and design solutions.

    PubMed

    Zhang, Lanlan; Hub, Martina; Mang, Sarah; Thieke, Christian; Nix, Oliver; Karger, Christian P; Floca, Ralf O

    2013-06-01

    Radiotherapy is a fast-developing discipline which plays a major role in cancer care. Quantitative analysis of radiotherapy data can improve the success of the treatment and support the prediction of outcome. In this paper, we first identify functional, conceptional and general requirements on a software system for quantitative analysis of radiotherapy. Further we present an overview of existing radiotherapy analysis software tools and check them against the stated requirements. As none of them could meet all of the demands presented herein, we analyzed possible conceptional problems and present software design solutions and recommendations to meet the stated requirements (e.g. algorithmic decoupling via dose iterator pattern; analysis database design). As a proof of concept we developed a software library "RTToolbox" following the presented design principles. The RTToolbox is available as open source library and has already been tested in a larger-scale software system for different use cases. These examples demonstrate the benefit of the presented design principles. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  15. Quantitative collision induced mass spectrometry of substituted piperazines - A correlative analysis between theory and experiment

    NASA Astrophysics Data System (ADS)

    Ivanova, Bojidarka; Spiteller, Michael

    2017-12-01

    The present paper deals with quantitative kinetics and thermodynamics of collision induced dissociation (CID) reactions of piperazines under different experimental conditions together with a systematic description of effect of counter-ions on common MS fragment reactions of piperazines; and intra-molecular effect of quaternary cyclization of substituted piperazines yielding to quaternary salts. There are discussed quantitative model equations of rate constants as well as free Gibbs energies of series of m-independent CID fragment processes in GP, which have been evidenced experimentally. Both kinetic and thermodynamic parameters are also predicted by computational density functional theory (DFT) and ab initio both static and dynamic methods. The paper examines validity of Maxwell-Boltzmann distribution to non-Boltzmann CID processes in quantitatively as well. The experiments conducted within the latter framework yield to an excellent correspondence with theoretical quantum chemical modeling. The important property of presented model equations of reaction kinetics is the applicability in predicting unknown and assigning of known mass spectrometric (MS) patterns. The nature of "GP" continuum of CID-MS coupled scheme of measurements with electrospray ionization (ESI) source is discussed, performing parallel computations in gas-phase (GP) and polar continuum at different temperatures and ionic strengths. The effect of pressure is presented. The study contributes significantly to methodological and phenomenological developments of CID-MS and its analytical implementations for quantitative and structural analyses. It also demonstrates great prospective of a complementary application of experimental CID-MS and computational quantum chemistry studying chemical reactivity, among others. To a considerable extend this work underlies the place of computational quantum chemistry to the field of experimental analytical chemistry in particular highlighting the structural analysis.

  16. Non-animal approaches for toxicokinetics in risk evaluations of food chemicals.

    PubMed

    Punt, Ans; Peijnenburg, Ad A C M; Hoogenboom, Ron L A P; Bouwmeester, Hans

    2017-01-01

    The objective of the present work was to review the availability and predictive value of non-animal toxicokinetic approaches and to evaluate their current use in European risk evaluations of food contaminants, additives and food contact materials, as well as pesticides and medicines. Results revealed little use of quantitative animal or human kinetic data in risk evaluations of food chemicals, compared with pesticides and medicines. Risk evaluations of medicines provided sufficient in vivo kinetic data from different species to evaluate the predictive value of animal kinetic data for humans. These data showed a relatively poor correlation between the in vivo bioavailability in rats and dogs versus that in humans. In contrast, in vitro (human) kinetic data have been demonstrated to provide adequate predictions of the fate of compounds in humans, using appropriate in vitro-in vivo scalers and by integration of in vitro kinetic data with in silico kinetic modelling. Even though in vitro kinetic data were found to be occasionally included within risk evaluations of food chemicals, particularly results from Caco-2 absorption experiments and in vitro data on gut-microbial conversions, only minor use of in vitro methods for metabolism and quantitative in vitro-in vivo extrapolation methods was identified. Yet, such quantitative predictions are essential in the development of alternatives to animal testing as well as to increase human relevance of toxicological risk evaluations. Future research should aim at further improving and validating quantitative alternative methods for kinetics, thereby increasing regulatory acceptance of non-animal kinetic data.

  17. Another look at retroactive and proactive interference: a quantitative analysis of conversion processes.

    PubMed

    Blank, Hartmut

    2005-02-01

    Traditionally, the causes of interference phenomena were sought in "real" or "hard" memory processes such as unlearning, response competition, or inhibition, which serve to reduce the accessibility of target items. I propose an alternative approach which does not deny the influence of such processes but highlights a second, equally important, source of interference-the conversion (Tulving, 1983) of accessible memory information into memory performance. Conversion is conceived as a problem-solving-like activity in which the rememberer tries to find solutions to a memory task. Conversion-based interference effects are traced to different conversion processes in the experimental and control conditions of interference designs. I present a simple theoretical model that quantitatively predicts the resulting amount of interference. In two paired-associate learning experiments using two different types of memory tests, these predictions were corroborated. Relations of the present approach to traditional accounts of interference phenomena and implications for eyewitness testimony are discussed.

  18. Approaches to developing alternative and predictive toxicology based on PBPK/PD and QSAR modeling.

    PubMed Central

    Yang, R S; Thomas, R S; Gustafson, D L; Campain, J; Benjamin, S A; Verhaar, H J; Mumtaz, M M

    1998-01-01

    Systematic toxicity testing, using conventional toxicology methodologies, of single chemicals and chemical mixtures is highly impractical because of the immense numbers of chemicals and chemical mixtures involved and the limited scientific resources. Therefore, the development of unconventional, efficient, and predictive toxicology methods is imperative. Using carcinogenicity as an end point, we present approaches for developing predictive tools for toxicologic evaluation of chemicals and chemical mixtures relevant to environmental contamination. Central to the approaches presented is the integration of physiologically based pharmacokinetic/pharmacodynamic (PBPK/PD) and quantitative structure--activity relationship (QSAR) modeling with focused mechanistically based experimental toxicology. In this development, molecular and cellular biomarkers critical to the carcinogenesis process are evaluated quantitatively between different chemicals and/or chemical mixtures. Examples presented include the integration of PBPK/PD and QSAR modeling with a time-course medium-term liver foci assay, molecular biology and cell proliferation studies. Fourier transform infrared spectroscopic analyses of DNA changes, and cancer modeling to assess and attempt to predict the carcinogenicity of the series of 12 chlorobenzene isomers. Also presented is an ongoing effort to develop and apply a similar approach to chemical mixtures using in vitro cell culture (Syrian hamster embryo cell transformation assay and human keratinocytes) methodologies and in vivo studies. The promise and pitfalls of these developments are elaborated. When successfully applied, these approaches may greatly reduce animal usage, personnel, resources, and time required to evaluate the carcinogenicity of chemicals and chemical mixtures. Images Figure 6 PMID:9860897

  19. A life prediction methodology for encapsulated solar cells

    NASA Technical Reports Server (NTRS)

    Coulbert, C. D.

    1978-01-01

    This paper presents an approach to the development of a life prediction methodology for encapsulated solar cells which are intended to operate for twenty years or more in a terrestrial environment. Such a methodology, or solar cell life prediction model, requires the development of quantitative intermediate relationships between local environmental stress parameters and the basic chemical mechanisms of encapsulant aging leading to solar cell failures. The use of accelerated/abbreviated testing to develop these intermediate relationships and in revealing failure modes is discussed. Current field and demonstration tests of solar cell arrays and the present laboratory tests to qualify solar module designs provide very little data applicable to predicting the long-term performance of encapsulated solar cells. An approach to enhancing the value of such field tests to provide data for life prediction is described.

  20. Growth and yield predictions for upland oak stands. 10 years after initial thinning

    Treesearch

    Martin E. Dale; Martin E. Dale

    1972-01-01

    The purpose of this paper is to furnish part of the needed information, that is, quantitative estimates of growth and yield 10 years after initial thinning of upland oak stands. All estimates are computed from a system of equations. These predictions are presented here in tabular form for convenient visual inspection of growth and yield trends. The tables show growth...

  1. Meat Authentication via Multiple Reaction Monitoring Mass Spectrometry of Myoglobin Peptides.

    PubMed

    Watson, Andrew D; Gunning, Yvonne; Rigby, Neil M; Philo, Mark; Kemsley, E Kate

    2015-10-20

    A rapid multiple reaction monitoring (MRM) mass spectrometric method for the detection and relative quantitation of the adulteration of meat with that of an undeclared species is presented. Our approach uses corresponding proteins from the different species under investigation and corresponding peptides from those proteins, or CPCP. Selected peptide markers can be used for species detection. The use of ratios of MRM transition peak areas for corresponding peptides is proposed for relative quantitation. The approach is introduced by use of myoglobin from four meats: beef, pork, horse and lamb. Focusing in the present work on species identification, by use of predictive tools, we determine peptide markers that allow the identification of all four meats and detection of one meat added to another at levels of 1% (w/w). Candidate corresponding peptide pairs to be used for the relative quantification of one meat added to another have been observed. Preliminary quantitation data presented here are encouraging.

  2. QSAR DataBank - an approach for the digital organization and archiving of QSAR model information

    PubMed Central

    2014-01-01

    Background Research efforts in the field of descriptive and predictive Quantitative Structure-Activity Relationships or Quantitative Structure–Property Relationships produce around one thousand scientific publications annually. All the materials and results are mainly communicated using printed media. The printed media in its present form have obvious limitations when they come to effectively representing mathematical models, including complex and non-linear, and large bodies of associated numerical chemical data. It is not supportive of secondary information extraction or reuse efforts while in silico studies poses additional requirements for accessibility, transparency and reproducibility of the research. This gap can and should be bridged by introducing domain-specific digital data exchange standards and tools. The current publication presents a formal specification of the quantitative structure-activity relationship data organization and archival format called the QSAR DataBank (QsarDB for shorter, or QDB for shortest). Results The article describes QsarDB data schema, which formalizes QSAR concepts (objects and relationships between them) and QsarDB data format, which formalizes their presentation for computer systems. The utility and benefits of QsarDB have been thoroughly tested by solving everyday QSAR and predictive modeling problems, with examples in the field of predictive toxicology, and can be applied for a wide variety of other endpoints. The work is accompanied with open source reference implementation and tools. Conclusions The proposed open data, open source, and open standards design is open to public and proprietary extensions on many levels. Selected use cases exemplify the benefits of the proposed QsarDB data format. General ideas for future development are discussed. PMID:24910716

  3. A quantitative ELISA procedure for the measurement of membrane-bound platelet-associated IgG (PAIgG).

    PubMed

    Lynch, D M; Lynch, J M; Howe, S E

    1985-03-01

    A quantitative ELISA assay for the measurement of in vivo bound platelet-associated IgG (PAIgG) using intact patient platelets is presented. The assay requires quantitation and standardization of the number of platelets bound to microtiter plate wells and an absorbance curve using quantitated IgG standards. Platelet-bound IgG was measured using an F(ab')2 peroxidase labeled anti-human IgG and o-phenylenediamine dihydrochloride (OPD) as the substrate. Using this assay, PAIgG for normal individuals was 2.8 +/- 1.6 fg/platelet (mean +/- 1 SD; n = 30). Increased levels were found in 28 of 30 patients with clinical autoimmune thrombocytopenia (ATP) with a range of 7.0-80 fg/platelet. Normal PAIgG levels were found in 26 of 30 patients with nonimmune thrombocytopenia. In the sample population studied, the PAIgG assay showed a sensitivity of 93%, specificity of 90%, a positive predictive value of 0.90, and a negative predictive value of 0.93. The procedure is highly reproducible (CV = 6.8%) and useful in evaluating patients with suspected immune mediated thrombocytopenia.

  4. Evaluation of a web based informatics system with data mining tools for predicting outcomes with quantitative imaging features in stroke rehabilitation clinical trials

    NASA Astrophysics Data System (ADS)

    Wang, Ximing; Kim, Bokkyu; Park, Ji Hoon; Wang, Erik; Forsyth, Sydney; Lim, Cody; Ravi, Ragini; Karibyan, Sarkis; Sanchez, Alexander; Liu, Brent

    2017-03-01

    Quantitative imaging biomarkers are used widely in clinical trials for tracking and evaluation of medical interventions. Previously, we have presented a web based informatics system utilizing quantitative imaging features for predicting outcomes in stroke rehabilitation clinical trials. The system integrates imaging features extraction tools and a web-based statistical analysis tool. The tools include a generalized linear mixed model(GLMM) that can investigate potential significance and correlation based on features extracted from clinical data and quantitative biomarkers. The imaging features extraction tools allow the user to collect imaging features and the GLMM module allows the user to select clinical data and imaging features such as stroke lesion characteristics from the database as regressors and regressands. This paper discusses the application scenario and evaluation results of the system in a stroke rehabilitation clinical trial. The system was utilized to manage clinical data and extract imaging biomarkers including stroke lesion volume, location and ventricle/brain ratio. The GLMM module was validated and the efficiency of data analysis was also evaluated.

  5. Quantitative Protein Topography Analysis and High-Resolution Structure Prediction Using Hydroxyl Radical Labeling and Tandem-Ion Mass Spectrometry (MS)*

    PubMed Central

    Kaur, Parminder; Kiselar, Janna; Yang, Sichun; Chance, Mark R.

    2015-01-01

    Hydroxyl radical footprinting based MS for protein structure assessment has the goal of understanding ligand induced conformational changes and macromolecular interactions, for example, protein tertiary and quaternary structure, but the structural resolution provided by typical peptide-level quantification is limiting. In this work, we present experimental strategies using tandem-MS fragmentation to increase the spatial resolution of the technique to the single residue level to provide a high precision tool for molecular biophysics research. Overall, in this study we demonstrated an eightfold increase in structural resolution compared with peptide level assessments. In addition, to provide a quantitative analysis of residue based solvent accessibility and protein topography as a basis for high-resolution structure prediction; we illustrate strategies of data transformation using the relative reactivity of side chains as a normalization strategy and predict side-chain surface area from the footprinting data. We tested the methods by examination of Ca+2-calmodulin showing highly significant correlations between surface area and side-chain contact predictions for individual side chains and the crystal structure. Tandem ion based hydroxyl radical footprinting-MS provides quantitative high-resolution protein topology information in solution that can fill existing gaps in structure determination for large proteins and macromolecular complexes. PMID:25687570

  6. Elucidating dynamic metabolic physiology through network integration of quantitative time-course metabolomics

    DOE PAGES

    Bordbar, Aarash; Yurkovich, James T.; Paglia, Giuseppe; ...

    2017-04-07

    In this study, the increasing availability of metabolomics data necessitates novel methods for deeper data analysis and interpretation. We present a flux balance analysis method that allows for the computation of dynamic intracellular metabolic changes at the cellular scale through integration of time-course absolute quantitative metabolomics. This approach, termed “unsteady-state flux balance analysis” (uFBA), is applied to four cellular systems: three dynamic and one steady-state as a negative control. uFBA and FBA predictions are contrasted, and uFBA is found to be more accurate in predicting dynamic metabolic flux states for red blood cells, platelets, and Saccharomyces cerevisiae. Notably, only uFBAmore » predicts that stored red blood cells metabolize TCA intermediates to regenerate important cofactors, such as ATP, NADH, and NADPH. These pathway usage predictions were subsequently validated through 13C isotopic labeling and metabolic flux analysis in stored red blood cells. Utilizing time-course metabolomics data, uFBA provides an accurate method to predict metabolic physiology at the cellular scale for dynamic systems.« less

  7. Elucidating dynamic metabolic physiology through network integration of quantitative time-course metabolomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bordbar, Aarash; Yurkovich, James T.; Paglia, Giuseppe

    In this study, the increasing availability of metabolomics data necessitates novel methods for deeper data analysis and interpretation. We present a flux balance analysis method that allows for the computation of dynamic intracellular metabolic changes at the cellular scale through integration of time-course absolute quantitative metabolomics. This approach, termed “unsteady-state flux balance analysis” (uFBA), is applied to four cellular systems: three dynamic and one steady-state as a negative control. uFBA and FBA predictions are contrasted, and uFBA is found to be more accurate in predicting dynamic metabolic flux states for red blood cells, platelets, and Saccharomyces cerevisiae. Notably, only uFBAmore » predicts that stored red blood cells metabolize TCA intermediates to regenerate important cofactors, such as ATP, NADH, and NADPH. These pathway usage predictions were subsequently validated through 13C isotopic labeling and metabolic flux analysis in stored red blood cells. Utilizing time-course metabolomics data, uFBA provides an accurate method to predict metabolic physiology at the cellular scale for dynamic systems.« less

  8. Quantitative proteomic view on secreted, cell surface-associated, and cytoplasmic proteins of the methicillin-resistant human pathogen Staphylococcus aureus under iron-limited conditions.

    PubMed

    Hempel, Kristina; Herbst, Florian-Alexander; Moche, Martin; Hecker, Michael; Becher, Dörte

    2011-04-01

    Staphylococcus aureus is capable of colonizing and infecting humans by its arsenal of surface-exposed and secreted proteins. Iron-limited conditions in mammalian body fluids serve as a major environmental signal to bacteria to express virulence determinants. Here we present a comprehensive, gel-free, and GeLC-MS/MS-based quantitative proteome profiling of S. aureus under this infection-relevant situation. (14)N(15)N metabolic labeling and three complementing approaches were combined for relative quantitative analyses of surface-associated proteins. The surface-exposed and secreted proteome profiling approaches comprise trypsin shaving, biotinylation, and precipitation of the supernatant. By analysis of the outer subproteomic and cytoplasmic protein fraction, 1210 proteins could be identified including 221 surface-associated proteins. Thus, access was enabled to 70% of the predicted cell wall-associated proteins, 80% of the predicted sortase substrates, two/thirds of lipoproteins and more than 50% of secreted and cytoplasmic proteins. For iron-deficiency, 158 surface-associated proteins were quantified. Twenty-nine proteins were found in altered amounts showing particularly surface-exposed proteins strongly induced, such as the iron-regulated surface determinant proteins IsdA, IsdB, IsdC and IsdD as well as lipid-anchored iron compound-binding proteins. The work presents a crucial subject for understanding S. aureus pathophysiology by the use of methods that allow quantitative surface proteome profiling.

  9. Influence factors and prediction of stormwater runoff of urban green space in Tianjin, China: laboratory experiment and quantitative theory model.

    PubMed

    Yang, Xu; You, Xue-Yi; Ji, Min; Nima, Ciren

    2013-01-01

    The effects of limiting factors such as rainfall intensity, rainfall duration, grass type and vegetation coverage on the stormwater runoff of urban green space was investigated in Tianjin. The prediction equation of stormwater runoff was established by the quantitative theory with the lab experimental data of soil columns. It was validated by three field experiments and the relative errors between predicted and measured stormwater runoff are 1.41, 1.52 and 7.35%, respectively. The results implied that the prediction equation could be used to forecast the stormwater runoff of urban green space. The results of range and variance analysis indicated the sequence order of limiting factors is rainfall intensity > grass type > rainfall duration > vegetation coverage. The least runoff of green land in the present study is the combination of rainfall intensity 60.0 mm/h, duration 60.0 min, grass Festuca arundinacea and vegetation coverage 90.0%. When the intensity and duration of rainfall are 60.0 mm/h and 90.0 min, the predicted volumetric runoff coefficient is 0.23 with Festuca arundinacea of 90.0% vegetation coverage. The present approach indicated that green space is an effective method to reduce stormwater runoff and the conclusions are mainly applicable to Tianjin and the semi-arid areas with main summer precipitation and long-time interval rainfalls.

  10. A Quantitative Structure Activity Relationship for acute oral toxicity of pesticides on rats: Validation, domain of application and prediction.

    PubMed

    Hamadache, Mabrouk; Benkortbi, Othmane; Hanini, Salah; Amrane, Abdeltif; Khaouane, Latifa; Si Moussa, Cherif

    2016-02-13

    Quantitative Structure Activity Relationship (QSAR) models are expected to play an important role in the risk assessment of chemicals on humans and the environment. In this study, we developed a validated QSAR model to predict acute oral toxicity of 329 pesticides to rats because a few QSAR models have been devoted to predict the Lethal Dose 50 (LD50) of pesticides on rats. This QSAR model is based on 17 molecular descriptors, and is robust, externally predictive and characterized by a good applicability domain. The best results were obtained with a 17/9/1 Artificial Neural Network model trained with the Quasi Newton back propagation (BFGS) algorithm. The prediction accuracy for the external validation set was estimated by the Q(2)ext and the root mean square error (RMS) which are equal to 0.948 and 0.201, respectively. 98.6% of external validation set is correctly predicted and the present model proved to be superior to models previously published. Accordingly, the model developed in this study provides excellent predictions and can be used to predict the acute oral toxicity of pesticides, particularly for those that have not been tested as well as new pesticides. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Quantitative Acoustic Model for Adhesion Evaluation of Pmma/silicon Film Structures

    NASA Astrophysics Data System (ADS)

    Ju, H. S.; Tittmann, B. R.

    2010-02-01

    A Poly-methyl-methacrylate (PMMA) film on a silicon substrate is a main structure for photolithography in semiconductor manufacturing processes. This paper presents a potential of scanning acoustic microscopy (SAM) for nondestructive evaluation of the PMMA/Si film structure, whose adhesion failure is commonly encountered during the fabrication and post-fabrication processes. A physical model employing a partial discontinuity in displacement is developed for rigorously quantitative evaluation of the interfacial weakness. The model is implanted to the matrix method for the surface acoustic wave (SAW) propagation in anisotropic media. Our results show that variations in the SAW velocity and reflectance are predicted to show their sensitivity to the adhesion condition. Experimental results by the v(z) technique and SAW velocity reconstruction verify the prediction.

  12. Predicting ESI/MS Signal Change for Anions in Different Solvents.

    PubMed

    Kruve, Anneli; Kaupmees, Karl

    2017-05-02

    LC/ESI/MS is a technique widely used for qualitative and quantitative analysis in various fields. However, quantification is currently possible only for compounds for which the standard substances are available, as the ionization efficiency of different compounds in ESI source differs by orders of magnitude. In this paper we present an approach for quantitative LC/ESI/MS analysis without standard substances. This approach relies on accurately predicting the ionization efficiencies in ESI source based on a model, which uses physicochemical parameters of analytes. Furthermore, the model has been made transferable between different mobile phases and instrument setups by using a suitable set of calibration compounds. This approach has been validated both in flow injection and chromatographic mode with gradient elution.

  13. Use of Artificial Intelligence and Machine Learning Algorithms with Gene Expression Profiling to Predict Recurrent Nonmuscle Invasive Urothelial Carcinoma of the Bladder.

    PubMed

    Bartsch, Georg; Mitra, Anirban P; Mitra, Sheetal A; Almal, Arpit A; Steven, Kenneth E; Skinner, Donald G; Fry, David W; Lenehan, Peter F; Worzel, William P; Cote, Richard J

    2016-02-01

    Due to the high recurrence risk of nonmuscle invasive urothelial carcinoma it is crucial to distinguish patients at high risk from those with indolent disease. In this study we used a machine learning algorithm to identify the genes in patients with nonmuscle invasive urothelial carcinoma at initial presentation that were most predictive of recurrence. We used the genes in a molecular signature to predict recurrence risk within 5 years after transurethral resection of bladder tumor. Whole genome profiling was performed on 112 frozen nonmuscle invasive urothelial carcinoma specimens obtained at first presentation on Human WG-6 BeadChips (Illumina®). A genetic programming algorithm was applied to evolve classifier mathematical models for outcome prediction. Cross-validation based resampling and gene use frequencies were used to identify the most prognostic genes, which were combined into rules used in a voting algorithm to predict the sample target class. Key genes were validated by quantitative polymerase chain reaction. The classifier set included 21 genes that predicted recurrence. Quantitative polymerase chain reaction was done for these genes in a subset of 100 patients. A 5-gene combined rule incorporating a voting algorithm yielded 77% sensitivity and 85% specificity to predict recurrence in the training set, and 69% and 62%, respectively, in the test set. A singular 3-gene rule was constructed that predicted recurrence with 80% sensitivity and 90% specificity in the training set, and 71% and 67%, respectively, in the test set. Using primary nonmuscle invasive urothelial carcinoma from initial occurrences genetic programming identified transcripts in reproducible fashion, which were predictive of recurrence. These findings could potentially impact nonmuscle invasive urothelial carcinoma management. Copyright © 2016 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  14. Predicting low-temperature free energy landscapes with flat-histogram Monte Carlo methods

    NASA Astrophysics Data System (ADS)

    Mahynski, Nathan A.; Blanco, Marco A.; Errington, Jeffrey R.; Shen, Vincent K.

    2017-02-01

    We present a method for predicting the free energy landscape of fluids at low temperatures from flat-histogram grand canonical Monte Carlo simulations performed at higher ones. We illustrate our approach for both pure and multicomponent systems using two different sampling methods as a demonstration. This allows us to predict the thermodynamic behavior of systems which undergo both first order and continuous phase transitions upon cooling using simulations performed only at higher temperatures. After surveying a variety of different systems, we identify a range of temperature differences over which the extrapolation of high temperature simulations tends to quantitatively predict the thermodynamic properties of fluids at lower ones. Beyond this range, extrapolation still provides a reasonably well-informed estimate of the free energy landscape; this prediction then requires less computational effort to refine with an additional simulation at the desired temperature than reconstruction of the surface without any initial estimate. In either case, this method significantly increases the computational efficiency of these flat-histogram methods when investigating thermodynamic properties of fluids over a wide range of temperatures. For example, we demonstrate how a binary fluid phase diagram may be quantitatively predicted for many temperatures using only information obtained from a single supercritical state.

  15. Quantitative assessment of emphysema from whole lung CT scans: comparison with visual grading

    NASA Astrophysics Data System (ADS)

    Keller, Brad M.; Reeves, Anthony P.; Apanosovich, Tatiyana V.; Wang, Jianwei; Yankelevitz, David F.; Henschke, Claudia I.

    2009-02-01

    Emphysema is a disease of the lungs that destroys the alveolar air sacs and induces long-term respiratory dysfunction. CT scans allow for imaging of the anatomical basis of emphysema and for visual assessment by radiologists of the extent present in the lungs. Several measures have been introduced for the quantification of the extent of disease directly from CT data in order to add to the qualitative assessments made by radiologists. In this paper we compare emphysema index, mean lung density, histogram percentiles, and the fractal dimension to visual grade in order to evaluate the predictability of radiologist visual scoring of emphysema from low-dose CT scans through quantitative scores, in order to determine which measures can be useful as surrogates for visual assessment. All measures were computed over nine divisions of the lung field (whole lung, individual lungs, and upper/middle/lower thirds of each lung) for each of 148 low-dose, whole lung scans. In addition, a visual grade of each section was also given by an expert radiologist. One-way ANOVA and multinomial logistic regression were used to determine the ability of the measures to predict visual grade from quantitative score. We found that all measures were able to distinguish between normal and severe grades (p<0.01), and between mild/moderate and all other grades (p<0.05). However, no measure was able to distinguish between mild and moderate cases. Approximately 65% prediction accuracy was achieved from using quantitative score to predict visual grade, with 73% if mild and moderate cases are considered as a single class.

  16. Solar-cell interconnect design for terrestrial photovoltaic modules

    NASA Technical Reports Server (NTRS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1984-01-01

    Useful solar cell interconnect reliability design and life prediction algorithms are presented, together with experimental data indicating that the classical strain cycle (fatigue) curve for the interconnect material does not account for the statistical scatter that is required in reliability predictions. This shortcoming is presently addressed by fitting a functional form to experimental cumulative interconnect failure rate data, which thereby yields statistical fatigue curves enabling not only the prediction of cumulative interconnect failures during the design life of an array field, but also the quantitative interpretation of data from accelerated thermal cycling tests. Optimal interconnect cost reliability design algorithms are also derived which may allow the minimization of energy cost over the design life of the array field.

  17. Solar-cell interconnect design for terrestrial photovoltaic modules

    NASA Astrophysics Data System (ADS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1984-11-01

    Useful solar cell interconnect reliability design and life prediction algorithms are presented, together with experimental data indicating that the classical strain cycle (fatigue) curve for the interconnect material does not account for the statistical scatter that is required in reliability predictions. This shortcoming is presently addressed by fitting a functional form to experimental cumulative interconnect failure rate data, which thereby yields statistical fatigue curves enabling not only the prediction of cumulative interconnect failures during the design life of an array field, but also the quantitative interpretation of data from accelerated thermal cycling tests. Optimal interconnect cost reliability design algorithms are also derived which may allow the minimization of energy cost over the design life of the array field.

  18. Evaluating Rapid Models for High-Throughput Exposure Forecasting (SOT)

    EPA Science Inventory

    High throughput exposure screening models can provide quantitative predictions for thousands of chemicals; however these predictions must be systematically evaluated for predictive ability. Without the capability to make quantitative, albeit uncertain, forecasts of exposure, the ...

  19. Exploring simple, transparent, interpretable and predictive QSAR models for classification and quantitative prediction of rat toxicity of ionic liquids using OECD recommended guidelines.

    PubMed

    Das, Rudra Narayan; Roy, Kunal; Popelier, Paul L A

    2015-11-01

    The present study explores the chemical attributes of diverse ionic liquids responsible for their cytotoxicity in a rat leukemia cell line (IPC-81) by developing predictive classification as well as regression-based mathematical models. Simple and interpretable descriptors derived from a two-dimensional representation of the chemical structures along with quantum topological molecular similarity indices have been used for model development, employing unambiguous modeling strategies that strictly obey the guidelines of the Organization for Economic Co-operation and Development (OECD) for quantitative structure-activity relationship (QSAR) analysis. The structure-toxicity relationships that emerged from both classification and regression-based models were in accordance with the findings of some previous studies. The models suggested that the cytotoxicity of ionic liquids is dependent on the cationic surfactant action, long alkyl side chains, cationic lipophilicity as well as aromaticity, the presence of a dialkylamino substituent at the 4-position of the pyridinium nucleus and a bulky anionic moiety. The models have been transparently presented in the form of equations, thus allowing their easy transferability in accordance with the OECD guidelines. The models have also been subjected to rigorous validation tests proving their predictive potential and can hence be used for designing novel and "greener" ionic liquids. The major strength of the present study lies in the use of a diverse and large dataset, use of simple reproducible descriptors and compliance with the OECD norms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brünner, F.; Parganlija, D.; Rebhan, A.

    We present new results on the decay patterns of scalar and tensor glueballs in the top-down holographic Witten-Sakai-Sugimoto model. This model, which has only one free dimensionless parameter, gives semi-quantitative predictions for the vector meson spectrum, their decay widths, and also a gluon condensate in agreement with SVZ sum rules. The holographic predictions for scalar glueball decay rates are compared with experimental data for the widely discussed gluon candidates f{sub 0}(1500) and f{sub 0}(1710)

  1. Cost Models for MMC Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.

  2. X-Ray and UV Photoelectron Spectroscopy | Materials Science | NREL

    Science.gov Websites

    backsheet material, showing excellent quantitative agreement between measured and predicted peak area ratios quantitative agreement between measured and predicted peak area ratios. Subtle differences in polymer functionality are assessed by deviations from stoichiometry. Elemental Analysis Uses quantitative identification

  3. Weather Prediction Center (WPC) Home Page

    Science.gov Websites

    grids, quantitative precipitation, and winter weather outlook probabilities can be found at: http Short Range Products » More Medium Range Products Quantitative Precipitation Forecasts Legacy Page Discussion (Day 1-3) Quantitative Precipitation Forecast Discussion NWS Weather Prediction Center College

  4. Predicting Admission of Minorities into Medical School.

    ERIC Educational Resources Information Center

    Lynch, Kathleen Bodisch; Woode, Moses K.

    A study identifying the relationships between quantitative academic characteristics--specifically, grade point average (GPA) and MCAT scores--and admission into medical school for minorities is presented. Explanations are proposed for contradictory findings related to this question that have appeared in literature. Data from 58 minority student…

  5. Towards a Quantitative Endogenous Network Theory of Cancer Genesis and Progression: beyond ``cancer as diseases of genome''

    NASA Astrophysics Data System (ADS)

    Ao, Ping

    2011-03-01

    There has been a tremendous progress in cancer research. However, it appears the current dominant cancer research framework of regarding cancer as diseases of genome leads impasse. Naturally questions have been asked that whether it is possible to develop alternative frameworks such that they can connect both to mutations and other genetic/genomic effects and to environmental factors. Furthermore, such framework can be made quantitative and with predictions experimentally testable. In this talk, I will present a positive answer to this calling. I will explain on our construction of endogenous network theory based on molecular-cellular agencies as dynamical variable. Such cancer theory explicitly demonstrates a profound connection to many fundamental concepts in physics, as such stochastic non-equilibrium processes, ``energy'' landscape, metastability, etc. It suggests that neneath cancer's daunting complexity may lie a simplicity that gives grounds for hope. The rationales behind such theory, its predictions, and its initial experimental verifications will be presented. Supported by USA NIH and China NSF.

  6. The Adaptation of the Immigrant Second Generation in America: Theoretical Overview and Recent Evidence

    PubMed Central

    Portes, Alejandro; Fernández-Kelly, Patricia; Haller, William

    2013-01-01

    This paper summarises a research program on the new immigrant second generation initiated in the early 1990s and completed in 2006. The four field waves of the Children of Immigrants Longitudinal Study (CILS) are described and the main theoretical models emerging from it are presented and graphically summarised. After considering critical views of this theory, we present the most recent results from this longitudinal research program in the forum of quantitative models predicting downward assimilation in early adulthood and qualitative interviews identifying ways to escape it by disadvantaged children of immigrants. Quantitative results strongly support the predicted effects of exogenous variables identified by segmented assimilation theory and identify the intervening factors during adolescence that mediate their influence on adult outcomes. Qualitative evidence gathered during the last stage of the study points to three factors that can lead to exceptional educational achievement among disadvantaged youths. All three indicate the positive influence of selective acculturation. Implications of these findings for theory and policy are discussed. PMID:23626483

  7. Inverse methods for 3D quantitative optical coherence elasticity imaging (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Dong, Li; Wijesinghe, Philip; Hugenberg, Nicholas; Sampson, David D.; Munro, Peter R. T.; Kennedy, Brendan F.; Oberai, Assad A.

    2017-02-01

    In elastography, quantitative elastograms are desirable as they are system and operator independent. Such quantification also facilitates more accurate diagnosis, longitudinal studies and studies performed across multiple sites. In optical elastography (compression, surface-wave or shear-wave), quantitative elastograms are typically obtained by assuming some form of homogeneity. This simplifies data processing at the expense of smearing sharp transitions in elastic properties, and/or introducing artifacts in these regions. Recently, we proposed an inverse problem-based approach to compression OCE that does not assume homogeneity, and overcomes the drawbacks described above. In this approach, the difference between the measured and predicted displacement field is minimized by seeking the optimal distribution of elastic parameters. The predicted displacements and recovered elastic parameters together satisfy the constraint of the equations of equilibrium. This approach, which has been applied in two spatial dimensions assuming plane strain, has yielded accurate material property distributions. Here, we describe the extension of the inverse problem approach to three dimensions. In addition to the advantage of visualizing elastic properties in three dimensions, this extension eliminates the plane strain assumption and is therefore closer to the true physical state. It does, however, incur greater computational costs. We address this challenge through a modified adjoint problem, spatially adaptive grid resolution, and three-dimensional decomposition techniques. Through these techniques the inverse problem is solved on a typical desktop machine within a wall clock time of 20 hours. We present the details of the method and quantitative elasticity images of phantoms and tissue samples.

  8. Quantitative fetal fibronectin and cervical length to predict preterm birth in asymptomatic women with previous cervical surgery.

    PubMed

    Vandermolen, Brooke I; Hezelgrave, Natasha L; Smout, Elizabeth M; Abbott, Danielle S; Seed, Paul T; Shennan, Andrew H

    2016-10-01

    Quantitative fetal fibronectin testing has demonstrated accuracy for prediction of spontaneous preterm birth in asymptomatic women with a history of preterm birth. Predictive accuracy in women with previous cervical surgery (a potentially different risk mechanism) is not known. We sought to compare the predictive accuracy of cervicovaginal fluid quantitative fetal fibronectin and cervical length testing in asymptomatic women with previous cervical surgery to that in women with 1 previous preterm birth. We conducted a prospective blinded secondary analysis of a larger observational study of cervicovaginal fluid quantitative fetal fibronectin concentration in asymptomatic women measured with a Hologic 10Q system (Hologic, Marlborough, MA). Prediction of spontaneous preterm birth (<30, <34, and <37 weeks) with cervicovaginal fluid quantitative fetal fibronectin concentration in primiparous women who had undergone at least 1 invasive cervical procedure (n = 473) was compared with prediction in women who had previous spontaneous preterm birth, preterm prelabor rupture of membranes, or late miscarriage (n = 821). Relationship with cervical length was explored. The rate of spontaneous preterm birth <34 weeks in the cervical surgery group was 3% compared with 9% in previous spontaneous preterm birth group. Receiver operating characteristic curves comparing quantitative fetal fibronectin for prediction at all 3 gestational end points were comparable between the cervical surgery and previous spontaneous preterm birth groups (34 weeks: area under the curve, 0.78 [95% confidence interval 0.64-0.93] vs 0.71 [95% confidence interval 0.64-0.78]; P = .39). Prediction of spontaneous preterm birth using cervical length compared with quantitative fetal fibronectin for prediction of preterm birth <34 weeks of gestation offered similar prediction (area under the curve, 0.88 [95% confidence interval 0.79-0.96] vs 0.77 [95% confidence interval 0.62-0.92], P = .12 in the cervical surgery group; and 0.77 [95% confidence interval 0.70-0.84] vs 0.74 [95% confidence interval 0.67-0.81], P = .32 in the previous spontaneous preterm birth group). Prediction of spontaneous preterm birth using cervicovaginal fluid quantitative fetal fibronectin in asymptomatic women with cervical surgery is valid, and has comparative accuracy to that in women with a history of spontaneous preterm birth. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Modeling ready biodegradability of fragrance materials.

    PubMed

    Ceriani, Lidia; Papa, Ester; Kovarich, Simona; Boethling, Robert; Gramatica, Paola

    2015-06-01

    In the present study, quantitative structure activity relationships were developed for predicting ready biodegradability of approximately 200 heterogeneous fragrance materials. Two classification methods, classification and regression tree (CART) and k-nearest neighbors (kNN), were applied to perform the modeling. The models were validated with multiple external prediction sets, and the structural applicability domain was verified by the leverage approach. The best models had good sensitivity (internal ≥80%; external ≥68%), specificity (internal ≥80%; external 73%), and overall accuracy (≥75%). Results from the comparison with BIOWIN global models, based on group contribution method, show that specific models developed in the present study perform better in prediction than BIOWIN6, in particular for the correct classification of not readily biodegradable fragrance materials. © 2015 SETAC.

  10. Quantitative interpretation of Great Lakes remote sensing data

    NASA Technical Reports Server (NTRS)

    Shook, D. F.; Salzman, J.; Svehla, R. A.; Gedney, R. T.

    1980-01-01

    The paper discusses the quantitative interpretation of Great Lakes remote sensing water quality data. Remote sensing using color information must take into account (1) the existence of many different organic and inorganic species throughout the Great Lakes, (2) the occurrence of a mixture of species in most locations, and (3) spatial variations in types and concentration of species. The radiative transfer model provides a potential method for an orderly analysis of remote sensing data and a physical basis for developing quantitative algorithms. Predictions and field measurements of volume reflectances are presented which show the advantage of using a radiative transfer model. Spectral absorptance and backscattering coefficients for two inorganic sediments are reported.

  11. Quantitative radiomics studies for tissue characterization: a review of technology and methodological procedures.

    PubMed

    Larue, Ruben T H M; Defraene, Gilles; De Ruysscher, Dirk; Lambin, Philippe; van Elmpt, Wouter

    2017-02-01

    Quantitative analysis of tumour characteristics based on medical imaging is an emerging field of research. In recent years, quantitative imaging features derived from CT, positron emission tomography and MR scans were shown to be of added value in the prediction of outcome parameters in oncology, in what is called the radiomics field. However, results might be difficult to compare owing to a lack of standardized methodologies to conduct quantitative image analyses. In this review, we aim to present an overview of the current challenges, technical routines and protocols that are involved in quantitative imaging studies. The first issue that should be overcome is the dependency of several features on the scan acquisition and image reconstruction parameters. Adopting consistent methods in the subsequent target segmentation step is evenly crucial. To further establish robust quantitative image analyses, standardization or at least calibration of imaging features based on different feature extraction settings is required, especially for texture- and filter-based features. Several open-source and commercial software packages to perform feature extraction are currently available, all with slightly different functionalities, which makes benchmarking quite challenging. The number of imaging features calculated is typically larger than the number of patients studied, which emphasizes the importance of proper feature selection and prediction model-building routines to prevent overfitting. Even though many of these challenges still need to be addressed before quantitative imaging can be brought into daily clinical practice, radiomics is expected to be a critical component for the integration of image-derived information to personalize treatment in the future.

  12. A Semiquantitative Framework for Gene Regulatory Networks: Increasing the Time and Quantitative Resolution of Boolean Networks

    PubMed Central

    Kerkhofs, Johan; Geris, Liesbet

    2015-01-01

    Boolean models have been instrumental in predicting general features of gene networks and more recently also as explorative tools in specific biological applications. In this study we introduce a basic quantitative and a limited time resolution to a discrete (Boolean) framework. Quantitative resolution is improved through the employ of normalized variables in unison with an additive approach. Increased time resolution stems from the introduction of two distinct priority classes. Through the implementation of a previously published chondrocyte network and T helper cell network, we show that this addition of quantitative and time resolution broadens the scope of biological behaviour that can be captured by the models. Specifically, the quantitative resolution readily allows models to discern qualitative differences in dosage response to growth factors. The limited time resolution, in turn, can influence the reachability of attractors, delineating the likely long term system behaviour. Importantly, the information required for implementation of these features, such as the nature of an interaction, is typically obtainable from the literature. Nonetheless, a trade-off is always present between additional computational cost of this approach and the likelihood of extending the model’s scope. Indeed, in some cases the inclusion of these features does not yield additional insight. This framework, incorporating increased and readily available time and semi-quantitative resolution, can help in substantiating the litmus test of dynamics for gene networks, firstly by excluding unlikely dynamics and secondly by refining falsifiable predictions on qualitative behaviour. PMID:26067297

  13. Quantitative self-assembly prediction yields targeted nanomedicines

    NASA Astrophysics Data System (ADS)

    Shamay, Yosi; Shah, Janki; Işık, Mehtap; Mizrachi, Aviram; Leibold, Josef; Tschaharganeh, Darjus F.; Roxbury, Daniel; Budhathoki-Uprety, Januka; Nawaly, Karla; Sugarman, James L.; Baut, Emily; Neiman, Michelle R.; Dacek, Megan; Ganesh, Kripa S.; Johnson, Darren C.; Sridharan, Ramya; Chu, Karen L.; Rajasekhar, Vinagolu K.; Lowe, Scott W.; Chodera, John D.; Heller, Daniel A.

    2018-02-01

    Development of targeted nanoparticle drug carriers often requires complex synthetic schemes involving both supramolecular self-assembly and chemical modification. These processes are generally difficult to predict, execute, and control. We describe herein a targeted drug delivery system that is accurately and quantitatively predicted to self-assemble into nanoparticles based on the molecular structures of precursor molecules, which are the drugs themselves. The drugs assemble with the aid of sulfated indocyanines into particles with ultrahigh drug loadings of up to 90%. We devised quantitative structure-nanoparticle assembly prediction (QSNAP) models to identify and validate electrotopological molecular descriptors as highly predictive indicators of nano-assembly and nanoparticle size. The resulting nanoparticles selectively targeted kinase inhibitors to caveolin-1-expressing human colon cancer and autochthonous liver cancer models to yield striking therapeutic effects while avoiding pERK inhibition in healthy skin. This finding enables the computational design of nanomedicines based on quantitative models for drug payload selection.

  14. Improved cancer risk stratification and diagnosis via quantitative phase microscopy (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Liu, Yang; Uttam, Shikhar; Pham, Hoa V.; Hartman, Douglas J.

    2017-02-01

    Pathology remains the gold standard for cancer diagnosis and in some cases prognosis, in which trained pathologists examine abnormality in tissue architecture and cell morphology characteristic of cancer cells with a bright-field microscope. The limited resolution of conventional microscope can result in intra-observer variation, missed early-stage cancers, and indeterminate cases that often result in unnecessary invasive procedures in the absence of cancer. Assessment of nanoscale structural characteristics via quantitative phase represents a promising strategy for identifying pre-cancerous or cancerous cells, due to its nanoscale sensitivity to optical path length, simple sample preparation (i.e., label-free) and low cost. I will present the development of quantitative phase microscopy system in transmission and reflection configuration to detect the structural changes in nuclear architecture, not be easily identifiable by conventional pathology. Specifically, we will present the use of transmission-mode quantitative phase imaging to improve diagnostic accuracy of urine cytology and the nuclear dry mass is progressively correlate with negative, atypical, suspicious and positive cytological diagnosis. In a second application, we will present the use of reflection-mode quantitative phase microscopy for depth-resolved nanoscale nuclear architecture mapping (nanoNAM) of clinically prepared formalin-fixed, paraffin-embedded tissue sections. We demonstrated that the quantitative phase microscopy system detects a gradual increase in the density alteration of nuclear architecture during malignant transformation in animal models of colon carcinogenesis and in human patients with ulcerative colitis, even in tissue that appears histologically normal according to pathologists. We evaluated the ability of nanoNAM to predict "future" cancer progression in patients with ulcerative colitis.

  15. A general nonlinear magnetomechanical model for ferromagnetic materials under a constant weak magnetic field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Pengpeng; Zheng, Xiaojing, E-mail: xjzheng@xidian.edu.cn; Jin, Ke

    2016-04-14

    Weak magnetic nondestructive testing (e.g., metal magnetic memory method) concerns the magnetization variation of ferromagnetic materials due to its applied load and a weak magnetic surrounding them. One key issue on these nondestructive technologies is the magnetomechanical effect for quantitative evaluation of magnetization state from stress–strain condition. A representative phenomenological model has been proposed to explain the magnetomechanical effect by Jiles in 1995. However, the Jiles' model has some deficiencies in quantification, for instance, there is a visible difference between theoretical prediction and experimental measurements on stress–magnetization curve, especially in the compression case. Based on the thermodynamic relations and themore » approach law of irreversible magnetization, a nonlinear coupled model is proposed to improve the quantitative evaluation of the magnetomechanical effect. Excellent agreement has been achieved between the predictions from the present model and previous experimental results. In comparison with Jiles' model, the prediction accuracy is improved greatly by the present model, particularly for the compression case. A detailed study has also been performed to reveal the effects of initial magnetization status, cyclic loading, and demagnetization factor on the magnetomechanical effect. Our theoretical model reveals that the stable weak magnetic signals of nondestructive testing after multiple cyclic loads are attributed to the first few cycles eliminating most of the irreversible magnetization. Remarkably, the existence of demagnetization field can weaken magnetomechanical effect, therefore, significantly reduces the testing capability. This theoretical model can be adopted to quantitatively analyze magnetic memory signals, and then can be applied in weak magnetic nondestructive testing.« less

  16. Quantitative structure-activation barrier relationship modeling for Diels-Alder ligations utilizing quantum chemical structural descriptors.

    PubMed

    Nandi, Sisir; Monesi, Alessandro; Drgan, Viktor; Merzel, Franci; Novič, Marjana

    2013-10-30

    In the present study, we show the correlation of quantum chemical structural descriptors with the activation barriers of the Diels-Alder ligations. A set of 72 non-catalysed Diels-Alder reactions were subjected to quantitative structure-activation barrier relationship (QSABR) under the framework of theoretical quantum chemical descriptors calculated solely from the structures of diene and dienophile reactants. Experimental activation barrier data were obtained from literature. Descriptors were computed using Hartree-Fock theory using 6-31G(d) basis set as implemented in Gaussian 09 software. Variable selection and model development were carried out by stepwise multiple linear regression methodology. Predictive performance of the quantitative structure-activation barrier relationship (QSABR) model was assessed by training and test set concept and by calculating leave-one-out cross-validated Q2 and predictive R2 values. The QSABR model can explain and predict 86.5% and 80% of the variances, respectively, in the activation energy barrier training data. Alternatively, a neural network model based on back propagation of errors was developed to assess the nonlinearity of the sought correlations between theoretical descriptors and experimental reaction barriers. A reasonable predictability for the activation barrier of the test set reactions was obtained, which enabled an exploration and interpretation of the significant variables responsible for Diels-Alder interaction between dienes and dienophiles. Thus, studies in the direction of QSABR modelling that provide efficient and fast prediction of activation barriers of the Diels-Alder reactions turn out to be a meaningful alternative to transition state theory based computation.

  17. Quantitative CT based radiomics as predictor of resectability of pancreatic adenocarcinoma

    NASA Astrophysics Data System (ADS)

    van der Putten, Joost; Zinger, Svitlana; van der Sommen, Fons; de With, Peter H. N.; Prokop, Mathias; Hermans, John

    2018-02-01

    In current clinical practice, the resectability of pancreatic ductal adenocarcinoma (PDA) is determined subjec- tively by a physician, which is an error-prone procedure. In this paper, we present a method for automated determination of resectability of PDA from a routine abdominal CT, to reduce such decision errors. The tumor features are extracted from a group of patients with both hypo- and iso-attenuating tumors, of which 29 were resectable and 21 were not. The tumor contours are supplied by a medical expert. We present an approach that uses intensity, shape, and texture features to determine tumor resectability. The best classification results are obtained with fine Gaussian SVM and the L0 Feature Selection algorithms. Compared to expert predictions made on the same dataset, our method achieves better classification results. We obtain significantly better results on correctly predicting non-resectability (+17%) compared to a expert, which is essential for patient treatment (negative prediction value). Moreover, our predictions of resectability exceed expert predictions by approximately 3% (positive prediction value).

  18. Quantitative body fluid proteomics in medicine - A focus on minimal invasiveness.

    PubMed

    Csősz, Éva; Kalló, Gergő; Márkus, Bernadett; Deák, Eszter; Csutak, Adrienne; Tőzsér, József

    2017-02-05

    Identification of new biomarkers specific for various pathological conditions is an important field in medical sciences. Body fluids have emerging potential in biomarker studies especially those which are continuously available and can be collected by non-invasive means. Changes in the protein composition of body fluids such as tears, saliva, sweat, etc. may provide information on both local and systemic conditions of medical relevance. In this review, our aim is to discuss the quantitative proteomics techniques used in biomarker studies, and to present advances in quantitative body fluid proteomics of non-invasively collectable body fluids with relevance to biomarker identification. The advantages and limitations of the widely used quantitative proteomics techniques are also presented. Based on the reviewed literature, we suggest an ideal pipeline for body fluid analyses aiming at biomarkers discoveries: starting from identification of biomarker candidates by shotgun quantitative proteomics or protein arrays, through verification of potential biomarkers by targeted mass spectrometry, to the antibody-based validation of biomarkers. The importance of body fluids as a rich source of biomarkers is discussed. Quantitative proteomics is a challenging part of proteomics applications. The body fluids collected by non-invasive means have high relevance in medicine; they are good sources for biomarkers used in establishing the diagnosis, follow up of disease progression and predicting high risk groups. The review presents the most widely used quantitative proteomics techniques in body fluid analysis and lists the potential biomarkers identified in tears, saliva, sweat, nasal mucus and urine for local and systemic diseases. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Beta Decay and the Origins of Biological Chirality

    NASA Astrophysics Data System (ADS)

    van House, James Christopher

    1984-06-01

    The amino acids and sugars on which terrestrial life is based show maximal optical activity, that is, with rare exceptions, they are composed of D sugars in RNA and DNA and L-amino acids in proteins. Recent quantitative theoretical calculations suggest that the origin of this asymmetry can be causally explained by asymmetric radiolysis of initially racemic mixtures of D and L molecules by the longitudinally polarized electrons emitted in parity violating nuclear (beta) decay. These same theories predict an asymmetry in the rate of orthopositronium formation, Ap(,s), when low energy positron beams with a net helicity form positronium in optically active molecules, and quantitatively connect Ap(,s) to asymmetric radiolysis. This thesis presents the results of a measurement of Ap(,s) in several D, L, and DL amino acids using a polarized low energy positron beam. Limits of Ap(,s) < 3 x 10(' -4) were set on the amino acids leucine, selenocystine, and thyroxine, sufficient to exclude part of the predicted range of Ap(,s) in the last two molecules. These experimental limits improve previous limits on asymmetric radiolysis by a factor of 10('6). A quantitative discussion of the connection between the above limits and the origin of optical activity in living organisms is presented. Details of the development of the high intensity, highly polarized slow positron beam used in these measurements and of the first use of remoderation to form a slow positron beam and provide a timing signal for the beam are presented in the Appendices.

  20. Some Recent Developments in the Endochronic Theory with Application to Cyclic Histories

    NASA Technical Reports Server (NTRS)

    Valanis, K. C.; Lee, C. F.

    1983-01-01

    Constitutive equations with only two easily determined material constants predict the stress (strain) response of normalized mild steel to a variety of general strain (stress) histories, without a need for special unloading-reloading rules. The equations are derived from the endochronic theory of plasticity of isotropic materials with an intrinsic time scale defined in the plastic strain space. Agreement between theoretical predictions and experiments are are excellent quantitatively in cases of various uniaxial constant amplitude histories, variable uniaxial strain amplitude histories and cyclic relaxation. The cyclic ratcheting phenomenon is predicted by the present theory.

  1. WPC Quantitative Precipitation Forecasts - Day 1

    Science.gov Websites

    to all federal, state, and local government web resources and services. Quantitative Precipitation Prediction Center 5830 University Research Court College Park, Maryland 20740 Weather Prediction Center Web

  2. The predictive power of Japanese candlestick charting in Chinese stock market

    NASA Astrophysics Data System (ADS)

    Chen, Shi; Bao, Si; Zhou, Yu

    2016-09-01

    This paper studies the predictive power of 4 popular pairs of two-day bullish and bearish Japanese candlestick patterns in Chinese stock market. Based on Morris' study, we give the quantitative details of definition of long candlestick, which is important in two-day candlestick pattern recognition but ignored by several previous researches, and we further give the quantitative definitions of these four pairs of two-day candlestick patterns. To test the predictive power of candlestick patterns on short-term price movement, we propose the definition of daily average return to alleviate the impact of correlation among stocks' overlap-time returns in statistical tests. To show the robustness of our result, two methods of trend definition are used for both the medium-market-value and large-market-value sample sets. We use Step-SPA test to correct for data snooping bias. Statistical results show that the predictive power differs from pattern to pattern, three of the eight patterns provide both short-term and relatively long-term prediction, another one pair only provide significant forecasting power within very short-term period, while the rest three patterns present contradictory results for different market value groups. For all the four pairs, the predictive power drops as predicting time increases, and forecasting power is stronger for stocks with medium market value than those with large market value.

  3. RosettaHoles: rapid assessment of protein core packing for structure prediction, refinement, design, and validation.

    PubMed

    Sheffler, Will; Baker, David

    2009-01-01

    We present a novel method called RosettaHoles for visual and quantitative assessment of underpacking in the protein core. RosettaHoles generates a set of spherical cavity balls that fill the empty volume between atoms in the protein interior. For visualization, the cavity balls are aggregated into contiguous overlapping clusters and small cavities are discarded, leaving an uncluttered representation of the unfilled regions of space in a structure. For quantitative analysis, the cavity ball data are used to estimate the probability of observing a given cavity in a high-resolution crystal structure. RosettaHoles provides excellent discrimination between real and computationally generated structures, is predictive of incorrect regions in models, identifies problematic structures in the Protein Data Bank, and promises to be a useful validation tool for newly solved experimental structures.

  4. RosettaHoles: Rapid assessment of protein core packing for structure prediction, refinement, design, and validation

    PubMed Central

    Sheffler, Will; Baker, David

    2009-01-01

    We present a novel method called RosettaHoles for visual and quantitative assessment of underpacking in the protein core. RosettaHoles generates a set of spherical cavity balls that fill the empty volume between atoms in the protein interior. For visualization, the cavity balls are aggregated into contiguous overlapping clusters and small cavities are discarded, leaving an uncluttered representation of the unfilled regions of space in a structure. For quantitative analysis, the cavity ball data are used to estimate the probability of observing a given cavity in a high-resolution crystal structure. RosettaHoles provides excellent discrimination between real and computationally generated structures, is predictive of incorrect regions in models, identifies problematic structures in the Protein Data Bank, and promises to be a useful validation tool for newly solved experimental structures. PMID:19177366

  5. Identification of line-specific strategies for improving carotenoid production in synthetic maize through data-driven mathematical modeling.

    PubMed

    Comas, Jorge; Benfeitas, Rui; Vilaprinyo, Ester; Sorribas, Albert; Solsona, Francesc; Farré, Gemma; Berman, Judit; Zorrilla, Uxue; Capell, Teresa; Sandmann, Gerhard; Zhu, Changfu; Christou, Paul; Alves, Rui

    2016-09-01

    Plant synthetic biology is still in its infancy. However, synthetic biology approaches have been used to manipulate and improve the nutritional and health value of staple food crops such as rice, potato and maize. With current technologies, production yields of the synthetic nutrients are a result of trial and error, and systematic rational strategies to optimize those yields are still lacking. Here, we present a workflow that combines gene expression and quantitative metabolomics with mathematical modeling to identify strategies for increasing production yields of nutritionally important carotenoids in the seed endosperm synthesized through alternative biosynthetic pathways in synthetic lines of white maize, which is normally devoid of carotenoids. Quantitative metabolomics and gene expression data are used to create and fit parameters of mathematical models that are specific to four independent maize lines. Sensitivity analysis and simulation of each model is used to predict which gene activities should be further engineered in order to increase production yields for carotenoid accumulation in each line. Some of these predictions (e.g. increasing Zmlycb/Gllycb will increase accumulated β-carotenes) are valid across the four maize lines and consistent with experimental observations in other systems. Other predictions are line specific. The workflow is adaptable to any other biological system for which appropriate quantitative information is available. Furthermore, we validate some of the predictions using experimental data from additional synthetic maize lines for which no models were developed. © 2016 The Authors The Plant Journal © 2016 John Wiley & Sons Ltd.

  6. Statistical mechanics of ribbons under bending and twisting torques.

    PubMed

    Sinha, Supurna; Samuel, Joseph

    2013-11-20

    We present an analytical study of ribbons subjected to an external torque. We first describe the elastic response of a ribbon within a purely mechanical framework. We then study the role of thermal fluctuations in modifying its elastic response. We predict the moment-angle relation of bent and twisted ribbons. Such a study is expected to shed light on the role of twist in DNA looping and on bending elasticity of twisted graphene ribbons. Our quantitative predictions can be tested against future single molecule experiments.

  7. Physics and chemistry of antimicrobial behavior of ion-exchanged silver in glass.

    PubMed

    Borrelli, N F; Senaratne, W; Wei, Y; Petzold, O

    2015-02-04

    The results of a comprehensive study involving the antimicrobial activity in a silver ion-exchanged glass are presented. The study includes the glass composition, the method of incorporating silver into the glass, the effective concentration of the silver available at the glass surface, and the effect of the ambient environment. A quantitative kinetic model that includes the above factors in predicting the antimicrobial activity is proposed. Finally, experimental data demonstrating antibacterial activity against Staphylococcus aureus with correlation to the predicted model is shown.

  8. Prediction of rain effects on earth-space communication links operating in the 10 to 35 GHz frequency range

    NASA Technical Reports Server (NTRS)

    Stutzman, Warren L.

    1989-01-01

    This paper reviews the effects of precipitation on earth-space communication links operating the 10 to 35 GHz frequency range. Emphasis is on the quantitative prediction of rain attenuation and depolarization. Discussions center on the models developed at Virginia Tech. Comments on other models are included as well as literature references to key works. Also included is the system level modeling for dual polarized communication systems with techniques for calculating antenna and propagation medium effects. Simple models for the calculation of average annual attenuation and cross-polarization discrimination (XPD) are presented. Calculation of worst month statistics are also presented.

  9. Prediction based active ramp metering control strategy with mobility and safety assessment

    NASA Astrophysics Data System (ADS)

    Fang, Jie; Tu, Lili

    2018-04-01

    Ramp metering is one of the most direct and efficient motorway traffic flow management measures so as to improve traffic conditions. However, owing to short of traffic conditions prediction, in earlier studies, the impact on traffic flow dynamics of the applied RM control was not quantitatively evaluated. In this study, a RM control algorithm adopting Model Predictive Control (MPC) framework to predict and assess future traffic conditions, which taking both the current traffic conditions and the RM-controlled future traffic states into consideration, was presented. The designed RM control algorithm targets at optimizing the network mobility and safety performance. The designed algorithm is evaluated in a field-data-based simulation. Through comparing the presented algorithm controlled scenario with the uncontrolled scenario, it was proved that the proposed RM control algorithm can effectively relieve the congestion of traffic network with no significant compromises in safety aspect.

  10. Intermittent Turbulence in the Stable Boundary Layer over Land. Part III: A Classification for Observations during CASES-99.

    NASA Astrophysics Data System (ADS)

    van de Wiel, B. J. H.; Moene, A. F.; Hartogensis, O. K.; de Bruin, H. A. R.; Holtslag, A. A. M.

    2003-10-01

    In this paper a classification of stable boundary layer regimes is presented based on observations of near-surface turbulence during the Cooperative Atmosphere-Surface Exchange Study-1999 (CASES-99). It is found that the different nights can be divided into three subclasses: a turbulent regime, an intermittent regime, and a radiative regime, which confirms the findings of two companion papers that use a simplified theoretical model (it is noted that its simpliflied structure limits the model generality to near-surface flows). The papers predict the occurrence of stable boundary layer regimes in terms of external forcing parameters such as the (effective) pressure gradient and radiative forcing. The classification in the present work supports these predictions and shows that the predictions are robust in a qualitative sense. As such, it is, for example, shown that intermittent turbulence is most likely to occur in clear-sky conditions with a moderately weak effective pressure gradient. The quantitative features of the theoretical classification are, however, rather sensitive to (often uncertain) local parameter estimations, such as the bulk heat conductance of the vegetation layer. This sensitivity limits the current applicability of the theoretical classification in a strict quantitative sense, apart from its conceptual value.

  11. Prediction Analysis for Measles Epidemics

    NASA Astrophysics Data System (ADS)

    Sumi, Ayako; Ohtomo, Norio; Tanaka, Yukio; Sawamura, Sadashi; Olsen, Lars Folke; Kobayashi, Nobumichi

    2003-12-01

    A newly devised procedure of prediction analysis, which is a linearized version of the nonlinear least squares method combined with the maximum entropy spectral analysis method, was proposed. This method was applied to time series data of measles case notification in several communities in the UK, USA and Denmark. The dominant spectral lines observed in each power spectral density (PSD) can be safely assigned as fundamental periods. The optimum least squares fitting (LSF) curve calculated using these fundamental periods can essentially reproduce the underlying variation of the measles data. An extension of the LSF curve can be used to predict measles case notification quantitatively. Some discussions including a predictability of chaotic time series are presented.

  12. Semi-quantitative prediction of a multiple API solid dosage form with a combination of vibrational spectroscopy methods.

    PubMed

    Hertrampf, A; Sousa, R M; Menezes, J C; Herdling, T

    2016-05-30

    Quality control (QC) in the pharmaceutical industry is a key activity in ensuring medicines have the required quality, safety and efficacy for their intended use. QC departments at pharmaceutical companies are responsible for all release testing of final products but also all incoming raw materials. Near-infrared spectroscopy (NIRS) and Raman spectroscopy are important techniques for fast and accurate identification and qualification of pharmaceutical samples. Tablets containing two different active pharmaceutical ingredients (API) [bisoprolol, hydrochlorothiazide] in different commercially available dosages were analysed using Raman- and NIR Spectroscopy. The goal was to define multivariate models based on each vibrational spectroscopy to discriminate between different dosages (identity) and predict their dosage (semi-quantitative). Furthermore the combination of spectroscopic techniques was investigated. Therefore, two different multiblock techniques based on PLS have been applied: multiblock PLS (MB-PLS) and sequential-orthogonalised PLS (SO-PLS). NIRS showed better results compared to Raman spectroscopy for both identification and quantitation. The multiblock techniques investigated showed that each spectroscopy contains information not present or captured with the other spectroscopic technique, thus demonstrating that there is a potential benefit in their combined use for both identification and quantitation purposes. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Predicting Loss-of-Control Boundaries Toward a Piloting Aid

    NASA Technical Reports Server (NTRS)

    Barlow, Jonathan; Stepanyan, Vahram; Krishnakumar, Kalmanje

    2012-01-01

    This work presents an approach to predicting loss-of-control with the goal of providing the pilot a decision aid focused on maintaining the pilot's control action within predicted loss-of-control boundaries. The predictive architecture combines quantitative loss-of-control boundaries, a data-based predictive control boundary estimation algorithm and an adaptive prediction method to estimate Markov model parameters in real-time. The data-based loss-of-control boundary estimation algorithm estimates the boundary of a safe set of control inputs that will keep the aircraft within the loss-of-control boundaries for a specified time horizon. The adaptive prediction model generates estimates of the system Markov Parameters, which are used by the data-based loss-of-control boundary estimation algorithm. The combined algorithm is applied to a nonlinear generic transport aircraft to illustrate the features of the architecture.

  14. Evaluation of an ensemble of genetic models for prediction of a quantitative trait.

    PubMed

    Milton, Jacqueline N; Steinberg, Martin H; Sebastiani, Paola

    2014-01-01

    Many genetic markers have been shown to be associated with common quantitative traits in genome-wide association studies. Typically these associated genetic markers have small to modest effect sizes and individually they explain only a small amount of the variability of the phenotype. In order to build a genetic prediction model without fitting a multiple linear regression model with possibly hundreds of genetic markers as predictors, researchers often summarize the joint effect of risk alleles into a genetic score that is used as a covariate in the genetic prediction model. However, the prediction accuracy can be highly variable and selecting the optimal number of markers to be included in the genetic score is challenging. In this manuscript we present a strategy to build an ensemble of genetic prediction models from data and we show that the ensemble-based method makes the challenge of choosing the number of genetic markers more amenable. Using simulated data with varying heritability and number of genetic markers, we compare the predictive accuracy and inclusion of true positive and false positive markers of a single genetic prediction model and our proposed ensemble method. The results show that the ensemble of genetic models tends to include a larger number of genetic variants than a single genetic model and it is more likely to include all of the true genetic markers. This increased sensitivity is obtained at the price of a lower specificity that appears to minimally affect the predictive accuracy of the ensemble.

  15. Quantitative Model to Predict Melts on the Ol-Opx Saturation Boundary during Mantle Melting: The Role of H2O

    NASA Astrophysics Data System (ADS)

    Andrews, A. L.; Grove, T. L.

    2014-12-01

    Two quantitative, empirical models are presented that predict mantle melt compositions in equilibrium with olivine (ol) + orthopyroxene (opx) ± spinel (sp) as a function of variable pressure and H2O content. The models consist of multiple linear regressions calibrated using new data from H2O-undersaturated primitive and depleted mantle lherzolite melting experiments as well as experimental literature data. The models investigate the roles of H2O, Pressure, 1-Mg# (1-[XMg/(XMg+XFe)]), NaK# ((Na2O+K2O)/(Na2O+K2O+CaO)), TiO2, and Cr2O3 on mantle melt compositions. Melts are represented by the pseudoternary endmembers Clinopyroxene (Cpx), Olivine (Ol), Plagioclase (Plag), and Quartz (Qz) of Tormey et al. (1987). Model A returns predictive equations for the four endmembers with identical predictor variables, whereas Model B chooses predictor variables for the four compositional endmember equations and temperature independently. We employ the use of Akaike Information Criteria (Akaike, 1974) to determine the best predictor variables from initial variables chosen through thermodynamic reasoning and by previous models. In both Models A and B, the coefficients for H2O show that increasing H2O drives the melt to more Qz normative space, as the Qz component increases by +0.012(3) per 1 wt.% H2O. The other endmember components decrease and are all three times less affected by H2O (Ol: -0.004(2); Cpx: -0.004(2); Plag: -0.004(3)). Consistent with previous models and experimental data, increasing pressure moves melt compositions to more Ol normative space at the expense of the Qz component. The models presented quantitatively determine the influence of H2O, Pressure, 1-Mg#, NaK#, TiO2, and Cr2O3 on mantle melts in equilibrium with ol+opx±sp; the equations presented can be used to predict melts of known mantle source compositions saturated in ol+opx±sp. References Tormey, Grove, & Bryan (1987), doi: 10.1007/BF00375227. Akaike (1974), doi: 10.1109/TAC.1974.1100705.

  16. Qualitative and quantitative comparison of geostatistical techniques of porosity prediction from the seismic and logging data: a case study from the Blackfoot Field, Alberta, Canada

    NASA Astrophysics Data System (ADS)

    Maurya, S. P.; Singh, K. H.; Singh, N. P.

    2018-05-01

    In present study, three recently developed geostatistical methods, single attribute analysis, multi-attribute analysis and probabilistic neural network algorithm have been used to predict porosity in inter well region for Blackfoot field, Alberta, Canada, an offshore oil field. These techniques make use of seismic attributes, generated by model based inversion and colored inversion techniques. The principle objective of the study is to find the suitable combination of seismic inversion and geostatistical techniques to predict porosity and identification of prospective zones in 3D seismic volume. The porosity estimated from these geostatistical approaches is corroborated with the well log porosity. The results suggest that all the three implemented geostatistical methods are efficient and reliable to predict the porosity but the multi-attribute and probabilistic neural network analysis provide more accurate and high resolution porosity sections. A low impedance (6000-8000 m/s g/cc) and high porosity (> 15%) zone is interpreted from inverted impedance and porosity sections respectively between 1060 and 1075 ms time interval and is characterized as reservoir. The qualitative and quantitative results demonstrate that of all the employed geostatistical methods, the probabilistic neural network along with model based inversion is the most efficient method for predicting porosity in inter well region.

  17. Noninvasive Assessment of Biochemical and Mechanical Properties of Lumbar Discs Through Quantitative Magnetic Resonance Imaging in Asymptomatic Volunteers.

    PubMed

    Foltz, Mary H; Kage, Craig C; Johnson, Casey P; Ellingson, Arin M

    2017-11-01

    Intervertebral disc degeneration is a prevalent phenomenon associated with back pain. It is of critical clinical interest to discriminate disc health and identify early stages of degeneration. Traditional clinical T2-weighted magnetic resonance imaging (MRI), assessed using the Pfirrmann classification system, is subjective and fails to adequately capture initial degenerative changes. Emerging quantitative MRI techniques offer a solution. Specifically, T2* mapping images water mobility in the macromolecular network, and our preliminary ex vivo work shows high predictability of the disc's glycosaminoglycan content (s-GAG) and residual mechanics. The present study expands upon this work to predict the biochemical and biomechanical properties in vivo and assess their relationship with both age and Pfirrmann grade. Eleven asymptomatic subjects (range: 18-62 yrs) were enrolled and imaged using a 3T MRI scanner. T2-weighted images (Pfirrmann grade) and quantitative T2* maps (predict s-GAG and residual stress) were acquired. Surface maps based on the distribution of these properties were generated and integrated to quantify the surface volume. Correlational analyses were conducted to establish the relationship between each metric of disc health derived from the quantitative T2* maps with both age and Pfirrmann grade, where an inverse trend was observed. Furthermore, the nucleus pulposus (NP) signal in conjunction with volumetric surface maps provided the ability to discern differences during initial stages of disc degeneration. This study highlights the ability of T2* mapping to noninvasively assess the s-GAG content, residual stress, and distributions throughout the entire disc, which may provide a powerful diagnostic tool for disc health assessment.

  18. Laboratory measurements of the millimeter-wave spectra of calcium isocyanide

    NASA Astrophysics Data System (ADS)

    Steimle, Timothy C.; Saito, Shuji; Takano, Shuro

    1993-06-01

    The ground state of CaNC is presently characterized by mm-wave spectroscopy, using a standard Hamiltonian linear molecule model to analyze the spectrum. The resulting spectroscopic parameters were used to predict the transition frequencies and Einstein A-coefficients, which should make possible a quantitative astrophysical search for CaNC.

  19. Exploring the Relationship between Self-Efficacy and Retention in Introductory Physics

    ERIC Educational Resources Information Center

    Sawtelle, Vashti; Brewe, Eric; Kramer, Laird H.

    2012-01-01

    The quantitative results of Sources of Self-Efficacy in Science Courses-Physics (SOSESC-P) are presented as a logistic regression predicting the passing of students in introductory Physics with Calculus I, overall as well as disaggregated by gender. Self-efficacy as a theory to explain human behavior change [Bandura [1977] "Psychological…

  20. Calculation of High Angle of Attack Aerodynamics of Fighter Configurations. Volume 1. Steady

    DTIC Science & Technology

    1991-04-01

    patterns are now well known qualitatively for fighter configurations from extensive wind and water tunnel tests. However, development of quantitative ...Illustration of Flow Features Predicted in the Present Method -55- z -I1 Figure 2. Difinition of Airplane Coordinate Systems -56- zz T .. l y vy.y Mean

  1. Lithologic composition and rock weathering potential of forested, glacial-till soils

    Treesearch

    Scott W. Bailey; James W. Hornbeck; James W. Hornbeck

    1992-01-01

    Describes methods for predicting lithologies present in soils developed on glacial till, and the potential weathering contributions from rock particles >2 mm in diameter. The methods are not quantitative in terms of providing weathering rates, but provide information that can further the understanding of forest nutrient cycles, and possibly assist with decisions...

  2. A Quantitative Investigation of Prospective Teachers' Hopes and Their Motivational Forces

    ERIC Educational Resources Information Center

    Eren, Altay; Yesilbursa, Amanda

    2017-01-01

    The present study aimed to investigate the diverse aspects of prospective teachers' dispositional hopes, teaching-specific hopes, and their sources, as well as to explore whether these would significantly predict their preparation for the teaching profession. A total of 851 prospective teachers voluntarily participated in the study. A series of…

  3. Effects of Endogenous Formaldehyde in Nasal Tissues on Inhaled Formmaldehyde Dosimetry Predictions in the Rat, Monkey, and Human Nasal Passages

    EPA Science Inventory

    ABSTRACT Formaldehyde, a nasal carcinogen, is also an endogenous compound that is present in all living cells. Due to its high solubility and reactivity, quantitative risk estimates for inhaled formaldehyde rely on internal dose calculations in the upper respiratory tract which ...

  4. Quantitative features in the computed tomography of healthy lungs.

    PubMed Central

    Fromson, B H; Denison, D M

    1988-01-01

    This study set out to determine whether quantitative features of lung computed tomography scans could be identified that would lead to a tightly defined normal range for use in assessing patients. Fourteen normal subjects with apparently healthy lungs were studied. A technique was developed for rapid and automatic extraction of lung field data from the computed tomography scans. The Hounsfield unit histograms were constructed and, when normalised for predicted lung volumes, shown to be consistent in shape for all the subjects. A three dimensional presentation of the data in the form of a "net plot" was devised, and from this a logarithmic relationship between the area of each lung slice and its mean density was derived (r = 0.9, n = 545, p less than 0.0001). The residual density, calculated as the difference between measured density and density predicted from the relationship with area, was shown to be normally distributed with a mean of 0 and a standard deviation of 25 Hounsfield units (chi 2 test: p less than 0.05). A presentation combining this residual density with the net plot is described. PMID:3353883

  5. Cocrystals to facilitate delivery of poorly soluble compounds beyond-rule-of-5.

    PubMed

    Kuminek, Gislaine; Cao, Fengjuan; Bahia de Oliveira da Rocha, Alanny; Gonçalves Cardoso, Simone; Rodríguez-Hornedo, Naír

    2016-06-01

    Besides enhancing aqueous solubilities, cocrystals have the ability to fine-tune solubility advantage over drug, supersaturation index, and bioavailability. This review presents important facts about cocrystals that set them apart from other solid-state forms of drugs, and a quantitative set of rules for the selection of additives and solution/formulation conditions that predict cocrystal solubility, supersaturation index, and transition points. Cocrystal eutectic constants are shown to be the most important cocrystal property that can be measured once a cocrystal is discovered, and simple relationships are presented that allow for prediction of cocrystal behavior as a function of pH and drug solubilizing agents. Cocrystal eutectic constant is a stability or supersatuation index that: (a) reflects how close or far from equilibrium a cocrystal is, (b) establishes transition points, and (c) provides a quantitative scale of cocrystal true solubility changes over drug. The benefit of this strategy is that a single measurement, that requires little material and time, provides a principled basis to tailor cocrystal supersaturation index by the rational selection of cocrystal formulation, dissolution, and processing conditions. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Quantitative coronary plaque analysis predicts high-risk plaque morphology on coronary computed tomography angiography: results from the ROMICAT II trial.

    PubMed

    Liu, Ting; Maurovich-Horvat, Pál; Mayrhofer, Thomas; Puchner, Stefan B; Lu, Michael T; Ghemigian, Khristine; Kitslaar, Pieter H; Broersen, Alexander; Pursnani, Amit; Hoffmann, Udo; Ferencik, Maros

    2018-02-01

    Semi-automated software can provide quantitative assessment of atherosclerotic plaques on coronary CT angiography (CTA). The relationship between established qualitative high-risk plaque features and quantitative plaque measurements has not been studied. We analyzed the association between quantitative plaque measurements and qualitative high-risk plaque features on coronary CTA. We included 260 patients with plaque who underwent coronary CTA in the Rule Out Myocardial Infarction/Ischemia Using Computer Assisted Tomography (ROMICAT) II trial. Quantitative plaque assessment and qualitative plaque characterization were performed on a per coronary segment basis. Quantitative coronary plaque measurements included plaque volume, plaque burden, remodeling index, and diameter stenosis. In qualitative analysis, high-risk plaque was present if positive remodeling, low CT attenuation plaque, napkin-ring sign or spotty calcium were detected. Univariable and multivariable logistic regression analyses were performed to assess the association between quantitative and qualitative high-risk plaque assessment. Among 888 segments with coronary plaque, high-risk plaque was present in 391 (44.0%) segments by qualitative analysis. In quantitative analysis, segments with high-risk plaque had higher total plaque volume, low CT attenuation plaque volume, plaque burden and remodeling index. Quantitatively assessed low CT attenuation plaque volume (odds ratio 1.12 per 1 mm 3 , 95% CI 1.04-1.21), positive remodeling (odds ratio 1.25 per 0.1, 95% CI 1.10-1.41) and plaque burden (odds ratio 1.53 per 0.1, 95% CI 1.08-2.16) were associated with high-risk plaque. Quantitative coronary plaque characteristics (low CT attenuation plaque volume, positive remodeling and plaque burden) measured by semi-automated software correlated with qualitative assessment of high-risk plaque features.

  7. A score model for predicting post-liver transplantation survival in HBV cirrhosis-related hepatocellular carcinoma recipients: a single center 5-year experience.

    PubMed

    Wang, Li-Ying; Zheng, Shu-Sen; Xu, Xiao; Wang, Wei-Lin; Wu, Jian; Zhang, Min; Shen, Yan; Yan, Sheng; Xie, Hai-Yang; Chen, Xin-Hua; Jiang, Tian-An; Chen, Fen

    2015-02-01

    The prognostic prediction of liver transplantation (LT) guides the donor organ allocation. However, there is currently no satisfactory model to predict the recipients' outcome, especially for the patients with HBV cirrhosis-related hepatocellular carcinoma (HCC). The present study was to develop a quantitative assessment model for predicting the post-LT survival in HBV-related HCC patients. Two hundred and thirty-eight LT recipients at the Liver Transplant Center, First Affiliated Hospital, Zhejiang University School of Medicine between 2008 and 2013 were included in this study. Their post-LT prognosis was recorded and multiple risk factors were analyzed using univariate and multivariate analyses in Cox regression. The score model was as follows: 0.114X(Child-Pugh score)-0.002X(positive HBV DNA detection time)+0.647X(number of tumor nodules)+0.055X(max diameter of tumor nodules)+0.231XlnAFP+0.437X(tumor differentiation grade). The receiver operating characteristic curve analysis showed that the area under the curve of the scoring model for predicting the post-LT survival was 0.887. The cut-off value was 1.27, which was associated with a sensitivity of 72.5% and a specificity of 90.7%, respectively. The quantitative score model for predicting post-LT survival proved to be sensitive and specific.

  8. Revealing and analyzing networks of environmental systems

    NASA Astrophysics Data System (ADS)

    Eveillard, D.; Bittner, L.; Chaffron, S.; Guidi, L.; Raes, J.; Karsenti, E.; Bowler, C.; Gorsky, G.

    2015-12-01

    Understanding the interactions between microbial communities and their environment well enough to be able to predict diversity on the basis of physicochemical parameters is a fundamental pursuit of microbial ecology that still eludes us. However, modeling microbial communities is a complicated task, because (i) communities are complex, (ii) most are described qualitatively, and (iii) quantitative understanding of the way communities interacts with their surroundings remains incomplete. Within this seminar, we will illustrate two complementary approaches that aim to overcome these points in different manners. First, we will present a network analysis that focus on the biological carbon pump in the global ocean. The biological carbon pump is the process by which photosynthesis transforms CO2 to organic carbon sinking to the deep-ocean as particles where it is sequestered. While the intensity of the pump correlate to plankton community composition, the underlying ecosystem structure and interactions driving this process remain largely uncharacterized Here we use environmental and metagenomic data gathered during the Tara Oceans expedition to improve understanding of these drivers. We show that specific plankton communities correlate with carbon export and highlight unexpected and overlooked taxa such as Radiolaria, alveolate parasites and bacterial pathogens, as well as Synechococcus and their phages, as key players in the biological pump. Additionally, we show that the abundances of just a few bacterial and viral genes predict most of the global ocean carbon export's variability. Together these findings help elucidate ecosystem drivers of the biological carbon pump and present a case study for scaling from genes-to-ecosystems. Second, we will show preliminary results on a probabilistic modeling that predicts microbial community structure across observed physicochemical data, from a putative network and partial quantitative knowledge. This modeling shows that, despite distinct quantitative environmental perturbations, the constraints on community structure could remain stable.

  9. Computing organic stereoselectivity - from concepts to quantitative calculations and predictions.

    PubMed

    Peng, Qian; Duarte, Fernanda; Paton, Robert S

    2016-11-07

    Advances in theory and processing power have established computation as a valuable interpretative and predictive tool in the discovery of new asymmetric catalysts. This tutorial review outlines the theory and practice of modeling stereoselective reactions. Recent examples illustrate how an understanding of the fundamental principles and the application of state-of-the-art computational methods may be used to gain mechanistic insight into organic and organometallic reactions. We highlight the emerging potential of this computational tool-box in providing meaningful predictions for the rational design of asymmetric catalysts. We present an accessible account of the field to encourage future synergy between computation and experiment.

  10. Combining molecular docking and QSAR studies for modeling the anti-tyrosinase activity of aromatic heterocycle thiosemicarbazone analogues

    NASA Astrophysics Data System (ADS)

    Dong, Huanhuan; Liu, Jing; Liu, Xiaoru; Yu, Yanying; Cao, Shuwen

    2018-01-01

    A collection of thirty-six aromatic heterocycle thiosemicarbazone analogues presented a broad span of anti-tyrosinase activities were designed and obtained. A robust and reliable two-dimensional quantitative structure-activity relationship model, as evidenced by the high q2 and r2 values (0.848 and 0.893, respectively), was gained based on the analogues to predict the quantitative chemical-biological relationship and the new modifier direction. Inhibitory activities of the compounds were found to greatly depend on molecular shape and orbital energy. Substituents brought out large ovality and high highest-occupied molecular orbital energy values helped to improve the activity of these analogues. The molecular docking results provided visual evidence for QSAR analysis and inhibition mechanism. Based on these, two novel tyrosinase inhibitors O04 and O05 with predicted IC50 of 0.5384 and 0.8752 nM were designed and suggested for further research.

  11. Predicting plant biomass accumulation from image-derived parameters

    PubMed Central

    Chen, Dijun; Shi, Rongli; Pape, Jean-Michel; Neumann, Kerstin; Graner, Andreas; Chen, Ming; Klukas, Christian

    2018-01-01

    Abstract Background Image-based high-throughput phenotyping technologies have been rapidly developed in plant science recently, and they provide a great potential to gain more valuable information than traditionally destructive methods. Predicting plant biomass is regarded as a key purpose for plant breeders and ecologists. However, it is a great challenge to find a predictive biomass model across experiments. Results In the present study, we constructed 4 predictive models to examine the quantitative relationship between image-based features and plant biomass accumulation. Our methodology has been applied to 3 consecutive barley (Hordeum vulgare) experiments with control and stress treatments. The results proved that plant biomass can be accurately predicted from image-based parameters using a random forest model. The high prediction accuracy based on this model will contribute to relieving the phenotyping bottleneck in biomass measurement in breeding applications. The prediction performance is still relatively high across experiments under similar conditions. The relative contribution of individual features for predicting biomass was further quantified, revealing new insights into the phenotypic determinants of the plant biomass outcome. Furthermore, methods could also be used to determine the most important image-based features related to plant biomass accumulation, which would be promising for subsequent genetic mapping to uncover the genetic basis of biomass. Conclusions We have developed quantitative models to accurately predict plant biomass accumulation from image data. We anticipate that the analysis results will be useful to advance our views of the phenotypic determinants of plant biomass outcome, and the statistical methods can be broadly used for other plant species. PMID:29346559

  12. Essential Set of Molecular Descriptors for ADME Prediction in Drug and Environmental Chemical Space

    EPA Science Inventory

    Historically, the disciplines of pharmacology and toxicology have embraced quantitative structure-activity relationships (QSAR) and quantitative structure-property relationships (QSPR) to predict ADME properties or biological activities of untested chemicals. The question arises ...

  13. Toward the prediction of class I and II mouse major histocompatibility complex-peptide-binding affinity: in silico bioinformatic step-by-step guide using quantitative structure-activity relationships.

    PubMed

    Hattotuwagama, Channa K; Doytchinova, Irini A; Flower, Darren R

    2007-01-01

    Quantitative structure-activity relationship (QSAR) analysis is a cornerstone of modern informatics. Predictive computational models of peptide-major histocompatibility complex (MHC)-binding affinity based on QSAR technology have now become important components of modern computational immunovaccinology. Historically, such approaches have been built around semiqualitative, classification methods, but these are now giving way to quantitative regression methods. We review three methods--a 2D-QSAR additive-partial least squares (PLS) and a 3D-QSAR comparative molecular similarity index analysis (CoMSIA) method--which can identify the sequence dependence of peptide-binding specificity for various class I MHC alleles from the reported binding affinities (IC50) of peptide sets. The third method is an iterative self-consistent (ISC) PLS-based additive method, which is a recently developed extension to the additive method for the affinity prediction of class II peptides. The QSAR methods presented here have established themselves as immunoinformatic techniques complementary to existing methodology, useful in the quantitative prediction of binding affinity: current methods for the in silico identification of T-cell epitopes (which form the basis of many vaccines, diagnostics, and reagents) rely on the accurate computational prediction of peptide-MHC affinity. We have reviewed various human and mouse class I and class II allele models. Studied alleles comprise HLA-A*0101, HLA-A*0201, HLA-A*0202, HLA-A*0203, HLA-A*0206, HLA-A*0301, HLA-A*1101, HLA-A*3101, HLA-A*6801, HLA-A*6802, HLA-B*3501, H2-K(k), H2-K(b), H2-D(b) HLA-DRB1*0101, HLA-DRB1*0401, HLA-DRB1*0701, I-A(b), I-A(d), I-A(k), I-A(S), I-E(d), and I-E(k). In this chapter we show a step-by-step guide into predicting the reliability and the resulting models to represent an advance on existing methods. The peptides used in this study are available from the AntiJen database (http://www.jenner.ac.uk/AntiJen). The PLS method is available commercially in the SYBYL molecular modeling software package. The resulting models, which can be used for accurate T-cell epitope prediction, will be made are freely available online at the URL http://www.jenner.ac.uk/MHCPred.

  14. Short-term Automated Quantification of Radiologic Changes in the Characterization of Idiopathic Pulmonary Fibrosis Versus Nonspecific Interstitial Pneumonia and Prediction of Long-term Survival.

    PubMed

    De Giacomi, Federica; Raghunath, Sushravya; Karwoski, Ronald; Bartholmai, Brian J; Moua, Teng

    2018-03-01

    Fibrotic interstitial lung diseases presenting with nonspecific and overlapping radiologic findings may be difficult to diagnose without surgical biopsy. We hypothesized that baseline quantifiable radiologic features and their short-term interval change may be predictive of underlying histologic diagnosis as well as long-term survival in idiopathic pulmonary fibrosis (IPF) presenting without honeycombing versus nonspecific interstitial pneumonia (NSIP). Forty biopsy-confirmed IPF and 20 biopsy-confirmed NSIP patients with available high-resolution chest computed tomography 4 to 24 months apart were studied. CALIPER software was used for the automated characterization and quantification of radiologic findings. IPF subjects were older (66 vs. 48; P<0.0001) with lower diffusion capacity for carbon monoxide and higher volumes of baseline reticulation (193 vs. 83 mL; P<0.0001). Over the interval period, compared with NSIP, IPF patients experienced greater functional decline (forced vital capacity, -6.3% vs. -1.7%; P=0.02) and radiologic progression, as noted by greater increase in reticulation volume (24 vs. 1.74 mL; P=0.048), and decrease in normal (-220 vs. -37.7 mL; P=0.045) and total lung volumes (-198 vs. 58.1 mL; P=0.03). Older age, male gender, higher reticulation volumes at baseline, and greater interval decrease in normal lung volumes were predictive of IPF. Both baseline and short-term changes in quantitative radiologic findings were predictive of mortality. Baseline quantitative radiologic findings and assessment of short-term disease progression may help characterize underlying IPF versus NSIP in those with difficult to differentiate clinicoradiologic presentations. Our study supports the possible utility of assessing serial quantifiable high-resolution chest computed tomographic findings for disease differentiation in these 2 entities.

  15. Severe rainfall prediction systems for civil protection purposes

    NASA Astrophysics Data System (ADS)

    Comellas, A.; Llasat, M. C.; Molini, L.; Parodi, A.; Siccardi, F.

    2010-09-01

    One of the most common natural hazards impending on Mediterranean regions is the occurrence of severe weather structures able to produce heavy rainfall. Floods have killed about 1000 people across all Europe in last 10 years. With the aim of mitigating this kind of risk, quantitative precipitation forecasts (QPF) and rain probability forecasts are two tools nowadays available for national meteorological services and institutions responsible for weather forecasting in order to and predict rainfall, by using either the deterministic or the probabilistic approach. This study provides an insight of the different approaches used by Italian (DPC) and Catalonian (SMC) Civil Protection and the results they achieved with their peculiar issuing-system for early warnings. For the former, the analysis considers the period between 2006-2009 in which the predictive ability of the forecasting system, based on the numerical weather prediction model COSMO-I7, has been put into comparison with ground based observations (composed by more than 2000 raingauge stations, Molini et al., 2009). Italian system is mainly focused on regional-scale warnings providing forecasts for periods never shorter than 18 hours and very often have a 36-hour maximum duration . The information contained in severe weather bulletins is not quantitative and usually is referred to a specific meteorological phenomena (thunderstorms, wind gales et c.). Updates and refining have a usual refresh time of 24 hours. SMC operates within the Catalonian boundaries and uses a warning system that mixes both quantitative and probabilistic information. For each administrative region ("comarca") Catalonia is divided into, forecasters give an approximate value of the average predicted rainfall and the probability of overcoming that threshold. Usually warnings are re-issued every 6 hours and their duration depends on the predicted time extent of the storm. In order to provide a comprehensive QPF verification, the rainfall predicted by Mesoscale Model 5 (MM5), the SMC forecast operational model, is compared with the local rain gauge network for year 2008 (Comellas et al., 2010). This study presents benefits and drawbacks of both Italian and Catalonian systems. Moreover, a particular attention is paid on the link between system's predictive ability and the predicted severe weather type as a function of its space-time development.

  16. Biomacromolecular quantitative structure-activity relationship (BioQSAR): a proof-of-concept study on the modeling, prediction and interpretation of protein-protein binding affinity.

    PubMed

    Zhou, Peng; Wang, Congcong; Tian, Feifei; Ren, Yanrong; Yang, Chao; Huang, Jian

    2013-01-01

    Quantitative structure-activity relationship (QSAR), a regression modeling methodology that establishes statistical correlation between structure feature and apparent behavior for a series of congeneric molecules quantitatively, has been widely used to evaluate the activity, toxicity and property of various small-molecule compounds such as drugs, toxicants and surfactants. However, it is surprising to see that such useful technique has only very limited applications to biomacromolecules, albeit the solved 3D atom-resolution structures of proteins, nucleic acids and their complexes have accumulated rapidly in past decades. Here, we present a proof-of-concept paradigm for the modeling, prediction and interpretation of the binding affinity of 144 sequence-nonredundant, structure-available and affinity-known protein complexes (Kastritis et al. Protein Sci 20:482-491, 2011) using a biomacromolecular QSAR (BioQSAR) scheme. We demonstrate that the modeling performance and predictive power of BioQSAR are comparable to or even better than that of traditional knowledge-based strategies, mechanism-type methods and empirical scoring algorithms, while BioQSAR possesses certain additional features compared to the traditional methods, such as adaptability, interpretability, deep-validation and high-efficiency. The BioQSAR scheme could be readily modified to infer the biological behavior and functions of other biomacromolecules, if their X-ray crystal structures, NMR conformation assemblies or computationally modeled structures are available.

  17. Predicting the performance of fingerprint similarity searching.

    PubMed

    Vogt, Martin; Bajorath, Jürgen

    2011-01-01

    Fingerprints are bit string representations of molecular structure that typically encode structural fragments, topological features, or pharmacophore patterns. Various fingerprint designs are utilized in virtual screening and their search performance essentially depends on three parameters: the nature of the fingerprint, the active compounds serving as reference molecules, and the composition of the screening database. It is of considerable interest and practical relevance to predict the performance of fingerprint similarity searching. A quantitative assessment of the potential that a fingerprint search might successfully retrieve active compounds, if available in the screening database, would substantially help to select the type of fingerprint most suitable for a given search problem. The method presented herein utilizes concepts from information theory to relate the fingerprint feature distributions of reference compounds to screening libraries. If these feature distributions do not sufficiently differ, active database compounds that are similar to reference molecules cannot be retrieved because they disappear in the "background." By quantifying the difference in feature distribution using the Kullback-Leibler divergence and relating the divergence to compound recovery rates obtained for different benchmark classes, fingerprint search performance can be quantitatively predicted.

  18. Void probability as a function of the void's shape and scale-invariant models

    NASA Technical Reports Server (NTRS)

    Elizalde, E.; Gaztanaga, E.

    1991-01-01

    The dependence of counts in cells on the shape of the cell for the large scale galaxy distribution is studied. A very concrete prediction can be done concerning the void distribution for scale invariant models. The prediction is tested on a sample of the CfA catalog, and good agreement is found. It is observed that the probability of a cell to be occupied is bigger for some elongated cells. A phenomenological scale invariant model for the observed distribution of the counts in cells, an extension of the negative binomial distribution, is presented in order to illustrate how this dependence can be quantitatively determined. An original, intuitive derivation of this model is presented.

  19. PREDICTING TOXICOLOGICAL ENDPOINTS OF CHEMICALS USING QUANTITATIVE STRUCTURE-ACTIVITY RELATIONSHIPS (QSARS)

    EPA Science Inventory

    Quantitative structure-activity relationships (QSARs) are being developed to predict the toxicological endpoints for untested chemicals similar in structure to chemicals that have known experimental toxicological data. Based on a very large number of predetermined descriptors, a...

  20. Manual physical balance assistance of therapists during gait training of stroke survivors: characteristics and predicting the timing.

    PubMed

    Haarman, Juliet A M; Maartens, Erik; van der Kooij, Herman; Buurke, Jaap H; Reenalda, Jasper; Rietman, Johan S

    2017-12-02

    During gait training, physical therapists continuously supervise stroke survivors and provide physical support to their pelvis when they judge that the patient is unable to keep his balance. This paper is the first in providing quantitative data about the corrective forces that therapists use during gait training. It is assumed that changes in the acceleration of a patient's COM are a good predictor for therapeutic balance assistance during the training sessions Therefore, this paper provides a method that predicts the timing of therapeutic balance assistance, based on acceleration data of the sacrum. Eight sub-acute stroke survivors and seven therapists were included in this study. Patients were asked to perform straight line walking as well as slalom walking in a conventional training setting. Acceleration of the sacrum was captured by an Inertial Magnetic Measurement Unit. Balance-assisting corrective forces applied by the therapist were collected from two force sensors positioned on both sides of the patient's hips. Measures to characterize the therapeutic balance assistance were the amount of force, duration, impulse and the anatomical plane in which the assistance took place. Based on the acceleration data of the sacrum, an algorithm was developed to predict therapeutic balance assistance. To validate the developed algorithm, the predicted events of balance assistance by the algorithm were compared with the actual provided therapeutic assistance. The algorithm was able to predict the actual therapeutic assistance with a Positive Predictive Value of 87% and a True Positive Rate of 81%. Assistance mainly took place over the medio-lateral axis and corrective forces of about 2% of the patient's body weight (15.9 N (11), median (IQR)) were provided by therapists in this plane. Median duration of balance assistance was 1.1 s (0.6) (median (IQR)) and median impulse was 9.4Ns (8.2) (median (IQR)). Although therapists were specifically instructed to aim for the force sensors on the iliac crest, a different contact location was reported in 22% of the corrections. This paper presents insights into the behavior of therapists regarding their manual physical assistance during gait training. A quantitative dataset was presented, representing therapeutic balance-assisting force characteristics. Furthermore, an algorithm was developed that predicts events at which therapeutic balance assistance was provided. Prediction scores remain high when different therapists and patients were analyzed with the same algorithm settings. Both the quantitative dataset and the developed algorithm can serve as technical input in the development of (robot-controlled) balance supportive devices.

  1. Risk analysis for veterinary biologicals released into the environment.

    PubMed

    Silva, S V; Samagh, B S; Morley, R S

    1995-12-01

    All veterinary biologicals licensed in Canada must be shown to be pure, potent, safe and effective. A risk-based approach is used to evaluate the safety of all biologicals, whether produced by conventional methods or by molecular biological techniques. Traditionally, qualitative risk assessment methods have been used for this purpose. More recently, quantitative risk assessment has become available for complex issues. The quantitative risk assessment method uses "scenario tree analysis' to predict the likelihood of various outcomes and their respective impacts. The authors describe the quantitative risk assessment approach which is used within the broader context of risk analysis (i.e. risk assessment, risk management and risk communication) to develop recommendations for the field release of veterinary biologicals. The general regulatory framework for the licensing of veterinary biologicals in Canada is also presented.

  2. Quantitative high-resolution genomic analysis of single cancer cells.

    PubMed

    Hannemann, Juliane; Meyer-Staeckling, Sönke; Kemming, Dirk; Alpers, Iris; Joosse, Simon A; Pospisil, Heike; Kurtz, Stefan; Görndt, Jennifer; Püschel, Klaus; Riethdorf, Sabine; Pantel, Klaus; Brandt, Burkhard

    2011-01-01

    During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics.

  3. Climate Dynamics and Experimental Prediction (CDEP) and Regional Integrated Science Assessments (RISA) Programs at NOAA Office of Global Programs

    NASA Astrophysics Data System (ADS)

    Bamzai, A.

    2003-04-01

    This talk will highlight science and application activities of the CDEP and RISA programs at NOAA OGP. CDEP, through a set of Applied Research Centers (ARCs), supports NOAA's program of quantitative assessments and predictions of global climate variability and its regional implications on time scales of seasons to centuries. The RISA program consolidates results from ongoing disciplinary process research under an integrative framework. Examples of joint CDEP-RISA activities will be presented. Future directions and programmatic challenges will also be discussed.

  4. A Research Methodology for Studying What Makes Some Problems Difficult to Solve

    ERIC Educational Resources Information Center

    Gulacar, Ozcan; Fynewever, Herb

    2010-01-01

    We present a quantitative model for predicting the level of difficulty subjects will experience with specific problems. The model explicitly accounts for the number of subproblems a problem can be broken into and the difficultly of each subproblem. Although the model builds on previously published models, it is uniquely suited for blending with…

  5. Quantitative analysis applied to contrast medium extravasation by using the computed-tomography number within the region of interest

    NASA Astrophysics Data System (ADS)

    Lee, Jae-Seung; Im, In-Chul; Kim, Moon-Jib; Goo, Eun-Hoe; Kim, Sun-Ju; Kim, Kwang; Kwak, Byung-Joon

    2014-02-01

    The present study was carried out to present a method to analyze extravasation quantitatively by measuring the computed tomography (CT) number after determining the region of interest (ROI) in the CT images obtained from patients suspected of extravasation induced by contrast medium auto-injection. To achieve this, we divided the study subjects into a group of patients who incurred extravasation and a group of patients who underwent routine scans without incurring extravasation. The CT numbers at IV sites were obtained as reference values, and CT numbers at extravasation sites and hepatic portal veins, respectively, were obtained as relative values. Thereupon, the predicted time for extravasation ( T EP ) and the predicted ratio for extravasation ( R EP ) of an extravasation site were obtained and analyzed quantitatively. In the case of extravasation induced by a dual auto-injector, the values of the CT numbers were confirmed to be lower and the extravasation site to be enlarged when compared to the extravasation induced by a single autoinjector. This is because the physiological saline introduced after the injection of the contrast agent diluted the concentration of the extravasated contrast agent. Additionally, the T EP caused by the auto-injector was about 40 seconds, and we could perform a precise quantitative assessment of the site suspected of extravasation. In conclusion, the dual auto-injection method, despite its advantage of reducing the volume of contrast agent and improving the quality of images for patients with good vascular integrity, was judged to be likely to increase the risk of extravasation and aggravate outcomes for patients with poor vascular integrity by enlarging extravasation sites.

  6. Projected effects of Climate-change-induced flow alterations on stream macroinvertebrate abundances.

    PubMed

    Kakouei, Karan; Kiesel, Jens; Domisch, Sami; Irving, Katie S; Jähnig, Sonja C; Kail, Jochem

    2018-03-01

    Global change has the potential to affect river flow conditions which are fundamental determinants of physical habitats. Predictions of the effects of flow alterations on aquatic biota have mostly been assessed based on species ecological traits (e.g., current preferences), which are difficult to link to quantitative discharge data. Alternatively, we used empirically derived predictive relationships for species' response to flow to assess the effect of flow alterations due to climate change in two contrasting central European river catchments. Predictive relationships were set up for 294 individual species based on (1) abundance data from 223 sampling sites in the Kinzig lower-mountainous catchment and 67 sites in the Treene lowland catchment, and (2) flow conditions at these sites described by five flow metrics quantifying the duration, frequency, magnitude, timing and rate of flow events using present-day gauging data. Species' abundances were predicted for three periods: (1) baseline (1998-2017), (2) horizon 2050 (2046-2065) and (3) horizon 2090 (2080-2099) based on these empirical relationships and using high-resolution modeled discharge data for the present and future climate conditions. We compared the differences in predicted abundances among periods for individual species at each site, where the percent change served as a proxy to assess the potential species responses to flow alterations. Climate change was predicted to most strongly affect the low-flow conditions, leading to decreased abundances of species up to -42%. Finally combining the response of all species over all metrics indicated increasing overall species assemblage responses in 98% of the studied river reaches in both projected horizons and were significantly larger in the lower-mountainous Kinzig compared to the lowland Treene catchment. Such quantitative analyses of freshwater taxa responses to flow alterations provide valuable tools for predicting potential climate-change impacts on species abundances and can be applied to any stressor, species, or region.

  7. Quantitative Sensory Testing Predicts Pregabalin Efficacy in Painful Chronic Pancreatitis

    PubMed Central

    Olesen, Søren S.; Graversen, Carina; Bouwense, Stefan A. W.; van Goor, Harry; Wilder-Smith, Oliver H. G.; Drewes, Asbjørn M.

    2013-01-01

    Background A major problem in pain medicine is the lack of knowledge about which treatment suits a specific patient. We tested the ability of quantitative sensory testing to predict the analgesic effect of pregabalin and placebo in patients with chronic pancreatitis. Methods Sixty-four patients with painful chronic pancreatitis received pregabalin (150–300 mg BID) or matching placebo for three consecutive weeks. Analgesic effect was documented in a pain diary based on a visual analogue scale. Responders were defined as patients with a reduction in clinical pain score of 30% or more after three weeks of study treatment compared to baseline recordings. Prior to study medication, pain thresholds to electric skin and pressure stimulation were measured in dermatomes T10 (pancreatic area) and C5 (control area). To eliminate inter-subject differences in absolute pain thresholds an index of sensitivity between stimulation areas was determined (ratio of pain detection thresholds in pancreatic versus control area, ePDT ratio). Pain modulation was recorded by a conditioned pain modulation paradigm. A support vector machine was used to screen sensory parameters for their predictive power of pregabalin efficacy. Results The pregabalin responders group was hypersensitive to electric tetanic stimulation of the pancreatic area (ePDT ratio 1.2 (0.9–1.3)) compared to non-responders group (ePDT ratio: 1.6 (1.5–2.0)) (P = 0.001). The electrical pain detection ratio was predictive for pregabalin effect with a classification accuracy of 83.9% (P = 0.007). The corresponding sensitivity was 87.5% and specificity was 80.0%. No other parameters were predictive of pregabalin or placebo efficacy. Conclusions The present study provides first evidence that quantitative sensory testing predicts the analgesic effect of pregabalin in patients with painful chronic pancreatitis. The method can be used to tailor pain medication based on patient’s individual sensory profile and thus comprises a significant step towards personalized pain medicine. PMID:23469256

  8. Demand theory of gene regulation. II. Quantitative application to the lactose and maltose operons of Escherichia coli.

    PubMed Central

    Savageau, M A

    1998-01-01

    Induction of gene expression can be accomplished either by removing a restraining element (negative mode of control) or by providing a stimulatory element (positive mode of control). According to the demand theory of gene regulation, which was first presented in qualitative form in the 1970s, the negative mode will be selected for the control of a gene whose function is in low demand in the organism's natural environment, whereas the positive mode will be selected for the control of a gene whose function is in high demand. This theory has now been further developed in a quantitative form that reveals the importance of two key parameters: cycle time C, which is the average time for a gene to complete an ON/OFF cycle, and demand D, which is the fraction of the cycle time that the gene is ON. Here we estimate nominal values for the relevant mutation rates and growth rates and apply the quantitative demand theory to the lactose and maltose operons of Escherichia coli. The results define regions of the C vs. D plot within which selection for the wild-type regulatory mechanisms is realizable, and these in turn provide the first estimates for the minimum and maximum values of demand that are required for selection of the positive and negative modes of gene control found in these systems. The ratio of mutation rate to selection coefficient is the most relevant determinant of the realizable region for selection, and the most influential parameter is the selection coefficient that reflects the reduction in growth rate when there is superfluous expression of a gene. The quantitative theory predicts the rate and extent of selection for each mode of control. It also predicts three critical values for the cycle time. The predicted maximum value for the cycle time C is consistent with the lifetime of the host. The predicted minimum value for C is consistent with the time for transit through the intestinal tract without colonization. Finally, the theory predicts an optimum value of C that is in agreement with the observed frequency for E. coli colonizing the human intestinal tract. PMID:9691028

  9. The ACCE method: an approach for obtaining quantitative or qualitative estimates of residual confounding that includes unmeasured confounding

    PubMed Central

    Smith, Eric G.

    2015-01-01

    Background:  Nonrandomized studies typically cannot account for confounding from unmeasured factors.  Method:  A method is presented that exploits the recently-identified phenomenon of  “confounding amplification” to produce, in principle, a quantitative estimate of total residual confounding resulting from both measured and unmeasured factors.  Two nested propensity score models are constructed that differ only in the deliberate introduction of an additional variable(s) that substantially predicts treatment exposure.  Residual confounding is then estimated by dividing the change in treatment effect estimate between models by the degree of confounding amplification estimated to occur, adjusting for any association between the additional variable(s) and outcome. Results:  Several hypothetical examples are provided to illustrate how the method produces a quantitative estimate of residual confounding if the method’s requirements and assumptions are met.  Previously published data is used to illustrate that, whether or not the method routinely provides precise quantitative estimates of residual confounding, the method appears to produce a valuable qualitative estimate of the likely direction and general size of residual confounding. Limitations:  Uncertainties exist, including identifying the best approaches for: 1) predicting the amount of confounding amplification, 2) minimizing changes between the nested models unrelated to confounding amplification, 3) adjusting for the association of the introduced variable(s) with outcome, and 4) deriving confidence intervals for the method’s estimates (although bootstrapping is one plausible approach). Conclusions:  To this author’s knowledge, it has not been previously suggested that the phenomenon of confounding amplification, if such amplification is as predictable as suggested by a recent simulation, provides a logical basis for estimating total residual confounding. The method's basic approach is straightforward.  The method's routine usefulness, however, has not yet been established, nor has the method been fully validated. Rapid further investigation of this novel method is clearly indicated, given the potential value of its quantitative or qualitative output. PMID:25580226

  10. Emission Computed Tomography: A New Technique for the Quantitative Physiologic Study of Brain and Heart in Vivo

    DOE R&D Accomplishments Database

    Phelps, M. E.; Hoffman, E. J.; Huang, S. C.; Schelbert, H. R.; Kuhl, D. E.

    1978-01-01

    Emission computed tomography can provide a quantitative in vivo measurement of regional tissue radionuclide tracer concentrations. This facility when combined with physiologic models and radioactively labeled physiologic tracers that behave in a predictable manner allow measurement of a wide variety of physiologic variables. This integrated technique has been referred to as Physiologic Tomography (PT). PT requires labeled compounds which trace physiologic processes in a known and predictable manner, and physiologic models which are appropriately formulated and validated to derive physiologic variables from ECT data. In order to effectively achieve this goal, PT requires an ECT system that is capable of performing truly quantitative or analytical measurements of tissue tracer concentrations and which has been well characterized in terms of spatial resolution, sensitivity and signal to noise ratios in the tomographic image. This paper illustrates the capabilities of emission computed tomography and provides examples of physiologic tomography for the regional measurement of cerebral and myocardial metabolic rate for glucose, regional measurement of cerebral blood volume, gated cardiac blood pools and capillary perfusion in brain and heart. Studies on patients with stroke and myocardial ischemia are also presented.

  11. National Centers for Environmental Prediction

    Science.gov Websites

    ENSEMBLE PRODUCTS & DATA SOURCES Probabilistic Forecasts of Quantitative Precipitation from the NCEP Predictability Research with Indian Monsoon Examples - PDF - 28 Mar 2005 North American Ensemble Forecast System QUANTITATIVE PRECIPITATION *PQPF* In these charts, the probability that 24-hour precipitation amounts over a

  12. Predicting Children's Reading and Mathematics Achievement from Early Quantitative Knowledge and Domain-General Cognitive Abilities

    PubMed Central

    Chu, Felicia W.; vanMarle, Kristy; Geary, David C.

    2016-01-01

    One hundred children (44 boys) participated in a 3-year longitudinal study of the development of basic quantitative competencies and the relation between these competencies and later mathematics and reading achievement. The children's preliteracy knowledge, intelligence, executive functions, and parental educational background were also assessed. The quantitative tasks assessed a broad range of symbolic and nonsymbolic knowledge and were administered four times across 2 years of preschool. Mathematics achievement was assessed at the end of each of 2 years of preschool, and mathematics and word reading achievement were assessed at the end of kindergarten. Our goals were to determine how domain-general abilities contribute to growth in children's quantitative knowledge and to determine how domain-general and domain-specific abilities contribute to children's preschool mathematics achievement and kindergarten mathematics and reading achievement. We first identified four core quantitative competencies (e.g., knowledge of the cardinal value of number words) that predict later mathematics achievement. The domain-general abilities were then used to predict growth in these competencies across 2 years of preschool, and the combination of domain-general abilities, preliteracy skills, and core quantitative competencies were used to predict mathematics achievement across preschool and mathematics and word reading achievement at the end of kindergarten. Both intelligence and executive functions predicted growth in the four quantitative competencies, especially across the first year of preschool. A combination of domain-general and domain-specific competencies predicted preschoolers' mathematics achievement, with a trend for domain-specific skills to be more strongly related to achievement at the beginning of preschool than at the end of preschool. Preschool preliteracy skills, sensitivity to the relative quantities of collections of objects, and cardinal knowledge predicted reading and mathematics achievement at the end of kindergarten. Preliteracy skills were more strongly related to word reading, whereas sensitivity to relative quantity was more strongly related to mathematics achievement. The overall results indicate that a combination of domain-general and domain-specific abilities contribute to development of children's early mathematics and reading achievement. PMID:27252675

  13. Predicting Children's Reading and Mathematics Achievement from Early Quantitative Knowledge and Domain-General Cognitive Abilities.

    PubMed

    Chu, Felicia W; vanMarle, Kristy; Geary, David C

    2016-01-01

    One hundred children (44 boys) participated in a 3-year longitudinal study of the development of basic quantitative competencies and the relation between these competencies and later mathematics and reading achievement. The children's preliteracy knowledge, intelligence, executive functions, and parental educational background were also assessed. The quantitative tasks assessed a broad range of symbolic and nonsymbolic knowledge and were administered four times across 2 years of preschool. Mathematics achievement was assessed at the end of each of 2 years of preschool, and mathematics and word reading achievement were assessed at the end of kindergarten. Our goals were to determine how domain-general abilities contribute to growth in children's quantitative knowledge and to determine how domain-general and domain-specific abilities contribute to children's preschool mathematics achievement and kindergarten mathematics and reading achievement. We first identified four core quantitative competencies (e.g., knowledge of the cardinal value of number words) that predict later mathematics achievement. The domain-general abilities were then used to predict growth in these competencies across 2 years of preschool, and the combination of domain-general abilities, preliteracy skills, and core quantitative competencies were used to predict mathematics achievement across preschool and mathematics and word reading achievement at the end of kindergarten. Both intelligence and executive functions predicted growth in the four quantitative competencies, especially across the first year of preschool. A combination of domain-general and domain-specific competencies predicted preschoolers' mathematics achievement, with a trend for domain-specific skills to be more strongly related to achievement at the beginning of preschool than at the end of preschool. Preschool preliteracy skills, sensitivity to the relative quantities of collections of objects, and cardinal knowledge predicted reading and mathematics achievement at the end of kindergarten. Preliteracy skills were more strongly related to word reading, whereas sensitivity to relative quantity was more strongly related to mathematics achievement. The overall results indicate that a combination of domain-general and domain-specific abilities contribute to development of children's early mathematics and reading achievement.

  14. Improved methods for predicting peptide binding affinity to MHC class II molecules.

    PubMed

    Jensen, Kamilla Kjaergaard; Andreatta, Massimo; Marcatili, Paolo; Buus, Søren; Greenbaum, Jason A; Yan, Zhen; Sette, Alessandro; Peters, Bjoern; Nielsen, Morten

    2018-07-01

    Major histocompatibility complex class II (MHC-II) molecules are expressed on the surface of professional antigen-presenting cells where they display peptides to T helper cells, which orchestrate the onset and outcome of many host immune responses. Understanding which peptides will be presented by the MHC-II molecule is therefore important for understanding the activation of T helper cells and can be used to identify T-cell epitopes. We here present updated versions of two MHC-II-peptide binding affinity prediction methods, NetMHCII and NetMHCIIpan. These were constructed using an extended data set of quantitative MHC-peptide binding affinity data obtained from the Immune Epitope Database covering HLA-DR, HLA-DQ, HLA-DP and H-2 mouse molecules. We show that training with this extended data set improved the performance for peptide binding predictions for both methods. Both methods are publicly available at www.cbs.dtu.dk/services/NetMHCII-2.3 and www.cbs.dtu.dk/services/NetMHCIIpan-3.2. © 2018 John Wiley & Sons Ltd.

  15. Quantitative imaging features of pretreatment CT predict volumetric response to chemotherapy in patients with colorectal liver metastases.

    PubMed

    Creasy, John M; Midya, Abhishek; Chakraborty, Jayasree; Adams, Lauryn B; Gomes, Camilla; Gonen, Mithat; Seastedt, Kenneth P; Sutton, Elizabeth J; Cercek, Andrea; Kemeny, Nancy E; Shia, Jinru; Balachandran, Vinod P; Kingham, T Peter; Allen, Peter J; DeMatteo, Ronald P; Jarnagin, William R; D'Angelica, Michael I; Do, Richard K G; Simpson, Amber L

    2018-06-19

    This study investigates whether quantitative image analysis of pretreatment CT scans can predict volumetric response to chemotherapy for patients with colorectal liver metastases (CRLM). Patients treated with chemotherapy for CRLM (hepatic artery infusion (HAI) combined with systemic or systemic alone) were included in the study. Patients were imaged at baseline and approximately 8 weeks after treatment. Response was measured as the percentage change in tumour volume from baseline. Quantitative imaging features were derived from the index hepatic tumour on pretreatment CT, and features statistically significant on univariate analysis were included in a linear regression model to predict volumetric response. The regression model was constructed from 70% of data, while 30% were reserved for testing. Test data were input into the trained model. Model performance was evaluated with mean absolute prediction error (MAPE) and R 2 . Clinicopatholologic factors were assessed for correlation with response. 157 patients were included, split into training (n = 110) and validation (n = 47) sets. MAPE from the multivariate linear regression model was 16.5% (R 2 = 0.774) and 21.5% in the training and validation sets, respectively. Stratified by HAI utilisation, MAPE in the validation set was 19.6% for HAI and 25.1% for systemic chemotherapy alone. Clinical factors associated with differences in median tumour response were treatment strategy, systemic chemotherapy regimen, age and KRAS mutation status (p < 0.05). Quantitative imaging features extracted from pretreatment CT are promising predictors of volumetric response to chemotherapy in patients with CRLM. Pretreatment predictors of response have the potential to better select patients for specific therapies. • Colorectal liver metastases (CRLM) are downsized with chemotherapy but predicting the patients that will respond to chemotherapy is currently not possible. • Heterogeneity and enhancement patterns of CRLM can be measured with quantitative imaging. • Prediction model constructed that predicts volumetric response with 20% error suggesting that quantitative imaging holds promise to better select patients for specific treatments.

  16. Effect of topological patterning on self-rolling of nanomembranes.

    PubMed

    Chen, Cheng; Song, Pengfei; Meng, Fanchao; Ou, Pengfei; Liu, Xinyu; Song, Jun

    2018-08-24

    The effects of topological patterning (i.e., grating and rectangular patterns) on the self-rolling behaviors of heteroepitaxial strained nanomembranes have been systematically studied. An analytical modeling framework, validated through finite-element simulations, has been formulated to predict the resultant curvature of the patterned nanomembrane as the pattern thickness and density vary. The effectiveness of the grating pattern in regulating the rolling direction of the nanomembrane has been demonstrated and quantitatively assessed. Further to the rolling of nanomembranes, a route to achieve predictive design of helical structures has been proposed and showcased. The present study provides new knowledge and mechanistic guidance towards predictive control and tuning of roll-up nanostructures via topological patterning.

  17. Hexatic smectic phase with algebraically decaying bond-orientational order

    NASA Astrophysics Data System (ADS)

    Agosta, Lorenzo; Metere, Alfredo; Dzugutov, Mikhail

    2018-05-01

    The hexatic phase predicted by the theories of two-dimensional melting is characterized by the power-law decay of the orientational correlations, whereas the in-layer bond orientational order in all the hexatic smectic phases observed so far was found to be long range. We report a hexatic smectic phase where the in-layer bond orientational correlations decay algebraically, in quantitative agreement with the hexatic ordering predicted by the theory for two dimensions. The phase was formed in a molecular dynamics simulation of a one-component system of particles interacting via a spherically symmetric potential. The present results thus demonstrate that the theoretically predicted two-dimensional hexatic order can exist in a three-dimensional system.

  18. Experimentally validated quantitative linear model for the device physics of elastomeric microfluidic valves

    NASA Astrophysics Data System (ADS)

    Kartalov, Emil P.; Scherer, Axel; Quake, Stephen R.; Taylor, Clive R.; Anderson, W. French

    2007-03-01

    A systematic experimental study and theoretical modeling of the device physics of polydimethylsiloxane "pushdown" microfluidic valves are presented. The phase space is charted by 1587 dimension combinations and encompasses 45-295μm lateral dimensions, 16-39μm membrane thickness, and 1-28psi closing pressure. Three linear models are developed and tested against the empirical data, and then combined into a fourth-power-polynomial superposition. The experimentally validated final model offers a useful quantitative prediction for a valve's properties as a function of its dimensions. Typical valves (80-150μm width) are shown to behave like thin springs.

  19. Pattern Search in Multi-structure Data: A Framework for the Next-Generation Evidence-based Medicine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sukumar, Sreenivas R; Ainsworth, Keela C

    With the advent of personalized and evidence-based medicine, the need for a framework to analyze/interpret quantitative measurements (blood work, toxicology, etc.) with qualitative descriptions (specialist reports after reading images, bio-medical knowledge-bases) to predict diagnostic risks is fast emerging. Addressing this need, we pose and address the following questions (i) How can we jointly analyze both qualitative and quantitative data ? (ii) Is the fusion of multi-structure data expected to provide better insights than either of them individually ? We present experiments on two bio-medical data sets - mammography and traumatic brain studies to demonstrate architectures and tools for evidence-pattern search.

  20. Synthetic dosage lethality in the human metabolic network is highly predictive of tumor growth and cancer patient survival.

    PubMed

    Megchelenbrink, Wout; Katzir, Rotem; Lu, Xiaowen; Ruppin, Eytan; Notebaart, Richard A

    2015-09-29

    Synthetic dosage lethality (SDL) denotes a genetic interaction between two genes whereby the underexpression of gene A combined with the overexpression of gene B is lethal. SDLs offer a promising way to kill cancer cells by inhibiting the activity of SDL partners of activated oncogenes in tumors, which are often difficult to target directly. As experimental genome-wide SDL screens are still scarce, here we introduce a network-level computational modeling framework that quantitatively predicts human SDLs in metabolism. For each enzyme pair (A, B) we systematically knock out the flux through A combined with a stepwise flux increase through B and search for pairs that reduce cellular growth more than when either enzyme is perturbed individually. The predictive signal of the emerging network of 12,000 SDLs is demonstrated in five different ways. (i) It can be successfully used to predict gene essentiality in shRNA cancer cell line screens. Moving to clinical tumors, we show that (ii) SDLs are significantly underrepresented in tumors. Furthermore, breast cancer tumors with SDLs active (iii) have smaller sizes and (iv) result in increased patient survival, indicating that activation of SDLs increases cancer vulnerability. Finally, (v) patient survival improves when multiple SDLs are present, pointing to a cumulative effect. This study lays the basis for quantitative identification of cancer SDLs in a model-based mechanistic manner. The approach presented can be used to identify SDLs in species and cell types in which "omics" data necessary for data-driven identification are missing.

  1. Evaluation of a New Molecular Entity as a Victim of Metabolic Drug-Drug Interactions-an Industry Perspective.

    PubMed

    Bohnert, Tonika; Patel, Aarti; Templeton, Ian; Chen, Yuan; Lu, Chuang; Lai, George; Leung, Louis; Tse, Susanna; Einolf, Heidi J; Wang, Ying-Hong; Sinz, Michael; Stearns, Ralph; Walsky, Robert; Geng, Wanping; Sudsakorn, Sirimas; Moore, David; He, Ling; Wahlstrom, Jan; Keirns, Jim; Narayanan, Rangaraj; Lang, Dieter; Yang, Xiaoqing

    2016-08-01

    Under the guidance of the International Consortium for Innovation and Quality in Pharmaceutical Development (IQ), scientists from 20 pharmaceutical companies formed a Victim Drug-Drug Interactions Working Group. This working group has conducted a review of the literature and the practices of each company on the approaches to clearance pathway identification (fCL), estimation of fractional contribution of metabolizing enzyme toward metabolism (fm), along with modeling and simulation-aided strategy in predicting the victim drug-drug interaction (DDI) liability due to modulation of drug metabolizing enzymes. Presented in this perspective are the recommendations from this working group on: 1) strategic and experimental approaches to identify fCL and fm, 2) whether those assessments may be quantitative for certain enzymes (e.g., cytochrome P450, P450, and limited uridine diphosphoglucuronosyltransferase, UGT enzymes) or qualitative (for most of other drug metabolism enzymes), and the impact due to the lack of quantitative information on the latter. Multiple decision trees are presented with stepwise approaches to identify specific enzymes that are involved in the metabolism of a given drug and to aid the prediction and risk assessment of drug as a victim in DDI. Modeling and simulation approaches are also discussed to better predict DDI risk in humans. Variability and parameter sensitivity analysis were emphasized when applying modeling and simulation to capture the differences within the population used and to characterize the parameters that have the most influence on the prediction outcome. Copyright © 2016 by The American Society for Pharmacology and Experimental Therapeutics.

  2. Pathological Gleason prediction through gland ring morphometry in immunofluorescent prostate cancer images

    NASA Astrophysics Data System (ADS)

    Scott, Richard; Khan, Faisal M.; Zeineh, Jack; Donovan, Michael; Fernandez, Gerardo

    2016-03-01

    The Gleason score is the most common architectural and morphological assessment of prostate cancer severity and prognosis. There have been numerous quantitative techniques developed to approximate and duplicate the Gleason scoring system. Most of these approaches have been developed in standard H and E brightfield microscopy. Immunofluorescence (IF) image analysis of tissue pathology has recently been proven to be extremely valuable and robust in developing prognostic assessments of disease, particularly in prostate cancer. There have been significant advances in the literature in quantitative biomarker expression as well as characterization of glandular architectures in discrete gland rings. In this work we leverage a new method of segmenting gland rings in IF images for predicting the pathological Gleason; both the clinical and the image specific grade, which may not necessarily be the same. We combine these measures with nuclear specific characteristics as assessed by the MST algorithm. Our individual features correlate well univariately with the Gleason grades, and in a multivariate setting have an accuracy of 85% in predicting the Gleason grade. Additionally, these features correlate strongly with clinical progression outcomes (CI of 0.89), significantly outperforming the clinical Gleason grades (CI of 0.78). This work presents the first assessment of morphological gland unit features from IF images for predicting the Gleason grade.

  3. Predicting Ki67% expression from DCE-MR images of breast tumors using textural kinetic features in tumor habitats

    NASA Astrophysics Data System (ADS)

    Chaudhury, Baishali; Zhou, Mu; Farhidzadeh, Hamidreza; Goldgof, Dmitry B.; Hall, Lawrence O.; Gatenby, Robert A.; Gillies, Robert J.; Weinfurtner, Robert J.; Drukteinis, Jennifer S.

    2016-03-01

    The use of Ki67% expression, a cell proliferation marker, as a predictive and prognostic factor has been widely studied in the literature. Yet its usefulness is limited due to inconsistent cut off scores for Ki67% expression, subjective differences in its assessment in various studies, and spatial variation in expression, which makes it difficult to reproduce as a reliable independent prognostic factor. Previous studies have shown that there are significant spatial variations in Ki67% expression, which may limit its clinical prognostic utility after core biopsy. These variations are most evident when examining the periphery of the tumor vs. the core. To date, prediction of Ki67% expression from quantitative image analysis of DCE-MRI is very limited. This work presents a novel computer aided diagnosis framework to use textural kinetics to (i) predict the ratio of periphery Ki67% expression to core Ki67% expression, and (ii) predict Ki67% expression from individual tumor habitats. The pilot cohort consists of T1 weighted fat saturated DCE-MR images from 17 patients. Support vector regression with a radial basis function was used for predicting the Ki67% expression and ratios. The initial results show that texture features from individual tumor habitats are more predictive of the Ki67% expression ratio and spatial Ki67% expression than features from the whole tumor. The Ki67% expression ratio could be predicted with a root mean square error (RMSE) of 1.67%. Quantitative image analysis of DCE-MRI using textural kinetic habitats, has the potential to be used as a non-invasive method for predicting Ki67 percentage and ratio, thus more accurately reporting high KI-67 expression for patient prognosis.

  4. Quantitative metal magnetic memory reliability modeling for welded joints

    NASA Astrophysics Data System (ADS)

    Xing, Haiyan; Dang, Yongbin; Wang, Ben; Leng, Jiancheng

    2016-03-01

    Metal magnetic memory(MMM) testing has been widely used to detect welded joints. However, load levels, environmental magnetic field, and measurement noises make the MMM data dispersive and bring difficulty to quantitative evaluation. In order to promote the development of quantitative MMM reliability assessment, a new MMM model is presented for welded joints. Steel Q235 welded specimens are tested along the longitudinal and horizontal lines by TSC-2M-8 instrument in the tensile fatigue experiments. The X-ray testing is carried out synchronously to verify the MMM results. It is found that MMM testing can detect the hidden crack earlier than X-ray testing. Moreover, the MMM gradient vector sum K vs is sensitive to the damage degree, especially at early and hidden damage stages. Considering the dispersion of MMM data, the K vs statistical law is investigated, which shows that K vs obeys Gaussian distribution. So K vs is the suitable MMM parameter to establish reliability model of welded joints. At last, the original quantitative MMM reliability model is first presented based on the improved stress strength interference theory. It is shown that the reliability degree R gradually decreases with the decreasing of the residual life ratio T, and the maximal error between prediction reliability degree R 1 and verification reliability degree R 2 is 9.15%. This presented method provides a novel tool of reliability testing and evaluating in practical engineering for welded joints.

  5. Searching for an Accurate Marker-Based Prediction of an Individual Quantitative Trait in Molecular Plant Breeding

    PubMed Central

    Fu, Yong-Bi; Yang, Mo-Hua; Zeng, Fangqin; Biligetu, Bill

    2017-01-01

    Molecular plant breeding with the aid of molecular markers has played an important role in modern plant breeding over the last two decades. Many marker-based predictions for quantitative traits have been made to enhance parental selection, but the trait prediction accuracy remains generally low, even with the aid of dense, genome-wide SNP markers. To search for more accurate trait-specific prediction with informative SNP markers, we conducted a literature review on the prediction issues in molecular plant breeding and on the applicability of an RNA-Seq technique for developing function-associated specific trait (FAST) SNP markers. To understand whether and how FAST SNP markers could enhance trait prediction, we also performed a theoretical reasoning on the effectiveness of these markers in a trait-specific prediction, and verified the reasoning through computer simulation. To the end, the search yielded an alternative to regular genomic selection with FAST SNP markers that could be explored to achieve more accurate trait-specific prediction. Continuous search for better alternatives is encouraged to enhance marker-based predictions for an individual quantitative trait in molecular plant breeding. PMID:28729875

  6. Larval transport modeling of deep-sea invertebrates can aid the search for undiscovered populations.

    PubMed

    Yearsley, Jon M; Sigwart, Julia D

    2011-01-01

    Many deep-sea benthic animals occur in patchy distributions separated by thousands of kilometres, yet because deep-sea habitats are remote, little is known about their larval dispersal. Our novel method simulates dispersal by combining data from the Argo array of autonomous oceanographic probes, deep-sea ecological surveys, and comparative invertebrate physiology. The predicted particle tracks allow quantitative, testable predictions about the dispersal of benthic invertebrate larvae in the south-west Pacific. In a test case presented here, using non-feeding, non-swimming (lecithotrophic trochophore) larvae of polyplacophoran molluscs (chitons), we show that the likely dispersal pathways in a single generation are significantly shorter than the distances between the three known population centres in our study region. The large-scale density of chiton populations throughout our study region is potentially much greater than present survey data suggest, with intermediate 'stepping stone' populations yet to be discovered. We present a new method that is broadly applicable to studies of the dispersal of deep-sea organisms. This test case demonstrates the power and potential applications of our new method, in generating quantitative, testable hypotheses at multiple levels to solve the mismatch between observed and expected distributions: probabilistic predictions of locations of intermediate populations, potential alternative dispersal mechanisms, and expected population genetic structure. The global Argo data have never previously been used to address benthic biology, and our method can be applied to any non-swimming larvae of the deep-sea, giving information upon dispersal corridors and population densities in habitats that remain intrinsically difficult to assess.

  7. Larval Transport Modeling of Deep-Sea Invertebrates Can Aid the Search for Undiscovered Populations

    PubMed Central

    Yearsley, Jon M.; Sigwart, Julia D.

    2011-01-01

    Background Many deep-sea benthic animals occur in patchy distributions separated by thousands of kilometres, yet because deep-sea habitats are remote, little is known about their larval dispersal. Our novel method simulates dispersal by combining data from the Argo array of autonomous oceanographic probes, deep-sea ecological surveys, and comparative invertebrate physiology. The predicted particle tracks allow quantitative, testable predictions about the dispersal of benthic invertebrate larvae in the south-west Pacific. Principal Findings In a test case presented here, using non-feeding, non-swimming (lecithotrophic trochophore) larvae of polyplacophoran molluscs (chitons), we show that the likely dispersal pathways in a single generation are significantly shorter than the distances between the three known population centres in our study region. The large-scale density of chiton populations throughout our study region is potentially much greater than present survey data suggest, with intermediate ‘stepping stone’ populations yet to be discovered. Conclusions/Significance We present a new method that is broadly applicable to studies of the dispersal of deep-sea organisms. This test case demonstrates the power and potential applications of our new method, in generating quantitative, testable hypotheses at multiple levels to solve the mismatch between observed and expected distributions: probabilistic predictions of locations of intermediate populations, potential alternative dispersal mechanisms, and expected population genetic structure. The global Argo data have never previously been used to address benthic biology, and our method can be applied to any non-swimming larvae of the deep-sea, giving information upon dispersal corridors and population densities in habitats that remain intrinsically difficult to assess. PMID:21857992

  8. Quantitative prediction of phase transformations in silicon during nanoindentation

    NASA Astrophysics Data System (ADS)

    Zhang, Liangchi; Basak, Animesh

    2013-08-01

    This paper establishes the first quantitative relationship between the phases transformed in silicon and the shape characteristics of nanoindentation curves. Based on an integrated analysis using TEM and unit cell properties of phases, the volumes of the phases emerged in a nanoindentation are formulated as a function of pop-out size and depth of nanoindentation impression. This simple formula enables a fast, accurate and quantitative prediction of the phases in a nanoindentation cycle, which has been impossible before.

  9. Teacher Emotions in the Classroom: Associations with Students' Engagement, Classroom Discipline and the Interpersonal Teacher-Student Relationship

    ERIC Educational Resources Information Center

    Hagenauer, Gerda; Hascher, Tina; Volet, Simone E.

    2015-01-01

    The present study explores teacher emotions, in particular how they are predicted by students' behaviour and the interpersonal aspect of the teacher-student relationship (TSR). One hundred thirty-two secondary teachers participated in a quantitative study relying on self-report questionnaire data. Based on the model of teacher emotions by Frenzel…

  10. Can Ambiguity Tolerance, Success in Reading, and Gender Predict the Foreign Language Reading Anxiety?

    ERIC Educational Resources Information Center

    Genç, Gülten

    2016-01-01

    The present study focuses on the relationship between reading anxiety and ambiguity tolerance of 295 Turkish EFL learners of English (180 females, 115 males). Data were collected using the Turkish version of FLRAS and SLTAS in 2015-2016 academic year. The overall design of the study was based on the quantitative research method. Data were…

  11. Predictive Power of Attention and Reading Readiness Variables on Auditory Reasoning and Processing Skills of Six-Year-Old Children

    ERIC Educational Resources Information Center

    Erbay, Filiz

    2013-01-01

    The aim of present research was to describe the relation of six-year-old children's attention and reading readiness skills (general knowledge, word comprehension, sentences, and matching) with their auditory reasoning and processing skills. This was a quantitative study based on scanning model. Research sampling consisted of 204 kindergarten…

  12. Prediction and generation of binary Markov processes: Can a finite-state fox catch a Markov mouse?

    NASA Astrophysics Data System (ADS)

    Ruebeck, Joshua B.; James, Ryan G.; Mahoney, John R.; Crutchfield, James P.

    2018-01-01

    Understanding the generative mechanism of a natural system is a vital component of the scientific method. Here, we investigate one of the fundamental steps toward this goal by presenting the minimal generator of an arbitrary binary Markov process. This is a class of processes whose predictive model is well known. Surprisingly, the generative model requires three distinct topologies for different regions of parameter space. We show that a previously proposed generator for a particular set of binary Markov processes is, in fact, not minimal. Our results shed the first quantitative light on the relative (minimal) costs of prediction and generation. We find, for instance, that the difference between prediction and generation is maximized when the process is approximately independently, identically distributed.

  13. Quantitative AOP-based predictions for two aromatase inhibitors evaluating the influence of bioaccumulation on prediction accuracy

    EPA Science Inventory

    The adverse outcome pathway (AOP) framework can be used to support the use of mechanistic toxicology data as a basis for risk assessment. For certain risk contexts this includes defining, quantitative linkages between the molecular initiating event (MIE) and subsequent key events...

  14. Studying Biology to Understand Risk: Dosimetry Models and Quantitative Adverse Outcome Pathways

    EPA Science Inventory

    Confidence in the quantitative prediction of risk is increased when the prediction is based to as great an extent as possible on the relevant biological factors that constitute the pathway from exposure to adverse outcome. With the first examples now over 40 years old, physiologi...

  15. DOSIMETRY MODELING OF INHALED FORMALDEHYDE: BINNING NASAL FLUX PREDICTIONS FOR QUANTITATIVE RISK ASSESSMENT

    EPA Science Inventory

    Dosimetry Modeling of Inhaled Formaldehyde: Binning Nasal Flux Predictions for Quantitative Risk Assessment. Kimbell, J.S., Overton, J.H., Subramaniam, R.P., Schlosser, P.M., Morgan, K.T., Conolly, R.B., and Miller, F.J. (2001). Toxicol. Sci. 000, 000:000.

    Interspecies e...

  16. Optical properties of acute kidney injury measured by quantitative phase imaging

    PubMed Central

    Ban, Sungbea; Min, Eunjung; Baek, Songyee; Kwon, Hyug Moo; Popescu, Gabriel

    2018-01-01

    The diagnosis of acute kidney disease (AKI) has been examined mainly by histology, immunohistochemistry and western blot. Though these approaches are widely accepted in the field, it has an inherent limitation due to the lack of high-throughput and quantitative information. For a better understanding of prognosis in AKI, we present a new approach using quantitative phase imaging combined with a wide-field scanning platform. Through the phase-delay information from the tissue, we were able to predict a stage of AKI based on various optical properties such as light scattering coefficient and anisotropy. These optical parameters quantify the deterioration process of the AKI model of tissue. Our device would be a very useful tool when it is required to deliver fast feedback of tissue pathology or when diseases are related to mechanical properties such as fibrosis. PMID:29541494

  17. Quantitative High-Resolution Genomic Analysis of Single Cancer Cells

    PubMed Central

    Hannemann, Juliane; Meyer-Staeckling, Sönke; Kemming, Dirk; Alpers, Iris; Joosse, Simon A.; Pospisil, Heike; Kurtz, Stefan; Görndt, Jennifer; Püschel, Klaus; Riethdorf, Sabine; Pantel, Klaus; Brandt, Burkhard

    2011-01-01

    During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics. PMID:22140428

  18. Classical least squares multivariate spectral analysis

    DOEpatents

    Haaland, David M.

    2002-01-01

    An improved classical least squares multivariate spectral analysis method that adds spectral shapes describing non-calibrated components and system effects (other than baseline corrections) present in the analyzed mixture to the prediction phase of the method. These improvements decrease or eliminate many of the restrictions to the CLS-type methods and greatly extend their capabilities, accuracy, and precision. One new application of PACLS includes the ability to accurately predict unknown sample concentrations when new unmodeled spectral components are present in the unknown samples. Other applications of PACLS include the incorporation of spectrometer drift into the quantitative multivariate model and the maintenance of a calibration on a drifting spectrometer. Finally, the ability of PACLS to transfer a multivariate model between spectrometers is demonstrated.

  19. Void probability as a function of the void's shape and scale-invariant models. [in studies of spacial galactic distribution

    NASA Technical Reports Server (NTRS)

    Elizalde, E.; Gaztanaga, E.

    1992-01-01

    The dependence of counts in cells on the shape of the cell for the large scale galaxy distribution is studied. A very concrete prediction can be done concerning the void distribution for scale invariant models. The prediction is tested on a sample of the CfA catalog, and good agreement is found. It is observed that the probability of a cell to be occupied is bigger for some elongated cells. A phenomenological scale invariant model for the observed distribution of the counts in cells, an extension of the negative binomial distribution, is presented in order to illustrate how this dependence can be quantitatively determined. An original, intuitive derivation of this model is presented.

  20. Object-oriented Persistent Homology

    PubMed Central

    Wang, Bao; Wei, Guo-Wei

    2015-01-01

    Persistent homology provides a new approach for the topological simplification of big data via measuring the life time of intrinsic topological features in a filtration process and has found its success in scientific and engineering applications. However, such a success is essentially limited to qualitative data classification and analysis. Indeed, persistent homology has rarely been employed for quantitative modeling and prediction. Additionally, the present persistent homology is a passive tool, rather than a proactive technique, for classification and analysis. In this work, we outline a general protocol to construct object-oriented persistent homology methods. By means of differential geometry theory of surfaces, we construct an objective functional, namely, a surface free energy defined on the data of interest. The minimization of the objective functional leads to a Laplace-Beltrami operator which generates a multiscale representation of the initial data and offers an objective oriented filtration process. The resulting differential geometry based object-oriented persistent homology is able to preserve desirable geometric features in the evolutionary filtration and enhances the corresponding topological persistence. The cubical complex based homology algorithm is employed in the present work to be compatible with the Cartesian representation of the Laplace-Beltrami flow. The proposed Laplace-Beltrami flow based persistent homology method is extensively validated. The consistence between Laplace-Beltrami flow based filtration and Euclidean distance based filtration is confirmed on the Vietoris-Rips complex for a large amount of numerical tests. The convergence and reliability of the present Laplace-Beltrami flow based cubical complex filtration approach are analyzed over various spatial and temporal mesh sizes. The Laplace-Beltrami flow based persistent homology approach is utilized to study the intrinsic topology of proteins and fullerene molecules. Based on a quantitative model which correlates the topological persistence of fullerene central cavity with the total curvature energy of the fullerene structure, the proposed method is used for the prediction of fullerene isomer stability. The efficiency and robustness of the present method are verified by more than 500 fullerene molecules. It is shown that the proposed persistent homology based quantitative model offers good predictions of total curvature energies for ten types of fullerene isomers. The present work offers the first example to design object-oriented persistent homology to enhance or preserve desirable features in the original data during the filtration process and then automatically detect or extract the corresponding topological traits from the data. PMID:26705370

  1. Evaluation of a quantitative structure-property relationship (QSPR) for predicting mid-visible refractive index of secondary organic aerosol (SOA).

    PubMed

    Redmond, Haley; Thompson, Jonathan E

    2011-04-21

    In this work we describe and evaluate a simple scheme by which the refractive index (λ = 589 nm) of non-absorbing components common to secondary organic aerosols (SOA) may be predicted from molecular formula and density (g cm(-3)). The QSPR approach described is based on three parameters linked to refractive index-molecular polarizability, the ratio of mass density to molecular weight, and degree of unsaturation. After computing these quantities for a training set of 111 compounds common to atmospheric aerosols, multi-linear regression analysis was conducted to establish a quantitative relationship between the parameters and accepted value of refractive index. The resulting quantitative relationship can often estimate refractive index to ±0.01 when averaged across a variety of compound classes. A notable exception is for alcohols for which the model consistently underestimates refractive index. Homogenous internal mixtures can conceivably be addressed through use of either the volume or mole fraction mixing rules commonly used in the aerosol community. Predicted refractive indices reconstructed from chemical composition data presented in the literature generally agree with previous reports of SOA refractive index. Additionally, the predicted refractive indices lie near measured values we report for λ = 532 nm for SOA generated from vapors of α-pinene (R.I. 1.49-1.51) and toluene (R.I. 1.49-1.50). We envision the QSPR method may find use in reconstructing optical scattering of organic aerosols if mass composition data is known. Alternatively, the method described could be incorporated into in models of organic aerosol formation/phase partitioning to better constrain organic aerosol optical properties.

  2. Comparative study evaluating the role of color Doppler sonography and computed tomography in predicting chest wall invasion by lung tumors.

    PubMed

    Sripathi, Smiti; Mahajan, Abhishek

    2013-09-01

    To analyze qualitative and quantitative parameters of lung tumors by color Doppler sonography, determine the role of color Doppler sonography in predicting chest wall invasion by lung tumors using spectral waveform analysis, and compare color Doppler sonography and computed tomography (CT) for predicting chest wall invasion by lung tumors. Between March and September 2007, 55 patients with pleuropulmonary lesions on chest radiography were assessed by grayscale and color Doppler sonography for chest wall invasion. Four patients were excluded from the study because of poor acoustic windows. Quantitative and qualitative sonographic examinations of the lesions were performed using grayscale and color Doppler imaging. The correlation between the color Doppler and CT findings was determined, and the final outcomes were correlated with the histopathologic findings. Of a total of 51 lesions, 32 were malignant. Vascularity was present on color Doppler sonography in 28 lesions, and chest wall invasion was documented in 22 cases. Computed tomography was performed in 24 of 28 evaluable malignant lesions, and the findings were correlated with the color Doppler findings for chest wall invasion. Of the 24 patients who underwent CT, 19 showed chest wall invasion. The correlation between the color Doppler and CT findings revealed that color Doppler sonography had sensitivity of 95.6% and specificity of 100% for assessing chest wall invasion, whereas CT had sensitivity of 85.7% and specificity of 66.7%. Combined qualitative and quantitative color Doppler sonography can predict chest wall invasion by lung tumors with better sensitivity and specificity than CT. Although surgery is the reference standard, color Doppler sonography is a readily available, affordable, and noninvasive in vivo diagnostic imaging modality that is complementary to CT and magnetic resonance imaging for lung cancer staging.

  3. A Quantitative Model of Expert Transcription Typing

    DTIC Science & Technology

    1993-03-08

    side of pure psychology, several researchers have argued that transcription typing is a particularly good activity for the study of human skilled...phenomenon with a quantitative METT prediction. The first, quick and dirty analysis gives a good prediction of the copy span, in fact, it is even...typing, it should be demonstrated that the mechanism of the model does not get in the way of good predictions. If situations occur where the entire

  4. QSAR prediction of additive and non-additive mixture toxicities of antibiotics and pesticide.

    PubMed

    Qin, Li-Tang; Chen, Yu-Han; Zhang, Xin; Mo, Ling-Yun; Zeng, Hong-Hu; Liang, Yan-Peng

    2018-05-01

    Antibiotics and pesticides may exist as a mixture in real environment. The combined effect of mixture can either be additive or non-additive (synergism and antagonism). However, no effective predictive approach exists on predicting the synergistic and antagonistic toxicities of mixtures. In this study, we developed a quantitative structure-activity relationship (QSAR) model for the toxicities (half effect concentration, EC 50 ) of 45 binary and multi-component mixtures composed of two antibiotics and four pesticides. The acute toxicities of single compound and mixtures toward Aliivibrio fischeri were tested. A genetic algorithm was used to obtain the optimized model with three theoretical descriptors. Various internal and external validation techniques indicated that the coefficient of determination of 0.9366 and root mean square error of 0.1345 for the QSAR model predicted that 45 mixture toxicities presented additive, synergistic, and antagonistic effects. Compared with the traditional concentration additive and independent action models, the QSAR model exhibited an advantage in predicting mixture toxicity. Thus, the presented approach may be able to fill the gaps in predicting non-additive toxicities of binary and multi-component mixtures. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Sensitivity and specificity of radiographic methods for predicting insertion torque of dental implants.

    PubMed

    Cortes, Arthur Rodriguez Gonzalez; Eimar, Hazem; Barbosa, Jorge de Sá; Costa, Claudio; Arita, Emiko Saito; Tamimi, Faleh

    2015-05-01

    Subjective radiographic classifications of alveolar bone have been proposed and correlated with implant insertion torque (IT). The present diagnostic study aims to identify quantitative bone features influencing IT and to use these findings to develop an objective radiographic classification for predicting IT. Demographics, panoramic radiographs (taken at the beginning of dental treatment), and cone-beam computed tomographic scans (taken for implant surgical planning) of 25 patients receiving 31 implants were analyzed. Bone samples retrieved from implant sites were assessed with dual x-ray absorptiometry, microcomputed tomography, and histology. Odds ratio, sensitivity, and specificity of all variables to predict high peak IT were assessed. A ridge cortical thickness >0.75 mm and a normal appearance of the inferior mandibular cortex were the most sensitive variables for predicting high peak IT (87.5% and 75%, respectively). A classification based on the combination of both variables presented high sensitivity (90.9%) and specificity (100%) for predicting IT. Within the limitations of this study, the results suggest that it is possible to predict IT accurately based on radiographic findings of the patient. This could be useful in the treatment plan of immediate loading cases.

  6. Chemistry of atmosphere-surface interactions on Venus and Mars

    NASA Astrophysics Data System (ADS)

    Fegley, Bruce, Jr.; Treiman, Allan H.

    Earth-based, earth-orbital, and spacecraft observational data are used in the present evaluation of Venus atmosphere-surface interactions to quantitatively characterize the reactions between C, H, S, Cl, F, and N gases and plausible surface minerals. Calculation results are used to predict stable minerals and mineral assemblages on the Venus surface, in order to ascertain which (if any) of the atmospheric gases are buffeted by mineral assemblages. Chemical equilibrium calculations using extant thermodynamic data on scapolite minerals predict that carbonate-bearing scapolite and sulfate meionite are unstable on the surface of Venus, while chloride-bearing scapolite is stable.

  7. Testing hadronic interaction models using a highly granular silicon-tungsten calorimeter

    NASA Astrophysics Data System (ADS)

    Bilki, B.; Repond, J.; Schlereth, J.; Xia, L.; Deng, Z.; Li, Y.; Wang, Y.; Yue, Q.; Yang, Z.; Eigen, G.; Mikami, Y.; Price, T.; Watson, N. K.; Thomson, M. A.; Ward, D. R.; Benchekroun, D.; Hoummada, A.; Khoulaki, Y.; Cârloganu, C.; Chang, S.; Khan, A.; Kim, D. H.; Kong, D. J.; Oh, Y. D.; Blazey, G. C.; Dyshkant, A.; Francis, K.; Lima, J. G. R.; Salcido, P.; Zutshi, V.; Boisvert, V.; Green, B.; Misiejuk, A.; Salvatore, F.; Kawagoe, K.; Miyazaki, Y.; Sudo, Y.; Suehara, T.; Tomita, T.; Ueno, H.; Yoshioka, T.; Apostolakis, J.; Folger, G.; Ivantchenko, V.; Ribon, A.; Uzhinskiy, V.; Cauwenbergh, S.; Tytgat, M.; Zaganidis, N.; Hostachy, J.-Y.; Morin, L.; Gadow, K.; Göttlicher, P.; Günter, C.; Krüger, K.; Lutz, B.; Reinecke, M.; Sefkow, F.; Feege, N.; Garutti, E.; Laurien, S.; Lu, S.; Marchesini, I.; Matysek, M.; Ramilli, M.; Kaplan, A.; Norbeck, E.; Northacker, D.; Onel, Y.; Kim, E. J.; van Doren, B.; Wilson, G. W.; Wing, M.; Bobchenko, B.; Chadeeva, M.; Chistov, R.; Danilov, M.; Drutskoy, A.; Epifantsev, A.; Markin, O.; Mizuk, R.; Novikov, E.; Popov, V.; Rusinov, V.; Tarkovsky, E.; Besson, D.; Popova, E.; Gabriel, M.; Kiesling, C.; Simon, F.; Soldner, C.; Szalay, M.; Tesar, M.; Weuste, L.; Amjad, M. S.; Bonis, J.; Callier, S.; Conforti di Lorenzo, S.; Cornebise, P.; Doublet, Ph.; Dulucq, F.; Faucci-Giannelli, M.; Fleury, J.; Frisson, T.; Kégl, B.; van der Kolk, N.; Li, H.; Martin-Chassard, G.; Richard, F.; de La Taille, Ch.; Pöschl, R.; Raux, L.; Rouëné, J.; Seguin-Moreau, N.; Anduze, M.; Balagura, V.; Becheva, E.; Boudry, V.; Brient, J.-C.; Cornat, R.; Frotin, M.; Gastaldi, F.; Magniette, F.; Matthieu, A.; Mora de Freitas, P.; Videau, H.; Augustin, J.-E.; David, J.; Ghislain, P.; Lacour, D.; Lavergne, L.; Zacek, J.; Cvach, J.; Gallus, P.; Havranek, M.; Janata, M.; Kvasnicka, J.; Lednicky, D.; Marcisovsky, M.; Polak, I.; Popule, J.; Tomasek, L.; Tomasek, M.; Ruzicka, P.; Sicho, P.; Smolik, J.; Vrba, V.; Zalesak, J.; Jeans, D.; Götze, M.; Calice Collaboration

    2015-09-01

    A detailed study of hadronic interactions is presented using data recorded with the highly granular CALICE silicon-tungsten electromagnetic calorimeter. Approximately 350,000 selected π- events at energies between 2 and 10 GeV have been studied. The predictions of several physics models available within the GEANT4 simulation tool kit are compared to this data. A reasonable overall description of the data is observed; the Monte Carlo predictions are within 20% of the data, and for many observables much closer. The largest quantitative discrepancies are found in the longitudinal and transverse distributions of reconstructed energy.

  8. Quantitative structure-activity relationship modeling of rat acute toxicity by oral exposure.

    PubMed

    Zhu, Hao; Martin, Todd M; Ye, Lin; Sedykh, Alexander; Young, Douglas M; Tropsha, Alexander

    2009-12-01

    Few quantitative structure-activity relationship (QSAR) studies have successfully modeled large, diverse rodent toxicity end points. In this study, a comprehensive data set of 7385 compounds with their most conservative lethal dose (LD(50)) values has been compiled. A combinatorial QSAR approach has been employed to develop robust and predictive models of acute toxicity in rats caused by oral exposure to chemicals. To enable fair comparison between the predictive power of models generated in this study versus a commercial toxicity predictor, TOPKAT (Toxicity Prediction by Komputer Assisted Technology), a modeling subset of the entire data set was selected that included all 3472 compounds used in TOPKAT's training set. The remaining 3913 compounds, which were not present in the TOPKAT training set, were used as the external validation set. QSAR models of five different types were developed for the modeling set. The prediction accuracy for the external validation set was estimated by determination coefficient R(2) of linear regression between actual and predicted LD(50) values. The use of the applicability domain threshold implemented in most models generally improved the external prediction accuracy but expectedly led to the decrease in chemical space coverage; depending on the applicability domain threshold, R(2) ranged from 0.24 to 0.70. Ultimately, several consensus models were developed by averaging the predicted LD(50) for every compound using all five models. The consensus models afforded higher prediction accuracy for the external validation data set with the higher coverage as compared to individual constituent models. The validated consensus LD(50) models developed in this study can be used as reliable computational predictors of in vivo acute toxicity.

  9. A quantitative systems physiology model of renal function and blood pressure regulation: Model description.

    PubMed

    Hallow, K M; Gebremichael, Y

    2017-06-01

    Renal function plays a central role in cardiovascular, kidney, and multiple other diseases, and many existing and novel therapies act through renal mechanisms. Even with decades of accumulated knowledge of renal physiology, pathophysiology, and pharmacology, the dynamics of renal function remain difficult to understand and predict, often resulting in unexpected or counterintuitive therapy responses. Quantitative systems pharmacology modeling of renal function integrates this accumulated knowledge into a quantitative framework, allowing evaluation of competing hypotheses, identification of knowledge gaps, and generation of new experimentally testable hypotheses. Here we present a model of renal physiology and control mechanisms involved in maintaining sodium and water homeostasis. This model represents the core renal physiological processes involved in many research questions in drug development. The model runs in R and the code is made available. In a companion article, we present a case study using the model to explore mechanisms and pharmacology of salt-sensitive hypertension. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  10. Assessment of quantitative structure-activity relationship of toxicity prediction models for Korean chemical substance control legislation

    PubMed Central

    Kim, Kwang-Yon; Shin, Seong Eun; No, Kyoung Tai

    2015-01-01

    Objectives For successful adoption of legislation controlling registration and assessment of chemical substances, it is important to obtain sufficient toxicological experimental evidence and other related information. It is also essential to obtain a sufficient number of predicted risk and toxicity results. Particularly, methods used in predicting toxicities of chemical substances during acquisition of required data, ultimately become an economic method for future dealings with new substances. Although the need for such methods is gradually increasing, the-required information about reliability and applicability range has not been systematically provided. Methods There are various representative environmental and human toxicity models based on quantitative structure-activity relationships (QSAR). Here, we secured the 10 representative QSAR-based prediction models and its information that can make predictions about substances that are expected to be regulated. We used models that predict and confirm usability of the information expected to be collected and submitted according to the legislation. After collecting and evaluating each predictive model and relevant data, we prepared methods quantifying the scientific validity and reliability, which are essential conditions for using predictive models. Results We calculated predicted values for the models. Furthermore, we deduced and compared adequacies of the models using the Alternative non-testing method assessed for Registration, Evaluation, Authorization, and Restriction of Chemicals Substances scoring system, and deduced the applicability domains for each model. Additionally, we calculated and compared inclusion rates of substances expected to be regulated, to confirm the applicability. Conclusions We evaluated and compared the data, adequacy, and applicability of our selected QSAR-based toxicity prediction models, and included them in a database. Based on this data, we aimed to construct a system that can be used with predicted toxicity results. Furthermore, by presenting the suitability of individual predicted results, we aimed to provide a foundation that could be used in actual assessments and regulations. PMID:26206368

  11. Quantitative prediction of perceptual decisions during near-threshold fear detection

    NASA Astrophysics Data System (ADS)

    Pessoa, Luiz; Padmala, Srikanth

    2005-04-01

    A fundamental goal of cognitive neuroscience is to explain how mental decisions originate from basic neural mechanisms. The goal of the present study was to investigate the neural correlates of perceptual decisions in the context of emotional perception. To probe this question, we investigated how fluctuations in functional MRI (fMRI) signals were correlated with behavioral choice during a near-threshold fear detection task. fMRI signals predicted behavioral choice independently of stimulus properties and task accuracy in a network of brain regions linked to emotional processing: posterior cingulate cortex, medial prefrontal cortex, right inferior frontal gyrus, and left insula. We quantified the link between fMRI signals and behavioral choice in a whole-brain analysis by determining choice probabilities by means of signal-detection theory methods. Our results demonstrate that voxel-wise fMRI signals can reliably predict behavioral choice in a quantitative fashion (choice probabilities ranged from 0.63 to 0.78) at levels comparable to neuronal data. We suggest that the conscious decision that a fearful face has been seen is represented across a network of interconnected brain regions that prepare the organism to appropriately handle emotionally challenging stimuli and that regulate the associated emotional response. decision making | emotion | functional MRI

  12. Comparative Analysis of Predictive Models for Liver Toxicity Using ToxCast Assays and Quantitative Structure-Activity Relationships (MCBIOS)

    EPA Science Inventory

    Comparative Analysis of Predictive Models for Liver Toxicity Using ToxCast Assays and Quantitative Structure-Activity Relationships Jie Liu1,2, Richard Judson1, Matthew T. Martin1, Huixiao Hong3, Imran Shah1 1National Center for Computational Toxicology (NCCT), US EPA, RTP, NC...

  13. Investigation of a redox-sensitive predictive model of mouse embryonic stem cells differentiation using quantitative nuclease protection assays and glutathione redox status

    EPA Science Inventory

    Investigation of a redox-sensitive predictive model of mouse embryonic stem cell differentiation via quantitative nuclease protection assays and glutathione redox status Chandler KJ,Hansen JM, Knudsen T,and Hunter ES 1. U.S. Environmental Protection Agency, Research Triangl...

  14. SVD compression for magnetic resonance fingerprinting in the time domain.

    PubMed

    McGivney, Debra F; Pierre, Eric; Ma, Dan; Jiang, Yun; Saybasili, Haris; Gulani, Vikas; Griswold, Mark A

    2014-12-01

    Magnetic resonance (MR) fingerprinting is a technique for acquiring and processing MR data that simultaneously provides quantitative maps of different tissue parameters through a pattern recognition algorithm. A predefined dictionary models the possible signal evolutions simulated using the Bloch equations with different combinations of various MR parameters and pattern recognition is completed by computing the inner product between the observed signal and each of the predicted signals within the dictionary. Though this matching algorithm has been shown to accurately predict the MR parameters of interest, one desires a more efficient method to obtain the quantitative images. We propose to compress the dictionary using the singular value decomposition, which will provide a low-rank approximation. By compressing the size of the dictionary in the time domain, we are able to speed up the pattern recognition algorithm, by a factor of between 3.4-4.8, without sacrificing the high signal-to-noise ratio of the original scheme presented previously.

  15. Genotype-phenotype association study via new multi-task learning model

    PubMed Central

    Huo, Zhouyuan; Shen, Dinggang

    2018-01-01

    Research on the associations between genetic variations and imaging phenotypes is developing with the advance in high-throughput genotype and brain image techniques. Regression analysis of single nucleotide polymorphisms (SNPs) and imaging measures as quantitative traits (QTs) has been proposed to identify the quantitative trait loci (QTL) via multi-task learning models. Recent studies consider the interlinked structures within SNPs and imaging QTs through group lasso, e.g. ℓ2,1-norm, leading to better predictive results and insights of SNPs. However, group sparsity is not enough for representing the correlation between multiple tasks and ℓ2,1-norm regularization is not robust either. In this paper, we propose a new multi-task learning model to analyze the associations between SNPs and QTs. We suppose that low-rank structure is also beneficial to uncover the correlation between genetic variations and imaging phenotypes. Finally, we conduct regression analysis of SNPs and QTs. Experimental results show that our model is more accurate in prediction than compared methods and presents new insights of SNPs. PMID:29218896

  16. Importance of Multimodal MRI in Characterizing Brain Tissue and Its Potential Application for Individual Age Prediction.

    PubMed

    Cherubini, Andrea; Caligiuri, Maria Eugenia; Peran, Patrice; Sabatini, Umberto; Cosentino, Carlo; Amato, Francesco

    2016-09-01

    This study presents a voxel-based multiple regression analysis of different magnetic resonance image modalities, including anatomical T1-weighted, T2(*) relaxometry, and diffusion tensor imaging. Quantitative parameters sensitive to complementary brain tissue alterations, including morphometric atrophy, mineralization, microstructural damage, and anisotropy loss, were compared in a linear physiological aging model in 140 healthy subjects (range 20-74 years). The performance of different predictors and the identification of the best biomarker of age-induced structural variation were compared without a priori anatomical knowledge. The best quantitative predictors in several brain regions were iron deposition and microstructural damage, rather than macroscopic tissue atrophy. Age variations were best resolved with a combination of markers, suggesting that multiple predictors better capture age-induced tissue alterations. The results of the linear model were used to predict apparent age in different regions of individual brain. This approach pointed to a number of novel applications that could potentially help highlighting areas particularly vulnerable to disease.

  17. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    NASA Technical Reports Server (NTRS)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  18. SVD Compression for Magnetic Resonance Fingerprinting in the Time Domain

    PubMed Central

    McGivney, Debra F.; Pierre, Eric; Ma, Dan; Jiang, Yun; Saybasili, Haris; Gulani, Vikas; Griswold, Mark A.

    2016-01-01

    Magnetic resonance fingerprinting is a technique for acquiring and processing MR data that simultaneously provides quantitative maps of different tissue parameters through a pattern recognition algorithm. A predefined dictionary models the possible signal evolutions simulated using the Bloch equations with different combinations of various MR parameters and pattern recognition is completed by computing the inner product between the observed signal and each of the predicted signals within the dictionary. Though this matching algorithm has been shown to accurately predict the MR parameters of interest, one desires a more efficient method to obtain the quantitative images. We propose to compress the dictionary using the singular value decomposition (SVD), which will provide a low-rank approximation. By compressing the size of the dictionary in the time domain, we are able to speed up the pattern recognition algorithm, by a factor of between 3.4-4.8, without sacrificing the high signal-to-noise ratio of the original scheme presented previously. PMID:25029380

  19. The Incremental Value of Subjective and Quantitative Assessment of 18F-FDG PET for the Prediction of Pathologic Complete Response to Preoperative Chemoradiotherapy in Esophageal Cancer.

    PubMed

    van Rossum, Peter S N; Fried, David V; Zhang, Lifei; Hofstetter, Wayne L; van Vulpen, Marco; Meijer, Gert J; Court, Laurence E; Lin, Steven H

    2016-05-01

    A reliable prediction of a pathologic complete response (pathCR) to chemoradiotherapy before surgery for esophageal cancer would enable investigators to study the feasibility and outcome of an organ-preserving strategy after chemoradiotherapy. So far no clinical parameters or diagnostic studies are able to accurately predict which patients will achieve a pathCR. The aim of this study was to determine whether subjective and quantitative assessment of baseline and postchemoradiation (18)F-FDG PET can improve the accuracy of predicting pathCR to preoperative chemoradiotherapy in esophageal cancer beyond clinical predictors. This retrospective study was approved by the institutional review board, and the need for written informed consent was waived. Clinical parameters along with subjective and quantitative parameters from baseline and postchemoradiation (18)F-FDG PET were derived from 217 esophageal adenocarcinoma patients who underwent chemoradiotherapy followed by surgery. The associations between these parameters and pathCR were studied in univariable and multivariable logistic regression analysis. Four prediction models were constructed and internally validated using bootstrapping to study the incremental predictive values of subjective assessment of (18)F-FDG PET, conventional quantitative metabolic features, and comprehensive (18)F-FDG PET texture/geometry features, respectively. The clinical benefit of (18)F-FDG PET was determined using decision-curve analysis. A pathCR was found in 59 (27%) patients. A clinical prediction model (corrected c-index, 0.67) was improved by adding (18)F-FDG PET-based subjective assessment of response (corrected c-index, 0.72). This latter model was slightly improved by the addition of 1 conventional quantitative metabolic feature only (i.e., postchemoradiation total lesion glycolysis; corrected c-index, 0.73), and even more by subsequently adding 4 comprehensive (18)F-FDG PET texture/geometry features (corrected c-index, 0.77). However, at a decision threshold of 0.9 or higher, representing a clinically relevant predictive value for pathCR at which one may be willing to omit surgery, there was no clear incremental value. Subjective and quantitative assessment of (18)F-FDG PET provides statistical incremental value for predicting pathCR after preoperative chemoradiotherapy in esophageal cancer. However, the discriminatory improvement beyond clinical predictors does not translate into a clinically relevant benefit that could change decision making. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  20. Early prediction of coma recovery after cardiac arrest with blinded pupillometry.

    PubMed

    Solari, Daria; Rossetti, Andrea O; Carteron, Laurent; Miroz, John-Paul; Novy, Jan; Eckert, Philippe; Oddo, Mauro

    2017-06-01

    Prognostication studies on comatose cardiac arrest (CA) patients are limited by lack of blinding, potentially causing overestimation of outcome predictors and self-fulfilling prophecy. Using a blinded approach, we analyzed the value of quantitative automated pupillometry to predict neurological recovery after CA. We examined a prospective cohort of 103 comatose adult patients who were unconscious 48 hours after CA and underwent repeated measurements of quantitative pupillary light reflex (PLR) using the Neurolight-Algiscan device. Clinical examination, electroencephalography (EEG), somatosensory evoked potentials (SSEP), and serum neuron-specific enolase were performed in parallel, as part of standard multimodal assessment. Automated pupillometry results were blinded to clinicians involved in patient care. Cerebral Performance Categories (CPC) at 1 year was the outcome endpoint. Survivors (n = 50 patients; 32 CPC 1, 16 CPC 2, 2 CPC 3) had higher quantitative PLR (median = 20 [range = 13-41] vs 11 [0-55] %, p < 0.0001) and constriction velocity (1.46 [0.85-4.63] vs 0.94 [0.16-4.97] mm/s, p < 0.0001) than nonsurvivors. At 48 hours, a quantitative PLR < 13% had 100% specificity and positive predictive value to predict poor recovery (0% false-positive rate), and provided equal performance to that of EEG and SSEP. Reduced quantitative PLR correlated with higher serum neuron-specific enolase (Spearman r = -0.52, p < 0.0001). Reduced quantitative PLR correlates with postanoxic brain injury and, when compared to standard multimodal assessment, is highly accurate in predicting long-term prognosis after CA. This is the first prognostication study to show the value of automated pupillometry using a blinded approach to minimize self-fulfilling prophecy. Ann Neurol 2017;81:804-810. © 2017 American Neurological Association.

  1. Research on Improved Depth Belief Network-Based Prediction of Cardiovascular Diseases

    PubMed Central

    Zhang, Hongpo

    2018-01-01

    Quantitative analysis and prediction can help to reduce the risk of cardiovascular disease. Quantitative prediction based on traditional model has low accuracy. The variance of model prediction based on shallow neural network is larger. In this paper, cardiovascular disease prediction model based on improved deep belief network (DBN) is proposed. Using the reconstruction error, the network depth is determined independently, and unsupervised training and supervised optimization are combined. It ensures the accuracy of model prediction while guaranteeing stability. Thirty experiments were performed independently on the Statlog (Heart) and Heart Disease Database data sets in the UCI database. Experimental results showed that the mean of prediction accuracy was 91.26% and 89.78%, respectively. The variance of prediction accuracy was 5.78 and 4.46, respectively. PMID:29854369

  2. Thermodynamic prediction of protein neutrality.

    PubMed

    Bloom, Jesse D; Silberg, Jonathan J; Wilke, Claus O; Drummond, D Allan; Adami, Christoph; Arnold, Frances H

    2005-01-18

    We present a simple theory that uses thermodynamic parameters to predict the probability that a protein retains the wild-type structure after one or more random amino acid substitutions. Our theory predicts that for large numbers of substitutions the probability that a protein retains its structure will decline exponentially with the number of substitutions, with the severity of this decline determined by properties of the structure. Our theory also predicts that a protein can gain extra robustness to the first few substitutions by increasing its thermodynamic stability. We validate our theory with simulations on lattice protein models and by showing that it quantitatively predicts previously published experimental measurements on subtilisin and our own measurements on variants of TEM1 beta-lactamase. Our work unifies observations about the clustering of functional proteins in sequence space, and provides a basis for interpreting the response of proteins to substitutions in protein engineering applications.

  3. Thermodynamic prediction of protein neutrality

    PubMed Central

    Bloom, Jesse D.; Silberg, Jonathan J.; Wilke, Claus O.; Drummond, D. Allan; Adami, Christoph; Arnold, Frances H.

    2005-01-01

    We present a simple theory that uses thermodynamic parameters to predict the probability that a protein retains the wild-type structure after one or more random amino acid substitutions. Our theory predicts that for large numbers of substitutions the probability that a protein retains its structure will decline exponentially with the number of substitutions, with the severity of this decline determined by properties of the structure. Our theory also predicts that a protein can gain extra robustness to the first few substitutions by increasing its thermodynamic stability. We validate our theory with simulations on lattice protein models and by showing that it quantitatively predicts previously published experimental measurements on subtilisin and our own measurements on variants of TEM1 β-lactamase. Our work unifies observations about the clustering of functional proteins in sequence space, and provides a basis for interpreting the response of proteins to substitutions in protein engineering applications. PMID:15644440

  4. Biochemical methane potential prediction of plant biomasses: Comparing chemical composition versus near infrared methods and linear versus non-linear models.

    PubMed

    Godin, Bruno; Mayer, Frédéric; Agneessens, Richard; Gerin, Patrick; Dardenne, Pierre; Delfosse, Philippe; Delcarte, Jérôme

    2015-01-01

    The reliability of different models to predict the biochemical methane potential (BMP) of various plant biomasses using a multispecies dataset was compared. The most reliable prediction models of the BMP were those based on the near infrared (NIR) spectrum compared to those based on the chemical composition. The NIR predictions of local (specific regression and non-linear) models were able to estimate quantitatively, rapidly, cheaply and easily the BMP. Such a model could be further used for biomethanation plant management and optimization. The predictions of non-linear models were more reliable compared to those of linear models. The presentation form (green-dried, silage-dried and silage-wet form) of biomasses to the NIR spectrometer did not influence the performances of the NIR prediction models. The accuracy of the BMP method should be improved to enhance further the BMP prediction models. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Solar Cycle Predictions

    NASA Technical Reports Server (NTRS)

    Pesnell, William Dean

    2012-01-01

    Solar cycle predictions are needed to plan long-term space missions; just like weather predictions are needed to plan the launch. Fleets of satellites circle the Earth collecting many types of science data, protecting astronauts, and relaying information. All of these satellites are sensitive at some level to solar cycle effects. Predictions of drag on LEO spacecraft are one of the most important. Launching a satellite with less propellant can mean a higher orbit, but unanticipated solar activity and increased drag can make that a Pyrrhic victory as you consume the reduced propellant load more rapidly. Energetic events at the Sun can produce crippling radiation storms that endanger all assets in space. Solar cycle predictions also anticipate the shortwave emissions that cause degradation of solar panels. Testing solar dynamo theories by quantitative predictions of what will happen in 5-20 years is the next arena for solar cycle predictions. A summary and analysis of 75 predictions of the amplitude of the upcoming Solar Cycle 24 is presented. The current state of solar cycle predictions and some anticipations how those predictions could be made more accurate in the future will be discussed.

  6. Quantitative contrast-enhanced ultrasound evaluation of pathological complete response in patients with locally advanced breast cancer receiving neoadjuvant chemotherapy.

    PubMed

    Wan, Cai-Feng; Liu, Xue-Song; Wang, Lin; Zhang, Jie; Lu, Jin-Song; Li, Feng-Hua

    2018-06-01

    To clarify whether the quantitative parameters of contrast-enhanced ultrasound (CEUS) can be used to predict pathological complete response (pCR) in patients with locally advanced breast cancer receiving neoadjuvant chemotherapy (NAC). Fifty-one patients with histologically proved locally advanced breast cancer scheduled for NAC were enrolled. The quantitative data for CEUS and the tumor diameter were collected at baseline and before surgery, and compared with the pathological response. Multiple logistic regression analysis was performed to examine quantitative parameters at CEUS and the tumor diameter to predict the pCR, and receiver operating characteristic (ROC) curve analysis was used as a summary statistic. Multiple logistic regression analysis revealed that PEAK (the maximum intensity of the time-intensity curve during bolus transit), PEAK%, TTP% (time to peak), and diameter% were significant independent predictors of pCR, and the area under the ROC curve was 0.932(Az 1 ), and the sensitivity and specificity to predict pCR were 93.7% and 80.0%. The area under the ROC curve for the quantitative parameters was 0.927(Az 2 ), and the sensitivity and specificity to predict pCR were 81.2% and 94.3%. For diameter%, the area under the ROC curve was 0.786 (Az 3 ), and the sensitivity and specificity to predict pCR were 93.8% and 54.3%. The values of Az 1 and Az 2 were significantly higher than that of Az 3 (P = 0.027 and P = 0.034, respectively). However, there was no significant difference between the values of Az 1 and Az 2 (P = 0.825). Quantitative analysis of tumor blood perfusion with CEUS is superior to diameter% to predict pCR, and can be used as a functional technique to evaluate tumor response to NAC. Copyright © 2018. Published by Elsevier B.V.

  7. Development and Use of Numerical and Factual Data Bases

    DTIC Science & Technology

    1983-10-01

    the quantitative description of what has been accomplished by their scientific and technical endeavors. 1-3 overhead charge to the national treasury... Molecular properties calculated with the aid of quantum mechanics or the prediction of solar eclipses using celestial mechanics are examples of theoretical...system under study. Examples include phase diagrams, molecular models, geological maps, metabolic pathways. Symbolic data (F3) are data presented in

  8. Utility of Gene Expression and Ex vivo Steroid Production in a 96 h Assay for Predicting Impacts of Endocrine Active Chemicals on Fish Reproduction.

    EPA Science Inventory

    Development of efficient test methods that can generate reliable data to inform risk assessment is an on-going challenge in the field of ecotoxicology. In the present study we evaluated whether a 96 h in vivo assay focused on a small number of quantitative real-time polymerase ch...

  9. Towards a quantitative description of tunneling conductance of superconductors: Application to LiFeAs

    DOE PAGES

    Kreisel, A.; Nelson, R.; Berlijn, T.; ...

    2016-12-27

    Since the discovery of iron-based superconductors, a number of theories have been put forward to explain the qualitative origin of pairing, but there have been few attempts to make quantitative, material-specific comparisons to experimental results. The spin-fluctuation theory of electronic pairing, based on first-principles electronic structure calculations, makes predictions for the superconducting gap. Within the same framework, the surface wave functions may also be calculated, allowing, e.g., for detailed comparisons between theoretical results and measured scanning tunneling topographs and spectra. We present such a comparison between theory and experiment on the Fe-based superconductor LiFeAs. Our results for the homogeneous surfacemore » as well as impurity states are presented as a benchmark test of the theory. For the homogeneous system, we argue that the maxima of topographic image intensity may be located at positions above either the As or Li atoms, depending on tip height and the setpoint current of the measurement. We further report the experimental observation of transitions between As- and Li-registered lattices as functions of both tip height and setpoint bias, in agreement with this prediction. Next, we give a detailed comparison between the simulated scanning tunneling microscopy images of transition-metal defects with experiment. Finally, we discuss possible extensions of the current framework to obtain a theory with true predictive power for scanning tunneling microscopy in Fe-based systems.« less

  10. Towards a quantitative description of tunneling conductance of superconductors: Application to LiFeAs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kreisel, A.; Nelson, R.; Berlijn, T.

    Since the discovery of iron-based superconductors, a number of theories have been put forward to explain the qualitative origin of pairing, but there have been few attempts to make quantitative, material-specific comparisons to experimental results. The spin-fluctuation theory of electronic pairing, based on first-principles electronic structure calculations, makes predictions for the superconducting gap. Within the same framework, the surface wave functions may also be calculated, allowing, e.g., for detailed comparisons between theoretical results and measured scanning tunneling topographs and spectra. We present such a comparison between theory and experiment on the Fe-based superconductor LiFeAs. Our results for the homogeneous surfacemore » as well as impurity states are presented as a benchmark test of the theory. For the homogeneous system, we argue that the maxima of topographic image intensity may be located at positions above either the As or Li atoms, depending on tip height and the setpoint current of the measurement. We further report the experimental observation of transitions between As- and Li-registered lattices as functions of both tip height and setpoint bias, in agreement with this prediction. Next, we give a detailed comparison between the simulated scanning tunneling microscopy images of transition-metal defects with experiment. Finally, we discuss possible extensions of the current framework to obtain a theory with true predictive power for scanning tunneling microscopy in Fe-based systems.« less

  11. Initial CGE Model Results Summary Exogenous and Endogenous Variables Tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, Brian Keith; Boero, Riccardo; Rivera, Michael Kelly

    The following discussion presents initial results of tests of the most recent version of the National Infrastructure Simulation and Analysis Center Dynamic Computable General Equilibrium (CGE) model developed by Los Alamos National Laboratory (LANL). The intent of this is to test and assess the model’s behavioral properties. The test evaluated whether the predicted impacts are reasonable from a qualitative perspective. This issue is whether the predicted change, be it an increase or decrease in other model variables, is consistent with prior economic intuition and expectations about the predicted change. One of the purposes of this effort is to determine whethermore » model changes are needed in order to improve its behavior qualitatively and quantitatively.« less

  12. Evaluation of chemotherapy response in ovarian cancer treatment using quantitative CT image biomarkers: a preliminary study

    NASA Astrophysics Data System (ADS)

    Qiu, Yuchen; Tan, Maxine; McMeekin, Scott; Thai, Theresa; Moore, Kathleen; Ding, Kai; Liu, Hong; Zheng, Bin

    2015-03-01

    The purpose of this study is to identify and apply quantitative image biomarkers for early prediction of the tumor response to the chemotherapy among the ovarian cancer patients participated in the clinical trials of testing new drugs. In the experiment, we retrospectively selected 30 cases from the patients who participated in Phase I clinical trials of new drug or drug agents for ovarian cancer treatment. Each case is composed of two sets of CT images acquired pre- and post-treatment (4-6 weeks after starting treatment). A computer-aided detection (CAD) scheme was developed to extract and analyze the quantitative image features of the metastatic tumors previously tracked by the radiologists using the standard Response Evaluation Criteria in Solid Tumors (RECIST) guideline. The CAD scheme first segmented 3-D tumor volumes from the background using a hybrid tumor segmentation scheme. Then, for each segmented tumor, CAD computed three quantitative image features including the change of tumor volume, tumor CT number (density) and density variance. The feature changes were calculated between the matched tumors tracked on the CT images acquired pre- and post-treatments. Finally, CAD predicted patient's 6-month progression-free survival (PFS) using a decision-tree based classifier. The performance of the CAD scheme was compared with the RECIST category. The result shows that the CAD scheme achieved a prediction accuracy of 76.7% (23/30 cases) with a Kappa coefficient of 0.493, which is significantly higher than the performance of RECIST prediction with a prediction accuracy and Kappa coefficient of 60% (17/30) and 0.062, respectively. This study demonstrated the feasibility of analyzing quantitative image features to improve the early predicting accuracy of the tumor response to the new testing drugs or therapeutic methods for the ovarian cancer patients.

  13. Making predictions of mangrove deforestation: a comparison of two methods in Kenya.

    PubMed

    Rideout, Alasdair J R; Joshi, Neha P; Viergever, Karin M; Huxham, Mark; Briers, Robert A

    2013-11-01

    Deforestation of mangroves is of global concern given their importance for carbon storage, biogeochemical cycling and the provision of other ecosystem services, but the links between rates of loss and potential drivers or risk factors are rarely evaluated. Here, we identified key drivers of mangrove loss in Kenya and compared two different approaches to predicting risk. Risk factors tested included various possible predictors of anthropogenic deforestation, related to population, suitability for land use change and accessibility. Two approaches were taken to modelling risk; a quantitative statistical approach and a qualitative categorical ranking approach. A quantitative model linking rates of loss to risk factors was constructed based on generalized least squares regression and using mangrove loss data from 1992 to 2000. Population density, soil type and proximity to roads were the most important predictors. In order to validate this model it was used to generate a map of losses of Kenyan mangroves predicted to have occurred between 2000 and 2010. The qualitative categorical model was constructed using data from the same selection of variables, with the coincidence of different risk factors in particular mangrove areas used in an additive manner to create a relative risk index which was then mapped. Quantitative predictions of loss were significantly correlated with the actual loss of mangroves between 2000 and 2010 and the categorical risk index values were also highly correlated with the quantitative predictions. Hence, in this case the relatively simple categorical modelling approach was of similar predictive value to the more complex quantitative model of mangrove deforestation. The advantages and disadvantages of each approach are discussed, and the implications for mangroves are outlined. © 2013 Blackwell Publishing Ltd.

  14. Systems Toxicology: From Basic Research to Risk Assessment

    PubMed Central

    2014-01-01

    Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment. PMID:24446777

  15. Large-scale label-free quantitative proteomics of the pea aphid-Buchnera symbiosis.

    PubMed

    Poliakov, Anton; Russell, Calum W; Ponnala, Lalit; Hoops, Harold J; Sun, Qi; Douglas, Angela E; van Wijk, Klaas J

    2011-06-01

    Many insects are nutritionally dependent on symbiotic microorganisms that have tiny genomes and are housed in specialized host cells called bacteriocytes. The obligate symbiosis between the pea aphid Acyrthosiphon pisum and the γ-proteobacterium Buchnera aphidicola (only 584 predicted proteins) is particularly amenable for molecular analysis because the genomes of both partners have been sequenced. To better define the symbiotic relationship between this aphid and Buchnera, we used large-scale, high accuracy tandem mass spectrometry (nanoLC-LTQ-Orbtrap) to identify aphid and Buchnera proteins in the whole aphid body, purified bacteriocytes, isolated Buchnera cells and the residual bacteriocyte fraction. More than 1900 aphid and 400 Buchnera proteins were identified. All enzymes in amino acid metabolism annotated in the Buchnera genome were detected, reflecting the high (68%) coverage of the proteome and supporting the core function of Buchnera in the aphid symbiosis. Transporters mediating the transport of predicted metabolites were present in the bacteriocyte. Label-free spectral counting combined with hierarchical clustering, allowed to define the quantitative distribution of a subset of these proteins across both symbiotic partners, yielding no evidence for the selective transfer of protein among the partners in either direction. This is the first quantitative proteome analysis of bacteriocyte symbiosis, providing a wealth of information about molecular function of both the host cell and bacterial symbiont.

  16. Systems toxicology: from basic research to risk assessment.

    PubMed

    Sturla, Shana J; Boobis, Alan R; FitzGerald, Rex E; Hoeng, Julia; Kavlock, Robert J; Schirmer, Kristin; Whelan, Maurice; Wilks, Martin F; Peitsch, Manuel C

    2014-03-17

    Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment.

  17. High and low frequency unfolded partial least squares regression based on empirical mode decomposition for quantitative analysis of fuel oil samples.

    PubMed

    Bian, Xihui; Li, Shujuan; Lin, Ligang; Tan, Xiaoyao; Fan, Qingjie; Li, Ming

    2016-06-21

    Accurate prediction of the model is fundamental to the successful analysis of complex samples. To utilize abundant information embedded over frequency and time domains, a novel regression model is presented for quantitative analysis of hydrocarbon contents in the fuel oil samples. The proposed method named as high and low frequency unfolded PLSR (HLUPLSR), which integrates empirical mode decomposition (EMD) and unfolded strategy with partial least squares regression (PLSR). In the proposed method, the original signals are firstly decomposed into a finite number of intrinsic mode functions (IMFs) and a residue by EMD. Secondly, the former high frequency IMFs are summed as a high frequency matrix and the latter IMFs and residue are summed as a low frequency matrix. Finally, the two matrices are unfolded to an extended matrix in variable dimension, and then the PLSR model is built between the extended matrix and the target values. Coupled with Ultraviolet (UV) spectroscopy, HLUPLSR has been applied to determine hydrocarbon contents of light gas oil and diesel fuels samples. Comparing with single PLSR and other signal processing techniques, the proposed method shows superiority in prediction ability and better model interpretation. Therefore, HLUPLSR method provides a promising tool for quantitative analysis of complex samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Multi-pathway Kinase Signatures of Multipotent Stromal Cells are Predictive for Osteogenic Differentiation

    PubMed Central

    Platt, Manu O.; Wilder, Catera L.; Wells, Alan; Griffith, Linda G.; Lauffenburger, Douglas A.

    2010-01-01

    Bone marrow-derived multi-potent stromal cells (MSCs) offer great promise for regenerating tissue. While certain transcription factors have been identified in association with tendency toward particular MSC differentiation phenotypes, the regulatory network of key receptor-mediated signaling pathways activated by extracellular ligands that induce various differentiation responses remain poorly understood. Attempts to predict differentiation fate tendencies from individual pathways in isolation are problematic due to the complex pathway interactions inherent in signaling networks. Accordingly, we have undertaken a multi-variate systems approach integrating experimental measurement of multiple kinase pathway activities and osteogenic differentiation in MSCs, together with computational analysis to elucidate quantitative combinations of kinase signals predictive of cell behavior across diverse contexts. In particular, for culture on polymeric biomaterials surfaces presenting tethered epidermal growth factor (tEGF), type-I collagen, neither, or both, we have found that a partial least-squares regression model yields successful prediction of phenotypic behavior on the basis of two principal components comprising the weighted sums of 8 intracellular phosphoproteins: p-EGFR, p-Akt, p-ERK1/2, p-Hsp27, p-c-jun, p-GSK3α/β, p-p38, and p-STAT3. This combination provides strongest predictive capability for 21-day differentiated phenotype status when calculated from day-7 signal measurements (99%); day-4 (88%) and day-14 (89%) signal measurements are also significantly predictive, indicating a broad time-frame during MSC osteogenesis wherein multiple pathways and states of the kinase signaling network are quantitatively integrated to regulate gene expression, cell processes, and ultimately, cell fate. PMID:19750537

  19. Laboratory evolution of the migratory polymorphism in the sand cricket: combining physiology with quantitative genetics.

    PubMed

    Roff, Derek A; Fairbairn, Daphne J

    2007-01-01

    Predicting evolutionary change is the central goal of evolutionary biology because it is the primary means by which we can test evolutionary hypotheses. In this article, we analyze the pattern of evolutionary change in a laboratory population of the wing-dimorphic sand cricket Gryllus firmus resulting from relaxation of selection favoring the migratory (long-winged) morph. Based on a well-characterized trade-off between fecundity and flight capability, we predict that evolution in the laboratory environment should result in a reduction in the proportion of long-winged morphs. We also predict increased fecundity and reduced functionality and weight of the major flight muscles in long-winged females but little change in short-winged (flightless) females. Based on quantitative genetic theory, we predict that the regression equation describing the trade-off between ovary weight and weight of the major flight muscles will show a change in its intercept but not in its slope. Comparisons across generations verify all of these predictions. Further, using values of genetic parameters estimated from previous studies, we show that a quantitative genetic simulation model can account for not only the qualitative changes but also the evolutionary trajectory. These results demonstrate the power of combining quantitative genetic and physiological approaches for understanding the evolution of complex traits.

  20. Quantitative validation of an air-coupled ultrasonic probe model by Interferometric laser tomography

    NASA Astrophysics Data System (ADS)

    Revel, G. M.; Pandarese, G.; Cavuto, A.

    2012-06-01

    The present paper describes the quantitative validation of a finite element (FE) model of the ultrasound beam generated by an air coupled non-contact ultrasound transducer. The model boundary conditions are given by vibration velocities measured by laser vibrometry on the probe membrane. The proposed validation method is based on the comparison between the simulated 3D pressure field and the pressure data measured with interferometric laser tomography technique. The model details and the experimental techniques are described in paper. The analysis of results shows the effectiveness of the proposed approach and the possibility to quantitatively assess and predict the generated acoustic pressure field, with maximum discrepancies in the order of 20% due to uncertainty effects. This step is important for determining in complex problems the real applicability of air-coupled probes and for the simulation of the whole inspection procedure, also when the component is designed, so as to virtually verify its inspectability.

  1. New horizons in mouse immunoinformatics: reliable in silico prediction of mouse class I histocompatibility major complex peptide binding affinity.

    PubMed

    Hattotuwagama, Channa K; Guan, Pingping; Doytchinova, Irini A; Flower, Darren R

    2004-11-21

    Quantitative structure-activity relationship (QSAR) analysis is a main cornerstone of modern informatic disciplines. Predictive computational models, based on QSAR technology, of peptide-major histocompatibility complex (MHC) binding affinity have now become a vital component of modern day computational immunovaccinology. Historically, such approaches have been built around semi-qualitative, classification methods, but these are now giving way to quantitative regression methods. The additive method, an established immunoinformatics technique for the quantitative prediction of peptide-protein affinity, was used here to identify the sequence dependence of peptide binding specificity for three mouse class I MHC alleles: H2-D(b), H2-K(b) and H2-K(k). As we show, in terms of reliability the resulting models represent a significant advance on existing methods. They can be used for the accurate prediction of T-cell epitopes and are freely available online ( http://www.jenner.ac.uk/MHCPred).

  2. A new rapid quantitative test for fecal calprotectin predicts endoscopic activity in ulcerative colitis.

    PubMed

    Lobatón, Triana; Rodríguez-Moranta, Francisco; Lopez, Alicia; Sánchez, Elena; Rodríguez-Alonso, Lorena; Guardiola, Jordi

    2013-04-01

    Fecal calprotectin (FC) determined by the enzyme-linked immunosorbent assay (ELISA) test has been proposed as a promising biomarker of endoscopic activity in ulcerative colitis (UC). However, data on its accuracy in predicting endoscopic activity is scarce. Besides, FC determined by the quantitative-point-of-care test (FC-QPOCT) that provides rapid and individual results could optimize its use in clinical practice. The aims of our study were to evaluate the ability of FC to predict endoscopic activity according to the Mayo score in patients with UC when determined by FC-QPOCT and to compare it with the ELISA test (FC-ELISA). FC was determined simultaneously by FC-ELISA and FC-QPOCT in patients with UC undergoing colonoscopy. Clinical disease activity and endoscopy were assessed according to the Mayo score. Blood tests were taken to analyze serological biomarkers. A total of 146 colonoscopies were performed on 123 patients with UC. FC-QPOCT correlated more closely with the Mayo endoscopic subscore (Spearman's correlation coefficient rank r = 0.727, P < 0.001) than clinical activity (r = 0.636, P < 0.001), platelets (r = 0.381, P < 0.001), leucocytes (r = 0.300, P < 0.001), and C-reactive protein (r = 0.291, P = 0.002). The prediction of "endoscopic remission" (Mayo endoscopic subscore ≤1) with FC-QPOCT (280 µg/g) and FC-ELISA (250 µg/g) presented an area under the curve of 0.906 and 0.924, respectively. The interclass correlation index between both tests was 0.904 (95% confidence interval, 0.864-0.932; P < 0.001). FC determined by QPOCT was an accurate surrogate marker of "endoscopic remission" in UC and presented a good correlation with the FC-ELISA test.

  3. Respiratory trace feature analysis for the prediction of respiratory-gated PET quantification.

    PubMed

    Wang, Shouyi; Bowen, Stephen R; Chaovalitwongse, W Art; Sandison, George A; Grabowski, Thomas J; Kinahan, Paul E

    2014-02-21

    The benefits of respiratory gating in quantitative PET/CT vary tremendously between individual patients. Respiratory pattern is among many patient-specific characteristics that are thought to play an important role in gating-induced imaging improvements. However, the quantitative relationship between patient-specific characteristics of respiratory pattern and improvements in quantitative accuracy from respiratory-gated PET/CT has not been well established. If such a relationship could be estimated, then patient-specific respiratory patterns could be used to prospectively select appropriate motion compensation during image acquisition on a per-patient basis. This study was undertaken to develop a novel statistical model that predicts quantitative changes in PET/CT imaging due to respiratory gating. Free-breathing static FDG-PET images without gating and respiratory-gated FDG-PET images were collected from 22 lung and liver cancer patients on a PET/CT scanner. PET imaging quality was quantified with peak standardized uptake value (SUV(peak)) over lesions of interest. Relative differences in SUV(peak) between static and gated PET images were calculated to indicate quantitative imaging changes due to gating. A comprehensive multidimensional extraction of the morphological and statistical characteristics of respiratory patterns was conducted, resulting in 16 features that characterize representative patterns of a single respiratory trace. The six most informative features were subsequently extracted using a stepwise feature selection approach. The multiple-regression model was trained and tested based on a leave-one-subject-out cross-validation. The predicted quantitative improvements in PET imaging achieved an accuracy higher than 90% using a criterion with a dynamic error-tolerance range for SUV(peak) values. The results of this study suggest that our prediction framework could be applied to determine which patients would likely benefit from respiratory motion compensation when clinicians quantitatively assess PET/CT for therapy target definition and response assessment.

  4. Respiratory trace feature analysis for the prediction of respiratory-gated PET quantification

    NASA Astrophysics Data System (ADS)

    Wang, Shouyi; Bowen, Stephen R.; Chaovalitwongse, W. Art; Sandison, George A.; Grabowski, Thomas J.; Kinahan, Paul E.

    2014-02-01

    The benefits of respiratory gating in quantitative PET/CT vary tremendously between individual patients. Respiratory pattern is among many patient-specific characteristics that are thought to play an important role in gating-induced imaging improvements. However, the quantitative relationship between patient-specific characteristics of respiratory pattern and improvements in quantitative accuracy from respiratory-gated PET/CT has not been well established. If such a relationship could be estimated, then patient-specific respiratory patterns could be used to prospectively select appropriate motion compensation during image acquisition on a per-patient basis. This study was undertaken to develop a novel statistical model that predicts quantitative changes in PET/CT imaging due to respiratory gating. Free-breathing static FDG-PET images without gating and respiratory-gated FDG-PET images were collected from 22 lung and liver cancer patients on a PET/CT scanner. PET imaging quality was quantified with peak standardized uptake value (SUVpeak) over lesions of interest. Relative differences in SUVpeak between static and gated PET images were calculated to indicate quantitative imaging changes due to gating. A comprehensive multidimensional extraction of the morphological and statistical characteristics of respiratory patterns was conducted, resulting in 16 features that characterize representative patterns of a single respiratory trace. The six most informative features were subsequently extracted using a stepwise feature selection approach. The multiple-regression model was trained and tested based on a leave-one-subject-out cross-validation. The predicted quantitative improvements in PET imaging achieved an accuracy higher than 90% using a criterion with a dynamic error-tolerance range for SUVpeak values. The results of this study suggest that our prediction framework could be applied to determine which patients would likely benefit from respiratory motion compensation when clinicians quantitatively assess PET/CT for therapy target definition and response assessment.

  5. Survival Prediction in Pancreatic Ductal Adenocarcinoma by Quantitative Computed Tomography Image Analysis.

    PubMed

    Attiyeh, Marc A; Chakraborty, Jayasree; Doussot, Alexandre; Langdon-Embry, Liana; Mainarich, Shiana; Gönen, Mithat; Balachandran, Vinod P; D'Angelica, Michael I; DeMatteo, Ronald P; Jarnagin, William R; Kingham, T Peter; Allen, Peter J; Simpson, Amber L; Do, Richard K

    2018-04-01

    Pancreatic cancer is a highly lethal cancer with no established a priori markers of survival. Existing nomograms rely mainly on post-resection data and are of limited utility in directing surgical management. This study investigated the use of quantitative computed tomography (CT) features to preoperatively assess survival for pancreatic ductal adenocarcinoma (PDAC) patients. A prospectively maintained database identified consecutive chemotherapy-naive patients with CT angiography and resected PDAC between 2009 and 2012. Variation in CT enhancement patterns was extracted from the tumor region using texture analysis, a quantitative image analysis tool previously described in the literature. Two continuous survival models were constructed, with 70% of the data (training set) using Cox regression, first based only on preoperative serum cancer antigen (CA) 19-9 levels and image features (model A), and then on CA19-9, image features, and the Brennan score (composite pathology score; model B). The remaining 30% of the data (test set) were reserved for independent validation. A total of 161 patients were included in the analysis. Training and test sets contained 113 and 48 patients, respectively. Quantitative image features combined with CA19-9 achieved a c-index of 0.69 [integrated Brier score (IBS) 0.224] on the test data, while combining CA19-9, imaging, and the Brennan score achieved a c-index of 0.74 (IBS 0.200) on the test data. We present two continuous survival prediction models for resected PDAC patients. Quantitative analysis of CT texture features is associated with overall survival. Further work includes applying the model to an external dataset to increase the sample size for training and to determine its applicability.

  6. Application of the Refined Integral Method in the mathematical modeling of drug delivery from one-layer torus-shaped devices.

    PubMed

    Helbling, Ignacio M; Ibarra, Juan C D; Luna, Julio A

    2012-02-28

    A mathematical modeling of controlled release of drug from one-layer torus-shaped devices is presented. Analytical solutions based on Refined Integral Method (RIM) are derived. The validity and utility of the model are ascertained by comparison of the simulation results with matrix-type vaginal rings experimental release data reported in the literature. For the comparisons, the pair-wise procedure is used to measure quantitatively the fit of the theoretical predictions to the experimental data. A good agreement between the model prediction and the experimental data is observed. A comparison with a previously reported model is also presented. More accurate results are achieved for small A/C(s) ratios. Copyright © 2011 Elsevier B.V. All rights reserved.

  7. Current subsidence rates due to compaction of Holocene sediments in southern Louisiana

    USGS Publications Warehouse

    Meckel, T.A.; ten Brink, Uri S.; Williams, S.J.

    2006-01-01

    Relative contributions of geologic and anthropogenic processes to subsidence of southern Louisiana are vigorously debated. Of these, shallow sediment compaction is often considered dominant, although this has never been directly observed or effectively demonstrated. Quantitative understanding of subsidence is important for predicting relative sea level rise, storm surge flooding due to hurricanes, and for successful wetland restoration. Despite many shallow borings, few appropriate stratigraphic and geotechnical data are available for site-specific calculations. We overcome this by determining present compaction rates from Monte Carlo simulations of the incremental sedimentation and compaction of stratigraphies typical of the Holocene of southern Louisiana. This approach generates distributions of present compaction rates that are not expected to exceed 5 mm/yr, but may locally. Locations with present subsidence rates greater than the predicted maximum probable shallow compaction rates are likely influenced by additional processes.

  8. Genome-Scale Screening of Drug-Target Associations Relevant to Ki Using a Chemogenomics Approach

    PubMed Central

    Cao, Dong-Sheng; Liang, Yi-Zeng; Deng, Zhe; Hu, Qian-Nan; He, Min; Xu, Qing-Song; Zhou, Guang-Hua; Zhang, Liu-Xia; Deng, Zi-xin; Liu, Shao

    2013-01-01

    The identification of interactions between drugs and target proteins plays a key role in genomic drug discovery. In the present study, the quantitative binding affinities of drug-target pairs are differentiated as a measurement to define whether a drug interacts with a protein or not, and then a chemogenomics framework using an unbiased set of general integrated features and random forest (RF) is employed to construct a predictive model which can accurately classify drug-target pairs. The predictability of the model is further investigated and validated by several independent validation sets. The built model is used to predict drug-target associations, some of which were confirmed by comparing experimental data from public biological resources. A drug-target interaction network with high confidence drug-target pairs was also reconstructed. This network provides further insight for the action of drugs and targets. Finally, a web-based server called PreDPI-Ki was developed to predict drug-target interactions for drug discovery. In addition to providing a high-confidence list of drug-target associations for subsequent experimental investigation guidance, these results also contribute to the understanding of drug-target interactions. We can also see that quantitative information of drug-target associations could greatly promote the development of more accurate models. The PreDPI-Ki server is freely available via: http://sdd.whu.edu.cn/dpiki. PMID:23577055

  9. Prediction of acute mammalian toxicity using QSAR methods: a case study of sulfur mustard and its breakdown products.

    PubMed

    Ruiz, Patricia; Begluitti, Gino; Tincher, Terry; Wheeler, John; Mumtaz, Moiz

    2012-07-27

    Predicting toxicity quantitatively, using Quantitative Structure Activity Relationships (QSAR), has matured over recent years to the point that the predictions can be used to help identify missing comparison values in a substance's database. In this manuscript we investigate using the lethal dose that kills fifty percent of a test population (LD₅₀) for determining relative toxicity of a number of substances. In general, the smaller the LD₅₀ value, the more toxic the chemical, and the larger the LD₅₀ value, the lower the toxicity. When systemic toxicity and other specific toxicity data are unavailable for the chemical(s) of interest, during emergency responses, LD₅₀ values may be employed to determine the relative toxicity of a series of chemicals. In the present study, a group of chemical warfare agents and their breakdown products have been evaluated using four available rat oral QSAR LD₅₀ models. The QSAR analysis shows that the breakdown products of Sulfur Mustard (HD) are predicted to be less toxic than the parent compound as well as other known breakdown products that have known toxicities. The QSAR estimated break down products LD₅₀ values ranged from 299 mg/kg to 5,764 mg/kg. This evaluation allows for the ranking and toxicity estimation of compounds for which little toxicity information existed; thus leading to better risk decision making in the field.

  10. BIOPEP database and other programs for processing bioactive peptide sequences.

    PubMed

    Minkiewicz, Piotr; Dziuba, Jerzy; Iwaniak, Anna; Dziuba, Marta; Darewicz, Małgorzata

    2008-01-01

    This review presents the potential for application of computational tools in peptide science based on a sample BIOPEP database and program as well as other programs and databases available via the World Wide Web. The BIOPEP application contains a database of biologically active peptide sequences and a program enabling construction of profiles of the potential biological activity of protein fragments, calculation of quantitative descriptors as measures of the value of proteins as potential precursors of bioactive peptides, and prediction of bonds susceptible to hydrolysis by endopeptidases in a protein chain. Other bioactive and allergenic peptide sequence databases are also presented. Programs enabling the construction of binary and multiple alignments between peptide sequences, the construction of sequence motifs attributed to a given type of bioactivity, searching for potential precursors of bioactive peptides, and the prediction of sites susceptible to proteolytic cleavage in protein chains are available via the Internet as are other approaches concerning secondary structure prediction and calculation of physicochemical features based on amino acid sequence. Programs for prediction of allergenic and toxic properties have also been developed. This review explores the possibilities of cooperation between various programs.

  11. Non-constant link tension coefficient in the tumbling-snake model subjected to simple shear

    NASA Astrophysics Data System (ADS)

    Stephanou, Pavlos S.; Kröger, Martin

    2017-11-01

    The authors of the present study have recently presented evidence that the tumbling-snake model for polymeric systems has the necessary capacity to predict the appearance of pronounced undershoots in the time-dependent shear viscosity as well as an absence of equally pronounced undershoots in the transient two normal stress coefficients. The undershoots were found to appear due to the tumbling behavior of the director u when a rotational Brownian diffusion term is considered within the equation of motion of polymer segments, and a theoretical basis concerning the use of a link tension coefficient given through the nematic order parameter had been provided. The current work elaborates on the quantitative predictions of the tumbling-snake model to demonstrate its capacity to predict undershoots in the time-dependent shear viscosity. These predictions are shown to compare favorably with experimental rheological data for both polymer melts and solutions, help us to clarify the microscopic origin of the observed phenomena, and demonstrate in detail why a constant link tension coefficient has to be abandoned.

  12. MO-DE-303-03: Session on quantitative imaging for assessment of tumor response to radiation therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowen, S.

    This session will focus on quantitative imaging for assessment of tumor response to radiation therapy. This is a technically challenging method to translate to practice in radiation therapy. In the new era of precision medicine, however, delivering the right treatment, to the right patient, and at the right time, can positively impact treatment choices and patient outcomes. Quantitative imaging provides the spatial sensitivity required by radiation therapy for precision medicine that is not available by other means. In this Joint ESTRO -AAPM Symposium, three leading-edge investigators will present specific motivations for quantitative imaging biomarkers in radiation therapy of esophageal, headmore » and neck, locally advanced non-small cell lung cancer, and hepatocellular carcinoma. Experiences with the use of dynamic contrast enhanced (DCE) MRI, diffusion- weighted (DW) MRI, PET/CT, and SPECT/CT will be presented. Issues covered will include: response prediction, dose-painting, timing between therapy and imaging, within-therapy biomarkers, confounding effects, normal tissue sparing, dose-response modeling, and association with clinical biomarkers and outcomes. Current information will be presented from investigational studies and clinical practice. Learning Objectives: Learn motivations for the use of quantitative imaging biomarkers for assessment of response to radiation therapy Review the potential areas of application in cancer therapy Examine the challenges for translation, including imaging confounds and paucity of evidence to date Compare exemplary examples of the current state of the art in DCE-MRI, DW-MRI, PET/CT and SPECT/CT imaging for assessment of response to radiation therapy Van der Heide: Research grants from the Dutch Cancer Society and the European Union (FP7) Bowen: RSNA Scholar grant.« less

  13. Average intragranular misorientation trends in polycrystalline materials predicted by a viscoplastic self-consistent approach

    DOE PAGES

    Lebensohn, Ricardo A.; Zecevic, Miroslav; Knezevic, Marko; ...

    2015-12-15

    Here, this work presents estimations of average intragranular fluctuations of lattice rotation rates in polycrystalline materials, obtained by means of the viscoplastic self-consistent (VPSC) model. These fluctuations give a tensorial measure of the trend of misorientation developing inside each single crystal grain representing a polycrystalline aggregate. We first report details of the algorithm implemented in the VPSC code to estimate these fluctuations, which are then validated by comparison with corresponding full-field calculations. Next, we present predictions of average intragranular fluctuations of lattice rotation rates for cubic aggregates, which are rationalized by comparison with experimental evidence on annealing textures of fccmore » and bcc polycrystals deformed in tension and compression, respectively, as well as with measured intragranular misorientation distributions in a Cu polycrystal deformed in tension. The orientation-dependent and micromechanically-based estimations of intragranular misorientations that can be derived from the present implementation are necessary to formulate sound sub-models for the prediction of quantitatively accurate deformation textures, grain fragmentation, and recrystallization textures using the VPSC approach.« less

  14. Global response of Acidithiobacillus ferrooxidans ATCC 53993 to high concentrations of copper: A quantitative proteomics approach.

    PubMed

    Martínez-Bussenius, Cristóbal; Navarro, Claudio A; Orellana, Luis; Paradela, Alberto; Jerez, Carlos A

    2016-08-11

    Acidithiobacillus ferrooxidans is used in industrial bioleaching of minerals to extract valuable metals. A. ferrooxidans strain ATCC 53993 is much more resistant to copper than other strains of this microorganism and it has been proposed that genes present in an exclusive genomic island (GI) of this strain would contribute to its extreme copper tolerance. ICPL (isotope-coded protein labeling) quantitative proteomics was used to study in detail the response of this bacterium to copper. A high overexpression of RND efflux systems and CusF copper chaperones, both present in the genome and the GI of strain ATCC 53993 was found. Also, changes in the levels of the respiratory system proteins such as AcoP and Rus copper binding proteins and several proteins with other predicted functions suggest that numerous metabolic changes are apparently involved in controlling the effects of the toxic metal on this acidophile. Using quantitative proteomics we overview the adaptation mechanisms that biomining acidophiles use to stand their harsh environment. The overexpression of several genes present in an exclusive genomic island strongly suggests the importance of the proteins coded in this DNA region in the high tolerance of A. ferrooxidans ATCC 53993 to metals. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Identification and Quantitation of Asparagine and Citrulline Using High-Performance Liquid Chromatography (HPLC)

    PubMed Central

    Bai, Cheng; Reilly, Charles C.; Wood, Bruce W.

    2007-01-01

    High-performance liquid chromatography (HPLC) analysis was used for identification of two problematic ureides, asparagine and citrulline. We report here a technique that takes advantage of the predictable delay in retention time of the co-asparagine/citrulline peak to enable both qualitative and quantitative analysis of asparagine and citrulline using the Platinum EPS reverse-phase C18 column (Alltech Associates). Asparagine alone is eluted earlier than citrulline alone, but when both of them are present in biological samples they may co-elute. HPLC retention times for asparagine and citrulline were influenced by other ureides in the mixture. We found that at various asparagines and citrulline ratios [= 3:1, 1:1, and 1:3; corresponding to 75:25, 50:50, and 25:75 (μMol ml−1/μMol ml−1)], the resulting peak exhibited different retention times. Adjustment of ureide ratios as internal standards enables peak identification and quantification. Both chemicals were quantified in xylem sap samples of pecan [Carya illinoinensis (Wangenh.) K. Koch] trees. Analysis revealed that tree nickel nutrition status affects relative concentrations of Urea Cycle intermediates, asparagine and citrulline, present in sap. Consequently, we concluded that the HPLC methods are presented to enable qualitative and quantitative analysis of these metabolically important ureides. PMID:19662174

  16. Identification and quantitation of asparagine and citrulline using high-performance liquid chromatography (HPLC).

    PubMed

    Bai, Cheng; Reilly, Charles C; Wood, Bruce W

    2007-03-28

    High-performance liquid chromatography (HPLC) analysis was used for identification of two problematic ureides, asparagine and citrulline. We report here a technique that takes advantage of the predictable delay in retention time of the co-asparagine/citrulline peak to enable both qualitative and quantitative analysis of asparagine and citrulline using the Platinum EPS reverse-phase C18 column (Alltech Associates). Asparagine alone is eluted earlier than citrulline alone, but when both of them are present in biological samples they may co-elute. HPLC retention times for asparagine and citrulline were influenced by other ureides in the mixture. We found that at various asparagines and citrulline ratios [= 3:1, 1:1, and 1:3; corresponding to 75:25, 50:50, and 25:75 (microMol ml(-1)/microMol ml(-1))], the resulting peak exhibited different retention times. Adjustment of ureide ratios as internal standards enables peak identification and quantification. Both chemicals were quantified in xylem sap samples of pecan [Carya illinoinensis (Wangenh.) K. Koch] trees. Analysis revealed that tree nickel nutrition status affects relative concentrations of Urea Cycle intermediates, asparagine and citrulline, present in sap. Consequently, we concluded that the HPLC methods are presented to enable qualitative and quantitative analysis of these metabolically important ureides.

  17. Applying quantitative adiposity feature analysis models to predict benefit of bevacizumab-based chemotherapy in ovarian cancer patients

    NASA Astrophysics Data System (ADS)

    Wang, Yunzhi; Qiu, Yuchen; Thai, Theresa; More, Kathleen; Ding, Kai; Liu, Hong; Zheng, Bin

    2016-03-01

    How to rationally identify epithelial ovarian cancer (EOC) patients who will benefit from bevacizumab or other antiangiogenic therapies is a critical issue in EOC treatments. The motivation of this study is to quantitatively measure adiposity features from CT images and investigate the feasibility of predicting potential benefit of EOC patients with or without receiving bevacizumab-based chemotherapy treatment using multivariate statistical models built based on quantitative adiposity image features. A dataset involving CT images from 59 advanced EOC patients were included. Among them, 32 patients received maintenance bevacizumab after primary chemotherapy and the remaining 27 patients did not. We developed a computer-aided detection (CAD) scheme to automatically segment subcutaneous fat areas (VFA) and visceral fat areas (SFA) and then extracted 7 adiposity-related quantitative features. Three multivariate data analysis models (linear regression, logistic regression and Cox proportional hazards regression) were performed respectively to investigate the potential association between the model-generated prediction results and the patients' progression-free survival (PFS) and overall survival (OS). The results show that using all 3 statistical models, a statistically significant association was detected between the model-generated results and both of the two clinical outcomes in the group of patients receiving maintenance bevacizumab (p<0.01), while there were no significant association for both PFS and OS in the group of patients without receiving maintenance bevacizumab. Therefore, this study demonstrated the feasibility of using quantitative adiposity-related CT image features based statistical prediction models to generate a new clinical marker and predict the clinical outcome of EOC patients receiving maintenance bevacizumab-based chemotherapy.

  18. Forecasting the Environmental Impacts of New Energetic Materials

    DTIC Science & Technology

    2010-11-30

    Quantitative structure- activity relationships for chemical reductions of organic contaminants. Environmental Toxicology and Chemistry 22(8): 1733-1742. QSARs ...activity relationships [ QSARs ]) and the use of these properties to predict the chemical?s fate with multimedia assessment models. SERDP has recently...has several parts, including the prediction of chemical properties (e.g., with quantitative structure-activity relationships [ QSARs ]) and the use of

  19. Built-In-Test Equipment Requirements Workshop. Workshop Presentation

    DTIC Science & Technology

    1981-08-01

    quantitatively evaluated in test. (2) It is necessary to develop the statistical methods that should be used for predicting and confirming of diagnostic...of different performance levels of BIT peacetime and wartime applications, and the corresponding manpower and other support requirements should be...reports. The scope of the workshop involves the areas of require- ments for built-in-test and diagnostics, and the methods of testing to ensure that the

  20. Predictive modeling of neuroanatomic structures for brain atrophy detection

    NASA Astrophysics Data System (ADS)

    Hu, Xintao; Guo, Lei; Nie, Jingxin; Li, Kaiming; Liu, Tianming

    2010-03-01

    In this paper, we present an approach of predictive modeling of neuroanatomic structures for the detection of brain atrophy based on cross-sectional MRI image. The underlying premise of applying predictive modeling for atrophy detection is that brain atrophy is defined as significant deviation of part of the anatomy from what the remaining normal anatomy predicts for that part. The steps of predictive modeling are as follows. The central cortical surface under consideration is reconstructed from brain tissue map and Regions of Interests (ROI) on it are predicted from other reliable anatomies. The vertex pair-wise distance between the predicted vertex and the true one within the abnormal region is expected to be larger than that of the vertex in normal brain region. Change of white matter/gray matter ratio within a spherical region is used to identify the direction of vertex displacement. In this way, the severity of brain atrophy can be defined quantitatively by the displacements of those vertices. The proposed predictive modeling method has been evaluated by using both simulated atrophies and MRI images of Alzheimer's disease.

  1. Predicting MHC-II binding affinity using multiple instance regression

    PubMed Central

    EL-Manzalawy, Yasser; Dobbs, Drena; Honavar, Vasant

    2011-01-01

    Reliably predicting the ability of antigen peptides to bind to major histocompatibility complex class II (MHC-II) molecules is an essential step in developing new vaccines. Uncovering the amino acid sequence correlates of the binding affinity of MHC-II binding peptides is important for understanding pathogenesis and immune response. The task of predicting MHC-II binding peptides is complicated by the significant variability in their length. Most existing computational methods for predicting MHC-II binding peptides focus on identifying a nine amino acids core region in each binding peptide. We formulate the problems of qualitatively and quantitatively predicting flexible length MHC-II peptides as multiple instance learning and multiple instance regression problems, respectively. Based on this formulation, we introduce MHCMIR, a novel method for predicting MHC-II binding affinity using multiple instance regression. We present results of experiments using several benchmark datasets that show that MHCMIR is competitive with the state-of-the-art methods for predicting MHC-II binding peptides. An online web server that implements the MHCMIR method for MHC-II binding affinity prediction is freely accessible at http://ailab.cs.iastate.edu/mhcmir. PMID:20855923

  2. Computational design of a Diels-Alderase from a thermophilic esterase: the importance of dynamics

    NASA Astrophysics Data System (ADS)

    Linder, Mats; Johansson, Adam Johannes; Olsson, Tjelvar S. G.; Liebeschuetz, John; Brinck, Tore

    2012-09-01

    A novel computational Diels-Alderase design, based on a relatively rare form of carboxylesterase from Geobacillus stearothermophilus, is presented and theoretically evaluated. The structure was found by mining the PDB for a suitable oxyanion hole-containing structure, followed by a combinatorial approach to find suitable substrates and rational mutations. Four lead designs were selected and thoroughly modeled to obtain realistic estimates of substrate binding and prearrangement. Molecular dynamics simulations and DFT calculations were used to optimize and estimate binding affinity and activation energies. A large quantum chemical model was used to capture the salient interactions in the crucial transition state (TS). Our quantitative estimation of kinetic parameters was validated against four experimentally characterized Diels-Alderases with good results. The final designs in this work are predicted to have rate enhancements of ≈103-106 and high predicted proficiencies. This work emphasizes the importance of considering protein dynamics in the design approach, and provides a quantitative estimate of the how the TS stabilization observed in most de novo and redesigned enzymes is decreased compared to a minimal, `ideal' model. The presented design is highly interesting for further optimization and applications since it is based on a thermophilic enzyme ( T opt = 70 °C).

  3. Prediction of Environmental Impact of High-Energy Materials with Atomistic Computer Simulations

    DTIC Science & Technology

    2010-11-01

    from a training set of compounds. Other methods include Quantitative Struc- ture-Activity Relationship ( QSAR ) and Quantitative Structure-Property...26 28 the development of QSPR/ QSAR models, in contrast to boiling points and critical parameters derived from empirical correlations, to improve...Quadratic Configuration Interaction Singles Doubles QSAR Quantitative Structure-Activity Relationship QSPR Quantitative Structure-Property

  4. Quantitative Adverse Outcome Pathways and Their Application to Predictive Toxicology

    EPA Science Inventory

    A quantitative adverse outcome pathway (qAOP) consists of one or more biologically based, computational models describing key event relationships linking a molecular initiating event (MIE) to an adverse outcome. A qAOP provides quantitative, dose–response, and time-course p...

  5. Fundamentals and techniques of nonimaging optics for solar energy concentration

    NASA Astrophysics Data System (ADS)

    Winston, R.; Gallagher, J. J.

    1980-05-01

    The properties of a variety of new and previously known nonimaging optical configurations were investigated. A thermodynamic model which explains quantitatively the enhancement of effective absorptance of gray body receivers through cavity effects was developed. The classic method of Liu and Jordan, which allows one to predict the diffuse sunlight levels through correlation with the total and direct fraction was revised and updated and applied to predict the performance of nonimaging solar collectors. The conceptual design for an optimized solar collector which integrates the techniques of nonimaging concentration with evacuated tube collector technology was carried out and is presently the basis for a separately funded hardware development project.

  6. Analysis of conditional genetic effects and variance components in developmental genetics.

    PubMed

    Zhu, J

    1995-12-01

    A genetic model with additive-dominance effects and genotype x environment interactions is presented for quantitative traits with time-dependent measures. The genetic model for phenotypic means at time t conditional on phenotypic means measured at previous time (t-1) is defined. Statistical methods are proposed for analyzing conditional genetic effects and conditional genetic variance components. Conditional variances can be estimated by minimum norm quadratic unbiased estimation (MINQUE) method. An adjusted unbiased prediction (AUP) procedure is suggested for predicting conditional genetic effects. A worked example from cotton fruiting data is given for comparison of unconditional and conditional genetic variances and additive effects.

  7. Analysis of Conditional Genetic Effects and Variance Components in Developmental Genetics

    PubMed Central

    Zhu, J.

    1995-01-01

    A genetic model with additive-dominance effects and genotype X environment interactions is presented for quantitative traits with time-dependent measures. The genetic model for phenotypic means at time t conditional on phenotypic means measured at previous time (t - 1) is defined. Statistical methods are proposed for analyzing conditional genetic effects and conditional genetic variance components. Conditional variances can be estimated by minimum norm quadratic unbiased estimation (MINQUE) method. An adjusted unbiased prediction (AUP) procedure is suggested for predicting conditional genetic effects. A worked example from cotton fruiting data is given for comparison of unconditional and conditional genetic variances and additive effects. PMID:8601500

  8. The Separation and Quantitation of Peptides with and without Oxidation of Methionine and Deamidation of Asparagine Using Hydrophilic Interaction Liquid Chromatography with Mass Spectrometry (HILIC-MS)

    NASA Astrophysics Data System (ADS)

    Badgett, Majors J.; Boyes, Barry; Orlando, Ron

    2017-05-01

    Peptides with deamidated asparagine residues and oxidized methionine residues are often not resolved sufficiently to allow quantitation of their native and modified forms using reversed phase (RP) chromatography. The accurate quantitation of these modifications is vital in protein biotherapeutic analysis because they can affect a protein's function, activity, and stability. We demonstrate here that hydrophilic interaction liquid chromatography (HILIC) adequately and predictably separates peptides with these modifications from their native counterparts. Furthermore, coefficients describing the extent of the hydrophilicity of these modifications have been derived and were incorporated into a previously made peptide retention prediction model that is capable of predicting the retention times of peptides with and without these modifications.

  9. The Separation and Quantitation of Peptides with and without Oxidation of Methionine and Deamidation of Asparagine Using Hydrophilic Interaction Liquid Chromatography with Mass Spectrometry (HILIC-MS).

    PubMed

    Badgett, Majors J; Boyes, Barry; Orlando, Ron

    2017-05-01

    Peptides with deamidated asparagine residues and oxidized methionine residues are often not resolved sufficiently to allow quantitation of their native and modified forms using reversed phase (RP) chromatography. The accurate quantitation of these modifications is vital in protein biotherapeutic analysis because they can affect a protein's function, activity, and stability. We demonstrate here that hydrophilic interaction liquid chromatography (HILIC) adequately and predictably separates peptides with these modifications from their native counterparts. Furthermore, coefficients describing the extent of the hydrophilicity of these modifications have been derived and were incorporated into a previously made peptide retention prediction model that is capable of predicting the retention times of peptides with and without these modifications. Graphical Abstract ᅟ.

  10. Time-series analysis of hepatitis A, B, C and E infections in a large Chinese city: application to prediction analysis.

    PubMed

    Sumi, A; Luo, T; Zhou, D; Yu, B; Kong, D; Kobayashi, N

    2013-05-01

    Viral hepatitis is recognized as one of the most frequently reported diseases, and especially in China, acute and chronic liver disease due to viral hepatitis has been a major public health problem. The present study aimed to analyse and predict surveillance data of infections of hepatitis A, B, C and E in Wuhan, China, by the method of time-series analysis (MemCalc, Suwa-Trast, Japan). On the basis of spectral analysis, fundamental modes explaining the underlying variation of the data for the years 2004-2008 were assigned. The model was calculated using the fundamental modes and the underlying variation of the data reproduced well. An extension of the model to the year 2009 could predict the data quantitatively. Our study suggests that the present method will allow us to model the temporal pattern of epidemics of viral hepatitis much more effectively than using the artificial neural network, which has been used previously.

  11. Quantifying the Pathway and Predicting Spontaneous Emulsification during Material Exchange in a Two Phase Liquid System.

    PubMed

    Spooner, Stephen; Rahnama, Alireza; Warnett, Jason M; Williams, Mark A; Li, Zushu; Sridhar, Seetharaman

    2017-10-30

    Kinetic restriction of a thermodynamically favourable equilibrium is a common theme in materials processing. The interfacial instability in systems where rate of material exchange is far greater than the mass transfer through respective bulk phases is of specific interest when tracking the transient interfacial area, a parameter integral to short processing times for productivity streamlining in all manufacturing where interfacial reaction occurs. This is even more pertinent in high-temperature systems for energy and cost savings. Here the quantified physical pathway of interfacial area change due to material exchange in liquid metal-molten oxide systems is presented. In addition the predicted growth regime and emulsification behaviour in relation to interfacial tension as modelled using phase-field methodology is shown. The observed in-situ emulsification behaviour links quantitatively the geometry of perturbations as a validation method for the development of simulating the phenomena. Thus a method is presented to both predict and engineer the formation of micro emulsions to a desired specification.

  12. The predictive value of quantitative fibronectin testing in combination with cervical length measurement in symptomatic women.

    PubMed

    Bruijn, Merel M C; Kamphuis, Esme I; Hoesli, Irene M; Martinez de Tejada, Begoña; Loccufier, Anne R; Kühnert, Maritta; Helmer, Hanns; Franz, Marie; Porath, Martina M; Oudijk, Martijn A; Jacquemyn, Yves; Schulzke, Sven M; Vetter, Grit; Hoste, Griet; Vis, Jolande Y; Kok, Marjolein; Mol, Ben W J; van Baaren, Gert-Jan

    2016-12-01

    The combination of the qualitative fetal fibronectin test and cervical length measurement has a high negative predictive value for preterm birth within 7 days; however, positive prediction is poor. A new bedside quantitative fetal fibronectin test showed potential additional value over the conventional qualitative test, but there is limited evidence on the combination with cervical length measurement. The purpose of this study was to compare quantitative fetal fibronectin and qualitative fetal fibronectin testing in the prediction of spontaneous preterm birth within 7 days in symptomatic women who undergo cervical length measurement. We performed a European multicenter cohort study in 10 perinatal centers in 5 countries. Women between 24 and 34 weeks of gestation with signs of active labor and intact membranes underwent quantitative fibronectin testing and cervical length measurement. We assessed the risk of preterm birth within 7 days in predefined strata based on fibronectin concentration and cervical length. Of 455 women who were included in the study, 48 women (11%) delivered within 7 days. A combination of cervical length and qualitative fibronectin resulted in the identification of 246 women who were at low risk: 164 women with a cervix between 15 and 30 mm and a negative fibronectin test (<50 ng/mL; preterm birth rate, 2%) and 82 women with a cervix at >30 mm (preterm birth rate, 2%). Use of quantitative fibronectin alone resulted in a predicted risk of preterm birth within 7 days that ranged from 2% in the group with the lowest fibronectin level (<10 ng/mL) to 38% in the group with the highest fibronectin level (>500 ng/mL), with similar accuracy as that of the combination of cervical length and qualitative fibronectin. Combining cervical length and quantitative fibronectin resulted in the identification of an additional 19 women at low risk (preterm birth rate, 5%), using a threshold of 10 ng/mL in women with a cervix at <15 mm, and 6 women at high risk (preterm birth rate, 33%) using a threshold of >500 ng/mL in women with a cervix at >30 mm. In women with threatened preterm birth, quantitative fibronectin testing alone performs equal to the combination of cervical length and qualitative fibronectin. Possibly, the combination of quantitative fibronectin testing and cervical length increases this predictive capacity. Cost-effectiveness analysis and the availability of these tests in a local setting should determine the final choice. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Harnessing quantitative genetics and genomics for understanding and improving complex traits in crops

    USDA-ARS?s Scientific Manuscript database

    Classical quantitative genetics aids crop improvement by providing the means to estimate heritability, genetic correlations, and predicted responses to various selection schemes. Genomics has the potential to aid quantitative genetics and applied crop improvement programs via large-scale, high-thro...

  14. Comparative study of contrast-enhanced ultrasound qualitative and quantitative analysis for identifying benign and malignant breast tumor lumps.

    PubMed

    Liu, Jian; Gao, Yun-Hua; Li, Ding-Dong; Gao, Yan-Chun; Hou, Ling-Mi; Xie, Ting

    2014-01-01

    To compare the value of contrast-enhanced ultrasound (CEUS) qualitative and quantitative analysis in the identification of breast tumor lumps. Qualitative and quantitative indicators of CEUS for 73 cases of breast tumor lumps were retrospectively analyzed by univariate and multivariate approaches. Logistic regression was applied and ROC curves were drawn for evaluation and comparison. The CEUS qualitative indicator-generated regression equation contained three indicators, namely enhanced homogeneity, diameter line expansion and peak intensity grading, which demonstrated prediction accuracy for benign and malignant breast tumor lumps of 91.8%; the quantitative indicator-generated regression equation only contained one indicator, namely the relative peak intensity, and its prediction accuracy was 61.5%. The corresponding areas under the ROC curve for qualitative and quantitative analyses were 91.3% and 75.7%, respectively, which exhibited a statistically significant difference by the Z test (P<0.05). The ability of CEUS qualitative analysis to identify breast tumor lumps is better than with quantitative analysis.

  15. A systematic review of quantitative burn wound microbiology in the management of burns patients.

    PubMed

    Halstead, Fenella D; Lee, Kwang Chear; Kwei, Johnny; Dretzke, Janine; Oppenheim, Beryl A; Moiemen, Naiem S

    2018-02-01

    The early diagnosis of infection or sepsis in burns are important for patient care. Globally, a large number of burn centres advocate quantitative cultures of wound biopsies for patient management, since there is assumed to be a direct link between the bioburden of a burn wound and the risk of microbial invasion. Given the conflicting study findings in this area, a systematic review was warranted. Bibliographic databases were searched with no language restrictions to August 2015. Study selection, data extraction and risk of bias assessment were performed in duplicate using pre-defined criteria. Substantial heterogeneity precluded quantitative synthesis, and findings were described narratively, sub-grouped by clinical question. Twenty six laboratory and/or clinical studies were included. Substantial heterogeneity hampered comparisons across studies and interpretation of findings. Limited evidence suggests that (i) more than one quantitative microbiology sample is required to obtain reliable estimates of bacterial load; (ii) biopsies are more sensitive than swabs in diagnosing or predicting sepsis; (iii) high bacterial loads may predict worse clinical outcomes, and (iv) both quantitative and semi-quantitative culture reports need to be interpreted with caution and in the context of other clinical risk factors. The evidence base for the utility and reliability of quantitative microbiology for diagnosing or predicting clinical outcomes in burns patients is limited and often poorly reported. Consequently future research is warranted. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  16. A rational account of pedagogical reasoning: teaching by, and learning from, examples.

    PubMed

    Shafto, Patrick; Goodman, Noah D; Griffiths, Thomas L

    2014-06-01

    Much of learning and reasoning occurs in pedagogical situations--situations in which a person who knows a concept chooses examples for the purpose of helping a learner acquire the concept. We introduce a model of teaching and learning in pedagogical settings that predicts which examples teachers should choose and what learners should infer given a teacher's examples. We present three experiments testing the model predictions for rule-based, prototype, and causally structured concepts. The model shows good quantitative and qualitative fits to the data across all three experiments, predicting novel qualitative phenomena in each case. We conclude by discussing implications for understanding concept learning and implications for theoretical claims about the role of pedagogy in human learning. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Applications of Genomic Selection in Breeding Wheat for Rust Resistance.

    PubMed

    Ornella, Leonardo; González-Camacho, Juan Manuel; Dreisigacker, Susanne; Crossa, Jose

    2017-01-01

    There are a lot of methods developed to predict untested phenotypes in schemes commonly used in genomic selection (GS) breeding. The use of GS for predicting disease resistance has its own particularities: (a) most populations shows additivity in quantitative adult plant resistance (APR); (b) resistance needs effective combinations of major and minor genes; and (c) phenotype is commonly expressed in ordinal categorical traits, whereas most parametric applications assume that the response variable is continuous and normally distributed. Machine learning methods (MLM) can take advantage of examples (data) that capture characteristics of interest from an unknown underlying probability distribution (i.e., data-driven). We introduce some state-of-the-art MLM capable to predict rust resistance in wheat. We also present two parametric R packages for the reader to be able to compare.

  18. Test-Analysis Correlation for Space Shuttle External Tank Foam Impacting RCC Wing Leading Edge Component Panels

    NASA Technical Reports Server (NTRS)

    Lyle, Karen H.

    2008-01-01

    The Space Shuttle Columbia Accident Investigation Board recommended that NASA develop, validate, and maintain a modeling tool capable of predicting the damage threshold for debris impacts on the Space Shuttle Reinforced Carbon-Carbon (RCC) wing leading edge and nosecap assembly. The results presented in this paper are one part of a multi-level approach that supported the development of the predictive tool used to recertify the shuttle for flight following the Columbia Accident. The assessment of predictive capability was largely based on test analysis comparisons for simpler component structures. This paper provides comparisons of finite element simulations with test data for external tank foam debris impacts onto 6-in. square RCC flat panels. Both quantitative displacement and qualitative damage assessment correlations are provided. The comparisons show good agreement and provided the Space Shuttle Program with confidence in the predictive tool.

  19. Predicted and measured boundary layer refraction for advanced turboprop propeller noise

    NASA Technical Reports Server (NTRS)

    Dittmar, James H.; Krejsa, Eugene A.

    1990-01-01

    Currently, boundary layer refraction presents a limitation to the measurement of forward arc propeller noise measured on an acoustic plate in the NASA Lewis 8- by 6-Foot Supersonic Wind Tunnel. The use of a validated boundary layer refraction model to adjust the data could remove this limitation. An existing boundary layer refraction model is used to predict the refraction for cases where boundary layer refraction was measured. In general, the model exhibits the same qualitative behavior as the measured refraction. However, the prediction method does not show quantitative agreement with the data. In general, it overpredicts the amount of refraction for the far forward angles at axial Mach number of 0.85 and 0.80 and underpredicts the refraction at axial Mach numbers of 0.75 and 0.70. A more complete propeller source description is suggested as a way to improve the prediction method.

  20. A new simplex chemometric approach to identify olive oil blends with potentially high traceability.

    PubMed

    Semmar, N; Laroussi-Mezghani, S; Grati-Kamoun, N; Hammami, M; Artaud, J

    2016-10-01

    Olive oil blends (OOBs) are complex matrices combining different cultivars at variable proportions. Although qualitative determinations of OOBs have been subjected to several chemometric works, quantitative evaluations of their contents remain poorly developed because of traceability difficulties concerning co-occurring cultivars. Around this question, we recently published an original simplex approach helping to develop predictive models of the proportions of co-occurring cultivars from chemical profiles of resulting blends (Semmar & Artaud, 2015). Beyond predictive model construction and validation, this paper presents an extension based on prediction errors' analysis to statistically define the blends with the highest predictability among all the possible ones that can be made by mixing cultivars at different proportions. This provides an interesting way to identify a priori labeled commercial products with potentially high traceability taking into account the natural chemical variability of different constitutive cultivars. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Ductile failure X-prize.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cox, James V.; Wellman, Gerald William; Emery, John M.

    2011-09-01

    Fracture or tearing of ductile metals is a pervasive engineering concern, yet accurate prediction of the critical conditions of fracture remains elusive. Sandia National Laboratories has been developing and implementing several new modeling methodologies to address problems in fracture, including both new physical models and new numerical schemes. The present study provides a double-blind quantitative assessment of several computational capabilities including tearing parameters embedded in a conventional finite element code, localization elements, extended finite elements (XFEM), and peridynamics. For this assessment, each of four teams reported blind predictions for three challenge problems spanning crack initiation and crack propagation. After predictionsmore » had been reported, the predictions were compared to experimentally observed behavior. The metal alloys for these three problems were aluminum alloy 2024-T3 and precipitation hardened stainless steel PH13-8Mo H950. The predictive accuracies of the various methods are demonstrated, and the potential sources of error are discussed.« less

  2. Estimation of equivalence ratio distribution in diesel spray using a computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Suzuki, Yasumasa; Tsujimura, Taku; Kusaka, Jin

    2014-08-01

    It is important to understand the mechanism of mixing and atomization of the diesel spray. In addition, the computational prediction of mixing behavior and internal structure of a diesel spray is expected to promote the further understanding about a diesel spray and development of the diesel engine including devices for fuel injection. In this study, we predicted the formation of diesel fuel spray with 3D-CFD code and validated the application by comparing experimental results of the fuel spray behavior and the equivalence ratio visualized by Layleigh-scatter imaging under some ambient, injection and fuel conditions. Using the applicable constants of KH-RT model, we can predict the liquid length spray on a quantitative level. under various fuel injection, ambient and fuel conditions. On the other hand, the change of the vapor penetration and the fuel mass fraction and equivalence ratio distribution with change of fuel injection and ambient conditions quantitatively. The 3D-CFD code used in this study predicts the spray cone angle and entrainment of ambient gas are predicted excessively, therefore there is the possibility of the improvement in the prediction accuracy by the refinement of fuel droplets breakup and evaporation model and the quantitative prediction of spray cone angle.

  3. Quantitative computed tomography for the prediction of pulmonary function after lung cancer surgery: a simple method using simulation software.

    PubMed

    Ueda, Kazuhiro; Tanaka, Toshiki; Li, Tao-Sheng; Tanaka, Nobuyuki; Hamano, Kimikazu

    2009-03-01

    The prediction of pulmonary functional reserve is mandatory in therapeutic decision-making for patients with resectable lung cancer, especially those with underlying lung disease. Volumetric analysis in combination with densitometric analysis of the affected lung lobe or segment with quantitative computed tomography (CT) helps to identify residual pulmonary function, although the utility of this modality needs investigation. The subjects of this prospective study were 30 patients with resectable lung cancer. A three-dimensional CT lung model was created with voxels representing normal lung attenuation (-600 to -910 Hounsfield units). Residual pulmonary function was predicted by drawing a boundary line between the lung to be preserved and that to be resected, directly on the lung model. The predicted values were correlated with the postoperative measured values. The predicted and measured values corresponded well (r=0.89, p<0.001). Although the predicted values corresponded with values predicted by simple calculation using a segment-counting method (r=0.98), there were two outliers whose pulmonary functional reserves were predicted more accurately by CT than by segment counting. The measured pulmonary functional reserves were significantly higher than the predicted values in patients with extensive emphysematous areas (<-910 Hounsfield units), but not in patients with chronic obstructive pulmonary disease. Quantitative CT yielded accurate prediction of functional reserve after lung cancer surgery and helped to identify patients whose functional reserves are likely to be underestimated. Hence, this modality should be utilized for patients with marginal pulmonary function.

  4. Universality and predictability in molecular quantitative genetics.

    PubMed

    Nourmohammad, Armita; Held, Torsten; Lässig, Michael

    2013-12-01

    Molecular traits, such as gene expression levels or protein binding affinities, are increasingly accessible to quantitative measurement by modern high-throughput techniques. Such traits measure molecular functions and, from an evolutionary point of view, are important as targets of natural selection. We review recent developments in evolutionary theory and experiments that are expected to become building blocks of a quantitative genetics of molecular traits. We focus on universal evolutionary characteristics: these are largely independent of a trait's genetic basis, which is often at least partially unknown. We show that universal measurements can be used to infer selection on a quantitative trait, which determines its evolutionary mode of conservation or adaptation. Furthermore, universality is closely linked to predictability of trait evolution across lineages. We argue that universal trait statistics extends over a range of cellular scales and opens new avenues of quantitative evolutionary systems biology. Copyright © 2013. Published by Elsevier Ltd.

  5. Investigation of the influence of protein corona composition on gold nanoparticle bioactivity using machine learning approaches.

    PubMed

    Papa, E; Doucet, J P; Sangion, A; Doucet-Panaye, A

    2016-07-01

    The understanding of the mechanisms and interactions that occur when nanomaterials enter biological systems is important to improve their future use. The adsorption of proteins from biological fluids in a physiological environment to form a corona on the surface of nanoparticles represents a key step that influences nanoparticle behaviour. In this study, the quantitative description of the composition of the protein corona was used to study the effect on cell association induced by 84 surface-modified gold nanoparticles of different sizes. Quantitative relationships between the protein corona and the activity of the gold nanoparticles were modelled by using several machine learning-based linear and non-linear approaches. Models based on a selection of only six serum proteins had robust and predictive results. The Projection Pursuit Regression method had the best performances (r(2) = 0.91; Q(2)loo = 0.81; r(2)ext = 0.79). The present study confirmed the utility of protein corona composition to predict the bioactivity of gold nanoparticles and identified the main proteins that act as promoters or inhibitors of cell association. In addition, the comparison of several techniques showed which strategies offer the best results in prediction and could be used to support new toxicological studies on gold-based nanomaterials.

  6. Synthetic cannabinoids: In silico prediction of the cannabinoid receptor 1 affinity by a quantitative structure-activity relationship model.

    PubMed

    Paulke, Alexander; Proschak, Ewgenij; Sommer, Kai; Achenbach, Janosch; Wunder, Cora; Toennes, Stefan W

    2016-03-14

    The number of new synthetic psychoactive compounds increase steadily. Among the group of these psychoactive compounds, the synthetic cannabinoids (SCBs) are most popular and serve as a substitute of herbal cannabis. More than 600 of these substances already exist. For some SCBs the in vitro cannabinoid receptor 1 (CB1) affinity is known, but for the majority it is unknown. A quantitative structure-activity relationship (QSAR) model was developed, which allows the determination of the SCBs affinity to CB1 (expressed as binding constant (Ki)) without reference substances. The chemically advance template search descriptor was used for vector representation of the compound structures. The similarity between two molecules was calculated using the Feature-Pair Distribution Similarity. The Ki values were calculated using the Inverse Distance Weighting method. The prediction model was validated using a cross validation procedure. The predicted Ki values of some new SCBs were in a range between 20 (considerably higher affinity to CB1 than THC) to 468 (considerably lower affinity to CB1 than THC). The present QSAR model can serve as a simple, fast and cheap tool to get a first hint of the biological activity of new synthetic cannabinoids or of other new psychoactive compounds. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  7. Unpredictability of escape trajectory explains predator evasion ability and microhabitat preference of desert rodents.

    PubMed

    Moore, Talia Y; Cooper, Kimberly L; Biewener, Andrew A; Vasudevan, Ramanarayan

    2017-09-05

    Mechanistically linking movement behaviors and ecology is key to understanding the adaptive evolution of locomotion. Predator evasion, a behavior that enhances fitness, may depend upon short bursts or complex patterns of locomotion. However, such movements are poorly characterized by existing biomechanical metrics. We present methods based on the entropy measure of randomness from Information Theory to quantitatively characterize the unpredictability of non-steady-state locomotion. We then apply the method by examining sympatric rodent species whose escape trajectories differ in dimensionality. Unlike the speed-regulated gait use of cursorial animals to enhance locomotor economy, bipedal jerboa (family Dipodidae) gait transitions likely enhance maneuverability. In field-based observations, jerboa trajectories are significantly less predictable than those of quadrupedal rodents, likely increasing predator evasion ability. Consistent with this hypothesis, jerboas exhibit lower anxiety in open fields than quadrupedal rodents, a behavior that varies inversely with predator evasion ability. Our unpredictability metric expands the scope of quantitative biomechanical studies to include non-steady-state locomotion in a variety of evolutionary and ecologically significant contexts.Biomechanical understanding of animal gait and maneuverability has primarily been limited to species with more predictable, steady-state movement patterns. Here, the authors develop a method to quantify movement predictability, and apply the method to study escape-related movement in several species of desert rodents.

  8. Multiple-scale structures: from Faraday waves to soft-matter quasicrystals.

    PubMed

    Savitz, Samuel; Babadi, Mehrtash; Lifshitz, Ron

    2018-05-01

    For many years, quasicrystals were observed only as solid-state metallic alloys, yet current research is now actively exploring their formation in a variety of soft materials, including systems of macromolecules, nanoparticles and colloids. Much effort is being invested in understanding the thermodynamic properties of these soft-matter quasicrystals in order to predict and possibly control the structures that form, and hopefully to shed light on the broader yet unresolved general questions of quasicrystal formation and stability. Moreover, the ability to control the self-assembly of soft quasicrystals may contribute to the development of novel photonics or other applications based on self-assembled metamaterials. Here a path is followed, leading to quantitative stability predictions, that starts with a model developed two decades ago to treat the formation of multiple-scale quasiperiodic Faraday waves (standing wave patterns in vibrating fluid surfaces) and which was later mapped onto systems of soft particles, interacting via multiple-scale pair potentials. The article reviews, and substantially expands, the quantitative predictions of these models, while correcting a few discrepancies in earlier calculations, and presents new analytical methods for treating the models. In so doing, a number of new stable quasicrystalline structures are found with octagonal, octadecagonal and higher-order symmetries, some of which may, it is hoped, be observed in future experiments.

  9. Interpretation of Negative Molecular Test Results in Patients With Suspected or Confirmed Ebola Virus Disease: Report of Two Cases.

    PubMed

    Edwards, Jeffrey K; Kleine, Christian; Munster, Vincent; Giuliani, Ruggero; Massaquoi, Moses; Sprecher, Armand; Chertow, Daniel S

    2015-12-01

    Quantitative reverse-transcription polymerase chain reaction (qRT-PCR) is the most sensitive quantitative diagnostic assay for detection of Ebola virus in multiple body fluids. Despite the strengths of this assay, we present 2 cases of Ebola virus disease (EVD) and highlight the potential for false-negative results during the early and late stages of EVD. The first case emphasizes the low negative-predictive value of qRT-PCR during incubation and the early febrile stage of EVD, and the second case emphasizes the potential for false-negative results during recovery and late neurologic complications of EVD. Careful interpretation of test results are needed to guide difficult admission and discharge decisions in suspected or confirmed EVD.

  10. LOD significance thresholds for QTL analysis in experimental populations of diploid species

    PubMed

    Van Ooijen JW

    1999-11-01

    Linkage analysis with molecular genetic markers is a very powerful tool in the biological research of quantitative traits. The lack of an easy way to know what areas of the genome can be designated as statistically significant for containing a gene affecting the quantitative trait of interest hampers the important prediction of the rate of false positives. In this paper four tables, obtained by large-scale simulations, are presented that can be used with a simple formula to get the false-positives rate for analyses of the standard types of experimental populations with diploid species with any size of genome. A new definition of the term 'suggestive linkage' is proposed that allows a more objective comparison of results across species.

  11. A quantitative link between microplastic instability and macroscopic deformation behaviors in metallic glasses

    NASA Astrophysics Data System (ADS)

    Wu, Y.; Chen, G. L.; Hui, X. D.; Liu, C. T.; Lin, Y.; Shang, X. C.; Lu, Z. P.

    2009-10-01

    Based on mechanical instability of individual shear transformation zones (STZs), a quantitative link between the microplastic instability and macroscopic deformation behavior of metallic glasses was proposed. Our analysis confirms that macroscopic metallic glasses comprise a statistical distribution of STZ embryos with distributed values of activation energy, and the microplastic instability of all the individual STZs dictates the macroscopic deformation behavior of amorphous solids. The statistical model presented in this paper can successfully reproduce the macroscopic stress-strain curves determined experimentally and readily be used to predict strain-rate effects on the macroscopic responses with the availability of the material parameters at a certain strain rate, which offer new insights into understanding the actual deformation mechanism in amorphous solids.

  12. Genome-Wide Tuning of Protein Expression Levels to Rapidly Engineer Microbial Traits.

    PubMed

    Freed, Emily F; Winkler, James D; Weiss, Sophie J; Garst, Andrew D; Mutalik, Vivek K; Arkin, Adam P; Knight, Rob; Gill, Ryan T

    2015-11-20

    The reliable engineering of biological systems requires quantitative mapping of predictable and context-independent expression over a broad range of protein expression levels. However, current techniques for modifying expression levels are cumbersome and are not amenable to high-throughput approaches. Here we present major improvements to current techniques through the design and construction of E. coli genome-wide libraries using synthetic DNA cassettes that can tune expression over a ∼10(4) range. The cassettes also contain molecular barcodes that are optimized for next-generation sequencing, enabling rapid and quantitative tracking of alleles that have the highest fitness advantage. We show these libraries can be used to determine which genes and expression levels confer greater fitness to E. coli under different growth conditions.

  13. The quantitative structure-insecticidal activity relationships from plant derived compounds against chikungunya and zika Aedes aegypti (Diptera:Culicidae) vector.

    PubMed

    Saavedra, Laura M; Romanelli, Gustavo P; Rozo, Ciro E; Duchowicz, Pablo R

    2018-01-01

    The insecticidal activity of a series of 62 plant derived molecules against the chikungunya, dengue and zika vector, the Aedes aegypti (Diptera:Culicidae) mosquito, is subjected to a Quantitative Structure-Activity Relationships (QSAR) analysis. The Replacement Method (RM) variable subset selection technique based on Multivariable Linear Regression (MLR) proves to be successful for exploring 4885 molecular descriptors calculated with Dragon 6. The predictive capability of the obtained models is confirmed through an external test set of compounds, Leave-One-Out (LOO) cross-validation and Y-Randomization. The present study constitutes a first necessary computational step for designing less toxic insecticides. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Spectroscopic database

    NASA Technical Reports Server (NTRS)

    Husson, N.; Barbe, A.; Brown, L. R.; Carli, B.; Goldman, A.; Pickett, H. M.; Roche, A. E.; Rothman, L. S.; Smith, M. A. H.

    1985-01-01

    Several aspects of quantitative atmospheric spectroscopy are considered, using a classification of the molecules according to the gas amounts in the stratosphere and upper troposphere, and reviews of quantitative atmospheric high-resolution spectroscopic measurements and field measurements systems are given. Laboratory spectroscopy and spectral analysis and prediction are presented with a summary of current laboratory spectroscopy capabilities. Spectroscopic data requirements for accurate derivation of atmospheric composition are discussed, where examples are given for space-based remote sensing experiments of the atmosphere: the ATMOS (Atmospheric Trace Molecule) and UARS (Upper Atmosphere Research Satellite) experiment. A review of the basic parameters involved in the data compilations; a summary of information on line parameter compilations already in existence; and a summary of current laboratory spectroscopy studies are used to assess the data base.

  15. Target and Tissue Selectivity Prediction by Integrated Mechanistic Pharmacokinetic-Target Binding and Quantitative Structure Activity Modeling.

    PubMed

    Vlot, Anna H C; de Witte, Wilhelmus E A; Danhof, Meindert; van der Graaf, Piet H; van Westen, Gerard J P; de Lange, Elizabeth C M

    2017-12-04

    Selectivity is an important attribute of effective and safe drugs, and prediction of in vivo target and tissue selectivity would likely improve drug development success rates. However, a lack of understanding of the underlying (pharmacological) mechanisms and availability of directly applicable predictive methods complicates the prediction of selectivity. We explore the value of combining physiologically based pharmacokinetic (PBPK) modeling with quantitative structure-activity relationship (QSAR) modeling to predict the influence of the target dissociation constant (K D ) and the target dissociation rate constant on target and tissue selectivity. The K D values of CB1 ligands in the ChEMBL database are predicted by QSAR random forest (RF) modeling for the CB1 receptor and known off-targets (TRPV1, mGlu5, 5-HT1a). Of these CB1 ligands, rimonabant, CP-55940, and Δ 8 -tetrahydrocanabinol, one of the active ingredients of cannabis, were selected for simulations of target occupancy for CB1, TRPV1, mGlu5, and 5-HT1a in three brain regions, to illustrate the principles of the combined PBPK-QSAR modeling. Our combined PBPK and target binding modeling demonstrated that the optimal values of the K D and k off for target and tissue selectivity were dependent on target concentration and tissue distribution kinetics. Interestingly, if the target concentration is high and the perfusion of the target site is low, the optimal K D value is often not the lowest K D value, suggesting that optimization towards high drug-target affinity can decrease the benefit-risk ratio. The presented integrative structure-pharmacokinetic-pharmacodynamic modeling provides an improved understanding of tissue and target selectivity.

  16. Do psychosocial work conditions predict risk of disability pensioning? An analysis of register-based outcomes using pooled data on 40,554 observations.

    PubMed

    Clausen, Thomas; Burr, Hermann; Borg, Vilhelm

    2014-06-01

    To investigate whether high psychosocial job demands (quantitative demands and work pace) and low psychosocial job resources (influence at work and quality of leadership) predicted risk of disability pensioning among employees in four occupational groups--employees working with customers, employees working with clients, office workers and manual workers--in line with the propositions of the Job Demands-Resources (JD-R) model. Survey data from 40,554 individuals were fitted to the DREAM register containing information on payments of disability pension. Using multi-adjusted Cox regression, observations were followed in the DREAM-register to assess risk of disability pensioning. Average follow-up time was 5.9 years (SD=3.0). Low levels of influence at work predicted an increased risk of disability pensioning and medium levels of quantitative demands predicted a decreased risk of disability pensioning in the study population. We found significant interaction effects between job demands and job resources as combinations low quality of leadership and high job demands predicted the highest rate of disability pensioning. Further analyses showed some, but no statistically significant, differences between the four occupational groups in the associations between job demands, job resources and risk of disability pensioning. The study showed that psychosocial job demands and job resources predicted risk of disability pensioning. The direction of some of the observed associations countered the expectations of the JD-R model and the findings of the present study therefore imply that associations between job demands, job resources and adverse labour market outcomes are more complex than conceptualised in the JD-R model. © 2014 the Nordic Societies of Public Health.

  17. Machine Learning Meta-analysis of Large Metagenomic Datasets: Tools and Biological Insights.

    PubMed

    Pasolli, Edoardo; Truong, Duy Tin; Malik, Faizan; Waldron, Levi; Segata, Nicola

    2016-07-01

    Shotgun metagenomic analysis of the human associated microbiome provides a rich set of microbial features for prediction and biomarker discovery in the context of human diseases and health conditions. However, the use of such high-resolution microbial features presents new challenges, and validated computational tools for learning tasks are lacking. Moreover, classification rules have scarcely been validated in independent studies, posing questions about the generality and generalization of disease-predictive models across cohorts. In this paper, we comprehensively assess approaches to metagenomics-based prediction tasks and for quantitative assessment of the strength of potential microbiome-phenotype associations. We develop a computational framework for prediction tasks using quantitative microbiome profiles, including species-level relative abundances and presence of strain-specific markers. A comprehensive meta-analysis, with particular emphasis on generalization across cohorts, was performed in a collection of 2424 publicly available metagenomic samples from eight large-scale studies. Cross-validation revealed good disease-prediction capabilities, which were in general improved by feature selection and use of strain-specific markers instead of species-level taxonomic abundance. In cross-study analysis, models transferred between studies were in some cases less accurate than models tested by within-study cross-validation. Interestingly, the addition of healthy (control) samples from other studies to training sets improved disease prediction capabilities. Some microbial species (most notably Streptococcus anginosus) seem to characterize general dysbiotic states of the microbiome rather than connections with a specific disease. Our results in modelling features of the "healthy" microbiome can be considered a first step toward defining general microbial dysbiosis. The software framework, microbiome profiles, and metadata for thousands of samples are publicly available at http://segatalab.cibio.unitn.it/tools/metaml.

  18. Quantitative structure-retention relationships of polycyclic aromatic hydrocarbons gas-chromatographic retention indices.

    PubMed

    Drosos, Juan Carlos; Viola-Rhenals, Maricela; Vivas-Reyes, Ricardo

    2010-06-25

    Polycyclic aromatic compounds (PAHs) are of concern in environmental chemistry and toxicology. In the present work, a QSRR study was performed for 209 previously reported PAHs using quantum mechanics and other sources descriptors estimated by different approaches. The B3LYP/6-31G* level of theory was used for geometrical optimization and quantum mechanics related variables. A good linear relationship between gas-chromatographic retention index and electronic or topologic descriptors was found by stepwise linear regression analysis. The molecular polarizability (alpha) and the second order molecular connectivity Kier and Hall index ((2)chi) showed evidence of significant correlation with retention index by means of important squared coefficient of determination, (R(2)), values (R(2)=0.950 and 0.962, respectively). A one variable QSRR model is presented for each descriptor and both models demonstrates a significant predictive capacity established using the leave-many-out LMO (excluding 25% of rows) cross validation method's q(2) cross-validation coefficients q(2)(CV-LMO25%), (obtained q(2)(CV-LMO25%) 0.947 and 0.960, respectively). Furthermore, the physicochemical interpretation of selected descriptors allowed detailed explanation of the source of the observed statistical correlation. The model analysis suggests that only one descriptor is sufficient to establish a consistent retention index-structure relationship. Moderate or non-significant improve was observed for quantitative results or statistical validation parameters when introducing more terms in predictive equation. The one parameter QSRR proposed model offers a consistent scheme to predict chromatographic properties of PAHs compounds. Copyright 2010 Elsevier B.V. All rights reserved.

  19. An Optimized Transient Dual Luciferase Assay for Quantifying MicroRNA Directed Repression of Targeted Sequences

    PubMed Central

    Moyle, Richard L.; Carvalhais, Lilia C.; Pretorius, Lara-Simone; Nowak, Ekaterina; Subramaniam, Gayathery; Dalton-Morgan, Jessica; Schenk, Peer M.

    2017-01-01

    Studies investigating the action of small RNAs on computationally predicted target genes require some form of experimental validation. Classical molecular methods of validating microRNA action on target genes are laborious, while approaches that tag predicted target sequences to qualitative reporter genes encounter technical limitations. The aim of this study was to address the challenge of experimentally validating large numbers of computationally predicted microRNA-target transcript interactions using an optimized, quantitative, cost-effective, and scalable approach. The presented method combines transient expression via agroinfiltration of Nicotiana benthamiana leaves with a quantitative dual luciferase reporter system, where firefly luciferase is used to report the microRNA-target sequence interaction and Renilla luciferase is used as an internal standard to normalize expression between replicates. We report the appropriate concentration of N. benthamiana leaf extracts and dilution factor to apply in order to avoid inhibition of firefly LUC activity. Furthermore, the optimal ratio of microRNA precursor expression construct to reporter construct and duration of the incubation period post-agroinfiltration were determined. The optimized dual luciferase assay provides an efficient, repeatable and scalable method to validate and quantify microRNA action on predicted target sequences. The optimized assay was used to validate five predicted targets of rice microRNA miR529b, with as few as six technical replicates. The assay can be extended to assess other small RNA-target sequence interactions, including assessing the functionality of an artificial miRNA or an RNAi construct on a targeted sequence. PMID:28979287

  20. Three-dimensional structural modelling and calculation of electrostatic potentials of HLA Bw4 and Bw6 epitopes to explain the molecular basis for alloantibody binding: toward predicting HLA antigenicity and immunogenicity.

    PubMed

    Mallon, Dermot H; Bradley, J Andrew; Winn, Peter J; Taylor, Craig J; Kosmoliaptsis, Vasilis

    2015-02-01

    We have previously shown that qualitative assessment of surface electrostatic potential of HLA class I molecules helps explain serological patterns of alloantibody binding. We have now used a novel computational approach to quantitate differences in surface electrostatic potential of HLA B-cell epitopes and applied this to explain HLA Bw4 and Bw6 antigenicity. Protein structure models of HLA class I alleles expressing either the Bw4 or Bw6 epitope (defined by sequence motifs at positions 77 to 83) were generated using comparative structure prediction. The electrostatic potential in 3-dimensional space encompassing the Bw4/Bw6 epitope was computed by solving the Poisson-Boltzmann equation and quantitatively compared in a pairwise, all-versus-all fashion to produce distance matrices that cluster epitopes with similar electrostatics properties. Quantitative comparison of surface electrostatic potential at the carboxyl terminal of the α1-helix of HLA class I alleles, corresponding to amino acid sequence motif 77 to 83, produced clustering of HLA molecules in 3 principal groups according to Bw4 or Bw6 epitope expression. Remarkably, quantitative differences in electrostatic potential reflected known patterns of serological reactivity better than Bw4/Bw6 amino acid sequence motifs. Quantitative assessment of epitope electrostatic potential allowed the impact of known amino acid substitutions (HLA-B*07:02 R79G, R82L, G83R) that are critical for antibody binding to be predicted. We describe a novel approach for quantitating differences in HLA B-cell epitope electrostatic potential. Proof of principle is provided that this approach enables better assessment of HLA epitope antigenicity than amino acid sequence data alone, and it may allow prediction of HLA immunogenicity.

  1. Efficient multiparticle entanglement via asymmetric Rydberg blockade.

    PubMed

    Saffman, M; Mølmer, K

    2009-06-19

    We present an efficient method for producing N particle entangled states using Rydberg blockade interactions. Optical excitation of Rydberg states that interact weakly, yet have a strong coupling to a second control state is used to achieve state dependent qubit rotations in small ensembles. On the basis of quantitative calculations, we predict that an entangled quantum superposition state of eight atoms can be produced with a fidelity of 84% in cold Rb atoms.

  2. Nanoparticle surface characterization and clustering through concentration-dependent surface adsorption modeling.

    PubMed

    Chen, Ran; Zhang, Yuntao; Sahneh, Faryad Darabi; Scoglio, Caterina M; Wohlleben, Wendel; Haase, Andrea; Monteiro-Riviere, Nancy A; Riviere, Jim E

    2014-09-23

    Quantitative characterization of nanoparticle interactions with their surrounding environment is vital for safe nanotechnological development and standardization. A recent quantitative measure, the biological surface adsorption index (BSAI), has demonstrated promising applications in nanomaterial surface characterization and biological/environmental prediction. This paper further advances the approach beyond the application of five descriptors in the original BSAI to address the concentration dependence of the descriptors, enabling better prediction of the adsorption profile and more accurate categorization of nanomaterials based on their surface properties. Statistical analysis on the obtained adsorption data was performed based on three different models: the original BSAI, a concentration-dependent polynomial model, and an infinite dilution model. These advancements in BSAI modeling showed a promising development in the application of quantitative predictive modeling in biological applications, nanomedicine, and environmental safety assessment of nanomaterials.

  3. Quantifying the relationship between sequence and three-dimensional structure conservation in RNA

    PubMed Central

    2010-01-01

    Background In recent years, the number of available RNA structures has rapidly grown reflecting the increased interest on RNA biology. Similarly to the studies carried out two decades ago for proteins, which gave the fundamental grounds for developing comparative protein structure prediction methods, we are now able to quantify the relationship between sequence and structure conservation in RNA. Results Here we introduce an all-against-all sequence- and three-dimensional (3D) structure-based comparison of a representative set of RNA structures, which have allowed us to quantitatively confirm that: (i) there is a measurable relationship between sequence and structure conservation that weakens for alignments resulting in below 60% sequence identity, (ii) evolution tends to conserve more RNA structure than sequence, and (iii) there is a twilight zone for RNA homology detection. Discussion The computational analysis here presented quantitatively describes the relationship between sequence and structure for RNA molecules and defines a twilight zone region for detecting RNA homology. Our work could represent the theoretical basis and limitations for future developments in comparative RNA 3D structure prediction. PMID:20550657

  4. Quantitative analysis and prediction of G-quadruplex forming sequences in double-stranded DNA

    PubMed Central

    Kim, Minji; Kreig, Alex; Lee, Chun-Ying; Rube, H. Tomas; Calvert, Jacob; Song, Jun S.; Myong, Sua

    2016-01-01

    Abstract G-quadruplex (GQ) is a four-stranded DNA structure that can be formed in guanine-rich sequences. GQ structures have been proposed to regulate diverse biological processes including transcription, replication, translation and telomere maintenance. Recent studies have demonstrated the existence of GQ DNA in live mammalian cells and a significant number of potential GQ forming sequences in the human genome. We present a systematic and quantitative analysis of GQ folding propensity on a large set of 438 GQ forming sequences in double-stranded DNA by integrating fluorescence measurement, single-molecule imaging and computational modeling. We find that short minimum loop length and the thymine base are two main factors that lead to high GQ folding propensity. Linear and Gaussian process regression models further validate that the GQ folding potential can be predicted with high accuracy based on the loop length distribution and the nucleotide content of the loop sequences. Our study provides important new parameters that can inform the evaluation and classification of putative GQ sequences in the human genome. PMID:27095201

  5. Transforming Boolean models to continuous models: methodology and application to T-cell receptor signaling

    PubMed Central

    Wittmann, Dominik M; Krumsiek, Jan; Saez-Rodriguez, Julio; Lauffenburger, Douglas A; Klamt, Steffen; Theis, Fabian J

    2009-01-01

    Background The understanding of regulatory and signaling networks has long been a core objective in Systems Biology. Knowledge about these networks is mainly of qualitative nature, which allows the construction of Boolean models, where the state of a component is either 'off' or 'on'. While often able to capture the essential behavior of a network, these models can never reproduce detailed time courses of concentration levels. Nowadays however, experiments yield more and more quantitative data. An obvious question therefore is how qualitative models can be used to explain and predict the outcome of these experiments. Results In this contribution we present a canonical way of transforming Boolean into continuous models, where the use of multivariate polynomial interpolation allows transformation of logic operations into a system of ordinary differential equations (ODE). The method is standardized and can readily be applied to large networks. Other, more limited approaches to this task are briefly reviewed and compared. Moreover, we discuss and generalize existing theoretical results on the relation between Boolean and continuous models. As a test case a logical model is transformed into an extensive continuous ODE model describing the activation of T-cells. We discuss how parameters for this model can be determined such that quantitative experimental results are explained and predicted, including time-courses for multiple ligand concentrations and binding affinities of different ligands. This shows that from the continuous model we may obtain biological insights not evident from the discrete one. Conclusion The presented approach will facilitate the interaction between modeling and experiments. Moreover, it provides a straightforward way to apply quantitative analysis methods to qualitatively described systems. PMID:19785753

  6. Conditional Toxicity Value (CTV) Predictor: An In Silico Approach for Generating Quantitative Risk Estimates for Chemicals.

    PubMed

    Wignall, Jessica A; Muratov, Eugene; Sedykh, Alexander; Guyton, Kathryn Z; Tropsha, Alexander; Rusyn, Ivan; Chiu, Weihsueh A

    2018-05-01

    Human health assessments synthesize human, animal, and mechanistic data to produce toxicity values that are key inputs to risk-based decision making. Traditional assessments are data-, time-, and resource-intensive, and they cannot be developed for most environmental chemicals owing to a lack of appropriate data. As recommended by the National Research Council, we propose a solution for predicting toxicity values for data-poor chemicals through development of quantitative structure-activity relationship (QSAR) models. We used a comprehensive database of chemicals with existing regulatory toxicity values from U.S. federal and state agencies to develop quantitative QSAR models. We compared QSAR-based model predictions to those based on high-throughput screening (HTS) assays. QSAR models for noncancer threshold-based values and cancer slope factors had cross-validation-based Q 2 of 0.25-0.45, mean model errors of 0.70-1.11 log 10 units, and applicability domains covering >80% of environmental chemicals. Toxicity values predicted from QSAR models developed in this study were more accurate and precise than those based on HTS assays or mean-based predictions. A publicly accessible web interface to make predictions for any chemical of interest is available at http://toxvalue.org. An in silico tool that can predict toxicity values with an uncertainty of an order of magnitude or less can be used to quickly and quantitatively assess risks of environmental chemicals when traditional toxicity data or human health assessments are unavailable. This tool can fill a critical gap in the risk assessment and management of data-poor chemicals. https://doi.org/10.1289/EHP2998.

  7. Building gene expression signatures indicative of transcription factor activation to predict AOP modulation

    EPA Science Inventory

    Building gene expression signatures indicative of transcription factor activation to predict AOP modulation Adverse outcome pathways (AOPs) are a framework for predicting quantitative relationships between molecular initiatin...

  8. The Dopamine Prediction Error: Contributions to Associative Models of Reward Learning

    PubMed Central

    Nasser, Helen M.; Calu, Donna J.; Schoenbaum, Geoffrey; Sharpe, Melissa J.

    2017-01-01

    Phasic activity of midbrain dopamine neurons is currently thought to encapsulate the prediction-error signal described in Sutton and Barto’s (1981) model-free reinforcement learning algorithm. This phasic signal is thought to contain information about the quantitative value of reward, which transfers to the reward-predictive cue after learning. This is argued to endow the reward-predictive cue with the value inherent in the reward, motivating behavior toward cues signaling the presence of reward. Yet theoretical and empirical research has implicated prediction-error signaling in learning that extends far beyond a transfer of quantitative value to a reward-predictive cue. Here, we review the research which demonstrates the complexity of how dopaminergic prediction errors facilitate learning. After briefly discussing the literature demonstrating that phasic dopaminergic signals can act in the manner described by Sutton and Barto (1981), we consider how these signals may also influence attentional processing across multiple attentional systems in distinct brain circuits. Then, we discuss how prediction errors encode and promote the development of context-specific associations between cues and rewards. Finally, we consider recent evidence that shows dopaminergic activity contains information about causal relationships between cues and rewards that reflect information garnered from rich associative models of the world that can be adapted in the absence of direct experience. In discussing this research we hope to support the expansion of how dopaminergic prediction errors are thought to contribute to the learning process beyond the traditional concept of transferring quantitative value. PMID:28275359

  9. State-of-the-art radiological techniques improve the assessment of postoperative lung function in patients with non-small cell lung cancer.

    PubMed

    Ohno, Yoshiharu; Koyama, Hisanobu; Nogami, Munenobu; Takenaka, Daisuke; Onishi, Yumiko; Matsumoto, Keiko; Matsumoto, Sumiaki; Maniwa, Yoshimasa; Yoshimura, Masahiro; Nishimura, Yoshihiro; Sugimura, Kazuro

    2011-01-01

    The purpose of this study was to compare predictive capabilities for postoperative lung function in non-small cell lung cancer (NSCLC) patients of the state-of-the-art radiological methods including perfusion MRI, quantitative CT and SPECT/CT with that of anatomical method (i.e. qualitative CT) and traditional nuclear medicine methods such as planar imaging and SPECT. Perfusion MRI, CT, nuclear medicine study and measurements of %FEV(1) before and after lung resection were performed for 229 NSCLC patients (125 men and 104 women). For perfusion MRI, postoperative %FEV(1) (po%FEV(1)) was predicted from semi-quantitatively assessed blood volumes within total and resected lungs, for quantitative CT, it was predicted from the functional lung volumes within total and resected lungs, for qualitative CT, from the number of segments of total and resected lungs, and for nuclear medicine studies, from uptakes within total and resected lungs. All SPECTs were automatically co-registered with CTs for preparation of SPECT/CTs. Predicted po%FEV(1)s were then correlated with actual po%FEV(1)s, which were measured %FEV(1)s after operation. The limits of agreement were also evaluated. All predicted po%FEV(1)s showed good correlation with actual po%FEV(1)s (0.83≤r≤0.88, p<0.0001). Perfusion MRI, quantitative CT and SPECT/CT demonstrated better correlation than other methods. The limits of agreement of perfusion MRI (4.4±14.2%), quantitative CT (4.7±14.2%) and SPECT/CT (5.1±14.7%) were less than those of qualitative CT (6.0±17.4%), planar imaging (5.8±18.2%), and SPECT (5.5±16.8%). State-of-the-art radiological methods can predict postoperative lung function in NSCLC patients more accurately than traditional methods. Copyright © 2009 Elsevier Ireland Ltd. All rights reserved.

  10. Nonesterified fatty acid determination for functional lipidomics: comprehensive ultrahigh performance liquid chromatography-tandem mass spectrometry quantitation, qualification, and parameter prediction.

    PubMed

    Hellmuth, Christian; Weber, Martina; Koletzko, Berthold; Peissner, Wolfgang

    2012-02-07

    Despite their central importance for lipid metabolism, straightforward quantitative methods for determination of nonesterified fatty acid (NEFA) species are still missing. The protocol presented here provides unbiased quantitation of plasma NEFA species by liquid chromatography-tandem mass spectrometry (LC-MS/MS). Simple deproteination of plasma in organic solvent solution yields high accuracy, including both the unbound and initially protein-bound fractions, while avoiding interferences from hydrolysis of esterified fatty acids from other lipid classes. Sample preparation is fast and nonexpensive, hence well suited for automation and high-throughput applications. Separation of isotopologic NEFA is achieved using ultrahigh-performance liquid chromatography (UPLC) coupled to triple quadrupole LC-MS/MS detection. In combination with automated liquid handling, total assay time per sample is less than 15 min. The analytical spectrum extends beyond readily available NEFA standard compounds by a regression model predicting all the relevant analytical parameters (retention time, ion path settings, and response factor) of NEFA species based on chain length and number of double bonds. Detection of 50 NEFA species and accurate quantification of 36 NEFA species in human plasma is described, the highest numbers ever reported for a LC-MS application. Accuracy and precision are within widely accepted limits. The use of qualifier ions supports unequivocal analyte verification. © 2012 American Chemical Society

  11. Diffusion rate limitations in actin-based propulsion of hard and deformable particles.

    PubMed

    Dickinson, Richard B; Purich, Daniel L

    2006-08-15

    The mechanism by which actin polymerization propels intracellular vesicles and invasive microorganisms remains an open question. Several recent quantitative studies have examined propulsion of biomimetic particles such as polystyrene microspheres, phospholipid vesicles, and oil droplets. In addition to allowing quantitative measurement of parameters such as the dependence of particle speed on its size, these systems have also revealed characteristic behaviors such a saltatory motion of hard particles and oscillatory deformation of soft particles. Such measurements and observations provide tests for proposed mechanisms of actin-based motility. In the actoclampin filament end-tracking motor model, particle-surface-bound filament end-tracking proteins are involved in load-insensitive processive insertion of actin subunits onto elongating filament plus-ends that are persistently tethered to the surface. In contrast, the tethered-ratchet model assumes working filaments are untethered and the free-ended filaments grow as thermal ratchets in a load-sensitive manner. This article presents a model for the diffusion and consumption of actin monomers during actin-based particle propulsion to predict the monomer concentration field around motile particles. The results suggest that the various behaviors of biomimetic particles, including dynamic saltatory motion of hard particles and oscillatory vesicle deformations, can be quantitatively and self-consistently explained by load-insensitive, diffusion-limited elongation of (+)-end-tethered actin filaments, consistent with predictions of the actoclampin filament-end tracking mechanism.

  12. Electric Potential and Electric Field Imaging with Dynamic Applications & Extensions

    NASA Technical Reports Server (NTRS)

    Generazio, Ed

    2017-01-01

    The technology and methods for remote quantitative imaging of electrostatic potentials and electrostatic fields in and around objects and in free space is presented. Electric field imaging (EFI) technology may be applied to characterize intrinsic or existing electric potentials and electric fields, or an externally generated electrostatic field made be used for volumes to be inspected with EFI. The baseline sensor technology (e-Sensor) and its construction, optional electric field generation (quasi-static generator), and current e- Sensor enhancements (ephemeral e-Sensor) are discussed. Critical design elements of current linear and real-time two-dimensional (2D) measurement systems are highlighted, and the development of a three dimensional (3D) EFI system is presented. Demonstrations for structural, electronic, human, and memory applications are shown. Recent work demonstrates that phonons may be used to create and annihilate electric dipoles within structures. Phonon induced dipoles are ephemeral and their polarization, strength, and location may be quantitatively characterized by EFI providing a new subsurface Phonon-EFI imaging technology. Results from real-time imaging of combustion and ion flow, and their measurement complications, will be discussed. Extensions to environment, Space and subterranean applications will be presented, and initial results for quantitative characterizing material properties are shown. A wearable EFI system has been developed by using fundamental EFI concepts. These new EFI capabilities are demonstrated to characterize electric charge distribution creating a new field of study embracing areas of interest including electrostatic discharge (ESD) mitigation, manufacturing quality control, crime scene forensics, design and materials selection for advanced sensors, combustion science, on-orbit space potential, container inspection, remote characterization of electronic circuits and level of activation, dielectric morphology of structures, tether integrity, organic molecular memory, atmospheric science, weather prediction, earth quake prediction, and medical diagnostic and treatment efficacy applications such as cardiac polarization wave propagation and electromyography imaging.

  13. Predicting Future Morphological Changes of Lesions from Radiotracer Uptake in 18F-FDG-PET Images

    PubMed Central

    Bagci, Ulas; Yao, Jianhua; Miller-Jaster, Kirsten; Chen, Xinjian; Mollura, Daniel J.

    2013-01-01

    We introduce a novel computational framework to enable automated identification of texture and shape features of lesions on 18F-FDG-PET images through a graph-based image segmentation method. The proposed framework predicts future morphological changes of lesions with high accuracy. The presented methodology has several benefits over conventional qualitative and semi-quantitative methods, due to its fully quantitative nature and high accuracy in each step of (i) detection, (ii) segmentation, and (iii) feature extraction. To evaluate our proposed computational framework, thirty patients received 2 18F-FDG-PET scans (60 scans total), at two different time points. Metastatic papillary renal cell carcinoma, cerebellar hemongioblastoma, non-small cell lung cancer, neurofibroma, lymphomatoid granulomatosis, lung neoplasm, neuroendocrine tumor, soft tissue thoracic mass, nonnecrotizing granulomatous inflammation, renal cell carcinoma with papillary and cystic features, diffuse large B-cell lymphoma, metastatic alveolar soft part sarcoma, and small cell lung cancer were included in this analysis. The radiotracer accumulation in patients' scans was automatically detected and segmented by the proposed segmentation algorithm. Delineated regions were used to extract shape and textural features, with the proposed adaptive feature extraction framework, as well as standardized uptake values (SUV) of uptake regions, to conduct a broad quantitative analysis. Evaluation of segmentation results indicates that our proposed segmentation algorithm has a mean dice similarity coefficient of 85.75±1.75%. We found that 28 of 68 extracted imaging features were correlated well with SUVmax (p<0.05), and some of the textural features (such as entropy and maximum probability) were superior in predicting morphological changes of radiotracer uptake regions longitudinally, compared to single intensity feature such as SUVmax. We also found that integrating textural features with SUV measurements significantly improves the prediction accuracy of morphological changes (Spearman correlation coefficient = 0.8715, p<2e-16). PMID:23431398

  14. Quantitative prediction of oral cancer risk in patients with oral leukoplakia.

    PubMed

    Liu, Yao; Li, Yicheng; Fu, Yue; Liu, Tong; Liu, Xiaoyong; Zhang, Xinyan; Fu, Jie; Guan, Xiaobing; Chen, Tong; Chen, Xiaoxin; Sun, Zheng

    2017-07-11

    Exfoliative cytology has been widely used for early diagnosis of oral squamous cell carcinoma. We have developed an oral cancer risk index using DNA index value to quantitatively assess cancer risk in patients with oral leukoplakia, but with limited success. In order to improve the performance of the risk index, we collected exfoliative cytology, histopathology, and clinical follow-up data from two independent cohorts of normal, leukoplakia and cancer subjects (training set and validation set). Peaks were defined on the basis of first derivatives with positives, and modern machine learning techniques were utilized to build statistical prediction models on the reconstructed data. Random forest was found to be the best model with high sensitivity (100%) and specificity (99.2%). Using the Peaks-Random Forest model, we constructed an index (OCRI2) as a quantitative measurement of cancer risk. Among 11 leukoplakia patients with an OCRI2 over 0.5, 4 (36.4%) developed cancer during follow-up (23 ± 20 months), whereas 3 (5.3%) of 57 leukoplakia patients with an OCRI2 less than 0.5 developed cancer (32 ± 31 months). OCRI2 is better than other methods in predicting oral squamous cell carcinoma during follow-up. In conclusion, we have developed an exfoliative cytology-based method for quantitative prediction of cancer risk in patients with oral leukoplakia.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pavlickova, Katarina; Vyskupova, Monika, E-mail: vyskupova@fns.uniba.sk

    Cumulative environmental impact assessment deals with the occasional use in practical application of environmental impact assessment process. The main reasons are the difficulty of cumulative impact identification caused by lack of data, inability to measure the intensity and spatial effect of all types of impacts and the uncertainty of their future evolution. This work presents a method proposal to predict cumulative impacts on the basis of landscape vulnerability evaluation. For this purpose, qualitative assessment of landscape ecological stability is conducted and major vulnerability indicators of environmental and socio-economic receptors are specified and valuated. Potential cumulative impacts and the overall impactmore » significance are predicted quantitatively in modified Argonne multiple matrixes while considering the vulnerability of affected landscape receptors and the significance of impacts identified individually. The method was employed in the concrete environmental impact assessment process conducted in Slovakia. The results obtained in this case study reflect that this methodology is simple to apply, valid for all types of impacts and projects, inexpensive and not time-consuming. The objectivity of the partial methods used in this procedure is improved by quantitative landscape ecological stability evaluation, assignment of weights to vulnerability indicators based on the detailed characteristics of affected factors, and grading impact significance. - Highlights: • This paper suggests a method proposal for cumulative impact prediction. • The method includes landscape vulnerability evaluation. • The vulnerability of affected receptors is determined by their sensitivity. • This method can increase the objectivity of impact prediction in the EIA process.« less

  16. Assessing non-additive effects in GBLUP model.

    PubMed

    Vieira, I C; Dos Santos, J P R; Pires, L P M; Lima, B M; Gonçalves, F M A; Balestre, M

    2017-05-10

    Understanding non-additive effects in the expression of quantitative traits is very important in genotype selection, especially in species where the commercial products are clones or hybrids. The use of molecular markers has allowed the study of non-additive genetic effects on a genomic level, in addition to a better understanding of its importance in quantitative traits. Thus, the purpose of this study was to evaluate the behavior of the GBLUP model in different genetic models and relationship matrices and their influence on the estimates of genetic parameters. We used real data of the circumference at breast height in Eucalyptus spp and simulated data from a population of F 2 . Three commonly reported kinship structures in the literature were adopted. The simulation results showed that the inclusion of epistatic kinship improved prediction estimates of genomic breeding values. However, the non-additive effects were not accurately recovered. The Fisher information matrix for real dataset showed high collinearity in estimates of additive, dominant, and epistatic variance, causing no gain in the prediction of the unobserved data and convergence problems. Estimates presented differences of genetic parameters and correlations considering the different kinship structures. Our results show that the inclusion of non-additive effects can improve the predictive ability or even the prediction of additive effects. However, the high distortions observed in the variance estimates when the Hardy-Weinberg equilibrium assumption is violated due to the presence of selection or inbreeding can converge at zero gains in models that consider epistasis in genomic kinship.

  17. Quantitative X-ray mapping, scatter diagrams and the generation of correction maps to obtain more information about your material

    NASA Astrophysics Data System (ADS)

    Wuhrer, R.; Moran, K.

    2014-03-01

    Quantitative X-ray mapping with silicon drift detectors and multi-EDS detector systems have become an invaluable analysis technique and one of the most useful methods of X-ray microanalysis today. The time to perform an X-ray map has reduced considerably with the ability to map minor and trace elements very accurately due to the larger detector area and higher count rate detectors. Live X-ray imaging can now be performed with a significant amount of data collected in a matter of minutes. A great deal of information can be obtained from X-ray maps. This includes; elemental relationship or scatter diagram creation, elemental ratio mapping, chemical phase mapping (CPM) and quantitative X-ray maps. In obtaining quantitative x-ray maps, we are able to easily generate atomic number (Z), absorption (A), fluorescence (F), theoretical back scatter coefficient (η), and quantitative total maps from each pixel in the image. This allows us to generate an image corresponding to each factor (for each element present). These images allow the user to predict and verify where they are likely to have problems in our images, and are especially helpful to look at possible interface artefacts. The post-processing techniques to improve the quantitation of X-ray map data and the development of post processing techniques for improved characterisation are covered in this paper.

  18. Skill Assessment of An Hybrid Technique To Estimate Quantitative Precipitation Forecast For Galicia (nw Spain)

    NASA Astrophysics Data System (ADS)

    Lage, A.; Taboada, J. J.

    Precipitation is the most obvious of the weather elements in its effects on normal life. Numerical weather prediction (NWP) is generally used to produce quantitative precip- itation forecast (QPF) beyond the 1-3 h time frame. These models often fail to predict small-scale variations of rain because of spin-up problems and their coarse spatial and temporal resolution (Antolik, 2000). Moreover, there are some uncertainties about the behaviour of the NWP models in extreme situations (de Bruijn and Brandsma, 2000). Hybrid techniques, combining the benefits of NWP and statistical approaches in a flexible way, are very useful to achieve a good QPF. In this work, a new technique of QPF for Galicia (NW of Spain) is presented. This region has a percentage of rainy days per year greater than 50% with quantities that may cause floods, with human and economical damages. The technique is composed of a NWP model (ARPS) and a statistical downscaling process based on an automated classification scheme of at- mospheric circulation patterns for the Iberian Peninsula (J. Ribalaygua and R. Boren, 1995). Results show that QPF for Galicia is improved using this hybrid technique. [1] Antolik, M.S. 2000 "An Overview of the National Weather Service's centralized statistical quantitative precipitation forecasts". Journal of Hydrology, 239, pp:306- 337. [2] de Bruijn, E.I.F and T. Brandsma "Rainfall prediction for a flooding event in Ireland caused by the remnants of Hurricane Charley". Journal of Hydrology, 239, pp:148-161. [3] Ribalaygua, J. and Boren R. "Clasificación de patrones espaciales de precipitación diaria sobre la España Peninsular". Informes N 3 y 4 del Servicio de Análisis e Investigación del Clima. Instituto Nacional de Meteorología. Madrid. 53 pp.

  19. Quantitative precipitation forecasts in the Alps - an assessment from the Forecast Demonstration Project MAP D-PHASE

    NASA Astrophysics Data System (ADS)

    Ament, F.; Weusthoff, T.; Arpagaus, M.; Rotach, M.

    2009-04-01

    The main aim of the WWRP Forecast Demonstration Project MAP D-PHASE is to demonstrate the performance of today's models to forecast heavy precipitation and flood events in the Alpine region. Therefore an end-to-end, real-time forecasting system was installed and operated during the D PHASE Operations Period from June to November 2007. Part of this system are 30 numerical weather prediction models (deterministic as well as ensemble systems) operated by weather services and research institutes, which issue alerts if predicted precipitation accumulations exceed critical thresholds. Additionally to the real-time alerts, all relevant model fields of these simulations are stored in a central data archive. This comprehensive data set allows a detailed assessment of today's quantitative precipitation forecast (QPF) performance in the Alpine region. We will present results of QPF verifications against Swiss radar and rain gauge data both from a qualitative point of view, in terms of alerts, as well as from a quantitative perspective, in terms of precipitation rate. Various influencing factors like lead time, accumulation time, selection of warning thresholds, or bias corrections will be discussed. Additional to traditional verifications of area average precipitation amounts, the performance of the models to predict the correct precipitation statistics without requiring a point-to-point match will be described by using modern Fuzzy verification techniques. Both analyses reveal significant advantages of deep convection resolving models compared to coarser models with parameterized convection. An intercomparison of the model forecasts themselves reveals a remarkably high variability between different models, and makes it worthwhile to evaluate the potential of a multi-model ensemble. Various multi-model ensemble strategies will be tested by combining D-PHASE models to virtual ensemble systems.

  20. Qualification Testing Versus Quantitative Reliability Testing of PV - Gaining Confidence in a Rapidly Changing Technology: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurtz, Sarah; Repins, Ingrid L; Hacke, Peter L

    Continued growth of PV system deployment would be enhanced by quantitative, low-uncertainty predictions of the degradation and failure rates of PV modules and systems. The intended product lifetime (decades) far exceeds the product development cycle (months), limiting our ability to reduce the uncertainty of the predictions for this rapidly changing technology. Yet, business decisions (setting insurance rates, analyzing return on investment, etc.) require quantitative risk assessment. Moving toward more quantitative assessments requires consideration of many factors, including the intended application, consequence of a possible failure, variability in the manufacturing, installation, and operation, as well as uncertainty in the measured accelerationmore » factors, which provide the basis for predictions based on accelerated tests. As the industry matures, it is useful to periodically assess the overall strategy for standards development and prioritization of research to provide a technical basis both for the standards and the analysis related to the application of those. To this end, this paper suggests a tiered approach to creating risk assessments. Recent and planned potential improvements in international standards are also summarized.« less

  1. Impact of implementation choices on quantitative predictions of cell-based computational models

    NASA Astrophysics Data System (ADS)

    Kursawe, Jochen; Baker, Ruth E.; Fletcher, Alexander G.

    2017-09-01

    'Cell-based' models provide a powerful computational tool for studying the mechanisms underlying the growth and dynamics of biological tissues in health and disease. An increasing amount of quantitative data with cellular resolution has paved the way for the quantitative parameterisation and validation of such models. However, the numerical implementation of cell-based models remains challenging, and little work has been done to understand to what extent implementation choices may influence model predictions. Here, we consider the numerical implementation of a popular class of cell-based models called vertex models, which are often used to study epithelial tissues. In two-dimensional vertex models, a tissue is approximated as a tessellation of polygons and the vertices of these polygons move due to mechanical forces originating from the cells. Such models have been used extensively to study the mechanical regulation of tissue topology in the literature. Here, we analyse how the model predictions may be affected by numerical parameters, such as the size of the time step, and non-physical model parameters, such as length thresholds for cell rearrangement. We find that vertex positions and summary statistics are sensitive to several of these implementation parameters. For example, the predicted tissue size decreases with decreasing cell cycle durations, and cell rearrangement may be suppressed by large time steps. These findings are counter-intuitive and illustrate that model predictions need to be thoroughly analysed and implementation details carefully considered when applying cell-based computational models in a quantitative setting.

  2. Validity and validation of expert (Q)SAR systems.

    PubMed

    Hulzebos, E; Sijm, D; Traas, T; Posthumus, R; Maslankiewicz, L

    2005-08-01

    At a recent workshop in Setubal (Portugal) principles were drafted to assess the suitability of (quantitative) structure-activity relationships ((Q)SARs) for assessing the hazards and risks of chemicals. In the present study we applied some of the Setubal principles to test the validity of three (Q)SAR expert systems and validate the results. These principles include a mechanistic basis, the availability of a training set and validation. ECOSAR, BIOWIN and DEREK for Windows have a mechanistic or empirical basis. ECOSAR has a training set for each QSAR. For half of the structural fragments the number of chemicals in the training set is >4. Based on structural fragments and log Kow, ECOSAR uses linear regression to predict ecotoxicity. Validating ECOSAR for three 'valid' classes results in predictivity of > or = 64%. BIOWIN uses (non-)linear regressions to predict the probability of biodegradability based on fragments and molecular weight. It has a large training set and predicts non-ready biodegradability well. DEREK for Windows predictions are supported by a mechanistic rationale and literature references. The structural alerts in this program have been developed with a training set of positive and negative toxicity data. However, to support the prediction only a limited number of chemicals in the training set is presented to the user. DEREK for Windows predicts effects by 'if-then' reasoning. The program predicts best for mutagenicity and carcinogenicity. Each structural fragment in ECOSAR and DEREK for Windows needs to be evaluated and validated separately.

  3. CADASTER QSPR Models for Predictions of Melting and Boiling Points of Perfluorinated Chemicals.

    PubMed

    Bhhatarai, Barun; Teetz, Wolfram; Liu, Tao; Öberg, Tomas; Jeliazkova, Nina; Kochev, Nikolay; Pukalov, Ognyan; Tetko, Igor V; Kovarich, Simona; Papa, Ester; Gramatica, Paola

    2011-03-14

    Quantitative structure property relationship (QSPR) studies on per- and polyfluorinated chemicals (PFCs) on melting point (MP) and boiling point (BP) are presented. The training and prediction chemicals used for developing and validating the models were selected from Syracuse PhysProp database and literatures. The available experimental data sets were split in two different ways: a) random selection on response value, and b) structural similarity verified by self-organizing-map (SOM), in order to propose reliable predictive models, developed only on the training sets and externally verified on the prediction sets. Individual linear and non-linear approaches based models developed by different CADASTER partners on 0D-2D Dragon descriptors, E-state descriptors and fragment based descriptors as well as consensus model and their predictions are presented. In addition, the predictive performance of the developed models was verified on a blind external validation set (EV-set) prepared using PERFORCE database on 15 MP and 25 BP data respectively. This database contains only long chain perfluoro-alkylated chemicals, particularly monitored by regulatory agencies like US-EPA and EU-REACH. QSPR models with internal and external validation on two different external prediction/validation sets and study of applicability-domain highlighting the robustness and high accuracy of the models are discussed. Finally, MPs for additional 303 PFCs and BPs for 271 PFCs were predicted for which experimental measurements are unknown. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Tissue microarrays and quantitative tissue-based image analysis as a tool for oncology biomarker and diagnostic development.

    PubMed

    Dolled-Filhart, Marisa P; Gustavson, Mark D

    2012-11-01

    Translational oncology has been improved by using tissue microarrays (TMAs), which facilitate biomarker analysis of large cohorts on a single slide. This has allowed for rapid analysis and validation of potential biomarkers for prognostic and predictive value, as well as for evaluation of biomarker prevalence. Coupled with quantitative analysis of immunohistochemical (IHC) staining, objective and standardized biomarker data from tumor samples can further advance companion diagnostic approaches for the identification of drug-responsive or resistant patient subpopulations. This review covers the advantages, disadvantages and applications of TMAs for biomarker research. Research literature and reviews of TMAs and quantitative image analysis methodology have been surveyed for this review (with an AQUA® analysis focus). Applications such as multi-marker diagnostic development and pathway-based biomarker subpopulation analyses are described. Tissue microarrays are a useful tool for biomarker analyses including prevalence surveys, disease progression assessment and addressing potential prognostic or predictive value. By combining quantitative image analysis with TMAs, analyses will be more objective and reproducible, allowing for more robust IHC-based diagnostic test development. Quantitative multi-biomarker IHC diagnostic tests that can predict drug response will allow for greater success of clinical trials for targeted therapies and provide more personalized clinical decision making.

  5. Quantitative chest computed tomography as a means of predicting exercise performance in severe emphysema.

    PubMed

    Crausman, R S; Ferguson, G; Irvin, C G; Make, B; Newell, J D

    1995-06-01

    We assessed the value of quantitative high-resolution computed tomography (CT) as a diagnostic and prognostic tool in smoking-related emphysema. We performed an inception cohort study of 14 patients referred with emphysema. The diagnosis of emphysema was based on a compatible history, physical examination, chest radiograph, CT scan of the lung, and pulmonary physiologic evaluation. As a group, those who underwent exercise testing were hyperinflated (percentage predicted total lung capacity +/- standard error of the mean = 133 +/- 9%), and there was evidence of air trapping (percentage predicted respiratory volume = 318 +/- 31%) and airflow limitation (forced expiratory volume in 1 sec [FEV1] = 40 +/- 7%). The exercise performance of the group was severely limited (maximum achievable workload = 43 +/- 6%) and was characterized by prominent ventilatory, gas exchange, and pulmonary vascular abnormalities. The quantitative CT index was markedly elevated in all patients (76 +/- 9; n = 14; normal < 4). There were correlations between this quantitative CT index and measures of airflow limitation (FEV1 r2 = .34, p = 09; FEV1/forced vital capacity r2 = .46, p = .04) and between maximum workload achieved (r2 = .93, p = .0001) and maximum oxygen utilization (r2 = .83, p = .0007). Quantitative chest CT assessment of disease severity is correlated with the degree of airflow limitation and exercise impairment in pulmonary emphysema.

  6. Fitness to work of astronauts in conditions of action of the extreme emotional factors

    NASA Astrophysics Data System (ADS)

    Prisniakova, L. M.

    2004-01-01

    The theoretical model for the quantitative determination of influence of a level of emotional exertion on the success of human activity is presented. The learning curves of fixed words in the groups with a different level of the emotional exertion are analyzed. The obtained magnitudes of time constant T depending on a type of the emotional exertion are a quantitative measure of the emotional exertion. Time constants could also be of use for a prediction of the characteristic of fitness to work of an astronaut in conditions of extreme factors. The inverse of the sign of influencing on efficiency of activity of the man is detected. The paper offers a mathematical model of the relation between successful activity and motivations or the emotional exertion (Yerkes-Dodson law). Proposed models can serve by the theoretical basis of the quantitative characteristics of an estimation of activity of astronauts in conditions of the emotional factors at a phase of their selection.

  7. Fitness to work of astronauts in conditions of action of the extreme emotional factors.

    PubMed

    Prisniakova, L M

    2004-01-01

    The theoretical model for the quantitative determination of influence of a level of emotional exertion on the success of human activity is presented. The learning curves of fixed words in the groups with a different level of the emotional exertion are analyzed. The obtained magnitudes of time constant T depending on a type of the emotional exertion are a quantitative measure of the emotional exertion. Time constants could also be of use for a prediction of the characteristic of fitness to work of an astronaut in conditions of extreme factors. The inverse of the sign of influencing on efficiency of activity of the man is detected. The paper offers a mathematical model of the relation between successful activity and motivations or the emotional exertion (Yerkes-Dodson law). Proposed models can serve by the theoretical basis of the quantitative characteristics of an estimation of activity of astronauts in conditions of the emotional factors at a phase of their selection. Published by Elsevier Ltd on behalf of COSPAR.

  8. Calibration of diatom-pH-alkalinity methodology for the interpretation of the sedimentary record in Emerald Lake Integrated watershed study. Final report, 6 May 1985-10 October 1986

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holmes, R.W.

    1986-10-10

    The present study was designed to establish quantitative relationships between lake air-equilibrated pH, alkalinity, and diatoms occurring in the surface sediments in high-elevation Sierra Nevada Lakes. These relationships provided the necessary information to develop predictive equations relating lake pH to the composition of surface-sediment diatom assemblages in 27 study lakes. Using the Hustedt diatom pH classification system, Index B of Renberg and Hellberg, and multiple linear regression analysis, two equations were developed which predict lake pH from the relative abundance of sediment diatoms occurring in each of four diatom pH groupings.

  9. Ultra-sparse dielectric nanowire grids as wideband reflectors and polarizers.

    PubMed

    Yoon, Jae Woong; Lee, Kyu Jin; Magnusson, Robert

    2015-11-02

    Engaging both theory and experiment, we investigate resonant photonic lattices in which the duty cycle tends to zero. Corresponding dielectric nanowire grids are mostly empty space if operated as membranes in vacuum or air. These grids are shown to be effective wideband reflectors with impressive polarizing properties. We provide computed results predicting nearly complete reflection and attendant polarization extinction in multiple spectral regions. Experimental results with Si nanowire arrays with 10% duty cycle show ~200-nm-wide band of high reflection for one polarization state and free transmission for the orthogonal state. These results agree quantitatively with theoretical predictions. It is fundamentally extremely significant that the wideband spectral expressions presented can be generated in these minimal systems.

  10. Kernel-based whole-genome prediction of complex traits: a review.

    PubMed

    Morota, Gota; Gianola, Daniel

    2014-01-01

    Prediction of genetic values has been a focus of applied quantitative genetics since the beginning of the 20th century, with renewed interest following the advent of the era of whole genome-enabled prediction. Opportunities offered by the emergence of high-dimensional genomic data fueled by post-Sanger sequencing technologies, especially molecular markers, have driven researchers to extend Ronald Fisher and Sewall Wright's models to confront new challenges. In particular, kernel methods are gaining consideration as a regression method of choice for genome-enabled prediction. Complex traits are presumably influenced by many genomic regions working in concert with others (clearly so when considering pathways), thus generating interactions. Motivated by this view, a growing number of statistical approaches based on kernels attempt to capture non-additive effects, either parametrically or non-parametrically. This review centers on whole-genome regression using kernel methods applied to a wide range of quantitative traits of agricultural importance in animals and plants. We discuss various kernel-based approaches tailored to capturing total genetic variation, with the aim of arriving at an enhanced predictive performance in the light of available genome annotation information. Connections between prediction machines born in animal breeding, statistics, and machine learning are revisited, and their empirical prediction performance is discussed. Overall, while some encouraging results have been obtained with non-parametric kernels, recovering non-additive genetic variation in a validation dataset remains a challenge in quantitative genetics.

  11. Atmospheric Effects of Subsonic Aircraft: Interim Assessment Report of the Advanced Subsonic Technology Program

    NASA Technical Reports Server (NTRS)

    Friedl, Randall R. (Editor)

    1997-01-01

    This first interim assessment of the subsonic assessment (SASS) project attempts to summarize concisely the status of our knowledge concerning the impacts of present and future subsonic aircraft fleets. It also highlights the major areas of scientific uncertainty, through review of existing data bases and model-based sensitivity studies. In view of the need for substantial improvements in both model formulations and experimental databases, this interim assessment cannot provide confident numerical predictions of aviation impacts. However, a number of quantitative estimates are presented, which provide some guidance to policy makers.

  12. Acquisition and extinction in autoshaping.

    PubMed

    Kakade, Sham; Dayan, Peter

    2002-07-01

    C. R. Gallistel and J. Gibbon (2000) presented quantitative data on the speed with which animals acquire behavioral responses during autoshaping, together with a statistical model of learning intended to account for them. Although this model captures the form of the dependencies among critical variables, its detailed predictions are substantially at variance with the data. In the present article, further key data on the speed of acquisition are used to motivate an alternative model of learning, in which animals can be interpreted as paying different amounts of attention to stimuli according to estimates of their differential reliabilities as predictors.

  13. Near infrared spectroscopy as an on-line method to quantitatively determine glycogen and predict ultimate pH in pre rigor bovine M. longissimus dorsi.

    PubMed

    Lomiwes, D; Reis, M M; Wiklund, E; Young, O A; North, M

    2010-12-01

    The potential of near infrared (NIR) spectroscopy as an on-line method to quantify glycogen and predict ultimate pH (pH(u)) of pre rigor beef M. longissimus dorsi (LD) was assessed. NIR spectra (538 to 1677 nm) of pre rigor LD from steers, cows and bulls were collected early post mortem and measurements were made for pre rigor glycogen concentration and pH(u). Spectral and measured data were combined to develop models to quantify glycogen and predict the pH(u) of pre rigor LD. NIR spectra and pre rigor predicted values obtained from quantitative models were shown to be poorly correlated against glycogen and pH(u) (r(2)=0.23 and 0.20, respectively). Qualitative models developed to categorize each muscle according to their pH(u) were able to correctly categorize 42% of high pH(u) samples. Optimum qualitative and quantitative models derived from NIR spectra found low correlation between predicted values and reference measurements. Copyright © 2010 The American Meat Science Association. Published by Elsevier Ltd.. All rights reserved.

  14. Using metal-ligand binding characteristics to predict metal toxicity: quantitative ion character-activity relationships (QICARs).

    PubMed Central

    Newman, M C; McCloskey, J T; Tatara, C P

    1998-01-01

    Ecological risk assessment can be enhanced with predictive models for metal toxicity. Modelings of published data were done under the simplifying assumption that intermetal trends in toxicity reflect relative metal-ligand complex stabilities. This idea has been invoked successfully since 1904 but has yet to be applied widely in quantitative ecotoxicology. Intermetal trends in toxicity were successfully modeled with ion characteristics reflecting metal binding to ligands for a wide range of effects. Most models were useful for predictive purposes based on an F-ratio criterion and cross-validation, but anomalous predictions did occur if speciation was ignored. In general, models for metals with the same valence (i.e., divalent metals) were better than those combining mono-, di-, and trivalent metals. The softness parameter (sigma p) and the absolute value of the log of the first hydrolysis constant ([symbol: see text] log KOH [symbol: see text]) were especially useful in model construction. Also, delta E0 contributed substantially to several of the two-variable models. In contrast, quantitative attempts to predict metal interactions in binary mixtures based on metal-ligand complex stabilities were not successful. PMID:9860900

  15. Determination of quantitative trait variants by concordance via application of the a posteriori granddaughter design to the U.S. Holstein population

    USDA-ARS?s Scientific Manuscript database

    Experimental designs that exploit family information can provide substantial predictive power in quantitative trait variant discovery projects. Concordance between quantitative trait locus genotype as determined by the a posteriori granddaughter design and marker genotype was determined for 29 trai...

  16. Qualitative Versus Quantitative Social Support as a Predictor of Depression in the Elderly.

    ERIC Educational Resources Information Center

    Chwalisz, Kathleen D.; And Others

    This study examined the relationship between qualitative and quantitative indicators of social support in the prediction of depression. Quantitative indicators were examined with regard to their direct effects on depression as well as their indirect effects through their relationship to perceived social support. Subjects were 301…

  17. Predictive model for convective flows induced by surface reactivity contrast

    NASA Astrophysics Data System (ADS)

    Davidson, Scott M.; Lammertink, Rob G. H.; Mani, Ali

    2018-05-01

    Concentration gradients in a fluid adjacent to a reactive surface due to contrast in surface reactivity generate convective flows. These flows result from contributions by electro- and diffusio-osmotic phenomena. In this study, we have analyzed reactive patterns that release and consume protons, analogous to bimetallic catalytic conversion of peroxide. Similar systems have typically been studied using either scaling analysis to predict trends or costly numerical simulation. Here, we present a simple analytical model, bridging the gap in quantitative understanding between scaling relations and simulations, to predict the induced potentials and consequent velocities in such systems without the use of any fitting parameters. Our model is tested against direct numerical solutions to the coupled Poisson, Nernst-Planck, and Stokes equations. Predicted slip velocities from the model and simulations agree to within a factor of ≈2 over a multiple order-of-magnitude change in the input parameters. Our analysis can be used to predict enhancement of mass transport and the resulting impact on overall catalytic conversion, and is also applicable to predicting the speed of catalytic nanomotors.

  18. Baseline correction combined partial least squares algorithm and its application in on-line Fourier transform infrared quantitative analysis.

    PubMed

    Peng, Jiangtao; Peng, Silong; Xie, Qiong; Wei, Jiping

    2011-04-01

    In order to eliminate the lower order polynomial interferences, a new quantitative calibration algorithm "Baseline Correction Combined Partial Least Squares (BCC-PLS)", which combines baseline correction and conventional PLS, is proposed. By embedding baseline correction constraints into PLS weights selection, the proposed calibration algorithm overcomes the uncertainty in baseline correction and can meet the requirement of on-line attenuated total reflectance Fourier transform infrared (ATR-FTIR) quantitative analysis. The effectiveness of the algorithm is evaluated by the analysis of glucose and marzipan ATR-FTIR spectra. BCC-PLS algorithm shows improved prediction performance over PLS. The root mean square error of cross-validation (RMSECV) on marzipan spectra for the prediction of the moisture is found to be 0.53%, w/w (range 7-19%). The sugar content is predicted with a RMSECV of 2.04%, w/w (range 33-68%). Copyright © 2011 Elsevier B.V. All rights reserved.

  19. A GRID OF THREE-DIMENSIONAL STELLAR ATMOSPHERE MODELS OF SOLAR METALLICITY. I. GENERAL PROPERTIES, GRANULATION, AND ATMOSPHERIC EXPANSION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trampedach, Regner; Asplund, Martin; Collet, Remo

    2013-05-20

    Present grids of stellar atmosphere models are the workhorses in interpreting stellar observations and determining their fundamental parameters. These models rely on greatly simplified models of convection, however, lending less predictive power to such models of late-type stars. We present a grid of improved and more reliable stellar atmosphere models of late-type stars, based on deep, three-dimensional (3D), convective, stellar atmosphere simulations. This grid is to be used in general for interpreting observations and improving stellar and asteroseismic modeling. We solve the Navier Stokes equations in 3D and concurrent with the radiative transfer equation, for a range of atmospheric parameters,more » covering most of stellar evolution with convection at the surface. We emphasize the use of the best available atomic physics for quantitative predictions and comparisons with observations. We present granulation size, convective expansion of the acoustic cavity, and asymptotic adiabat as functions of atmospheric parameters.« less

  20. Systematics of strong nuclear amplification of gluon saturation from exclusive vector meson production in high energy electron-nucleus collisions

    NASA Astrophysics Data System (ADS)

    Mäntysaari, Heikki; Venugopalan, Raju

    2018-06-01

    We show that gluon saturation gives rise to a strong modification of the scaling in both the nuclear mass number A and the virtuality Q2 of the vector meson production cross-section in exclusive deep-inelastic scattering off nuclei. We present qualitative analytic expressions for how the scaling exponents are modified as well as quantitative predictions that can be tested at an Electron-Ion Collider.

  1. Quantitative Lymphoscintigraphy to Predict the Possibility of Lymphedema Development After Breast Cancer Surgery: Retrospective Clinical Study.

    PubMed

    Kim, Paul; Lee, Ju Kang; Lim, Oh Kyung; Park, Heung Kyu; Park, Ki Deok

    2017-12-01

    To predict the probability of lymphedema development in breast cancer patients in the early post-operation stage, we investigated the ability of quantitative lymphoscintigraphic assessment. This retrospective study included 201 patients without lymphedema after unilateral breast cancer surgery. Lymphoscintigraphy was performed between 4 and 8 weeks after surgery to evaluate the lymphatic system in the early postoperative stage. Quantitative lymphoscintigraphy was performed using four methods: ratio of radiopharmaceutical clearance rate of the affected to normal hand; ratio of radioactivity of the affected to normal hand; ratio of radiopharmaceutical uptake rate of the affected to normal axilla (RUA); and ratio of radioactivity of the affected to normal axilla (RRA). During a 1-year follow-up, patients with a circumferential interlimb difference of 2 cm at any measurement location and a 200-mL interlimb volume difference were diagnosed with lymphedema. We investigated the difference in quantitative lymphoscintigraphic assessment between the non-lymphedema and lymphedema groups. Quantitative lymphoscintigraphic assessment revealed that the RUA and RRA were significantly lower in the lymphedema group than in the non-lymphedema group. After adjusting the model for all significant variables (body mass index, N-stage, T-stage, type of surgery, and type of lymph node surgery), RRA was associated with lymphedema (odds ratio=0.14; 95% confidence interval, 0.04-0.46; p=0.001). In patients in the early postoperative stage after unilateral breast cancer surgery, quantitative lymphoscintigraphic assessment can be used to predict the probability of developing lymphedema.

  2. [The role of endotracheal aspirate culture in the diagnosis of ventilator-associated pneumonia: a meta analysis].

    PubMed

    Wang, Fei; He, Bei

    2013-01-01

    To investigate the role of endotracheal aspirate (EA) culture in the diagnosis and antibiotic management in ventilator-associated pneumonia (VAP). We searched CNKI, Wanfang, PUBMED and EMBASE databases published from January 1990 to December 2011, to find relevant literatures on VAP microbiological diagnostic techniques including EA and bronchoalveolar lavage (BALF). The following key words were used: ventilator associated pneumonia, diagnosis and adult. Meta-analysis was performed and the sensitivity and specificity of EA on VAP diagnosis were calculated. Our literature search identified 1665 potential articles, 8 of which fulfilled our selection criteria including 561 patients with paired cultures. Using BALF quantitative culture as reference standard, the sensitivity and specificity of EA were 72% and 71%. When considering quantitative culture of EA only, the sensitivity and specificity improved to 90% and 65%, while the positive and the negative predictive values were 68% and 89% respectively. However, the sensitivity and specificity of semi-quantitative culture of EA were only 50% and 80%, with a positive predictive value of 77% and a negative predictive value of 58% respectively. EA culture had relatively poor sensitivity and specificity, although quantitative culture of EA only could improve the sensitivity. Initiating therapy on the basis of EA quantitative culture may still result in excessive antibiotic usage. Our data suggested that EA could provide some information for clinical decision but could not replace the role of BALF quantitative culture in VAP diagnosis.

  3. Gyrokinetic modeling of impurity peaking in JET H-mode plasmas

    NASA Astrophysics Data System (ADS)

    Manas, P.; Camenen, Y.; Benkadda, S.; Weisen, H.; Angioni, C.; Casson, F. J.; Giroud, C.; Gelfusa, M.; Maslov, M.

    2017-06-01

    Quantitative comparisons are presented between gyrokinetic simulations and experimental values of the carbon impurity peaking factor in a database of JET H-modes during the carbon wall era. These plasmas feature strong NBI heating and hence high values of toroidal rotation and corresponding gradient. Furthermore, the carbon profiles present particularly interesting shapes for fusion devices, i.e., hollow in the core and peaked near the edge. Dependencies of the experimental carbon peaking factor ( R / L nC ) on plasma parameters are investigated via multilinear regressions. A marked correlation between R / L nC and the normalised toroidal rotation gradient is observed in the core, which suggests an important role of the rotation in establishing hollow carbon profiles. The carbon peaking factor is then computed with the gyrokinetic code GKW, using a quasi-linear approach, supported by a few non-linear simulations. The comparison of the quasi-linear predictions to the experimental values at mid-radius reveals two main regimes. At low normalised collisionality, ν * , and T e / T i < 1 , the gyrokinetic simulations quantitatively recover experimental carbon density profiles, provided that rotodiffusion is taken into account. In contrast, at higher ν * and T e / T i > 1 , the very hollow experimental carbon density profiles are never predicted by the simulations and the carbon density peaking is systematically over estimated. This points to a possible missing ingredient in this regime.

  4. Assessing exposure to transformation products of soil-applied organic contaminants in surface water: comparison of model predictions and field data.

    PubMed

    Kern, Susanne; Singer, Heinz; Hollender, Juliane; Schwarzenbach, René P; Fenner, Kathrin

    2011-04-01

    Transformation products (TPs) of chemicals released to soil, for example, pesticides, are regularly detected in surface and groundwater with some TPs even dominating observed pesticide levels. Given the large number of TPs potentially formed in the environment, straightforward prioritization methods based on available data and simple, evaluative models are required to identify TPs with a high aquatic exposure potential. While different such methods exist, none of them has so far been systematically evaluated against field data. Using a dynamic multimedia, multispecies model for TP prioritization, we compared the predicted relative surface water exposure potential of pesticides and their TPs with experimental data for 16 pesticides and 46 TPs measured in a small river draining a Swiss agricultural catchment. Twenty TPs were determined quantitatively using solid-phase extraction liquid chromatography mass spectrometry (SPE-LC-MS/MS), whereas the remaining 26 TPs could only be detected qualitatively because of the lack of analytical reference standards. Accordingly, the two sets of TPs were used for quantitative and qualitative model evaluation, respectively. Quantitative comparison of predicted with measured surface water exposure ratios for 20 pairs of TPs and parent pesticides indicated agreement within a factor of 10, except for chloridazon-desphenyl and chloridazon-methyl-desphenyl. The latter two TPs were found to be present in elevated concentrations during baseflow conditions and in groundwater samples across Switzerland, pointing toward high concentrations in exfiltrating groundwater. A simple leaching relationship was shown to qualitatively agree with the observed baseflow concentrations and to thus be useful in identifying TPs for which the simple prioritization model might underestimate actual surface water concentrations. Application of the model to the 26 qualitatively analyzed TPs showed that most of those TPs categorized as exhibiting a high aquatic exposure potential could be confirmed to be present in the majority of water samples investigated. On the basis of these results, we propose a generally applicable, model-based approach to identify those TPs of soil-applied organic contaminants that exhibit a high aquatic exposure potential to prioritize them for higher-tier, experimental investigations.

  5. On measures of association among genetic variables

    PubMed Central

    Gianola, Daniel; Manfredi, Eduardo; Simianer, Henner

    2012-01-01

    Summary Systems involving many variables are important in population and quantitative genetics, for example, in multi-trait prediction of breeding values and in exploration of multi-locus associations. We studied departures of the joint distribution of sets of genetic variables from independence. New measures of association based on notions of statistical distance between distributions are presented. These are more general than correlations, which are pairwise measures, and lack a clear interpretation beyond the bivariate normal distribution. Our measures are based on logarithmic (Kullback-Leibler) and on relative ‘distances’ between distributions. Indexes of association are developed and illustrated for quantitative genetics settings in which the joint distribution of the variables is either multivariate normal or multivariate-t, and we show how the indexes can be used to study linkage disequilibrium in a two-locus system with multiple alleles and present applications to systems of correlated beta distributions. Two multivariate beta and multivariate beta-binomial processes are examined, and new distributions are introduced: the GMS-Sarmanov multivariate beta and its beta-binomial counterpart. PMID:22742500

  6. The impacts of uncertainty and variability in groundwater-driven health risk assessment. (Invited)

    NASA Astrophysics Data System (ADS)

    Maxwell, R. M.

    2010-12-01

    Potential human health risk from contaminated groundwater is becoming an important, quantitative measure used in management decisions in a range of applications from Superfund to CO2 sequestration. Quantitatively assessing the potential human health risks from contaminated groundwater is challenging due to the many coupled processes, uncertainty in transport parameters and the variability in individual physiology and behavior. Perspective on human health risk assessment techniques will be presented and a framework used to predict potential, increased human health risk from contaminated groundwater will be discussed. This framework incorporates transport of contaminants through the subsurface from source to receptor and health risks to individuals via household exposure pathways. The subsurface is shown subject to both physical and chemical heterogeneity which affects downstream concentrations at receptors. Cases are presented where hydraulic conductivity can exhibit both uncertainty and spatial variability in addition to situations where hydraulic conductivity is the dominant source of uncertainty in risk assessment. Management implications, such as characterization and remediation will also be discussed.

  7. Quantitative computed tomography versus spirometry in predicting air leak duration after major lung resection for cancer.

    PubMed

    Ueda, Kazuhiro; Kaneda, Yoshikazu; Sudo, Manabu; Mitsutaka, Jinbo; Li, Tao-Sheng; Suga, Kazuyoshi; Tanaka, Nobuyuki; Hamano, Kimikazu

    2005-11-01

    Emphysema is a well-known risk factor for developing air leak or persistent air leak after pulmonary resection. Although quantitative computed tomography (CT) and spirometry are used to diagnose emphysema, it remains controversial whether these tests are predictive of the duration of postoperative air leak. Sixty-two consecutive patients who were scheduled to undergo major lung resection for cancer were enrolled in this prospective study to define the best predictor of postoperative air leak duration. Preoperative factors analyzed included spirometric variables and area of emphysema (proportion of the low-attenuation area) that was quantified in a three-dimensional CT lung model. Chest tubes were removed the day after disappearance of the air leak, regardless of pleural drainage. Univariate and multivariate proportional hazards analyses were used to determine the influence of preoperative factors on chest tube time (air leak duration). By univariate analysis, site of resection (upper, lower), forced expiratory volume in 1 second, predicted postoperative forced expiratory volume in 1 second, and area of emphysema (< 1%, 1% to 10%, > 10%) were significant predictors of air leak duration. By multivariate analysis, site of resection and area of emphysema were the best independent determinants of air leak duration. The results were similar for patients with a smoking history (n = 40), but neither forced expiratory volume in 1 second nor predicted postoperative forced expiratory volume in 1 second were predictive of air leak duration. Quantitative CT is superior to spirometry in predicting air leak duration after major lung resection for cancer. Quantitative CT may aid in the identification of patients, particularly among those with a smoking history, requiring additional preventive procedures against air leak.

  8. Visualization and simulated surgery of the left ventricle in the virtual pathological heart of the Virtual Physiological Human

    PubMed Central

    McFarlane, N. J. B.; Lin, X.; Zhao, Y.; Clapworthy, G. J.; Dong, F.; Redaelli, A.; Parodi, O.; Testi, D.

    2011-01-01

    Ischaemic heart failure remains a significant health and economic problem worldwide. This paper presents a user-friendly software system that will form a part of the virtual pathological heart of the Virtual Physiological Human (VPH2) project, currently being developed under the European Commission Virtual Physiological Human (VPH) programme. VPH2 is an integrated medicine project, which will create a suite of modelling, simulation and visualization tools for patient-specific prediction and planning in cases of post-ischaemic left ventricular dysfunction. The work presented here describes a three-dimensional interactive visualization for simulating left ventricle restoration surgery, comprising the operations of cutting, stitching and patching, and for simulating the elastic deformation of the ventricle to its post-operative shape. This will supply the quantitative measurements required for the post-operative prediction tools being developed in parallel in the same project. PMID:22670207

  9. Understanding bullying and victimization during childhood and adolescence: a mixed methods study.

    PubMed

    Guerra, Nancy G; Williams, Kirk R; Sadek, Shelly

    2011-01-01

    In the present study, quantitative and qualitative data are presented to examine individual and contextual predictors of bullying and victimization and how they vary by age and gender. Two waves of survey data were collected from 2,678 elementary, middle, and high school youth attending 59 schools. In addition, 14 focus groups were conducted with 115 youth who did not participate in the survey. Changes in both bullying and victimization were predicted across gender and age by low self-esteem and negative school climate, with normative beliefs supporting bullying predicting increases in bullying only. Focus group comments provided insights into the dynamics of bullying, highlighting its connection to emergent sexuality and social identity during adolescence. Findings are discussed in terms of their implications for preventive antibullying interventions in schools. © 2011 The Authors. Child Development © 2011 Society for Research in Child Development, Inc.

  10. Dr. Bibbo's Presidential Address on Automation in Cytology: Were Her Predictions Right, Wrong, or Somewhere in the Middle?

    PubMed

    Wilbur, David C

    2017-01-01

    In 1983, Dr. Marluce Bibbo gave the Presidential Address at the Annual Meeting of the American Society of Cytology in Denver, CO, USA. The lecture was entitled "Analytic and Quantitative Cytology," a field in which Dr. Bibbo was intimately involved. In the presentation, she included a summary of 30 years of work already accomplished, the present state of the art, and musings about issues encountered, potential resolutions, progress that needed to be made, and her perception of how the field needed to evolve in order to become ultimately successful as a clinical service. This commentary looks back 34 years, with observations about Dr. Bibbo's predictions and how the field of cytology automation did actually evolve in the decades following her address. New challenges are identified and possible paths forward are discussed. © 2017 S. Karger AG, Basel.

  11. Extending Theory-Based Quantitative Predictions to New Health Behaviors.

    PubMed

    Brick, Leslie Ann D; Velicer, Wayne F; Redding, Colleen A; Rossi, Joseph S; Prochaska, James O

    2016-04-01

    Traditional null hypothesis significance testing suffers many limitations and is poorly adapted to theory testing. A proposed alternative approach, called Testing Theory-based Quantitative Predictions, uses effect size estimates and confidence intervals to directly test predictions based on theory. This paper replicates findings from previous smoking studies and extends the approach to diet and sun protection behaviors using baseline data from a Transtheoretical Model behavioral intervention (N = 5407). Effect size predictions were developed using two methods: (1) applying refined effect size estimates from previous smoking research or (2) using predictions developed by an expert panel. Thirteen of 15 predictions were confirmed for smoking. For diet, 7 of 14 predictions were confirmed using smoking predictions and 6 of 16 using expert panel predictions. For sun protection, 3 of 11 predictions were confirmed using smoking predictions and 5 of 19 using expert panel predictions. Expert panel predictions and smoking-based predictions poorly predicted effect sizes for diet and sun protection constructs. Future studies should aim to use previous empirical data to generate predictions whenever possible. The best results occur when there have been several iterations of predictions for a behavior, such as with smoking, demonstrating that expected values begin to converge on the population effect size. Overall, the study supports necessity in strengthening and revising theory with empirical data.

  12. An ensemble model of QSAR tools for regulatory risk assessment.

    PubMed

    Pradeep, Prachi; Povinelli, Richard J; White, Shannon; Merrill, Stephen J

    2016-01-01

    Quantitative structure activity relationships (QSARs) are theoretical models that relate a quantitative measure of chemical structure to a physical property or a biological effect. QSAR predictions can be used for chemical risk assessment for protection of human and environmental health, which makes them interesting to regulators, especially in the absence of experimental data. For compatibility with regulatory use, QSAR models should be transparent, reproducible and optimized to minimize the number of false negatives. In silico QSAR tools are gaining wide acceptance as a faster alternative to otherwise time-consuming clinical and animal testing methods. However, different QSAR tools often make conflicting predictions for a given chemical and may also vary in their predictive performance across different chemical datasets. In a regulatory context, conflicting predictions raise interpretation, validation and adequacy concerns. To address these concerns, ensemble learning techniques in the machine learning paradigm can be used to integrate predictions from multiple tools. By leveraging various underlying QSAR algorithms and training datasets, the resulting consensus prediction should yield better overall predictive ability. We present a novel ensemble QSAR model using Bayesian classification. The model allows for varying a cut-off parameter that allows for a selection in the desirable trade-off between model sensitivity and specificity. The predictive performance of the ensemble model is compared with four in silico tools (Toxtree, Lazar, OECD Toolbox, and Danish QSAR) to predict carcinogenicity for a dataset of air toxins (332 chemicals) and a subset of the gold carcinogenic potency database (480 chemicals). Leave-one-out cross validation results show that the ensemble model achieves the best trade-off between sensitivity and specificity (accuracy: 83.8 % and 80.4 %, and balanced accuracy: 80.6 % and 80.8 %) and highest inter-rater agreement [kappa ( κ ): 0.63 and 0.62] for both the datasets. The ROC curves demonstrate the utility of the cut-off feature in the predictive ability of the ensemble model. This feature provides an additional control to the regulators in grading a chemical based on the severity of the toxic endpoint under study.

  13. An ensemble model of QSAR tools for regulatory risk assessment

    DOE PAGES

    Pradeep, Prachi; Povinelli, Richard J.; White, Shannon; ...

    2016-09-22

    Quantitative structure activity relationships (QSARs) are theoretical models that relate a quantitative measure of chemical structure to a physical property or a biological effect. QSAR predictions can be used for chemical risk assessment for protection of human and environmental health, which makes them interesting to regulators, especially in the absence of experimental data. For compatibility with regulatory use, QSAR models should be transparent, reproducible and optimized to minimize the number of false negatives. In silico QSAR tools are gaining wide acceptance as a faster alternative to otherwise time-consuming clinical and animal testing methods. However, different QSAR tools often make conflictingmore » predictions for a given chemical and may also vary in their predictive performance across different chemical datasets. In a regulatory context, conflicting predictions raise interpretation, validation and adequacy concerns. To address these concerns, ensemble learning techniques in the machine learning paradigm can be used to integrate predictions from multiple tools. By leveraging various underlying QSAR algorithms and training datasets, the resulting consensus prediction should yield better overall predictive ability. We present a novel ensemble QSAR model using Bayesian classification. The model allows for varying a cut-off parameter that allows for a selection in the desirable trade-off between model sensitivity and specificity. The predictive performance of the ensemble model is compared with four in silico tools (Toxtree, Lazar, OECD Toolbox, and Danish QSAR) to predict carcinogenicity for a dataset of air toxins (332 chemicals) and a subset of the gold carcinogenic potency database (480 chemicals). Leave-one-out cross validation results show that the ensemble model achieves the best trade-off between sensitivity and specificity (accuracy: 83.8 % and 80.4 %, and balanced accuracy: 80.6 % and 80.8 %) and highest inter-rater agreement [kappa (κ): 0.63 and 0.62] for both the datasets. The ROC curves demonstrate the utility of the cut-off feature in the predictive ability of the ensemble model. In conclusion, this feature provides an additional control to the regulators in grading a chemical based on the severity of the toxic endpoint under study.« less

  14. The Relevance of a Novel Quantitative Assay to Detect up to 40 Major Streptococcus pneumoniae Serotypes Directly in Clinical Nasopharyngeal and Blood Specimens

    PubMed Central

    Albrich, Werner C.; van der Linden, Mark P. G.; Bénet, Thomas; Chou, Monidarin; Sylla, Mariam; Barreto Costa, Patricia; Richard, Nathalie; Klugman, Keith P.; Endtz, Hubert P.; Paranhos-Baccalà, Gláucia; Telles, Jean-Noël

    2016-01-01

    For epidemiological and surveillance purposes, it is relevant to monitor the distribution and dynamics of Streptococcus pneumoniae serotypes. Conventional serotyping methods do not provide rapid or quantitative information on serotype loads. Quantitative serotyping may enable prediction of the invasiveness of a specific serotype compared to other serotypes carried. Here, we describe a novel, rapid multiplex real-time PCR assay for identification and quantification of the 40 most prevalent pneumococcal serotypes and the assay impacts in pneumonia specimens from emerging and developing countries. Eleven multiplex PCR to detect 40 serotypes or serogroups were optimized. Quantification was enabled by reference to standard dilutions of known bacterial load. Performance of the assay was evaluated to specifically type and quantify S. pneumoniae in nasopharyngeal and blood samples from adult and pediatric patients hospitalized with pneumonia (n = 664) from five different countries. Serogroup 6 was widely represented in nasopharyngeal specimens from all five cohorts. The most frequent serotypes in the French, South African, and Brazilian cohorts were 1 and 7A/F, 3 and 19F, and 14, respectively. When both samples were available, the serotype in blood was always present as carriage with other serotypes in the nasopharynx. Moreover, the ability of a serotype to invade the bloodstream may be linked to its nasopharyngeal load. The mean nasopharyngeal concentration of the serotypes that moved to the blood was 3 log-fold higher than the ones only found in the nasopharynx. This novel, rapid, quantitative assay may potentially predict some of the S. pneumoniae serotypes invasiveness and assessment of pneumococcal serotype distribution. PMID:26986831

  15. Quantitative myocardial perfusion from static cardiac and dynamic arterial CT

    NASA Astrophysics Data System (ADS)

    Bindschadler, Michael; Branch, Kelley R.; Alessio, Adam M.

    2018-05-01

    Quantitative myocardial blood flow (MBF) estimation by dynamic contrast enhanced cardiac computed tomography (CT) requires multi-frame acquisition of contrast transit through the blood pool and myocardium to inform the arterial input and tissue response functions. Both the input and the tissue response functions for the entire myocardium are sampled with each acquisition. However, the long breath holds and frequent sampling can result in significant motion artifacts and relatively high radiation dose. To address these limitations, we propose and evaluate a new static cardiac and dynamic arterial (SCDA) quantitative MBF approach where (1) the input function is well sampled using either prediction from pre-scan timing bolus data or measured from dynamic thin slice ‘bolus tracking’ acquisitions, and (2) the whole-heart tissue response data is limited to one contrast enhanced CT acquisition. A perfusion model uses the dynamic arterial input function to generate a family of possible myocardial contrast enhancement curves corresponding to a range of MBF values. Combined with the timing of the single whole-heart acquisition, these curves generate a lookup table relating myocardial contrast enhancement to quantitative MBF. We tested the SCDA approach in 28 patients that underwent a full dynamic CT protocol both at rest and vasodilator stress conditions. Using measured input function plus single (enhanced CT only) or plus double (enhanced and contrast free baseline CT’s) myocardial acquisitions yielded MBF estimates with root mean square (RMS) error of 1.2 ml/min/g and 0.35 ml/min/g, and radiation dose reductions of 90% and 83%, respectively. The prediction of the input function based on timing bolus data and the static acquisition had an RMS error compared to the measured input function of 26.0% which led to MBF estimation errors greater than threefold higher than using the measured input function. SCDA presents a new, simplified approach for quantitative perfusion imaging with an acquisition strategy offering substantial radiation dose and computational complexity savings over dynamic CT.

  16. Modeling and parameterization of photoelectrons emitted in condensed matter by linearly polarized synchrotron radiation

    NASA Astrophysics Data System (ADS)

    Jablonski, A.

    2018-01-01

    Growing availability of synchrotron facilities stimulates an interest in quantitative applications of hard X-ray photoemission spectroscopy (HAXPES) using linearly polarized radiation. An advantage of this approach is the possibility of continuous variation of radiation energy that makes it possible to control the sampling depth for a measurement. Quantitative applications are based on accurate and reliable theory relating the measured spectral features to needed characteristics of the surface region of solids. A major complication in the case of polarized radiation is an involved structure of the photoemission cross-section for hard X-rays. In the present work, details of the relevant formalism are described and algorithms implementing this formalism for different experimental configurations are proposed. The photoelectron signal intensity may be considerably affected by variation in the positioning of the polarization vector with respect to the surface plane. This information is critical for any quantitative application of HAXPES by polarized X-rays. Different quantitative applications based on photoelectrons with energies up to 10 keV are considered here: (i) determination of surface composition, (ii) estimation of sampling depth, and (iii) measurements of an overlayer thickness. Parameters facilitating these applications (mean escape depths, information depths, effective attenuation lengths) were calculated for a number of photoelectron lines in four elemental solids (Si, Cu, Ag and Au) in different experimental configurations and locations of the polarization vector. One of the considered configurations, with polarization vector located in a plane perpendicular to the surface, was recommended for quantitative applications of HAXPES. In this configurations, it was found that the considered parameters vary weakly in the range of photoelectron emission angles from normal emission to about 50° with respect to the surface normal. The averaged values of the mean escape depth and effective attenuation length were approximated with accurate predictive formulas. The predicted effective attenuation lengths were compared with published values; major discrepancies observed can be ascribed to a possibility of discontinuous structure of the deposited overlayer.

  17. How to make predictions about future infectious disease risks

    PubMed Central

    Woolhouse, Mark

    2011-01-01

    Formal, quantitative approaches are now widely used to make predictions about the likelihood of an infectious disease outbreak, how the disease will spread, and how to control it. Several well-established methodologies are available, including risk factor analysis, risk modelling and dynamic modelling. Even so, predictive modelling is very much the ‘art of the possible’, which tends to drive research effort towards some areas and away from others which may be at least as important. Building on the undoubted success of quantitative modelling of the epidemiology and control of human and animal diseases such as AIDS, influenza, foot-and-mouth disease and BSE, attention needs to be paid to developing a more holistic framework that captures the role of the underlying drivers of disease risks, from demography and behaviour to land use and climate change. At the same time, there is still considerable room for improvement in how quantitative analyses and their outputs are communicated to policy makers and other stakeholders. A starting point would be generally accepted guidelines for ‘good practice’ for the development and the use of predictive models. PMID:21624924

  18. Quantitative Comparison of Photothermal Heat Generation between Gold Nanospheres and Nanorods.

    PubMed

    Qin, Zhenpeng; Wang, Yiru; Randrianalisoa, Jaona; Raeesi, Vahid; Chan, Warren C W; Lipiński, Wojciech; Bischof, John C

    2016-07-21

    Gold nanoparticles (GNPs) are widely used for biomedical applications due to unique optical properties, established synthesis methods, and biological compatibility. Despite important applications of plasmonic heating in thermal therapy, imaging, and diagnostics, the lack of quantification in heat generation leads to difficulties in comparing the heating capability for new plasmonic nanostructures and predicting the therapeutic and diagnostic outcome. This study quantifies GNP heat generation by experimental measurements and theoretical predictions for gold nanospheres (GNS) and nanorods (GNR). Interestingly, the results show a GNP-type dependent agreement between experiment and theory. The measured heat generation of GNS matches well with theory, while the measured heat generation of GNR is only 30% of that predicted theoretically at peak absorption. This then leads to a surprising finding that the polydispersity, the deviation of nanoparticle size and shape from nominal value, significantly influences GNR heat generation (>70% reduction), while having a limited effect for GNS (<10% change). This work demonstrates that polydispersity is an important metric in quantitatively predicting plasmonic heat generation and provides a validated framework to quantitatively compare the heating capabilities between gold and other plasmonic nanostructures.

  19. Quantitative Comparison of Photothermal Heat Generation between Gold Nanospheres and Nanorods

    NASA Astrophysics Data System (ADS)

    Qin, Zhenpeng; Wang, Yiru; Randrianalisoa, Jaona; Raeesi, Vahid; Chan, Warren C. W.; Lipiński, Wojciech; Bischof, John C.

    2016-07-01

    Gold nanoparticles (GNPs) are widely used for biomedical applications due to unique optical properties, established synthesis methods, and biological compatibility. Despite important applications of plasmonic heating in thermal therapy, imaging, and diagnostics, the lack of quantification in heat generation leads to difficulties in comparing the heating capability for new plasmonic nanostructures and predicting the therapeutic and diagnostic outcome. This study quantifies GNP heat generation by experimental measurements and theoretical predictions for gold nanospheres (GNS) and nanorods (GNR). Interestingly, the results show a GNP-type dependent agreement between experiment and theory. The measured heat generation of GNS matches well with theory, while the measured heat generation of GNR is only 30% of that predicted theoretically at peak absorption. This then leads to a surprising finding that the polydispersity, the deviation of nanoparticle size and shape from nominal value, significantly influences GNR heat generation (>70% reduction), while having a limited effect for GNS (<10% change). This work demonstrates that polydispersity is an important metric in quantitatively predicting plasmonic heat generation and provides a validated framework to quantitatively compare the heating capabilities between gold and other plasmonic nanostructures.

  20. Selective Weighted Least Squares Method for Fourier Transform Infrared Quantitative Analysis.

    PubMed

    Wang, Xin; Li, Yan; Wei, Haoyun; Chen, Xia

    2017-06-01

    Classical least squares (CLS) regression is a popular multivariate statistical method used frequently for quantitative analysis using Fourier transform infrared (FT-IR) spectrometry. Classical least squares provides the best unbiased estimator for uncorrelated residual errors with zero mean and equal variance. However, the noise in FT-IR spectra, which accounts for a large portion of the residual errors, is heteroscedastic. Thus, if this noise with zero mean dominates in the residual errors, the weighted least squares (WLS) regression method described in this paper is a better estimator than CLS. However, if bias errors, such as the residual baseline error, are significant, WLS may perform worse than CLS. In this paper, we compare the effect of noise and bias error in using CLS and WLS in quantitative analysis. Results indicated that for wavenumbers with low absorbance, the bias error significantly affected the error, such that the performance of CLS is better than that of WLS. However, for wavenumbers with high absorbance, the noise significantly affected the error, and WLS proves to be better than CLS. Thus, we propose a selective weighted least squares (SWLS) regression that processes data with different wavenumbers using either CLS or WLS based on a selection criterion, i.e., lower or higher than an absorbance threshold. The effects of various factors on the optimal threshold value (OTV) for SWLS have been studied through numerical simulations. These studies reported that: (1) the concentration and the analyte type had minimal effect on OTV; and (2) the major factor that influences OTV is the ratio between the bias error and the standard deviation of the noise. The last part of this paper is dedicated to quantitative analysis of methane gas spectra, and methane/toluene mixtures gas spectra as measured using FT-IR spectrometry and CLS, WLS, and SWLS. The standard error of prediction (SEP), bias of prediction (bias), and the residual sum of squares of the errors (RSS) from the three quantitative analyses were compared. In methane gas analysis, SWLS yielded the lowest SEP and RSS among the three methods. In methane/toluene mixture gas analysis, a modification of the SWLS has been presented to tackle the bias error from other components. The SWLS without modification presents the lowest SEP in all cases but not bias and RSS. The modification of SWLS reduced the bias, which showed a lower RSS than CLS, especially for small components.

  1. Quantitative methods in assessment of neurologic function.

    PubMed

    Potvin, A R; Tourtellotte, W W; Syndulko, K; Potvin, J

    1981-01-01

    Traditionally, neurologists have emphasized qualitative techniques for assessing results of clinical trials. However, in recent years qualitative evaluations have been increasingly augmented by quantitative tests for measuring neurologic functions pertaining to mental state, strength, steadiness, reactions, speed, coordination, sensation, fatigue, gait, station, and simulated activities of daily living. Quantitative tests have long been used by psychologists for evaluating asymptomatic function, assessing human information processing, and predicting proficiency in skilled tasks; however, their methodology has never been directly assessed for validity in a clinical environment. In this report, relevant contributions from the literature on asymptomatic human performance and that on clinical quantitative neurologic function are reviewed and assessed. While emphasis is focused on tests appropriate for evaluating clinical neurologic trials, evaluations of tests for reproducibility, reliability, validity, and examiner training procedures, and for effects of motivation, learning, handedness, age, and sex are also reported and interpreted. Examples of statistical strategies for data analysis, scoring systems, data reduction methods, and data display concepts are presented. Although investigative work still remains to be done, it appears that carefully selected and evaluated tests of sensory and motor function should be an essential factor for evaluating clinical trials in an objective manner.

  2. The cutting edge - Micro-CT for quantitative toolmark analysis of sharp force trauma to bone.

    PubMed

    Norman, D G; Watson, D G; Burnett, B; Fenne, P M; Williams, M A

    2018-02-01

    Toolmark analysis involves examining marks created on an object to identify the likely tool responsible for creating those marks (e.g., a knife). Although a potentially powerful forensic tool, knife mark analysis is still in its infancy and the validation of imaging techniques as well as quantitative approaches is ongoing. This study builds on previous work by simulating real-world stabbings experimentally and statistically exploring quantitative toolmark properties, such as cut mark angle captured by micro-CT imaging, to predict the knife responsible. In Experiment 1 a mechanical stab rig and two knives were used to create 14 knife cut marks on dry pig ribs. The toolmarks were laser and micro-CT scanned to allow for quantitative measurements of numerous toolmark properties. The findings from Experiment 1 demonstrated that both knives produced statistically different cut mark widths, wall angle and shapes. Experiment 2 examined knife marks created on fleshed pig torsos with conditions designed to better simulate real-world stabbings. Eight knives were used to generate 64 incision cut marks that were also micro-CT scanned. Statistical exploration of these cut marks suggested that knife type, serrated or plain, can be predicted from cut mark width and wall angle. Preliminary results suggest that knives type can be predicted from cut mark width, and that knife edge thickness correlates with cut mark width. An additional 16 cut marks walls were imaged for striation marks using scanning electron microscopy with results suggesting that this approach might not be useful for knife mark analysis. Results also indicated that observer judgements of cut mark shape were more consistent when rated from micro-CT images than light microscopy images. The potential to combine micro-CT data, medical grade CT data and photographs to develop highly realistic virtual models for visualisation and 3D printing is also demonstrated. This is the first study to statistically explore simulated real-world knife marks imaged by micro-CT to demonstrate the potential of quantitative approaches in knife mark analysis. Findings and methods presented in this study are relevant to both forensic toolmark researchers as well as practitioners. Limitations of the experimental methodologies and imaging techniques are discussed, and further work is recommended. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Prediction of fermentation index of cocoa beans (Theobroma cacao L.) based on color measurement and artificial neural networks.

    PubMed

    León-Roque, Noemí; Abderrahim, Mohamed; Nuñez-Alejos, Luis; Arribas, Silvia M; Condezo-Hoyos, Luis

    2016-12-01

    Several procedures are currently used to assess fermentation index (FI) of cocoa beans (Theobroma cacao L.) for quality control. However, all of them present several drawbacks. The aim of the present work was to develop and validate a simple image based quantitative procedure, using color measurement and artificial neural network (ANNs). ANN models based on color measurements were tested to predict fermentation index (FI) of fermented cocoa beans. The RGB values were measured from surface and center region of fermented beans in images obtained by camera and desktop scanner. The FI was defined as the ratio of total free amino acids in fermented versus non-fermented samples. The ANN model that included RGB color measurement of fermented cocoa surface and R/G ratio in cocoa bean of alkaline extracts was able to predict FI with no statistical difference compared with the experimental values. Performance of the ANN model was evaluated by the coefficient of determination, Bland-Altman plot and Passing-Bablok regression analyses. Moreover, in fermented beans, total sugar content and titratable acidity showed a similar pattern to the total free amino acid predicted through the color based ANN model. The results of the present work demonstrate that the proposed ANN model can be adopted as a low-cost and in situ procedure to predict FI in fermented cocoa beans through apps developed for mobile device. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Testing 40 Predictions from the Transtheoretical Model Again, with Confidence

    ERIC Educational Resources Information Center

    Velicer, Wayne F.; Brick, Leslie Ann D.; Fava, Joseph L.; Prochaska, James O.

    2013-01-01

    Testing Theory-based Quantitative Predictions (TTQP) represents an alternative to traditional Null Hypothesis Significance Testing (NHST) procedures and is more appropriate for theory testing. The theory generates explicit effect size predictions and these effect size estimates, with related confidence intervals, are used to test the predictions.…

  5. Microbial-based evaluation of foaming events in full-scale wastewater treatment plants by microscopy survey and quantitative image analysis.

    PubMed

    Leal, Cristiano; Amaral, António Luís; Costa, Maria de Lourdes

    2016-08-01

    Activated sludge systems are prone to be affected by foaming occurrences causing the sludge to rise in the reactor and affecting the wastewater treatment plant (WWTP) performance. Nonetheless, there is currently a knowledge gap hindering the development of foaming events prediction tools that may be fulfilled by the quantitative monitoring of AS systems biota and sludge characteristics. As such, the present study focuses on the assessment of foaming events in full-scale WWTPs, by quantitative protozoa, metazoa, filamentous bacteria, and sludge characteristics analysis, further used to enlighten the inner relationships between these parameters. In the current study, a conventional activated sludge system (CAS) and an oxidation ditch (OD) were surveyed throughout a period of 2 and 3 months, respectively, regarding their biota and sludge characteristics. The biota community was monitored by microscopic observation, and a new filamentous bacteria index was developed to quantify their occurrence. Sludge characteristics (aggregated and filamentous biomass contents and aggregate size) were determined by quantitative image analysis (QIA). The obtained data was then processed by principal components analysis (PCA), cross-correlation analysis, and decision trees to assess the foaming occurrences, and enlighten the inner relationships. It was found that such events were best assessed by the combined use of the relative abundance of testate amoeba and nocardioform filamentous index, presenting a 92.9 % success rate for overall foaming events, and 87.5 and 100 %, respectively, for persistent and mild events.

  6. Quantitative Correlation of in Vivo Properties with in Vitro Assay Results: The in Vitro Binding of a Biotin–DNA Analogue Modifier with Streptavidin Predicts the in Vivo Avidin-Induced Clearability of the Analogue-Modified Antibody

    PubMed Central

    Dou, Shuping; Virostko, John; Greiner, Dale L.; Powers, Alvin C.; Liu, Guozheng

    2016-01-01

    Quantitative prediction of in vivo behavior using an in vitro assay would dramatically accelerate pharmaceutical development. However, studies quantitatively correlating in vivo properties with in vitro assay results are rare because of the difficulty in quantitatively understanding the in vivo behavior of an agent. We now demonstrate such a correlation as a case study based on our quantitative understanding of the in vivo chemistry. In an ongoing pretargeting project, we designed a trifunctional antibody (Ab) that concomitantly carried a biotin and a DNA analogue (hereafter termed MORF). The biotin and the MORF were fused into one structure prior to conjugation to the Ab for the concomitant attachment. Because it was known that avidin-bound Ab molecules leave the circulation rapidly, this design would theoretically allow complete clearance by avidin. The clearability of the trifunctional Ab was determined by calculating the blood MORF concentration ratio of avidin-treated Ab to non-avidin-treated Ab using mice injected with these compounds. In theory, any compromised clearability should be due to the presence of impurities. In vitro, we measured the biotinylated percentage of the Ab-reacting (MORF-biotin)⊃-NH2 modifier, by addition of streptavidin to the radiolabeled (MORF-biotin)⊃-NH2 samples and subsequent high-performance liquid chromatography (HPLC) analysis. On the basis of our previous quantitative understanding, we predicted that the clearability of the Ab would be equal to the biotinylation percentage measured via HPLC. We validated this prediction within a 3% difference. In addition to the high avidin-induced clearability of the trifunctional Ab (up to ~95%) achieved by the design, we were able to predict the required quality of the (MORF-biotin)⊃-NH2 modifier for any given in vivo clearability. This approach may greatly reduce the steps and time currently required in pharmaceutical development in the process of synthesis, chemical analysis, in vitro cell study, and in vivo validation. PMID:26103429

  7. The role of quantitative estrogen receptor status in predicting tumor response at surgery in breast cancer patients treated with neoadjuvant chemotherapy.

    PubMed

    Raphael, Jacques; Gandhi, Sonal; Li, Nim; Lu, Fang-I; Trudeau, Maureen

    2017-07-01

    Estrogen receptor (ER) negative (-) breast cancer (BC) patients have better tumor response rates than ER-positive (+) patients after neoadjuvant chemotherapy (NCT). We conducted a retrospective review using the institutional database "Biomatrix" to assess the value of quantitative ER status in predicting tumor response at surgery and to identify potential predictors of survival outcomes. Univariate followed by multivariable regression analyses were conducted to assess the association between quantitative ER and tumor response assessed as tumor size reduction and pathologic complete response (pCR). Predictors of recurrence-free survival (RFS) were identified using a cox proportional hazards model (CPH). A log-rank test was used to compare RFS between groups if a significant predictor was identified. 304 patients were included with a median follow-up of 43.3 months (Q1-Q3 28.7-61.1) and a mean age of 49.7 years (SD 10.9). Quantitative ER was inversely associated with tumor size reduction and pCR (OR 0.99, 95% CI 0.99-1.00, p = 0.027 and 0.98 95% CI 0.97-0.99, p < 0.0001, respectively). A cut-off of 60 and 80% predicted best the association with tumor size reduction and pCR, respectively. pCR was shown to be an independent predictor of RFS (HR 0.17, 95% CI 0.07-0.43, p = 0.0002) in all patients. At 5 years, 93% of patients with pCR and 72% of patients with residual tumor were recurrence-free, respectively (p = 0.0012). Quantitative ER status is inversely associated with tumor response in BC patients treated with NCT. A cut-off of 60 and 80% predicts best the association with tumor size reduction and pCR, respectively. Therefore, patients with an ER status higher than the cut-off might benefit from a neoadjuvant endocrine therapy approach. Patients with pCR had better survival outcomes independently of their tumor phenotype. Further prospective studies are needed to validate the clinical utility of quantitative ER as a predictive marker of tumor response.

  8. Insights into multimodal imaging classification of ADHD

    PubMed Central

    Colby, John B.; Rudie, Jeffrey D.; Brown, Jesse A.; Douglas, Pamela K.; Cohen, Mark S.; Shehzad, Zarrar

    2012-01-01

    Attention deficit hyperactivity disorder (ADHD) currently is diagnosed in children by clinicians via subjective ADHD-specific behavioral instruments and by reports from the parents and teachers. Considering its high prevalence and large economic and societal costs, a quantitative tool that aids in diagnosis by characterizing underlying neurobiology would be extremely valuable. This provided motivation for the ADHD-200 machine learning (ML) competition, a multisite collaborative effort to investigate imaging classifiers for ADHD. Here we present our ML approach, which used structural and functional magnetic resonance imaging data, combined with demographic information, to predict diagnostic status of individuals with ADHD from typically developing (TD) children across eight different research sites. Structural features included quantitative metrics from 113 cortical and non-cortical regions. Functional features included Pearson correlation functional connectivity matrices, nodal and global graph theoretical measures, nodal power spectra, voxelwise global connectivity, and voxelwise regional homogeneity. We performed feature ranking for each site and modality using the multiple support vector machine recursive feature elimination (SVM-RFE) algorithm, and feature subset selection by optimizing the expected generalization performance of a radial basis function kernel SVM (RBF-SVM) trained across a range of the top features. Site-specific RBF-SVMs using these optimal feature sets from each imaging modality were used to predict the class labels of an independent hold-out test set. A voting approach was used to combine these multiple predictions and assign final class labels. With this methodology we were able to predict diagnosis of ADHD with 55% accuracy (versus a 39% chance level in this sample), 33% sensitivity, and 80% specificity. This approach also allowed us to evaluate predictive structural and functional features giving insight into abnormal brain circuitry in ADHD. PMID:22912605

  9. Using genome-scale metabolic models to compare serovars of the foodborne pathogen Listeria monocytogenes.

    PubMed

    Metz, Zachary P; Ding, Tong; Baumler, David J

    2018-01-01

    Listeria monocytogenes is a microorganism of great concern for the food industry and the cause of human foodborne disease. Therefore, novel methods of control are needed, and systems biology is one such approach to identify them. Using a combination of computational techniques and laboratory methods, genome-scale metabolic models (GEMs) can be created, validated, and used to simulate growth environments and discern metabolic capabilities of microbes of interest, including L. monocytogenes. The objective of the work presented here was to generate GEMs for six different strains of L. monocytogenes, and to both qualitatively and quantitatively validate these GEMs with experimental data to examine the diversity of metabolic capabilities of numerous strains from the three different serovar groups most associated with foodborne outbreaks and human disease. Following qualitative validation, 57 of the 95 carbon sources tested experimentally were present in the GEMs, and; therefore, these were the compounds from which comparisons could be drawn. Of these 57 compounds, agreement between in silico predictions and in vitro results for carbon source utilization ranged from 80.7% to 91.2% between strains. Nutrient utilization agreement between in silico predictions and in vitro results were also conducted for numerous nitrogen, phosphorous, and sulfur sources. Additionally, quantitative validation showed that the L. monocytogenes GEMs were able to generate in silico predictions for growth rate and growth yield that were strongly and significantly (p < 0.0013 and p < 0.0015, respectively) correlated with experimental results. These findings are significant because they show that these GEMs for L. monocytogenes are comparable to published GEMs of other organisms for agreement between in silico predictions and in vitro results. Therefore, as with the other GEMs, namely those for Escherichia coli, Staphylococcus aureus, Vibrio vulnificus, and Salmonella spp., they can be used to determine new methods of growth control and disease treatment.

  10. Assessment of cancer and virus antigens for cross-reactivity in human tissues.

    PubMed

    Jaravine, Victor; Raffegerst, Silke; Schendel, Dolores J; Frishman, Dmitrij

    2017-01-01

    Cross-reactivity (CR) or invocation of autoimmune side effects in various tissues has important safety implications in adoptive immunotherapy directed against selected antigens. The ability to predict CR (on-target and off-target toxicities) may help in the early selection of safer therapeutically relevant target antigens. We developed a methodology for the calculation of quantitative CR for any defined peptide epitope. Using this approach, we performed assessment of 4 groups of 283 currently known human MHC-class-I epitopes including differentiation antigens, overexpressed proteins, cancer-testis antigens and mutations displayed by tumor cells. In addition, 89 epitopes originating from viral sources were investigated. The natural occurrence of these epitopes in human tissues was assessed based on proteomics abundance data, while the probability of their presentation by MHC-class-I molecules was modelled by the method of Keşmir et al. which combines proteasomal cleavage, TAP affinity and MHC-binding predictions. The results of these analyses for many previously defined peptides are presented as CR indices and tissue profiles. The methodology thus allows for quantitative comparisons of epitopes and is suggested to be suited for the assessment of epitopes of candidate antigens in an early stage of development of adoptive immunotherapy. Our method is implemented as a Java program, with curated datasets stored in a MySQL database. It predicts all naturally possible self-antigens for a given sequence of a therapeutic antigen (or epitope) and after filtering for predicted immunogenicity outputs results as an index and profile of CR to the self-antigens in 22 human tissues. The program is implemented as part of the iCrossR webserver, which is publicly available at http://webclu.bio.wzw.tum.de/icrossr/ CONTACT: d.frishman@wzw.tum.deSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. A Crowdsourcing Approach to Developing and Assessing Prediction Algorithms for AML Prognosis

    PubMed Central

    Noren, David P.; Long, Byron L.; Norel, Raquel; Rrhissorrakrai, Kahn; Hess, Kenneth; Hu, Chenyue Wendy; Bisberg, Alex J.; Schultz, Andre; Engquist, Erik; Liu, Li; Lin, Xihui; Chen, Gregory M.; Xie, Honglei; Hunter, Geoffrey A. M.; Norman, Thea; Friend, Stephen H.; Stolovitzky, Gustavo; Kornblau, Steven; Qutub, Amina A.

    2016-01-01

    Acute Myeloid Leukemia (AML) is a fatal hematological cancer. The genetic abnormalities underlying AML are extremely heterogeneous among patients, making prognosis and treatment selection very difficult. While clinical proteomics data has the potential to improve prognosis accuracy, thus far, the quantitative means to do so have yet to be developed. Here we report the results and insights gained from the DREAM 9 Acute Myeloid Prediction Outcome Prediction Challenge (AML-OPC), a crowdsourcing effort designed to promote the development of quantitative methods for AML prognosis prediction. We identify the most accurate and robust models in predicting patient response to therapy, remission duration, and overall survival. We further investigate patient response to therapy, a clinically actionable prediction, and find that patients that are classified as resistant to therapy are harder to predict than responsive patients across the 31 models submitted to the challenge. The top two performing models, which held a high sensitivity to these patients, substantially utilized the proteomics data to make predictions. Using these models, we also identify which signaling proteins were useful in predicting patient therapeutic response. PMID:27351836

  12. Mantle rheology and satellite signatures from present-day glacial forcings

    NASA Technical Reports Server (NTRS)

    Sabadini, Roberto; Yuen, David A.; Gasperini, Paolo

    1988-01-01

    Changes in the long-wavelength region of the earth's gravity field resulting from both present-day glacial discharges and the possible growth of the Antarctic ice sheet are considered. Significant differences in the responses between the Maxell and Burger body rheologies are found for time spans of less than 100 years. The quantitative model for predicting the secular variations of the gravitational potential, and means for incorporating glacial forcings, are described. Results are given for the excitation of the degree two harmonics. It is suggested that detailed satellite monitoring of present-day ice movements in conjunction with geodetic satellite missions may provide a reasonable alternative for the esimation of deep mantle viscosity.

  13. Health Impacts of Increased Physical Activity from Changes in Transportation Infrastructure: Quantitative Estimates for Three Communities

    PubMed Central

    2015-01-01

    Recently, two quantitative tools have emerged for predicting the health impacts of projects that change population physical activity: the Health Economic Assessment Tool (HEAT) and Dynamic Modeling for Health Impact Assessment (DYNAMO-HIA). HEAT has been used to support health impact assessments of transportation infrastructure projects, but DYNAMO-HIA has not been previously employed for this purpose nor have the two tools been compared. To demonstrate the use of DYNAMO-HIA for supporting health impact assessments of transportation infrastructure projects, we employed the model in three communities (urban, suburban, and rural) in North Carolina. We also compared DYNAMO-HIA and HEAT predictions in the urban community. Using DYNAMO-HIA, we estimated benefit-cost ratios of 20.2 (95% C.I.: 8.7–30.6), 0.6 (0.3–0.9), and 4.7 (2.1–7.1) for the urban, suburban, and rural projects, respectively. For a 40-year time period, the HEAT predictions of deaths avoided by the urban infrastructure project were three times as high as DYNAMO-HIA's predictions due to HEAT's inability to account for changing population health characteristics over time. Quantitative health impact assessment coupled with economic valuation is a powerful tool for integrating health considerations into transportation decision-making. However, to avoid overestimating benefits, such quantitative HIAs should use dynamic, rather than static, approaches. PMID:26504832

  14. Health Impacts of Increased Physical Activity from Changes in Transportation Infrastructure: Quantitative Estimates for Three Communities.

    PubMed

    Mansfield, Theodore J; MacDonald Gibson, Jacqueline

    2015-01-01

    Recently, two quantitative tools have emerged for predicting the health impacts of projects that change population physical activity: the Health Economic Assessment Tool (HEAT) and Dynamic Modeling for Health Impact Assessment (DYNAMO-HIA). HEAT has been used to support health impact assessments of transportation infrastructure projects, but DYNAMO-HIA has not been previously employed for this purpose nor have the two tools been compared. To demonstrate the use of DYNAMO-HIA for supporting health impact assessments of transportation infrastructure projects, we employed the model in three communities (urban, suburban, and rural) in North Carolina. We also compared DYNAMO-HIA and HEAT predictions in the urban community. Using DYNAMO-HIA, we estimated benefit-cost ratios of 20.2 (95% C.I.: 8.7-30.6), 0.6 (0.3-0.9), and 4.7 (2.1-7.1) for the urban, suburban, and rural projects, respectively. For a 40-year time period, the HEAT predictions of deaths avoided by the urban infrastructure project were three times as high as DYNAMO-HIA's predictions due to HEAT's inability to account for changing population health characteristics over time. Quantitative health impact assessment coupled with economic valuation is a powerful tool for integrating health considerations into transportation decision-making. However, to avoid overestimating benefits, such quantitative HIAs should use dynamic, rather than static, approaches.

  15. The prognostic value of sleep patterns in disorders of consciousness in the sub-acute phase.

    PubMed

    Arnaldi, Dario; Terzaghi, Michele; Cremascoli, Riccardo; De Carli, Fabrizio; Maggioni, Giorgio; Pistarini, Caterina; Nobili, Flavio; Moglia, Arrigo; Manni, Raffaele

    2016-02-01

    This study aimed to evaluate, through polysomnographic analysis, the prognostic value of sleep patterns, compared to other prognostic factors, in patients with disorders of consciousness (DOCs) in the sub-acute phase. Twenty-seven patients underwent 24-h polysomnography and clinical evaluation 3.5 ± 2 months after brain injury. Their clinical outcome was assessed 18.5 ± 9.9 months later. Polysomnographic recordings were evaluated using visual and quantitative indexes. A general linear model was applied to identify features able to predict clinical outcome. Clinical status at follow-up was analysed as a function of the baseline clinical status, the interval between brain injury and follow-up evaluation, patient age and gender, the aetiology of the injury, the lesion site, and visual and quantitative sleep indexes. A better clinical outcome was predicted by a visual index indicating the presence of sleep integrity (p=0.0006), a better baseline clinical status (p=0.014), and younger age (p=0.031). Addition of the quantitative sleep index strengthened the prediction. More structured sleep emerged as a valuable predictor of a positive clinical outcome in sub-acute DOC patients, even stronger than established predictors (e.g. age and baseline clinical condition). Both visual and quantitative sleep evaluation could be helpful in predicting clinical outcome in sub-acute DOCs. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  16. Quantitative analysis of essential oils in perfume using multivariate curve resolution combined with comprehensive two-dimensional gas chromatography.

    PubMed

    de Godoy, Luiz Antonio Fonseca; Hantao, Leandro Wang; Pedroso, Marcio Pozzobon; Poppi, Ronei Jesus; Augusto, Fabio

    2011-08-05

    The use of multivariate curve resolution (MCR) to build multivariate quantitative models using data obtained from comprehensive two-dimensional gas chromatography with flame ionization detection (GC×GC-FID) is presented and evaluated. The MCR algorithm presents some important features, such as second order advantage and the recovery of the instrumental response for each pure component after optimization by an alternating least squares (ALS) procedure. A model to quantify the essential oil of rosemary was built using a calibration set containing only known concentrations of the essential oil and cereal alcohol as solvent. A calibration curve correlating the concentration of the essential oil of rosemary and the instrumental response obtained from the MCR-ALS algorithm was obtained, and this calibration model was applied to predict the concentration of the oil in complex samples (mixtures of the essential oil, pineapple essence and commercial perfume). The values of the root mean square error of prediction (RMSEP) and of the root mean square error of the percentage deviation (RMSPD) obtained were 0.4% (v/v) and 7.2%, respectively. Additionally, a second model was built and used to evaluate the accuracy of the method. A model to quantify the essential oil of lemon grass was built and its concentration was predicted in the validation set and real perfume samples. The RMSEP and RMSPD obtained were 0.5% (v/v) and 6.9%, respectively, and the concentration of the essential oil of lemon grass in perfume agreed to the value informed by the manufacturer. The result indicates that the MCR algorithm is adequate to resolve the target chromatogram from the complex sample and to build multivariate models of GC×GC-FID data. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. New MYC IHC Classifier Integrating Quantitative Architecture Parameters to Predict MYC Gene Translocation in Diffuse Large B-Cell Lymphoma

    PubMed Central

    Dong, Wei-Feng; Canil, Sarah; Lai, Raymond; Morel, Didier; Swanson, Paul E.; Izevbaye, Iyare

    2018-01-01

    A new automated MYC IHC classifier based on bivariate logistic regression is presented. The predictor relies on image analysis developed with the open-source ImageJ platform. From a histologic section immunostained for MYC protein, 2 dimensionless quantitative variables are extracted: (a) relative distance between nuclei positive for MYC IHC based on euclidean minimum spanning tree graph and (b) coefficient of variation of the MYC IHC stain intensity among MYC IHC-positive nuclei. Distance between positive nuclei is suggested to inversely correlate MYC gene rearrangement status, whereas coefficient of variation is suggested to inversely correlate physiological regulation of MYC protein expression. The bivariate classifier was compared with 2 other MYC IHC classifiers (based on percentage of MYC IHC positive nuclei), all tested on 113 lymphomas including mostly diffuse large B-cell lymphomas with known MYC fluorescent in situ hybridization (FISH) status. The bivariate classifier strongly outperformed the “percentage of MYC IHC-positive nuclei” methods to predict MYC+ FISH status with 100% sensitivity (95% confidence interval, 94-100) associated with 80% specificity. The test is rapidly performed and might at a minimum provide primary IHC screening for MYC gene rearrangement status in diffuse large B-cell lymphomas. Furthermore, as this bivariate classifier actually predicts “permanent overexpressed MYC protein status,” it might identify nontranslocation-related chromosomal anomalies missed by FISH. PMID:27093450

  18. Projecting technology change to improve space technology planning and systems management

    NASA Astrophysics Data System (ADS)

    Walk, Steven Robert

    2011-04-01

    Projecting technology performance evolution has been improving over the years. Reliable quantitative forecasting methods have been developed that project the growth, diffusion, and performance of technology in time, including projecting technology substitutions, saturation levels, and performance improvements. These forecasts can be applied at the early stages of space technology planning to better predict available future technology performance, assure the successful selection of technology, and improve technology systems management strategy. Often what is published as a technology forecast is simply scenario planning, usually made by extrapolating current trends into the future, with perhaps some subjective insight added. Typically, the accuracy of such predictions falls rapidly with distance in time. Quantitative technology forecasting (QTF), on the other hand, includes the study of historic data to identify one of or a combination of several recognized universal technology diffusion or substitution patterns. In the same manner that quantitative models of physical phenomena provide excellent predictions of system behavior, so do QTF models provide reliable technological performance trajectories. In practice, a quantitative technology forecast is completed to ascertain with confidence when the projected performance of a technology or system of technologies will occur. Such projections provide reliable time-referenced information when considering cost and performance trade-offs in maintaining, replacing, or migrating a technology, component, or system. This paper introduces various quantitative technology forecasting techniques and illustrates their practical application in space technology and technology systems management.

  19. Machine learning for predicting the response of breast cancer to neoadjuvant chemotherapy

    PubMed Central

    Mani, Subramani; Chen, Yukun; Li, Xia; Arlinghaus, Lori; Chakravarthy, A Bapsi; Abramson, Vandana; Bhave, Sandeep R; Levy, Mia A; Xu, Hua; Yankeelov, Thomas E

    2013-01-01

    Objective To employ machine learning methods to predict the eventual therapeutic response of breast cancer patients after a single cycle of neoadjuvant chemotherapy (NAC). Materials and methods Quantitative dynamic contrast-enhanced MRI and diffusion-weighted MRI data were acquired on 28 patients before and after one cycle of NAC. A total of 118 semiquantitative and quantitative parameters were derived from these data and combined with 11 clinical variables. We used Bayesian logistic regression in combination with feature selection using a machine learning framework for predictive model building. Results The best predictive models using feature selection obtained an area under the curve of 0.86 and an accuracy of 0.86, with a sensitivity of 0.88 and a specificity of 0.82. Discussion With the numerous options for NAC available, development of a method to predict response early in the course of therapy is needed. Unfortunately, by the time most patients are found not to be responding, their disease may no longer be surgically resectable, and this situation could be avoided by the development of techniques to assess response earlier in the treatment regimen. The method outlined here is one possible solution to this important clinical problem. Conclusions Predictive modeling approaches based on machine learning using readily available clinical and quantitative MRI data show promise in distinguishing breast cancer responders from non-responders after the first cycle of NAC. PMID:23616206

  20. Solubility prediction of naphthalene in carbon dioxide from crystal microstructure

    NASA Astrophysics Data System (ADS)

    Sang, Jiarong; Jin, Junsu; Mi, Jianguo

    2018-03-01

    Crystals dissolved in solvents are ubiquitous in both natural and artificial systems. Due to the complicated structures and asymmetric interactions between the crystal and solvent, it is difficult to interpret the dissolution mechanism and predict solubility using traditional theories and models. Here we use the classical density functional theory (DFT) to describe the crystal dissolution behavior. As an example, naphthalene dissolved in carbon dioxide (CO2) is considered within the DFT framework. The unit cell dimensions and microstructure of crystalline naphthalene are determined by minimizing the free-energy of the crystal. According to the microstructure, the solubilities of naphthalene in CO2 are predicted based on the equality of naphthalene's chemical potential in crystal and solution phases, and the interfacial structures and free-energies between different crystal planes and solution are determined to investigate the dissolution mechanism at the molecular level. The theoretical predictions are in general agreement with the available experimental data, implying that the present model is quantitatively reliable in describing crystal dissolution.

  1. Scattering of sound by atmospheric turbulence predictions in a refractive shadow zone

    NASA Technical Reports Server (NTRS)

    Mcbride, Walton E.; Bass, Henry E.; Raspet, Richard; Gilbert, Kenneth E.

    1990-01-01

    According to ray theory, regions exist in an upward refracting atmosphere where no sound should be present. Experiments show, however, that appreciable sound levels penetrate these so-called shadow zones. Two mechanisms contribute to sound in the shadow zone: diffraction and turbulent scattering of sound. Diffractive effects can be pronounced at lower frequencies but are small at high frequencies. In the short wavelength limit, then, scattering due to turbulence should be the predominant mechanism involved in producing the sound levels measured in shadow zones. No existing analytical method includes turbulence effects in the prediction of sound pressure levels in upward refractive shadow zones. In order to obtain quantitative average sound pressure level predictions, a numerical simulation of the effect of atmospheric turbulence on sound propagation is performed. The simulation is based on scattering from randomly distributed scattering centers ('turbules'). Sound pressure levels are computed for many realizations of a turbulent atmosphere. Predictions from the numerical simulation are compared with existing theories and experimental data.

  2. A Subject-Specific Kinematic Model to Predict Human Motion in Exoskeleton-Assisted Gait.

    PubMed

    Torricelli, Diego; Cortés, Camilo; Lete, Nerea; Bertelsen, Álvaro; Gonzalez-Vargas, Jose E; Del-Ama, Antonio J; Dimbwadyo, Iris; Moreno, Juan C; Florez, Julian; Pons, Jose L

    2018-01-01

    The relative motion between human and exoskeleton is a crucial factor that has remarkable consequences on the efficiency, reliability and safety of human-robot interaction. Unfortunately, its quantitative assessment has been largely overlooked in the literature. Here, we present a methodology that allows predicting the motion of the human joints from the knowledge of the angular motion of the exoskeleton frame. Our method combines a subject-specific skeletal model with a kinematic model of a lower limb exoskeleton (H2, Technaid), imposing specific kinematic constraints between them. To calibrate the model and validate its ability to predict the relative motion in a subject-specific way, we performed experiments on seven healthy subjects during treadmill walking tasks. We demonstrate a prediction accuracy lower than 3.5° globally, and around 1.5° at the hip level, which represent an improvement up to 66% compared to the traditional approach assuming no relative motion between the user and the exoskeleton.

  3. A Subject-Specific Kinematic Model to Predict Human Motion in Exoskeleton-Assisted Gait

    PubMed Central

    Torricelli, Diego; Cortés, Camilo; Lete, Nerea; Bertelsen, Álvaro; Gonzalez-Vargas, Jose E.; del-Ama, Antonio J.; Dimbwadyo, Iris; Moreno, Juan C.; Florez, Julian; Pons, Jose L.

    2018-01-01

    The relative motion between human and exoskeleton is a crucial factor that has remarkable consequences on the efficiency, reliability and safety of human-robot interaction. Unfortunately, its quantitative assessment has been largely overlooked in the literature. Here, we present a methodology that allows predicting the motion of the human joints from the knowledge of the angular motion of the exoskeleton frame. Our method combines a subject-specific skeletal model with a kinematic model of a lower limb exoskeleton (H2, Technaid), imposing specific kinematic constraints between them. To calibrate the model and validate its ability to predict the relative motion in a subject-specific way, we performed experiments on seven healthy subjects during treadmill walking tasks. We demonstrate a prediction accuracy lower than 3.5° globally, and around 1.5° at the hip level, which represent an improvement up to 66% compared to the traditional approach assuming no relative motion between the user and the exoskeleton. PMID:29755336

  4. A double-blinded, prospective study to define antigenemia and quantitative real-time polymerase chain reaction cutoffs to start preemptive therapy in low-risk, seropositive, renal transplanted recipients.

    PubMed

    David-Neto, Elias; Triboni, Ana H K; Paula, Flavio J; Vilas Boas, Lucy S; Machado, Clarisse M; Agena, Fabiana; Latif, Acram Z A; Alencar, Cecília S; Pierrotti, Ligia C; Nahas, William C; Caiaffa-Filho, Helio H; Pannuti, Claudio S

    2014-11-27

    Cytomegalovirus (CMV) disease occurs in 16% to 20% of low-risk, CMV-positive renal transplant recipients. The cutoffs for quantitative real-time polymerase chain reaction (qPCR) or phosphoprotein (pp65) antigenemia (pp65emia) for starting preemptive therapy have not been well established. We measured qPCR and pp65emia weekly from day 7 to day 120 after transplantation, in anti-CMV immunoglobulin G–positive donor and recipient pairs. Patients and physicians were blinded to the test results. Suspicion of CMV disease led to the order of new tests. In asymptomatic viremic patients, the highest pp65emia and qPCR values were used, whereas we considered the last value before diagnosis in those with CMV disease. We collected a total of 1,481 blood samples from 102 adult patients. Seventeen patients developed CMV disease, 54 presented at least one episode of viremia that cleared spontaneously, and 31 never presented viremia. Five patients developed CMV disease after the end of the study period. The median (95% confidence interval) pp65emia and qPCR values were higher before CMV disease than during asymptomatic viremia (6 [9–82] vs. 3 [1–14] cells/10(6) cells; P<0.001 and 3,080 [1,263–15,605] vs. 258 [258–1,679] copies/mL; P=0.008, respectively). The receiver operating characteristic curve showed that pp65emia 4 cells/10(6) cells or greater showed a sensitivity and specificity to predict CMV disease of 69% and 81%, respectively (area, 0.769; P=0.001), with a positive predictive value of 37% and a negative predictive value of 93%. For qPCR 2,000 copies/mL or higher, the positive predictive value and negative predictive value were 57% and 91%, respectively (receiver operating characteristic area, 0.782; P=0.000). With these cutoffs, both methods are appropriate for detecting CMV disease.

  5. Prediction of Emergent Heart Failure Death by Semi-Quantitative Triage Risk Stratification

    PubMed Central

    Van Spall, Harriette G. C.; Atzema, Clare; Schull, Michael J.; Newton, Gary E.; Mak, Susanna; Chong, Alice; Tu, Jack V.; Stukel, Thérèse A.; Lee, Douglas S.

    2011-01-01

    Objectives Generic triage risk assessments are widely used in the emergency department (ED), but have not been validated for prediction of short-term risk among patients with acute heart failure (HF). Our objective was to evaluate the Canadian Triage Acuity Scale (CTAS) for prediction of early death among HF patients. Methods We included patients presenting with HF to an ED in Ontario from Apr 2003 to Mar 2007. We used the National Ambulatory Care Reporting System and vital statistics databases to examine care and outcomes. Results Among 68,380 patients (76±12 years, 49.4% men), early mortality was stratified with death rates of 9.9%, 1.9%, 0.9%, and 0.5% at 1-day, and 17.2%, 5.9%, 3.8%, and 2.5% at 7-days, for CTAS 1, 2, 3, and 4–5, respectively. Compared to lower acuity (CTAS 4–5) patients, adjusted odds ratios (aOR) for 1-day death were 1.32 (95%CI; 0.93–1.88; p = 0.12) for CTAS 3, 2.41 (95%CI; 1.71–3.40; p<0.001) for CTAS 2, and highest for CTAS 1: 9.06 (95%CI; 6.28–13.06; p<0.001). Predictors of triage-critical (CTAS 1) status included oxygen saturation <90% (aOR 5.92, 95%CI; 3.09–11.81; p<0.001), respiratory rate >24 breaths/minute (aOR 1.96, 95%CI; 1.05–3.67; p = 0.034), and arrival by paramedic (aOR 3.52, 95%CI; 1.70–8.02; p = 0.001). While age/sex-adjusted CTAS score provided good discrimination for ED (c-statistic = 0.817) and 1-day (c-statistic = 0.724) death, mortality prediction was improved further after accounting for cardiac and non-cardiac co-morbidities (c-statistics 0.882 and 0.810, respectively; both p<0.001). Conclusions A semi-quantitative triage acuity scale assigned at ED presentation and based largely on respiratory factors predicted emergent death among HF patients. PMID:21853068

  6. Novel Application of Quantitative Single-Photon Emission Computed Tomography/Computed Tomography to Predict Early Response to Methimazole in Graves' Disease

    PubMed Central

    Kim, Hyun Joo; Bang, Ji-In; Kim, Ji-Young; Moon, Jae Hoon; So, Young

    2017-01-01

    Objective Since Graves' disease (GD) is resistant to antithyroid drugs (ATDs), an accurate quantitative thyroid function measurement is required for the prediction of early responses to ATD. Quantitative parameters derived from the novel technology, single-photon emission computed tomography/computed tomography (SPECT/CT), were investigated for the prediction of achievement of euthyroidism after methimazole (MMI) treatment in GD. Materials and Methods A total of 36 GD patients (10 males, 26 females; mean age, 45.3 ± 13.8 years) were enrolled for this study, from April 2015 to January 2016. They underwent quantitative thyroid SPECT/CT 20 minutes post-injection of 99mTc-pertechnetate (5 mCi). Association between the time to biochemical euthyroidism after MMI treatment and %uptake, standardized uptake value (SUV), functional thyroid mass (SUVmean × thyroid volume) from the SPECT/CT, and clinical/biochemical variables, were investigated. Results GD patients had a significantly greater %uptake (6.9 ± 6.4%) than historical control euthyroid patients (n = 20, 0.8 ± 0.5%, p < 0.001) from the same quantitative SPECT/CT protocol. Euthyroidism was achieved in 14 patients at 156 ± 62 days post-MMI treatment, but 22 patients had still not achieved euthyroidism by the last follow-up time-point (208 ± 80 days). In the univariate Cox regression analysis, the initial MMI dose (p = 0.014), %uptake (p = 0.015), and functional thyroid mass (p = 0.016) were significant predictors of euthyroidism in response to MMI treatment. However, only %uptake remained significant in a multivariate Cox regression analysis (p = 0.034). A %uptake cutoff of 5.0% dichotomized the faster responding versus the slower responding GD patients (p = 0.006). Conclusion A novel parameter of thyroid %uptake from quantitative SPECT/CT is a predictive indicator of an early response to MMI in GD patients. PMID:28458607

  7. Testing a lepton quarticity flavor theory of neutrino oscillations with the DUNE experiment

    NASA Astrophysics Data System (ADS)

    Srivastava, Rahul; Ternes, Christoph A.; Tórtola, Mariam; Valle, José W. F.

    2018-03-01

    Oscillation studies play a central role in elucidating at least some aspects of the flavor problem. Here we examine the status of the predictions of a lepton quarticity flavor theory of neutrino oscillations against the existing global sample of oscillation data. By performing quantitative simulations we also determine the potential of the upcoming DUNE experiment in narrowing down the currently ill-measured oscillation parameters θ23 and δCP. We present the expected improved sensitivity on these parameters for different assumptions.

  8. Scaling properties of multitension domain wall networks

    NASA Astrophysics Data System (ADS)

    Oliveira, M. F.; Martins, C. J. A. P.

    2015-02-01

    We study the asymptotic scaling properties of domain wall networks with three different tensions in various cosmological epochs. We discuss the conditions under which a scale-invariant evolution of the network (which is well established for simpler walls) still applies and also consider the limiting case where defects are locally planar and the curvature is concentrated in the junctions. We present detailed quantitative predictions for scaling densities in various contexts, which should be testable by means of future high-resolution numerical simulations.

  9. Simulation of 2D Granular Hopper Flow

    NASA Astrophysics Data System (ADS)

    Li, Zhusong; Shattuck, Mark

    2012-02-01

    Jamming and intermittent granular flow are big problems in industry, and the vertical hopper is a canonical example of these difficulties. We simulate gravity driven flow and jamming of 2D disks in a vertical hopper and compare with identical companion experiments presented in this session. We measure and compare the flow rate and probability for jamming as a function of particle properties and geometry. We evaluate the ability of standard Hertz-Mindlin contact mode to quantitatively predict the experimental flow.

  10. An Interdisciplinary Approach to Predictive Modeling of Structural Adhesive Bonding. Factors Affecting the Durability of Titanium/Epoxy Bonds.

    DTIC Science & Technology

    1987-10-01

    durability test at 800 C, 95% r.h. 71 SEM photomicrograph at 1600 x of E-8385 film spun coat . from a 2 wt% solution onto a ferrotype plate. .I 72 Theoretical ...TiO2 to the high energy side. While Auger line shapes theoretically yield oxidation state information, stoichiometry conclusions from experi- 0 mental...the justification for the methods chosen in this work. ,*p-* ., Fadley et al. [37] present a detailed theoretical discussion on quantitative XPS

  11. Convection in a nematic liquid crystal with homeotropic alignment and heated from below

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahlers, G.

    Experimental results for convection in a thin horizontal layer of a homeotropically aligned nematic liquid crystal heated from below and in a vertical magnetic field are presented. A subcritical Hopf bifurcation leads to the convecting state. There is quantitative agreement between the measured and the predicted bifurcation line as a function of magnetic field. The nonlinear state near the bifurcation is one of spatio-temporal chaos which seems to be the result of a zig-zag instability of the straight-roll state.

  12. Genomic Prediction for Quantitative Traits Is Improved by Mapping Variants to Gene Ontology Categories in Drosophila melanogaster

    PubMed Central

    Edwards, Stefan M.; Sørensen, Izel F.; Sarup, Pernille; Mackay, Trudy F. C.; Sørensen, Peter

    2016-01-01

    Predicting individual quantitative trait phenotypes from high-resolution genomic polymorphism data is important for personalized medicine in humans, plant and animal breeding, and adaptive evolution. However, this is difficult for populations of unrelated individuals when the number of causal variants is low relative to the total number of polymorphisms and causal variants individually have small effects on the traits. We hypothesized that mapping molecular polymorphisms to genomic features such as genes and their gene ontology categories could increase the accuracy of genomic prediction models. We developed a genomic feature best linear unbiased prediction (GFBLUP) model that implements this strategy and applied it to three quantitative traits (startle response, starvation resistance, and chill coma recovery) in the unrelated, sequenced inbred lines of the Drosophila melanogaster Genetic Reference Panel. Our results indicate that subsetting markers based on genomic features increases the predictive ability relative to the standard genomic best linear unbiased prediction (GBLUP) model. Both models use all markers, but GFBLUP allows differential weighting of the individual genetic marker relationships, whereas GBLUP weighs the genetic marker relationships equally. Simulation studies show that it is possible to further increase the accuracy of genomic prediction for complex traits using this model, provided the genomic features are enriched for causal variants. Our GFBLUP model using prior information on genomic features enriched for causal variants can increase the accuracy of genomic predictions in populations of unrelated individuals and provides a formal statistical framework for leveraging and evaluating information across multiple experimental studies to provide novel insights into the genetic architecture of complex traits. PMID:27235308

  13. A Detailed Data-Driven Network Model of Prefrontal Cortex Reproduces Key Features of In Vivo Activity

    PubMed Central

    Hass, Joachim; Hertäg, Loreen; Durstewitz, Daniel

    2016-01-01

    The prefrontal cortex is centrally involved in a wide range of cognitive functions and their impairment in psychiatric disorders. Yet, the computational principles that govern the dynamics of prefrontal neural networks, and link their physiological, biochemical and anatomical properties to cognitive functions, are not well understood. Computational models can help to bridge the gap between these different levels of description, provided they are sufficiently constrained by experimental data and capable of predicting key properties of the intact cortex. Here, we present a detailed network model of the prefrontal cortex, based on a simple computationally efficient single neuron model (simpAdEx), with all parameters derived from in vitro electrophysiological and anatomical data. Without additional tuning, this model could be shown to quantitatively reproduce a wide range of measures from in vivo electrophysiological recordings, to a degree where simulated and experimentally observed activities were statistically indistinguishable. These measures include spike train statistics, membrane potential fluctuations, local field potentials, and the transmission of transient stimulus information across layers. We further demonstrate that model predictions are robust against moderate changes in key parameters, and that synaptic heterogeneity is a crucial ingredient to the quantitative reproduction of in vivo-like electrophysiological behavior. Thus, we have produced a physiologically highly valid, in a quantitative sense, yet computationally efficient PFC network model, which helped to identify key properties underlying spike time dynamics as observed in vivo, and can be harvested for in-depth investigation of the links between physiology and cognition. PMID:27203563

  14. Quantitative imaging features: extension of the oncology medical image database

    NASA Astrophysics Data System (ADS)

    Patel, M. N.; Looney, P. T.; Young, K. C.; Halling-Brown, M. D.

    2015-03-01

    Radiological imaging is fundamental within the healthcare industry and has become routinely adopted for diagnosis, disease monitoring and treatment planning. With the advent of digital imaging modalities and the rapid growth in both diagnostic and therapeutic imaging, the ability to be able to harness this large influx of data is of paramount importance. The Oncology Medical Image Database (OMI-DB) was created to provide a centralized, fully annotated dataset for research. The database contains both processed and unprocessed images, associated data, and annotations and where applicable expert determined ground truths describing features of interest. Medical imaging provides the ability to detect and localize many changes that are important to determine whether a disease is present or a therapy is effective by depicting alterations in anatomic, physiologic, biochemical or molecular processes. Quantitative imaging features are sensitive, specific, accurate and reproducible imaging measures of these changes. Here, we describe an extension to the OMI-DB whereby a range of imaging features and descriptors are pre-calculated using a high throughput approach. The ability to calculate multiple imaging features and data from the acquired images would be valuable and facilitate further research applications investigating detection, prognosis, and classification. The resultant data store contains more than 10 million quantitative features as well as features derived from CAD predictions. Theses data can be used to build predictive models to aid image classification, treatment response assessment as well as to identify prognostic imaging biomarkers.

  15. Comparison of quantitative and qualitative tests for glucose-6-phosphate dehydrogenase deficiency in the neonatal period.

    PubMed

    Keihanian, F; Basirjafari, S; Darbandi, B; Saeidinia, A; Jafroodi, M; Sharafi, R; Shakiba, M

    2017-06-01

    Considering the high prevalence of glucose-6-phosphate dehydrogenase (G6PD) deficiency among newborns, different screening methods have been established in various countries. In this study, we aimed to assess the prevalence of G6PD deficiency among newborns in Rasht, Iran, and compare G6PD activity in cord blood samples, using quantitative and qualitative tests. This cross-sectional, prospective study was performed at five largest hospitals in Rasht, Guilan Province, Iran. The screening tests were performed for all the newborns, referred to these hospitals. Specimens were characterized in terms of G6PD activity under ultraviolet light, using the kinetic method and the qualitative fluorescent spot test (FST). We also determined the sensitivity, specificity, negative predictive value, and positive predictive value of the qualitative assay. Blood samples were collected from 1474 newborns. Overall, 757 (51.4%) subjects were male. As the findings revealed, 1376 (93.4%) newborns showed normal G6PD activity, while 98 (6.6%) had G6PD deficiency. There was a significant difference in the mean G6PD level between males and females (P = 0.0001). Also, a significant relationship was detected between FST results and the mean values obtained in the quantitative test (P < 0.0001). According to the present study, FST showed acceptable sensitivity and specificity for G6PD activity, although it appeared inefficient for diagnostic purposes in some cases. © 2017 John Wiley & Sons Ltd.

  16. Development of quantitative screen for 1550 chemicals with GC-MS.

    PubMed

    Bergmann, Alan J; Points, Gary L; Scott, Richard P; Wilson, Glenn; Anderson, Kim A

    2018-05-01

    With hundreds of thousands of chemicals in the environment, effective monitoring requires high-throughput analytical techniques. This paper presents a quantitative screening method for 1550 chemicals based on statistical modeling of responses with identification and integration performed using deconvolution reporting software. The method was evaluated with representative environmental samples. We tested biological extracts, low-density polyethylene, and silicone passive sampling devices spiked with known concentrations of 196 representative chemicals. A multiple linear regression (R 2  = 0.80) was developed with molecular weight, logP, polar surface area, and fractional ion abundance to predict chemical responses within a factor of 2.5. Linearity beyond the calibration had R 2  > 0.97 for three orders of magnitude. Median limits of quantitation were estimated to be 201 pg/μL (1.9× standard deviation). The number of detected chemicals and the accuracy of quantitation were similar for environmental samples and standard solutions. To our knowledge, this is the most precise method for the largest number of semi-volatile organic chemicals lacking authentic standards. Accessible instrumentation and software make this method cost effective in quantifying a large, customizable list of chemicals. When paired with silicone wristband passive samplers, this quantitative screen will be very useful for epidemiology where binning of concentrations is common. Graphical abstract A multiple linear regression of chemical responses measured with GC-MS allowed quantitation of 1550 chemicals in samples such as silicone wristbands.

  17. NetMHCpan, a Method for Quantitative Predictions of Peptide Binding to Any HLA-A and -B Locus Protein of Known Sequence

    PubMed Central

    Nielsen, Morten; Lundegaard, Claus; Blicher, Thomas; Lamberth, Kasper; Harndahl, Mikkel; Justesen, Sune; Røder, Gustav; Peters, Bjoern; Sette, Alessandro; Lund, Ole; Buus, Søren

    2007-01-01

    Background Binding of peptides to Major Histocompatibility Complex (MHC) molecules is the single most selective step in the recognition of pathogens by the cellular immune system. The human MHC class I system (HLA-I) is extremely polymorphic. The number of registered HLA-I molecules has now surpassed 1500. Characterizing the specificity of each separately would be a major undertaking. Principal Findings Here, we have drawn on a large database of known peptide-HLA-I interactions to develop a bioinformatics method, which takes both peptide and HLA sequence information into account, and generates quantitative predictions of the affinity of any peptide-HLA-I interaction. Prospective experimental validation of peptides predicted to bind to previously untested HLA-I molecules, cross-validation, and retrospective prediction of known HIV immune epitopes and endogenous presented peptides, all successfully validate this method. We further demonstrate that the method can be applied to perform a clustering analysis of MHC specificities and suggest using this clustering to select particularly informative novel MHC molecules for future biochemical and functional analysis. Conclusions Encompassing all HLA molecules, this high-throughput computational method lends itself to epitope searches that are not only genome- and pathogen-wide, but also HLA-wide. Thus, it offers a truly global analysis of immune responses supporting rational development of vaccines and immunotherapy. It also promises to provide new basic insights into HLA structure-function relationships. The method is available at http://www.cbs.dtu.dk/services/NetMHCpan. PMID:17726526

  18. Predictive Model of Systemic Toxicity (SOT)

    EPA Science Inventory

    In an effort to ensure chemical safety in light of regulatory advances away from reliance on animal testing, USEPA and L’Oréal have collaborated to develop a quantitative systemic toxicity prediction model. Prediction of human systemic toxicity has proved difficult and remains a ...

  19. Testing Feedback Models with Nearby Star Forming Regions

    NASA Astrophysics Data System (ADS)

    Doran, E.; Crowther, P.

    2012-12-01

    The feedback from massive stars plays a crucial role in the evolution of galaxies. Accurate modelling of this feedback is essential in understanding distant star forming regions. Young nearby, high mass (> 104 M⊙) clusters such as R136 (in the 30 Doradus region) are ideal test beds for population synthesis since they host large numbers of spatially resolved massive stars at a pre-supernovae stage. We present a quantitative comparison of empirical calibrations of radiative and mechanical feedback from individual stars in R136, with instantaneous burst predictions from the popular Starburst99 evolution synthesis code. We find that empirical results exceed predictions by factors of ˜3-9, as a result of limiting simulations to an upper limit of 100 M⊙. 100-300 M⊙ stars should to be incorporated in population synthesis models for high mass clusters to bring predictions into close agreement with empirical results.

  20. Interconnect fatigue design for terrestrial photovoltaic modules

    NASA Technical Reports Server (NTRS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1982-01-01

    The results of comprehensive investigation of interconnect fatigue that has led to the definition of useful reliability-design and life-prediction algorithms are presented. Experimental data indicate that the classical strain-cycle (fatigue) curve for the interconnect material is a good model of mean interconnect fatigue performance, but it fails to account for the broad statistical scatter, which is critical to reliability prediction. To fill this shortcoming the classical fatigue curve is combined with experimental cumulative interconnect failure rate data to yield statistical fatigue curves (having failure probability as a parameter) which enable (1) the prediction of cumulative interconnect failures during the design life of an array field, and (2) the unambiguous--ie., quantitative--interpretation of data from field-service qualification (accelerated thermal cycling) tests. Optimal interconnect cost-reliability design algorithms are derived based on minimizing the cost of energy over the design life of the array field.

  1. Interconnect fatigue design for terrestrial photovoltaic modules

    NASA Astrophysics Data System (ADS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1982-03-01

    The results of comprehensive investigation of interconnect fatigue that has led to the definition of useful reliability-design and life-prediction algorithms are presented. Experimental data indicate that the classical strain-cycle (fatigue) curve for the interconnect material is a good model of mean interconnect fatigue performance, but it fails to account for the broad statistical scatter, which is critical to reliability prediction. To fill this shortcoming the classical fatigue curve is combined with experimental cumulative interconnect failure rate data to yield statistical fatigue curves (having failure probability as a parameter) which enable (1) the prediction of cumulative interconnect failures during the design life of an array field, and (2) the unambiguous--ie., quantitative--interpretation of data from field-service qualification (accelerated thermal cycling) tests. Optimal interconnect cost-reliability design algorithms are derived based on minimizing the cost of energy over the design life of the array field.

  2. The use of copula functions for predictive analysis of correlations between extreme storm tides

    NASA Astrophysics Data System (ADS)

    Domino, Krzysztof; Błachowicz, Tomasz; Ciupak, Maurycy

    2014-11-01

    In this paper we present a method used in quantitative description of weakly predictable hydrological, extreme events at inland sea. Investigations for correlations between variations of individual measuring points, employing combined statistical methods, were carried out. As a main tool for this analysis we used a two-dimensional copula function sensitive for correlated extreme effects. Additionally, a new proposed methodology, based on Detrended Fluctuations Analysis (DFA) and Anomalous Diffusion (AD), was used for the prediction of negative and positive auto-correlations and associated optimum choice of copula functions. As a practical example we analysed maximum storm tides data recorded at five spatially separated places at the Baltic Sea. For the analysis we used Gumbel, Clayton, and Frank copula functions and introduced the reversed Clayton copula. The application of our research model is associated with modelling the risk of high storm tides and possible storm flooding.

  3. Thermoplastic matrix composite processing model

    NASA Technical Reports Server (NTRS)

    Dara, P. H.; Loos, A. C.

    1985-01-01

    The effects the processing parameters pressure, temperature, and time have on the quality of continuous graphite fiber reinforced thermoplastic matrix composites were quantitatively accessed by defining the extent to which intimate contact and bond formation has occurred at successive ply interfaces. Two models are presented predicting the extents to which the ply interfaces have achieved intimate contact and cohesive strength. The models are based on experimental observation of compression molded laminates and neat resin conditions, respectively. Identified as the mechanism explaining the phenomenon by which the plies bond to themselves is the theory of autohesion (or self diffusion). Theoretical predictions from the Reptation Theory between autohesive strength and contact time are used to explain the effects of the processing parameters on the observed experimental strengths. The application of a time-temperature relationship for autohesive strength predictions is evaluated. A viscoelastic compression molding model of a tow was developed to explain the phenomenon by which the prepreg ply interfaces develop intimate contact.

  4. Quantifying Reliability - The Next Step for a Rapidly Maturing PV Industry and China's Role

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurtz, Sarah

    2015-10-14

    PV customers wish to know how long their PV modules will last, but quantitatively predicting service life is difficult because of the large number of ways that a module can fail, the variability of the use environment, the cost of the testing, and the short product development time, especially when compared with the long desired lifetime. China should play a key role in developing international standards because China manufactures most of the world's PV modules. The presentation will describe the steps that need to be taken to create a service life prediction within the context of a defined bill ofmore » materials, process window and use environment. Worldwide standards for cost-effective approaches to service-life predictions will be beneficial to both PV customers and manufacturers since the consequences of premature module failure can be disastrous for both.« less

  5. Quantitative PET Imaging with Novel HER3 Targeted Peptides Selected by Phage Display to Predict Androgen Independent Prostate Cancer Progression

    DTIC Science & Technology

    2017-08-01

    9 4 1. Introduction The subject of this research is the design and testing of a PET imaging agent for the detection and...AWARD NUMBER: W81XWH-16-1-0447 TITLE: Quantitative PET Imaging with Novel HER3-Targeted Peptides Selected by Phage Display to Predict Androgen...MA 02114 REPORT DATE: August 2017 TYPE OF REPORT: Annual PREPARED FOR: U.S. Army Medical Research and Materiel Command Fort Detrick, Maryland

  6. Quantitative somatosensory testing of the penis: optimizing the clinical neurological examination.

    PubMed

    Bleustein, Clifford B; Eckholdt, Haftan; Arezzo, Joseph C; Melman, Arnold

    2003-06-01

    Quantitative somatosensory testing, including vibration, pressure, spatial perception and thermal thresholds of the penis, has demonstrated neuropathy in patients with a history of erectile dysfunction of all etiologies. We evaluated which measurement of neurological function of the penis was best at predicting erectile dysfunction and examined the impact of location on the penis for quantitative somatosensory testing measurements. A total of 107 patients were evaluated. All patients were required to complete the erectile function domain of the International Index of Erectile Function (IIEF) questionnaire, of whom 24 had no complaints of erectile dysfunction and scored within the "normal" range on the IIEF. Patients were subsequently tested on ventral middle penile shaft, proximal dorsal midline penile shaft and glans penis (with foreskin retracted) for vibration, pressure, spatial perception, and warm and cold thermal thresholds. Mixed models repeated measures analysis of variance controlling for age, diabetes and hypertension revealed that method of measurement (quantitative somatosensory testing) was predictive of IIEF score (F = 209, df = 4,1315, p <0.001), while site of measurement on the penis was not. To determine the best method of measurement, we used hierarchical regression, which revealed that warm temperature was the best predictor of erectile dysfunction with pseudo R(2) = 0.19, p <0.0007. There was no significant improvement in predicting erectile dysfunction when another test was added. Using 37C and greater as the warm thermal threshold yielded a sensitivity of 88.5%, specificity 70.0% and positive predictive value 85.5%. Quantitative somatosensory testing using warm thermal threshold measurements taken at the glans penis can be used alone to assess the neurological status of the penis. Warm thermal thresholds alone offer a quick, noninvasive accurate method of evaluating penile neuropathy in an office setting.

  7. The Dokuchaev hypothesis as a basis for predictive digital soil mapping (on the 125th anniversary of its publication)

    NASA Astrophysics Data System (ADS)

    Florinsky, I. V.

    2012-04-01

    Predictive digital soil mapping is widely used in soil science. Its objective is the prediction of the spatial distribution of soil taxonomic units and quantitative soil properties via the analysis of spatially distributed quantitative characteristics of soil-forming factors. Western pedometrists stress the scientific priority and principal importance of Hans Jenny's book (1941) for the emergence and development of predictive soil mapping. In this paper, we demonstrate that Vasily Dokuchaev explicitly defined the central idea and statement of the problem of contemporary predictive soil mapping in the year 1886. Then, we reconstruct the history of the soil formation equation from 1899 to 1941. We argue that Jenny adopted the soil formation equation from Sergey Zakharov, who published it in a well-known fundamental textbook in 1927. It is encouraging that this issue was clarified in 2011, the anniversary year for publications of Dokuchaev and Jenny.

  8. Quantitative Predictive Models for Systemic Toxicity (SOT)

    EPA Science Inventory

    Models to identify systemic and specific target organ toxicity were developed to help transition the field of toxicology towards computational models. By leveraging multiple data sources to incorporate read-across and machine learning approaches, a quantitative model of systemic ...

  9. Distribution of voids in field concrete.

    DOT National Transportation Integrated Search

    1978-01-01

    This study was intended to evaluate the air void characteristics of concrete in an attempt to identify, quantitatively or semi-quantitatively, different types of voids and to predict their influence on strength and durability. At the outset, it was a...

  10. General Platform for Systematic Quantitative Evaluation of Small-Molecule Permeability in Bacteria

    PubMed Central

    2015-01-01

    The chemical features that impact small-molecule permeability across bacterial membranes are poorly understood, and the resulting lack of tools to predict permeability presents a major obstacle to the discovery and development of novel antibiotics. Antibacterials are known to have vastly different structural and physicochemical properties compared to nonantiinfective drugs, as illustrated herein by principal component analysis (PCA). To understand how these properties influence bacterial permeability, we have developed a systematic approach to evaluate the penetration of diverse compounds into bacteria with distinct cellular envelopes. Intracellular compound accumulation is quantitated using LC-MS/MS, then PCA and Pearson pairwise correlations are used to identify structural and physicochemical parameters that correlate with accumulation. An initial study using 10 sulfonyladenosines in Escherichia coli, Bacillus subtilis, and Mycobacterium smegmatis has identified nonobvious correlations between chemical structure and permeability that differ among the various bacteria. Effects of cotreatment with efflux pump inhibitors were also investigated. This sets the stage for use of this platform in larger prospective analyses of diverse chemotypes to identify global relationships between chemical structure and bacterial permeability that would enable the development of predictive tools to accelerate antibiotic drug discovery. PMID:25198656

  11. Quantitative theory of driven nonlinear brain dynamics.

    PubMed

    Roberts, J A; Robinson, P A

    2012-09-01

    Strong periodic stimuli such as bright flashing lights evoke nonlinear responses in the brain and interact nonlinearly with ongoing cortical activity, but the underlying mechanisms for these phenomena are poorly understood at present. The dominant features of these experimentally observed dynamics are reproduced by the dynamics of a quantitative neural field model subject to periodic drive. Model power spectra over a range of drive frequencies show agreement with multiple features of experimental measurements, exhibiting nonlinear effects including entrainment over a range of frequencies around the natural alpha frequency f(α), subharmonic entrainment near 2f(α), and harmonic generation. Further analysis of the driven dynamics as a function of the drive parameters reveals rich nonlinear dynamics that is predicted to be observable in future experiments at high drive amplitude, including period doubling, bistable phase-locking, hysteresis, wave mixing, and chaos indicated by positive Lyapunov exponents. Moreover, photosensitive seizures are predicted for physiologically realistic model parameters yielding bistability between healthy and seizure dynamics. These results demonstrate the applicability of neural field models to the new regime of periodically driven nonlinear dynamics, enabling interpretation of experimental data in terms of specific generating mechanisms and providing new tests of the theory. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. Target-based drug discovery for [Formula: see text]-globin disorders: drug target prediction using quantitative modeling with hybrid functional Petri nets.

    PubMed

    Mehraei, Mani; Bashirov, Rza; Tüzmen, Şükrü

    2016-10-01

    Recent molecular studies provide important clues into treatment of [Formula: see text]-thalassemia, sickle-cell anaemia and other [Formula: see text]-globin disorders revealing that increased production of fetal hemoglobin, that is normally suppressed in adulthood, can ameliorate the severity of these diseases. In this paper, we present a novel approach for drug prediction for [Formula: see text]-globin disorders. Our approach is centered upon quantitative modeling of interactions in human fetal-to-adult hemoglobin switch network using hybrid functional Petri nets. In accordance with the reverse pharmacology approach, we pose a hypothesis regarding modulation of specific protein targets that induce [Formula: see text]-globin and consequently fetal hemoglobin. Comparison of simulation results for the proposed strategy with the ones obtained for already existing drugs shows that our strategy is the optimal as it leads to highest level of [Formula: see text]-globin induction and thereby has potential beneficial therapeutic effects on [Formula: see text]-globin disorders. Simulation results enable verification of model coherence demonstrating that it is consistent with qPCR data available for known strategies and/or drugs.

  13. Bayesian data assimilation provides rapid decision support for vector-borne diseases

    PubMed Central

    Jewell, Chris P.; Brown, Richard G.

    2015-01-01

    Predicting the spread of vector-borne diseases in response to incursions requires knowledge of both host and vector demographics in advance of an outbreak. Although host population data are typically available, for novel disease introductions there is a high chance of the pathogen using a vector for which data are unavailable. This presents a barrier to estimating the parameters of dynamical models representing host–vector–pathogen interaction, and hence limits their ability to provide quantitative risk forecasts. The Theileria orientalis (Ikeda) outbreak in New Zealand cattle demonstrates this problem: even though the vector has received extensive laboratory study, a high degree of uncertainty persists over its national demographic distribution. Addressing this, we develop a Bayesian data assimilation approach whereby indirect observations of vector activity inform a seasonal spatio-temporal risk surface within a stochastic epidemic model. We provide quantitative predictions for the future spread of the epidemic, quantifying uncertainty in the model parameters, case infection times and the disease status of undetected infections. Importantly, we demonstrate how our model learns sequentially as the epidemic unfolds and provide evidence for changing epidemic dynamics through time. Our approach therefore provides a significant advance in rapid decision support for novel vector-borne disease outbreaks. PMID:26136225

  14. Quantitative modeling and optimization of magnetic tweezers.

    PubMed

    Lipfert, Jan; Hao, Xiaomin; Dekker, Nynke H

    2009-06-17

    Magnetic tweezers are a powerful tool to manipulate single DNA or RNA molecules and to study nucleic acid-protein interactions in real time. Here, we have modeled the magnetic fields of permanent magnets in magnetic tweezers and computed the forces exerted on superparamagnetic beads from first principles. For simple, symmetric geometries the magnetic fields can be calculated semianalytically using the Biot-Savart law. For complicated geometries and in the presence of an iron yoke, we employ a finite-element three-dimensional PDE solver to numerically solve the magnetostatic problem. The theoretical predictions are in quantitative agreement with direct Hall-probe measurements of the magnetic field and with measurements of the force exerted on DNA-tethered beads. Using these predictive theories, we systematically explore the effects of magnet alignment, magnet spacing, magnet size, and of adding an iron yoke to the magnets on the forces that can be exerted on tethered particles. We find that the optimal configuration for maximal stretching forces is a vertically aligned pair of magnets, with a minimal gap between the magnets and minimal flow cell thickness. Following these principles, we present a configuration that allows one to apply > or = 40 pN stretching forces on approximately 1-microm tethered beads.

  15. Quantitative Modeling and Optimization of Magnetic Tweezers

    PubMed Central

    Lipfert, Jan; Hao, Xiaomin; Dekker, Nynke H.

    2009-01-01

    Abstract Magnetic tweezers are a powerful tool to manipulate single DNA or RNA molecules and to study nucleic acid-protein interactions in real time. Here, we have modeled the magnetic fields of permanent magnets in magnetic tweezers and computed the forces exerted on superparamagnetic beads from first principles. For simple, symmetric geometries the magnetic fields can be calculated semianalytically using the Biot-Savart law. For complicated geometries and in the presence of an iron yoke, we employ a finite-element three-dimensional PDE solver to numerically solve the magnetostatic problem. The theoretical predictions are in quantitative agreement with direct Hall-probe measurements of the magnetic field and with measurements of the force exerted on DNA-tethered beads. Using these predictive theories, we systematically explore the effects of magnet alignment, magnet spacing, magnet size, and of adding an iron yoke to the magnets on the forces that can be exerted on tethered particles. We find that the optimal configuration for maximal stretching forces is a vertically aligned pair of magnets, with a minimal gap between the magnets and minimal flow cell thickness. Following these principles, we present a configuration that allows one to apply ≥40 pN stretching forces on ≈1-μm tethered beads. PMID:19527664

  16. About the National Forecast Chart

    Science.gov Websites

    General Weather WPC Quantitative Precipitation Forecasts for coverage, and weather type from the NWS NDFD Weather Prediction Center 5830 University Research Court College Park, Maryland 20740 Weather Prediction

  17. Conventional liquid chromatography/triple quadrupole mass spectrometer-based metabolite identification and semi-quantitative estimation approach in the investigation of dabigatran etexilate in vitro metabolism

    PubMed Central

    Hu, Zhe-Yi; Parker, Robert B.; Herring, Vanessa L.; Laizure, S. Casey

    2012-01-01

    Dabigatran etexilate (DABE) is an oral prodrug that is rapidly converted by esterases to dabigatran (DAB), a direct inhibitor of thrombin. To elucidate the esterase-mediated metabolic pathway of DABE, a high-performance liquid chromatography/mass spectrometer (LC-MS/MS)-based metabolite identification and semi-quantitative estimation approach was developed. To overcome the poor full-scan sensitivity of conventional triple quadrupole mass spectrometry, precursor-product ion pairs were predicted, to search for the potential in vitro metabolites. The detected metabolites were confirmed by the product ion scan. A dilution method was introduced to evaluate the matrix effects of tentatively identified metabolites without chemical standards. Quantitative information on detected metabolites was obtained using ‘metabolite standards’ generated from incubation samples that contain a high concentration of metabolite in combination with a correction factor for mass spectrometry response. Two in vitro metabolites of DABE (M1 and M2) were identified, and quantified by the semi-quantitative estimation approach. It is noteworthy that CES1 convert DABE to M1 while CES2 mediates the conversion of DABE to M2. M1 (or M2) was further metabolized to DAB by CES2 (or CES1). The approach presented here provides a solution to a bioanalytical need for fast identification and semi-quantitative estimation of CES metabolites in preclinical samples. PMID:23239178

  18. A priori Prediction of Neoadjuvant Chemotherapy Response and Survival in Breast Cancer Patients using Quantitative Ultrasound

    PubMed Central

    Tadayyon, Hadi; Sannachi, Lakshmanan; Gangeh, Mehrdad J.; Kim, Christina; Ghandi, Sonal; Trudeau, Maureen; Pritchard, Kathleen; Tran, William T.; Slodkowska, Elzbieta; Sadeghi-Naini, Ali; Czarnota, Gregory J.

    2017-01-01

    Quantitative ultrasound (QUS) can probe tissue structure and analyze tumour characteristics. Using a 6-MHz ultrasound system, radiofrequency data were acquired from 56 locally advanced breast cancer patients prior to their neoadjuvant chemotherapy (NAC) and QUS texture features were computed from regions of interest in tumour cores and their margins as potential predictive and prognostic indicators. Breast tumour molecular features were also collected and used for analysis. A multiparametric QUS model was constructed, which demonstrated a response prediction accuracy of 88% and ability to predict patient 5-year survival rates (p = 0.01). QUS features demonstrated superior performance in comparison to molecular markers and the combination of QUS and molecular markers did not improve response prediction. This study demonstrates, for the first time, that non-invasive QUS features in the core and margin of breast tumours can indicate breast cancer response to neoadjuvant chemotherapy (NAC) and predict five-year recurrence-free survival. PMID:28401902

  19. A priori Prediction of Neoadjuvant Chemotherapy Response and Survival in Breast Cancer Patients using Quantitative Ultrasound.

    PubMed

    Tadayyon, Hadi; Sannachi, Lakshmanan; Gangeh, Mehrdad J; Kim, Christina; Ghandi, Sonal; Trudeau, Maureen; Pritchard, Kathleen; Tran, William T; Slodkowska, Elzbieta; Sadeghi-Naini, Ali; Czarnota, Gregory J

    2017-04-12

    Quantitative ultrasound (QUS) can probe tissue structure and analyze tumour characteristics. Using a 6-MHz ultrasound system, radiofrequency data were acquired from 56 locally advanced breast cancer patients prior to their neoadjuvant chemotherapy (NAC) and QUS texture features were computed from regions of interest in tumour cores and their margins as potential predictive and prognostic indicators. Breast tumour molecular features were also collected and used for analysis. A multiparametric QUS model was constructed, which demonstrated a response prediction accuracy of 88% and ability to predict patient 5-year survival rates (p = 0.01). QUS features demonstrated superior performance in comparison to molecular markers and the combination of QUS and molecular markers did not improve response prediction. This study demonstrates, for the first time, that non-invasive QUS features in the core and margin of breast tumours can indicate breast cancer response to neoadjuvant chemotherapy (NAC) and predict five-year recurrence-free survival.

  20. Cognitive Predictors of Achievement Growth in Mathematics: A Five Year Longitudinal Study

    PubMed Central

    Geary, David C.

    2011-01-01

    The study's goal was to identify the beginning of first grade quantitative competencies that predict mathematics achievement start point and growth through fifth grade. Measures of number, counting, and arithmetic competencies were administered in early first grade and used to predict mathematics achievement through fifth (n = 177), while controlling for intelligence, working memory, and processing speed. Multilevel models revealed intelligence, processing speed, and the central executive component of working memory predicted achievement or achievement growth in mathematics and, as a contrast domain, word reading. The phonological loop was uniquely predictive of word reading and the visuospatial sketch pad of mathematics. Early fluency in processing and manipulating numerical set size and Arabic numerals, accurate use of sophisticated counting procedures for solving addition problems, and accuracy in making placements on a mathematical number line were uniquely predictive of mathematics achievement. Use of memory-based processes to solve addition problems predicted mathematics and reading achievement but in different ways. The results identify the early quantitative competencies that uniquely contribute to mathematics learning. PMID:21942667

  1. Statistical Mechanics of Viral Entry

    NASA Astrophysics Data System (ADS)

    Zhang, Yaojun; Dudko, Olga K.

    2015-01-01

    Viruses that have lipid-membrane envelopes infect cells by fusing with the cell membrane to release viral genes. Membrane fusion is known to be hindered by high kinetic barriers associated with drastic structural rearrangements—yet viral infection, which occurs by fusion, proceeds on remarkably short time scales. Here, we present a quantitative framework that captures the principles behind the invasion strategy shared by all enveloped viruses. The key to this strategy—ligand-triggered conformational changes in the viral proteins that pull the membranes together—is treated as a set of concurrent, bias field-induced activated rate processes. The framework results in analytical solutions for experimentally measurable characteristics of virus-cell fusion and enables us to express the efficiency of the viral strategy in quantitative terms. The predictive value of the theory is validated through simulations and illustrated through recent experimental data on influenza virus infection.

  2. Investigation of a dual modal method for bone pathologies using quantitative ultrasound and photoacoustics

    NASA Astrophysics Data System (ADS)

    Steinberg, Idan; Gannot, Israel; Eyal, Avishay

    2015-03-01

    Osteoporosis is a widespread disease that has a catastrophic impact on patient's lives and overwhelming related healthcare costs. In recent works, we have developed a multi-spectral, frequency domain photoacoustic method for the evaluation of bone pathologies. This method has great advantages over pure ultrasonic or optical methods as it provides both molecular information from the bone absorption spectrum and bone mechanical status from the characteristics of the ultrasound propagation. These characteristics include both the Speed of Sound (SOS) and Broadband Ultrasonic Attenuation (BUA). To test the method's quantitative predictions, we have constructed a combined ultrasound and photoacoustic setup. Here, we experimentally present a dual modality system, and compares between the methods on bone samples in-vitro. The differences between the two modalities are shown to provide valuable insight into the bone structure and functional status.

  3. Modelling oxygen transfer using dynamic alpha factors.

    PubMed

    Jiang, Lu-Man; Garrido-Baserba, Manel; Nolasco, Daniel; Al-Omari, Ahmed; DeClippeleir, Haydee; Murthy, Sudhir; Rosso, Diego

    2017-11-01

    Due to the importance of wastewater aeration in meeting treatment requirements and due to its elevated energy intensity, it is important to describe the real nature of an aeration system to improve design and specification, performance prediction, energy consumption, and process sustainability. Because organic loadings drive aeration efficiency to its lowest value when the oxygen demand (energy) is the highest, the implications of considering their dynamic nature on energy costs are of utmost importance. A dynamic model aimed at identifying conservation opportunities is presented. The model developed describes the correlation between the COD concentration and the α factor in activated sludge. Using the proposed model, the aeration efficiency is calculated as a function of the organic loading (i.e. COD). This results in predictions of oxygen transfer values that are more realistic than the traditional method of assuming constant α values. The model was applied to two water resource recovery facilities, and was calibrated and validated with time-sensitive databases. Our improved aeration model structure increases the quality of prediction of field data through the recognition of the dynamic nature of the alpha factor (α) as a function of the applied oxygen demand. For the cases presented herein, the model prediction of airflow improved by 20-35% when dynamic α is used. The proposed model offers a quantitative tool for the prediction of energy demand and for minimizing aeration design uncertainty. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  5. Predicting the activity of drugs for a group of imidazopyridine anticoccidial compounds.

    PubMed

    Si, Hongzong; Lian, Ning; Yuan, Shuping; Fu, Aiping; Duan, Yun-Bo; Zhang, Kejun; Yao, Xiaojun

    2009-10-01

    Gene expression programming (GEP) is a novel machine learning technique. The GEP is used to build nonlinear quantitative structure-activity relationship model for the prediction of the IC(50) for the imidazopyridine anticoccidial compounds. This model is based on descriptors which are calculated from the molecular structure. Four descriptors are selected from the descriptors' pool by heuristic method (HM) to build multivariable linear model. The GEP method produced a nonlinear quantitative model with a correlation coefficient and a mean error of 0.96 and 0.24 for the training set, 0.91 and 0.52 for the test set, respectively. It is shown that the GEP predicted results are in good agreement with experimental ones.

  6. Predicting epidermal growth factor receptor gene amplification status in glioblastoma multiforme by quantitative enhancement and necrosis features deriving from conventional magnetic resonance imaging.

    PubMed

    Dong, Fei; Zeng, Qiang; Jiang, Biao; Yu, Xinfeng; Wang, Weiwei; Xu, Jingjing; Yu, Jinna; Li, Qian; Zhang, Minming

    2018-05-01

    To study whether some of the quantitative enhancement and necrosis features in preoperative conventional MRI (cMRI) had a predictive value for epidermal growth factor receptor (EGFR) gene amplification status in glioblastoma multiforme (GBM).Fifty-five patients with pathologically determined GBMs who underwent cMRI were retrospectively reviewed. The following cMRI features were quantitatively measured and recorded: long and short diameters of the enhanced portion (LDE and SDE), maximum and minimum thickness of the enhanced portion (MaxTE and MinTE), and long and short diameters of the necrotic portion (LDN and SDN). Univariate analysis of each feature and a decision tree model fed with all the features were performed. Area under the receiver operating characteristic (ROC) curve (AUC) was used to assess the performance of features, and predictive accuracy was used to assess the performance of the model.For single feature, MinTE showed the best performance in differentiating EGFR gene amplification negative (wild-type) (nEGFR) GBM from EGFR gene amplification positive (pEGFR) GBM, and it got an AUC of 0.68 with a cut-off value of 2.6 mm. The decision tree model included 2 features MinTE and SDN, and got an accuracy of 0.83 in validation dataset.Our results suggest that quantitative measurement of the features MinTE and SDN in preoperative cMRI had a high accuracy for predicting EGFR gene amplification status in GBM.

  7. A prospective study of a quantitative PCR ELISA assay for the diagnosis of CMV pneumonia in lung and heart-transplant recipients.

    PubMed

    Barber, L; Egan, J J; Lomax, J; Haider, Y; Yonan, N; Woodcock, A A; Turner, A J; Fox, A J

    2000-08-01

    Qualitative polymerase chain reaction (PCR) for the identification of cytomegalovirus (CMV) infection has a low predictive value for the identification of CMV pneumonia. This study prospectively evaluated the application of a quantitative PCR Enzyme-Linked Immuno-Sorbent Assay (ELISA) assay in 9 lung- and 18 heart-transplant recipients who did not receive ganciclovir prophylaxis. DNA was collected from peripheral blood polymorphonuclear leucocytes (PMNL) posttransplantation. Oligonucleotide primers for the glycoprotein B gene (149 bp) were used in a PCR ELISA assay using an internal standard for quantitation. CMV disease was defined as histological evidence of end organ damage. The median level CMV genome equivalents in patients with CMV disease was 2665/2 x 10(5) PMNL (range 1,200 to 61,606) compared to 100 x 10(5) PMNL (range 20 to 855) with infection but no CMV disease (p = 0.036). All patients with CMV disease had genome equivalents levels of >1200/2 x 10(5) PMNL. A cut-off level of 1,200 PMNL had a positive predictive value for CMV disease of 100% and a negative predictive value of 100%. The first detection of levels of CMV genome equivalents above a level of 1200/2 x 10(5) PMNL was at a median of 58 days (range 47 to 147) posttransplant. Quantitative PCR assays for the diagnosis of CMV infection may predict patients at risk of CMV disease and thereby direct preemptive treatment to high-risk patients.

  8. Quantitative structure-activity relationship (QSAR) for insecticides: development of predictive in vivo insecticide activity models.

    PubMed

    Naik, P K; Singh, T; Singh, H

    2009-07-01

    Quantitative structure-activity relationship (QSAR) analyses were performed independently on data sets belonging to two groups of insecticides, namely the organophosphates and carbamates. Several types of descriptors including topological, spatial, thermodynamic, information content, lead likeness and E-state indices were used to derive quantitative relationships between insecticide activities and structural properties of chemicals. A systematic search approach based on missing value, zero value, simple correlation and multi-collinearity tests as well as the use of a genetic algorithm allowed the optimal selection of the descriptors used to generate the models. The QSAR models developed for both organophosphate and carbamate groups revealed good predictability with r(2) values of 0.949 and 0.838 as well as [image omitted] values of 0.890 and 0.765, respectively. In addition, a linear correlation was observed between the predicted and experimental LD(50) values for the test set data with r(2) of 0.871 and 0.788 for both the organophosphate and carbamate groups, indicating that the prediction accuracy of the QSAR models was acceptable. The models were also tested successfully from external validation criteria. QSAR models developed in this study should help further design of novel potent insecticides.

  9. An augmented classical least squares method for quantitative Raman spectral analysis against component information loss.

    PubMed

    Zhou, Yan; Cao, Hui

    2013-01-01

    We propose an augmented classical least squares (ACLS) calibration method for quantitative Raman spectral analysis against component information loss. The Raman spectral signals with low analyte concentration correlations were selected and used as the substitutes for unknown quantitative component information during the CLS calibration procedure. The number of selected signals was determined by using the leave-one-out root-mean-square error of cross-validation (RMSECV) curve. An ACLS model was built based on the augmented concentration matrix and the reference spectral signal matrix. The proposed method was compared with partial least squares (PLS) and principal component regression (PCR) using one example: a data set recorded from an experiment of analyte concentration determination using Raman spectroscopy. A 2-fold cross-validation with Venetian blinds strategy was exploited to evaluate the predictive power of the proposed method. The one-way variance analysis (ANOVA) was used to access the predictive power difference between the proposed method and existing methods. Results indicated that the proposed method is effective at increasing the robust predictive power of traditional CLS model against component information loss and its predictive power is comparable to that of PLS or PCR.

  10. Molecular surface area based predictive models for the adsorption and diffusion of disperse dyes in polylactic acid matrix.

    PubMed

    Xu, Suxin; Chen, Jiangang; Wang, Bijia; Yang, Yiqi

    2015-11-15

    Two predictive models were presented for the adsorption affinities and diffusion coefficients of disperse dyes in polylactic acid matrix. Quantitative structure-sorption behavior relationship would not only provide insights into sorption process, but also enable rational engineering for desired properties. The thermodynamic and kinetic parameters for three disperse dyes were measured. The predictive model for adsorption affinity was based on two linear relationships derived by interpreting the experimental measurements with molecular structural parameters and compensation effect: ΔH° vs. dye size and ΔS° vs. ΔH°. Similarly, the predictive model for diffusion coefficient was based on two derived linear relationships: activation energy of diffusion vs. dye size and logarithm of pre-exponential factor vs. activation energy of diffusion. The only required parameters for both models are temperature and solvent accessible surface area of the dye molecule. These two predictive models were validated by testing the adsorption and diffusion properties of new disperse dyes. The models offer fairly good predictive ability. The linkage between structural parameter of disperse dyes and sorption behaviors might be generalized and extended to other similar polymer-penetrant systems. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Reliability prediction of ontology-based service compositions using Petri net and time series models.

    PubMed

    Li, Jia; Xia, Yunni; Luo, Xin

    2014-01-01

    OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy.

  12. Prediction of Hip Failure Load: In Vitro Study of 80 Femurs Using Three Imaging Methods and Finite Element Models-The European Fracture Study (EFFECT).

    PubMed

    Pottecher, Pierre; Engelke, Klaus; Duchemin, Laure; Museyko, Oleg; Moser, Thomas; Mitton, David; Vicaut, Eric; Adams, Judith; Skalli, Wafa; Laredo, Jean Denis; Bousson, Valérie

    2016-09-01

    Purpose To evaluate the performance of three imaging methods (radiography, dual-energy x-ray absorptiometry [DXA], and quantitative computed tomography [CT]) and that of a numerical analysis with finite element modeling (FEM) in the prediction of failure load of the proximal femur and to identify the best densitometric or geometric predictors of hip failure load. Materials and Methods Institutional review board approval was obtained. A total of 40 pairs of excised cadaver femurs (mean patient age at time of death, 82 years ± 12 [standard deviation]) were examined with (a) radiography to measure geometric parameters (lengths, angles, and cortical thicknesses), (b) DXA (reference standard) to determine areal bone mineral densities (BMDs), and (c) quantitative CT with dedicated three-dimensional analysis software to determine volumetric BMDs and geometric parameters (neck axis length, cortical thicknesses, volumes, and moments of inertia), and (d) quantitative CT-based FEM to calculate a numerical value of failure load. The 80 femurs were fractured via mechanical testing, with random assignment of one femur from each pair to the single-limb stance configuration (hereafter, stance configuration) and assignment of the paired femur to the sideways fall configuration (hereafter, side configuration). Descriptive statistics, univariate correlations, and stepwise regression models were obtained for each imaging method and for FEM to enable us to predict failure load in both configurations. Results Statistics reported are for stance and side configurations, respectively. For radiography, the strongest correlation with mechanical failure load was obtained by using a geometric parameter combined with a cortical thickness (r(2) = 0.66, P < .001; r(2) = 0.65, P < .001). For DXA, the strongest correlation with mechanical failure load was obtained by using total BMD (r(2) = 0.73, P < .001) and trochanteric BMD (r(2) = 0.80, P < .001). For quantitative CT, in both configurations, the best model combined volumetric BMD and a moment of inertia (r(2) = 0.78, P < .001; r(2) = 0.85, P < .001). FEM explained 87% (P < .001) and 83% (P < .001) of bone strength, respectively. By combining (a) radiography and DXA and (b) quantitative CT and DXA, correlations with mechanical failure load increased to 0.82 (P < .001) and 0.84 (P < .001), respectively, for radiography and DXA and to 0.80 (P < .001) and 0.86 (P < .001) , respectively, for quantitative CT and DXA. Conclusion Quantitative CT-based FEM was the best method with which to predict the experimental failure load; however, combining quantitative CT and DXA yielded a performance as good as that attained with FEM. The quantitative CT DXA combination may be easier to use in fracture prediction, provided standardized software is developed. These findings also highlight the major influence on femoral failure load, particularly in the trochanteric region, of a densitometric parameter combined with a geometric parameter. (©) RSNA, 2016 Online supplemental material is available for this article.

  13. Development of a 3D finite element acoustic model to predict the sound reduction index of stud based double-leaf walls

    NASA Astrophysics Data System (ADS)

    Arjunan, A.; Wang, C. J.; Yahiaoui, K.; Mynors, D. J.; Morgan, T.; Nguyen, V. B.; English, M.

    2014-11-01

    Building standards incorporating quantitative acoustical criteria to ensure adequate sound insulation are now being implemented. Engineers are making great efforts to design acoustically efficient double-wall structures. Accordingly, efficient simulation models to predict the acoustic insulation of double-leaf wall structures are needed. This paper presents the development of a numerical tool that can predict the frequency dependent sound reduction index R of stud based double-leaf walls at one-third-octave band frequency range. A fully vibro-acoustic 3D model consisting of two rooms partitioned using a double-leaf wall, considering the structure and acoustic fluid coupling incorporating the existing fluid and structural solvers are presented. The validity of the finite element (FE) model is assessed by comparison with experimental test results carried out in a certified laboratory. Accurate representation of the structural damping matrix to effectively predict the R values are studied. The possibilities of minimising the simulation time using a frequency dependent mesh model was also investigated. The FEA model presented in this work is capable of predicting the weighted sound reduction index Rw along with A-weighted pink noise C and A-weighted urban noise Ctr within an error of 1 dB. The model developed can also be used to analyse the acoustically induced frequency dependent geometrical behaviour of the double-leaf wall components to optimise them for best acoustic performance. The FE modelling procedure reported in this paper can be extended to other building components undergoing fluid-structure interaction (FSI) to evaluate their acoustic insulation.

  14. Radiomic analysis in prediction of Human Papilloma Virus status.

    PubMed

    Yu, Kaixian; Zhang, Youyi; Yu, Yang; Huang, Chao; Liu, Rongjie; Li, Tengfei; Yang, Liuqing; Morris, Jeffrey S; Baladandayuthapani, Veerabhadran; Zhu, Hongtu

    2017-12-01

    Human Papilloma Virus (HPV) has been associated with oropharyngeal cancer prognosis. Traditionally the HPV status is tested through invasive lab test. Recently, the rapid development of statistical image analysis techniques has enabled precise quantitative analysis of medical images. The quantitative analysis of Computed Tomography (CT) provides a non-invasive way to assess HPV status for oropharynx cancer patients. We designed a statistical radiomics approach analyzing CT images to predict HPV status. Various radiomics features were extracted from CT scans, and analyzed using statistical feature selection and prediction methods. Our approach ranked the highest in the 2016 Medical Image Computing and Computer Assisted Intervention (MICCAI) grand challenge: Oropharynx Cancer (OPC) Radiomics Challenge, Human Papilloma Virus (HPV) Status Prediction. Further analysis on the most relevant radiomic features distinguishing HPV positive and negative subjects suggested that HPV positive patients usually have smaller and simpler tumors.

  15. Mechanistic modeling to predict the transporter- and enzyme-mediated drug-drug interactions of repaglinide.

    PubMed

    Varma, Manthena V S; Lai, Yurong; Kimoto, Emi; Goosen, Theunis C; El-Kattan, Ayman F; Kumar, Vikas

    2013-04-01

    Quantitative prediction of complex drug-drug interactions (DDIs) is challenging. Repaglinide is mainly metabolized by cytochrome-P-450 (CYP)2C8 and CYP3A4, and is also a substrate of organic anion transporting polypeptide (OATP)1B1. The purpose is to develop a physiologically based pharmacokinetic (PBPK) model to predict the pharmacokinetics and DDIs of repaglinide. In vitro hepatic transport of repaglinide, gemfibrozil and gemfibrozil 1-O-β-glucuronide was characterized using sandwich-culture human hepatocytes. A PBPK model, implemented in Simcyp (Sheffield, UK), was developed utilizing in vitro transport and metabolic clearance data. In vitro studies suggested significant active hepatic uptake of repaglinide. Mechanistic model adequately described repaglinide pharmacokinetics, and successfully predicted DDIs with several OATP1B1 and CYP3A4 inhibitors (<10% error). Furthermore, repaglinide-gemfibrozil interaction at therapeutic dose was closely predicted using in vitro fraction metabolism for CYP2C8 (0.71), when primarily considering reversible inhibition of OATP1B1 and mechanism-based inactivation of CYP2C8 by gemfibrozil and gemfibrozil 1-O-β-glucuronide. This study demonstrated that hepatic uptake is rate-determining in the systemic clearance of repaglinide. The model quantitatively predicted several repaglinide DDIs, including the complex interactions with gemfibrozil. Both OATP1B1 and CYP2C8 inhibition contribute significantly to repaglinide-gemfibrozil interaction, and need to be considered for quantitative rationalization of DDIs with either drug.

  16. Assessing deep and shallow learning methods for quantitative prediction of acute chemical toxicity.

    PubMed

    Liu, Ruifeng; Madore, Michael; Glover, Kyle P; Feasel, Michael G; Wallqvist, Anders

    2018-05-02

    Animal-based methods for assessing chemical toxicity are struggling to meet testing demands. In silico approaches, including machine-learning methods, are promising alternatives. Recently, deep neural networks (DNNs) were evaluated and reported to outperform other machine-learning methods for quantitative structure-activity relationship modeling of molecular properties. However, most of the reported performance evaluations relied on global performance metrics, such as the root mean squared error (RMSE) between the predicted and experimental values of all samples, without considering the impact of sample distribution across the activity spectrum. Here, we carried out an in-depth analysis of DNN performance for quantitative prediction of acute chemical toxicity using several datasets. We found that the overall performance of DNN models on datasets of up to 30,000 compounds was similar to that of random forest (RF) models, as measured by the RMSE and correlation coefficients between the predicted and experimental results. However, our detailed analyses demonstrated that global performance metrics are inappropriate for datasets with a highly uneven sample distribution, because they show a strong bias for the most populous compounds along the toxicity spectrum. For highly toxic compounds, DNN and RF models trained on all samples performed much worse than the global performance metrics indicated. Surprisingly, our variable nearest neighbor method, which utilizes only structurally similar compounds to make predictions, performed reasonably well, suggesting that information of close near neighbors in the training sets is a key determinant of acute toxicity predictions.

  17. Quantitative Structure--Activity Relationship Modeling of Rat Acute Toxicity by Oral Exposure

    EPA Science Inventory

    Background: Few Quantitative Structure-Activity Relationship (QSAR) studies have successfully modeled large, diverse rodent toxicity endpoints. Objective: In this study, a combinatorial QSAR approach has been employed for the creation of robust and predictive models of acute toxi...

  18. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    USDA-ARS?s Scientific Manuscript database

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and human health effect...

  19. A hybrid approach to advancing quantitative prediction of tissue distribution of basic drugs in human

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poulin, Patrick, E-mail: patrick-poulin@videotron.ca; Ekins, Sean; Department of Pharmaceutical Sciences, School of Pharmacy, University of Maryland, 20 Penn Street, Baltimore, MD 21201

    A general toxicity of basic drugs is related to phospholipidosis in tissues. Therefore, it is essential to predict the tissue distribution of basic drugs to facilitate an initial estimate of that toxicity. The objective of the present study was to further assess the original prediction method that consisted of using the binding to red blood cells measured in vitro for the unbound drug (RBCu) as a surrogate for tissue distribution, by correlating it to unbound tissue:plasma partition coefficients (Kpu) of several tissues, and finally to predict volume of distribution at steady-state (V{sub ss}) in humans under in vivo conditions. Thismore » correlation method demonstrated inaccurate predictions of V{sub ss} for particular basic drugs that did not follow the original correlation principle. Therefore, the novelty of this study is to provide clarity on the actual hypotheses to identify i) the impact of pharmacological mode of action on the generic correlation of RBCu-Kpu, ii) additional mechanisms of tissue distribution for the outlier drugs, iii) molecular features and properties that differentiate compounds as outliers in the original correlation analysis in order to facilitate its applicability domain alongside the properties already used so far, and finally iv) to present a novel and refined correlation method that is superior to what has been previously published for the prediction of human V{sub ss} of basic drugs. Applying a refined correlation method after identifying outliers would facilitate the prediction of more accurate distribution parameters as key inputs used in physiologically based pharmacokinetic (PBPK) and phospholipidosis models.« less

  20. Peridynamic theory for modeling three-dimensional damage growth in metallic and composite structures

    NASA Astrophysics Data System (ADS)

    Ochoa-Ricoux, Juan Pedro

    A recently introduced nonlocal peridynamic theory removes the obstacles present in classical continuum mechanics that limit the prediction of crack initiation and growth in materials. It is also applicable at different length scales. This study presents an alternative approach for the derivation of peridynamic equations of motion based on the principle of virtual work. It also presents solutions for the longitudinal vibration of a bar subjected to an initial stretch, propagation of a pre-existing crack in a plate subjected to velocity boundary conditions, and crack initiation and growth in a plate with a circular cutout. Furthermore, damage growth in composites involves complex and progressive failure modes. Current computational tools are incapable of predicting failure in composite materials mainly due to their mathematical structure. However, the peridynamic theory removes these obstacles by taking into account non-local interactions between material points. Hence, an application of the peridynamic theory to predict how damage propagates in fiber reinforced composite materials subjected to mechanical and thermal loading conditions is presented. Finally, an analysis approach based on a merger of the finite element method and the peridynamic theory is proposed. Its validity is established through qualitative and quantitative comparisons against the test results for a stiffened composite curved panel with a central slot under combined internal pressure and axial tension. The predicted initial and final failure loads, as well as the final failure modes, are in close agreement with the experimental observations. This proposed approach demonstrates the capability of the PD approach to assess the durability of complex composite structures.

  1. Does Rational Selection of Training and Test Sets Improve the Outcome of QSAR Modeling?

    EPA Science Inventory

    Prior to using a quantitative structure activity relationship (QSAR) model for external predictions, its predictive power should be established and validated. In the absence of a true external dataset, the best way to validate the predictive ability of a model is to perform its s...

  2. Sampling interval analysis and CDF generation for grain-scale gravel bed topography

    USDA-ARS?s Scientific Manuscript database

    In river hydraulics, there is a continuing need for characterizing bed elevations to arrive at quantitative roughness measures that can be used in predicting flow depth and for improved prediction of fine-sediment transport over and through coarse beds. Recently published prediction methods require...

  3. QUANTITATIVE MODELING APPROACHES TO PREDICTING THE ACUTE NEUROTOXICITY OF VOLATILE ORGANIC COMPOUNDS (VOCS).

    EPA Science Inventory

    Lack of complete and appropriate human data requires prediction of the hazards for exposed human populations by extrapolation from available animal and in vitro data. Predictive models for the toxicity of chemicals can be constructed by linking kinetic and mode of action data uti...

  4. Predicting the Emplacement of Improvised Explosive Devices: An Innovative Solution

    ERIC Educational Resources Information Center

    Lerner, Warren D.

    2013-01-01

    In this quantitative correlational study, simulated data were employed to examine artificial-intelligence techniques or, more specifically, artificial neural networks, as they relate to the location prediction of improvised explosive devices (IEDs). An ANN model was developed to predict IED placement, based upon terrain features and objects…

  5. PLS-based quantitative structure-activity relationship for substituted benzamides of clebopride type. Application of experimental design in drug design.

    PubMed

    Norinder, U; Högberg, T

    1992-04-01

    The advantageous approach of using an experimentally designed training set as the basis for establishing a quantitative structure-activity relationship with good predictive capability is described. The training set was selected from a fractional factorial design scheme based on a principal component description of physico-chemical parameters of aromatic substituents. The derived model successfully predicts the activities of additional substituted benzamides of 6-methoxy-N-(4-piperidyl)salicylamide type. The major influence on activity of the 3-substituent is demonstrated.

  6. Clinical evaluation of tuberculosis viability microscopy for assessing treatment response.

    PubMed

    Datta, Sumona; Sherman, Jonathan M; Bravard, Marjory A; Valencia, Teresa; Gilman, Robert H; Evans, Carlton A

    2015-04-15

    It is difficult to determine whether early tuberculosis treatment is effective in reducing the infectiousness of patients' sputum, because culture takes weeks and conventional acid-fast sputum microscopy and molecular tests cannot differentiate live from dead tuberculosis. To assess treatment response, sputum samples (n=124) from unselected patients (n=35) with sputum microscopy-positive tuberculosis were tested pretreatment and after 3, 6, and 9 days of empiric first-line therapy. Tuberculosis quantitative viability microscopy with fluorescein diacetate, quantitative culture, and acid-fast auramine microscopy were all performed in triplicate. Tuberculosis quantitative viability microscopy predicted quantitative culture results such that 76% of results agreed within ±1 logarithm (rS=0.85; P<.0001). In 31 patients with non-multidrug-resistant (MDR) tuberculosis, viability and quantitative culture results approximately halved (both 0.27 log reduction, P<.001) daily. For patients with non-MDR tuberculosis and available data, by treatment day 9 there was a >10-fold reduction in viability in 100% (24/24) of cases and quantitative culture in 95% (19/20) of cases. Four other patients subsequently found to have MDR tuberculosis had no significant changes in viability (P=.4) or quantitative culture (P=.6) results during early treatment. The change in viability and quantitative culture results during early treatment differed significantly between patients with non-MDR tuberculosis and those with MDR tuberculosis (both P<.001). Acid-fast microscopy results changed little during early treatment, and this change was similar for non-MDR tuberculosis vs MDR tuberculosis (P=.6). Tuberculosis quantitative viability microscopy is a simple test that within 1 hour predicted quantitative culture results that became available weeks later, rapidly indicating whether patients were responding to tuberculosis therapy. © The Author 2014. Published by Oxford University Press on behalf of the Infectious Diseases Society of America.

  7. Wavelet modeling and prediction of the stability of states: the Roman Empire and the European Union

    NASA Astrophysics Data System (ADS)

    Yaroshenko, Tatyana Y.; Krysko, Dmitri V.; Dobriyan, Vitalii; Zhigalov, Maksim V.; Vos, Hendrik; Vandenabeele, Peter; Krysko, Vadim A.

    2015-09-01

    How can the stability of a state be quantitatively determined and its future stability predicted? The rise and collapse of empires and states is very complex, and it is exceedingly difficult to understand and predict it. Existing theories are usually formulated as verbal models and, consequently, do not yield sharply defined, quantitative prediction that can be unambiguously validated with data. Here we describe a model that determines whether the state is in a stable or chaotic condition and predicts its future condition. The central model, which we test, is that growth and collapse of states is reflected by the changes of their territories, populations and budgets. The model was simulated within the historical societies of the Roman Empire (400 BC to 400 AD) and the European Union (1957-2007) by using wavelets and analysis of the sign change of the spectrum of Lyapunov exponents. The model matches well with the historical events. During wars and crises, the state becomes unstable; this is reflected in the wavelet analysis by a significant increase in the frequency ω (t) and wavelet coefficients W (ω, t) and the sign of the largest Lyapunov exponent becomes positive, indicating chaos. We successfully reconstructed and forecasted time series in the Roman Empire and the European Union by applying artificial neural network. The proposed model helps to quantitatively determine and forecast the stability of a state.

  8. Potential usefulness of a topic model-based categorization of lung cancers as quantitative CT biomarkers for predicting the recurrence risk after curative resection

    NASA Astrophysics Data System (ADS)

    Kawata, Y.; Niki, N.; Ohmatsu, H.; Satake, M.; Kusumoto, M.; Tsuchida, T.; Aokage, K.; Eguchi, K.; Kaneko, M.; Moriyama, N.

    2014-03-01

    In this work, we investigate a potential usefulness of a topic model-based categorization of lung cancers as quantitative CT biomarkers for predicting the recurrence risk after curative resection. The elucidation of the subcategorization of a pulmonary nodule type in CT images is an important preliminary step towards developing the nodule managements that are specific to each patient. We categorize lung cancers by analyzing volumetric distributions of CT values within lung cancers via a topic model such as latent Dirichlet allocation. Through applying our scheme to 3D CT images of nonsmall- cell lung cancer (maximum lesion size of 3 cm) , we demonstrate the potential usefulness of the topic model-based categorization of lung cancers as quantitative CT biomarkers.

  9. Calculator Use on the "GRE"® Revised General Test Quantitative Reasoning Measure. ETS GRE® Board Research Report. ETS GRE®-14-02. ETS Research Report. RR-14-25

    ERIC Educational Resources Information Center

    Attali, Yigal

    2014-01-01

    Previous research on calculator use in standardized assessments of quantitative ability focused on the effect of calculator availability on item difficulty and on whether test developers can predict these effects. With the introduction of an on-screen calculator on the Quantitative Reasoning measure of the "GRE"® revised General Test, it…

  10. Theoretical Characterization of the Spectral Density of the Water-Soluble Chlorophyll-Binding Protein from Combined Quantum Mechanics/Molecular Mechanics Molecular Dynamics Simulations.

    PubMed

    Rosnik, Andreana M; Curutchet, Carles

    2015-12-08

    Over the past decade, both experimentalists and theorists have worked to develop methods to describe pigment-protein coupling in photosynthetic light-harvesting complexes in order to understand the molecular basis of quantum coherence effects observed in photosynthesis. Here we present an improved strategy based on the combination of quantum mechanics/molecular mechanics (QM/MM) molecular dynamics (MD) simulations and excited-state calculations to predict the spectral density of electronic-vibrational coupling. We study the water-soluble chlorophyll-binding protein (WSCP) reconstituted with Chl a or Chl b pigments as the system of interest and compare our work with data obtained by Pieper and co-workers from differential fluorescence line-narrowing spectra (Pieper et al. J. Phys. Chem. B 2011, 115 (14), 4042-4052). Our results demonstrate that the use of QM/MM MD simulations where the nuclear positions are still propagated at the classical level leads to a striking improvement of the predicted spectral densities in the middle- and high-frequency regions, where they nearly reach quantitative accuracy. This demonstrates that the so-called "geometry mismatch" problem related to the use of low-quality structures in QM calculations, not the quantum features of pigments high-frequency motions, causes the failure of previous studies relying on similar protocols. Thus, this work paves the way toward quantitative predictions of pigment-protein coupling and the comprehension of quantum coherence effects in photosynthesis.

  11. Preliminary Numerical Simulations of Nozzle Formation in the Host Rock of Supersonic Volcanic Jets

    NASA Astrophysics Data System (ADS)

    Wohletz, K. H.; Ogden, D. E.; Glatzmaier, G. A.

    2006-12-01

    Recognizing the difficulty in quantitatively predicting how a vent changes during an explosive eruption, Kieffer (Kieffer, S.W., Rev. Geophys. 27, 1989) developed the theory of fluid dynamic nozzles for volcanism, utilizing a highly developed predictive scheme used extensively in aerodynamics for design of jet and rocket nozzles. Kieffer's work shows that explosive eruptions involve flow from sub to supersonic conditions through the vent and that these conditions control the erosion of the vent to nozzle shapes and sizes that maximize mass flux. The question remains how to predict the failure and erosion of vent host rocks by a high-speed, multiphase, compressible fluid that represents an eruption column. Clearly, in order to have a quantitative model of vent dynamics one needs a robust computational method for a turbulent, compressible, multiphase fluid. Here we present preliminary simulations of fluid flowing from a high-pressure reservoir through an eroding conduit and into the atmosphere. The eruptive fluid is modeled as an ideal gas, the host rock as a simple incompressible fluid with sandstone properties. Although these simulations do not yet include the multiphase dynamics of the eruptive fluid or the solid mechanics of the host rock, the evolution of the host rock into a supersonic nozzle is clearly seen. Our simulations show shock fronts both above the conduit, where the gas has expanded into the atmosphere, and within the conduit itself, thereby influencing the dynamics of the jet decompression.

  12. Evaluation of quantitative image analysis criteria for the high-resolution microendoscopic detection of neoplasia in Barrett's esophagus

    NASA Astrophysics Data System (ADS)

    Muldoon, Timothy J.; Thekkek, Nadhi; Roblyer, Darren; Maru, Dipen; Harpaz, Noam; Potack, Jonathan; Anandasabapathy, Sharmila; Richards-Kortum, Rebecca

    2010-03-01

    Early detection of neoplasia in patients with Barrett's esophagus is essential to improve outcomes. The aim of this ex vivo study was to evaluate the ability of high-resolution microendoscopic imaging and quantitative image analysis to identify neoplastic lesions in patients with Barrett's esophagus. Nine patients with pathologically confirmed Barrett's esophagus underwent endoscopic examination with biopsies or endoscopic mucosal resection. Resected fresh tissue was imaged with fiber bundle microendoscopy; images were analyzed by visual interpretation or by quantitative image analysis to predict whether the imaged sites were non-neoplastic or neoplastic. The best performing pair of quantitative features were chosen based on their ability to correctly classify the data into the two groups. Predictions were compared to the gold standard of histopathology. Subjective analysis of the images by expert clinicians achieved average sensitivity and specificity of 87% and 61%, respectively. The best performing quantitative classification algorithm relied on two image textural features and achieved a sensitivity and specificity of 87% and 85%, respectively. This ex vivo pilot trial demonstrates that quantitative analysis of images obtained with a simple microendoscope system can distinguish neoplasia in Barrett's esophagus with good sensitivity and specificity when compared to histopathology and to subjective image interpretation.

  13. Quantitative Prediction of Systemic Toxicity Points of Departure (OpenTox USA 2017)

    EPA Science Inventory

    Human health risk assessment associated with environmental chemical exposure is limited by the tens of thousands of chemicals little or no experimental in vivo toxicity data. Data gap filling techniques, such as quantitative models based on chemical structure information, are c...

  14. Predictive microbiology: Quantitative science delivering quantifiable benefits to the meat industry and other food industries.

    PubMed

    McMeekin, T A

    2007-09-01

    Predictive microbiology is considered in the context of the conference theme "chance, innovation and challenge", together with the impact of quantitative approaches on food microbiology, generally. The contents of four prominent texts on predictive microbiology are analysed and the major contributions of two meat microbiologists, Drs. T.A. Roberts and C.O. Gill, to the early development of predictive microbiology are highlighted. These provide a segue into R&D trends in predictive microbiology, including the Refrigeration Index, an example of science-based, outcome-focussed food safety regulation. Rapid advances in technologies and systems for application of predictive models are indicated and measures to judge the impact of predictive microbiology are suggested in terms of research outputs and outcomes. The penultimate section considers the future of predictive microbiology and advances that will become possible when data on population responses are combined with data derived from physiological and molecular studies in a systems biology approach. Whilst the emphasis is on science and technology for food safety management, it is suggested that decreases in foodborne illness will also arise from minimising human error by changing the food safety culture.

  15. Density functional theory fragment descriptors to quantify the reactivity of a molecular family: application to amino acids.

    PubMed

    Senet, P; Aparicio, F

    2007-04-14

    By using the exact density functional theory, one demonstrates that the value of the local electronic softness of a molecular fragment is directly related to the polarization charge (Coulomb hole) induced by a test electron removed (or added) from (at) the fragment. Our finding generalizes to a chemical group a formal relation between these molecular descriptors recently obtained for an atom in a molecule using an approximate atomistic model [P. Senet and M. Yang, J. Chem. Sci. 117, 411 (2005)]. In addition, a practical ab initio computational scheme of the Coulomb hole and related local descriptors of reactivity of a molecular family having in common a similar fragment is presented. As a blind test, the method is applied to the lateral chains of the 20 isolated amino acids. One demonstrates that the local softness of the lateral chain is a quantitative measure of the similarity of the amino acids. It predicts the separation of amino acids in different biochemical groups (aliphatic, basic, acidic, sulfur contained, and aromatic). The present approach may find applications in quantitative structure activity relationship methodology.

  16. Evaluating a linearized Euler equations model for strong turbulence effects on sound propagation.

    PubMed

    Ehrhardt, Loïc; Cheinet, Sylvain; Juvé, Daniel; Blanc-Benon, Philippe

    2013-04-01

    Sound propagation outdoors is strongly affected by atmospheric turbulence. Under strongly perturbed conditions or long propagation paths, the sound fluctuations reach their asymptotic behavior, e.g., the intensity variance progressively saturates. The present study evaluates the ability of a numerical propagation model based on the finite-difference time-domain solving of the linearized Euler equations in quantitatively reproducing the wave statistics under strong and saturated intensity fluctuations. It is the continuation of a previous study where weak intensity fluctuations were considered. The numerical propagation model is presented and tested with two-dimensional harmonic sound propagation over long paths and strong atmospheric perturbations. The results are compared to quantitative theoretical or numerical predictions available on the wave statistics, including the log-amplitude variance and the probability density functions of the complex acoustic pressure. The match is excellent for the evaluated source frequencies and all sound fluctuations strengths. Hence, this model captures these many aspects of strong atmospheric turbulence effects on sound propagation. Finally, the model results for the intensity probability density function are compared with a standard fit by a generalized gamma function.

  17. A Novel Method of Quantitative Anterior Chamber Depth Estimation Using Temporal Perpendicular Digital Photography

    PubMed Central

    Zamir, Ehud; Kong, George Y.X.; Kowalski, Tanya; Coote, Michael; Ang, Ghee Soon

    2016-01-01

    Purpose We hypothesize that: (1) Anterior chamber depth (ACD) is correlated with the relative anteroposterior position of the pupillary image, as viewed from the temporal side. (2) Such a correlation may be used as a simple quantitative tool for estimation of ACD. Methods Two hundred sixty-six phakic eyes had lateral digital photographs taken from the temporal side, perpendicular to the visual axis, and underwent optical biometry (Nidek AL scanner). The relative anteroposterior position of the pupillary image was expressed using the ratio between: (1) lateral photographic temporal limbus to pupil distance (“E”) and (2) lateral photographic temporal limbus to cornea distance (“Z”). In the first chronological half of patients (Correlation Series), E:Z ratio (EZR) was correlated with optical biometric ACD. The correlation equation was then used to predict ACD in the second half of patients (Prediction Series) and compared to their biometric ACD for agreement analysis. Results A strong linear correlation was found between EZR and ACD, R = −0.91, R2 = 0.81. Bland-Altman analysis showed good agreement between predicted ACD using this method and the optical biometric ACD. The mean error was −0.013 mm (range −0.377 to 0.336 mm), standard deviation 0.166 mm. The 95% limits of agreement were ±0.33 mm. Conclusions Lateral digital photography and EZR calculation is a novel method to quantitatively estimate ACD, requiring minimal equipment and training. Translational Relevance EZ ratio may be employed in screening for angle closure glaucoma. It may also be helpful in outpatient medical clinic settings, where doctors need to judge the safety of topical or systemic pupil-dilating medications versus their risk of triggering acute angle closure glaucoma. Similarly, non ophthalmologists may use it to estimate the likelihood of acute angle closure glaucoma in emergency presentations. PMID:27540496

  18. Hydrogen Donor-Acceptor Fluctuations from Kinetic Isotope Effects: A Phenomenological Model

    PubMed Central

    Roston, Daniel; Cheatum, Christopher M.; Kohen, Amnon

    2012-01-01

    Kinetic isotope effects (KIEs) and their temperature dependence can probe the structural and dynamic nature of enzyme-catalyzed proton or hydride transfers. The molecular interpretation of their temperature dependence requires expensive and specialized QM/MM calculations to provide a quantitative molecular understanding. Currently available phenomenological models use a non-adiabatic assumption that is not appropriate for most hydride and proton-transfer reactions, while others require more parameters than the experimental data justify. Here we propose a phenomenological interpretation of KIEs based on a simple method to quantitatively link the size and temperature dependence of KIEs to a conformational distribution of the catalyzed reaction. The present model assumes adiabatic hydrogen tunneling, and by fitting experimental KIE data, the model yields a population distribution for fluctuations of the distance between donor and acceptor atoms. Fits to data from a variety of proton and hydride transfers catalyzed by enzymes and their mutants, as well as non-enzymatic reactions, reveal that steeply temperature-dependent KIEs indicate the presence of at least two distinct conformational populations, each with different kinetic behaviors. We present the results of these calculations for several published cases and discuss how the predictions of the calculations might be experimentally tested. The current analysis does not replace molecular quantum mechanics/molecular mechanics (QM/MM) investigations, but it provides a fast and accessible way to quantitatively interpret KIEs in the context of a Marcus-like model. PMID:22857146

  19. A Combination of Geographically Weighted Regression, Particle Swarm Optimization and Support Vector Machine for Landslide Susceptibility Mapping: A Case Study at Wanzhou in the Three Gorges Area, China

    PubMed Central

    Yu, Xianyu; Wang, Yi; Niu, Ruiqing; Hu, Youjian

    2016-01-01

    In this study, a novel coupling model for landslide susceptibility mapping is presented. In practice, environmental factors may have different impacts at a local scale in study areas. To provide better predictions, a geographically weighted regression (GWR) technique is firstly used in our method to segment study areas into a series of prediction regions with appropriate sizes. Meanwhile, a support vector machine (SVM) classifier is exploited in each prediction region for landslide susceptibility mapping. To further improve the prediction performance, the particle swarm optimization (PSO) algorithm is used in the prediction regions to obtain optimal parameters for the SVM classifier. To evaluate the prediction performance of our model, several SVM-based prediction models are utilized for comparison on a study area of the Wanzhou district in the Three Gorges Reservoir. Experimental results, based on three objective quantitative measures and visual qualitative evaluation, indicate that our model can achieve better prediction accuracies and is more effective for landslide susceptibility mapping. For instance, our model can achieve an overall prediction accuracy of 91.10%, which is 7.8%–19.1% higher than the traditional SVM-based models. In addition, the obtained landslide susceptibility map by our model can demonstrate an intensive correlation between the classified very high-susceptibility zone and the previously investigated landslides. PMID:27187430

  20. A Combination of Geographically Weighted Regression, Particle Swarm Optimization and Support Vector Machine for Landslide Susceptibility Mapping: A Case Study at Wanzhou in the Three Gorges Area, China.

    PubMed

    Yu, Xianyu; Wang, Yi; Niu, Ruiqing; Hu, Youjian

    2016-05-11

    In this study, a novel coupling model for landslide susceptibility mapping is presented. In practice, environmental factors may have different impacts at a local scale in study areas. To provide better predictions, a geographically weighted regression (GWR) technique is firstly used in our method to segment study areas into a series of prediction regions with appropriate sizes. Meanwhile, a support vector machine (SVM) classifier is exploited in each prediction region for landslide susceptibility mapping. To further improve the prediction performance, the particle swarm optimization (PSO) algorithm is used in the prediction regions to obtain optimal parameters for the SVM classifier. To evaluate the prediction performance of our model, several SVM-based prediction models are utilized for comparison on a study area of the Wanzhou district in the Three Gorges Reservoir. Experimental results, based on three objective quantitative measures and visual qualitative evaluation, indicate that our model can achieve better prediction accuracies and is more effective for landslide susceptibility mapping. For instance, our model can achieve an overall prediction accuracy of 91.10%, which is 7.8%-19.1% higher than the traditional SVM-based models. In addition, the obtained landslide susceptibility map by our model can demonstrate an intensive correlation between the classified very high-susceptibility zone and the previously investigated landslides.

  1. Remarks on the Phase Transition in QCD

    NASA Astrophysics Data System (ADS)

    Wilczek, Frank

    The significance of the question of the order of the phase transition in QCD, and recent evidence that real-world QCD is probably close to having a single second order transition as a function of temperature, is reviewed. Although this circumstance seems to remove the possibility that the QCD transition during the big bang might have had spectacular cosmological consequences, there is some good news: it allows highly non-trivial yet reliable quantitative predictions to be made for the behavior near the transition. These predictions can be tested in numerical simulations and perhaps even eventually in heavy ion collisions. The present paper is a very elementary discussion of the relevant concepts, meant to be an accessible introduction for those innocent of the renormalization group approach to critical phenomena and/or the details of QCD.

  2. Mechanical critical phenomena and the elastic response of fiber networks

    NASA Astrophysics Data System (ADS)

    Mackintosh, Fred

    The mechanics of cells and tissues are largely governed by scaffolds of filamentous proteins that make up the cytoskeleton, as well as extracellular matrices. Evidence is emerging that such networks can exhibit rich mechanical phase behavior. A classic example of a mechanical phase transition was identified by Maxwell for macroscopic engineering structures: networks of struts or springs exhibit a continuous, second-order phase transition at the isostatic point, where the number of constraints imposed by connectivity just equals the number of mechanical degrees of freedom. We present recent theoretical predictions and experimental evidence for mechanical phase transitions in in both synthetic and biopolymer networks. We show, in particular, excellent quantitative agreement between the mechanics of collagen matrices and the predictions of a strain-controlled phase transition in sub-isostatic networks.

  3. Prediction of solvation enthalpy of gaseous organic compounds in propanol

    NASA Astrophysics Data System (ADS)

    Golmohammadi, Hassan; Dashtbozorgi, Zahra

    2016-09-01

    The purpose of this paper is to present a novel way for developing quantitative structure-property relationship (QSPR) models to predict the gas-to-propanol solvation enthalpy (Δ H solv) of 95 organic compounds. Different kinds of descriptors were calculated for each compound using the Dragon software package. The variable selection technique of replacement method (RM) was employed to select the optimal subset of solute descriptors. Our investigation reveals that the dependence of physical chemistry properties of solution on solvation enthalpy is nonlinear and that the RM method is unable to model the solvation enthalpy accurately. The results established that the calculated Δ H solv values by SVM were in good agreement with the experimental ones, and the performances of the SVM models were superior to those obtained by RM model.

  4. Elastic amplitudes studied with the LHC measurements at 7 and 8 TeV

    NASA Astrophysics Data System (ADS)

    Kohara, A. K.; Ferreira, E.; Kodama, T.; Rangel, M.

    2017-12-01

    Recent measurements of the differential cross sections in the forward region of pp elastic scattering at 7 and 8 TeV show the precise form of the t dependence. We present a detailed analysis of these measurements including the structures of the real and imaginary parts of the scattering amplitude. A good description is achieved, confirming in all experiments the existence of a zero in the real part in the forward region close to the origin, in agreement with the prediction of a theorem by Martin, with an important role in the observed form of dσ /dt. A universal value for the position of this zero and regularity in other features of the amplitudes are found, leading to quantitative predictions for the forward elastic scattering at 13 TeV.

  5. Semianalytic calculation of cosmic microwave background anisotropies from wiggly and superconducting cosmic strings

    NASA Astrophysics Data System (ADS)

    Rybak, I. Yu.; Avgoustidis, A.; Martins, C. J. A. P.

    2017-11-01

    We study how the presence of world-sheet currents affects the evolution of cosmic string networks, and their impact on predictions for the cosmic microwave background (CMB) anisotropies generated by these networks. We provide a general description of string networks with currents and explicitly investigate in detail two physically motivated examples: wiggly and superconducting cosmic string networks. By using a modified version of the CMBact code, we show quantitatively how the relevant network parameters in both of these cases influence the predicted CMB signal. Our analysis suggests that previous studies have overestimated the amplitude of the anisotropies for wiggly strings. For superconducting strings the amplitude of the anisotropies depends on parameters which presently are not well known—but which can be measured in future high-resolution numerical simulations.

  6. Grading of Emphysema Is Indispensable for Predicting Prolonged Air Leak After Lung Lobectomy.

    PubMed

    Murakami, Junichi; Ueda, Kazuhiro; Tanaka, Toshiki; Kobayashi, Taiga; Hamano, Kimikazu

    2018-04-01

    The aim of this study was to assess the utility of quantitative computed tomography-based grading of emphysema for predicting prolonged air leak after thoracoscopic lobectomy. A consecutive series of 284 patients undergoing thoracoscopic lobectomy for lung cancer was retrospectively reviewed. Prolonged air leak was defined as air leaks lasting 7 days or longer. The grade of emphysema (emphysema index) was defined by the proportion of the emphysematous lung volume (less than -910 HU) to the total lung volume (-600 to -1,024 HU) by a computer-assisted histogram analysis of whole-lung computed tomography scans. The mean length of chest tube drainage was 1.5 days. Fifteen patients (5.3%) presented with prolonged air leak. According to a receiver-operating characteristics curve analysis, the emphysema index was the best predictor of prolonged air leak, with an area under the curve of 0.85 (95% confidence interval: 0.73 to 0.98). An emphysema index of 35% or greater was the best cutoff value for predicting prolonged air leak, with a negative predictive value of 0.99. The emphysema index was the only significant predictor for the length of postoperative chest tube drainage among conventional variables, including the pulmonary function and resected lobe, in both univariate and multivariate analyses. Prolonged air leak resulted in an increased duration of hospitalization (p < 0.001) and was frequently accompanied by pneumonia or empyema (p < 0.001). The grade of emphysema on computed tomography scan is the best predictor of prolonged air leak that adversely influences early postoperative outcomes. We must take new measures against prolonged air leak in quantitative computed tomography-based high-risk patients. Copyright © 2018 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  7. Genome-Assisted Prediction of Quantitative Traits Using the R Package sommer.

    PubMed

    Covarrubias-Pazaran, Giovanny

    2016-01-01

    Most traits of agronomic importance are quantitative in nature, and genetic markers have been used for decades to dissect such traits. Recently, genomic selection has earned attention as next generation sequencing technologies became feasible for major and minor crops. Mixed models have become a key tool for fitting genomic selection models, but most current genomic selection software can only include a single variance component other than the error, making hybrid prediction using additive, dominance and epistatic effects unfeasible for species displaying heterotic effects. Moreover, Likelihood-based software for fitting mixed models with multiple random effects that allows the user to specify the variance-covariance structure of random effects has not been fully exploited. A new open-source R package called sommer is presented to facilitate the use of mixed models for genomic selection and hybrid prediction purposes using more than one variance component and allowing specification of covariance structures. The use of sommer for genomic prediction is demonstrated through several examples using maize and wheat genotypic and phenotypic data. At its core, the program contains three algorithms for estimating variance components: Average information (AI), Expectation-Maximization (EM) and Efficient Mixed Model Association (EMMA). Kernels for calculating the additive, dominance and epistatic relationship matrices are included, along with other useful functions for genomic analysis. Results from sommer were comparable to other software, but the analysis was faster than Bayesian counterparts in the magnitude of hours to days. In addition, ability to deal with missing data, combined with greater flexibility and speed than other REML-based software was achieved by putting together some of the most efficient algorithms to fit models in a gentle environment such as R.

  8. Stability and dynamics of the edge pedestal in the low collisionality regime: physics mechanisms for steady-state ELM-free operation

    NASA Astrophysics Data System (ADS)

    Snyder, P. B.; Burrell, K. H.; Wilson, H. R.; Chu, M. S.; Fenstermacher, M. E.; Leonard, A. W.; Moyer, R. A.; Osborne, T. H.; Umansky, M.; West, W. P.; Xu, X. Q.

    2007-08-01

    Understanding the physics of the edge pedestal and edge localized modes (ELMs) is of great importance for ITER and the optimization of the tokamak concept. The peeling-ballooning model has quantitatively explained many observations, including ELM onset and pedestal constraints, in the standard H-mode regime. The ELITE code has been developed to efficiently evaluate peeling-ballooning stability for comparison with observation and predictions for future devices. We briefly review recent progress in the peeling-ballooning model, including experimental validation of ELM onset and pedestal height predictions, and nonlinear 3D simulations of ELM dynamics, which together lead to an emerging understanding of the physics of the onset and dynamics of ELMs in the standard intermediate to high collisionality regime. We also discuss new studies of the apparent power dependence of the pedestal, and studies of the impact of sheared toroidal flow. Recently, highly promising low collisionality regimes without ELMs have been discovered, including the quiescent H-mode (QH) and resonant magnetic perturbation (RMP) regimes. We present recent observations from the DIII-D tokamak of the density, shape and rotation dependence of QH discharges, and studies of the peeling-ballooning stability in this regime. We propose a model of the QH-mode in which the observed edge harmonic oscillation (EHO) is a saturated kink/peeling mode which is destabilized by current and rotation, and drives significant transport, allowing a near steady-state edge plasma. The model quantitatively predicts the observed density dependence and qualitatively predicts observed mode structure, rotation dependence and outer gap dependence. Low density RMP discharges are found to operate in a similar regime, but with the EHO replaced by an applied magnetic perturbation.

  9. Flow, Transport, and Reaction in Porous Media: Percolation Scaling, Critical-Path Analysis, and Effective Medium Approximation

    NASA Astrophysics Data System (ADS)

    Hunt, Allen G.; Sahimi, Muhammad

    2017-12-01

    We describe the most important developments in the application of three theoretical tools to modeling of the morphology of porous media and flow and transport processes in them. One tool is percolation theory. Although it was over 40 years ago that the possibility of using percolation theory to describe flow and transport processes in porous media was first raised, new models and concepts, as well as new variants of the original percolation model are still being developed for various applications to flow phenomena in porous media. The other two approaches, closely related to percolation theory, are the critical-path analysis, which is applicable when porous media are highly heterogeneous, and the effective medium approximation—poor man's percolation—that provide a simple and, under certain conditions, quantitatively correct description of transport in porous media in which percolation-type disorder is relevant. Applications to topics in geosciences include predictions of the hydraulic conductivity and air permeability, solute and gas diffusion that are particularly important in ecohydrological applications and land-surface interactions, and multiphase flow in porous media, as well as non-Gaussian solute transport, and flow morphologies associated with imbibition into unsaturated fractures. We describe new applications of percolation theory of solute transport to chemical weathering and soil formation, geomorphology, and elemental cycling through the terrestrial Earth surface. Wherever quantitatively accurate predictions of such quantities are relevant, so are the techniques presented here. Whenever possible, the theoretical predictions are compared with the relevant experimental data. In practically all the cases, the agreement between the theoretical predictions and the data is excellent. Also discussed are possible future directions in the application of such concepts to many other phenomena in geosciences.

  10. Gastrointestinal Endogenous Proteins as a Source of Bioactive Peptides - An In Silico Study

    PubMed Central

    Dave, Lakshmi A.; Montoya, Carlos A.; Rutherfurd, Shane M.; Moughan, Paul J.

    2014-01-01

    Dietary proteins are known to contain bioactive peptides that are released during digestion. Endogenous proteins secreted into the gastrointestinal tract represent a quantitatively greater supply of protein to the gut lumen than those of dietary origin. Many of these endogenous proteins are digested in the gastrointestinal tract but the possibility that these are also a source of bioactive peptides has not been considered. An in silico prediction method was used to test if bioactive peptides could be derived from the gastrointestinal digestion of gut endogenous proteins. Twenty six gut endogenous proteins and seven dietary proteins were evaluated. The peptides present after gastric and intestinal digestion were predicted based on the amino acid sequence of the proteins and the known specificities of the major gastrointestinal proteases. The predicted resultant peptides possessing amino acid sequences identical to those of known bioactive peptides were identified. After gastrointestinal digestion (based on the in silico simulation), the total number of bioactive peptides predicted to be released ranged from 1 (gliadin) to 55 (myosin) for the selected dietary proteins and from 1 (secretin) to 39 (mucin-5AC) for the selected gut endogenous proteins. Within the intact proteins and after simulated gastrointestinal digestion, angiotensin converting enzyme (ACE)-inhibitory peptide sequences were the most frequently observed in both the dietary and endogenous proteins. Among the dietary proteins, after in silico simulated gastrointestinal digestion, myosin was found to have the highest number of ACE-inhibitory peptide sequences (49 peptides), while for the gut endogenous proteins, mucin-5AC had the greatest number of ACE-inhibitory peptide sequences (38 peptides). Gut endogenous proteins may be an important source of bioactive peptides in the gut particularly since gut endogenous proteins represent a quantitatively large and consistent source of protein. PMID:24901416

  11. Simple estimate of entrainment rate of pollutants from a coastal discharge into the surf zone.

    PubMed

    Wong, Simon H C; Monismith, Stephen G; Boehm, Alexandria B

    2013-10-15

    Microbial pollutants from coastal discharges can increase illness risks for swimmers and cause beach advisories. There is presently no predictive model for estimating the entrainment of pollution from coastal discharges into the surf zone. We present a novel, quantitative framework for estimating surf zone entrainment of pollution at a wave-dominant open beach. Using physical arguments, we identify a dimensionless parameter equal to the quotient of the surf zone width l(sz) and the cross-flow length scale of the discharge la = M(j) (1/2)/U(sz), where M(j) is the discharge's momentum flux and U(sz) is a representative alongshore velocity in the surf zone. We conducted numerical modeling of a nonbuoyant discharge at an alongshore uniform beach with constant slope using a wave-resolving hydrodynamic model. Using results from 144 numerical experiments we develop an empirical relationship between the surf zone entrainment rate α and l(sz)/(la). The empirical relationship can reasonably explain seven measurements of surf zone entrainment at three diverse coastal discharges. This predictive relationship can be a useful tool in coastal water quality management and can be used to develop predictive beach water quality models.

  12. A proteomic analysis identifies candidate early biomarkers to predict ovarian hyperstimulation syndrome in polycystic ovarian syndrome patients.

    PubMed

    Wu, Lan; Sun, Yazhou; Wan, Jun; Luan, Ting; Cheng, Qing; Tan, Yong

    2017-07-01

    Ovarian hyperstimulation syndrome (OHSS) is a potentially life‑threatening, iatrogenic complication that occurs during assisted reproduction. Polycystic ovarian syndrome (PCOS) significantly increases the risk of OHSS during controlled ovarian stimulation. Therefore, a more effective early prediction technique is required in PCOS patients. Quantitative proteomic analysis of serum proteins indicates the potential diagnostic value for disease. In the present study, the authors revealed the differentially expressed proteins in OHSS patients with PCOS as new diagnostic biomarkers. The promising proteins obtained from liquid chromatography‑mass spectrometry were subjected to ELISA and western blotting assay for further confirmation. A total of 57 proteins were identified with significant difference, of which 29 proteins were upregulated and 28 proteins were downregulated in OHSS patients. Haptoglobin, fibrinogen and lipoprotein lipase were selected as candidate biomarkers. Receiver operating characteristic curve analysis demonstrated all three proteins may have potential as biomarkers to discriminate OHSS in PCOS patients. Haptoglobin, fibrinogen and lipoprotein lipase have never been reported as a predictive marker of OHSS in PCOS patients, and their potential roles in OHSS occurrence deserve further studies. The proteomic results reported in the present study may gain deeper insights into the pathophysiology of OHSS.

  13. Gleaning knowledge from data in the intensive care unit.

    PubMed

    Pinsky, Michael R; Dubrawski, Artur

    2014-09-15

    It is often difficult to accurately predict when, why, and which patients develop shock, because signs of shock often occur late, once organ injury is already present. Three levels of aggregation of information can be used to aid the bedside clinician in this task: analysis of derived parameters of existing measured physiologic variables using simple bedside calculations (functional hemodynamic monitoring); prior physiologic data of similar subjects during periods of stability and disease to define quantitative metrics of level of severity; and libraries of responses across large and comprehensive collections of records of diverse subjects whose diagnosis, therapies, and course is already known to predict not only disease severity, but also the subsequent behavior of the subject if left untreated or treated with one of the many therapeutic options. The problem is in defining the minimal monitoring data set needed to initially identify those patients across all possible processes, and then specifically monitor their responses to targeted therapies known to improve outcome. To address these issues, multivariable models using machine learning data-driven classification techniques can be used to parsimoniously predict cardiorespiratory insufficiency. We briefly describe how these machine learning approaches are presently applied to address earlier identification of cardiorespiratory insufficiency and direct focused, patient-specific management.

  14. US EPA - A*Star Partnership - Accelerating the Acceptance of ...

    EPA Pesticide Factsheets

    The path for incorporating new alternative methods and technologies into quantitative chemical risk assessment poses a diverse set of scientific challenges. Some of these challenges include development of relevant and predictive test systems and computational models to integrate and extrapolate experimental data, and rapid characterization and acceptance of these systems and models. The series of presentations will highlight a collaborative effort between the U.S. Environmental Protection Agency (EPA) and the Agency for Science, Technology and Research (A*STAR) that is focused on developing and applying experimental and computational models for predicting chemical-induced liver and kidney toxicity, brain angiogenesis, and blood-brain-barrier formation. In addressing some of these challenges, the U.S. EPA and A*STAR collaboration will provide a glimpse of what chemical risk assessments could look like in the 21st century. Presentation on US EPA – A*STAR Partnership at international symposium on Accelerating the acceptance of next-generation sciences and their application to regulatory risk assessment in Singapore.

  15. Choice theories: What are they good for?☆

    PubMed Central

    Johnson, Eric J.

    2013-01-01

    Simonson et al. present an ambitious sketch of an integrative theory of context. Provoked by this thoughtful proposal, I discuss what is the function of theories of choice in the coming decades. Traditionally, choice models and theory have attempted to predict choices as a function of the attributes of options. I argue that to be truly useful, they need to generate specific and quantitative predictions of the effect of the choice environment upon choice probability. To do this, we need to focus on rigorously modeling and measuring the underlying processes causing these effects, and use the Simonson et al. proposal to provide some examples. I also present some examples from research in decision-making and decision neuroscience, and argue that models that fail, and fail spectacularly are particularly useful. I close with a challenge: How would consumer researcher aid the design of real world choice environments such as the health exchanges under the Patient Protection and Affordable Care Act? PMID:23794793

  16. Chemical graphs, molecular matrices and topological indices in chemoinformatics and quantitative structure-activity relationships.

    PubMed

    Ivanciuc, Ovidiu

    2013-06-01

    Chemical and molecular graphs have fundamental applications in chemoinformatics, quantitative structureproperty relationships (QSPR), quantitative structure-activity relationships (QSAR), virtual screening of chemical libraries, and computational drug design. Chemoinformatics applications of graphs include chemical structure representation and coding, database search and retrieval, and physicochemical property prediction. QSPR, QSAR and virtual screening are based on the structure-property principle, which states that the physicochemical and biological properties of chemical compounds can be predicted from their chemical structure. Such structure-property correlations are usually developed from topological indices and fingerprints computed from the molecular graph and from molecular descriptors computed from the three-dimensional chemical structure. We present here a selection of the most important graph descriptors and topological indices, including molecular matrices, graph spectra, spectral moments, graph polynomials, and vertex topological indices. These graph descriptors are used to define several topological indices based on molecular connectivity, graph distance, reciprocal distance, distance-degree, distance-valency, spectra, polynomials, and information theory concepts. The molecular descriptors and topological indices can be developed with a more general approach, based on molecular graph operators, which define a family of graph indices related by a common formula. Graph descriptors and topological indices for molecules containing heteroatoms and multiple bonds are computed with weighting schemes based on atomic properties, such as the atomic number, covalent radius, or electronegativity. The correlation in QSPR and QSAR models can be improved by optimizing some parameters in the formula of topological indices, as demonstrated for structural descriptors based on atomic connectivity and graph distance.

  17. Dissociable brain biomarkers of fluid intelligence.

    PubMed

    Paul, Erick J; Larsen, Ryan J; Nikolaidis, Aki; Ward, Nathan; Hillman, Charles H; Cohen, Neal J; Kramer, Arthur F; Barbey, Aron K

    2016-08-15

    Cognitive neuroscience has long sought to understand the biological foundations of human intelligence. Decades of research have revealed that general intelligence is correlated with two brain-based biomarkers: the concentration of the brain biochemical N-acetyl aspartate (NAA) measured by proton magnetic resonance spectroscopy (MRS) and total brain volume measured using structural MR imaging (MRI). However, the relative contribution of these biomarkers in predicting performance on core facets of human intelligence remains to be well characterized. In the present study, we sought to elucidate the role of NAA and brain volume in predicting fluid intelligence (Gf). Three canonical tests of Gf (BOMAT, Number Series, and Letter Sets) and three working memory tasks (Reading, Rotation, and Symmetry span tasks) were administered to a large sample of healthy adults (n=211). We conducted exploratory factor analysis to investigate the factor structure underlying Gf independent from working memory and observed two Gf components (verbal/spatial and quantitative reasoning) and one working memory component. Our findings revealed a dissociation between two brain biomarkers of Gf (controlling for age and sex): NAA concentration correlated with verbal/spatial reasoning, whereas brain volume correlated with quantitative reasoning and working memory. A follow-up analysis revealed that this pattern of findings is observed for males and females when analyzed separately. Our results provide novel evidence that distinct brain biomarkers are associated with specific facets of human intelligence, demonstrating that NAA and brain volume are independent predictors of verbal/spatial and quantitative facets of Gf. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Biomarkers are used to predict quantitative metabolite concentration profiles in human red blood cells

    DOE PAGES

    Yurkovich, James T.; Yang, Laurence; Palsson, Bernhard O.; ...

    2017-03-06

    Deep-coverage metabolomic profiling has revealed a well-defined development of metabolic decay in human red blood cells (RBCs) under cold storage conditions. A set of extracellular biomarkers has been recently identified that reliably defines the qualitative state of the metabolic network throughout this metabolic decay process. Here, we extend the utility of these biomarkers by using them to quantitatively predict the concentrations of other metabolites in the red blood cell. We are able to accurately predict the concentration profile of 84 of the 91 (92%) measured metabolites ( p < 0.05) in RBC metabolism using only measurements of these five biomarkers.more » The median of prediction errors (symmetric mean absolute percent error) across all metabolites was 13%. Furthermore, the ability to predict numerous metabolite concentrations from a simple set of biomarkers offers the potential for the development of a powerful workflow that could be used to evaluate the metabolic state of a biological system using a minimal set of measurements.« less

  19. Biomarkers are used to predict quantitative metabolite concentration profiles in human red blood cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yurkovich, James T.; Yang, Laurence; Palsson, Bernhard O.

    Deep-coverage metabolomic profiling has revealed a well-defined development of metabolic decay in human red blood cells (RBCs) under cold storage conditions. A set of extracellular biomarkers has been recently identified that reliably defines the qualitative state of the metabolic network throughout this metabolic decay process. Here, we extend the utility of these biomarkers by using them to quantitatively predict the concentrations of other metabolites in the red blood cell. We are able to accurately predict the concentration profile of 84 of the 91 (92%) measured metabolites ( p < 0.05) in RBC metabolism using only measurements of these five biomarkers.more » The median of prediction errors (symmetric mean absolute percent error) across all metabolites was 13%. Furthermore, the ability to predict numerous metabolite concentrations from a simple set of biomarkers offers the potential for the development of a powerful workflow that could be used to evaluate the metabolic state of a biological system using a minimal set of measurements.« less

  20. Prognostic Value of Quantitative Metabolic Metrics on Baseline Pre-Sunitinib FDG PET/CT in Advanced Renal Cell Carcinoma

    PubMed Central

    Minamimoto, Ryogo; Barkhodari, Amir; Harshman, Lauren; Srinivas, Sandy; Quon, Andrew

    2016-01-01

    Purpose The objective of this study was to prospectively evaluate various quantitative metrics on FDG PET/CT for monitoring sunitinib therapy and predicting prognosis in patients with metastatic renal cell cancer (mRCC). Methods Seventeen patients (mean age: 59.0 ± 11.6) prospectively underwent a baseline FDG PET/CT and interim PET/CT after 2 cycles (12 weeks) of sunitinib therapy. We measured the highest maximum standardized uptake value (SUVmax) of all identified lesions (highest SUVmax), sum of SUVmax with maximum six lesions (sum of SUVmax), total lesion glycolysis (TLG) and metabolic tumor volume (MTV) from baseline PET/CT and interim PET/CT, and the % decrease in highest SUVmax of lesion (%Δ highest SUVmax), the % decrease in sum of SUVmax, the % decrease in TLG (%ΔTLG) and the % decrease in MTV (%ΔMTV) between baseline and interim PET/CT, and the imaging results were validated by clinical follow-up at 12 months after completion of therapy for progression free survival (PFS). Results At 12 month follow-up, 6/17 (35.3%) patients achieved PFS, while 11/17 (64.7%) patients were deemed to have progression of disease or recurrence within the previous 12 months. At baseline, PET/CT demonstrated metabolically active cancer in all cases. Using baseline PET/CT alone, all of the quantitative imaging metrics were predictive of PFS. Using interim PET/CT, the %Δ highest SUVmax, %Δ sum of SUVmax, and %ΔTLG were also predictive of PFS. Otherwise, interim PET/CT showed no significant difference between the two survival groups regardless of the quantitative metric utilized including MTV and TLG. Conclusions Quantitative metabolic measurements on baseline PET/CT appears to be predictive of PFS at 12 months post-therapy in patients scheduled to undergo sunitinib therapy for mRCC. Change between baseline and interim PET/CT also appeared to have prognostic value but otherwise interim PET/CT after 12 weeks of sunitinib did not appear to be predictive of PFS. PMID:27123976

  1. [Research on rapid and quantitative detection method for organophosphorus pesticide residue].

    PubMed

    Sun, Yuan-Xin; Chen, Bing-Tai; Yi, Sen; Sun, Ming

    2014-05-01

    The methods of physical-chemical inspection is adopted in the traditional pesticide residue detection, which require a lot of pretreatment processes, are time-consuming and complicated. In the present study, the authors take chlorpyrifos applied widely in the present agricultural field as the research object and propose a rapid and quantitative detection method for organophosphorus pesticide residues. At first, according to the chemical characteristics of chlorpyrifos and comprehensive chromogenic effect of several colorimetric reagents and secondary pollution, the pretreatment of the scheme of chromogenic reaction of chlorpyrifos with resorcin in a weak alkaline environment was determined. Secondly, by analyzing Uv-Vis spectrum data of chlorpyrifos samples whose content were between 0. 5 and 400 mg kg-1, it was confirmed that the characteristic information after the color reaction mainly was concentrated among 360 approximately 400 nm. Thirdly, the full spectrum forecasting model was established based on the partial least squares, whose correlation coefficient of calibration was 0. 999 6, correlation coefficient of prediction reached 0. 995 6, standard deviation of calibration (RMSEC) was 2. 814 7 mg kg-1, and standard deviation of verification (RMSEP) was 8. 012 4 mg kg-1. Fourthly, the wavelengths whose center wavelength is 400 nm was extracted as characteristic region to build a forecasting model, whose correlation coefficient of calibration was 0. 999 6, correlation coefficient of prediction reached 0. 999 3, standard deviation of calibration (RMSEC) was 2. 566 7 mg kg-1 , standard deviation of verification (RMSEP) was 4. 886 6 mg kg-1, respectively. At last, by analyzing the near infrared spectrum data of chlorpyrifos samples with contents between 0. 5 and 16 mg kg-1, the authors found that although the characteristics of the chromogenic functional group are not obvious, the change of absorption peaks of resorcin itself in the neighborhood of 5 200 cm-' happens. The above-mentioned experimental results show that the proposed method is effective and feasible for rapid and quantitative detection prediction for organophosphorus pesticide residues. In the method, the information in full spectrum especially UV-Vis spectrum is strengthened by chromogenic reaction of a colorimetric reagent, which provides a new way of rapid detection of pesticide residues for agricultural products in the future.

  2. Understanding outbreaks of waterborne infectious disease: quantitative microbial risk assessment vs. epidemiology

    USDA-ARS?s Scientific Manuscript database

    Drinking water contaminated with microbial pathogens can cause outbreaks of infectious disease, and these outbreaks are traditionally studied using epidemiologic methods. Quantitative microbial risk assessment (QMRA) can predict – and therefore help prevent – such outbreaks, but it has never been r...

  3. Using Quantitative Structure-Activity Relationship Modeling to Quantitatively Predict the Developmental Toxicity of Halogenated Azole compounds

    EPA Science Inventory

    Developmental toxicity is a relevant endpoint for the comprehensive assessment of human health risk from chemical exposure. However, animal developmental toxicity studies remain unavailable for many environmental contaminants due to the complexity and cost of these types of analy...

  4. An overview of the nonequilibrium behavior of polymer glasses

    NASA Technical Reports Server (NTRS)

    Tant, M. R.; Wilkes, G. L.

    1981-01-01

    It is pointed out that research efforts are at present being directed in two areas, one comprising experimental studies of this phenomenon in various glassy polymer systems and the other involving the development of a quantitative theory capable of satisfactorily predicting aging behavior for a variety of polymer materials under different conditions. Recent work in both these areas is surveyed. The basic principles of nonequilibrium behavior are outlined, with emphasis placed on changes in material properties with annealing below the glass transition temperature. Free volume theory and thermodynamic theory are discussed.

  5. Clogging of Manifolds with Evaporatively Frozen Propellants. Part 2; Analysis

    NASA Technical Reports Server (NTRS)

    Simmon, J. A.; Gift, R. D.; Spurlock, J. M.

    1966-01-01

    The mechanisms of evaporative freezing of leaking propellant and the creation of flow stoppages within injector manifolds is discussed. A quantitative analysis of the conditions, including the existence of minimum and maximum leak rates, for the accumulation of evaporatively frozen propellant is presented. Clogging of the injector manifolds of the Apollo SPS and the Gemini OAMS engines by the freezing of leaking propellant is predicted and the seriousness of the consequences are discussed. Based on the analysis a realistic evaluation of selected techniques to eliminate flow stoppages by frozen propellant is made.

  6. Oscillational instabilities in single-mode acoustic levitators

    NASA Technical Reports Server (NTRS)

    Rudnick, Joseph; Barmatz, M.

    1990-01-01

    An extension of standard results for the acoustic force on an object in a single-mode resonant chamber yields predictions for the onset of oscillational instabilities when objects are levitated or positioned in these chambers. The results are consistent with experimental investigations. The present approach accounts for the effect of time delays on the response of a cavity to the motion of an object inside it. Quantitative features of the instabilities are investigated. The experimental conditions required for sample stability, saturation of sample oscillations, hysteretic effects, and the loss of the ability to levitate are discussed.

  7. Theory of cooperation in a micro-organismal snowdrift game

    NASA Astrophysics Data System (ADS)

    Wang, Zhenyu; Goldenfeld, Nigel

    2011-08-01

    We present a mean-field model for the phase diagram of a community of micro-organisms, interacting through their metabolism so that they are, in effect, engaging in a cooperative social game. We show that as a function of the concentration of the nutrients glucose and histidine, the community undergoes a phase transition separating a state in which one strain is dominant to a state which is characterized by coexisting populations. Our results are in good agreement with recent experimental results, correctly reproducing quantitative trends and predicting the phase diagram.

  8. Simulating Picosecond X-ray Diffraction from shocked crystals by Post-processing Molecular Dynamics Calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kimminau, G; Nagler, B; Higginbotham, A

    2008-06-19

    Calculations of the x-ray diffraction patterns from shocked crystals derived from the results of Non-Equilibrium-Molecular-Dynamics (NEMD) simulations are presented. The atomic coordinates predicted by the NEMD simulations combined with atomic form factors are used to generate a discrete distribution of electron density. A Fast-Fourier-Transform (FFT) of this distribution provides an image of the crystal in reciprocal space, which can be further processed to produce quantitative simulated data for direct comparison with experiments that employ picosecond x-ray diffraction from laser-irradiated crystalline targets.

  9. Simulation of Decomposition Kinetics of Supercooled Austenite in Powder Steel

    NASA Astrophysics Data System (ADS)

    Tsyganova, M. S.; Ivashko, A. G.; Polyshuk, I. N.; Nabatov, R. I.; Tsyganova, A. I.

    2017-10-01

    To approve heat treatment of steel modes, quantitative data on austenite decomposition are required. Gaining these data experimentally appears to be extremely complicated. In present work, few approaches to simulate the phase transformation process are proposed considering structure characteristics of powder steels. Results of comparative analysis of these approaches are also given. Predicting the transformation kinetics by simulation is verified for PK40N2M (0.38% C, 2.10% Ni, 0.40% Mo) steel with 3% porosity and PK80 (0.80% C) steel with different porosity using published experimental data.

  10. Influence of study goals on study design and execution.

    PubMed

    Kirklin, J W; Blackstone, E H; Naftel, D C; Turner, M E

    1997-12-01

    From the viewpoint of a clinician who makes recommendations to patients about choosing from the multiple possible management schemes, quantitative information derived from statistical analyses of observational studies is useful. Although random assignment of therapy is optimal, appropriately performed studies in which therapy has been nonrandomly "assigned" are considered acceptable, albeit occasionally with limitations in inferences. The analyses are considered most useful when they generate multivariable equations suitable for predicting time-related outcomes in individual patients. Graphic presentations improve communication with patients and facilitate truly informed consent.

  11. Dataglove measurement of joint angles in sign language handshapes

    PubMed Central

    Eccarius, Petra; Bour, Rebecca; Scheidt, Robert A.

    2012-01-01

    In sign language research, we understand little about articulatory factors involved in shaping phonemic boundaries or the amount (and articulatory nature) of acceptable phonetic variation between handshapes. To date, there exists no comprehensive analysis of handshape based on the quantitative measurement of joint angles during sign production. The purpose of our work is to develop a methodology for collecting and visualizing quantitative handshape data in an attempt to better understand how handshapes are produced at a phonetic level. In this pursuit, we seek to quantify the flexion and abduction angles of the finger joints using a commercial data glove (CyberGlove; Immersion Inc.). We present calibration procedures used to convert raw glove signals into joint angles. We then implement those procedures and evaluate their ability to accurately predict joint angle. Finally, we provide examples of how our recording techniques might inform current research questions. PMID:23997644

  12. Directed differential connectivity graph of interictal epileptiform discharges

    PubMed Central

    Amini, Ladan; Jutten, Christian; Achard, Sophie; David, Olivier; Soltanian-Zadeh, Hamid; Hossein-Zadeh, Gh. Ali; Kahane, Philippe; Minotti, Lorella; Vercueil, Laurent

    2011-01-01

    In this paper, we study temporal couplings between interictal events of spatially remote regions in order to localize the leading epileptic regions from intracerebral electroencephalogram (iEEG). We aim to assess whether quantitative epileptic graph analysis during interictal period may be helpful to predict the seizure onset zone of ictal iEEG. Using wavelet transform, cross-correlation coefficient, and multiple hypothesis test, we propose a differential connectivity graph (DCG) to represent the connections that change significantly between epileptic and non-epileptic states as defined by the interictal events. Post-processings based on mutual information and multi-objective optimization are proposed to localize the leading epileptic regions through DCG. The suggested approach is applied on iEEG recordings of five patients suffering from focal epilepsy. Quantitative comparisons of the proposed epileptic regions within ictal onset zones detected by visual inspection and using electrically stimulated seizures, reveal good performance of the present method. PMID:21156385

  13. Quantitative structure-toxicity relationship (QSTR) studies on the organophosphate insecticides.

    PubMed

    Can, Alper

    2014-11-04

    Organophosphate insecticides are the most commonly used pesticides in the world. In this study, quantitative structure-toxicity relationship (QSTR) models were derived for estimating the acute oral toxicity of organophosphate insecticides to male rats. The 20 chemicals of the training set and the seven compounds of the external testing set were described by means of using descriptors. Descriptors for lipophilicity, polarity and molecular geometry, as well as quantum chemical descriptors for energy were calculated. Model development to predict toxicity of organophosphate insecticides in different matrices was carried out using multiple linear regression. The model was validated internally and externally. In the present study, QSTR model was used for the first time to understand the inherent relationships between the organophosphate insecticide molecules and their toxicity behavior. Such studies provide mechanistic insight about structure-toxicity relationship and help in the design of less toxic insecticides. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  14. Development and psychometric evaluation of a quantitative measure of "fat talk".

    PubMed

    MacDonald Clarke, Paige; Murnen, Sarah K; Smolak, Linda

    2010-01-01

    Based on her anthropological research, Nichter (2000) concluded that it is normative for many American girls to engage in body self-disparagement in the form of "fat talk." The purpose of the present two studies was to develop a quantitative measure of fat talk. A series of 17 scenarios were created in which "Naomi" is talking with a female friend(s) and there is an expression of fat talk. College women respondents rated the frequency with which they would behave in a similar way as the women in each scenario. A nine-item one-factor scale was determined through principal components analysis and its scores yielded evidence of internal consistency reliability, test-retest reliability over a five-week time period, construct validity, discriminant validity, and incremental validity in that it predicted unique variance in body shame and eating disorder symptoms above and beyond other measures of self-objectification. Copyright 2009 Elsevier Ltd. All rights reserved.

  15. Quantitative structure activity relationships from optimised ab initio bond lengths: steroid binding affinity and antibacterial activity of nitrofuran derivatives

    NASA Astrophysics Data System (ADS)

    Smith, P. J.; Popelier, P. L. A.

    2004-02-01

    The present day abundance of cheap computing power enables the use of quantum chemical ab initio data in Quantitative Structure-Activity Relationships (QSARs). Optimised bond lengths are a new such class of descriptors, which we have successfully used previously in representing electronic effects in medicinal and ecological QSARs (enzyme inhibitory activity, hydrolysis rate constants and pKas). Here we use AM1 and HF/3-21G* bond lengths in conjunction with Partial Least Squares (PLS) and a Genetic Algorithm (GA) to predict the Corticosteroid-Binding Globulin (CBG) binding activity of the classic steroid data set, and the antibacterial activity of nitrofuran derivatives. The current procedure, which does not require molecular alignment, produces good r2 and q2 values. Moreover, it highlights regions in the common steroid skeleton deemed relevant to the active regions of the steroids and nitrofuran derivatives.

  16. Quantitative structure-activity relationships of selective antagonists of glucagon receptor using QuaSAR descriptors.

    PubMed

    Manoj Kumar, Palanivelu; Karthikeyan, Chandrabose; Hari Narayana Moorthy, Narayana Subbiah; Trivedi, Piyush

    2006-11-01

    In the present paper, quantitative structure activity relationship (QSAR) approach was applied to understand the affinity and selectivity of a novel series of triaryl imidazole derivatives towards glucagon receptor. Statistically significant and highly predictive QSARs were derived for glucagon receptor inhibition by triaryl imidazoles using QuaSAR descriptors of molecular operating environment (MOE) employing computer-assisted multiple regression procedure. The generated QSAR models revealed that factors related to hydrophobicity, molecular shape and geometry predominantly influences glucagon receptor binding affinity of the triaryl imidazoles indicating the relevance of shape specific steric interactions between the molecule and the receptor. Further, QSAR models formulated for selective inhibition of glucagon receptor over p38 mitogen activated protein (MAP) kinase of the compounds in the series highlights that the same structural features, which influence the glucagon receptor affinity, also contribute to their selective inhibition.

  17. Quantitative visualization of passive transport across bilayer lipid membranes

    PubMed Central

    Grime, John M. A.; Edwards, Martin A.; Rudd, Nicola C.; Unwin, Patrick R.

    2008-01-01

    The ability to predict and interpret membrane permeation coefficients is of critical importance, particularly because passive transport is crucial for the effective delivery of many pharmaceutical agents to intracellular targets. We present a method for the quantitative measurement of the permeation coefficients of protonophores by using laser confocal scanning microscopy coupled to microelectrochemistry, which is amenable to precise modeling with the finite element method. The technique delivers well defined and high mass transport rates and allows rapid visualization of the entire pH distribution on both the cis and trans side of model bilayer lipid membranes (BLMs). A homologous series of carboxylic acids was investigated as probe molecules for BLMs composed of soybean phosphatidylcholine. Significantly, the permeation coefficient decreased with acyl tail length contrary to previous work and to Overton's rule. The reasons for this difference are considered, and we suggest that the applicability of Overton's rule requires re-evaluation. PMID:18787114

  18. Performance prediction of electrohydrodynamic thrusters by the perturbation method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shibata, H., E-mail: shibata@daedalus.k.u-tokyo.ac.jp; Watanabe, Y.; Suzuki, K.

    2016-05-15

    In this paper, we present a novel method for analyzing electrohydrodynamic (EHD) thrusters. The method is based on a perturbation technique applied to a set of drift-diffusion equations, similar to the one introduced in our previous study on estimating breakdown voltage. The thrust-to-current ratio is generalized to represent the performance of EHD thrusters. We have compared the thrust-to-current ratio obtained theoretically with that obtained from the proposed method under atmospheric air conditions, and we have obtained good quantitative agreement. Also, we have conducted a numerical simulation in more complex thruster geometries, such as the dual-stage thruster developed by Masuyama andmore » Barrett [Proc. R. Soc. A 469, 20120623 (2013)]. We quantitatively clarify the fact that if the magnitude of a third electrode voltage is low, the effective gap distance shortens, whereas if the magnitude of the third electrode voltage is sufficiently high, the effective gap distance lengthens.« less

  19. QR-STEM: Energy and Environment as a Context for Improving QR and STEM Understandings of 6-12 Grade Teachers II. The Quantitative Reasoning

    NASA Astrophysics Data System (ADS)

    Mayes, R.; Lyford, M. E.; Myers, J. D.

    2009-12-01

    The Quantitative Reasoning in STEM (QR STEM) project is a state level Mathematics and Science Partnership Project (MSP) with a focus on the mathematics and statistics that underlies the understanding of complex global scientific issues. This session is a companion session to the QR STEM: The Science presentation. The focus of this session is the quantitative reasoning aspects of the project. As students move from understandings that range from local to global in perspective on issues of energy and environment, there is a significant increase in the need for mathematical and statistical conceptual understanding. These understandings must be accessible to the students within the scientific context, requiring the special understandings that are endemic within quantitative reasoning. The QR STEM project brings together interdisciplinary teams of higher education faculty and middle/high school teachers to explore complex problems in energy and environment. The disciplines include life sciences, physics, chemistry, earth science, statistics, and mathematics. These interdisciplinary teams develop open ended performance tasks to implement in the classroom, based on scientific concepts that underpin energy and environment. Quantitative reasoning is broken down into three components: Quantitative Literacy, Quantitative Interpretation, and Quantitative Modeling. Quantitative Literacy is composed of arithmetic concepts such as proportional reasoning, numeracy, and descriptive statistics. Quantitative Interpretation includes algebraic and geometric concepts that underlie the ability to interpret a model of natural phenomena which is provided for the student. This model may be a table, graph, or equation from which the student is to make predictions or identify trends, or from which they would use statistics to explore correlations or patterns in data. Quantitative modeling is the ability to develop the model from data, including the ability to test hypothesis using statistical procedures. We use the term model very broadly, so it includes visual models such as box models, as well as best fit equation models and hypothesis testing. One of the powerful outcomes of the project is the conversation which takes place between science teachers and mathematics teachers. First they realize that though they are teaching concepts that cross their disciplines, the barrier of scientific language within their subjects restricts students from applying the concepts across subjects. Second the mathematics teachers discover the context of science as a means of providing real world situations that engage students in the utility of mathematics as a tool for solving problems. Third the science teachers discover the barrier to understanding science that is presented by poor quantitative reasoning ability. Finally the students are engaged in exploring energy and environment in a manner which exposes the importance of seeing a problem from multiple interdisciplinary perspectives. The outcome is a democratic citizen capable of making informed decisions, and perhaps a future scientist.

  20. Semiquantitative culture of Gardnerella vaginalis in laboratory determination of nonspecific vaginitis.

    PubMed Central

    Ratnam, S; Fitzgerald, B L

    1983-01-01

    To evaluate the usefulness of quantitative cultures of Gardnerella vaginalis in the laboratory determination of nonspecific vaginitis, the actual and relative numbers of G. vaginalis in genital cultures of a general patient population were assessed semiquantitatively, and the laboratory results were then correlated with the clinical findings. Of the 1,585 women studied, 417 (26.3%) yielded G. vaginalis in culture. Of these, only 113 (27.1%) were found to have symptoms and signs consistent with nonspecific vaginitis. G. vaginalis was obtained in pure or predominant growth from 87 of 100 consecutive cases with nonspecific vaginitis and 32 of 100 consecutive cases without the symptoms or signs of vaginitis (P less than 0.001). Hence, the positive predictive value of isolation of G. vaginalis in pure and predominant growths was determined to be 73% (87 of 119). Conversely, G. vaginalis was isolated in mixed or light growth significantly more often from asymptomatic women than from symptomatic patients, i.e., 68 versus 13 cases. Therefore, the negative predictive value of isolation of G. vaginalis in mixed and light growths was found to be 84% (68 of 81). Quantitation of the relative amount of G. vaginalis growth had higher predictive values as compared with the assessment of G. vaginalis growth alone. We conclude that quantitative culture of G. vaginalis is essential to obtain maximum reliability of culture results in the laboratory determination of nonspecific vaginitis. Although quantitated cultures of G. vaginalis have high predictive values, laboratory results must be interpreted in conjunction with the clinical findings. PMID:6604735

  1. Training Signaling Pathway Maps to Biochemical Data with Constrained Fuzzy Logic: Quantitative Analysis of Liver Cell Responses to Inflammatory Stimuli

    PubMed Central

    Morris, Melody K.; Saez-Rodriguez, Julio; Clarke, David C.; Sorger, Peter K.; Lauffenburger, Douglas A.

    2011-01-01

    Predictive understanding of cell signaling network operation based on general prior knowledge but consistent with empirical data in a specific environmental context is a current challenge in computational biology. Recent work has demonstrated that Boolean logic can be used to create context-specific network models by training proteomic pathway maps to dedicated biochemical data; however, the Boolean formalism is restricted to characterizing protein species as either fully active or inactive. To advance beyond this limitation, we propose a novel form of fuzzy logic sufficiently flexible to model quantitative data but also sufficiently simple to efficiently construct models by training pathway maps on dedicated experimental measurements. Our new approach, termed constrained fuzzy logic (cFL), converts a prior knowledge network (obtained from literature or interactome databases) into a computable model that describes graded values of protein activation across multiple pathways. We train a cFL-converted network to experimental data describing hepatocytic protein activation by inflammatory cytokines and demonstrate the application of the resultant trained models for three important purposes: (a) generating experimentally testable biological hypotheses concerning pathway crosstalk, (b) establishing capability for quantitative prediction of protein activity, and (c) prediction and understanding of the cytokine release phenotypic response. Our methodology systematically and quantitatively trains a protein pathway map summarizing curated literature to context-specific biochemical data. This process generates a computable model yielding successful prediction of new test data and offering biological insight into complex datasets that are difficult to fully analyze by intuition alone. PMID:21408212

  2. Animal versus human oral drug bioavailability: Do they correlate?

    PubMed Central

    Musther, Helen; Olivares-Morales, Andrés; Hatley, Oliver J.D.; Liu, Bo; Rostami Hodjegan, Amin

    2014-01-01

    Oral bioavailability is a key consideration in development of drug products, and the use of preclinical species in predicting bioavailability in human has long been debated. In order to clarify whether any correlation between human and animal bioavailability exist, an extensive analysis of the published literature data was conducted. Due to the complex nature of bioavailability calculations inclusion criteria were applied to ensure integrity of the data. A database of 184 compounds was assembled. Linear regression for the reported compounds indicated no strong or predictive correlations to human data for all species, individually and combined. The lack of correlation in this extended dataset highlights that animal bioavailability is not quantitatively predictive of bioavailability in human. Although qualitative (high/low bioavailability) indications might be possible, models taking into account species-specific factors that may affect bioavailability are recommended for developing quantitative prediction. PMID:23988844

  3. Novel Applications of Multi-task Learning and Multiple Output Regression to Multiple Genetic Trait Prediction

    USDA-ARS?s Scientific Manuscript database

    Given a set of biallelic molecular markers, such as SNPs, with genotype values encoded numerically on a collection of plant, animal or human samples, the goal of genetic trait prediction is to predict the quantitative trait values by simultaneously modeling all marker effects. Genetic trait predicti...

  4. Prediction of Safety Margin and Optimization of Dosing Protocol for a Novel Antibiotic using Quantitative Systems Pharmacology Modeling.

    PubMed

    Woodhead, Jeffrey L; Paech, Franziska; Maurer, Martina; Engelhardt, Marc; Schmitt-Hoffmann, Anne H; Spickermann, Jochen; Messner, Simon; Wind, Mathias; Witschi, Anne-Therese; Krähenbühl, Stephan; Siler, Scott Q; Watkins, Paul B; Howell, Brett A

    2018-06-07

    Elevations of liver enzymes have been observed in clinical trials with BAL30072, a novel antibiotic. In vitro assays have identified potential mechanisms for the observed hepatotoxicity, including electron transport chain (ETC) inhibition and reactive oxygen species (ROS) generation. DILIsym, a quantitative systems pharmacology (QSP) model of drug-induced liver injury, has been used to predict the likelihood that each mechanism explains the observed toxicity. DILIsym was also used to predict the safety margin for a novel BAL30072 dosing scheme; it was predicted to be low. DILIsym was then used to recommend potential modifications to this dosing scheme; weight-adjusted dosing and a requirement to assay plasma alanine aminotransferase (ALT) daily and stop dosing as soon as ALT increases were observed improved the predicted safety margin of BAL30072 and decreased the predicted likelihood of severe injury. This research demonstrates a potential application for QSP modeling in improving the safety profile of candidate drugs. © 2018 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  5. Clinical-Radiological Parameters Improve the Prediction of the Thrombolysis Time Window by Both MRI Signal Intensities and DWI-FLAIR Mismatch.

    PubMed

    Madai, Vince Istvan; Wood, Carla N; Galinovic, Ivana; Grittner, Ulrike; Piper, Sophie K; Revankar, Gajanan S; Martin, Steve Z; Zaro-Weber, Olivier; Moeller-Hartmann, Walter; von Samson-Himmelstjerna, Federico C; Heiss, Wolf-Dieter; Ebinger, Martin; Fiebach, Jochen B; Sobesky, Jan

    2016-01-01

    With regard to acute stroke, patients with unknown time from stroke onset are not eligible for thrombolysis. Quantitative diffusion weighted imaging (DWI) and fluid attenuated inversion recovery (FLAIR) MRI relative signal intensity (rSI) biomarkers have been introduced to predict eligibility for thrombolysis, but have shown heterogeneous results in the past. In the present work, we investigated whether the inclusion of easily obtainable clinical-radiological parameters would improve the prediction of the thrombolysis time window by rSIs and compared their performance to the visual DWI-FLAIR mismatch. In a retrospective study, patients from 2 centers with proven stroke with onset <12 h were included. The DWI lesion was segmented and overlaid on ADC and FLAIR images. rSI mean and SD, were calculated as follows: (mean ROI value/mean value of the unaffected hemisphere). Additionally, the visual DWI-FLAIR mismatch was evaluated. Prediction of the thrombolysis time window was evaluated by the area-under-the-curve (AUC) derived from receiver operating characteristic (ROC) curve analysis. Factors such as the association of age, National Institutes of Health Stroke Scale, MRI field strength, lesion size, vessel occlusion and Wahlund-Score with rSI were investigated and the models were adjusted and stratified accordingly. In 82 patients, the unadjusted rSI measures DWI-mean and -SD showed the highest AUCs (AUC 0.86-0.87). Adjustment for clinical-radiological covariates significantly improved the performance of FLAIR-mean (0.91) and DWI-SD (0.91). The best prediction results based on the AUC were found for the final stratified and adjusted models of DWI-SD (0.94) and FLAIR-mean (0.96) and a multivariable DWI-FLAIR model (0.95). The adjusted visual DWI-FLAIR mismatch did not perform in a significantly worse manner (0.89). ADC-rSIs showed fair performance in all models. Quantitative DWI and FLAIR MRI biomarkers as well as the visual DWI-FLAIR mismatch provide excellent prediction of eligibility for thrombolysis in acute stroke, when easily obtainable clinical-radiological parameters are included in the prediction models. © 2016 S. Karger AG, Basel.

  6. Validation metrics for turbulent plasma transport

    DOE PAGES

    Holland, C.

    2016-06-22

    Developing accurate models of plasma dynamics is essential for confident predictive modeling of current and future fusion devices. In modern computer science and engineering, formal verification and validation processes are used to assess model accuracy and establish confidence in the predictive capabilities of a given model. This paper provides an overview of the key guiding principles and best practices for the development of validation metrics, illustrated using examples from investigations of turbulent transport in magnetically confined plasmas. Particular emphasis is given to the importance of uncertainty quantification and its inclusion within the metrics, and the need for utilizing synthetic diagnosticsmore » to enable quantitatively meaningful comparisons between simulation and experiment. As a starting point, the structure of commonly used global transport model metrics and their limitations is reviewed. An alternate approach is then presented, which focuses upon comparisons of predicted local fluxes, fluctuations, and equilibrium gradients against observation. Furthermore, the utility of metrics based upon these comparisons is demonstrated by applying them to gyrokinetic predictions of turbulent transport in a variety of discharges performed on the DIII-D tokamak, as part of a multi-year transport model validation activity.« less

  7. Translational Modeling to Guide Study Design and Dose Choice in Obesity Exemplified by AZD1979, a Melanin‐concentrating Hormone Receptor 1 Antagonist

    PubMed Central

    Trägårdh, M; Lindén, D; Ploj, K; Johansson, A; Turnbull, A; Carlsson, B; Antonsson, M

    2017-01-01

    In this study, we present the translational modeling used in the discovery of AZD1979, a melanin‐concentrating hormone receptor 1 (MCHr1) antagonist aimed for treatment of obesity. The model quantitatively connects the relevant biomarkers and thereby closes the scaling path from rodent to man, as well as from dose to effect level. The complexity of individual modeling steps depends on the quality and quantity of data as well as the prior information; from semimechanistic body‐composition models to standard linear regression. Key predictions are obtained by standard forward simulation (e.g., predicting effect from exposure), as well as non‐parametric input estimation (e.g., predicting energy intake from longitudinal body‐weight data), across species. The work illustrates how modeling integrates data from several species, fills critical gaps between biomarkers, and supports experimental design and human dose‐prediction. We believe this approach can be of general interest for translation in the obesity field, and might inspire translational reasoning more broadly. PMID:28556607

  8. Predicting the electronic properties of aqueous solutions from first-principles

    NASA Astrophysics Data System (ADS)

    Schwegler, Eric; Pham, Tuan Anh; Govoni, Marco; Seidel, Robert; Bradforth, Stephen; Galli, Giulia

    Predicting the electronic properties of aqueous liquids has been a long-standing challenge for quantum-mechanical methods. Yet it is a crucial step in understanding and predicting the key role played by aqueous solutions and electrolytes in a wide variety of emerging energy and environmental technologies, including battery and photoelectrochemical cell design. Here we propose an efficient and accurate approach to predict the electronic properties of aqueous solutions, based on the combination of first-principles methods and experimental validation using state-of-the-art spectroscopic measurements. We present results for the photoelectron spectra of a broad range of solvated ions, showing that first-principles molecular dynamics simulations and electronic structure calculations using dielectric hybrid functionals provide a quantitative description of their electronic properties, including excitation energies, of the solvent and solutes. The proposed computational framework is general and applicable to other liquids, thereby offering great promise in understanding and engineering solutions and liquid electrolytes for a variety of important energy technologies. Part of this work was performed under the auspices of the U.S. Department of Energy at LLNL under Contract DE-AC52-07A27344.

  9. Validation metrics for turbulent plasma transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, C.

    Developing accurate models of plasma dynamics is essential for confident predictive modeling of current and future fusion devices. In modern computer science and engineering, formal verification and validation processes are used to assess model accuracy and establish confidence in the predictive capabilities of a given model. This paper provides an overview of the key guiding principles and best practices for the development of validation metrics, illustrated using examples from investigations of turbulent transport in magnetically confined plasmas. Particular emphasis is given to the importance of uncertainty quantification and its inclusion within the metrics, and the need for utilizing synthetic diagnosticsmore » to enable quantitatively meaningful comparisons between simulation and experiment. As a starting point, the structure of commonly used global transport model metrics and their limitations is reviewed. An alternate approach is then presented, which focuses upon comparisons of predicted local fluxes, fluctuations, and equilibrium gradients against observation. Furthermore, the utility of metrics based upon these comparisons is demonstrated by applying them to gyrokinetic predictions of turbulent transport in a variety of discharges performed on the DIII-D tokamak, as part of a multi-year transport model validation activity.« less

  10. Inductive reasoning about causally transmitted properties.

    PubMed

    Shafto, Patrick; Kemp, Charles; Bonawitz, Elizabeth Baraff; Coley, John D; Tenenbaum, Joshua B

    2008-11-01

    Different intuitive theories constrain and guide inferences in different contexts. Formalizing simple intuitive theories as probabilistic processes operating over structured representations, we present a new computational model of category-based induction about causally transmitted properties. A first experiment demonstrates undergraduates' context-sensitive use of taxonomic and food web knowledge to guide reasoning about causal transmission and shows good qualitative agreement between model predictions and human inferences. A second experiment demonstrates strong quantitative and qualitative fits to inferences about a more complex artificial food web. A third experiment investigates human reasoning about complex novel food webs where species have known taxonomic relations. Results demonstrate a double-dissociation between the predictions of our causal model and a related taxonomic model [Kemp, C., & Tenenbaum, J. B. (2003). Learning domain structures. In Proceedings of the 25th annual conference of the cognitive science society]: the causal model predicts human inferences about diseases but not genes, while the taxonomic model predicts human inferences about genes but not diseases. We contrast our framework with previous models of category-based induction and previous formal instantiations of intuitive theories, and outline challenges in developing a complete model of context-sensitive reasoning.

  11. [Influence of Spectral Pre-Processing on PLS Quantitative Model of Detecting Cu in Navel Orange by LIBS].

    PubMed

    Li, Wen-bing; Yao, Lin-tao; Liu, Mu-hua; Huang, Lin; Yao, Ming-yin; Chen, Tian-bing; He, Xiu-wen; Yang, Ping; Hu, Hui-qin; Nie, Jiang-hui

    2015-05-01

    Cu in navel orange was detected rapidly by laser-induced breakdown spectroscopy (LIBS) combined with partial least squares (PLS) for quantitative analysis, then the effect on the detection accuracy of the model with different spectral data ptetreatment methods was explored. Spectral data for the 52 Gannan navel orange samples were pretreated by different data smoothing, mean centralized and standard normal variable transform. Then 319~338 nm wavelength section containing characteristic spectral lines of Cu was selected to build PLS models, the main evaluation indexes of models such as regression coefficient (r), root mean square error of cross validation (RMSECV) and the root mean square error of prediction (RMSEP) were compared and analyzed. Three indicators of PLS model after 13 points smoothing and processing of the mean center were found reaching 0. 992 8, 3. 43 and 3. 4 respectively, the average relative error of prediction model is only 5. 55%, and in one word, the quality of calibration and prediction of this model are the best results. The results show that selecting the appropriate data pre-processing method, the prediction accuracy of PLS quantitative model of fruits and vegetables detected by LIBS can be improved effectively, providing a new method for fast and accurate detection of fruits and vegetables by LIBS.

  12. The Interrelationship between Promoter Strength, Gene Expression, and Growth Rate

    PubMed Central

    Klesmith, Justin R.; Detwiler, Emily E.; Tomek, Kyle J.; Whitehead, Timothy A.

    2014-01-01

    In exponentially growing bacteria, expression of heterologous protein impedes cellular growth rates. Quantitative understanding of the relationship between expression and growth rate will advance our ability to forward engineer bacteria, important for metabolic engineering and synthetic biology applications. Recently, a work described a scaling model based on optimal allocation of ribosomes for protein translation. This model quantitatively predicts a linear relationship between microbial growth rate and heterologous protein expression with no free parameters. With the aim of validating this model, we have rigorously quantified the fitness cost of gene expression by using a library of synthetic constitutive promoters to drive expression of two separate proteins (eGFP and amiE) in E. coli in different strains and growth media. In all cases, we demonstrate that the fitness cost is consistent with the previous findings. We expand upon the previous theory by introducing a simple promoter activity model to quantitatively predict how basal promoter strength relates to growth rate and protein expression. We then estimate the amount of protein expression needed to support high flux through a heterologous metabolic pathway and predict the sizable fitness cost associated with enzyme production. This work has broad implications across applied biological sciences because it allows for prediction of the interplay between promoter strength, protein expression, and the resulting cost to microbial growth rates. PMID:25286161

  13. Quantitative Assessment of Thermodynamic Constraints on the Solution Space of Genome-Scale Metabolic Models

    PubMed Central

    Hamilton, Joshua J.; Dwivedi, Vivek; Reed, Jennifer L.

    2013-01-01

    Constraint-based methods provide powerful computational techniques to allow understanding and prediction of cellular behavior. These methods rely on physiochemical constraints to eliminate infeasible behaviors from the space of available behaviors. One such constraint is thermodynamic feasibility, the requirement that intracellular flux distributions obey the laws of thermodynamics. The past decade has seen several constraint-based methods that interpret this constraint in different ways, including those that are limited to small networks, rely on predefined reaction directions, and/or neglect the relationship between reaction free energies and metabolite concentrations. In this work, we utilize one such approach, thermodynamics-based metabolic flux analysis (TMFA), to make genome-scale, quantitative predictions about metabolite concentrations and reaction free energies in the absence of prior knowledge of reaction directions, while accounting for uncertainties in thermodynamic estimates. We applied TMFA to a genome-scale network reconstruction of Escherichia coli and examined the effect of thermodynamic constraints on the flux space. We also assessed the predictive performance of TMFA against gene essentiality and quantitative metabolomics data, under both aerobic and anaerobic, and optimal and suboptimal growth conditions. Based on these results, we propose that TMFA is a useful tool for validating phenotypes and generating hypotheses, and that additional types of data and constraints can improve predictions of metabolite concentrations. PMID:23870272

  14. Exploring discrepancies between quantitative validation results and the geomorphic plausibility of statistical landslide susceptibility maps

    NASA Astrophysics Data System (ADS)

    Steger, Stefan; Brenning, Alexander; Bell, Rainer; Petschko, Helene; Glade, Thomas

    2016-06-01

    Empirical models are frequently applied to produce landslide susceptibility maps for large areas. Subsequent quantitative validation results are routinely used as the primary criteria to infer the validity and applicability of the final maps or to select one of several models. This study hypothesizes that such direct deductions can be misleading. The main objective was to explore discrepancies between the predictive performance of a landslide susceptibility model and the geomorphic plausibility of subsequent landslide susceptibility maps while a particular emphasis was placed on the influence of incomplete landslide inventories on modelling and validation results. The study was conducted within the Flysch Zone of Lower Austria (1,354 km2) which is known to be highly susceptible to landslides of the slide-type movement. Sixteen susceptibility models were generated by applying two statistical classifiers (logistic regression and generalized additive model) and two machine learning techniques (random forest and support vector machine) separately for two landslide inventories of differing completeness and two predictor sets. The results were validated quantitatively by estimating the area under the receiver operating characteristic curve (AUROC) with single holdout and spatial cross-validation technique. The heuristic evaluation of the geomorphic plausibility of the final results was supported by findings of an exploratory data analysis, an estimation of odds ratios and an evaluation of the spatial structure of the final maps. The results showed that maps generated by different inventories, classifiers and predictors appeared differently while holdout validation revealed similar high predictive performances. Spatial cross-validation proved useful to expose spatially varying inconsistencies of the modelling results while additionally providing evidence for slightly overfitted machine learning-based models. However, the highest predictive performances were obtained for maps that explicitly expressed geomorphically implausible relationships indicating that the predictive performance of a model might be misleading in the case a predictor systematically relates to a spatially consistent bias of the inventory. Furthermore, we observed that random forest-based maps displayed spatial artifacts. The most plausible susceptibility map of the study area showed smooth prediction surfaces while the underlying model revealed a high predictive capability and was generated with an accurate landslide inventory and predictors that did not directly describe a bias. However, none of the presented models was found to be completely unbiased. This study showed that high predictive performances cannot be equated with a high plausibility and applicability of subsequent landslide susceptibility maps. We suggest that greater emphasis should be placed on identifying confounding factors and biases in landslide inventories. A joint discussion between modelers and decision makers of the spatial pattern of the final susceptibility maps in the field might increase their acceptance and applicability.

  15. Assessment of MRI-Based Automated Fetal Cerebral Cortical Folding Measures in Prediction of Gestational Age in the Third Trimester.

    PubMed

    Wu, J; Awate, S P; Licht, D J; Clouchoux, C; du Plessis, A J; Avants, B B; Vossough, A; Gee, J C; Limperopoulos, C

    2015-07-01

    Traditional methods of dating a pregnancy based on history or sonographic assessment have a large variation in the third trimester. We aimed to assess the ability of various quantitative measures of brain cortical folding on MR imaging in determining fetal gestational age in the third trimester. We evaluated 8 different quantitative cortical folding measures to predict gestational age in 33 healthy fetuses by using T2-weighted fetal MR imaging. We compared the accuracy of the prediction of gestational age by these cortical folding measures with the accuracy of prediction by brain volume measurement and by a previously reported semiquantitative visual scale of brain maturity. Regression models were constructed, and measurement biases and variances were determined via a cross-validation procedure. The cortical folding measures are accurate in the estimation and prediction of gestational age (mean of the absolute error, 0.43 ± 0.45 weeks) and perform better than (P = .024) brain volume (mean of the absolute error, 0.72 ± 0.61 weeks) or sonography measures (SDs approximately 1.5 weeks, as reported in literature). Prediction accuracy is comparable with that of the semiquantitative visual assessment score (mean, 0.57 ± 0.41 weeks). Quantitative cortical folding measures such as global average curvedness can be an accurate and reliable estimator of gestational age and brain maturity for healthy fetuses in the third trimester and have the potential to be an indicator of brain-growth delays for at-risk fetuses and preterm neonates. © 2015 by American Journal of Neuroradiology.

  16. Predictive Modeling of Chemical Hazard by Integrating Numerical Descriptors of Chemical Structures and Short-term Toxicity Assay Data

    PubMed Central

    Rusyn, Ivan; Sedykh, Alexander; Guyton, Kathryn Z.; Tropsha, Alexander

    2012-01-01

    Quantitative structure-activity relationship (QSAR) models are widely used for in silico prediction of in vivo toxicity of drug candidates or environmental chemicals, adding value to candidate selection in drug development or in a search for less hazardous and more sustainable alternatives for chemicals in commerce. The development of traditional QSAR models is enabled by numerical descriptors representing the inherent chemical properties that can be easily defined for any number of molecules; however, traditional QSAR models often have limited predictive power due to the lack of data and complexity of in vivo endpoints. Although it has been indeed difficult to obtain experimentally derived toxicity data on a large number of chemicals in the past, the results of quantitative in vitro screening of thousands of environmental chemicals in hundreds of experimental systems are now available and continue to accumulate. In addition, publicly accessible toxicogenomics data collected on hundreds of chemicals provide another dimension of molecular information that is potentially useful for predictive toxicity modeling. These new characteristics of molecular bioactivity arising from short-term biological assays, i.e., in vitro screening and/or in vivo toxicogenomics data can now be exploited in combination with chemical structural information to generate hybrid QSAR–like quantitative models to predict human toxicity and carcinogenicity. Using several case studies, we illustrate the benefits of a hybrid modeling approach, namely improvements in the accuracy of models, enhanced interpretation of the most predictive features, and expanded applicability domain for wider chemical space coverage. PMID:22387746

  17. Non-destructive and fast identification of cotton-polyester blend fabrics by the portable near-infrared spectrometer.

    PubMed

    Li, Wen-xia; Li, Feng; Zhao, Guo-liang; Tang, Shi-jun; Liu, Xiao-ying

    2014-12-01

    A series of 376 cotton-polyester (PET) blend fabrics were studied by a portable near-infrared (NIR) spectrometer. A NIR semi-quantitative-qualitative calibration model was established by Partial Least Squares (PLS) method combined with qualitative identification coefficient. In this process, PLS method in a quantitative analysis was used as a correction method, and the qualitative identification coefficient was set by the content of cotton and polyester in blend fabrics. Cotton-polyester blend fabrics were identified qualitatively by the model and their relative contents were obtained quantitatively, the model can be used for semi-quantitative identification analysis. In the course of establishing the model, the noise and baseline drift of the spectra were eliminated by Savitzky-Golay(S-G) derivative. The influence of waveband selection and different pre-processing method was also studied in the qualitative calibration model. The major absorption bands of 100% cotton samples were in the 1400~1600 nm region, and the one for 100% polyester were around 1600~1800 nm, the absorption intensity was enhancing with the content increasing of cotton or polyester. Therefore, the cotton-polyester's major absorption region was selected as the base waveband, the optimal waveband (1100~2500 nm) was found by expanding the waveband in two directions (the correlation coefficient was 0.6, and wave-point number was 934). The validation samples were predicted by the calibration model, the results showed that the model evaluation parameters was optimum in the 1100~2500 nm region, and the combination of S-G derivative, multiplicative scatter correction (MSC) and mean centering was used as the pre-processing method. RC (relational coefficient of calibration) value was 0.978, RP (relational coefficient of prediction) value was 0.940, SEC (standard error of calibration) value was 1.264, SEP (standard error of prediction) value was 1.590, and the sample's recognition accuracy was up to 93.4%. It showed that the cotton-polyester blend fabrics could be predicted by the semi-quantitative-qualitative calibration model.

  18. On various metrics used for validation of predictive QSAR models with applications in virtual screening and focused library design.

    PubMed

    Roy, Kunal; Mitra, Indrani

    2011-07-01

    Quantitative structure-activity relationships (QSARs) have important applications in drug discovery research, environmental fate modeling, property prediction, etc. Validation has been recognized as a very important step for QSAR model development. As one of the important objectives of QSAR modeling is to predict activity/property/toxicity of new chemicals falling within the domain of applicability of the developed models and QSARs are being used for regulatory decisions, checking reliability of the models and confidence of their predictions is a very important aspect, which can be judged during the validation process. One prime application of a statistically significant QSAR model is virtual screening for molecules with improved potency based on the pharmacophoric features and the descriptors appearing in the QSAR model. Validated QSAR models may also be utilized for design of focused libraries which may be subsequently screened for the selection of hits. The present review focuses on various metrics used for validation of predictive QSAR models together with an overview of the application of QSAR models in the fields of virtual screening and focused library design for diverse series of compounds with citation of some recent examples.

  19. Exploring the Sequence-based Prediction of Folding Initiation Sites in Proteins.

    PubMed

    Raimondi, Daniele; Orlando, Gabriele; Pancsa, Rita; Khan, Taushif; Vranken, Wim F

    2017-08-18

    Protein folding is a complex process that can lead to disease when it fails. Especially poorly understood are the very early stages of protein folding, which are likely defined by intrinsic local interactions between amino acids close to each other in the protein sequence. We here present EFoldMine, a method that predicts, from the primary amino acid sequence of a protein, which amino acids are likely involved in early folding events. The method is based on early folding data from hydrogen deuterium exchange (HDX) data from NMR pulsed labelling experiments, and uses backbone and sidechain dynamics as well as secondary structure propensities as features. The EFoldMine predictions give insights into the folding process, as illustrated by a qualitative comparison with independent experimental observations. Furthermore, on a quantitative proteome scale, the predicted early folding residues tend to become the residues that interact the most in the folded structure, and they are often residues that display evolutionary covariation. The connection of the EFoldMine predictions with both folding pathway data and the folded protein structure suggests that the initial statistical behavior of the protein chain with respect to local structure formation has a lasting effect on its subsequent states.

  20. Investigating the Validity of Two Widely Used Quantitative Text Tools

    ERIC Educational Resources Information Center

    Cunningham, James W.; Hiebert, Elfrieda H.; Mesmer, Heidi Anne

    2018-01-01

    In recent years, readability formulas have gained new prominence as a basis for selecting texts for learning and assessment. Variables that quantitative tools count (e.g., word frequency, sentence length) provide valid measures of text complexity insofar as they accurately predict representative and high-quality criteria. The longstanding…

  1. Dual-Enrollment High-School Graduates' College-Enrollment Considerations

    ERIC Educational Resources Information Center

    Damrow, Roberta J.

    2017-01-01

    This quantitative study examined college enrollment considerations of dual-enrollment students enrolling at one Wisconsin credit-granting technical college. A combined college-choice theoretical framework guided this quantitative study that addressed two research questions: To what extent, if any, did the number of dual credits predict likelihood…

  2. Why physics needs mathematics

    NASA Astrophysics Data System (ADS)

    Rohrlich, Fritz

    2011-12-01

    Classical and the quantum mechanical sciences are in essential need of mathematics. Only thus can the laws of nature be formulated quantitatively permitting quantitative predictions. Mathematics also facilitates extrapolations. But classical and quantum sciences differ in essential ways: they follow different laws of logic, Aristotelian and non-Aristotelian logics, respectively. These are explicated.

  3. ESTIMATION OF MICROBIAL REDUCTIVE TRANSFORMATION RATES FOR CHLORINATED BENZENES AND PHENOLS USING A QUANTITATIVE STRUCTURE-ACTIVITY RELATIONSHIP APPROACH

    EPA Science Inventory

    A set of literature data was used to derive several quantitative structure-activity relationships (QSARs) to predict the rate constants for the microbial reductive dehalogenation of chlorinated aromatics. Dechlorination rate constants for 25 chloroaromatics were corrected for th...

  4. Multivariate Radiological-Based Models for the Prediction of Future Knee Pain: Data from the OAI

    PubMed Central

    Galván-Tejada, Jorge I.; Celaya-Padilla, José M.; Treviño, Victor; Tamez-Peña, José G.

    2015-01-01

    In this work, the potential of X-ray based multivariate prognostic models to predict the onset of chronic knee pain is presented. Using X-rays quantitative image assessments of joint-space-width (JSW) and paired semiquantitative central X-ray scores from the Osteoarthritis Initiative (OAI), a case-control study is presented. The pain assessments of the right knee at the baseline and the 60-month visits were used to screen for case/control subjects. Scores were analyzed at the time of pain incidence (T-0), the year prior incidence (T-1), and two years before pain incidence (T-2). Multivariate models were created by a cross validated elastic-net regularized generalized linear models feature selection tool. Univariate differences between cases and controls were reported by AUC, C-statistics, and ODDs ratios. Univariate analysis indicated that the medial osteophytes were significantly more prevalent in cases than controls: C-stat 0.62, 0.62, and 0.61, at T-0, T-1, and T-2, respectively. The multivariate JSW models significantly predicted pain: AUC = 0.695, 0.623, and 0.620, at T-0, T-1, and T-2, respectively. Semiquantitative multivariate models predicted paint with C-stat = 0.671, 0.648, and 0.645 at T-0, T-1, and T-2, respectively. Multivariate models derived from plain X-ray radiography assessments may be used to predict subjects that are at risk of developing knee pain. PMID:26504490

  5. A predictive framework and review of the ecological impacts of exotic plant invasions on reptiles and amphibians.

    PubMed

    Martin, Leigh J; Murray, Brad R

    2011-05-01

    The invasive spread of exotic plants in native vegetation can pose serious threats to native faunal assemblages. This is of particular concern for reptiles and amphibians because they form a significant component of the world's vertebrate fauna, play a pivotal role in ecosystem functioning and are often neglected in biodiversity research. A framework to predict how exotic plant invasion will affect reptile and amphibian assemblages is imperative for conservation, management and the identification of research priorities. Here, we present a new predictive framework that integrates three mechanistic models. These models are based on exotic plant invasion altering: (1) habitat structure; (2) herbivory and predator-prey interactions; (3) the reproductive success of reptile and amphibian species and assemblages. We present a series of testable predictions from these models that arise from the interplay over time among three exotic plant traits (growth form, area of coverage, taxonomic distinctiveness) and six traits of reptiles and amphibians (body size, lifespan, home range size, habitat specialisation, diet, reproductive strategy). A literature review provided robust empirical evidence of exotic plant impacts on reptiles and amphibians from each of the three model mechanisms. Evidence relating to the role of body size and diet was less clear-cut, indicating the need for further research. The literature provided limited empirical support for many of the other model predictions. This was not, however, because findings contradicted our model predictions but because research in this area is sparse. In particular, the small number of studies specifically examining the effects of exotic plants on amphibians highlights the pressing need for quantitative research in this area. There is enormous scope for detailed empirical investigation of interactions between exotic plants and reptile and amphibian species and assemblages. The framework presented here and further testing of predictions will provide a basis for informing and prioritising environmental management and exotic plant control efforts. © 2010 The Authors. Biological Reviews © 2010 Cambridge Philosophical Society.

  6. Analysis of genetic effects of nuclear-cytoplasmic interaction on quantitative traits: genetic model for diploid plants.

    PubMed

    Han, Lide; Yang, Jian; Zhu, Jun

    2007-06-01

    A genetic model was proposed for simultaneously analyzing genetic effects of nuclear, cytoplasm, and nuclear-cytoplasmic interaction (NCI) as well as their genotype by environment (GE) interaction for quantitative traits of diploid plants. In the model, the NCI effects were further partitioned into additive and dominance nuclear-cytoplasmic interaction components. Mixed linear model approaches were used for statistical analysis. On the basis of diallel cross designs, Monte Carlo simulations showed that the genetic model was robust for estimating variance components under several situations without specific effects. Random genetic effects were predicted by an adjusted unbiased prediction (AUP) method. Data on four quantitative traits (boll number, lint percentage, fiber length, and micronaire) in Upland cotton (Gossypium hirsutum L.) were analyzed as a worked example to show the effectiveness of the model.

  7. Comparison of intradermal dilutional testing, skin prick testing, and modified quantitative testing for common allergens.

    PubMed

    Peltier, Jacques; Ryan, Matthew W

    2007-08-01

    To compare and correlate wheal size using the Multi-Test II applicator with the endpoint obtained by intradermal dilutional testing (IDT) for 5 common allergens. To examine the safety of modified quantitative testing (MQT) for determining immunotherapy starting doses. Prospective comparative clinical study. A total of 134 subjects were simultaneously skin tested for immediate hypersensitivity using the Multi-Test II device and IDT. There was a 77% concordance between results from IDT and results from MQT. When there was a difference, MQT predicted a safer endpoint for starting immunotherapy in all but 2 cases. Wheal size by SPT is predictive of endpoint by IDT. MQT is nearly as effective as formal IDT in determining endpoint. Modified quantitative testing appears to be a safe alternative to IDT for determining starting doses for immunotherapy.

  8. Temporal maps and informativeness in associative learning.

    PubMed

    Balsam, Peter D; Gallistel, C Randy

    2009-02-01

    Neurobiological research on learning assumes that temporal contiguity is essential for association formation, but what constitutes temporal contiguity has never been specified. We review evidence that learning depends, instead, on learning a temporal map. Temporal relations between events are encoded even from single experiences. The speed with which an anticipatory response emerges is proportional to the informativeness of the encoded relation between a predictive stimulus or event and the event it predicts. This principle yields a quantitative account of the heretofore undefined, but theoretically crucial, concept of temporal pairing, an account in quantitative accord with surprising experimental findings. The same principle explains the basic results in the cue competition literature, which motivated the Rescorla-Wagner model and most other contemporary models of associative learning. The essential feature of a memory mechanism in this account is its ability to encode quantitative information.

  9. Temporal maps and informativeness in associative learning

    PubMed Central

    Balsam, Peter D; Gallistel, C. Randy

    2009-01-01

    Neurobiological research on learning assumes that temporal contiguity is essential for association formation, but what constitutes temporal contiguity has never been specified. We review evidence that learning depends, instead, on learning a temporal map. Temporal relations between events are encoded even from single experiences. The speed with which an anticipatory response emerges is proportional to the informativeness of the encoded relation between a predictive stimulus or event and the event it predicts. This principle yields a quantitative account of the heretofore undefined, but theoretically crucial, concept of temporal pairing, an account in quantitative accord with surprising experimental findings. The same principle explains the basic results in the cue competition literature, which motivated the Rescorla–Wagner model and most other contemporary models of associative learning. The essential feature of a memory mechanism in this account is its ability to encode quantitative information. PMID:19136158

  10. The use of weather data to predict non-recurring traffic congestion

    DOT National Transportation Integrated Search

    2006-08-01

    This project will demonstrate the quantitative relationship between weather patterns and surface traffic conditions. The aviation and maritime industries use weather measurements and predictions as a normal part of operations, and this can be extende...

  11. The second Sandia Fracture Challenge. Predictions of ductile failure under quasi-static and moderate-rate dynamic loading

    DOE PAGES

    Boyce, B. L.; Kramer, S. L. B.; Bosiljevac, T. R.; ...

    2016-03-14

    Ductile failure of structural metals is relevant to a wide range of engineering scenarios. Computational methods are employed to anticipate the critical conditions of failure, yet they sometimes provide inaccurate and misleading predictions. Challenge scenarios, such as the one presented in the current work, provide an opportunity to assess the blind, quantitative predictive ability of simulation methods against a previously unseen failure problem. Instead of evaluating the predictions of a single simulation approach, the Sandia Fracture Challenge relied on numerous volunteer teams with expertise in computational mechanics to apply a broad range of computational methods, numerical algorithms, and constitutive modelsmore » to the challenge. This exercise is intended to evaluate the state of health of technologies available for failure prediction. In the first Sandia Fracture Challenge, a wide range of issues were raised in ductile failure modeling, including a lack of consistency in failure models, the importance of shear calibration data, and difficulties in quantifying the uncertainty of prediction [see Boyce et al. (Int J Fract 186:5–68, 2014) for details of these observations]. This second Sandia Fracture Challenge investigated the ductile rupture of a Ti–6Al–4V sheet under both quasi-static and modest-rate dynamic loading (failure in ~ 0.1 s). Like the previous challenge, the sheet had an unusual arrangement of notches and holes that added geometric complexity and fostered a competition between tensile- and shear-dominated failure modes. The teams were asked to predict the fracture path and quantitative far-field failure metrics such as the peak force and displacement to cause crack initiation. Fourteen teams contributed blind predictions, and the experimental outcomes were quantified in three independent test labs. In addition, shortcomings were revealed in this second challenge such as inconsistency in the application of appropriate boundary conditions, need for a thermomechanical treatment of the heat generation in the dynamic loading condition, and further difficulties in model calibration based on limited real-world engineering data. As with the prior challenge, this work not only documents the ‘state-of-the-art’ in computational failure prediction of ductile tearing scenarios, but also provides a detailed dataset for non-blind assessment of alternative methods.« less

  12. The second Sandia Fracture Challenge. Predictions of ductile failure under quasi-static and moderate-rate dynamic loading

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boyce, B. L.; Kramer, S. L. B.; Bosiljevac, T. R.

    Ductile failure of structural metals is relevant to a wide range of engineering scenarios. Computational methods are employed to anticipate the critical conditions of failure, yet they sometimes provide inaccurate and misleading predictions. Challenge scenarios, such as the one presented in the current work, provide an opportunity to assess the blind, quantitative predictive ability of simulation methods against a previously unseen failure problem. Instead of evaluating the predictions of a single simulation approach, the Sandia Fracture Challenge relied on numerous volunteer teams with expertise in computational mechanics to apply a broad range of computational methods, numerical algorithms, and constitutive modelsmore » to the challenge. This exercise is intended to evaluate the state of health of technologies available for failure prediction. In the first Sandia Fracture Challenge, a wide range of issues were raised in ductile failure modeling, including a lack of consistency in failure models, the importance of shear calibration data, and difficulties in quantifying the uncertainty of prediction [see Boyce et al. (Int J Fract 186:5–68, 2014) for details of these observations]. This second Sandia Fracture Challenge investigated the ductile rupture of a Ti–6Al–4V sheet under both quasi-static and modest-rate dynamic loading (failure in ~ 0.1 s). Like the previous challenge, the sheet had an unusual arrangement of notches and holes that added geometric complexity and fostered a competition between tensile- and shear-dominated failure modes. The teams were asked to predict the fracture path and quantitative far-field failure metrics such as the peak force and displacement to cause crack initiation. Fourteen teams contributed blind predictions, and the experimental outcomes were quantified in three independent test labs. In addition, shortcomings were revealed in this second challenge such as inconsistency in the application of appropriate boundary conditions, need for a thermomechanical treatment of the heat generation in the dynamic loading condition, and further difficulties in model calibration based on limited real-world engineering data. As with the prior challenge, this work not only documents the ‘state-of-the-art’ in computational failure prediction of ductile tearing scenarios, but also provides a detailed dataset for non-blind assessment of alternative methods.« less

  13. Uniting Cheminformatics and Chemical Theory To Predict the Intrinsic Aqueous Solubility of Crystalline Druglike Molecules

    PubMed Central

    2014-01-01

    We present four models of solution free-energy prediction for druglike molecules utilizing cheminformatics descriptors and theoretically calculated thermodynamic values. We make predictions of solution free energy using physics-based theory alone and using machine learning/quantitative structure–property relationship (QSPR) models. We also develop machine learning models where the theoretical energies and cheminformatics descriptors are used as combined input. These models are used to predict solvation free energy. While direct theoretical calculation does not give accurate results in this approach, machine learning is able to give predictions with a root mean squared error (RMSE) of ∼1.1 log S units in a 10-fold cross-validation for our Drug-Like-Solubility-100 (DLS-100) dataset of 100 druglike molecules. We find that a model built using energy terms from our theoretical methodology as descriptors is marginally less predictive than one built on Chemistry Development Kit (CDK) descriptors. Combining both sets of descriptors allows a further but very modest improvement in the predictions. However, in some cases, this is a statistically significant enhancement. These results suggest that there is little complementarity between the chemical information provided by these two sets of descriptors, despite their different sources and methods of calculation. Our machine learning models are also able to predict the well-known Solubility Challenge dataset with an RMSE value of 0.9–1.0 log S units. PMID:24564264

  14. Validation Process for LEWICE Coupled by Use of a Navier-stokes Solver

    NASA Technical Reports Server (NTRS)

    Wright, William B.

    2016-01-01

    A research project is underway at NASA Glenn to produce computer software that can accurately predict ice growth for many meteorological conditions for any aircraft surface. This report will present results from the latest LEWICE release, version 3.5. This program differs from previous releases in its ability to model mixed phase and ice crystal conditions such as those encountered inside an engine. It also has expanded capability to use structured grids and a new capability to use results from unstructured grid flow solvers. An extensive comparison of the results in a quantifiable manner against the database of ice shapes that have been generated in the NASA Glenn Icing Research Tunnel (IRT) has also been performed. This paper will show the differences in ice shape between LEWICE 3.5 and experimental data. In addition, comparisons will be made between the lift and drag calculated on the ice shapes from experiment and those produced by LEWICE. This report will also provide a description of both programs. Quantitative geometric comparisons are shown for horn height, horn angle, icing limit, area and leading edge thickness. Quantitative comparisons of calculated lift and drag will also be shown. The results show that the predicted results are within the accuracy limits of the experimental data for the majority of cases.

  15. Landslide hazard assessment: recent trends and techniques.

    PubMed

    Pardeshi, Sudhakar D; Autade, Sumant E; Pardeshi, Suchitra S

    2013-01-01

    Landslide hazard assessment is an important step towards landslide hazard and risk management. There are several methods of Landslide Hazard Zonation (LHZ) viz. heuristic, semi quantitative, quantitative, probabilistic and multi-criteria decision making process. However, no one method is accepted universally for effective assessment of landslide hazards. In recent years, several attempts have been made to apply different methods of LHZ and to compare results in order to find the best suited model. This paper presents the review of researches on landslide hazard mapping published in recent years. The advanced multivariate techniques are proved to be effective in spatial prediction of landslides with high degree of accuracy. Physical process based models also perform well in LHZ mapping even in the areas with poor database. Multi-criteria decision making approach also play significant role in determining relative importance of landslide causative factors in slope instability process. Remote Sensing and Geographical Information System (GIS) are powerful tools to assess landslide hazards and are being used extensively in landslide researches since last decade. Aerial photographs and high resolution satellite data are useful in detection, mapping and monitoring landslide processes. GIS based LHZ models helps not only to map and monitor landslides but also to predict future slope failures. The advancements in Geo-spatial technologies have opened the doors for detailed and accurate assessment of landslide hazards.

  16. Global Scale Periodic Responses in Saturn’s Magnetosphere

    NASA Astrophysics Data System (ADS)

    Jia, Xianzhe; Kivelson, Margaret G.

    2017-10-01

    Despite having an axisymmetric internal magnetic field, Saturn’s magnetosphere exhibits periodic modulations in a variety of properties at periods close to the planetary rotation period. While the source of the periodicity remains unidentified, it is evident from Cassini observations that much of Saturn’s magnetospheric structure and dynamics is dominated by global-scale responses to the driving source of the periodicity. We have developed a global MHD model in which a rotating field-aligned current system is introduced by imposing vortical flows in the high-latitude ionosphere in order to simulate the magnetospheric periodicities. The model has been utilized to quantitatively characterize various periodic responses in the magnetosphere, such as the displacement of the magnetopause and bow shock and flapping of the tail plasma sheet, all of which show quantitative agreement with Cassini observations. One of our model predictions is periodic release of plasmoids in the tail that occurs preferentially in the midnight-to-dawn local time sector during each rotation cycle. Here we present detailed analysis of the periodic responses seen in our simulations focusing on the properties of plasmoids predicted by the model, including their spatial distribution, occurrence frequency, and mass loss rate. We will compare these modeled parameters with published Cassini observations, and discuss their implications for interpreting in-situ measurements.

  17. Quantitative assessment of copper proteinates used as animal feed additives using ATR-FTIR spectroscopy and powder X-ray diffraction (PXRD) analysis.

    PubMed

    Cantwell, Caoimhe A; Byrne, Laurann A; Connolly, Cathal D; Hynes, Michael J; McArdle, Patrick; Murphy, Richard A

    2017-08-01

    The aim of the present work was to establish a reliable analytical method to determine the degree of complexation in commercial metal proteinates used as feed additives in the solid state. Two complementary techniques were developed. Firstly, a quantitative attenuated total reflectance Fourier transform infrared (ATR-FTIR) spectroscopic method investigated modifications in vibrational absorption bands of the ligand on complex formation. Secondly, a powder X-ray diffraction (PXRD) method to quantify the amount of crystalline material in the proteinate product was developed. These methods were developed in tandem and cross-validated with each other. Multivariate analysis (MVA) was used to develop validated calibration and prediction models. The FTIR and PXRD calibrations showed excellent linearity (R 2  > 0.99). The diagnostic model parameters showed that the FTIR and PXRD methods were robust with a root mean square error of calibration RMSEC ≤3.39% and a root mean square error of prediction RMSEP ≤7.17% respectively. Comparative statistics show excellent agreement between the MVA packages assessed and between the FTIR and PXRD methods. The methods can be used to determine the degree of complexation in complexes of both protein hydrolysates and pure amino acids.

  18. Using Data Independent Acquisition (DIA) to Model High-responding Peptides for Targeted Proteomics Experiments*

    PubMed Central

    Searle, Brian C.; Egertson, Jarrett D.; Bollinger, James G.; Stergachis, Andrew B.; MacCoss, Michael J.

    2015-01-01

    Targeted mass spectrometry is an essential tool for detecting quantitative changes in low abundant proteins throughout the proteome. Although selected reaction monitoring (SRM) is the preferred method for quantifying peptides in complex samples, the process of designing SRM assays is laborious. Peptides have widely varying signal responses dictated by sequence-specific physiochemical properties; one major challenge is in selecting representative peptides to target as a proxy for protein abundance. Here we present PREGO, a software tool that predicts high-responding peptides for SRM experiments. PREGO predicts peptide responses with an artificial neural network trained using 11 minimally redundant, maximally relevant properties. Crucial to its success, PREGO is trained using fragment ion intensities of equimolar synthetic peptides extracted from data independent acquisition experiments. Because of similarities in instrumentation and the nature of data collection, relative peptide responses from data independent acquisition experiments are a suitable substitute for SRM experiments because they both make quantitative measurements from integrated fragment ion chromatograms. Using an SRM experiment containing 12,973 peptides from 724 synthetic proteins, PREGO exhibits a 40–85% improvement over previously published approaches at selecting high-responding peptides. These results also represent a dramatic improvement over the rules-based peptide selection approaches commonly used in the literature. PMID:26100116

  19. Arms race between selfishness and policing: two-trait quantitative genetic model for caste fate conflict in eusocial Hymenoptera.

    PubMed

    Dobata, Shigeto

    2012-12-01

    Policing against selfishness is now regarded as the main force maintaining cooperation, by reducing costly conflict in complex social systems. Although policing has been studied extensively in social insect colonies, its coevolution against selfishness has not been fully captured by previous theories. In this study, I developed a two-trait quantitative genetic model of the conflict between selfish immature females (usually larvae) and policing workers in eusocial Hymenoptera over the immatures' propensity to develop into new queens. This model allows for the analysis of coevolution between genomes expressed in immatures and workers that collectively determine the immatures' queen caste fate. The main prediction of the model is that a higher level of polyandry leads to a smaller fraction of queens produced among new females through caste fate policing. The other main prediction of the present model is that, as a result of arms race, caste fate policing by workers coevolves with exaggerated selfishness of the immatures achieving maximum potential to develop into queens. Moreover, the model can incorporate genetic correlation between traits, which has been largely unexplored in social evolution theory. This study highlights the importance of understanding social traits as influenced by the coevolution of conflicting genomes. © 2012 The Author. Evolution© 2012 The Society for the Study of Evolution.

  20. Nonlinear ultrasonics for material state awareness

    NASA Astrophysics Data System (ADS)

    Jacobs, L. J.

    2014-02-01

    Predictive health monitoring of structural components will require the development of advanced sensing techniques capable of providing quantitative information on the damage state of structural materials. By focusing on nonlinear acoustic techniques, it is possible to measure absolute, strength based material parameters that can then be coupled with uncertainty models to enable accurate and quantitative life prediction. Starting at the material level, this review will present current research that involves a combination of sensing techniques and physics-based models to characterize damage in metallic materials. In metals, these nonlinear ultrasonic measurements can sense material state, before the formation of micro- and macro-cracks. Typically, cracks of a measurable size appear quite late in a component's total life, while the material's integrity in terms of toughness and strength gradually decreases due to the microplasticity (dislocations) and associated change in the material's microstructure. This review focuses on second harmonic generation techniques. Since these nonlinear acoustic techniques are acoustic wave based, component interrogation can be performed with bulk, surface and guided waves using the same underlying material physics; these nonlinear ultrasonic techniques provide results which are independent of the wave type used. Recent physics-based models consider the evolution of damage due to dislocations, slip bands, interstitials, and precipitates in the lattice structure, which can lead to localized damage.

Top