Sample records for quantitative computer model

  1. Modeling Students' Problem Solving Performance in the Computer-Based Mathematics Learning Environment

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2017-01-01

    Purpose: The purpose of this paper is to develop a quantitative model of problem solving performance of students in the computer-based mathematics learning environment. Design/methodology/approach: Regularized logistic regression was used to create a quantitative model of problem solving performance of students that predicts whether students can…

  2. Computer system for definition of the quantitative geometry of musculature from CT images.

    PubMed

    Daniel, Matej; Iglic, Ales; Kralj-Iglic, Veronika; Konvicková, Svatava

    2005-02-01

    The computer system for quantitative determination of musculoskeletal geometry from computer tomography (CT) images has been developed. The computer system processes series of CT images to obtain three-dimensional (3D) model of bony structures where the effective muscle fibres can be interactively defined. Presented computer system has flexible modular structure and is suitable also for educational purposes.

  3. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  4. Quantitative Predictive Models for Systemic Toxicity (SOT)

    EPA Science Inventory

    Models to identify systemic and specific target organ toxicity were developed to help transition the field of toxicology towards computational models. By leveraging multiple data sources to incorporate read-across and machine learning approaches, a quantitative model of systemic ...

  5. EnviroLand: A Simple Computer Program for Quantitative Stream Assessment.

    ERIC Educational Resources Information Center

    Dunnivant, Frank; Danowski, Dan; Timmens-Haroldson, Alice; Newman, Meredith

    2002-01-01

    Introduces the Enviroland computer program which features lab simulations of theoretical calculations for quantitative analysis and environmental chemistry, and fate and transport models. Uses the program to demonstrate the nature of linear and nonlinear equations. (Author/YDS)

  6. A Quantitative Geochemical Target for Modeling the Formation of the Earth and Moon

    NASA Technical Reports Server (NTRS)

    Boyce, Jeremy W.; Barnes, Jessica J.; McCubbin, Francis M.

    2017-01-01

    The past decade has been one of geochemical, isotopic, and computational advances that are bringing the laboratory measurements and computational modeling neighborhoods of the Earth-Moon community to ever closer proximity. We are now however in the position to become even better neighbors: modelers can generate testable hypthotheses for geochemists; and geochemists can provide quantitive targets for modelers. Here we present a robust example of the latter based on Cl isotope measurements of mare basalts.

  7. Computer simulation of the metastatic progression.

    PubMed

    Wedemann, Gero; Bethge, Anja; Haustein, Volker; Schumacher, Udo

    2014-01-01

    A novel computer model based on a discrete event simulation procedure describes quantitatively the processes underlying the metastatic cascade. Analytical functions describe the size of the primary tumor and the metastases, while a rate function models the intravasation events of the primary tumor and metastases. Events describe the behavior of the malignant cells until the formation of new metastases. The results of the computer simulations are in quantitative agreement with clinical data determined from a patient with hepatocellular carcinoma in the liver. The model provides a more detailed view on the process than a conventional mathematical model. In particular, the implications of interventions on metastasis formation can be calculated.

  8. Computational modeling of brain tumors: discrete, continuum or hybrid?

    NASA Astrophysics Data System (ADS)

    Wang, Zhihui; Deisboeck, Thomas S.

    In spite of all efforts, patients diagnosed with highly malignant brain tumors (gliomas), continue to face a grim prognosis. Achieving significant therapeutic advances will also require a more detailed quantitative understanding of the dynamic interactions among tumor cells, and between these cells and their biological microenvironment. Data-driven computational brain tumor models have the potential to provide experimental tumor biologists with such quantitative and cost-efficient tools to generate and test hypotheses on tumor progression, and to infer fundamental operating principles governing bidirectional signal propagation in multicellular cancer systems. This review highlights the modeling objectives of and challenges with developing such in silico brain tumor models by outlining two distinct computational approaches: discrete and continuum, each with representative examples. Future directions of this integrative computational neuro-oncology field, such as hybrid multiscale multiresolution modeling are discussed.

  9. Computational modeling of brain tumors: discrete, continuum or hybrid?

    NASA Astrophysics Data System (ADS)

    Wang, Zhihui; Deisboeck, Thomas S.

    2008-04-01

    In spite of all efforts, patients diagnosed with highly malignant brain tumors (gliomas), continue to face a grim prognosis. Achieving significant therapeutic advances will also require a more detailed quantitative understanding of the dynamic interactions among tumor cells, and between these cells and their biological microenvironment. Data-driven computational brain tumor models have the potential to provide experimental tumor biologists with such quantitative and cost-efficient tools to generate and test hypotheses on tumor progression, and to infer fundamental operating principles governing bidirectional signal propagation in multicellular cancer systems. This review highlights the modeling objectives of and challenges with developing such in silicobrain tumor models by outlining two distinct computational approaches: discrete and continuum, each with representative examples. Future directions of this integrative computational neuro-oncology field, such as hybrid multiscale multiresolution modeling are discussed.

  10. The Impact of Three-Dimensional Computational Modeling on Student Understanding of Astronomical Concepts: A Quantitative Analysis

    ERIC Educational Resources Information Center

    Hansen, John; Barnett, Michael; MaKinster, James; Keating, Thomas

    2004-01-01

    The increased availability of computational modeling software has created opportunities for students to engage in scientific inquiry through constructing computer-based models of scientific phenomena. However, despite the growing trend of integrating technology into science curricula, educators need to understand what aspects of these technologies…

  11. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast

    PubMed Central

    Pang, Wei; Coghill, George M.

    2015-01-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. PMID:25864377

  12. Impact of implementation choices on quantitative predictions of cell-based computational models

    NASA Astrophysics Data System (ADS)

    Kursawe, Jochen; Baker, Ruth E.; Fletcher, Alexander G.

    2017-09-01

    'Cell-based' models provide a powerful computational tool for studying the mechanisms underlying the growth and dynamics of biological tissues in health and disease. An increasing amount of quantitative data with cellular resolution has paved the way for the quantitative parameterisation and validation of such models. However, the numerical implementation of cell-based models remains challenging, and little work has been done to understand to what extent implementation choices may influence model predictions. Here, we consider the numerical implementation of a popular class of cell-based models called vertex models, which are often used to study epithelial tissues. In two-dimensional vertex models, a tissue is approximated as a tessellation of polygons and the vertices of these polygons move due to mechanical forces originating from the cells. Such models have been used extensively to study the mechanical regulation of tissue topology in the literature. Here, we analyse how the model predictions may be affected by numerical parameters, such as the size of the time step, and non-physical model parameters, such as length thresholds for cell rearrangement. We find that vertex positions and summary statistics are sensitive to several of these implementation parameters. For example, the predicted tissue size decreases with decreasing cell cycle durations, and cell rearrangement may be suppressed by large time steps. These findings are counter-intuitive and illustrate that model predictions need to be thoroughly analysed and implementation details carefully considered when applying cell-based computational models in a quantitative setting.

  13. Resources and Approaches for Teaching Quantitative and Computational Skills in the Geosciences and Allied Fields

    NASA Astrophysics Data System (ADS)

    Orr, C. H.; Mcfadden, R. R.; Manduca, C. A.; Kempler, L. A.

    2016-12-01

    Teaching with data, simulations, and models in the geosciences can increase many facets of student success in the classroom, and in the workforce. Teaching undergraduates about programming and improving students' quantitative and computational skills expands their perception of Geoscience beyond field-based studies. Processing data and developing quantitative models are critically important for Geoscience students. Students need to be able to perform calculations, analyze data, create numerical models and visualizations, and more deeply understand complex systems—all essential aspects of modern science. These skills require students to have comfort and skill with languages and tools such as MATLAB. To achieve comfort and skill, computational and quantitative thinking must build over a 4-year degree program across courses and disciplines. However, in courses focused on Geoscience content it can be challenging to get students comfortable with using computational methods to answers Geoscience questions. To help bridge this gap, we have partnered with MathWorks to develop two workshops focused on collecting and developing strategies and resources to help faculty teach students to incorporate data, simulations, and models into the curriculum at the course and program levels. We brought together faculty members from the sciences, including Geoscience and allied fields, who teach computation and quantitative thinking skills using MATLAB to build a resource collection for teaching. These materials, and the outcomes of the workshops are freely available on our website. The workshop outcomes include a collection of teaching activities, essays, and course descriptions that can help faculty incorporate computational skills at the course or program level. The teaching activities include in-class assignments, problem sets, labs, projects, and toolboxes. These activities range from programming assignments to creating and using models. The outcomes also include workshop syntheses that highlights best practices, a set of webpages to support teaching with software such as MATLAB, and an interest group actively discussing aspects these issues in Geoscience and allied fields. Learn more and view the resources at http://serc.carleton.edu/matlab_computation2016/index.html

  14. Modeling with Young Students--Quantitative and Qualitative.

    ERIC Educational Resources Information Center

    Bliss, Joan; Ogborn, Jon; Boohan, Richard; Brosnan, Tim; Mellar, Harvey; Sakonidis, Babis

    1999-01-01

    A project created tasks and tools to investigate quality and nature of 11- to 14-year-old pupils' reasoning with quantitative and qualitative computer-based modeling tools. Tasks and tools were used in two innovative modes of learning: expressive, where pupils created their own models, and exploratory, where pupils investigated an expert's model.…

  15. Vehicle - Bridge interaction, comparison of two computing models

    NASA Astrophysics Data System (ADS)

    Melcer, Jozef; Kuchárová, Daniela

    2017-07-01

    The paper presents the calculation of the bridge response on the effect of moving vehicle moves along the bridge with various velocities. The multi-body plane computing model of vehicle is adopted. The bridge computing models are created in two variants. One computing model represents the bridge as the Bernoulli-Euler beam with continuously distributed mass and the second one represents the bridge as the lumped mass model with 1 degrees of freedom. The mid-span bridge dynamic deflections are calculated for both computing models. The results are mutually compared and quantitative evaluated.

  16. A Quantitative Study of the Relationship between Leadership Practice and Strategic Intentions to Use Cloud Computing

    ERIC Educational Resources Information Center

    Castillo, Alan F.

    2014-01-01

    The purpose of this quantitative correlational cross-sectional research study was to examine a theoretical model consisting of leadership practice, attitudes of business process outsourcing, and strategic intentions of leaders to use cloud computing and to examine the relationships between each of the variables respectively. This study…

  17. A computational model of selection by consequences.

    PubMed

    McDowell, J J

    2004-05-01

    Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of computational experiments that arranged reinforcement according to random-interval (RI) schedules. The quantitative features of the model were varied over wide ranges in these experiments, and many of the qualitative features of the model also were varied. The digital organism consistently showed a hyperbolic relation between response and reinforcement rates, and this hyperbolic description of the data was consistently better than the description provided by other, similar, function forms. In addition, the parameters of the hyperbola varied systematically with the quantitative, and some of the qualitative, properties of the model in ways that were consistent with findings from biological organisms. These results suggest that the material events responsible for an organism's responding on RI schedules are computationally equivalent to Darwinian selection by consequences. They also suggest that the computational model developed here is worth pursuing further as a possible dynamic account of behavior.

  18. Computational Toxicology (S)

    EPA Science Inventory

    The emerging field of computational toxicology applies mathematical and computer models and molecular biological and chemical approaches to explore both qualitative and quantitative relationships between sources of environmental pollutant exposure and adverse health outcomes. Th...

  19. Computational modeling approaches to quantitative structure-binding kinetics relationships in drug discovery.

    PubMed

    De Benedetti, Pier G; Fanelli, Francesca

    2018-03-21

    Simple comparative correlation analyses and quantitative structure-kinetics relationship (QSKR) models highlight the interplay of kinetic rates and binding affinity as an essential feature in drug design and discovery. The choice of the molecular series, and their structural variations, used in QSKR modeling is fundamental to understanding the mechanistic implications of ligand and/or drug-target binding and/or unbinding processes. Here, we discuss the implications of linear correlations between kinetic rates and binding affinity constants and the relevance of the computational approaches to QSKR modeling. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.

    PubMed

    Pang, Wei; Coghill, George M

    2015-05-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  1. Quantitative ROESY analysis of computational models: structural studies of citalopram and β-cyclodextrin complexes by (1) H-NMR and computational methods.

    PubMed

    Ali, Syed Mashhood; Shamim, Shazia

    2015-07-01

    Complexation of racemic citalopram with β-cyclodextrin (β-CD) in aqueous medium was investigated to determine atom-accurate structure of the inclusion complexes. (1) H-NMR chemical shift change data of β-CD cavity protons in the presence of citalopram confirmed the formation of 1 : 1 inclusion complexes. ROESY spectrum confirmed the presence of aromatic ring in the β-CD cavity but whether one of the two or both rings was not clear. Molecular mechanics and molecular dynamic calculations showed the entry of fluoro-ring from wider side of β-CD cavity as the most favored mode of inclusion. Minimum energy computational models were analyzed for their accuracy in atomic coordinates by comparison of calculated and experimental intermolecular ROESY peak intensities, which were not found in agreement. Several least energy computational models were refined and analyzed till calculated and experimental intensities were compatible. The results demonstrate that computational models of CD complexes need to be analyzed for atom-accuracy and quantitative ROESY analysis is a promising method. Moreover, the study also validates that the quantitative use of ROESY is feasible even with longer mixing times if peak intensity ratios instead of absolute intensities are used. Copyright © 2015 John Wiley & Sons, Ltd.

  2. Dissecting Embryonic Stem Cell Self-Renewal and Differentiation Commitment from Quantitative Models.

    PubMed

    Hu, Rong; Dai, Xianhua; Dai, Zhiming; Xiang, Qian; Cai, Yanning

    2016-10-01

    To model quantitatively embryonic stem cell (ESC) self-renewal and differentiation by computational approaches, we developed a unified mathematical model for gene expression involved in cell fate choices. Our quantitative model comprised ESC master regulators and lineage-specific pivotal genes. It took the factors of multiple pathways as input and computed expression as a function of intrinsic transcription factors, extrinsic cues, epigenetic modifications, and antagonism between ESC master regulators and lineage-specific pivotal genes. In the model, the differential equations of expression of genes involved in cell fate choices from regulation relationship were established according to the transcription and degradation rates. We applied this model to the Murine ESC self-renewal and differentiation commitment and found that it modeled the expression patterns with good accuracy. Our model analysis revealed that Murine ESC was an attractor state in culture and differentiation was predominantly caused by antagonism between ESC master regulators and lineage-specific pivotal genes. Moreover, antagonism among lineages played a critical role in lineage reprogramming. Our results also uncovered that the ordered expression alteration of ESC master regulators over time had a central role in ESC differentiation fates. Our computational framework was generally applicable to most cell-type maintenance and lineage reprogramming.

  3. Quantitative Modeling of Earth Surface Processes

    NASA Astrophysics Data System (ADS)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes.

  4. More details...
  5. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1978-01-01

    The development of system models that can provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer are described. Specific topics covered include: system models; performability evaluation; capability and functional dependence; computation of trajectory set probabilities; and hierarchical modeling of an air transport mission.

  6. Quantitative Microbial Risk Assessment Tutorial: Installation of Software for Watershed Modeling in Support of QMRA

    EPA Science Inventory

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling:• SDMProjectBuilder (which includes the Microbial Source Module as part...

  7. Quantitative Adverse Outcome Pathways and Their Application to Predictive Toxicology

    EPA Science Inventory

    A quantitative adverse outcome pathway (qAOP) consists of one or more biologically based, computational models describing key event relationships linking a molecular initiating event (MIE) to an adverse outcome. A qAOP provides quantitative, dose–response, and time-course p...

  8. Systems Biology in Immunology – A Computational Modeling Perspective

    PubMed Central

    Germain, Ronald N.; Meier-Schellersheim, Martin; Nita-Lazar, Aleksandra; Fraser, Iain D. C.

    2011-01-01

    Systems biology is an emerging discipline that combines high-content, multiplexed measurements with informatic and computational modeling methods to better understand biological function at various scales. Here we present a detailed review of the methods used to create computational models and conduct simulations of immune function, We provide descriptions of the key data gathering techniques employed to generate the quantitative and qualitative data required for such modeling and simulation and summarize the progress to date in applying these tools and techniques to questions of immunological interest, including infectious disease. We include comments on what insights modeling can provide that complement information obtained from the more familiar experimental discovery methods used by most investigators and why quantitative methods are needed to eventually produce a better understanding of immune system operation in health and disease. PMID:21219182

  9. Modeling of Receptor Tyrosine Kinase Signaling: Computational and Experimental Protocols.

    PubMed

    Fey, Dirk; Aksamitiene, Edita; Kiyatkin, Anatoly; Kholodenko, Boris N

    2017-01-01

    The advent of systems biology has convincingly demonstrated that the integration of experiments and dynamic modelling is a powerful approach to understand the cellular network biology. Here we present experimental and computational protocols that are necessary for applying this integrative approach to the quantitative studies of receptor tyrosine kinase (RTK) signaling networks. Signaling by RTKs controls multiple cellular processes, including the regulation of cell survival, motility, proliferation, differentiation, glucose metabolism, and apoptosis. We describe methods of model building and training on experimentally obtained quantitative datasets, as well as experimental methods of obtaining quantitative dose-response and temporal dependencies of protein phosphorylation and activities. The presented methods make possible (1) both the fine-grained modeling of complex signaling dynamics and identification of salient, course-grained network structures (such as feedback loops) that bring about intricate dynamics, and (2) experimental validation of dynamic models.

  10. Quantitative Microbial Risk Assessment Tutorial Installation of Software for Watershed Modeling in Support of QMRA - Updated 2017

    EPA Science Inventory

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling: • QMRA Installation • SDMProjectBuilder (which includes the Microbial ...

  11. A computational model of selection by consequences.

    PubMed Central

    McDowell, J J

    2004-01-01

    Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of computational experiments that arranged reinforcement according to random-interval (RI) schedules. The quantitative features of the model were varied over wide ranges in these experiments, and many of the qualitative features of the model also were varied. The digital organism consistently showed a hyperbolic relation between response and reinforcement rates, and this hyperbolic description of the data was consistently better than the description provided by other, similar, function forms. In addition, the parameters of the hyperbola varied systematically with the quantitative, and some of the qualitative, properties of the model in ways that were consistent with findings from biological organisms. These results suggest that the material events responsible for an organism's responding on RI schedules are computationally equivalent to Darwinian selection by consequences. They also suggest that the computational model developed here is worth pursuing further as a possible dynamic account of behavior. PMID:15357512

  12. Prediction of Environmental Impact of High-Energy Materials with Atomistic Computer Simulations

    DTIC Science & Technology

    2010-11-01

    from a training set of compounds. Other methods include Quantitative Struc- ture-Activity Relationship ( QSAR ) and Quantitative Structure-Property...26 28 the development of QSPR/ QSAR models, in contrast to boiling points and critical parameters derived from empirical correlations, to improve...Quadratic Configuration Interaction Singles Doubles QSAR Quantitative Structure-Activity Relationship QSPR Quantitative Structure-Property

  13. Facilitating arrhythmia simulation: the method of quantitative cellular automata modeling and parallel running

    PubMed Central

    Zhu, Hao; Sun, Yan; Rajagopal, Gunaretnam; Mondry, Adrian; Dhar, Pawan

    2004-01-01

    Background Many arrhythmias are triggered by abnormal electrical activity at the ionic channel and cell level, and then evolve spatio-temporally within the heart. To understand arrhythmias better and to diagnose them more precisely by their ECG waveforms, a whole-heart model is required to explore the association between the massively parallel activities at the channel/cell level and the integrative electrophysiological phenomena at organ level. Methods We have developed a method to build large-scale electrophysiological models by using extended cellular automata, and to run such models on a cluster of shared memory machines. We describe here the method, including the extension of a language-based cellular automaton to implement quantitative computing, the building of a whole-heart model with Visible Human Project data, the parallelization of the model on a cluster of shared memory computers with OpenMP and MPI hybrid programming, and a simulation algorithm that links cellular activity with the ECG. Results We demonstrate that electrical activities at channel, cell, and organ levels can be traced and captured conveniently in our extended cellular automaton system. Examples of some ECG waveforms simulated with a 2-D slice are given to support the ECG simulation algorithm. A performance evaluation of the 3-D model on a four-node cluster is also given. Conclusions Quantitative multicellular modeling with extended cellular automata is a highly efficient and widely applicable method to weave experimental data at different levels into computational models. This process can be used to investigate complex and collective biological activities that can be described neither by their governing differentiation equations nor by discrete parallel computation. Transparent cluster computing is a convenient and effective method to make time-consuming simulation feasible. Arrhythmias, as a typical case, can be effectively simulated with the methods described. PMID:15339335

  14. Exposure Science and the US EPA National Center for Computational Toxicology

    EPA Science Inventory

    The emerging field of computational toxicology applies mathematical and computer models and molecular biological and chemical approaches to explore both qualitative and quantitative relationships between sources of environmental pollutant exposure and adverse health outcomes. The...

  15. The effect of in situ/in vitro three-dimensional quantitative computed tomography image voxel size on the finite element model of human vertebral cancellous bone.

    PubMed

    Lu, Yongtao; Engelke, Klaus; Glueer, Claus-C; Morlock, Michael M; Huber, Gerd

    2014-11-01

    Quantitative computed tomography-based finite element modeling technique is a promising clinical tool for the prediction of bone strength. However, quantitative computed tomography-based finite element models were created from image datasets with different image voxel sizes. The aim of this study was to investigate whether there is an influence of image voxel size on the finite element models. In all 12 thoracolumbar vertebrae were scanned prior to autopsy (in situ) using two different quantitative computed tomography scan protocols, which resulted in image datasets with two different voxel sizes (0.29 × 0.29 × 1.3 mm(3) vs 0.18 × 0.18 × 0.6 mm(3)). Eight of them were scanned after autopsy (in vitro) and the datasets were reconstructed with two voxel sizes (0.32 × 0.32 × 0.6 mm(3) vs. 0.18 × 0.18 × 0.3 mm(3)). Finite element models with cuboid volume of interest extracted from the vertebral cancellous part were created and inhomogeneous bilinear bone properties were defined. Axial compression was simulated. No effect of voxel size was detected on the apparent bone mineral density for both the in situ and in vitro cases. However, the apparent modulus and yield strength showed significant differences in the two voxel size group pairs (in situ and in vitro). In conclusion, the image voxel size may have to be considered when the finite element voxel modeling technique is used in clinical applications. © IMechE 2014.

  16. Computing Quantitative Characteristics of Finite-State Real-Time Systems

    DTIC Science & Technology

    1994-05-04

    Current methods for verifying real - time systems are essentially decision procedures that establish whether the system model satisfies a given...specification. We present a general method for computing quantitative information about finite-state real - time systems . We have developed algorithms that...our technique can be extended to a more general representation of real - time systems , namely, timed transition graphs. The algorithms presented in this

  17. Computational modeling of the amphibian thyroid axis supported by targeted in vivo testing to advance quantitative adverse outcome pathway development

    EPA Science Inventory

    In vitro screening of chemicals for bioactivity together with computational modeling are beginning to replace animal toxicity testing in support of chemical risk assessment. To facilitate this transition, an amphibian thyroid axis model has been developed to describe thyroid home...

  18. A Detailed Data-Driven Network Model of Prefrontal Cortex Reproduces Key Features of In Vivo Activity

    PubMed Central

    Hass, Joachim; Hertäg, Loreen; Durstewitz, Daniel

    2016-01-01

    The prefrontal cortex is centrally involved in a wide range of cognitive functions and their impairment in psychiatric disorders. Yet, the computational principles that govern the dynamics of prefrontal neural networks, and link their physiological, biochemical and anatomical properties to cognitive functions, are not well understood. Computational models can help to bridge the gap between these different levels of description, provided they are sufficiently constrained by experimental data and capable of predicting key properties of the intact cortex. Here, we present a detailed network model of the prefrontal cortex, based on a simple computationally efficient single neuron model (simpAdEx), with all parameters derived from in vitro electrophysiological and anatomical data. Without additional tuning, this model could be shown to quantitatively reproduce a wide range of measures from in vivo electrophysiological recordings, to a degree where simulated and experimentally observed activities were statistically indistinguishable. These measures include spike train statistics, membrane potential fluctuations, local field potentials, and the transmission of transient stimulus information across layers. We further demonstrate that model predictions are robust against moderate changes in key parameters, and that synaptic heterogeneity is a crucial ingredient to the quantitative reproduction of in vivo-like electrophysiological behavior. Thus, we have produced a physiologically highly valid, in a quantitative sense, yet computationally efficient PFC network model, which helped to identify key properties underlying spike time dynamics as observed in vivo, and can be harvested for in-depth investigation of the links between physiology and cognition. PMID:27203563

  19. Hybrid, experimental and computational, investigation of mechanical components

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    1996-07-01

    Computational and experimental methodologies have unique features for the analysis and solution of a wide variety of engineering problems. Computations provide results that depend on selection of input parameters such as geometry, material constants, and boundary conditions which, for correct modeling purposes, have to be appropriately chosen. In addition, it is relatively easy to modify the input parameters in order to computationally investigate different conditions. Experiments provide solutions which characterize the actual behavior of the object of interest subjected to specific operating conditions. However, it is impractical to experimentally perform parametric investigations. This paper discusses the use of a hybrid, computational and experimental, approach for study and optimization of mechanical components. Computational techniques are used for modeling the behavior of the object of interest while it is experimentally tested using noninvasive optical techniques. Comparisons are performed through a fringe predictor program used to facilitate the correlation between both techniques. In addition, experimentally obtained quantitative information, such as displacements and shape, can be applied in the computational model in order to improve this correlation. The result is a validated computational model that can be used for performing quantitative analyses and structural optimization. Practical application of the hybrid approach is illustrated with a representative example which demonstrates the viability of the approach as an engineering tool for structural analysis and optimization.

  20. Modeling the Learner in Computer-Assisted Instruction

    ERIC Educational Resources Information Center

    Fletcher, J. D.

    1975-01-01

    This paper briefly reviews relevant work in four areas: 1) quantitative models of memory; 2) regression models of performance; 3) automation models of performance; and 4) artificial intelligence. (Author/HB)

  21. Comparative Analysis of Predictive Models for Liver Toxicity Using ToxCast Assays and Quantitative Structure-Activity Relationships (MCBIOS)

    EPA Science Inventory

    Comparative Analysis of Predictive Models for Liver Toxicity Using ToxCast Assays and Quantitative Structure-Activity Relationships Jie Liu1,2, Richard Judson1, Matthew T. Martin1, Huixiao Hong3, Imran Shah1 1National Center for Computational Toxicology (NCCT), US EPA, RTP, NC...

  1. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING

    EPA Science Inventory

    The overall goal of the EPA-ORD NERL research program on Computational Toxicology (CompTox) is to provide the Agency with the tools of modern chemistry, biology, and computing to improve quantitative risk assessments and reduce uncertainties in the source-to-adverse outcome conti...

  2. An evidential reasoning extension to quantitative model-based failure diagnosis

    NASA Technical Reports Server (NTRS)

    Gertler, Janos J.; Anderson, Kenneth C.

    1992-01-01

    The detection and diagnosis of failures in physical systems characterized by continuous-time operation are studied. A quantitative diagnostic methodology has been developed that utilizes the mathematical model of the physical system. On the basis of the latter, diagnostic models are derived each of which comprises a set of orthogonal parity equations. To improve the robustness of the algorithm, several models may be used in parallel, providing potentially incomplete and/or conflicting inferences. Dempster's rule of combination is used to integrate evidence from the different models. The basic probability measures are assigned utilizing quantitative information extracted from the mathematical model and from online computation performed therewith.

  3. Health impact assessment – A survey on quantifying tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fehr, Rainer, E-mail: rainer.fehr@uni-bielefeld.de; Mekel, Odile C.L., E-mail: odile.mekel@lzg.nrw.de; Fintan Hurley, J., E-mail: fintan.hurley@iom-world.org

    Integrating human health into prospective impact assessments is known to be challenging. This is true for both approaches: dedicated health impact assessments (HIA) as well as inclusion of health into more general impact assessments. Acknowledging the full range of participatory, qualitative, and quantitative approaches, this study focuses on the latter, especially on computational tools for quantitative health modelling. We conducted a survey among tool developers concerning the status quo of development and availability of such tools; experiences made with model usage in real-life situations; and priorities for further development. Responding toolmaker groups described 17 such tools, most of them beingmore » maintained and reported as ready for use and covering a wide range of topics, including risk & protective factors, exposures, policies, and health outcomes. In recent years, existing models have been improved and were applied in new ways, and completely new models emerged. There was high agreement among respondents on the need to further develop methods for assessment of inequalities and uncertainty. The contribution of quantitative modeling to health foresight would benefit from building joint strategies of further tool development, improving the visibility of quantitative tools and methods, and engaging continuously with actual and potential users. - Highlights: • A survey investigated computational tools for health impact quantification. • Formal evaluation of such tools has been rare. • Handling inequalities and uncertainties are priority areas for further development. • Health foresight would benefit from tool developers and users forming a community. • Joint development strategies across computational tools are needed.« less

  4. Quantitative Evaluation of a Planetary Renderer for Terrain Relative Navigation

    NASA Astrophysics Data System (ADS)

    Amoroso, E.; Jones, H.; Otten, N.; Wettergreen, D.; Whittaker, W.

    2016-11-01

    A ray-tracing computer renderer tool is presented based on LOLA and LROC elevation models and is quantitatively compared to LRO WAC and NAC images for photometric accuracy. We investigated using rendered images for terrain relative navigation.

  5. Quantitative collision induced mass spectrometry of substituted piperazines - A correlative analysis between theory and experiment

    NASA Astrophysics Data System (ADS)

    Ivanova, Bojidarka; Spiteller, Michael

    2017-12-01

    The present paper deals with quantitative kinetics and thermodynamics of collision induced dissociation (CID) reactions of piperazines under different experimental conditions together with a systematic description of effect of counter-ions on common MS fragment reactions of piperazines; and intra-molecular effect of quaternary cyclization of substituted piperazines yielding to quaternary salts. There are discussed quantitative model equations of rate constants as well as free Gibbs energies of series of m-independent CID fragment processes in GP, which have been evidenced experimentally. Both kinetic and thermodynamic parameters are also predicted by computational density functional theory (DFT) and ab initio both static and dynamic methods. The paper examines validity of Maxwell-Boltzmann distribution to non-Boltzmann CID processes in quantitatively as well. The experiments conducted within the latter framework yield to an excellent correspondence with theoretical quantum chemical modeling. The important property of presented model equations of reaction kinetics is the applicability in predicting unknown and assigning of known mass spectrometric (MS) patterns. The nature of "GP" continuum of CID-MS coupled scheme of measurements with electrospray ionization (ESI) source is discussed, performing parallel computations in gas-phase (GP) and polar continuum at different temperatures and ionic strengths. The effect of pressure is presented. The study contributes significantly to methodological and phenomenological developments of CID-MS and its analytical implementations for quantitative and structural analyses. It also demonstrates great prospective of a complementary application of experimental CID-MS and computational quantum chemistry studying chemical reactivity, among others. To a considerable extend this work underlies the place of computational quantum chemistry to the field of experimental analytical chemistry in particular highlighting the structural analysis.

  6. Analysis of rocket engine injection combustion processes

    NASA Technical Reports Server (NTRS)

    Salmon, J. W.

    1976-01-01

    A critique is given of the JANNAF sub-critical propellant injection/combustion process analysis computer models and application of the models to correlation of well documented hot fire engine data bases. These programs are the distributed energy release (DER) model for conventional liquid propellants injectors and the coaxial injection combustion model (CICM) for gaseous annulus/liquid core coaxial injectors. The critique identifies model inconsistencies while the computer analyses provide quantitative data on predictive accuracy. The program is comprised of three tasks: (1) computer program review and operations; (2) analysis and data correlations; and (3) documentation.

  7. Training Signaling Pathway Maps to Biochemical Data with Constrained Fuzzy Logic: Quantitative Analysis of Liver Cell Responses to Inflammatory Stimuli

    PubMed Central

    Morris, Melody K.; Saez-Rodriguez, Julio; Clarke, David C.; Sorger, Peter K.; Lauffenburger, Douglas A.

    2011-01-01

    Predictive understanding of cell signaling network operation based on general prior knowledge but consistent with empirical data in a specific environmental context is a current challenge in computational biology. Recent work has demonstrated that Boolean logic can be used to create context-specific network models by training proteomic pathway maps to dedicated biochemical data; however, the Boolean formalism is restricted to characterizing protein species as either fully active or inactive. To advance beyond this limitation, we propose a novel form of fuzzy logic sufficiently flexible to model quantitative data but also sufficiently simple to efficiently construct models by training pathway maps on dedicated experimental measurements. Our new approach, termed constrained fuzzy logic (cFL), converts a prior knowledge network (obtained from literature or interactome databases) into a computable model that describes graded values of protein activation across multiple pathways. We train a cFL-converted network to experimental data describing hepatocytic protein activation by inflammatory cytokines and demonstrate the application of the resultant trained models for three important purposes: (a) generating experimentally testable biological hypotheses concerning pathway crosstalk, (b) establishing capability for quantitative prediction of protein activity, and (c) prediction and understanding of the cytokine release phenotypic response. Our methodology systematically and quantitatively trains a protein pathway map summarizing curated literature to context-specific biochemical data. This process generates a computable model yielding successful prediction of new test data and offering biological insight into complex datasets that are difficult to fully analyze by intuition alone. PMID:21408212

  8. Quantitative Diagnosis of Continuous-Valued, Stead-State Systems

    NASA Technical Reports Server (NTRS)

    Rouquette, N.

    1995-01-01

    Quantitative diagnosis involves numerically estimating the values of unobservable parameters that best explain the observed parameter values. We consider quantitative diagnosis for continuous, lumped- parameter, steady-state physical systems because such models are easy to construct and the diagnosis problem is considerably simpler than that for corresponding dynamic models. To further tackle the difficulties of numerically inverting a simulation model to compute a diagnosis, we propose to decompose a physical system model in terms of feedback loops. This decomposition reduces the dimension of the problem and consequently decreases the diagnosis search space. We illustrate this approach on a model of thermal control system studied in earlier research.

  9. Quantitative relations between fishing mortality, spawning stress mortality and biomass growth rate (computed with numerical model FISHMO)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laevastu, T.

    1983-01-01

    The effects of fishing on a given species biomass have been quantitatively evaluated. A constant recruitment is assumed in this study, but the evaluation can be computed on any known age distribution of exploitable biomass. Fishing mortality is assumed to be constant with age; however, spawning stress mortality increases with age. When fishing (mortality) increases, the spawning stress mortality decreases relative to total and exploitable biomasses. These changes are quantitatively shown for two species from the Bering Sea - walleye pollock, Theragra chalcogramma, and yellowfin sole, Limanda aspera.

  10. A Comprehensive Review of Existing Risk Assessment Models in Cloud Computing

    NASA Astrophysics Data System (ADS)

    Amini, Ahmad; Jamil, Norziana

    2018-05-01

    Cloud computing is a popular paradigm in information technology and computing as it offers numerous advantages in terms of economical saving and minimal management effort. Although elasticity and flexibility brings tremendous benefits, it still raises many information security issues due to its unique characteristic that allows ubiquitous computing. Therefore, the vulnerabilities and threats in cloud computing have to be identified and proper risk assessment mechanism has to be in place for better cloud computing management. Various quantitative and qualitative risk assessment models have been proposed but up to our knowledge, none of them is suitable for cloud computing environment. This paper, we compare and analyse the strengths and weaknesses of existing risk assessment models. We then propose a new risk assessment model that sufficiently address all the characteristics of cloud computing, which was not appeared in the existing models.

  11. An Empirical Generative Framework for Computational Modeling of Language Acquisition

    ERIC Educational Resources Information Center

    Waterfall, Heidi R.; Sandbank, Ben; Onnis, Luca; Edelman, Shimon

    2010-01-01

    This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of…

  12. AI/OR computational model for integrating qualitative and quantitative design methods

    NASA Technical Reports Server (NTRS)

    Agogino, Alice M.; Bradley, Stephen R.; Cagan, Jonathan; Jain, Pramod; Michelena, Nestor

    1990-01-01

    A theoretical framework for integrating qualitative and numerical computational methods for optimally-directed design is described. The theory is presented as a computational model and features of implementations are summarized where appropriate. To demonstrate the versatility of the methodology we focus on four seemingly disparate aspects of the design process and their interaction: (1) conceptual design, (2) qualitative optimal design, (3) design innovation, and (4) numerical global optimization.

  13. Augmenting Surgery via Multi-scale Modeling and Translational Systems Biology in the Era of Precision Medicine: A Multidisciplinary Perspective

    PubMed Central

    Kassab, Ghassan S.; An, Gary; Sander, Edward A.; Miga, Michael; Guccione, Julius M.; Ji, Songbai; Vodovotz, Yoram

    2016-01-01

    In this era of tremendous technological capabilities and increased focus on improving clinical outcomes, decreasing costs, and increasing precision, there is a need for a more quantitative approach to the field of surgery. Multiscale computational modeling has the potential to bridge the gap to the emerging paradigms of Precision Medicine and Translational Systems Biology, in which quantitative metrics and data guide patient care through improved stratification, diagnosis, and therapy. Achievements by multiple groups have demonstrated the potential for 1) multiscale computational modeling, at a biological level, of diseases treated with surgery and the surgical procedure process at the level of the individual and the population; along with 2) patient-specific, computationally-enabled surgical planning, delivery, and guidance and robotically-augmented manipulation. In this perspective article, we discuss these concepts, and cite emerging examples from the fields of trauma, wound healing, and cardiac surgery. PMID:27015816

  14. Representational Similarity Analysis – Connecting the Branches of Systems Neuroscience

    PubMed Central

    Kriegeskorte, Nikolaus; Mur, Marieke; Bandettini, Peter

    2008-01-01

    A fundamental challenge for systems neuroscience is to quantitatively relate its three major branches of research: brain-activity measurement, behavioral measurement, and computational modeling. Using measured brain-activity patterns to evaluate computational network models is complicated by the need to define the correspondency between the units of the model and the channels of the brain-activity data, e.g., single-cell recordings or voxels from functional magnetic resonance imaging (fMRI). Similar correspondency problems complicate relating activity patterns between different modalities of brain-activity measurement (e.g., fMRI and invasive or scalp electrophysiology), and between subjects and species. In order to bridge these divides, we suggest abstracting from the activity patterns themselves and computing representational dissimilarity matrices (RDMs), which characterize the information carried by a given representation in a brain or model. Building on a rich psychological and mathematical literature on similarity analysis, we propose a new experimental and data-analytical framework called representational similarity analysis (RSA), in which multi-channel measures of neural activity are quantitatively related to each other and to computational theory and behavior by comparing RDMs. We demonstrate RSA by relating representations of visual objects as measured with fMRI in early visual cortex and the fusiform face area to computational models spanning a wide range of complexities. The RDMs are simultaneously related via second-level application of multidimensional scaling and tested using randomization and bootstrap techniques. We discuss the broad potential of RSA, including novel approaches to experimental design, and argue that these ideas, which have deep roots in psychology and neuroscience, will allow the integrated quantitative analysis of data from all three branches, thus contributing to a more unified systems neuroscience. PMID:19104670

  15. Modeling of electron-specimen interaction in scanning electron microscope for e-beam metrology and inspection: challenges and perspectives

    NASA Astrophysics Data System (ADS)

    Suzuki, Makoto; Kameda, Toshimasa; Doi, Ayumi; Borisov, Sergey; Babin, Sergey

    2018-03-01

    The interpretation of scanning electron microscopy (SEM) images of the latest semiconductor devices is not intuitive and requires comparison with computed images based on theoretical modeling and simulations. For quantitative image prediction and geometrical reconstruction of the specimen structure, the accuracy of the physical model is essential. In this paper, we review the current models of electron-solid interaction and discuss their accuracy. We perform the comparison of the simulated results with our experiments of SEM overlay of under-layer, grain imaging of copper interconnect, and hole bottom visualization by angular selective detectors, and show that our model well reproduces the experimental results. Remaining issues for quantitative simulation are also discussed, including the accuracy of the charge dynamics, treatment of beam skirt, and explosive increase in computing time.

  16. An Analysis of Computer-Mediated Communication between Middle School Students and Scientist Role Models: A Pilot Study.

    ERIC Educational Resources Information Center

    Murfin, Brian

    1994-01-01

    Reports on a study of the effectiveness of computer-mediated communication (CMC) in providing African American and female middle school students with scientist role models. Quantitative and qualitative data gathered by analyzing messages students and scientists posted on a shared electronic bulletin board showed that CMC could be an effective…

  17. Image analysis and modeling in medical image computing. Recent developments and advances.

    PubMed

    Handels, H; Deserno, T M; Meinzer, H-P; Tolxdorff, T

    2012-01-01

    Medical image computing is of growing importance in medical diagnostics and image-guided therapy. Nowadays, image analysis systems integrating advanced image computing methods are used in practice e.g. to extract quantitative image parameters or to support the surgeon during a navigated intervention. However, the grade of automation, accuracy, reproducibility and robustness of medical image computing methods has to be increased to meet the requirements in clinical routine. In the focus theme, recent developments and advances in the field of modeling and model-based image analysis are described. The introduction of models in the image analysis process enables improvements of image analysis algorithms in terms of automation, accuracy, reproducibility and robustness. Furthermore, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients. Selected contributions are assembled to present latest advances in the field. The authors were invited to present their recent work and results based on their outstanding contributions to the Conference on Medical Image Computing BVM 2011 held at the University of Lübeck, Germany. All manuscripts had to pass a comprehensive peer review. Modeling approaches and model-based image analysis methods showing new trends and perspectives in model-based medical image computing are described. Complex models are used in different medical applications and medical images like radiographic images, dual-energy CT images, MR images, diffusion tensor images as well as microscopic images are analyzed. The applications emphasize the high potential and the wide application range of these methods. The use of model-based image analysis methods can improve segmentation quality as well as the accuracy and reproducibility of quantitative image analysis. Furthermore, image-based models enable new insights and can lead to a deeper understanding of complex dynamic mechanisms in the human body. Hence, model-based image computing methods are important tools to improve medical diagnostics and patient treatment in future.

  18. A Quantitative Investigation of Cloud Computing Adoption in Nigeria: Testing an Enhanced Technology Acceptance Model

    ERIC Educational Resources Information Center

    Ishola, Bashiru Abayomi

    2017-01-01

    Cloud computing has recently emerged as a potential alternative to the traditional on-premise computing that businesses can leverage to achieve operational efficiencies. Consequently, technology managers are often tasked with the responsibilities to analyze the barriers and variables critical to organizational cloud adoption decisions. This…

  19. A Quantitative Exploration of Preservice Teachers' Intent to Use Computer-based Technology

    ERIC Educational Resources Information Center

    Kim, Kioh; Jain, Sachin; Westhoff, Guy; Rezabek, Landra

    2008-01-01

    Based on Bandura's (1977) social learning theory, the purpose of this study is to identify the relationship of preservice teachers' perceptions of faculty modeling of computer-based technology and preservice teachers' intent of using computer-based technology in educational settings. There were 92 participants in this study; they were enrolled in…

  20. EPAs Virtual Embryo: Modeling Developmental Toxicity

    EPA Science Inventory

    Embryogenesis is regulated by concurrent activities of signaling pathways organized into networks that control spatial patterning, molecular clocks, morphogenetic rearrangements and cell differentiation. Quantitative mathematical and computational models are needed to better unde...

  1. On the usage of ultrasound computational models for decision making under ambiguity

    NASA Astrophysics Data System (ADS)

    Dib, Gerges; Sexton, Samuel; Prowant, Matthew; Crawford, Susan; Diaz, Aaron

    2018-04-01

    Computer modeling and simulation is becoming pervasive within the non-destructive evaluation (NDE) industry as a convenient tool for designing and assessing inspection techniques. This raises a pressing need for developing quantitative techniques for demonstrating the validity and applicability of the computational models. Computational models provide deterministic results based on deterministic and well-defined input, or stochastic results based on inputs defined by probability distributions. However, computational models cannot account for the effects of personnel, procedures, and equipment, resulting in ambiguity about the efficacy of inspections based on guidance from computational models only. In addition, ambiguity arises when model inputs, such as the representation of realistic cracks, cannot be defined deterministically, probabilistically, or by intervals. In this work, Pacific Northwest National Laboratory demonstrates the ability of computational models to represent field measurements under known variabilities, and quantify the differences using maximum amplitude and power spectrum density metrics. Sensitivity studies are also conducted to quantify the effects of different input parameters on the simulation results.

  2. Knowledge-driven computational modeling in Alzheimer's disease research: Current state and future trends.

    PubMed

    Geerts, Hugo; Hofmann-Apitius, Martin; Anastasio, Thomas J

    2017-11-01

    Neurodegenerative diseases such as Alzheimer's disease (AD) follow a slowly progressing dysfunctional trajectory, with a large presymptomatic component and many comorbidities. Using preclinical models and large-scale omics studies ranging from genetics to imaging, a large number of processes that might be involved in AD pathology at different stages and levels have been identified. The sheer number of putative hypotheses makes it almost impossible to estimate their contribution to the clinical outcome and to develop a comprehensive view on the pathological processes driving the clinical phenotype. Traditionally, bioinformatics approaches have provided correlations and associations between processes and phenotypes. Focusing on causality, a new breed of advanced and more quantitative modeling approaches that use formalized domain expertise offer new opportunities to integrate these different modalities and outline possible paths toward new therapeutic interventions. This article reviews three different computational approaches and their possible complementarities. Process algebras, implemented using declarative programming languages such as Maude, facilitate simulation and analysis of complicated biological processes on a comprehensive but coarse-grained level. A model-driven Integration of Data and Knowledge, based on the OpenBEL platform and using reverse causative reasoning and network jump analysis, can generate mechanistic knowledge and a new, mechanism-based taxonomy of disease. Finally, Quantitative Systems Pharmacology is based on formalized implementation of domain expertise in a more fine-grained, mechanism-driven, quantitative, and predictive humanized computer model. We propose a strategy to combine the strengths of these individual approaches for developing powerful modeling methodologies that can provide actionable knowledge for rational development of preventive and therapeutic interventions. Development of these computational approaches is likely to be required for further progress in understanding and treating AD. Copyright © 2017 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.

  3. Investigation Gender/Ethnicity Heterogeneity in Course Management System Use in Higher Education by Utilizing the MIMIC Model

    ERIC Educational Resources Information Center

    Li, Yi

    2012-01-01

    This study focuses on the issue of learning equity in colleges and universities where teaching and learning have come to depend heavily on computer technologies. The study uses the Multiple Indicators Multiple Causes (MIMIC) latent variable model to quantitatively investigate whether there is a gender /ethnicity difference in using computer based…

  4. Using the phase-space imager to analyze partially coherent imaging systems: bright-field, phase contrast, differential interference contrast, differential phase contrast, and spiral phase contrast

    NASA Astrophysics Data System (ADS)

    Mehta, Shalin B.; Sheppard, Colin J. R.

    2010-05-01

    Various methods that use large illumination aperture (i.e. partially coherent illumination) have been developed for making transparent (i.e. phase) specimens visible. These methods were developed to provide qualitative contrast rather than quantitative measurement-coherent illumination has been relied upon for quantitative phase analysis. Partially coherent illumination has some important advantages over coherent illumination and can be used for measurement of the specimen's phase distribution. However, quantitative analysis and image computation in partially coherent systems have not been explored fully due to the lack of a general, physically insightful and computationally efficient model of image formation. We have developed a phase-space model that satisfies these requirements. In this paper, we employ this model (called the phase-space imager) to elucidate five different partially coherent systems mentioned in the title. We compute images of an optical fiber under these systems and verify some of them with experimental images. These results and simulated images of a general phase profile are used to compare the contrast and the resolution of the imaging systems. We show that, for quantitative phase imaging of a thin specimen with matched illumination, differential phase contrast offers linear transfer of specimen information to the image. We also show that the edge enhancement properties of spiral phase contrast are compromised significantly as the coherence of illumination is reduced. The results demonstrate that the phase-space imager model provides a useful framework for analysis, calibration, and design of partially coherent imaging methods.

  5. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME.

    PubMed

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2016-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.

  6. New horizons in mouse immunoinformatics: reliable in silico prediction of mouse class I histocompatibility major complex peptide binding affinity.

    PubMed

    Hattotuwagama, Channa K; Guan, Pingping; Doytchinova, Irini A; Flower, Darren R

    2004-11-21

    Quantitative structure-activity relationship (QSAR) analysis is a main cornerstone of modern informatic disciplines. Predictive computational models, based on QSAR technology, of peptide-major histocompatibility complex (MHC) binding affinity have now become a vital component of modern day computational immunovaccinology. Historically, such approaches have been built around semi-qualitative, classification methods, but these are now giving way to quantitative regression methods. The additive method, an established immunoinformatics technique for the quantitative prediction of peptide-protein affinity, was used here to identify the sequence dependence of peptide binding specificity for three mouse class I MHC alleles: H2-D(b), H2-K(b) and H2-K(k). As we show, in terms of reliability the resulting models represent a significant advance on existing methods. They can be used for the accurate prediction of T-cell epitopes and are freely available online ( http://www.jenner.ac.uk/MHCPred).

  7. Computational Medicine: Translating Models to Clinical Care

    PubMed Central

    Winslow, Raimond L.; Trayanova, Natalia; Geman, Donald; Miller, Michael I.

    2013-01-01

    Because of the inherent complexity of coupled nonlinear biological systems, the development of computational models is necessary for achieving a quantitative understanding of their structure and function in health and disease. Statistical learning is applied to high-dimensional biomolecular data to create models that describe relationships between molecules and networks. Multiscale modeling links networks to cells, organs, and organ systems. Computational approaches are used to characterize anatomic shape and its variations in health and disease. In each case, the purposes of modeling are to capture all that we know about disease and to develop improved therapies tailored to the needs of individuals. We discuss advances in computational medicine, with specific examples in the fields of cancer, diabetes, cardiology, and neurology. Advances in translating these computational methods to the clinic are described, as well as challenges in applying models for improving patient health. PMID:23115356

  8. Towards a neuro-computational account of prism adaptation.

    PubMed

    Petitet, Pierre; O'Reilly, Jill X; O'Shea, Jacinta

    2017-12-14

    Prism adaptation has a long history as an experimental paradigm used to investigate the functional and neural processes that underlie sensorimotor control. In the neuropsychology literature, prism adaptation behaviour is typically explained by reference to a traditional cognitive psychology framework that distinguishes putative functions, such as 'strategic control' versus 'spatial realignment'. This theoretical framework lacks conceptual clarity, quantitative precision and explanatory power. Here, we advocate for an alternative computational framework that offers several advantages: 1) an algorithmic explanatory account of the computations and operations that drive behaviour; 2) expressed in quantitative mathematical terms; 3) embedded within a principled theoretical framework (Bayesian decision theory, state-space modelling); 4) that offers a means to generate and test quantitative behavioural predictions. This computational framework offers a route towards mechanistic neurocognitive explanations of prism adaptation behaviour. Thus it constitutes a conceptual advance compared to the traditional theoretical framework. In this paper, we illustrate how Bayesian decision theory and state-space models offer principled explanations for a range of behavioural phenomena in the field of prism adaptation (e.g. visual capture, magnitude of visual versus proprioceptive realignment, spontaneous recovery and dynamics of adaptation memory). We argue that this explanatory framework can advance understanding of the functional and neural mechanisms that implement prism adaptation behaviour, by enabling quantitative tests of hypotheses that go beyond merely descriptive mapping claims that 'brain area X is (somehow) involved in psychological process Y'. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Can the Analytical Hierarchy Process Model Be Effectively Applied in the Prioritization of Information Assurance Defense In-Depth Measures? --A Quantitative Study

    ERIC Educational Resources Information Center

    Alexander, Rodney T.

    2017-01-01

    Organizational computing devices are increasingly becoming targets of cyber-attacks, and organizations have become dependent on the safety and security of their computer networks and their organizational computing devices. Business and government often use defense in-depth information assurance measures such as firewalls, intrusion detection…

  10. Examining Student Opinions on Computer Use Based on the Learning Styles in Mathematics Education

    ERIC Educational Resources Information Center

    Ozgen, Kemal; Bindak, Recep

    2012-01-01

    The purpose of this study is to identify the opinions of high school students, who have different learning styles, related to computer use in mathematics education. High school students' opinions on computer use in mathematics education were collected with both qualitative and quantitative approaches in the study conducted with a survey model. For…

  11. Source-to-Outcome Microbial Exposure and Risk Modeling Framework

    EPA Science Inventory

    A Quantitative Microbial Risk Assessment (QMRA) is a computer-based data-delivery and modeling approach that integrates interdisciplinary fate/transport, exposure, and impact models and databases to characterize potential health impacts/risks due to pathogens. As such, a QMRA ex...

  12. Algorithmic-Reducibility = Renormalization-Group Fixed-Points; ``Noise''-Induced Phase-Transitions (NITs) to Accelerate Algorithmics (``NIT-Picking'') Replacing CRUTCHES!!!: Gauss Modular/Clock-Arithmetic Congruences = Signal X Noise PRODUCTS..

    NASA Astrophysics Data System (ADS)

    Siegel, J.; Siegel, Edward Carl-Ludwig

    2011-03-01

    Cook-Levin computational-"complexity"(C-C) algorithmic-equivalence reduction-theorem reducibility equivalence to renormalization-(semi)-group phase-transitions critical-phenomena statistical-physics universality-classes fixed-points, is exploited with Gauss modular/clock-arithmetic/model congruences = signal X noise PRODUCT reinterpretation. Siegel-Baez FUZZYICS=CATEGORYICS(SON of ``TRIZ''): Category-Semantics(C-S) tabular list-format truth-table matrix analytics predicts and implements "noise"-induced phase-transitions (NITs) to accelerate versus to decelerate Harel [Algorithmics(1987)]-Sipser[Intro. Theory Computation(1997) algorithmic C-C: "NIT-picking" to optimize optimization-problems optimally(OOPO). Versus iso-"noise" power-spectrum quantitative-only amplitude/magnitude-only variation stochastic-resonance, this "NIT-picking" is "noise" power-spectrum QUALitative-type variation via quantitative critical-exponents variation. Computer-"science" algorithmic C-C models: Turing-machine, finite-state-models/automata, are identified as early-days once-workable but NOW ONLY LIMITING CRUTCHES IMPEDING latter-days new-insights!!!

  13. An Integrated Qualitative and Quantitative Biochemical Model Learning Framework Using Evolutionary Strategy and Simulated Annealing.

    PubMed

    Wu, Zujian; Pang, Wei; Coghill, George M

    2015-01-01

    Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.

  14. Cook-Levin Theorem Algorithmic-Reducibility/Completeness = Wilson Renormalization-(Semi)-Group Fixed-Points; ``Noise''-Induced Phase-Transitions (NITs) to Accelerate Algorithmics (``NIT-Picking'') REPLACING CRUTCHES!!!: Models: Turing-machine, finite-state-models, finite-automata

    NASA Astrophysics Data System (ADS)

    Young, Frederic; Siegel, Edward

    Cook-Levin theorem theorem algorithmic computational-complexity(C-C) algorithmic-equivalence reducibility/completeness equivalence to renormalization-(semi)-group phase-transitions critical-phenomena statistical-physics universality-classes fixed-points, is exploited via Siegel FUZZYICS =CATEGORYICS = ANALOGYICS =PRAGMATYICS/CATEGORY-SEMANTICS ONTOLOGY COGNITION ANALYTICS-Aristotle ``square-of-opposition'' tabular list-format truth-table matrix analytics predicts and implements ''noise''-induced phase-transitions (NITs) to accelerate versus to decelerate Harel [Algorithmics (1987)]-Sipser[Intro.Thy. Computation(`97)] algorithmic C-C: ''NIT-picking''(!!!), to optimize optimization-problems optimally(OOPO). Versus iso-''noise'' power-spectrum quantitative-only amplitude/magnitude-only variation stochastic-resonance, ''NIT-picking'' is ''noise'' power-spectrum QUALitative-type variation via quantitative critical-exponents variation. Computer-''science''/SEANCE algorithmic C-C models: Turing-machine, finite-state-models, finite-automata,..., discrete-maths graph-theory equivalence to physics Feynman-diagrams are identified as early-days once-workable valid but limiting IMPEDING CRUTCHES(!!!), ONLY IMPEDE latter-days new-insights!!!

  15. Quantitative computational infrared imaging of buoyant diffusion flames

    NASA Astrophysics Data System (ADS)

    Newale, Ashish S.

    Studies of infrared radiation from turbulent buoyant diffusion flames impinging on structural elements have applications to the development of fire models. A numerical and experimental study of radiation from buoyant diffusion flames with and without impingement on a flat plate is reported. Quantitative images of the radiation intensity from the flames are acquired using a high speed infrared camera. Large eddy simulations are performed using fire dynamics simulator (FDS version 6). The species concentrations and temperature from the simulations are used in conjunction with a narrow-band radiation model (RADCAL) to solve the radiative transfer equation. The computed infrared radiation intensities rendered in the form of images and compared with the measurements. The measured and computed radiation intensities reveal necking and bulging with a characteristic frequency of 7.1 Hz which is in agreement with previous empirical correlations. The results demonstrate the effects of stagnation point boundary layer on the upstream buoyant shear layer. The coupling between these two shear layers presents a model problem for sub-grid scale modeling necessary for future large eddy simulations.

  16. From QSAR to QSIIR: Searching for Enhanced Computational Toxicology Models

    PubMed Central

    Zhu, Hao

    2017-01-01

    Quantitative Structure Activity Relationship (QSAR) is the most frequently used modeling approach to explore the dependency of biological, toxicological, or other types of activities/properties of chemicals on their molecular features. In the past two decades, QSAR modeling has been used extensively in drug discovery process. However, the predictive models resulted from QSAR studies have limited use for chemical risk assessment, especially for animal and human toxicity evaluations, due to the low predictivity of new compounds. To develop enhanced toxicity models with independently validated external prediction power, novel modeling protocols were pursued by computational toxicologists based on rapidly increasing toxicity testing data in recent years. This chapter reviews the recent effort in our laboratory to incorporate the biological testing results as descriptors in the toxicity modeling process. This effort extended the concept of QSAR to Quantitative Structure In vitro-In vivo Relationship (QSIIR). The QSIIR study examples provided in this chapter indicate that the QSIIR models that based on the hybrid (biological and chemical) descriptors are indeed superior to the conventional QSAR models that only based on chemical descriptors for several animal toxicity endpoints. We believe that the applications introduced in this review will be of interest and value to researchers working in the field of computational drug discovery and environmental chemical risk assessment. PMID:23086837

  17. Consumers' behavior in quantitative microbial risk assessment for pathogens in raw milk: Incorporation of the likelihood of consumption as a function of storage time and temperature.

    PubMed

    Crotta, Matteo; Paterlini, Franco; Rizzi, Rita; Guitian, Javier

    2016-02-01

    Foodborne disease as a result of raw milk consumption is an increasing concern in Western countries. Quantitative microbial risk assessment models have been used to estimate the risk of illness due to different pathogens in raw milk. In these models, the duration and temperature of storage before consumption have a critical influence in the final outcome of the simulations and are usually described and modeled as independent distributions in the consumer phase module. We hypothesize that this assumption can result in the computation, during simulations, of extreme scenarios that ultimately lead to an overestimation of the risk. In this study, a sensorial analysis was conducted to replicate consumers' behavior. The results of the analysis were used to establish, by means of a logistic model, the relationship between time-temperature combinations and the probability that a serving of raw milk is actually consumed. To assess our hypothesis, 2 recently published quantitative microbial risk assessment models quantifying the risks of listeriosis and salmonellosis related to the consumption of raw milk were implemented. First, the default settings described in the publications were kept; second, the likelihood of consumption as a function of the length and temperature of storage was included. When results were compared, the density of computed extreme scenarios decreased significantly in the modified model; consequently, the probability of illness and the expected number of cases per year also decreased. Reductions of 11.6 and 12.7% in the proportion of computed scenarios in which a contaminated milk serving was consumed were observed for the first and the second study, respectively. Our results confirm that overlooking the time-temperature dependency may yield to an important overestimation of the risk. Furthermore, we provide estimates of this dependency that could easily be implemented in future quantitative microbial risk assessment models of raw milk pathogens. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  18. Parallel and serial computing tools for testing single-locus and epistatic SNP effects of quantitative traits in genome-wide association studies

    PubMed Central

    Ma, Li; Runesha, H Birali; Dvorkin, Daniel; Garbe, John R; Da, Yang

    2008-01-01

    Background Genome-wide association studies (GWAS) using single nucleotide polymorphism (SNP) markers provide opportunities to detect epistatic SNPs associated with quantitative traits and to detect the exact mode of an epistasis effect. Computational difficulty is the main bottleneck for epistasis testing in large scale GWAS. Results The EPISNPmpi and EPISNP computer programs were developed for testing single-locus and epistatic SNP effects on quantitative traits in GWAS, including tests of three single-locus effects for each SNP (SNP genotypic effect, additive and dominance effects) and five epistasis effects for each pair of SNPs (two-locus interaction, additive × additive, additive × dominance, dominance × additive, and dominance × dominance) based on the extended Kempthorne model. EPISNPmpi is the parallel computing program for epistasis testing in large scale GWAS and achieved excellent scalability for large scale analysis and portability for various parallel computing platforms. EPISNP is the serial computing program based on the EPISNPmpi code for epistasis testing in small scale GWAS using commonly available operating systems and computer hardware. Three serial computing utility programs were developed for graphical viewing of test results and epistasis networks, and for estimating CPU time and disk space requirements. Conclusion The EPISNPmpi parallel computing program provides an effective computing tool for epistasis testing in large scale GWAS, and the epiSNP serial computing programs are convenient tools for epistasis analysis in small scale GWAS using commonly available computer hardware. PMID:18644146

  19. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1982-01-01

    Models, measures, and techniques for evaluating the effectiveness of aircraft computing systems were developed. By "effectiveness" in this context we mean the extent to which the user, i.e., a commercial air carrier, may expect to benefit from the computational tasks accomplished by a computing system in the environment of an advanced commercial aircraft. Thus, the concept of effectiveness involves aspects of system performance, reliability, and worth (value, benefit) which are appropriately integrated in the process of evaluating system effectiveness. Specifically, the primary objectives are: the development of system models that provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer.

  20. Full 3-D OCT-based pseudophakic custom computer eye model

    PubMed Central

    Sun, M.; Pérez-Merino, P.; Martinez-Enriquez, E.; Velasco-Ocana, M.; Marcos, S.

    2016-01-01

    We compared measured wave aberrations in pseudophakic eyes implanted with aspheric intraocular lenses (IOLs) with simulated aberrations from numerical ray tracing on customized computer eye models, built using quantitative 3-D OCT-based patient-specific ocular geometry. Experimental and simulated aberrations show high correlation (R = 0.93; p<0.0001) and similarity (RMS for high order aberrations discrepancies within 23.58%). This study shows that full OCT-based pseudophakic custom computer eye models allow understanding the relative contribution of optical geometrical and surgically-related factors to image quality, and are an excellent tool for characterizing and improving cataract surgery. PMID:27231608

  1. Comparing the cognitive differences resulting from modeling instruction: Using computer microworld and physical object instruction to model real world problems

    NASA Astrophysics Data System (ADS)

    Oursland, Mark David

    This study compared the modeling achievement of students receiving mathematical modeling instruction using the computer microworld, Interactive Physics, and students receiving instruction using physical objects. Modeling instruction included activities where students applied the (a) linear model to a variety of situations, (b) linear model to two-rate situations with a constant rate, (c) quadratic model to familiar geometric figures. Both quantitative and qualitative methods were used to analyze achievement differences between students (a) receiving different methods of modeling instruction, (b) with different levels of beginning modeling ability, or (c) with different levels of computer literacy. Student achievement was analyzed quantitatively through a three-factor analysis of variance where modeling instruction, beginning modeling ability, and computer literacy were used as the three independent factors. The SOLO (Structure of the Observed Learning Outcome) assessment framework was used to design written modeling assessment instruments to measure the students' modeling achievement. The same three independent factors were used to collect and analyze the interviews and observations of student behaviors. Both methods of modeling instruction used the data analysis approach to mathematical modeling. The instructional lessons presented problem situations where students were asked to collect data, analyze the data, write a symbolic mathematical equation, and use equation to solve the problem. The researcher recommends the following practice for modeling instruction based on the conclusions of this study. A variety of activities with a common structure are needed to make explicit the modeling process of applying a standard mathematical model. The modeling process is influenced strongly by prior knowledge of the problem context and previous modeling experiences. The conclusions of this study imply that knowledge of the properties about squares improved the students' ability to model a geometric problem more than instruction in data analysis modeling. The uses of computer microworlds such as Interactive Physics in conjunction with cooperative groups are a viable method of modeling instruction.

  2. EVALUATION OF PHYSIOLOGY COMPUTER MODELS, AND THE FEASIBILITY OF THEIR USE IN RISK ASSESSMENT.

    EPA Science Inventory

    This project will evaluate the current state of quantitative models that simulate physiological processes, and the how these models might be used in conjunction with the current use of PBPK and BBDR models in risk assessment. The work will include a literature search to identify...

  3. Factors Influencing F/OSS Cloud Computing Software Product Success: A Quantitative Study

    ERIC Educational Resources Information Center

    Letort, D. Brian

    2012-01-01

    Cloud Computing introduces a new business operational model that allows an organization to shift information technology consumption from traditional capital expenditure to operational expenditure. This shift introduces challenges from both the adoption and creation vantage. This study evaluates factors that influence Free/Open Source Software…

  4. Quantitative Model for Choosing Programming Language for Online Instruction

    ERIC Educational Resources Information Center

    Sherman, Steven J.; Shehane, Ronald F.; Todd, Dewey W.

    2018-01-01

    Colleges are increasingly offering online courses, including computer programming courses for business school students. Programming languages that are most useful to students are those that are widely used in the job market. However, the most popular computer languages change at least every three years. Therefore, the language used for instruction…

  5. Dissociation of the Ethyl Radical: An Exercise in Computational Chemistry

    ERIC Educational Resources Information Center

    Nassabeh, Nahal; Tran, Mark; Fleming, Patrick E.

    2014-01-01

    A set of exercises for use in a typical physical chemistry laboratory course are described, modeling the unimolecular dissociation of the ethyl radical to form ethylene and atomic hydrogen. Students analyze the computational results both qualitatively and quantitatively. Qualitative structural changes are compared to approximate predicted values…

  6. Fundamentals and Recent Developments in Approximate Bayesian Computation

    PubMed Central

    Lintusaari, Jarno; Gutmann, Michael U.; Dutta, Ritabrata; Kaski, Samuel; Corander, Jukka

    2017-01-01

    Abstract Bayesian inference plays an important role in phylogenetics, evolutionary biology, and in many other branches of science. It provides a principled framework for dealing with uncertainty and quantifying how it changes in the light of new evidence. For many complex models and inference problems, however, only approximate quantitative answers are obtainable. Approximate Bayesian computation (ABC) refers to a family of algorithms for approximate inference that makes a minimal set of assumptions by only requiring that sampling from a model is possible. We explain here the fundamentals of ABC, review the classical algorithms, and highlight recent developments. [ABC; approximate Bayesian computation; Bayesian inference; likelihood-free inference; phylogenetics; simulator-based models; stochastic simulation models; tree-based models.] PMID:28175922

  7. A novel paradigm for cell and molecule interaction ontology: from the CMM model to IMGT-ONTOLOGY

    PubMed Central

    2010-01-01

    Background Biology is moving fast toward the virtuous circle of other disciplines: from data to quantitative modeling and back to data. Models are usually developed by mathematicians, physicists, and computer scientists to translate qualitative or semi-quantitative biological knowledge into a quantitative approach. To eliminate semantic confusion between biology and other disciplines, it is necessary to have a list of the most important and frequently used concepts coherently defined. Results We propose a novel paradigm for generating new concepts for an ontology, starting from model rather than developing a database. We apply that approach to generate concepts for cell and molecule interaction starting from an agent based model. This effort provides a solid infrastructure that is useful to overcome the semantic ambiguities that arise between biologists and mathematicians, physicists, and computer scientists, when they interact in a multidisciplinary field. Conclusions This effort represents the first attempt at linking molecule ontology with cell ontology, in IMGT-ONTOLOGY, the well established ontology in immunogenetics and immunoinformatics, and a paradigm for life science biology. With the increasing use of models in biology and medicine, the need to link different levels, from molecules to cells to tissues and organs, is increasingly important. PMID:20167082

  8. Quantitative, steady-state properties of Catania's computational model of the operant reserve.

    PubMed

    Berg, John P; McDowell, J J

    2011-05-01

    Catania (2005) found that a computational model of the operant reserve (Skinner, 1938) produced realistic behavior in initial, exploratory analyses. Although Catania's operant reserve computational model demonstrated potential to simulate varied behavioral phenomena, the model was not systematically tested. The current project replicated and extended the Catania model, clarified its capabilities through systematic testing, and determined the extent to which it produces behavior corresponding to matching theory. Significant departures from both classic and modern matching theory were found in behavior generated by the model across all conditions. The results suggest that a simple, dynamic operant model of the reflex reserve does not simulate realistic steady state behavior. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Quantitative characterization of surface topography using spectral analysis

    NASA Astrophysics Data System (ADS)

    Jacobs, Tevis D. B.; Junge, Till; Pastewka, Lars

    2017-03-01

    Roughness determines many functional properties of surfaces, such as adhesion, friction, and (thermal and electrical) contact conductance. Recent analytical models and simulations enable quantitative prediction of these properties from knowledge of the power spectral density (PSD) of the surface topography. The utility of the PSD is that it contains statistical information that is unbiased by the particular scan size and pixel resolution chosen by the researcher. In this article, we first review the mathematical definition of the PSD, including the one- and two-dimensional cases, and common variations of each. We then discuss strategies for reconstructing an accurate PSD of a surface using topography measurements at different size scales. Finally, we discuss detecting and mitigating artifacts at the smallest scales, and computing upper/lower bounds on functional properties obtained from models. We accompany our discussion with virtual measurements on computer-generated surfaces. This discussion summarizes how to analyze topography measurements to reconstruct a reliable PSD. Analytical models demonstrate the potential for tuning functional properties by rationally tailoring surface topography—however, this potential can only be achieved through the accurate, quantitative reconstruction of the PSDs of real-world surfaces.

  10. Mapping Bone Mineral Density Obtained by Quantitative Computed Tomography to Bone Volume Fraction

    NASA Technical Reports Server (NTRS)

    Pennline, James A.; Mulugeta, Lealem

    2017-01-01

    Methods for relating or mapping estimates of volumetric Bone Mineral Density (vBMD) obtained by Quantitative Computed Tomography to Bone Volume Fraction (BVF) are outlined mathematically. The methods are based on definitions of bone properties, cited experimental studies and regression relations derived from them for trabecular bone in the proximal femur. Using an experimental range of values in the intertrochanteric region obtained from male and female human subjects, age 18 to 49, the BVF values calculated from four different methods were compared to the experimental average and numerical range. The BVF values computed from the conversion method used data from two sources. One source provided pre bed rest vBMD values in the intertrochanteric region from 24 bed rest subject who participated in a 70 day study. Another source contained preflight vBMD values from 18 astronauts who spent 4 to 6 months on the ISS. To aid the use of a mapping from BMD to BVF, the discussion includes how to formulate them for purpose of computational modeling. An application of the conversions would be used to aid in modeling of time varying changes in vBMD as it relates to changes in BVF via bone remodeling and/or modeling.

  11. Emission Computed Tomography: A New Technique for the Quantitative Physiologic Study of Brain and Heart in Vivo

    DOE R&D Accomplishments Database

    Phelps, M. E.; Hoffman, E. J.; Huang, S. C.; Schelbert, H. R.; Kuhl, D. E.

    1978-01-01

    Emission computed tomography can provide a quantitative in vivo measurement of regional tissue radionuclide tracer concentrations. This facility when combined with physiologic models and radioactively labeled physiologic tracers that behave in a predictable manner allow measurement of a wide variety of physiologic variables. This integrated technique has been referred to as Physiologic Tomography (PT). PT requires labeled compounds which trace physiologic processes in a known and predictable manner, and physiologic models which are appropriately formulated and validated to derive physiologic variables from ECT data. In order to effectively achieve this goal, PT requires an ECT system that is capable of performing truly quantitative or analytical measurements of tissue tracer concentrations and which has been well characterized in terms of spatial resolution, sensitivity and signal to noise ratios in the tomographic image. This paper illustrates the capabilities of emission computed tomography and provides examples of physiologic tomography for the regional measurement of cerebral and myocardial metabolic rate for glucose, regional measurement of cerebral blood volume, gated cardiac blood pools and capillary perfusion in brain and heart. Studies on patients with stroke and myocardial ischemia are also presented.

  12. Novel Threat-risk Index Using Probabilistic Risk Assessment and Human Reliability Analysis - Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George A. Beitel

    2004-02-01

    In support of a national need to improve the current state-of-the-art in alerting decision makers to the risk of terrorist attack, a quantitative approach employing scientific and engineering concepts to develop a threat-risk index was undertaken at the Idaho National Engineering and Environmental Laboratory (INEEL). As a result of this effort, a set of models has been successfully integrated into a single comprehensive model known as Quantitative Threat-Risk Index Model (QTRIM), with the capability of computing a quantitative threat-risk index on a system level, as well as for the major components of the system. Such a threat-risk index could providemore » a quantitative variant or basis for either prioritizing security upgrades or updating the current qualitative national color-coded terrorist threat alert.« less

  13. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME

    PubMed Central

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2017-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments. PMID:28190948

  14. A Quantitative Model for Assessing Visual Simulation Software Architecture

    DTIC Science & Technology

    2011-09-01

    Software Engineering Arnold Buss Research Associate Professor of MOVES LtCol Jeff Boleng, PhD Associate Professor of Computer Science U.S. Air Force Academy... science (operating and programming systems series). New York, NY, USA: Elsevier Science Ltd. Henry, S., & Kafura, D. (1984). The evaluation of software...Rudy Darken Professor of Computer Science Dissertation Supervisor Ted Lewis Professor of Computer Science Richard Riehle Professor of Practice

  15. Multi-scale modeling in cell biology

    PubMed Central

    Meier-Schellersheim, Martin; Fraser, Iain D. C.; Klauschen, Frederick

    2009-01-01

    Biomedical research frequently involves performing experiments and developing hypotheses that link different scales of biological systems such as, for instance, the scales of intracellular molecular interactions to the scale of cellular behavior and beyond to the behavior of cell populations. Computational modeling efforts that aim at exploring such multi-scale systems quantitatively with the help of simulations have to incorporate several different simulation techniques due to the different time and space scales involved. Here, we provide a non-technical overview of how different scales of experimental research can be combined with the appropriate computational modeling techniques. We also show that current modeling software permits building and simulating multi-scale models without having to become involved with the underlying technical details of computational modeling. PMID:20448808

  16. Parameters for Pesticide QSAR and PBPK/PD Models to inform Human Risk Assessments

    EPA Science Inventory

    Physiologically-based pharmacokinetic and pharmacodynamic (PBPK/PD) modeling has emerged as an important computational approach supporting quantitative risk assessment of agrochemicals. However, before complete regulatory acceptance of this tool, an assessment of assets and liabi...

  17. QSAR Modeling: Where Have You Been? Where Are You Going To?.

    EPA Science Inventory

    Quantitative structure–activity relationship modeling is one of the major computational tools employed in medicinal chemistry. However, throughout its entire history it has drawn both praise and criticism concerning its reliability, limitations, successes, and failures. In this...

  18. Application of linear regression analysis in accuracy assessment of rolling force calculations

    NASA Astrophysics Data System (ADS)

    Poliak, E. I.; Shim, M. K.; Kim, G. S.; Choo, W. Y.

    1998-10-01

    Efficient operation of the computational models employed in process control systems require periodical assessment of the accuracy of their predictions. Linear regression is proposed as a tool which allows separate systematic and random prediction errors from those related to measurements. A quantitative characteristic of the model predictive ability is introduced in addition to standard statistical tests for model adequacy. Rolling force calculations are considered as an example for the application. However, the outlined approach can be used to assess the performance of any computational model.

  19. Survival Prediction in Pancreatic Ductal Adenocarcinoma by Quantitative Computed Tomography Image Analysis.

    PubMed

    Attiyeh, Marc A; Chakraborty, Jayasree; Doussot, Alexandre; Langdon-Embry, Liana; Mainarich, Shiana; Gönen, Mithat; Balachandran, Vinod P; D'Angelica, Michael I; DeMatteo, Ronald P; Jarnagin, William R; Kingham, T Peter; Allen, Peter J; Simpson, Amber L; Do, Richard K

    2018-04-01

    Pancreatic cancer is a highly lethal cancer with no established a priori markers of survival. Existing nomograms rely mainly on post-resection data and are of limited utility in directing surgical management. This study investigated the use of quantitative computed tomography (CT) features to preoperatively assess survival for pancreatic ductal adenocarcinoma (PDAC) patients. A prospectively maintained database identified consecutive chemotherapy-naive patients with CT angiography and resected PDAC between 2009 and 2012. Variation in CT enhancement patterns was extracted from the tumor region using texture analysis, a quantitative image analysis tool previously described in the literature. Two continuous survival models were constructed, with 70% of the data (training set) using Cox regression, first based only on preoperative serum cancer antigen (CA) 19-9 levels and image features (model A), and then on CA19-9, image features, and the Brennan score (composite pathology score; model B). The remaining 30% of the data (test set) were reserved for independent validation. A total of 161 patients were included in the analysis. Training and test sets contained 113 and 48 patients, respectively. Quantitative image features combined with CA19-9 achieved a c-index of 0.69 [integrated Brier score (IBS) 0.224] on the test data, while combining CA19-9, imaging, and the Brennan score achieved a c-index of 0.74 (IBS 0.200) on the test data. We present two continuous survival prediction models for resected PDAC patients. Quantitative analysis of CT texture features is associated with overall survival. Further work includes applying the model to an external dataset to increase the sample size for training and to determine its applicability.

  20. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    USGS Publications Warehouse

    Crovelli, R.A.

    1988-01-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.

  1. Effect of Pt Doping on Nucleation and Crystallization in Li2O.2SiO2 Glass: Experimental Measurements and Computer Modeling

    NASA Technical Reports Server (NTRS)

    Narayan, K. Lakshmi; Kelton, K. F.; Ray, C. S.

    1996-01-01

    Heterogeneous nucleation and its effects on the crystallization of lithium disilicate glass containing small amounts of Pt are investigated. Measurements of the nucleation frequencies and induction times with and without Pt are shown to be consistent with predictions based on the classical nucleation theory. A realistic computer model for the transformation is presented. Computed differential thermal analysis data (such as crystallization rates as a function of time and temperature) are shown to be in good agreement with experimental results. This modeling provides a new, more quantitative method for analyzing calorimetric data.

  2. Agent-Based Computational Modeling to Examine How Individual Cell Morphology Affects Dosimetry

    EPA Science Inventory

    Cell-based models utilizing high-content screening (HCS) data have applications for predictive toxicology. Evaluating concentration-dependent effects on cell fate and state response is a fundamental utilization of HCS data.Although HCS assays may capture quantitative readouts at ...

  3. Grid workflow validation using ontology-based tacit knowledge: A case study for quantitative remote sensing applications

    NASA Astrophysics Data System (ADS)

    Liu, Jia; Liu, Longli; Xue, Yong; Dong, Jing; Hu, Yingcui; Hill, Richard; Guang, Jie; Li, Chi

    2017-01-01

    Workflow for remote sensing quantitative retrieval is the ;bridge; between Grid services and Grid-enabled application of remote sensing quantitative retrieval. Workflow averts low-level implementation details of the Grid and hence enables users to focus on higher levels of application. The workflow for remote sensing quantitative retrieval plays an important role in remote sensing Grid and Cloud computing services, which can support the modelling, construction and implementation of large-scale complicated applications of remote sensing science. The validation of workflow is important in order to support the large-scale sophisticated scientific computation processes with enhanced performance and to minimize potential waste of time and resources. To research the semantic correctness of user-defined workflows, in this paper, we propose a workflow validation method based on tacit knowledge research in the remote sensing domain. We first discuss the remote sensing model and metadata. Through detailed analysis, we then discuss the method of extracting the domain tacit knowledge and expressing the knowledge with ontology. Additionally, we construct the domain ontology with Protégé. Through our experimental study, we verify the validity of this method in two ways, namely data source consistency error validation and parameters matching error validation.

  4. Quantitative self-assembly prediction yields targeted nanomedicines

    NASA Astrophysics Data System (ADS)

    Shamay, Yosi; Shah, Janki; Işık, Mehtap; Mizrachi, Aviram; Leibold, Josef; Tschaharganeh, Darjus F.; Roxbury, Daniel; Budhathoki-Uprety, Januka; Nawaly, Karla; Sugarman, James L.; Baut, Emily; Neiman, Michelle R.; Dacek, Megan; Ganesh, Kripa S.; Johnson, Darren C.; Sridharan, Ramya; Chu, Karen L.; Rajasekhar, Vinagolu K.; Lowe, Scott W.; Chodera, John D.; Heller, Daniel A.

    2018-02-01

    Development of targeted nanoparticle drug carriers often requires complex synthetic schemes involving both supramolecular self-assembly and chemical modification. These processes are generally difficult to predict, execute, and control. We describe herein a targeted drug delivery system that is accurately and quantitatively predicted to self-assemble into nanoparticles based on the molecular structures of precursor molecules, which are the drugs themselves. The drugs assemble with the aid of sulfated indocyanines into particles with ultrahigh drug loadings of up to 90%. We devised quantitative structure-nanoparticle assembly prediction (QSNAP) models to identify and validate electrotopological molecular descriptors as highly predictive indicators of nano-assembly and nanoparticle size. The resulting nanoparticles selectively targeted kinase inhibitors to caveolin-1-expressing human colon cancer and autochthonous liver cancer models to yield striking therapeutic effects while avoiding pERK inhibition in healthy skin. This finding enables the computational design of nanomedicines based on quantitative models for drug payload selection.

  5. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  6. Comprehensive computational model for combining fluid hydrodynamics, light transport and biomass growth in a Taylor vortex algal photobioreactor: Lagrangian approach.

    PubMed

    Gao, Xi; Kong, Bo; Vigil, R Dennis

    2017-01-01

    A comprehensive quantitative model incorporating the effects of fluid flow patterns, light distribution, and algal growth kinetics on biomass growth rate is developed in order to predict the performance of a Taylor vortex algal photobioreactor for culturing Chlorella vulgaris. A commonly used Lagrangian strategy for coupling the various factors influencing algal growth was employed whereby results from computational fluid dynamics and radiation transport simulations were used to compute numerous microorganism light exposure histories, and this information in turn was used to estimate the global biomass specific growth rate. The simulations provide good quantitative agreement with experimental data and correctly predict the trend in reactor performance as a key reactor operating parameter is varied (inner cylinder rotation speed). However, biomass growth curves are consistently over-predicted and potential causes for these over-predictions and drawbacks of the Lagrangian approach are addressed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. From Inverse Problems in Mathematical Physiology to Quantitative Differential Diagnoses

    PubMed Central

    Zenker, Sven; Rubin, Jonathan; Clermont, Gilles

    2007-01-01

    The improved capacity to acquire quantitative data in a clinical setting has generally failed to improve outcomes in acutely ill patients, suggesting a need for advances in computer-supported data interpretation and decision making. In particular, the application of mathematical models of experimentally elucidated physiological mechanisms could augment the interpretation of quantitative, patient-specific information and help to better target therapy. Yet, such models are typically complex and nonlinear, a reality that often precludes the identification of unique parameters and states of the model that best represent available data. Hypothesizing that this non-uniqueness can convey useful information, we implemented a simplified simulation of a common differential diagnostic process (hypotension in an acute care setting), using a combination of a mathematical model of the cardiovascular system, a stochastic measurement model, and Bayesian inference techniques to quantify parameter and state uncertainty. The output of this procedure is a probability density function on the space of model parameters and initial conditions for a particular patient, based on prior population information together with patient-specific clinical observations. We show that multimodal posterior probability density functions arise naturally, even when unimodal and uninformative priors are used. The peaks of these densities correspond to clinically relevant differential diagnoses and can, in the simplified simulation setting, be constrained to a single diagnosis by assimilating additional observations from dynamical interventions (e.g., fluid challenge). We conclude that the ill-posedness of the inverse problem in quantitative physiology is not merely a technical obstacle, but rather reflects clinical reality and, when addressed adequately in the solution process, provides a novel link between mathematically described physiological knowledge and the clinical concept of differential diagnoses. We outline possible steps toward translating this computational approach to the bedside, to supplement today's evidence-based medicine with a quantitatively founded model-based medicine that integrates mechanistic knowledge with patient-specific information. PMID:17997590

  8. Two-dimensional heat flow apparatus

    NASA Astrophysics Data System (ADS)

    McDougall, Patrick; Ayars, Eric

    2014-06-01

    We have created an apparatus to quantitatively measure two-dimensional heat flow in a metal plate using a grid of temperature sensors read by a microcontroller. Real-time temperature data are collected from the microcontroller by a computer for comparison with a computational model of the heat equation. The microcontroller-based sensor array allows previously unavailable levels of precision at very low cost, and the combination of measurement and modeling makes for an excellent apparatus for the advanced undergraduate laboratory course.

  9. Computational understanding of Li-ion batteries

    NASA Astrophysics Data System (ADS)

    Urban, Alexander; Seo, Dong-Hwa; Ceder, Gerbrand

    2016-03-01

    Over the last two decades, computational methods have made tremendous advances, and today many key properties of lithium-ion batteries can be accurately predicted by first principles calculations. For this reason, computations have become a cornerstone of battery-related research by providing insight into fundamental processes that are not otherwise accessible, such as ionic diffusion mechanisms and electronic structure effects, as well as a quantitative comparison with experimental results. The aim of this review is to provide an overview of state-of-the-art ab initio approaches for the modelling of battery materials. We consider techniques for the computation of equilibrium cell voltages, 0-Kelvin and finite-temperature voltage profiles, ionic mobility and thermal and electrolyte stability. The strengths and weaknesses of different electronic structure methods, such as DFT+U and hybrid functionals, are discussed in the context of voltage and phase diagram predictions, and we review the merits of lattice models for the evaluation of finite-temperature thermodynamics and kinetics. With such a complete set of methods at hand, first principles calculations of ordered, crystalline solids, i.e., of most electrode materials and solid electrolytes, have become reliable and quantitative. However, the description of molecular materials and disordered or amorphous phases remains an important challenge. We highlight recent exciting progress in this area, especially regarding the modelling of organic electrolytes and solid-electrolyte interfaces.

  10. Consumer Security Perceptions and the Perceived Influence on Adopting Cloud Computing: A Quantitative Study Using the Technology Acceptance Model

    ERIC Educational Resources Information Center

    Paquet, Katherine G.

    2013-01-01

    Cloud computing may provide cost benefits for organizations by eliminating the overhead costs of software, hardware, and maintenance (e.g., license renewals, upgrading software, servers and their physical storage space, administration along with funding a large IT department). In addition to the promised savings, the organization may require…

  11. Stimuli Influencing Small Business Owner Adoption of a Software-as-a-Service Solution: A Quantitative Study

    ERIC Educational Resources Information Center

    Cianciotta, Michael A.

    2016-01-01

    Cloud computing has moved beyond the early adoption phase and recent trends demonstrate encouraging adoption rates. This utility-based computing model offers significant IT flexibility and potential for cost savings for organizations of all sizes, but may be the most attractive to small businesses because of limited capital to fund required…

  12. The Adoption of Grid Computing Technology by Organizations: A Quantitative Study Using Technology Acceptance Model

    ERIC Educational Resources Information Center

    Udoh, Emmanuel E.

    2010-01-01

    Advances in grid technology have enabled some organizations to harness enormous computational power on demand. However, the prediction of widespread adoption of the grid technology has not materialized despite the obvious grid advantages. This situation has encouraged intense efforts to close the research gap in the grid adoption process. In this…

  13. A method for evaluating the murine pulmonary vasculature using micro-computed tomography.

    PubMed

    Phillips, Michael R; Moore, Scott M; Shah, Mansi; Lee, Clara; Lee, Yueh Z; Faber, James E; McLean, Sean E

    2017-01-01

    Significant mortality and morbidity are associated with alterations in the pulmonary vasculature. While techniques have been described for quantitative morphometry of whole-lung arterial trees in larger animals, no methods have been described in mice. We report a method for the quantitative assessment of murine pulmonary arterial vasculature using high-resolution computed tomography scanning. Mice were harvested at 2 weeks, 4 weeks, and 3 months of age. The pulmonary artery vascular tree was pressure perfused to maximal dilation with a radio-opaque casting material with viscosity and pressure set to prevent capillary transit and venous filling. The lungs were fixed and scanned on a specimen computed tomography scanner at 8-μm resolution, and the vessels were segmented. Vessels were grouped into categories based on lumen diameter and branch generation. Robust high-resolution segmentation was achieved, permitting detailed quantitation of pulmonary vascular morphometrics. As expected, postnatal lung development was associated with progressive increase in small-vessel number and arterial branching complexity. These methods for quantitative analysis of the pulmonary vasculature in postnatal and adult mice provide a useful tool for the evaluation of mouse models of disease that affect the pulmonary vasculature. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Mapping Quantitative Traits in Unselected Families: Algorithms and Examples

    PubMed Central

    Dupuis, Josée; Shi, Jianxin; Manning, Alisa K.; Benjamin, Emelia J.; Meigs, James B.; Cupples, L. Adrienne; Siegmund, David

    2009-01-01

    Linkage analysis has been widely used to identify from family data genetic variants influencing quantitative traits. Common approaches have both strengths and limitations. Likelihood ratio tests typically computed in variance component analysis can accommodate large families but are highly sensitive to departure from normality assumptions. Regression-based approaches are more robust but their use has primarily been restricted to nuclear families. In this paper, we develop methods for mapping quantitative traits in moderately large pedigrees. Our methods are based on the score statistic which in contrast to the likelihood ratio statistic, can use nonparametric estimators of variability to achieve robustness of the false positive rate against departures from the hypothesized phenotypic model. Because the score statistic is easier to calculate than the likelihood ratio statistic, our basic mapping methods utilize relatively simple computer code that performs statistical analysis on output from any program that computes estimates of identity-by-descent. This simplicity also permits development and evaluation of methods to deal with multivariate and ordinal phenotypes, and with gene-gene and gene-environment interaction. We demonstrate our methods on simulated data and on fasting insulin, a quantitative trait measured in the Framingham Heart Study. PMID:19278016

  15. An approach to computing direction relations between separated object groups

    NASA Astrophysics Data System (ADS)

    Yan, H.; Wang, Z.; Li, J.

    2013-06-01

    Direction relations between object groups play an important role in qualitative spatial reasoning, spatial computation and spatial recognition. However, none of existing models can be used to compute direction relations between object groups. To fill this gap, an approach to computing direction relations between separated object groups is proposed in this paper, which is theoretically based on Gestalt principles and the idea of multi-directions. The approach firstly triangulates the two object groups; and then it constructs the Voronoi Diagram between the two groups using the triangular network; after this, the normal of each Vornoi edge is calculated, and the quantitative expression of the direction relations is constructed; finally, the quantitative direction relations are transformed into qualitative ones. The psychological experiments show that the proposed approach can obtain direction relations both between two single objects and between two object groups, and the results are correct from the point of view of spatial cognition.

  16. An approach to computing direction relations between separated object groups

    NASA Astrophysics Data System (ADS)

    Yan, H.; Wang, Z.; Li, J.

    2013-09-01

    Direction relations between object groups play an important role in qualitative spatial reasoning, spatial computation and spatial recognition. However, none of existing models can be used to compute direction relations between object groups. To fill this gap, an approach to computing direction relations between separated object groups is proposed in this paper, which is theoretically based on gestalt principles and the idea of multi-directions. The approach firstly triangulates the two object groups, and then it constructs the Voronoi diagram between the two groups using the triangular network. After this, the normal of each Voronoi edge is calculated, and the quantitative expression of the direction relations is constructed. Finally, the quantitative direction relations are transformed into qualitative ones. The psychological experiments show that the proposed approach can obtain direction relations both between two single objects and between two object groups, and the results are correct from the point of view of spatial cognition.

  17. Computer-Aided Drug Design in Epigenetics

    NASA Astrophysics Data System (ADS)

    Lu, Wenchao; Zhang, Rukang; Jiang, Hao; Zhang, Huimin; Luo, Cheng

    2018-03-01

    Epigenetic dysfunction has been widely implicated in several diseases especially cancers thus highlights the therapeutic potential for chemical interventions in this field. With rapid development of computational methodologies and high-performance computational resources, computer-aided drug design has emerged as a promising strategy to speed up epigenetic drug discovery. Herein, we make a brief overview of major computational methods reported in the literature including druggability prediction, virtual screening, homology modeling, scaffold hopping, pharmacophore modeling, molecular dynamics simulations, quantum chemistry calculation and 3D quantitative structure activity relationship that have been successfully applied in the design and discovery of epi-drugs and epi-probes. Finally, we discuss about major limitations of current virtual drug design strategies in epigenetics drug discovery and future directions in this field.

  18. Computer-Aided Drug Design in Epigenetics

    PubMed Central

    Lu, Wenchao; Zhang, Rukang; Jiang, Hao; Zhang, Huimin; Luo, Cheng

    2018-01-01

    Epigenetic dysfunction has been widely implicated in several diseases especially cancers thus highlights the therapeutic potential for chemical interventions in this field. With rapid development of computational methodologies and high-performance computational resources, computer-aided drug design has emerged as a promising strategy to speed up epigenetic drug discovery. Herein, we make a brief overview of major computational methods reported in the literature including druggability prediction, virtual screening, homology modeling, scaffold hopping, pharmacophore modeling, molecular dynamics simulations, quantum chemistry calculation, and 3D quantitative structure activity relationship that have been successfully applied in the design and discovery of epi-drugs and epi-probes. Finally, we discuss about major limitations of current virtual drug design strategies in epigenetics drug discovery and future directions in this field. PMID:29594101

  19. Mammographic features and subsequent risk of breast cancer: a comparison of qualitative and quantitative evaluations in the Guernsey prospective studies.

    PubMed

    Torres-Mejía, Gabriela; De Stavola, Bianca; Allen, Diane S; Pérez-Gavilán, Juan J; Ferreira, Jorge M; Fentiman, Ian S; Dos Santos Silva, Isabel

    2005-05-01

    Mammographic features are known to be associated with breast cancer but the magnitude of the effect differs markedly from study to study. Methods to assess mammographic features range from subjective qualitative classifications to computer-automated quantitative measures. We used data from the UK Guernsey prospective studies to examine the relative value of these methods in predicting breast cancer risk. In all, 3,211 women ages > or =35 years who had a mammogram taken in 1986 to 1989 were followed-up to the end of October 2003, with 111 developing breast cancer during this period. Mammograms were classified using the subjective qualitative Wolfe classification and several quantitative mammographic features measured using computer-based techniques. Breast cancer risk was positively associated with high-grade Wolfe classification, percent breast density and area of dense tissue, and negatively associated with area of lucent tissue, fractal dimension, and lacunarity. Inclusion of the quantitative measures in the same model identified area of dense tissue and lacunarity as the best predictors of breast cancer, with risk increasing by 59% [95% confidence interval (95% CI), 29-94%] per SD increase in total area of dense tissue but declining by 39% (95% CI, 53-22%) per SD increase in lacunarity, after adjusting for each other and for other confounders. Comparison of models that included both the qualitative Wolfe classification and these two quantitative measures to models that included either the qualitative or the two quantitative variables showed that they all made significant contributions to prediction of breast cancer risk. These findings indicate that breast cancer risk is affected not only by the amount of mammographic density but also by the degree of heterogeneity of the parenchymal pattern and, presumably, by other features captured by the Wolfe classification.

  20. An accurate computational method for an order parameter with a Markov state model constructed using a manifold-learning technique

    NASA Astrophysics Data System (ADS)

    Ito, Reika; Yoshidome, Takashi

    2018-01-01

    Markov state models (MSMs) are a powerful approach for analyzing the long-time behaviors of protein motion using molecular dynamics simulation data. However, their quantitative performance with respect to the physical quantities is poor. We believe that this poor performance is caused by the failure to appropriately classify protein conformations into states when constructing MSMs. Herein, we show that the quantitative performance of an order parameter is improved when a manifold-learning technique is employed for the classification in the MSM. The MSM construction using the K-center method, which has been previously used for classification, has a poor quantitative performance.

  1. [The development of a computer model in the quantitative assessment of thallium-201 myocardial scintigraphy].

    PubMed

    Raineri, M; Traina, M; Rotolo, A; Candela, B; Lombardo, R M; Raineri, A A

    1993-05-01

    Thallium-201 scintigraphy is a widely used noninvasive procedure for the detection and prognostic assessment of patients with suspected or proven coronary artery disease. Thallium uptake can be evaluated by a visual analysis or by a quantitative interpretation. Quantitative scintigraphy enhances disease detection in individual coronary arteries, provides a more precise estimate of the amount of ischemic myocardium, distinguishing scar from hypoperfused tissue. Due to the great deal of data, analysis, interpretation and comparison of thallium uptake can be very complex. We designed a computer-based system for the interpretation of quantitative thallium-201 scintigraphy data uptake. We used a database (DataEase 4.2-DataEase Italia). Our software has the following functions: data storage; calculation; conversion of numerical data into different definitions classifying myocardial perfusion; uptake data comparison; automatic conclusion; comparison of different scintigrams for the same patient. Our software is made up by 4 sections: numeric analysis, descriptive analysis, automatic conclusion, clinical remarks. We introduced in the computer system appropriate information, "logical paths", that use the "IF ... THEN" rules. The software executes these rules in order to analyze the myocardial regions in the 3 phases of scintigraphic analysis (stress, redistribution, re-injection), in the 3 projections (LAO 45 degrees, LAT,ANT), considering our uptake cutoff, obtaining, finally, the automatic conclusions. For these reasons, our computer-based system could be considered a real "expert system".

  2. An adaptive model approach for quantitative wrist rigidity evaluation during deep brain stimulation surgery.

    PubMed

    Assis, Sofia; Costa, Pedro; Rosas, Maria Jose; Vaz, Rui; Silva Cunha, Joao Paulo

    2016-08-01

    Intraoperative evaluation of the efficacy of Deep Brain Stimulation includes evaluation of the effect on rigidity. A subjective semi-quantitative scale is used, dependent on the examiner perception and experience. A system was proposed previously, aiming to tackle this subjectivity, using quantitative data and providing real-time feedback of the computed rigidity reduction, hence supporting the physician decision. This system comprised of a gyroscope-based motion sensor in a textile band, placed in the patients hand, which communicated its measurements to a laptop. The latter computed a signal descriptor from the angular velocity of the hand during wrist flexion in DBS surgery. The first approach relied on using a general rigidity reduction model, regardless of the initial severity of the symptom. Thus, to enhance the performance of the previously presented system, we aimed to develop models for high and low baseline rigidity, according to the examiner assessment before any stimulation. This would allow a more patient-oriented approach. Additionally, usability was improved by having in situ processing in a smartphone, instead of a computer. Such system has shown to be reliable, presenting an accuracy of 82.0% and a mean error of 3.4%. Relatively to previous results, the performance was similar, further supporting the importance of considering the cogwheel rigidity to better infer about the reduction in rigidity. Overall, we present a simple, wearable, mobile system, suitable for intra-operatory conditions during DBS, supporting a physician in decision-making when setting stimulation parameters.

  3. Bringing computational models of bone regeneration to the clinic.

    PubMed

    Carlier, Aurélie; Geris, Liesbet; Lammens, Johan; Van Oosterwyck, Hans

    2015-01-01

    Although the field of bone regeneration has experienced great advancements in the last decades, integrating all the relevant, patient-specific information into a personalized diagnosis and optimal treatment remains a challenging task due to the large number of variables that affect bone regeneration. Computational models have the potential to cope with this complexity and to improve the fundamental understanding of the bone regeneration processes as well as to predict and optimize the patient-specific treatment strategies. However, the current use of computational models in daily orthopedic practice is very limited or inexistent. We have identified three key hurdles that limit the translation of computational models of bone regeneration from bench to bed side. First, there exists a clear mismatch between the scope of the existing and the clinically required models. Second, most computational models are confronted with limited quantitative information of insufficient quality thereby hampering the determination of patient-specific parameter values. Third, current computational models are only corroborated with animal models, whereas a thorough (retrospective and prospective) assessment of the computational model will be crucial to convince the health care providers of the capabilities thereof. These challenges must be addressed so that computational models of bone regeneration can reach their true potential, resulting in the advancement of individualized care and reduction of the associated health care costs. © 2015 Wiley Periodicals, Inc.

  4. Predictive and mechanistic multivariate linear regression models for reaction development

    PubMed Central

    Santiago, Celine B.; Guo, Jing-Yao

    2018-01-01

    Multivariate Linear Regression (MLR) models utilizing computationally-derived and empirically-derived physical organic molecular descriptors are described in this review. Several reports demonstrating the effectiveness of this methodological approach towards reaction optimization and mechanistic interrogation are discussed. A detailed protocol to access quantitative and predictive MLR models is provided as a guide for model development and parameter analysis. PMID:29719711

  5. A System Computational Model of Implicit Emotional Learning

    PubMed Central

    Puviani, Luca; Rama, Sidita

    2016-01-01

    Nowadays, the experimental study of emotional learning is commonly based on classical conditioning paradigms and models, which have been thoroughly investigated in the last century. Unluckily, models based on classical conditioning are unable to explain or predict important psychophysiological phenomena, such as the failure of the extinction of emotional responses in certain circumstances (for instance, those observed in evaluative conditioning, in post-traumatic stress disorders and in panic attacks). In this manuscript, starting from the experimental results available from the literature, a computational model of implicit emotional learning based both on prediction errors computation and on statistical inference is developed. The model quantitatively predicts (a) the occurrence of evaluative conditioning, (b) the dynamics and the resistance-to-extinction of the traumatic emotional responses, (c) the mathematical relation between classical conditioning and unconditioned stimulus revaluation. Moreover, we discuss how the derived computational model can lead to the development of new animal models for resistant-to-extinction emotional reactions and novel methodologies of emotions modulation. PMID:27378898

  6. A System Computational Model of Implicit Emotional Learning.

    PubMed

    Puviani, Luca; Rama, Sidita

    2016-01-01

    Nowadays, the experimental study of emotional learning is commonly based on classical conditioning paradigms and models, which have been thoroughly investigated in the last century. Unluckily, models based on classical conditioning are unable to explain or predict important psychophysiological phenomena, such as the failure of the extinction of emotional responses in certain circumstances (for instance, those observed in evaluative conditioning, in post-traumatic stress disorders and in panic attacks). In this manuscript, starting from the experimental results available from the literature, a computational model of implicit emotional learning based both on prediction errors computation and on statistical inference is developed. The model quantitatively predicts (a) the occurrence of evaluative conditioning, (b) the dynamics and the resistance-to-extinction of the traumatic emotional responses, (c) the mathematical relation between classical conditioning and unconditioned stimulus revaluation. Moreover, we discuss how the derived computational model can lead to the development of new animal models for resistant-to-extinction emotional reactions and novel methodologies of emotions modulation.

  7. A toolbox for discrete modelling of cell signalling dynamics.

    PubMed

    Paterson, Yasmin Z; Shorthouse, David; Pleijzier, Markus W; Piterman, Nir; Bendtsen, Claus; Hall, Benjamin A; Fisher, Jasmin

    2018-06-18

    In an age where the volume of data regarding biological systems exceeds our ability to analyse it, many researchers are looking towards systems biology and computational modelling to help unravel the complexities of gene and protein regulatory networks. In particular, the use of discrete modelling allows generation of signalling networks in the absence of full quantitative descriptions of systems, which are necessary for ordinary differential equation (ODE) models. In order to make such techniques more accessible to mainstream researchers, tools such as the BioModelAnalyzer (BMA) have been developed to provide a user-friendly graphical interface for discrete modelling of biological systems. Here we use the BMA to build a library of discrete target functions of known canonical molecular interactions, translated from ordinary differential equations (ODEs). We then show that these BMA target functions can be used to reconstruct complex networks, which can correctly predict many known genetic perturbations. This new library supports the accessibility ethos behind the creation of BMA, providing a toolbox for the construction of complex cell signalling models without the need for extensive experience in computer programming or mathematical modelling, and allows for construction and simulation of complex biological systems with only small amounts of quantitative data.

  8. From in silica to in silico: retention thermodynamics at solid-liquid interfaces.

    PubMed

    El Hage, Krystel; Bemish, Raymond J; Meuwly, Markus

    2018-06-28

    The dynamics of solvated molecules at the solid/liquid interface is essential for a molecular-level understanding for the solution thermodynamics in reversed phase liquid chromatography (RPLC). The heterogeneous nature of the systems and the competing intermolecular interactions makes solute retention in RPLC a surprisingly challenging problem which benefits greatly from modelling at atomistic resolution. However, the quality of the underlying computational model needs to be sufficiently accurate to provide a realistic description of the energetics and dynamics of systems, especially for solution-phase simulations. Here, the retention thermodynamics and the retention mechanism of a range of benzene-derivatives in C18 stationary-phase chains in contact with water/methanol mixtures is studied using point charge (PC) and multipole (MTP) electrostatic models. The results demonstrate that free energy simulations with a faithful MTP representation of the computational model provide quantitative and molecular level insight into the thermodynamics of adsorption/desorption in chromatographic systems while a conventional PC representation fails in doing so. This provides a rational basis to develop more quantitative and validated models for the optimization of separation systems.

  9. Dynamic Redox Regulation of IL-4 Signaling.

    PubMed

    Dwivedi, Gaurav; Gran, Margaret A; Bagchi, Pritha; Kemp, Melissa L

    2015-11-01

    Quantifying the magnitude and dynamics of protein oxidation during cell signaling is technically challenging. Computational modeling provides tractable, quantitative methods to test hypotheses of redox mechanisms that may be simultaneously operative during signal transduction. The interleukin-4 (IL-4) pathway, which has previously been reported to induce reactive oxygen species and oxidation of PTP1B, may be controlled by several other putative mechanisms of redox regulation; widespread proteomic thiol oxidation observed via 2D redox differential gel electrophoresis upon IL-4 treatment suggests more than one redox-sensitive protein implicated in this pathway. Through computational modeling and a model selection strategy that relied on characteristic STAT6 phosphorylation dynamics of IL-4 signaling, we identified reversible protein tyrosine phosphatase (PTP) oxidation as the primary redox regulatory mechanism in the pathway. A systems-level model of IL-4 signaling was developed that integrates synchronous pan-PTP oxidation with ROS-independent mechanisms. The model quantitatively predicts the dynamics of IL-4 signaling over a broad range of new redox conditions, offers novel hypotheses about regulation of JAK/STAT signaling, and provides a framework for interrogating putative mechanisms involving receptor-initiated oxidation.

  10. Dynamic Redox Regulation of IL-4 Signaling

    PubMed Central

    Dwivedi, Gaurav; Gran, Margaret A.; Bagchi, Pritha; Kemp, Melissa L.

    2015-01-01

    Quantifying the magnitude and dynamics of protein oxidation during cell signaling is technically challenging. Computational modeling provides tractable, quantitative methods to test hypotheses of redox mechanisms that may be simultaneously operative during signal transduction. The interleukin-4 (IL-4) pathway, which has previously been reported to induce reactive oxygen species and oxidation of PTP1B, may be controlled by several other putative mechanisms of redox regulation; widespread proteomic thiol oxidation observed via 2D redox differential gel electrophoresis upon IL-4 treatment suggests more than one redox-sensitive protein implicated in this pathway. Through computational modeling and a model selection strategy that relied on characteristic STAT6 phosphorylation dynamics of IL-4 signaling, we identified reversible protein tyrosine phosphatase (PTP) oxidation as the primary redox regulatory mechanism in the pathway. A systems-level model of IL-4 signaling was developed that integrates synchronous pan-PTP oxidation with ROS-independent mechanisms. The model quantitatively predicts the dynamics of IL-4 signaling over a broad range of new redox conditions, offers novel hypotheses about regulation of JAK/STAT signaling, and provides a framework for interrogating putative mechanisms involving receptor-initiated oxidation. PMID:26562652

  11. Simulation of metastatic progression using a computer model including chemotherapy and radiation therapy.

    PubMed

    Bethge, Anja; Schumacher, Udo; Wedemann, Gero

    2015-10-01

    Despite considerable research efforts, the process of metastasis formation is still a subject of intense discussion, and even established models differ considerably in basic details and in the conclusions drawn from them. Mathematical and computational models add a new perspective to the research as they can quantitatively investigate the processes of metastasis and the effects of treatment. However, existing models look at only one treatment option at a time. We enhanced a previously developed computer model (called CaTSiT) that enables quantitative comparison of different metastasis formation models with clinical and experimental data, to include the effects of chemotherapy, external beam radiation, radioimmunotherapy and radioembolization. CaTSiT is based on a discrete event simulation procedure. The growth of the primary tumor and its metastases is modeled by a piecewise-defined growth function that describes the growth behavior of the primary tumor and metastases during various time intervals. The piecewise-defined growth function is composed of analytical functions describing the growth behavior of the tumor based on characteristics of the tumor, such as dormancy, or the effects of various therapies. The spreading of malignant cells into the blood is modeled by intravasation events, which are generated according to a rate function. Further events in the model describe the behavior of the released malignant cells until the formation of a new metastasis. The model is published under the GNU General Public License version 3. To demonstrate the application of the computer model, a case of a patient with a hepatocellular carcinoma and multiple metastases in the liver was simulated. Besides the untreated case, different treatments were simulated at two time points: one directly after diagnosis of the primary tumor and the other several months later. Except for early applied radioimmunotherapy, no treatment strategy was able to eliminate all metastases. These results emphasize the importance of early diagnosis and of proceeding with treatment even if no clinically detectable metastases are present at the time of diagnosis of the primary tumor. CaTSiT could be a valuable tool for quantitative investigation of the process of tumor growth and metastasis formation, including the effects of various treatment options. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Neural and Computational Mechanisms of Action Processing: Interaction between Visual and Motor Representations.

    PubMed

    Giese, Martin A; Rizzolatti, Giacomo

    2015-10-07

    Action recognition has received enormous interest in the field of neuroscience over the last two decades. In spite of this interest, the knowledge in terms of fundamental neural mechanisms that provide constraints for underlying computations remains rather limited. This fact stands in contrast with a wide variety of speculative theories about how action recognition might work. This review focuses on new fundamental electrophysiological results in monkeys, which provide constraints for the detailed underlying computations. In addition, we review models for action recognition and processing that have concrete mathematical implementations, as opposed to conceptual models. We think that only such implemented models can be meaningfully linked quantitatively to physiological data and have a potential to narrow down the many possible computational explanations for action recognition. In addition, only concrete implementations allow judging whether postulated computational concepts have a feasible implementation in terms of realistic neural circuits. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Computational Phenotyping in Psychiatry: A Worked Example

    PubMed Central

    2016-01-01

    Abstract Computational psychiatry is a rapidly emerging field that uses model-based quantities to infer the behavioral and neuronal abnormalities that underlie psychopathology. If successful, this approach promises key insights into (pathological) brain function as well as a more mechanistic and quantitative approach to psychiatric nosology—structuring therapeutic interventions and predicting response and relapse. The basic procedure in computational psychiatry is to build a computational model that formalizes a behavioral or neuronal process. Measured behavioral (or neuronal) responses are then used to infer the model parameters of a single subject or a group of subjects. Here, we provide an illustrative overview over this process, starting from the modeling of choice behavior in a specific task, simulating data, and then inverting that model to estimate group effects. Finally, we illustrate cross-validation to assess whether between-subject variables (e.g., diagnosis) can be recovered successfully. Our worked example uses a simple two-step maze task and a model of choice behavior based on (active) inference and Markov decision processes. The procedural steps and routines we illustrate are not restricted to a specific field of research or particular computational model but can, in principle, be applied in many domains of computational psychiatry. PMID:27517087

  14. Computational Phenotyping in Psychiatry: A Worked Example.

    PubMed

    Schwartenbeck, Philipp; Friston, Karl

    2016-01-01

    Computational psychiatry is a rapidly emerging field that uses model-based quantities to infer the behavioral and neuronal abnormalities that underlie psychopathology. If successful, this approach promises key insights into (pathological) brain function as well as a more mechanistic and quantitative approach to psychiatric nosology-structuring therapeutic interventions and predicting response and relapse. The basic procedure in computational psychiatry is to build a computational model that formalizes a behavioral or neuronal process. Measured behavioral (or neuronal) responses are then used to infer the model parameters of a single subject or a group of subjects. Here, we provide an illustrative overview over this process, starting from the modeling of choice behavior in a specific task, simulating data, and then inverting that model to estimate group effects. Finally, we illustrate cross-validation to assess whether between-subject variables (e.g., diagnosis) can be recovered successfully. Our worked example uses a simple two-step maze task and a model of choice behavior based on (active) inference and Markov decision processes. The procedural steps and routines we illustrate are not restricted to a specific field of research or particular computational model but can, in principle, be applied in many domains of computational psychiatry.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thrall, Brian D.; Minard, Kevin R.; Teeguarden, Justin G.

    A Cooperative Research and Development Agreement (CRADA) was sponsored by Battelle Memorial Institute (Battelle, Columbus), to initiate a collaborative research program across multiple Department of Energy (DOE) National Laboratories aimed at developing a suite of new capabilities for predictive toxicology. Predicting the potential toxicity of emerging classes of engineered nanomaterials was chosen as one of two focusing problems for this program. PNNL’s focus toward this broader goal was to refine and apply experimental and computational tools needed to provide quantitative understanding of nanoparticle dosimetry for in vitro cell culture systems, which is necessary for comparative risk estimates for different nanomaterialsmore » or biological systems. Research conducted using lung epithelial and macrophage cell models successfully adapted magnetic particle detection and fluorescent microscopy technologies to quantify uptake of various forms of engineered nanoparticles, and provided experimental constraints and test datasets for benchmark comparison against results obtained using an in vitro computational dosimetry model, termed the ISSD model. The experimental and computational approaches developed were used to demonstrate how cell dosimetry is applied to aid in interpretation of genomic studies of nanoparticle-mediated biological responses in model cell culture systems. The combined experimental and theoretical approach provides a highly quantitative framework for evaluating relationships between biocompatibility of nanoparticles and their physical form in a controlled manner.« less

  16. How to quantitatively evaluate safety of driver behavior upon accident? A biomechanical methodology

    PubMed Central

    Zhang, Wen; Cao, Jieer

    2017-01-01

    How to evaluate driver spontaneous reactions in various collision patterns in a quantitative way is one of the most important topics in vehicle safety. Firstly, this paper constructs representative numerical crash scenarios described by impact velocity, impact angle and contact position based on finite element (FE) computation platform. Secondly, a driver cabin model is extracted and described in the well validated multi-rigid body (MB) model to compute the value of weighted injury criterion to quantitatively assess drivers’ overall injury under certain circumstances. Furthermore, based on the coupling of FE and MB, parametric studies on various crash scenarios are conducted. It is revealed that the WIC (Weighted Injury Criteria) value variation law under high impact velocities is quite distinct comparing with the one in low impact velocities. In addition, the coupling effect can be elucidated by the fact that the difference of WIC value among three impact velocities under smaller impact angles tends to be distinctly higher than that under larger impact angles. Meanwhile, high impact velocity also increases the sensitivity of WIC under different collision positions and impact angles. Results may provide a new methodology to quantitatively evaluate driving behaviors and serve as a significant guiding step towards collision avoidance for autonomous driving vehicles. PMID:29240789

  17. How to quantitatively evaluate safety of driver behavior upon accident? A biomechanical methodology.

    PubMed

    Zhang, Wen; Cao, Jieer; Xu, Jun

    2017-01-01

    How to evaluate driver spontaneous reactions in various collision patterns in a quantitative way is one of the most important topics in vehicle safety. Firstly, this paper constructs representative numerical crash scenarios described by impact velocity, impact angle and contact position based on finite element (FE) computation platform. Secondly, a driver cabin model is extracted and described in the well validated multi-rigid body (MB) model to compute the value of weighted injury criterion to quantitatively assess drivers' overall injury under certain circumstances. Furthermore, based on the coupling of FE and MB, parametric studies on various crash scenarios are conducted. It is revealed that the WIC (Weighted Injury Criteria) value variation law under high impact velocities is quite distinct comparing with the one in low impact velocities. In addition, the coupling effect can be elucidated by the fact that the difference of WIC value among three impact velocities under smaller impact angles tends to be distinctly higher than that under larger impact angles. Meanwhile, high impact velocity also increases the sensitivity of WIC under different collision positions and impact angles. Results may provide a new methodology to quantitatively evaluate driving behaviors and serve as a significant guiding step towards collision avoidance for autonomous driving vehicles.

  18. Wires in the soup: quantitative models of cell signaling

    PubMed Central

    Cheong, Raymond; Levchenko, Andre

    2014-01-01

    Living cells are capable of extracting information from their environments and mounting appropriate responses to a variety of associated challenges. The underlying signal transduction networks enabling this can be quite complex, necessitating for their unraveling by sophisticated computational modeling coupled with precise experimentation. Although we are still at the beginning of this process, some recent examples of integrative analysis of cell signaling are very encouraging. This review highlights the case of the NF-κB pathway in order to illustrate how a quantitative model of a signaling pathway can be gradually constructed through continuous experimental validation, and what lessons one might learn from such exercises. PMID:18291655

  19. Dynamic inundation mapping of Hurricane Harvey flooding in the Houston metro area using hyper-resolution modeling and quantitative image reanalysis

    NASA Astrophysics Data System (ADS)

    Noh, S. J.; Lee, J. H.; Lee, S.; Zhang, Y.; Seo, D. J.

    2017-12-01

    Hurricane Harvey was one of the most extreme weather events in Texas history and left significant damages in the Houston and adjoining coastal areas. To understand better the relative impact to urban flooding of extreme amount and spatial extent of rainfall, unique geography, land use and storm surge, high-resolution water modeling is necessary such that natural and man-made components are fully resolved. In this presentation, we reconstruct spatiotemporal evolution of inundation during Hurricane Harvey using hyper-resolution modeling and quantitative image reanalysis. The two-dimensional urban flood model used is based on dynamic wave approximation and 10 m-resolution terrain data, and is forced by the radar-based multisensor quantitative precipitation estimates. The model domain includes Buffalo, Brays, Greens and White Oak Bayous in Houston. The model is simulated using hybrid parallel computing. To evaluate dynamic inundation mapping, we combine various qualitative crowdsourced images and video footages with LiDAR-based terrain data.

  20. Quantitative risk assessment system (QRAS)

    NASA Technical Reports Server (NTRS)

    Tan, Zhibin (Inventor); Mosleh, Ali (Inventor); Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Chang, Yung-Hsien (Inventor); Groen, Francisco J (Inventor); Swaminathan, Sankaran (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  1. Modeling of power transmission and stress grading for corona protection

    NASA Astrophysics Data System (ADS)

    Zohdi, T. I.; Abali, B. E.

    2017-11-01

    Electrical high voltage (HV) machines are prone to corona discharges leading to power losses as well as damage of the insulating layer. Many different techniques are applied as corona protection and computational methods aid to select the best design. In this paper we develop a reduced-order model in 1D estimating electric field and temperature distribution of a conductor wrapped with different layers, as usual for HV-machines. Many assumptions and simplifications are undertaken for this 1D model, therefore, we compare its results to a direct numerical simulation in 3D quantitatively. Both models are transient and nonlinear, giving a possibility to quickly estimate in 1D or fully compute in 3D by a computational cost. Such tools enable understanding, evaluation, and optimization of corona shielding systems for multilayered coils.

  2. Bayesian evidence computation for model selection in non-linear geoacoustic inference problems.

    PubMed

    Dettmer, Jan; Dosso, Stan E; Osler, John C

    2010-12-01

    This paper applies a general Bayesian inference approach, based on Bayesian evidence computation, to geoacoustic inversion of interface-wave dispersion data. Quantitative model selection is carried out by computing the evidence (normalizing constants) for several model parameterizations using annealed importance sampling. The resulting posterior probability density estimate is compared to estimates obtained from Metropolis-Hastings sampling to ensure consistent results. The approach is applied to invert interface-wave dispersion data collected on the Scotian Shelf, off the east coast of Canada for the sediment shear-wave velocity profile. Results are consistent with previous work on these data but extend the analysis to a rigorous approach including model selection and uncertainty analysis. The results are also consistent with core samples and seismic reflection measurements carried out in the area.

  3. Dynamic visual attention: motion direction versus motion magnitude

    NASA Astrophysics Data System (ADS)

    Bur, A.; Wurtz, P.; Müri, R. M.; Hügli, H.

    2008-02-01

    Defined as an attentive process in the context of visual sequences, dynamic visual attention refers to the selection of the most informative parts of video sequence. This paper investigates the contribution of motion in dynamic visual attention, and specifically compares computer models designed with the motion component expressed either as the speed magnitude or as the speed vector. Several computer models, including static features (color, intensity and orientation) and motion features (magnitude and vector) are considered. Qualitative and quantitative evaluations are performed by comparing the computer model output with human saliency maps obtained experimentally from eye movement recordings. The model suitability is evaluated in various situations (synthetic and real sequences, acquired with fixed and moving camera perspective), showing advantages and inconveniences of each method as well as preferred domain of application.

  4. Radiotherapy and chemotherapy change vessel tree geometry and metastatic spread in a small cell lung cancer xenograft mouse tumor model

    PubMed Central

    Bethge, Anja; Schumacher, Udo

    2017-01-01

    Background Tumor vasculature is critical for tumor growth, formation of distant metastases and efficiency of radio- and chemotherapy treatments. However, how the vasculature itself is affected during cancer treatment regarding to the metastatic behavior has not been thoroughly investigated. Therefore, the aim of this study was to analyze the influence of hypofractionated radiotherapy and cisplatin chemotherapy on vessel tree geometry and metastasis formation in a small cell lung cancer xenograft mouse tumor model to investigate the spread of malignant cells during different treatments modalities. Methods The biological data gained during these experiments were fed into our previously developed computer model “Cancer and Treatment Simulation Tool” (CaTSiT) to model the growth of the primary tumor, its metastatic deposit and also the influence on different therapies. Furthermore, we performed quantitative histology analyses to verify our predictions in xenograft mouse tumor model. Results According to the computer simulation the number of cells engrafting must vary considerably to explain the different weights of the primary tumor at the end of the experiment. Once a primary tumor is established, the fractal dimension of its vasculature correlates with the tumor size. Furthermore, the fractal dimension of the tumor vasculature changes during treatment, indicating that the therapy affects the blood vessels’ geometry. We corroborated these findings with a quantitative histological analysis showing that the blood vessel density is depleted during radiotherapy and cisplatin chemotherapy. The CaTSiT computer model reveals that chemotherapy influences the tumor’s therapeutic susceptibility and its metastatic spreading behavior. Conclusion Using a system biological approach in combination with xenograft models and computer simulations revealed that the usage of chemotherapy and radiation therapy determines the spreading behavior by changing the blood vessel geometry of the primary tumor. PMID:29107953

  5. Analysis Tools for Interconnected Boolean Networks With Biological Applications.

    PubMed

    Chaves, Madalena; Tournier, Laurent

    2018-01-01

    Boolean networks with asynchronous updates are a class of logical models particularly well adapted to describe the dynamics of biological networks with uncertain measures. The state space of these models can be described by an asynchronous state transition graph, which represents all the possible exits from every single state, and gives a global image of all the possible trajectories of the system. In addition, the asynchronous state transition graph can be associated with an absorbing Markov chain, further providing a semi-quantitative framework where it becomes possible to compute probabilities for the different trajectories. For large networks, however, such direct analyses become computationally untractable, given the exponential dimension of the graph. Exploiting the general modularity of biological systems, we have introduced the novel concept of asymptotic graph , computed as an interconnection of several asynchronous transition graphs and recovering all asymptotic behaviors of a large interconnected system from the behavior of its smaller modules. From a modeling point of view, the interconnection of networks is very useful to address for instance the interplay between known biological modules and to test different hypotheses on the nature of their mutual regulatory links. This paper develops two new features of this general methodology: a quantitative dimension is added to the asymptotic graph, through the computation of relative probabilities for each final attractor and a companion cross-graph is introduced to complement the method on a theoretical point of view.

  6. CAPTIONALS: A computer aided testing environment for the verification and validation of communication protocols

    NASA Technical Reports Server (NTRS)

    Feng, C.; Sun, X.; Shen, Y. N.; Lombardi, Fabrizio

    1992-01-01

    This paper covers the verification and protocol validation for distributed computer and communication systems using a computer aided testing approach. Validation and verification make up the so-called process of conformance testing. Protocol applications which pass conformance testing are then checked to see whether they can operate together. This is referred to as interoperability testing. A new comprehensive approach to protocol testing is presented which address: (1) modeling for inter-layer representation for compatibility between conformance and interoperability testing; (2) computational improvement to current testing methods by using the proposed model inclusive of formulation of new qualitative and quantitative measures and time-dependent behavior; (3) analysis and evaluation of protocol behavior for interactive testing without extensive simulation.

  7. dCLIP: a computational approach for comparative CLIP-seq analyses

    PubMed Central

    2014-01-01

    Although comparison of RNA-protein interaction profiles across different conditions has become increasingly important to understanding the function of RNA-binding proteins (RBPs), few computational approaches have been developed for quantitative comparison of CLIP-seq datasets. Here, we present an easy-to-use command line tool, dCLIP, for quantitative CLIP-seq comparative analysis. The two-stage method implemented in dCLIP, including a modified MA normalization method and a hidden Markov model, is shown to be able to effectively identify differential binding regions of RBPs in four CLIP-seq datasets, generated by HITS-CLIP, iCLIP and PAR-CLIP protocols. dCLIP is freely available at http://qbrc.swmed.edu/software/. PMID:24398258

  8. Information Processing Research

    DTIC Science & Technology

    1979-06-01

    quantitative shape recovery. For the qualitative shape recovery we use a model of the Origami world (Kanade, 1978), together with edge profiles of...Workshop. Carnegie-Mellon University, Computer Science Department, Pittsburgh, PA, November, 1978. Kanade, T. A theory of origami world. Technical

  9. Hydrologic modeling strategy for the Islamic Republic of Mauritania, Africa

    USGS Publications Warehouse

    Friedel, Michael J.

    2008-01-01

    The government of Mauritania is interested in how to maintain hydrologic balance to ensure a long-term stable water supply for minerals-related, domestic, and other purposes. Because of the many complicating and competing natural and anthropogenic factors, hydrologists will perform quantitative analysis with specific objectives and relevant computer models in mind. Whereas various computer models are available for studying water-resource priorities, the success of these models to provide reliable predictions largely depends on adequacy of the model-calibration process. Predictive analysis helps us evaluate the accuracy and uncertainty associated with simulated dependent variables of our calibrated model. In this report, the hydrologic modeling process is reviewed and a strategy summarized for future Mauritanian hydrologic modeling studies.

  10. Integrated computational model of the bioenergetics of isolated lung mitochondria

    PubMed Central

    Zhang, Xiao; Jacobs, Elizabeth R.; Camara, Amadou K. S.; Clough, Anne V.

    2018-01-01

    Integrated computational modeling provides a mechanistic and quantitative framework for describing lung mitochondrial bioenergetics. Thus, the objective of this study was to develop and validate a thermodynamically-constrained integrated computational model of the bioenergetics of isolated lung mitochondria. The model incorporates the major biochemical reactions and transport processes in lung mitochondria. A general framework was developed to model those biochemical reactions and transport processes. Intrinsic model parameters such as binding constants were estimated using previously published isolated enzymes and transporters kinetic data. Extrinsic model parameters such as maximal reaction and transport velocities were estimated by fitting the integrated bioenergetics model to published and new tricarboxylic acid cycle and respirometry data measured in isolated rat lung mitochondria. The integrated model was then validated by assessing its ability to predict experimental data not used for the estimation of the extrinsic model parameters. For example, the model was able to predict reasonably well the substrate and temperature dependency of mitochondrial oxygen consumption, kinetics of NADH redox status, and the kinetics of mitochondrial accumulation of the cationic dye rhodamine 123, driven by mitochondrial membrane potential, under different respiratory states. The latter required the coupling of the integrated bioenergetics model to a pharmacokinetic model for the mitochondrial uptake of rhodamine 123 from buffer. The integrated bioenergetics model provides a mechanistic and quantitative framework for 1) integrating experimental data from isolated lung mitochondria under diverse experimental conditions, and 2) assessing the impact of a change in one or more mitochondrial processes on overall lung mitochondrial bioenergetics. In addition, the model provides important insights into the bioenergetics and respiration of lung mitochondria and how they differ from those of mitochondria from other organs. To the best of our knowledge, this model is the first for the bioenergetics of isolated lung mitochondria. PMID:29889855

  11. Integrated computational model of the bioenergetics of isolated lung mitochondria.

    PubMed

    Zhang, Xiao; Dash, Ranjan K; Jacobs, Elizabeth R; Camara, Amadou K S; Clough, Anne V; Audi, Said H

    2018-01-01

    Integrated computational modeling provides a mechanistic and quantitative framework for describing lung mitochondrial bioenergetics. Thus, the objective of this study was to develop and validate a thermodynamically-constrained integrated computational model of the bioenergetics of isolated lung mitochondria. The model incorporates the major biochemical reactions and transport processes in lung mitochondria. A general framework was developed to model those biochemical reactions and transport processes. Intrinsic model parameters such as binding constants were estimated using previously published isolated enzymes and transporters kinetic data. Extrinsic model parameters such as maximal reaction and transport velocities were estimated by fitting the integrated bioenergetics model to published and new tricarboxylic acid cycle and respirometry data measured in isolated rat lung mitochondria. The integrated model was then validated by assessing its ability to predict experimental data not used for the estimation of the extrinsic model parameters. For example, the model was able to predict reasonably well the substrate and temperature dependency of mitochondrial oxygen consumption, kinetics of NADH redox status, and the kinetics of mitochondrial accumulation of the cationic dye rhodamine 123, driven by mitochondrial membrane potential, under different respiratory states. The latter required the coupling of the integrated bioenergetics model to a pharmacokinetic model for the mitochondrial uptake of rhodamine 123 from buffer. The integrated bioenergetics model provides a mechanistic and quantitative framework for 1) integrating experimental data from isolated lung mitochondria under diverse experimental conditions, and 2) assessing the impact of a change in one or more mitochondrial processes on overall lung mitochondrial bioenergetics. In addition, the model provides important insights into the bioenergetics and respiration of lung mitochondria and how they differ from those of mitochondria from other organs. To the best of our knowledge, this model is the first for the bioenergetics of isolated lung mitochondria.

  12. Computer-aided light sheet flow visualization using photogrammetry

    NASA Technical Reports Server (NTRS)

    Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.

    1994-01-01

    A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and a visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) results, was chosen to interactively display the reconstructed light sheet images with the numerical surface geometry for the model or aircraft under study. The photogrammetric reconstruction technique and the image processing and computer graphics techniques and equipment are described. Results of the computer-aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images with CFD solutions in the same graphics environment is also demonstrated.

  13. Computer-Aided Light Sheet Flow Visualization

    NASA Technical Reports Server (NTRS)

    Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.

    1993-01-01

    A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) data sets, was chosen to interactively display the reconstructed light sheet images, along with the numerical surface geometry for the model or aircraft under study. A description is provided of the photogrammetric reconstruction technique, and the image processing and computer graphics techniques and equipment. Results of the computer aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images and CFD solutions in the same graphics environment is also demonstrated.

  14. Computer-aided light sheet flow visualization

    NASA Technical Reports Server (NTRS)

    Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.

    1993-01-01

    A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) data sets, was chosen to interactively display the reconstructed light sheet images, along with the numerical surface geometry for the model or aircraft under study. A description is provided of the photogrammetric reconstruction technique, and the image processing and computer graphics techniques and equipment. Results of the computer aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images and CFD solutions in the same graphics environment is also demonstrated.

  15. An Evaluation into the Views of Candidate Mathematics Teachers over "Tablet Computers" to be Applied in Secondary Schools

    ERIC Educational Resources Information Center

    Aksu, Hasan Hüseyin

    2014-01-01

    This study aims to investigate, in terms of different variables, the views of prospective Mathematics teachers on tablet computers to be used in schools as an outcome of the Fatih Project, which was initiated by the Ministry of National Education. In the study, scanning model, one of the quantitative research methods, was used. In the population…

  16. Ion Channel ElectroPhysiology Ontology (ICEPO) - a case study of text mining assisted ontology development.

    PubMed

    Elayavilli, Ravikumar Komandur; Liu, Hongfang

    2016-01-01

    Computational modeling of biological cascades is of great interest to quantitative biologists. Biomedical text has been a rich source for quantitative information. Gathering quantitative parameters and values from biomedical text is one significant challenge in the early steps of computational modeling as it involves huge manual effort. While automatically extracting such quantitative information from bio-medical text may offer some relief, lack of ontological representation for a subdomain serves as impedance in normalizing textual extractions to a standard representation. This may render textual extractions less meaningful to the domain experts. In this work, we propose a rule-based approach to automatically extract relations involving quantitative data from biomedical text describing ion channel electrophysiology. We further translated the quantitative assertions extracted through text mining to a formal representation that may help in constructing ontology for ion channel events using a rule based approach. We have developed Ion Channel ElectroPhysiology Ontology (ICEPO) by integrating the information represented in closely related ontologies such as, Cell Physiology Ontology (CPO), and Cardiac Electro Physiology Ontology (CPEO) and the knowledge provided by domain experts. The rule-based system achieved an overall F-measure of 68.93% in extracting the quantitative data assertions system on an independently annotated blind data set. We further made an initial attempt in formalizing the quantitative data assertions extracted from the biomedical text into a formal representation that offers potential to facilitate the integration of text mining into ontological workflow, a novel aspect of this study. This work is a case study where we created a platform that provides formal interaction between ontology development and text mining. We have achieved partial success in extracting quantitative assertions from the biomedical text and formalizing them in ontological framework. The ICEPO ontology is available for download at http://openbionlp.org/mutd/supplementarydata/ICEPO/ICEPO.owl.

  17. Computational toxicology as implemented by the U.S. EPA: providing high throughput decision support tools for screening and assessing chemical exposure, hazard and risk.

    PubMed

    Kavlock, Robert; Dix, David

    2010-02-01

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the Toxicity of Chemicals (U.S. EPA, 2009a). Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast) and exposure (ExpoCast), and creating virtual liver (v-Liver) and virtual embryo (v-Embryo) systems models. U.S. EPA-funded STAR centers are also providing bioinformatics, computational toxicology data and models, and developmental toxicity data and models. The models and underlying data are being made publicly available through the Aggregated Computational Toxicology Resource (ACToR), the Distributed Structure-Searchable Toxicity (DSSTox) Database Network, and other U.S. EPA websites. While initially focused on improving the hazard identification process, the CTRP is placing increasing emphasis on using high-throughput bioactivity profiling data in systems modeling to support quantitative risk assessments, and in developing complementary higher throughput exposure models. This integrated approach will enable analysis of life-stage susceptibility, and understanding of the exposures, pathways, and key events by which chemicals exert their toxicity in developing systems (e.g., endocrine-related pathways). The CTRP will be a critical component in next-generation risk assessments utilizing quantitative high-throughput data and providing a much higher capacity for assessing chemical toxicity than is currently available.

  18. A Case-Series Test of the Interactive Two-Step Model of Lexical Access: Predicting Word Repetition from Picture Naming

    ERIC Educational Resources Information Center

    Dell, Gary S.; Martin, Nadine; Schwartz, Myrna F.

    2007-01-01

    Lexical access in language production, and particularly pathologies of lexical access, are often investigated by examining errors in picture naming and word repetition. In this article, we test a computational approach to lexical access, the two-step interactive model, by examining whether the model can quantitatively predict the repetition-error…

  19. A COST-EFFECTIVENESS MODEL FOR THE ANALYSIS OF TITLE I ESEA PROJECT PROPOSALS, PART I-VII.

    ERIC Educational Resources Information Center

    ABT, CLARK C.

    SEVEN SEPARATE REPORTS DESCRIBE AN OVERVIEW OF A COST-EFFECTIVENESS MODEL AND FIVE SUBMODELS FOR EVALUATING THE EFFECTIVENESS OF ELEMENTARY AND SECONDARY ACT TITLE I PROPOSALS. THE DESIGN FOR THE MODEL ATTEMPTS A QUANTITATIVE DESCRIPTION OF EDUCATION SYSTEMS WHICH MAY BE PROGRAMED AS A COMPUTER SIMULATION TO INDICATE THE IMPACT OF A TITLE I…

  20. VISUAL and SLOPE: perspective and quantitative representation of digital terrain models.

    Treesearch

    R.J. McGaughey; R.H. Twito

    1988-01-01

    Two computer programs to help timber-harvest planners evaluate terrain for logging operations are presented. The first program, VISUAL, produces three-dimensional perspectives of a digital terrain model. The second, SLOPE, produces map-scaled overlays delineating areas of equal slope, aspect, or elevation. Both programs help planners familiarize themselves with new...

  1. Development of a Model for Some Aspects of University Policy. Technical Report.

    ERIC Educational Resources Information Center

    Goossens, J. L. M.; And Others

    A method to calculate the need for academic staff per faculty, based on educational programs and numbers of students, is described which is based on quantitative relations between programs, student enrollment, and total budget. The model is described schematically and presented in a mathematical form adapted to computer processing. Its application…

  2. Asynchronous adaptive time step in quantitative cellular automata modeling

    PubMed Central

    Zhu, Hao; Pang, Peter YH; Sun, Yan; Dhar, Pawan

    2004-01-01

    Background The behaviors of cells in metazoans are context dependent, thus large-scale multi-cellular modeling is often necessary, for which cellular automata are natural candidates. Two related issues are involved in cellular automata based multi-cellular modeling: how to introduce differential equation based quantitative computing to precisely describe cellular activity, and upon it, how to solve the heavy time consumption issue in simulation. Results Based on a modified, language based cellular automata system we extended that allows ordinary differential equations in models, we introduce a method implementing asynchronous adaptive time step in simulation that can considerably improve efficiency yet without a significant sacrifice of accuracy. An average speedup rate of 4–5 is achieved in the given example. Conclusions Strategies for reducing time consumption in simulation are indispensable for large-scale, quantitative multi-cellular models, because even a small 100 × 100 × 100 tissue slab contains one million cells. Distributed and adaptive time step is a practical solution in cellular automata environment. PMID:15222901

  3. A functional-structural model of rice linking quantitative genetic information with morphological development and physiological processes.

    PubMed

    Xu, Lifeng; Henke, Michael; Zhu, Jun; Kurth, Winfried; Buck-Sorlin, Gerhard

    2011-04-01

    Although quantitative trait loci (QTL) analysis of yield-related traits for rice has developed rapidly, crop models using genotype information have been proposed only relatively recently. As a first step towards a generic genotype-phenotype model, we present here a three-dimensional functional-structural plant model (FSPM) of rice, in which some model parameters are controlled by functions describing the effect of main-effect and epistatic QTLs. The model simulates the growth and development of rice based on selected ecophysiological processes, such as photosynthesis (source process) and organ formation, growth and extension (sink processes). It was devised using GroIMP, an interactive modelling platform based on the Relational Growth Grammar formalism (RGG). RGG rules describe the course of organ initiation and extension resulting in final morphology. The link between the phenotype (as represented by the simulated rice plant) and the QTL genotype was implemented via a data interface between the rice FSPM and the QTLNetwork software, which computes predictions of QTLs from map data and measured trait data. Using plant height and grain yield, it is shown how QTL information for a given trait can be used in an FSPM, computing and visualizing the phenotypes of different lines of a mapping population. Furthermore, we demonstrate how modification of a particular trait feeds back on the entire plant phenotype via the physiological processes considered. We linked a rice FSPM to a quantitative genetic model, thereby employing QTL information to refine model parameters and visualizing the dynamics of development of the entire phenotype as a result of ecophysiological processes, including the trait(s) for which genetic information is available. Possibilities for further extension of the model, for example for the purposes of ideotype breeding, are discussed.

  4. A functional–structural model of rice linking quantitative genetic information with morphological development and physiological processes

    PubMed Central

    Xu, Lifeng; Henke, Michael; Zhu, Jun; Kurth, Winfried; Buck-Sorlin, Gerhard

    2011-01-01

    Background and Aims Although quantitative trait loci (QTL) analysis of yield-related traits for rice has developed rapidly, crop models using genotype information have been proposed only relatively recently. As a first step towards a generic genotype–phenotype model, we present here a three-dimensional functional–structural plant model (FSPM) of rice, in which some model parameters are controlled by functions describing the effect of main-effect and epistatic QTLs. Methods The model simulates the growth and development of rice based on selected ecophysiological processes, such as photosynthesis (source process) and organ formation, growth and extension (sink processes). It was devised using GroIMP, an interactive modelling platform based on the Relational Growth Grammar formalism (RGG). RGG rules describe the course of organ initiation and extension resulting in final morphology. The link between the phenotype (as represented by the simulated rice plant) and the QTL genotype was implemented via a data interface between the rice FSPM and the QTLNetwork software, which computes predictions of QTLs from map data and measured trait data. Key Results Using plant height and grain yield, it is shown how QTL information for a given trait can be used in an FSPM, computing and visualizing the phenotypes of different lines of a mapping population. Furthermore, we demonstrate how modification of a particular trait feeds back on the entire plant phenotype via the physiological processes considered. Conclusions We linked a rice FSPM to a quantitative genetic model, thereby employing QTL information to refine model parameters and visualizing the dynamics of development of the entire phenotype as a result of ecophysiological processes, including the trait(s) for which genetic information is available. Possibilities for further extension of the model, for example for the purposes of ideotype breeding, are discussed. PMID:21247905

  5. A high throughput geocomputing system for remote sensing quantitative retrieval and a case study

    NASA Astrophysics Data System (ADS)

    Xue, Yong; Chen, Ziqiang; Xu, Hui; Ai, Jianwen; Jiang, Shuzheng; Li, Yingjie; Wang, Ying; Guang, Jie; Mei, Linlu; Jiao, Xijuan; He, Xingwei; Hou, Tingting

    2011-12-01

    The quality and accuracy of remote sensing instruments have been improved significantly, however, rapid processing of large-scale remote sensing data becomes the bottleneck for remote sensing quantitative retrieval applications. The remote sensing quantitative retrieval is a data-intensive computation application, which is one of the research issues of high throughput computation. The remote sensing quantitative retrieval Grid workflow is a high-level core component of remote sensing Grid, which is used to support the modeling, reconstruction and implementation of large-scale complex applications of remote sensing science. In this paper, we intend to study middleware components of the remote sensing Grid - the dynamic Grid workflow based on the remote sensing quantitative retrieval application on Grid platform. We designed a novel architecture for the remote sensing Grid workflow. According to this architecture, we constructed the Remote Sensing Information Service Grid Node (RSSN) with Condor. We developed a graphic user interface (GUI) tools to compose remote sensing processing Grid workflows, and took the aerosol optical depth (AOD) retrieval as an example. The case study showed that significant improvement in the system performance could be achieved with this implementation. The results also give a perspective on the potential of applying Grid workflow practices to remote sensing quantitative retrieval problems using commodity class PCs.

  6. Mesoscopic modelling and simulation of soft matter.

    PubMed

    Schiller, Ulf D; Krüger, Timm; Henrich, Oliver

    2017-12-20

    The deformability of soft condensed matter often requires modelling of hydrodynamical aspects to gain quantitative understanding. This, however, requires specialised methods that can resolve the multiscale nature of soft matter systems. We review a number of the most popular simulation methods that have emerged, such as Langevin dynamics, dissipative particle dynamics, multi-particle collision dynamics, sometimes also referred to as stochastic rotation dynamics, and the lattice-Boltzmann method. We conclude this review with a short glance at current compute architectures for high-performance computing and community codes for soft matter simulation.

  7. Verification of a VRF Heat Pump Computer Model in EnergyPlus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nigusse, Bereket; Raustad, Richard

    2013-06-15

    This paper provides verification results of the EnergyPlus variable refrigerant flow (VRF) heat pump computer model using manufacturer's performance data. The paper provides an overview of the VRF model, presents the verification methodology, and discusses the results. The verification provides quantitative comparison of full and part-load performance to manufacturer's data in cooling-only and heating-only modes of operation. The VRF heat pump computer model uses dual range bi-quadratic performance curves to represent capacity and Energy Input Ratio (EIR) as a function of indoor and outdoor air temperatures, and dual range quadratic performance curves as a function of part-load-ratio for modeling part-loadmore » performance. These performance curves are generated directly from manufacturer's published performance data. The verification compared the simulation output directly to manufacturer's performance data, and found that the dual range equation fit VRF heat pump computer model predicts the manufacturer's performance data very well over a wide range of indoor and outdoor temperatures and part-load conditions. The predicted capacity and electric power deviations are comparbale to equation-fit HVAC computer models commonly used for packaged and split unitary HVAC equipment.« less

  8. Computing organic stereoselectivity - from concepts to quantitative calculations and predictions.

    PubMed

    Peng, Qian; Duarte, Fernanda; Paton, Robert S

    2016-11-07

    Advances in theory and processing power have established computation as a valuable interpretative and predictive tool in the discovery of new asymmetric catalysts. This tutorial review outlines the theory and practice of modeling stereoselective reactions. Recent examples illustrate how an understanding of the fundamental principles and the application of state-of-the-art computational methods may be used to gain mechanistic insight into organic and organometallic reactions. We highlight the emerging potential of this computational tool-box in providing meaningful predictions for the rational design of asymmetric catalysts. We present an accessible account of the field to encourage future synergy between computation and experiment.

  9. Explaining Quantitative Variation in the Rate of Optional Infinitive Errors across Languages: A Comparison of MOSAIC and the Variational Learning Model

    ERIC Educational Resources Information Center

    Freudenthal, Daniel: Pine, Julian; Gobet, Fernando

    2010-01-01

    In this study, we use corpus analysis and computational modelling techniques to compare two recent accounts of the OI stage: Legate & Yang's (2007) Variational Learning Model and Freudenthal, Pine & Gobet's (2006) Model of Syntax Acquisition in Children. We first assess the extent to which each of these accounts can explain the level of OI errors…

  10. Model Selection in Historical Research Using Approximate Bayesian Computation

    PubMed Central

    Rubio-Campillo, Xavier

    2016-01-01

    Formal Models and History Computational models are increasingly being used to study historical dynamics. This new trend, which could be named Model-Based History, makes use of recently published datasets and innovative quantitative methods to improve our understanding of past societies based on their written sources. The extensive use of formal models allows historians to re-evaluate hypotheses formulated decades ago and still subject to debate due to the lack of an adequate quantitative framework. The initiative has the potential to transform the discipline if it solves the challenges posed by the study of historical dynamics. These difficulties are based on the complexities of modelling social interaction, and the methodological issues raised by the evaluation of formal models against data with low sample size, high variance and strong fragmentation. Case Study This work examines an alternate approach to this evaluation based on a Bayesian-inspired model selection method. The validity of the classical Lanchester’s laws of combat is examined against a dataset comprising over a thousand battles spanning 300 years. Four variations of the basic equations are discussed, including the three most common formulations (linear, squared, and logarithmic) and a new variant introducing fatigue. Approximate Bayesian Computation is then used to infer both parameter values and model selection via Bayes Factors. Impact Results indicate decisive evidence favouring the new fatigue model. The interpretation of both parameter estimations and model selection provides new insights into the factors guiding the evolution of warfare. At a methodological level, the case study shows how model selection methods can be used to guide historical research through the comparison between existing hypotheses and empirical evidence. PMID:26730953

  11. Structural Characterization of Cross-Linked Hemoglobins Developed as Potential Transfusion Substitutes

    DTIC Science & Technology

    1990-05-30

    phase HPLC using an IBM Instruments Inc. model LC 9533 ternary liquid chromatograph attached to a model F9522 fixed UV module and a model F9523...acid analyses are done by separation and quantitation of phenylthiocarbamyl amino acid derivatives using a second IBM model LC 9533 ternary liquid...computer which controls the HPLC and an IBM Instruments Inc. model LC 9505 automatic sampler. The hemoglobin present in the effluent from large

  12. Iterative Refinement of a Binding Pocket Model: Active Computational Steering of Lead Optimization

    PubMed Central

    2012-01-01

    Computational approaches for binding affinity prediction are most frequently demonstrated through cross-validation within a series of molecules or through performance shown on a blinded test set. Here, we show how such a system performs in an iterative, temporal lead optimization exercise. A series of gyrase inhibitors with known synthetic order formed the set of molecules that could be selected for “synthesis.” Beginning with a small number of molecules, based only on structures and activities, a model was constructed. Compound selection was done computationally, each time making five selections based on confident predictions of high activity and five selections based on a quantitative measure of three-dimensional structural novelty. Compound selection was followed by model refinement using the new data. Iterative computational candidate selection produced rapid improvements in selected compound activity, and incorporation of explicitly novel compounds uncovered much more diverse active inhibitors than strategies lacking active novelty selection. PMID:23046104

  13. Quantiprot - a Python package for quantitative analysis of protein sequences.

    PubMed

    Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold

    2017-07-17

    The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.

  14. Intelligent model-based diagnostics for vehicle health management

    NASA Astrophysics Data System (ADS)

    Luo, Jianhui; Tu, Fang; Azam, Mohammad S.; Pattipati, Krishna R.; Willett, Peter K.; Qiao, Liu; Kawamoto, Masayuki

    2003-08-01

    The recent advances in sensor technology, remote communication and computational capabilities, and standardized hardware/software interfaces are creating a dramatic shift in the way the health of vehicles is monitored and managed. These advances facilitate remote monitoring, diagnosis and condition-based maintenance of automotive systems. With the increased sophistication of electronic control systems in vehicles, there is a concomitant increased difficulty in the identification of the malfunction phenomena. Consequently, the current rule-based diagnostic systems are difficult to develop, validate and maintain. New intelligent model-based diagnostic methodologies that exploit the advances in sensor, telecommunications, computing and software technologies are needed. In this paper, we will investigate hybrid model-based techniques that seamlessly employ quantitative (analytical) models and graph-based dependency models for intelligent diagnosis. Automotive engineers have found quantitative simulation (e.g. MATLAB/SIMULINK) to be a vital tool in the development of advanced control systems. The hybrid method exploits this capability to improve the diagnostic system's accuracy and consistency, utilizes existing validated knowledge on rule-based methods, enables remote diagnosis, and responds to the challenges of increased system complexity. The solution is generic and has the potential for application in a wide range of systems.

  15. 1:1 Computing Programs: An Analysis of the Stages of Concerns of 1:1 Integration, Professional Development Training and Level of Classroom Use by Illinois High School Teachers

    ERIC Educational Resources Information Center

    Detering, Brad

    2017-01-01

    This research study, grounded in the theoretical framework of education change, used the Concerns-Based Adoption Model of change to examine the concerns of Illinois high school teachers and administrators regarding the implementation of 1:1 computing programs. A quantitative study of educators investigated the stages of concern and the mathematics…

  16. Using Perturbation Theory to Reduce Noise in Diffusion Tensor Fields

    PubMed Central

    Bansal, Ravi; Staib, Lawrence H.; Xu, Dongrong; Laine, Andrew F.; Liu, Jun; Peterson, Bradley S.

    2009-01-01

    We propose the use of Perturbation theory to reduce noise in Diffusion Tensor (DT) fields. Diffusion Tensor Imaging (DTI) encodes the diffusion of water molecules along different spatial directions in a positive-definite, 3 × 3 symmetric tensor. Eigenvectors and eigenvalues of DTs allow the in vivo visualization and quantitative analysis of white matter fiber bundles across the brain. The validity and reliability of these analyses are limited, however, by the low spatial resolution and low Signal-to-Noise Ratio (SNR) in DTI datasets. Our procedures can be applied to improve the validity and reliability of these quantitative analyses by reducing noise in the tensor fields. We model a tensor field as a three-dimensional Markov Random Field and then compute the likelihood and the prior terms of this model using Perturbation theory. The prior term constrains the tensor field to be smooth, whereas the likelihood term constrains the smoothed tensor field to be similar to the original field. Thus, the proposed method generates a smoothed field that is close in structure to the original tensor field. We evaluate the performance of our method both visually and quantitatively using synthetic and real-world datasets. We quantitatively assess the performance of our method by computing the SNR for eigenvalues and the coherence measures for eigenvectors of DTs across tensor fields. In addition, we quantitatively compare the performance of our procedures with the performance of one method that uses a Riemannian distance to compute the similarity between two tensors, and with another method that reduces noise in tensor fields by anisotropically filtering the diffusion weighted images that are used to estimate diffusion tensors. These experiments demonstrate that our method significantly increases the coherence of the eigenvectors and the SNR of the eigenvalues, while simultaneously preserving the fine structure and boundaries between homogeneous regions, in the smoothed tensor field. PMID:19540791

  17. Notes on quantitative structure-properties relationships (QSPR) (1): A discussion on a QSPR dimensionality paradox (QSPR DP) and its quantum resolution.

    PubMed

    Carbó-Dorca, Ramon; Gallegos, Ana; Sánchez, Angel J

    2009-05-01

    Classical quantitative structure-properties relationship (QSPR) statistical techniques unavoidably present an inherent paradoxical computational context. They rely on the definition of a Gram matrix in descriptor spaces, which is used afterwards to reduce the original dimension via several possible kinds of algebraic manipulations. From there, effective models for the computation of unknown properties of known molecular structures are obtained. However, the reduced descriptor dimension causes linear dependence within the set of discrete vector molecular representations, leading to positive semi-definite Gram matrices in molecular spaces. To resolve this QSPR dimensionality paradox (QSPR DP) here is proposed to adopt as starting point the quantum QSPR (QQSPR) computational framework perspective, where density functions act as infinite dimensional descriptors. The fundamental QQSPR equation, deduced from employing quantum expectation value numerical evaluation, can be approximately solved in order to obtain models exempt of the QSPR DP. The substitution of the quantum similarity matrix by an empirical Gram matrix in molecular spaces, build up with the original non manipulated discrete molecular descriptor vectors, permits to obtain classical QSPR models with the same characteristics as in QQSPR, that is: possessing a certain degree of causality and explicitly independent of the descriptor dimension. 2008 Wiley Periodicals, Inc.

  18. Emissions Prediction and Measurement for Liquid-Fueled TVC Combustor with and without Water Injection

    NASA Technical Reports Server (NTRS)

    Brankovic, A.; Ryder, R. C., Jr.; Hendricks, R. C.; Liu, N.-S.; Shouse, D. T.; Roquemore, W. M.

    2005-01-01

    An investigation is performed to evaluate the performance of a computational fluid dynamics (CFD) tool for the prediction of the reacting flow in a liquid-fueled combustor that uses water injection for control of pollutant emissions. The experiment consists of a multisector, liquid-fueled combustor rig operated at different inlet pressures and temperatures, and over a range of fuel/air and water/fuel ratios. Fuel can be injected directly into the main combustion airstream and into the cavities. Test rig performance is characterized by combustor exit quantities such as temperature and emissions measurements using rakes and overall pressure drop from upstream plenum to combustor exit. Visualization of the flame is performed using gray scale and color still photographs and high-frame-rate videos. CFD simulations are performed utilizing a methodology that includes computer-aided design (CAD) solid modeling of the geometry, parallel processing over networked computers, and graphical and quantitative post-processing. Physical models include liquid fuel droplet dynamics and evaporation, with combustion modeled using a hybrid finite-rate chemistry model developed for Jet-A fuel. CFD and experimental results are compared for cases with cavity-only fueling, while numerical studies of cavity and main fueling was also performed. Predicted and measured trends in combustor exit temperature, CO and NOx are in general agreement at the different water/fuel loading rates, although quantitative differences exist between the predictions and measurements.

  19. CBB Portal @ PNNL

    Science.gov Websites

    Search PNNL Home About Research Publications Jobs News Contacts Computational Biology and Bioinformatics , and engineering to transform the data into knowledge. This new quantitative, predictive biology is to empirical modeling and physics-based simulations. CBB research seeks to: Understand. Understanding

  20. Traffic Noise Measurements at FM 3009 Greenfield Village Subdivision, City of Schertz, Texas

    DOT National Transportation Integrated Search

    1997-08-01

    Objectives of the studies were to: (1) determine by quantitative measurements and computer simulation modeling the effectiveness of the noise barriers; (2) determine the traffic noise intrusion through openings in the barriers at streets entering the...

  1. An Experimental Model for Analyzing Strategies for Financing Higher Education in New York State.

    ERIC Educational Resources Information Center

    New York State Education Dept., Albany. Office of Postsecondary Research, Information Systems, and Institutional Aid.

    Described is an experimental, quantitative model developed by the New York State Education Department to evaluate state-level financing strategies for higher education. It can be used to address a variety of questions and takes into account a host of direct and indirect relationships. It uses computer software and optimization algorithms developed…

  2. Quantitative structure-activation barrier relationship modeling for Diels-Alder ligations utilizing quantum chemical structural descriptors.

    PubMed

    Nandi, Sisir; Monesi, Alessandro; Drgan, Viktor; Merzel, Franci; Novič, Marjana

    2013-10-30

    In the present study, we show the correlation of quantum chemical structural descriptors with the activation barriers of the Diels-Alder ligations. A set of 72 non-catalysed Diels-Alder reactions were subjected to quantitative structure-activation barrier relationship (QSABR) under the framework of theoretical quantum chemical descriptors calculated solely from the structures of diene and dienophile reactants. Experimental activation barrier data were obtained from literature. Descriptors were computed using Hartree-Fock theory using 6-31G(d) basis set as implemented in Gaussian 09 software. Variable selection and model development were carried out by stepwise multiple linear regression methodology. Predictive performance of the quantitative structure-activation barrier relationship (QSABR) model was assessed by training and test set concept and by calculating leave-one-out cross-validated Q2 and predictive R2 values. The QSABR model can explain and predict 86.5% and 80% of the variances, respectively, in the activation energy barrier training data. Alternatively, a neural network model based on back propagation of errors was developed to assess the nonlinearity of the sought correlations between theoretical descriptors and experimental reaction barriers. A reasonable predictability for the activation barrier of the test set reactions was obtained, which enabled an exploration and interpretation of the significant variables responsible for Diels-Alder interaction between dienes and dienophiles. Thus, studies in the direction of QSABR modelling that provide efficient and fast prediction of activation barriers of the Diels-Alder reactions turn out to be a meaningful alternative to transition state theory based computation.

  3. Radiogenomics and radiotherapy response modeling

    NASA Astrophysics Data System (ADS)

    El Naqa, Issam; Kerns, Sarah L.; Coates, James; Luo, Yi; Speers, Corey; West, Catharine M. L.; Rosenstein, Barry S.; Ten Haken, Randall K.

    2017-08-01

    Advances in patient-specific information and biotechnology have contributed to a new era of computational medicine. Radiogenomics has emerged as a new field that investigates the role of genetics in treatment response to radiation therapy. Radiation oncology is currently attempting to embrace these recent advances and add to its rich history by maintaining its prominent role as a quantitative leader in oncologic response modeling. Here, we provide an overview of radiogenomics starting with genotyping, data aggregation, and application of different modeling approaches based on modifying traditional radiobiological methods or application of advanced machine learning techniques. We highlight the current status and potential for this new field to reshape the landscape of outcome modeling in radiotherapy and drive future advances in computational oncology.

  4. Phenotypic models of evolution and development: geometry as destiny.

    PubMed

    François, Paul; Siggia, Eric D

    2012-12-01

    Quantitative models of development that consider all relevant genes typically are difficult to fit to embryonic data alone and have many redundant parameters. Computational evolution supplies models of phenotype with relatively few variables and parameters that allows the patterning dynamics to be reduced to a geometrical picture for how the state of a cell moves. The clock and wavefront model, that defines the phenotype of somitogenesis, can be represented as a sequence of two discrete dynamical transitions (bifurcations). The expression-time to space map for Hox genes and the posterior dominance rule are phenotypes that naturally follow from computational evolution without considering the genetics of Hox regulation. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. A flow-simulation model of the tidal Potomac River

    USGS Publications Warehouse

    Schaffranek, Raymond W.

    1987-01-01

    A one-dimensional model capable of simulating flow in a network of interconnected channels has been applied to the tidal Potomac River including its major tributaries and embayments between Washington, D.C., and Indian Head, Md. The model can be used to compute water-surface elevations and flow discharges at any of 66 predetermined locations or at any alternative river cross sections definable within the network of channels. In addition, the model can be used to provide tidal-interchange flow volumes and to evaluate tidal excursions and the flushing properties of the riverine system. Comparisons of model-computed results with measured watersurface elevations and discharges demonstrate the validity and accuracy of the model. Tidal-cycle flow volumes computed by the calibrated model have been verified to be within an accuracy of ? 10 percent. Quantitative characteristics of the hydrodynamics of the tidal river are identified and discussed. The comprehensive flow data provided by the model can be used to better understand the geochemical, biological, and other processes affecting the river's water quality.

  6. Three-dimensional structural modelling and calculation of electrostatic potentials of HLA Bw4 and Bw6 epitopes to explain the molecular basis for alloantibody binding: toward predicting HLA antigenicity and immunogenicity.

    PubMed

    Mallon, Dermot H; Bradley, J Andrew; Winn, Peter J; Taylor, Craig J; Kosmoliaptsis, Vasilis

    2015-02-01

    We have previously shown that qualitative assessment of surface electrostatic potential of HLA class I molecules helps explain serological patterns of alloantibody binding. We have now used a novel computational approach to quantitate differences in surface electrostatic potential of HLA B-cell epitopes and applied this to explain HLA Bw4 and Bw6 antigenicity. Protein structure models of HLA class I alleles expressing either the Bw4 or Bw6 epitope (defined by sequence motifs at positions 77 to 83) were generated using comparative structure prediction. The electrostatic potential in 3-dimensional space encompassing the Bw4/Bw6 epitope was computed by solving the Poisson-Boltzmann equation and quantitatively compared in a pairwise, all-versus-all fashion to produce distance matrices that cluster epitopes with similar electrostatics properties. Quantitative comparison of surface electrostatic potential at the carboxyl terminal of the α1-helix of HLA class I alleles, corresponding to amino acid sequence motif 77 to 83, produced clustering of HLA molecules in 3 principal groups according to Bw4 or Bw6 epitope expression. Remarkably, quantitative differences in electrostatic potential reflected known patterns of serological reactivity better than Bw4/Bw6 amino acid sequence motifs. Quantitative assessment of epitope electrostatic potential allowed the impact of known amino acid substitutions (HLA-B*07:02 R79G, R82L, G83R) that are critical for antibody binding to be predicted. We describe a novel approach for quantitating differences in HLA B-cell epitope electrostatic potential. Proof of principle is provided that this approach enables better assessment of HLA epitope antigenicity than amino acid sequence data alone, and it may allow prediction of HLA immunogenicity.

  7. Validation of finite element computations for the quantitative prediction of underwater noise from impact pile driving.

    PubMed

    Zampolli, Mario; Nijhof, Marten J J; de Jong, Christ A F; Ainslie, Michael A; Jansen, Erwin H W; Quesson, Benoit A J

    2013-01-01

    The acoustic radiation from a pile being driven into the sediment by a sequence of hammer strikes is studied with a linear, axisymmetric, structural acoustic frequency domain finite element model. Each hammer strike results in an impulsive sound that is emitted from the pile and then propagated in the shallow water waveguide. Measurements from accelerometers mounted on the head of a test pile and from hydrophones deployed in the water are used to validate the model results. Transfer functions between the force input at the top of the anvil and field quantities, such as acceleration components in the structure or pressure in the fluid, are computed with the model. These transfer functions are validated using accelerometer or hydrophone measurements to infer the structural forcing. A modeled hammer forcing pulse is used in the successive step to produce quantitative predictions of sound exposure at the hydrophones. The comparison between the model and the measurements shows that, although several simplifying assumptions were made, useful predictions of noise levels based on linear structural acoustic models are possible. In the final part of the paper, the model is used to characterize the pile as an acoustic radiator by analyzing the flow of acoustic energy.

  8. Vascular endothelial growth factor receptor-2 (VEGFR-2) inhibitors: development and validation of predictive 3-D QSAR models through extensive ligand- and structure-based approaches

    NASA Astrophysics Data System (ADS)

    Ragno, Rino; Ballante, Flavio; Pirolli, Adele; Wickersham, Richard B.; Patsilinakos, Alexandros; Hesse, Stéphanie; Perspicace, Enrico; Kirsch, Gilbert

    2015-08-01

    Vascular endothelial growth factor receptor-2, (VEGFR-2), is a key element in angiogenesis, the process by which new blood vessels are formed, and is thus an important pharmaceutical target. Here, 3-D quantitative structure-activity relationship (3-D QSAR) were used to build a quantitative screening and pharmacophore model of the VEGFR-2 receptors for design of inhibitors with improved activities. Most of available experimental data information has been used as training set to derive optimized and fully cross-validated eight mono-probe and a multi-probe quantitative models. Notable is the use of 262 molecules, aligned following both structure-based and ligand-based protocols, as external test set confirming the 3-D QSAR models' predictive capability and their usefulness in design new VEGFR-2 inhibitors. From a survey on literature, this is the first generation of a wide-ranging computational medicinal chemistry application on VEGFR2 inhibitors.

  9. The effects of nutrition labeling on consumer food choice: a psychological experiment and computational model.

    PubMed

    Helfer, Peter; Shultz, Thomas R

    2014-12-01

    The widespread availability of calorie-dense food is believed to be a contributing cause of an epidemic of obesity and associated diseases throughout the world. One possible countermeasure is to empower consumers to make healthier food choices with useful nutrition labeling. An important part of this endeavor is to determine the usability of existing and proposed labeling schemes. Here, we report an experiment on how four different labeling schemes affect the speed and nutritional value of food choices. We then apply decision field theory, a leading computational model of human decision making, to simulate the experimental results. The psychology experiment shows that quantitative, single-attribute labeling schemes have greater usability than multiattribute and binary ones, and that they remain effective under moderate time pressure. The computational model simulates these psychological results and provides explanatory insights into them. This work shows how experimental psychology and computational modeling can contribute to the evaluation and improvement of nutrition-labeling schemes. © 2014 New York Academy of Sciences.

  10. Exploring Asynchronous Many-Task Runtime Systems toward Extreme Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knight, Samuel; Baker, Gavin Matthew; Gamell, Marc

    2015-10-01

    Major exascale computing reports indicate a number of software challenges to meet the dramatic change of system architectures in near future. While several-orders-of-magnitude increase in parallelism is the most commonly cited of those, hurdles also include performance heterogeneity of compute nodes across the system, increased imbalance between computational capacity and I/O capabilities, frequent system interrupts, and complex hardware architectures. Asynchronous task-parallel programming models show a great promise in addressing these issues, but are not yet fully understood nor developed su ciently for computational science and engineering application codes. We address these knowledge gaps through quantitative and qualitative exploration of leadingmore » candidate solutions in the context of engineering applications at Sandia. In this poster, we evaluate MiniAero code ported to three leading candidate programming models (Charm++, Legion and UINTAH) to examine the feasibility of these models that permits insertion of new programming model elements into an existing code base.« less

  11. Designing automation for human use: empirical studies and quantitative models.

    PubMed

    Parasuraman, R

    2000-07-01

    An emerging knowledge base of human performance research can provide guidelines for designing automation that can be used effectively by human operators of complex systems. Which functions should be automated and to what extent in a given system? A model for types and levels of automation that provides a framework and an objective basis for making such choices is described. The human performance consequences of particular types and levels of automation constitute primary evaluative criteria for automation design when using the model. Four human performance areas are considered--mental workload, situation awareness, complacency and skill degradation. Secondary evaluative criteria include such factors as automation reliability, the risks of decision/action consequences and the ease of systems integration. In addition to this qualitative approach, quantitative models can inform design. Several computational and formal models of human interaction with automation that have been proposed by various researchers are reviewed. An important future research need is the integration of qualitative and quantitative approaches. Application of these models provides an objective basis for designing automation for effective human use.

  12. A new constitutive model for simulation of softening, plateau, and densification phenomena for trabecular bone under compression.

    PubMed

    Lee, Chi-Seung; Lee, Jae-Myung; Youn, BuHyun; Kim, Hyung-Sik; Shin, Jong Ki; Goh, Tae Sik; Lee, Jung Sub

    2017-01-01

    A new type of constitutive model and its computational implementation procedure for the simulation of a trabecular bone are proposed in the present study. A yield surface-independent Frank-Brockman elasto-viscoplastic model is introduced to express the nonlinear material behavior such as softening beyond yield point, plateau, and densification under compressive loads. In particular, the hardening- and softening-dominant material functions are introduced and adopted in the plastic multiplier to describe each nonlinear material behavior separately. In addition, the elasto-viscoplastic model is transformed into an implicit type discrete model, and is programmed as a user-defined material subroutine in commercial finite element analysis code. In particular, the consistent tangent modulus method is proposed to improve the computational convergence and to save computational time during finite element analysis. Through the developed material library, the nonlinear stress-strain relationship is analyzed qualitatively and quantitatively, and the simulation results are compared with the results of compression test on the trabecular bone to validate the proposed constitutive model, computational method, and material library. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. An experimental approach to identify dynamical models of transcriptional regulation in living cells

    NASA Astrophysics Data System (ADS)

    Fiore, G.; Menolascina, F.; di Bernardo, M.; di Bernardo, D.

    2013-06-01

    We describe an innovative experimental approach, and a proof of principle investigation, for the application of System Identification techniques to derive quantitative dynamical models of transcriptional regulation in living cells. Specifically, we constructed an experimental platform for System Identification based on a microfluidic device, a time-lapse microscope, and a set of automated syringes all controlled by a computer. The platform allows delivering a time-varying concentration of any molecule of interest to the cells trapped in the microfluidics device (input) and real-time monitoring of a fluorescent reporter protein (output) at a high sampling rate. We tested this platform on the GAL1 promoter in the yeast Saccharomyces cerevisiae driving expression of a green fluorescent protein (Gfp) fused to the GAL1 gene. We demonstrated that the System Identification platform enables accurate measurements of the input (sugars concentrations in the medium) and output (Gfp fluorescence intensity) signals, thus making it possible to apply System Identification techniques to obtain a quantitative dynamical model of the promoter. We explored and compared linear and nonlinear model structures in order to select the most appropriate to derive a quantitative model of the promoter dynamics. Our platform can be used to quickly obtain quantitative models of eukaryotic promoters, currently a complex and time-consuming process.

  14. Computational Biochemistry-Enzyme Mechanisms Explored.

    PubMed

    Culka, Martin; Gisdon, Florian J; Ullmann, G Matthias

    2017-01-01

    Understanding enzyme mechanisms is a major task to achieve in order to comprehend how living cells work. Recent advances in biomolecular research provide huge amount of data on enzyme kinetics and structure. The analysis of diverse experimental results and their combination into an overall picture is, however, often challenging. Microscopic details of the enzymatic processes are often anticipated based on several hints from macroscopic experimental data. Computational biochemistry aims at creation of a computational model of an enzyme in order to explain microscopic details of the catalytic process and reproduce or predict macroscopic experimental findings. Results of such computations are in part complementary to experimental data and provide an explanation of a biochemical process at the microscopic level. In order to evaluate the mechanism of an enzyme, a structural model is constructed which can be analyzed by several theoretical approaches. Several simulation methods can and should be combined to get a reliable picture of the process of interest. Furthermore, abstract models of biological systems can be constructed combining computational and experimental data. In this review, we discuss structural computational models of enzymatic systems. We first discuss various models to simulate enzyme catalysis. Furthermore, we review various approaches how to characterize the enzyme mechanism both qualitatively and quantitatively using different modeling approaches. © 2017 Elsevier Inc. All rights reserved.

  15. Correlation of quantitative computed tomographic subchondral bone density and ash density in horses.

    PubMed

    Drum, M G; Les, C M; Park, R D; Norrdin, R W; McIlwraith, C W; Kawcak, C E

    2009-02-01

    The purpose of this study was to compare subchondral bone density obtained using quantitative computed tomography with ash density values from intact equine joints, and to determine if there are measurable anatomic variations in mean subchondral bone density. Five adult equine metacarpophalangeal joints were scanned with computed tomography (CT), disarticulated, and four 1-cm(3) regions of interest (ROI) cut from the distal third metacarpal bone. Bone cubes were ashed, and percent mineralization and ash density were recorded. Three-dimensional models were created of the distal third metacarpal bone from CT images. Four ROIs were measured on the distal aspect of the third metacarpal bone at axial and abaxial sites of the medial and lateral condyles for correlation with ash samples. Overall correlations of mean quantitative CT (QCT) density with ash density (r=0.82) and percent mineralization (r=0.93) were strong. There were significant differences between abaxial and axial ROIs for mean QCT density, percent bone mineralization and ash density (p<0.05). QCT appears to be a good measure of bone density in equine subchondral bone. Additionally, differences existed between axial and abaxial subchondral bone density in the equine distal third metacarpal bone.

  16. Optical observables in stars with non-stationary atmospheres. [fireballs and cepheid models

    NASA Technical Reports Server (NTRS)

    Hillendahl, R. W.

    1980-01-01

    Experience gained by use of Cepheid modeling codes to predict the dimensional and photometric behavior of nuclear fireballs is used as a means of validating various computational techniques used in the Cepheid codes. Predicted results from Cepheid models are compared with observations of the continuum and lines in an effort to demonstrate that the atmospheric phenomena in Cepheids are quite complex but that they can be quantitatively modeled.

  17. A probabilistic method for computing quantitative risk indexes from medical injuries compensation claims.

    PubMed

    Dalle Carbonare, S; Folli, F; Patrini, E; Giudici, P; Bellazzi, R

    2013-01-01

    The increasing demand of health care services and the complexity of health care delivery require Health Care Organizations (HCOs) to approach clinical risk management through proper methods and tools. An important aspect of risk management is to exploit the analysis of medical injuries compensation claims in order to reduce adverse events and, at the same time, to optimize the costs of health insurance policies. This work provides a probabilistic method to estimate the risk level of a HCO by computing quantitative risk indexes from medical injury compensation claims. Our method is based on the estimate of a loss probability distribution from compensation claims data through parametric and non-parametric modeling and Monte Carlo simulations. The loss distribution can be estimated both on the whole dataset and, thanks to the application of a Bayesian hierarchical model, on stratified data. The approach allows to quantitatively assessing the risk structure of the HCO by analyzing the loss distribution and deriving its expected value and percentiles. We applied the proposed method to 206 cases of injuries with compensation requests collected from 1999 to the first semester of 2007 by the HCO of Lodi, in the Northern part of Italy. We computed the risk indexes taking into account the different clinical departments and the different hospitals involved. The approach proved to be useful to understand the HCO risk structure in terms of frequency, severity, expected and unexpected loss related to adverse events.

  18. Requirements for the formal representation of pathophysiology mechanisms by clinicians

    PubMed Central

    Helvensteijn, M.; Kokash, N.; Martorelli, I.; Sarwar, D.; Islam, S.; Grenon, P.; Hunter, P.

    2016-01-01

    Knowledge of multiscale mechanisms in pathophysiology is the bedrock of clinical practice. If quantitative methods, predicting patient-specific behaviour of these pathophysiology mechanisms, are to be brought to bear on clinical decision-making, the Human Physiome community and Clinical community must share a common computational blueprint for pathophysiology mechanisms. A number of obstacles stand in the way of this sharing—not least the technical and operational challenges that must be overcome to ensure that (i) the explicit biological meanings of the Physiome's quantitative methods to represent mechanisms are open to articulation, verification and study by clinicians, and that (ii) clinicians are given the tools and training to explicitly express disease manifestations in direct contribution to modelling. To this end, the Physiome and Clinical communities must co-develop a common computational toolkit, based on this blueprint, to bridge the representation of knowledge of pathophysiology mechanisms (a) that is implicitly depicted in electronic health records and the literature, with (b) that found in mathematical models explicitly describing mechanisms. In particular, this paper makes use of a step-wise description of a specific disease mechanism as a means to elicit the requirements of representing pathophysiological meaning explicitly. The computational blueprint developed from these requirements addresses the Clinical community goals to (i) organize and manage healthcare resources in terms of relevant disease-related knowledge of mechanisms and (ii) train the next generation of physicians in the application of quantitative methods relevant to their research and practice. PMID:27051514

  19. 20180312 - Mechanistic Modeling of Developmental Defects through Computational Embryology (SOT)

    EPA Science Inventory

    Significant advances in the genome sciences, in automated high-throughput screening (HTS), and in alternative methods for testing enable rapid profiling of chemical libraries for quantitative effects on diverse cellular activities. While a surfeit of HTS data and information is n...

  20. Aerothermal modeling program, phase 1

    NASA Technical Reports Server (NTRS)

    Sturgess, G. J.

    1983-01-01

    The physical modeling embodied in the computational fluid dynamics codes is discussed. The objectives were to identify shortcomings in the models and to provide a program plan to improve the quantitative accuracy. The physical models studied were for: turbulent mass and momentum transport, heat release, liquid fuel spray, and gaseous radiation. The approach adopted was to test the models against appropriate benchmark-quality test cases from experiments in the literature for the constituent flows that together make up the combustor real flow.

  1. Deep Learning for Magnetic Resonance Fingerprinting: A New Approach for Predicting Quantitative Parameter Values from Time Series.

    PubMed

    Hoppe, Elisabeth; Körzdörfer, Gregor; Würfl, Tobias; Wetzl, Jens; Lugauer, Felix; Pfeuffer, Josef; Maier, Andreas

    2017-01-01

    The purpose of this work is to evaluate methods from deep learning for application to Magnetic Resonance Fingerprinting (MRF). MRF is a recently proposed measurement technique for generating quantitative parameter maps. In MRF a non-steady state signal is generated by a pseudo-random excitation pattern. A comparison of the measured signal in each voxel with the physical model yields quantitative parameter maps. Currently, the comparison is done by matching a dictionary of simulated signals to the acquired signals. To accelerate the computation of quantitative maps we train a Convolutional Neural Network (CNN) on simulated dictionary data. As a proof of principle we show that the neural network implicitly encodes the dictionary and can replace the matching process.

  2. xTract: software for characterizing conformational changes of protein complexes by quantitative cross-linking mass spectrometry.

    PubMed

    Walzthoeni, Thomas; Joachimiak, Lukasz A; Rosenberger, George; Röst, Hannes L; Malmström, Lars; Leitner, Alexander; Frydman, Judith; Aebersold, Ruedi

    2015-12-01

    Chemical cross-linking in combination with mass spectrometry generates distance restraints of amino acid pairs in close proximity on the surface of native proteins and protein complexes. In this study we used quantitative mass spectrometry and chemical cross-linking to quantify differences in cross-linked peptides obtained from complexes in spatially discrete states. We describe a generic computational pipeline for quantitative cross-linking mass spectrometry consisting of modules for quantitative data extraction and statistical assessment of the obtained results. We used the method to detect conformational changes in two model systems: firefly luciferase and the bovine TRiC complex. Our method discovers and explains the structural heterogeneity of protein complexes using only sparse structural information.

  3. Mind the Noise When Identifying Computational Models of Cognition from Brain Activity.

    PubMed

    Kolossa, Antonio; Kopp, Bruno

    2016-01-01

    The aim of this study was to analyze how measurement error affects the validity of modeling studies in computational neuroscience. A synthetic validity test was created using simulated P300 event-related potentials as an example. The model space comprised four computational models of single-trial P300 amplitude fluctuations which differed in terms of complexity and dependency. The single-trial fluctuation of simulated P300 amplitudes was computed on the basis of one of the models, at various levels of measurement error and at various numbers of data points. Bayesian model selection was performed based on exceedance probabilities. At very low numbers of data points, the least complex model generally outperformed the data-generating model. Invalid model identification also occurred at low levels of data quality and under low numbers of data points if the winning model's predictors were closely correlated with the predictors from the data-generating model. Given sufficient data quality and numbers of data points, the data-generating model could be correctly identified, even against models which were very similar to the data-generating model. Thus, a number of variables affects the validity of computational modeling studies, and data quality and numbers of data points are among the main factors relevant to the issue. Further, the nature of the model space (i.e., model complexity, model dependency) should not be neglected. This study provided quantitative results which show the importance of ensuring the validity of computational modeling via adequately prepared studies. The accomplishment of synthetic validity tests is recommended for future applications. Beyond that, we propose to render the demonstration of sufficient validity via adequate simulations mandatory to computational modeling studies.

  4. 3D printing PLGA: a quantitative examination of the effects of polymer composition and printing parameters on print resolution

    PubMed Central

    Guo, Ting; Holzberg, Timothy R; Lim, Casey G; Gao, Feng; Gargava, Ankit; Trachtenberg, Jordan E; Mikos, Antonios G; Fisher, John P

    2018-01-01

    In the past few decades, 3D printing has played a significant role in fabricating scaffolds with consistent, complex structure that meet patient-specific needs in future clinical applications. Although many studies have contributed to this emerging field of additive manufacturing, which includes material development and computer-aided scaffold design, current quantitative analyses do not correlate material properties, printing parameters, and printing outcomes to a great extent. A model that correlates these properties has tremendous potential to standardize 3D printing for tissue engineering and biomaterial science. In this study, we printed poly(lactic-co-glycolic acid) (PLGA) utilizing a direct melt extrusion technique without additional ingredients. We investigated PLGA with various lactic acid: glycolic acid (LA:GA) molecular weight ratios and end caps to demonstrate the dependence of the extrusion process on the polymer composition. Micro-computed tomography was then used to evaluate printed scaffolds containing different LA:GA ratios, composed of different fiber patterns, and processed under different printing conditions. We built a statistical model to reveal the correlation and predominant factors that determine printing precision. Our model showed a strong linear relationship between the actual and predicted precision under different combinations of printing conditions and material compositions. This quantitative examination establishes a significant foreground to 3D print biomaterials following a systematic fabrication procedure. Additionally, our proposed statistical models can be applied to couple specific biomaterials and 3D printing applications for patient implants with particular requirements. PMID:28244880

  5. 3D printing PLGA: a quantitative examination of the effects of polymer composition and printing parameters on print resolution.

    PubMed

    Guo, Ting; Holzberg, Timothy R; Lim, Casey G; Gao, Feng; Gargava, Ankit; Trachtenberg, Jordan E; Mikos, Antonios G; Fisher, John P

    2017-04-12

    In the past few decades, 3D printing has played a significant role in fabricating scaffolds with consistent, complex structure that meet patient-specific needs in future clinical applications. Although many studies have contributed to this emerging field of additive manufacturing, which includes material development and computer-aided scaffold design, current quantitative analyses do not correlate material properties, printing parameters, and printing outcomes to a great extent. A model that correlates these properties has tremendous potential to standardize 3D printing for tissue engineering and biomaterial science. In this study, we printed poly(lactic-co-glycolic acid) (PLGA) utilizing a direct melt extrusion technique without additional ingredients. We investigated PLGA with various lactic acid:glycolic acid (LA:GA) molecular weight ratios and end caps to demonstrate the dependence of the extrusion process on the polymer composition. Micro-computed tomography was then used to evaluate printed scaffolds containing different LA:GA ratios, composed of different fiber patterns, and processed under different printing conditions. We built a statistical model to reveal the correlation and predominant factors that determine printing precision. Our model showed a strong linear relationship between the actual and predicted precision under different combinations of printing conditions and material compositions. This quantitative examination establishes a significant foreground to 3D print biomaterials following a systematic fabrication procedure. Additionally, our proposed statistical models can be applied to couple specific biomaterials and 3D printing applications for patient implants with particular requirements.

  6. A quantitative model for transforming reflectance spectra into the Munsell color space using cone sensitivity functions and opponent process weights.

    PubMed

    D'Andrade, Roy G; Romney, A Kimball

    2003-05-13

    This article presents a computational model of the process through which the human visual system transforms reflectance spectra into perceptions of color. Using physical reflectance spectra data and standard human cone sensitivity functions we describe the transformations necessary for predicting the location of colors in the Munsell color space. These transformations include quantitative estimates of the opponent process weights needed to transform cone activations into Munsell color space coordinates. Using these opponent process weights, the Munsell position of specific colors can be predicted from their physical spectra with a mean correlation of 0.989.

  7. Analytical approximations of the firing rate of an adaptive exponential integrate-and-fire neuron in the presence of synaptic noise.

    PubMed

    Hertäg, Loreen; Durstewitz, Daniel; Brunel, Nicolas

    2014-01-01

    Computational models offer a unique tool for understanding the network-dynamical mechanisms which mediate between physiological and biophysical properties, and behavioral function. A traditional challenge in computational neuroscience is, however, that simple neuronal models which can be studied analytically fail to reproduce the diversity of electrophysiological behaviors seen in real neurons, while detailed neuronal models which do reproduce such diversity are intractable analytically and computationally expensive. A number of intermediate models have been proposed whose aim is to capture the diversity of firing behaviors and spike times of real neurons while entailing the simplest possible mathematical description. One such model is the exponential integrate-and-fire neuron with spike rate adaptation (aEIF) which consists of two differential equations for the membrane potential (V) and an adaptation current (w). Despite its simplicity, it can reproduce a wide variety of physiologically observed spiking patterns, can be fit to physiological recordings quantitatively, and, once done so, is able to predict spike times on traces not used for model fitting. Here we compute the steady-state firing rate of aEIF in the presence of Gaussian synaptic noise, using two approaches. The first approach is based on the 2-dimensional Fokker-Planck equation that describes the (V,w)-probability distribution, which is solved using an expansion in the ratio between the time constants of the two variables. The second is based on the firing rate of the EIF model, which is averaged over the distribution of the w variable. These analytically derived closed-form expressions were tested on simulations from a large variety of model cells quantitatively fitted to in vitro electrophysiological recordings from pyramidal cells and interneurons. Theoretical predictions closely agreed with the firing rate of the simulated cells fed with in-vivo-like synaptic noise.

  8. Engineering challenges of BioNEMS: the integration of microfluidics, micro- and nanodevices, models and external control for systems biology.

    PubMed

    Wikswo, J P; Prokop, A; Baudenbacher, F; Cliffel, D; Csukas, B; Velkovsky, M

    2006-08-01

    Systems biology, i.e. quantitative, postgenomic, postproteomic, dynamic, multiscale physiology, addresses in an integrative, quantitative manner the shockwave of genetic and proteomic information using computer models that may eventually have 10(6) dynamic variables with non-linear interactions. Historically, single biological measurements are made over minutes, suggesting the challenge of specifying 10(6) model parameters. Except for fluorescence and micro-electrode recordings, most cellular measurements have inadequate bandwidth to discern the time course of critical intracellular biochemical events. Micro-array expression profiles of thousands of genes cannot determine quantitative dynamic cellular signalling and metabolic variables. Major gaps must be bridged between the computational vision and experimental reality. The analysis of cellular signalling dynamics and control requires, first, micro- and nano-instruments that measure simultaneously multiple extracellular and intracellular variables with sufficient bandwidth; secondly, the ability to open existing internal control and signalling loops; thirdly, external BioMEMS micro-actuators that provide high bandwidth feedback and externally addressable intracellular nano-actuators; and, fourthly, real-time, closed-loop, single-cell control algorithms. The unravelling of the nested and coupled nature of cellular control loops requires simultaneous recording of multiple single-cell signatures. Externally controlled nano-actuators, needed to effect changes in the biochemical, mechanical and electrical environment both outside and inside the cell, will provide a major impetus for nanoscience.

  9. Predicting perceived visual complexity of abstract patterns using computational measures: The influence of mirror symmetry on complexity perception

    PubMed Central

    Leder, Helmut

    2017-01-01

    Visual complexity is relevant for many areas ranging from improving usability of technical displays or websites up to understanding aesthetic experiences. Therefore, many attempts have been made to relate objective properties of images to perceived complexity in artworks and other images. It has been argued that visual complexity is a multidimensional construct mainly consisting of two dimensions: A quantitative dimension that increases complexity through number of elements, and a structural dimension representing order negatively related to complexity. The objective of this work is to study human perception of visual complexity utilizing two large independent sets of abstract patterns. A wide range of computational measures of complexity was calculated, further combined using linear models as well as machine learning (random forests), and compared with data from human evaluations. Our results confirm the adequacy of existing two-factor models of perceived visual complexity consisting of a quantitative and a structural factor (in our case mirror symmetry) for both of our stimulus sets. In addition, a non-linear transformation of mirror symmetry giving more influence to small deviations from symmetry greatly increased explained variance. Thus, we again demonstrate the multidimensional nature of human complexity perception and present comprehensive quantitative models of the visual complexity of abstract patterns, which might be useful for future experiments and applications. PMID:29099832

  10. An empirical generative framework for computational modeling of language acquisition.

    PubMed

    Waterfall, Heidi R; Sandbank, Ben; Onnis, Luca; Edelman, Shimon

    2010-06-01

    This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of generative grammars from raw CHILDES data and give an account of the generative performance of the acquired grammars. Next, we summarize findings from recent longitudinal and experimental work that suggests how certain statistically prominent structural properties of child-directed speech may facilitate language acquisition. We then present a series of new analyses of CHILDES data indicating that the desired properties are indeed present in realistic child-directed speech corpora. Finally, we suggest how our computational results, behavioral findings, and corpus-based insights can be integrated into a next-generation model aimed at meeting the four requirements of our modeling framework.

  11. Quantitative Radiomics System Decoding the Tumor Phenotype | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    Our goal is to construct a publicly available computational radiomics system for the objective and automated extraction of quantitative imaging features that we believe will yield biomarkers of greater prognostic value compared with routinely extracted descriptors of tumor size. We will create a generalized, open, portable, and extensible radiomics platform that is widely applicable across cancer types and imaging modalities and describe how we will use lung and head and neck cancers as models to validate our developments.

  12. Plant hormone signaling during development: insights from computational models.

    PubMed

    Oliva, Marina; Farcot, Etienne; Vernoux, Teva

    2013-02-01

    Recent years have seen an impressive increase in our knowledge of the topology of plant hormone signaling networks. The complexity of these topologies has motivated the development of models for several hormones to aid understanding of how signaling networks process hormonal inputs. Such work has generated essential insights into the mechanisms of hormone perception and of regulation of cellular responses such as transcription in response to hormones. In addition, modeling approaches have contributed significantly to exploring how spatio-temporal regulation of hormone signaling contributes to plant growth and patterning. New tools have also been developed to obtain quantitative information on hormone distribution during development and to test model predictions, opening the way for quantitative understanding of the developmental roles of hormones. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. Case-Deletion Diagnostics for Maximum Likelihood Multipoint Quantitative Trait Locus Linkage Analysis

    PubMed Central

    Mendoza, Maria C.B.; Burns, Trudy L.; Jones, Michael P.

    2009-01-01

    Objectives Case-deletion diagnostic methods are tools that allow identification of influential observations that may affect parameter estimates and model fitting conclusions. The goal of this paper was to develop two case-deletion diagnostics, the exact case deletion (ECD) and the empirical influence function (EIF), for detecting outliers that can affect results of sib-pair maximum likelihood quantitative trait locus (QTL) linkage analysis. Methods Subroutines to compute the ECD and EIF were incorporated into the maximum likelihood QTL variance estimation components of the linkage analysis program MAPMAKER/SIBS. Performance of the diagnostics was compared in simulation studies that evaluated the proportion of outliers correctly identified (sensitivity), and the proportion of non-outliers correctly identified (specificity). Results Simulations involving nuclear family data sets with one outlier showed EIF sensitivities approximated ECD sensitivities well for outlier-affected parameters. Sensitivities were high, indicating the outlier was identified a high proportion of the time. Simulations also showed the enormous computational time advantage of the EIF. Diagnostics applied to body mass index in nuclear families detected observations influential on the lod score and model parameter estimates. Conclusions The EIF is a practical diagnostic tool that has the advantages of high sensitivity and quick computation. PMID:19172086

  14. HpQTL: a geometric morphometric platform to compute the genetic architecture of heterophylly.

    PubMed

    Sun, Lidan; Wang, Jing; Zhu, Xuli; Jiang, Libo; Gosik, Kirk; Sang, Mengmeng; Sun, Fengsuo; Cheng, Tangren; Zhang, Qixiang; Wu, Rongling

    2017-02-15

    Heterophylly, i.e. morphological changes in leaves along the axis of an individual plant, is regarded as a strategy used by plants to cope with environmental change. However, little is known of the extent to which heterophylly is controlled by genes and how each underlying gene exerts its effect on heterophyllous variation. We described a geometric morphometric model that can quantify heterophylly in plants and further constructed an R-based computing platform by integrating this model into a genetic mapping and association setting. The platform, named HpQTL, allows specific quantitative trait loci mediating heterophyllous variation to be mapped throughout the genome. The statistical properties of HpQTL were examined and validated via computer simulation. Its biological relevance was demonstrated by results from a real data analysis of heterophylly in a wood plant, mei (Prunus mume). HpQTL provides a powerful tool to analyze heterophylly and its underlying genetic architecture in a quantitative manner. It also contributes a new approach for genome-wide association studies aimed to dissect the programmed regulation of plant development and evolution. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. Computer modeling of lung cancer diagnosis-to-treatment process

    PubMed Central

    Ju, Feng; Lee, Hyo Kyung; Osarogiagbon, Raymond U.; Yu, Xinhua; Faris, Nick

    2015-01-01

    We introduce an example of a rigorous, quantitative method for quality improvement in lung cancer care-delivery. Computer process modeling methods are introduced for lung cancer diagnosis, staging and treatment selection process. Two types of process modeling techniques, discrete event simulation (DES) and analytical models, are briefly reviewed. Recent developments in DES are outlined and the necessary data and procedures to develop a DES model for lung cancer diagnosis, leading up to surgical treatment process are summarized. The analytical models include both Markov chain model and closed formulas. The Markov chain models with its application in healthcare are introduced and the approach to derive a lung cancer diagnosis process model is presented. Similarly, the procedure to derive closed formulas evaluating the diagnosis process performance is outlined. Finally, the pros and cons of these methods are discussed. PMID:26380181

  16. Quantitative Study on Corrosion of Steel Strands Based on Self-Magnetic Flux Leakage.

    PubMed

    Xia, Runchuan; Zhou, Jianting; Zhang, Hong; Liao, Leng; Zhao, Ruiqiang; Zhang, Zeyu

    2018-05-02

    This paper proposed a new computing method to quantitatively and non-destructively determine the corrosion of steel strands by analyzing the self-magnetic flux leakage (SMFL) signals from them. The magnetic dipole model and three growth models (Logistic model, Exponential model, and Linear model) were proposed to theoretically analyze the characteristic value of SMFL. Then, the experimental study on the corrosion detection by the magnetic sensor was carried out. The setup of the magnetic scanning device and signal collection method were also introduced. The results show that the Logistic Growth model is verified as the optimal model for calculating the magnetic field with good fitting effects. Combined with the experimental data analysis, the amplitudes of the calculated values ( B xL ( x,z ) curves) agree with the measured values in general. This method provides significant application prospects for the evaluation of the corrosion and the residual bearing capacity of steel strand.

  17. Agent-Based Computational Modeling of Cell Culture: Understanding Dosimetry In Vitro as Part of In Vitro to In Vivo Extrapolation

    EPA Science Inventory

    Quantitative characterization of cellular dose in vitro is needed for alignment of doses in vitro and in vivo. We used the agent-based software, CompuCell3D (CC3D), to provide a stochastic description of cell growth in culture. The model was configured so that isolated cells assu...

  18. Overview: early history of crop growth and photosynthesis modeling.

    PubMed

    El-Sharkawy, Mabrouk A

    2011-02-01

    As in industrial and engineering systems, there is a need to quantitatively study and analyze the many constituents of complex natural biological systems as well as agro-ecosystems via research-based mechanistic modeling. This objective is normally addressed by developing mathematically built descriptions of multilevel biological processes to provide biologists a means to integrate quantitatively experimental research findings that might lead to a better understanding of the whole systems and their interactions with surrounding environments. Aided with the power of computational capacities associated with computer technology then available, pioneering cropping systems simulations took place in the second half of the 20th century by several research groups across continents. This overview summarizes that initial pioneering effort made to simulate plant growth and photosynthesis of crop canopies, focusing on the discovery of gaps that exist in the current scientific knowledge. Examples are given for those gaps where experimental research was needed to improve the validity and application of the constructed models, so that their benefit to mankind was enhanced. Such research necessitates close collaboration among experimentalists and model builders while adopting a multidisciplinary/inter-institutional approach. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  19. Evolution of epigenetic regulation in vertebrate genomes

    PubMed Central

    Lowdon, Rebecca F.; Jang, Hyo Sik; Wang, Ting

    2016-01-01

    Empirical models of sequence evolution have spurred progress in the field of evolutionary genetics for decades. We are now realizing the importance and complexity of the eukaryotic epigenome. While epigenome analysis has been applied to genomes from single cell eukaryotes to human, comparative analyses are still relatively few, and computational algorithms to quantify epigenome evolution remain scarce. Accordingly, a quantitative model of epigenome evolution remains to be established. Here we review the comparative epigenomics literature and synthesize its overarching themes. We also suggest one mechanism, transcription factor binding site turnover, which relates sequence evolution to epigenetic conservation or divergence. Lastly, we propose a framework for how the field can move forward to build a coherent quantitative model of epigenome evolution. PMID:27080453

  20. Field Scale Monitoring and Modeling of Water and Chemical Transfer in the Vadose Zone

    USDA-ARS?s Scientific Manuscript database

    Natural resource systems involve highly complex interactions of soil-plant-atmosphere-management components that are extremely difficult to quantitatively describe. Computer simulations for prediction and management of watersheds, water supply areas, and agricultural fields and farms have become inc...

  1. Advantages and limitations of classic and 3D QSAR approaches in nano-QSAR studies based on biological activity of fullerene derivatives

    DOE PAGES

    Jagiello, Karolina; Grzonkowska, Monika; Swirog, Marta; ...

    2016-08-29

    In this contribution, the advantages and limitations of two computational techniques that can be used for the investigation of nanoparticles activity and toxicity: classic nano-QSAR (Quantitative Structure–Activity Relationships employed for nanomaterials) and 3D nano-QSAR (three-dimensional Quantitative Structure–Activity Relationships, such us Comparative Molecular Field Analysis, CoMFA/Comparative Molecular Similarity Indices Analysis, CoMSIA analysis employed for nanomaterials) have been briefly summarized. Both approaches were compared according to the selected criteria, including: efficiency, type of experimental data, class of nanomaterials, time required for calculations and computational cost, difficulties in the interpretation. Taking into account the advantages and limitations of each method, we provide themore » recommendations for nano-QSAR modellers and QSAR model users to be able to determine a proper and efficient methodology to investigate biological activity of nanoparticles in order to describe the underlying interactions in the most reliable and useful manner.« less

  2. Advantages and limitations of classic and 3D QSAR approaches in nano-QSAR studies based on biological activity of fullerene derivatives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jagiello, Karolina; Grzonkowska, Monika; Swirog, Marta

    In this contribution, the advantages and limitations of two computational techniques that can be used for the investigation of nanoparticles activity and toxicity: classic nano-QSAR (Quantitative Structure–Activity Relationships employed for nanomaterials) and 3D nano-QSAR (three-dimensional Quantitative Structure–Activity Relationships, such us Comparative Molecular Field Analysis, CoMFA/Comparative Molecular Similarity Indices Analysis, CoMSIA analysis employed for nanomaterials) have been briefly summarized. Both approaches were compared according to the selected criteria, including: efficiency, type of experimental data, class of nanomaterials, time required for calculations and computational cost, difficulties in the interpretation. Taking into account the advantages and limitations of each method, we provide themore » recommendations for nano-QSAR modellers and QSAR model users to be able to determine a proper and efficient methodology to investigate biological activity of nanoparticles in order to describe the underlying interactions in the most reliable and useful manner.« less

  3. Exploratory of society

    NASA Astrophysics Data System (ADS)

    Cederman, L.-E.; Conte, R.; Helbing, D.; Nowak, A.; Schweitzer, F.; Vespignani, A.

    2012-11-01

    A huge flow of quantitative social, demographic and behavioral data is becoming available that traces the activities and interactions of individuals, social patterns, transportation infrastructures and travel fluxes. This has caused, together with innovative computational techniques and methods for modeling social actions in hybrid (natural and artificial) societies, a qualitative change in the ways we model socio-technical systems. For the first time, society can be studied in a comprehensive fashion that addresses social and behavioral complexity. In other words we are in the position to envision the development of large data and computational cyber infrastructure defining an exploratory of society that provides quantitative anticipatory, explanatory and scenario analysis capabilities ranging from emerging infectious disease to conflict and crime surges. The goal of the exploratory of society is to provide the basic infrastructure embedding the framework of tools and knowledge needed for the design of forecast/anticipatory/crisis management approaches to socio technical systems, supporting future decision making procedures by accelerating the scientific cycle that goes from data generation to predictions.

  4. Investigation of Carbohydrate Recognition via Computer Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Quentin R.; Lindsay, Richard J.; Petridis, Loukas

    Carbohydrate recognition by proteins, such as lectins and other (bio)molecules, can be essential for many biological functions. Interest has arisen due to potential protein and drug design and future bioengineering applications. A quantitative measurement of carbohydrate-protein interaction is thus important for the full characterization of sugar recognition. Here, we focus on the aspect of utilizing computer simulations and biophysical models to evaluate the strength and specificity of carbohydrate recognition in this review. With increasing computational resources, better algorithms and refined modeling parameters, using state-of-the-art supercomputers to calculate the strength of the interaction between molecules has become increasingly mainstream. We reviewmore » the current state of this technique and its successful applications for studying protein-sugar interactions in recent years.« less

  5. Investigation of Carbohydrate Recognition via Computer Simulation

    DOE PAGES

    Johnson, Quentin R.; Lindsay, Richard J.; Petridis, Loukas; ...

    2015-04-28

    Carbohydrate recognition by proteins, such as lectins and other (bio)molecules, can be essential for many biological functions. Interest has arisen due to potential protein and drug design and future bioengineering applications. A quantitative measurement of carbohydrate-protein interaction is thus important for the full characterization of sugar recognition. Here, we focus on the aspect of utilizing computer simulations and biophysical models to evaluate the strength and specificity of carbohydrate recognition in this review. With increasing computational resources, better algorithms and refined modeling parameters, using state-of-the-art supercomputers to calculate the strength of the interaction between molecules has become increasingly mainstream. We reviewmore » the current state of this technique and its successful applications for studying protein-sugar interactions in recent years.« less

  6. Quantitative DFT modeling of product concentration in organometallic reactions: Cu-mediated pentafluoroethylation of benzoic acid chlorides as a case study.

    PubMed

    Jover, Jesús

    2017-11-08

    DFT calculations are widely used for computing properties, reaction mechanisms and energy profiles in organometallic reactions. A qualitative agreement between the experimental and the calculated results seems to usually be enough to validate a computational methodology but recent advances in computation indicate that a nearly quantitative agreement should be possible if an appropriate DFT study is carried out. Final percent product concentrations, often reported as yields, are by far the most commonly reported properties in experimental metal-mediated synthesis studies but reported DFT studies have not focused on predicting absolute product amounts. The recently reported stoichiometric pentafluoroethylation of benzoic acid chlorides (R-C 6 H 4 COCl) with [(phen)Cu(PPh 3 )C 2 F 5 ] (phen = 1,10-phenanthroline, PPh 3 = triphenylphosphine) has been used as a case study to check whether the experimental product concentrations can be reproduced by any of the most popular DFT approaches with high enough accuracy. To this end, the Gibbs energy profile for the pentafluoroethylation of benzoic acid chloride has been computed using 14 different DFT methods. These computed Gibbs energy profiles have been employed to build kinetic models predicting the final product concentration in solution. The best results are obtained with the D3-dispersion corrected B3LYP functional, which has been successfully used afterwards to model the reaction outcomes of other simple (R = o-Me, p-Me, p-Cl, p-F, etc.) benzoic acid chlorides. The product concentrations of more complex reaction networks in which more than one position of the substrate may be activated by the copper catalyst (R = o-Br and p-I) are also predicted appropriately.

  7. Toward the prediction of class I and II mouse major histocompatibility complex-peptide-binding affinity: in silico bioinformatic step-by-step guide using quantitative structure-activity relationships.

    PubMed

    Hattotuwagama, Channa K; Doytchinova, Irini A; Flower, Darren R

    2007-01-01

    Quantitative structure-activity relationship (QSAR) analysis is a cornerstone of modern informatics. Predictive computational models of peptide-major histocompatibility complex (MHC)-binding affinity based on QSAR technology have now become important components of modern computational immunovaccinology. Historically, such approaches have been built around semiqualitative, classification methods, but these are now giving way to quantitative regression methods. We review three methods--a 2D-QSAR additive-partial least squares (PLS) and a 3D-QSAR comparative molecular similarity index analysis (CoMSIA) method--which can identify the sequence dependence of peptide-binding specificity for various class I MHC alleles from the reported binding affinities (IC50) of peptide sets. The third method is an iterative self-consistent (ISC) PLS-based additive method, which is a recently developed extension to the additive method for the affinity prediction of class II peptides. The QSAR methods presented here have established themselves as immunoinformatic techniques complementary to existing methodology, useful in the quantitative prediction of binding affinity: current methods for the in silico identification of T-cell epitopes (which form the basis of many vaccines, diagnostics, and reagents) rely on the accurate computational prediction of peptide-MHC affinity. We have reviewed various human and mouse class I and class II allele models. Studied alleles comprise HLA-A*0101, HLA-A*0201, HLA-A*0202, HLA-A*0203, HLA-A*0206, HLA-A*0301, HLA-A*1101, HLA-A*3101, HLA-A*6801, HLA-A*6802, HLA-B*3501, H2-K(k), H2-K(b), H2-D(b) HLA-DRB1*0101, HLA-DRB1*0401, HLA-DRB1*0701, I-A(b), I-A(d), I-A(k), I-A(S), I-E(d), and I-E(k). In this chapter we show a step-by-step guide into predicting the reliability and the resulting models to represent an advance on existing methods. The peptides used in this study are available from the AntiJen database (http://www.jenner.ac.uk/AntiJen). The PLS method is available commercially in the SYBYL molecular modeling software package. The resulting models, which can be used for accurate T-cell epitope prediction, will be made are freely available online at the URL http://www.jenner.ac.uk/MHCPred.

  8. Bone Health Monitoring in Astronauts: Recommended Use of Quantitative Computed Tomography [QCT] for Clinical and Operational Decisions

    NASA Technical Reports Server (NTRS)

    Sibonga, J. D.; Truskowski, P.

    2010-01-01

    This slide presentation reviews the concerns that astronauts in long duration flights might have a greater risk of bone fracture as they age than the general population. A panel of experts was convened to review the information and recommend mechanisms to monitor the health of bones in astronauts. The use of Quantitative Computed Tomography (QCT) scans for risk surveillance to detect the clinical trigger and to inform countermeasure evaluation is reviewed. An added benefit of QCT is that it facilitates an individualized estimation of bone strength by Finite Element Modeling (FEM), that can inform approaches for bone rehabilitation. The use of FEM is reviewed as a process that arrives at a composite number to estimate bone strength, because it integrates multiple factors.

  9. Internet messenger based smart virtual class learning using ubiquitous computing

    NASA Astrophysics Data System (ADS)

    Umam, K.; Mardi, S. N. S.; Hariadi, M.

    2017-06-01

    Internet messenger (IM) has become an important educational technology component in college education, IM makes it possible for students to engage in learning and collaborating at smart virtual class learning (SVCL) using ubiquitous computing. However, the model of IM-based smart virtual class learning using ubiquitous computing and empirical evidence that would favor a broad application to improve engagement and behavior are still limited. In addition, the expectation that IM based SVCL using ubiquitous computing could improve engagement and behavior on smart class cannot be confirmed because the majority of the reviewed studies followed instructions paradigms. This article aims to present the model of IM-based SVCL using ubiquitous computing and showing learners’ experiences in improved engagement and behavior for learner-learner and learner-lecturer interactions. The method applied in this paper includes design process and quantitative analysis techniques, with the purpose of identifying scenarios of ubiquitous computing and realize the impressions of learners and lecturers about engagement and behavior aspect and its contribution to learning

  10. Quantitative comparison of hemodynamics in simulated and 3D angiography models of cerebral aneurysms by use of computational fluid dynamics.

    PubMed

    Saho, Tatsunori; Onishi, Hideo

    2015-07-01

    In this study, we evaluated hemodynamics using simulated models and determined how cerebral aneurysms develop in simulated and patient-specific models based on medical images. Computational fluid dynamics (CFD) was analyzed by use of OpenFOAM software. Flow velocity, stream line, and wall shear stress (WSS) were evaluated in a simulated model aneurysm with known geometry and in a three-dimensional angiographic model. The ratio of WSS at the aneurysm compared with that at the basilar artery was 1:10 in simulated model aneurysms with a diameter of 10 mm and 1:18 in the angiographic model, indicating similar tendencies. Vortex flow occurred in both model aneurysms, and the WSS decreased in larger model aneurysms. The angiographic model provided accurate CFD information, and the tendencies of simulated and angiographic models were similar. These findings indicate that hemodynamic effects are involved in the development of aneurysms.

  11. Combining wet and dry research: experience with model development for cardiac mechano-electric structure-function studies

    PubMed Central

    Quinn, T. Alexander; Kohl, Peter

    2013-01-01

    Since the development of the first mathematical cardiac cell model 50 years ago, computational modelling has become an increasingly powerful tool for the analysis of data and for the integration of information related to complex cardiac behaviour. Current models build on decades of iteration between experiment and theory, representing a collective understanding of cardiac function. All models, whether computational, experimental, or conceptual, are simplified representations of reality and, like tools in a toolbox, suitable for specific applications. Their range of applicability can be explored (and expanded) by iterative combination of ‘wet’ and ‘dry’ investigation, where experimental or clinical data are used to first build and then validate computational models (allowing integration of previous findings, quantitative assessment of conceptual models, and projection across relevant spatial and temporal scales), while computational simulations are utilized for plausibility assessment, hypotheses-generation, and prediction (thereby defining further experimental research targets). When implemented effectively, this combined wet/dry research approach can support the development of a more complete and cohesive understanding of integrated biological function. This review illustrates the utility of such an approach, based on recent examples of multi-scale studies of cardiac structure and mechano-electric function. PMID:23334215

  12. Applying quantitative adiposity feature analysis models to predict benefit of bevacizumab-based chemotherapy in ovarian cancer patients

    NASA Astrophysics Data System (ADS)

    Wang, Yunzhi; Qiu, Yuchen; Thai, Theresa; More, Kathleen; Ding, Kai; Liu, Hong; Zheng, Bin

    2016-03-01

    How to rationally identify epithelial ovarian cancer (EOC) patients who will benefit from bevacizumab or other antiangiogenic therapies is a critical issue in EOC treatments. The motivation of this study is to quantitatively measure adiposity features from CT images and investigate the feasibility of predicting potential benefit of EOC patients with or without receiving bevacizumab-based chemotherapy treatment using multivariate statistical models built based on quantitative adiposity image features. A dataset involving CT images from 59 advanced EOC patients were included. Among them, 32 patients received maintenance bevacizumab after primary chemotherapy and the remaining 27 patients did not. We developed a computer-aided detection (CAD) scheme to automatically segment subcutaneous fat areas (VFA) and visceral fat areas (SFA) and then extracted 7 adiposity-related quantitative features. Three multivariate data analysis models (linear regression, logistic regression and Cox proportional hazards regression) were performed respectively to investigate the potential association between the model-generated prediction results and the patients' progression-free survival (PFS) and overall survival (OS). The results show that using all 3 statistical models, a statistically significant association was detected between the model-generated results and both of the two clinical outcomes in the group of patients receiving maintenance bevacizumab (p<0.01), while there were no significant association for both PFS and OS in the group of patients without receiving maintenance bevacizumab. Therefore, this study demonstrated the feasibility of using quantitative adiposity-related CT image features based statistical prediction models to generate a new clinical marker and predict the clinical outcome of EOC patients receiving maintenance bevacizumab-based chemotherapy.

  13. Introductory life science mathematics and quantitative neuroscience courses.

    PubMed

    Duffus, Dwight; Olifer, Andrei

    2010-01-01

    We describe two sets of courses designed to enhance the mathematical, statistical, and computational training of life science undergraduates at Emory College. The first course is an introductory sequence in differential and integral calculus, modeling with differential equations, probability, and inferential statistics. The second is an upper-division course in computational neuroscience. We provide a description of each course, detailed syllabi, examples of content, and a brief discussion of the main issues encountered in developing and offering the courses.

  14. GUI to Facilitate Research on Biological Damage from Radiation

    NASA Technical Reports Server (NTRS)

    Cucinotta, Frances A.; Ponomarev, Artem Lvovich

    2010-01-01

    A graphical-user-interface (GUI) computer program has been developed to facilitate research on the damage caused by highly energetic particles and photons impinging on living organisms. The program brings together, into one computational workspace, computer codes that have been developed over the years, plus codes that will be developed during the foreseeable future, to address diverse aspects of radiation damage. These include codes that implement radiation-track models, codes for biophysical models of breakage of deoxyribonucleic acid (DNA) by radiation, pattern-recognition programs for extracting quantitative information from biological assays, and image-processing programs that aid visualization of DNA breaks. The radiation-track models are based on transport models of interactions of radiation with matter and solution of the Boltzmann transport equation by use of both theoretical and numerical models. The biophysical models of breakage of DNA by radiation include biopolymer coarse-grained and atomistic models of DNA, stochastic- process models of deposition of energy, and Markov-based probabilistic models of placement of double-strand breaks in DNA. The program is designed for use in the NT, 95, 98, 2000, ME, and XP variants of the Windows operating system.

  15. Effects of Scan Resolutions and Element Sizes on Bovine Vertebral Mechanical Parameters from Quantitative Computed Tomography-Based Finite Element Analysis

    PubMed Central

    Zhang, Meng; Gao, Jiazi; Huang, Xu; Zhang, Min; Liu, Bei

    2017-01-01

    Quantitative computed tomography-based finite element analysis (QCT/FEA) has been developed to predict vertebral strength. However, QCT/FEA models may be different with scan resolutions and element sizes. The aim of this study was to explore the effects of scan resolutions and element sizes on QCT/FEA outcomes. Nine bovine vertebral bodies were scanned using the clinical CT scanner and reconstructed from datasets with the two-slice thickness, that is, 0.6 mm (PA resolution) and 1 mm (PB resolution). There were significantly linear correlations between the predicted and measured principal strains (R2 > 0.7, P < 0.0001), and the predicted vertebral strength and stiffness were modestly correlated with the experimental values (R2 > 0.6, P < 0.05). Two different resolutions and six different element sizes were combined in pairs, and finite element (FE) models of bovine vertebral cancellous bones in the 12 cases were obtained. It showed that the mechanical parameters of FE models with the PB resolution were similar to those with the PA resolution. The computational accuracy of FE models with the element sizes of 0.41 × 0.41 × 0.6 mm3 and 0.41 × 0.41 × 1 mm3 was higher by comparing the apparent elastic modulus and yield strength. Therefore, scan resolution and element size should be chosen optimally to improve the accuracy of QCT/FEA. PMID:29065624

  16. Computationally Efficient Multiscale Reactive Molecular Dynamics to Describe Amino Acid Deprotonation in Proteins

    PubMed Central

    2016-01-01

    An important challenge in the simulation of biomolecular systems is a quantitative description of the protonation and deprotonation process of amino acid residues. Despite the seeming simplicity of adding or removing a positively charged hydrogen nucleus, simulating the actual protonation/deprotonation process is inherently difficult. It requires both the explicit treatment of the excess proton, including its charge defect delocalization and Grotthuss shuttling through inhomogeneous moieties (water and amino residues), and extensive sampling of coupled condensed phase motions. In a recent paper (J. Chem. Theory Comput.2014, 10, 2729−273725061442), a multiscale approach was developed to map high-level quantum mechanics/molecular mechanics (QM/MM) data into a multiscale reactive molecular dynamics (MS-RMD) model in order to describe amino acid deprotonation in bulk water. In this article, we extend the fitting approach (called FitRMD) to create MS-RMD models for ionizable amino acids within proteins. The resulting models are shown to faithfully reproduce the free energy profiles of the reference QM/MM Hamiltonian for PT inside an example protein, the ClC-ec1 H+/Cl– antiporter. Moreover, we show that the resulting MS-RMD models are computationally efficient enough to then characterize more complex 2-dimensional free energy surfaces due to slow degrees of freedom such as water hydration of internal protein cavities that can be inherently coupled to the excess proton charge translocation. The FitRMD method is thus shown to be an effective way to map ab initio level accuracy into a much more computationally efficient reactive MD method in order to explicitly simulate and quantitatively describe amino acid protonation/deprotonation in proteins. PMID:26734942

  17. Computationally Efficient Multiscale Reactive Molecular Dynamics to Describe Amino Acid Deprotonation in Proteins.

    PubMed

    Lee, Sangyun; Liang, Ruibin; Voth, Gregory A; Swanson, Jessica M J

    2016-02-09

    An important challenge in the simulation of biomolecular systems is a quantitative description of the protonation and deprotonation process of amino acid residues. Despite the seeming simplicity of adding or removing a positively charged hydrogen nucleus, simulating the actual protonation/deprotonation process is inherently difficult. It requires both the explicit treatment of the excess proton, including its charge defect delocalization and Grotthuss shuttling through inhomogeneous moieties (water and amino residues), and extensive sampling of coupled condensed phase motions. In a recent paper (J. Chem. Theory Comput. 2014, 10, 2729-2737), a multiscale approach was developed to map high-level quantum mechanics/molecular mechanics (QM/MM) data into a multiscale reactive molecular dynamics (MS-RMD) model in order to describe amino acid deprotonation in bulk water. In this article, we extend the fitting approach (called FitRMD) to create MS-RMD models for ionizable amino acids within proteins. The resulting models are shown to faithfully reproduce the free energy profiles of the reference QM/MM Hamiltonian for PT inside an example protein, the ClC-ec1 H(+)/Cl(-) antiporter. Moreover, we show that the resulting MS-RMD models are computationally efficient enough to then characterize more complex 2-dimensional free energy surfaces due to slow degrees of freedom such as water hydration of internal protein cavities that can be inherently coupled to the excess proton charge translocation. The FitRMD method is thus shown to be an effective way to map ab initio level accuracy into a much more computationally efficient reactive MD method in order to explicitly simulate and quantitatively describe amino acid protonation/deprotonation in proteins.

  18. Combining Experiments and Simulations Using the Maximum Entropy Principle

    PubMed Central

    Boomsma, Wouter; Ferkinghoff-Borg, Jesper; Lindorff-Larsen, Kresten

    2014-01-01

    A key component of computational biology is to compare the results of computer modelling with experimental measurements. Despite substantial progress in the models and algorithms used in many areas of computational biology, such comparisons sometimes reveal that the computations are not in quantitative agreement with experimental data. The principle of maximum entropy is a general procedure for constructing probability distributions in the light of new data, making it a natural tool in cases when an initial model provides results that are at odds with experiments. The number of maximum entropy applications in our field has grown steadily in recent years, in areas as diverse as sequence analysis, structural modelling, and neurobiology. In this Perspectives article, we give a broad introduction to the method, in an attempt to encourage its further adoption. The general procedure is explained in the context of a simple example, after which we proceed with a real-world application in the field of molecular simulations, where the maximum entropy procedure has recently provided new insight. Given the limited accuracy of force fields, macromolecular simulations sometimes produce results that are at not in complete and quantitative accordance with experiments. A common solution to this problem is to explicitly ensure agreement between the two by perturbing the potential energy function towards the experimental data. So far, a general consensus for how such perturbations should be implemented has been lacking. Three very recent papers have explored this problem using the maximum entropy approach, providing both new theoretical and practical insights to the problem. We highlight each of these contributions in turn and conclude with a discussion on remaining challenges. PMID:24586124

  19. Bone Mass in Boys with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Calarge, Chadi A.; Schlechte, Janet A.

    2017-01-01

    To examine bone mass in children and adolescents with autism spectrum disorders (ASD). Risperidone-treated 5 to 17 year-old males underwent anthropometric and bone measurements, using dual-energy X-ray absorptiometry and peripheral quantitative computed tomography. Multivariable linear regression analysis models examined whether skeletal outcomes…

  20. Introductory Life Science Mathematics and Quantitative Neuroscience Courses

    ERIC Educational Resources Information Center

    Duffus, Dwight; Olifer, Andrei

    2010-01-01

    We describe two sets of courses designed to enhance the mathematical, statistical, and computational training of life science undergraduates at Emory College. The first course is an introductory sequence in differential and integral calculus, modeling with differential equations, probability, and inferential statistics. The second is an…

  1. We favor formal models of heuristics rather than lists of loose dichotomies: a reply to Evans and Over

    PubMed Central

    Gigerenzer, Gerd

    2009-01-01

    In their comment on Marewski et al. (good judgments do not require complex cognition, 2009) Evans and Over (heuristic thinking and human intelligence: a commentary on Marewski, Gaissmaier and Gigerenzer, 2009) conjectured that heuristics can often lead to biases and are not error free. This is a most surprising critique. The computational models of heuristics we have tested allow for quantitative predictions of how many errors a given heuristic will make, and we and others have measured the amount of error by analysis, computer simulation, and experiment. This is clear progress over simply giving heuristics labels, such as availability, that do not allow for quantitative comparisons of errors. Evans and Over argue that the reason people rely on heuristics is the accuracy-effort trade-off. However, the comparison between heuristics and more effortful strategies, such as multiple regression, has shown that there are many situations in which a heuristic is more accurate with less effort. Finally, we do not see how the fast and frugal heuristics program could benefit from a dual-process framework unless the dual-process framework is made more precise. Instead, the dual-process framework could benefit if its two “black boxes” (Type 1 and Type 2 processes) were substituted by computational models of both heuristics and other processes. PMID:19784854

  2. Computational simulation of extravehicular activity dynamics during a satellite capture attempt.

    PubMed

    Schaffner, G; Newman, D J; Robinson, S K

    2000-01-01

    A more quantitative approach to the analysis of astronaut extravehicular activity (EVA) tasks is needed because of their increasing complexity, particularly in preparation for the on-orbit assembly of the International Space Station. Existing useful EVA computer analyses produce either high-resolution three-dimensional computer images based on anthropometric representations or empirically derived predictions of astronaut strength based on lean body mass and the position and velocity of body joints but do not provide multibody dynamic analysis of EVA tasks. Our physics-based methodology helps fill the current gap in quantitative analysis of astronaut EVA by providing a multisegment human model and solving the equations of motion in a high-fidelity simulation of the system dynamics. The simulation work described here improves on the realism of previous efforts by including three-dimensional astronaut motion, incorporating joint stops to account for the physiological limits of range of motion, and incorporating use of constraint forces to model interaction with objects. To demonstrate the utility of this approach, the simulation is modeled on an actual EVA task, namely, the attempted capture of a spinning Intelsat VI satellite during STS-49 in May 1992. Repeated capture attempts by an EVA crewmember were unsuccessful because the capture bar could not be held in contact with the satellite long enough for the capture latches to fire and successfully retrieve the satellite.

  3. Quantitative Adverse Outcome Pathways and Their ...

    EPA Pesticide Factsheets

    A quantitative adverse outcome pathway (qAOP) consists of one or more biologically based, computational models describing key event relationships linking a molecular initiating event (MIE) to an adverse outcome. A qAOP provides quantitative, dose–response, and time-course predictions that can support regulatory decision-making. Herein we describe several facets of qAOPs, including (a) motivation for development, (b) technical considerations, (c) evaluation of confidence, and (d) potential applications. The qAOP used as an illustrative example for these points describes the linkage between inhibition of cytochrome P450 19A aromatase (the MIE) and population-level decreases in the fathead minnow (FHM; Pimephales promelas). The qAOP consists of three linked computational models for the following: (a) the hypothalamic-pitutitary-gonadal axis in female FHMs, where aromatase inhibition decreases the conversion of testosterone to 17β-estradiol (E2), thereby reducing E2-dependent vitellogenin (VTG; egg yolk protein precursor) synthesis, (b) VTG-dependent egg development and spawning (fecundity), and (c) fecundity-dependent population trajectory. While development of the example qAOP was based on experiments with FHMs exposed to the aromatase inhibitor fadrozole, we also show how a toxic equivalence (TEQ) calculation allows use of the qAOP to predict effects of another, untested aromatase inhibitor, iprodione. While qAOP development can be resource-intensive, the quan

  4. Estimation of equivalence ratio distribution in diesel spray using a computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Suzuki, Yasumasa; Tsujimura, Taku; Kusaka, Jin

    2014-08-01

    It is important to understand the mechanism of mixing and atomization of the diesel spray. In addition, the computational prediction of mixing behavior and internal structure of a diesel spray is expected to promote the further understanding about a diesel spray and development of the diesel engine including devices for fuel injection. In this study, we predicted the formation of diesel fuel spray with 3D-CFD code and validated the application by comparing experimental results of the fuel spray behavior and the equivalence ratio visualized by Layleigh-scatter imaging under some ambient, injection and fuel conditions. Using the applicable constants of KH-RT model, we can predict the liquid length spray on a quantitative level. under various fuel injection, ambient and fuel conditions. On the other hand, the change of the vapor penetration and the fuel mass fraction and equivalence ratio distribution with change of fuel injection and ambient conditions quantitatively. The 3D-CFD code used in this study predicts the spray cone angle and entrainment of ambient gas are predicted excessively, therefore there is the possibility of the improvement in the prediction accuracy by the refinement of fuel droplets breakup and evaporation model and the quantitative prediction of spray cone angle.

  5. Systems Toxicology: From Basic Research to Risk Assessment

    PubMed Central

    2014-01-01

    Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment. PMID:24446777

  6. Synthesis, Spectra, and Theoretical Investigations of 1,3,5-Triazines Compounds as Ultraviolet Rays Absorber Based on Time-Dependent Density Functional Calculations and three-Dimensional Quantitative Structure-Property Relationship.

    PubMed

    Wang, Xueding; Xu, Yilian; Yang, Lu; Lu, Xiang; Zou, Hao; Yang, Weiqing; Zhang, Yuanyuan; Li, Zicheng; Ma, Menglin

    2018-03-01

    A series of 1,3,5-triazines were synthesized and their UV absorption properties were tested. The computational chemistry methods were used to construct quantitative structure-property relationship (QSPR), which was used to computer aided design of new 1,3,5-triazines ultraviolet rays absorber compounds. The experimental UV absorption data are in good agreement with those predicted data using the Time-dependent density functional theory (TD-DFT) [B3LYP/6-311 + G(d,p)]. A suitable forecasting model (R > 0.8, P < 0.0001) was revealed. Predictive three-dimensional quantitative structure-property relationship (3D-QSPR) model was established using multifit molecular alignment rule of Sybyl program, which conclusion is consistent with the TD-DFT calculation. The exceptional photostability mechanism of such ultraviolet rays absorber compounds was studied and confirmed as principally banked upon their ability to undergo excited-state deactivation via an ultrafast excited-state proton transfer (ESIPT). The intramolecular hydrogen bond (IMHB) of 1,3,5-triazines compounds is the basis for the excited state proton transfer, which was explored by IR spectroscopy, UV spectra, structural and energetic aspects of different conformers and frontier molecular orbitals analysis.

  7. Systems toxicology: from basic research to risk assessment.

    PubMed

    Sturla, Shana J; Boobis, Alan R; FitzGerald, Rex E; Hoeng, Julia; Kavlock, Robert J; Schirmer, Kristin; Whelan, Maurice; Wilks, Martin F; Peitsch, Manuel C

    2014-03-17

    Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment.

  8. A Computational Model of Linguistic Humor in Puns.

    PubMed

    Kao, Justine T; Levy, Roger; Goodman, Noah D

    2016-07-01

    Humor plays an essential role in human interactions. Precisely what makes something funny, however, remains elusive. While research on natural language understanding has made significant advancements in recent years, there has been little direct integration of humor research with computational models of language understanding. In this paper, we propose two information-theoretic measures-ambiguity and distinctiveness-derived from a simple model of sentence processing. We test these measures on a set of puns and regular sentences and show that they correlate significantly with human judgments of funniness. Moreover, within a set of puns, the distinctiveness measure distinguishes exceptionally funny puns from mediocre ones. Our work is the first, to our knowledge, to integrate a computational model of general language understanding and humor theory to quantitatively predict humor at a fine-grained level. We present it as an example of a framework for applying models of language processing to understand higher level linguistic and cognitive phenomena. © 2015 The Authors. Cognitive Science published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.

  9. QSAR Methods.

    PubMed

    Gini, Giuseppina

    2016-01-01

    In this chapter, we introduce the basis of computational chemistry and discuss how computational methods have been extended to some biological properties and toxicology, in particular. Since about 20 years, chemical experimentation is more and more replaced by modeling and virtual experimentation, using a large core of mathematics, chemistry, physics, and algorithms. Then we see how animal experiments, aimed at providing a standardized result about a biological property, can be mimicked by new in silico methods. Our emphasis here is on toxicology and on predicting properties through chemical structures. Two main streams of such models are available: models that consider the whole molecular structure to predict a value, namely QSAR (Quantitative Structure Activity Relationships), and models that find relevant substructures to predict a class, namely SAR. The term in silico discovery is applied to chemical design, to computational toxicology, and to drug discovery. We discuss how the experimental practice in biological science is moving more and more toward modeling and simulation. Such virtual experiments confirm hypotheses, provide data for regulation, and help in designing new chemicals.

  10. A computational model of cerebral cortex folding.

    PubMed

    Nie, Jingxin; Guo, Lei; Li, Gang; Faraco, Carlos; Stephen Miller, L; Liu, Tianming

    2010-05-21

    The geometric complexity and variability of the human cerebral cortex have long intrigued the scientific community. As a result, quantitative description of cortical folding patterns and the understanding of underlying folding mechanisms have emerged as important research goals. This paper presents a computational 3D geometric model of cerebral cortex folding initialized by MRI data of a human fetal brain and deformed under the governance of a partial differential equation modeling cortical growth. By applying different simulation parameters, our model is able to generate folding convolutions and shape dynamics of the cerebral cortex. The simulations of this 3D geometric model provide computational experimental support to the following hypotheses: (1) Mechanical constraints of the skull regulate the cortical folding process. (2) The cortical folding pattern is dependent on the global cell growth rate of the whole cortex. (3) The cortical folding pattern is dependent on relative rates of cell growth in different cortical areas. (4) The cortical folding pattern is dependent on the initial geometry of the cortex. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  11. [Computer aided diagnosis model for lung tumor based on ensemble convolutional neural network].

    PubMed

    Wang, Yuanyuan; Zhou, Tao; Lu, Huiling; Wu, Cuiying; Yang, Pengfei

    2017-08-01

    The convolutional neural network (CNN) could be used on computer-aided diagnosis of lung tumor with positron emission tomography (PET)/computed tomography (CT), which can provide accurate quantitative analysis to compensate for visual inertia and defects in gray-scale sensitivity, and help doctors diagnose accurately. Firstly, parameter migration method is used to build three CNNs (CT-CNN, PET-CNN, and PET/CT-CNN) for lung tumor recognition in CT, PET, and PET/CT image, respectively. Then, we aimed at CT-CNN to obtain the appropriate model parameters for CNN training through analysis the influence of model parameters such as epochs, batchsize and image scale on recognition rate and training time. Finally, three single CNNs are used to construct ensemble CNN, and then lung tumor PET/CT recognition was completed through relative majority vote method and the performance between ensemble CNN and single CNN was compared. The experiment results show that the ensemble CNN is better than single CNN on computer-aided diagnosis of lung tumor.

  12. Quantitative structure-retention relationships for gas chromatographic retention indices of alkylbenzenes with molecular graph descriptors.

    PubMed

    Ivanciuc, O; Ivanciuc, T; Klein, D J; Seitz, W A; Balaban, A T

    2001-02-01

    Quantitative structure-retention relationships (QSRR) represent statistical models that quantify the connection between the molecular structure and the chromatographic retention indices of organic compounds, allowing the prediction of retention indices of novel, not yet synthesized compounds, solely from their structural descriptors. Using multiple linear regression, QSRR models for the gas chromatographic Kováts retention indices of 129 alkylbenzenes are generated using molecular graph descriptors. The correlational ability of structural descriptors computed from 10 molecular matrices is investigated, showing that the novel reciprocal matrices give numerical indices with improved correlational ability. A QSRR equation with 5 graph descriptors gives the best calibration and prediction results, demonstrating the usefulness of the molecular graph descriptors in modeling chromatographic retention parameters. The sequential orthogonalization of descriptors suggests simpler QSRR models by eliminating redundant structural information.

  13. Quantitative measurement of transverse injector and free stream interaction in a nonreacting SCRAMJET combustor using laser-induced iodine fluorescence

    NASA Technical Reports Server (NTRS)

    Fletcher, D. G.; Mcdaniel, J. C.

    1987-01-01

    A preliminary quantitative study of the compressible flowfield in a steady, nonreacting model SCRAMJET combustor using laser-induced iodine fluorescence (LIIF) is reported. Measurements of density, temperature, and velocity were conducted with the calibrated, nonintrusive, optical technique for two different combustor operating conditions. First, measurements were made in the supersonic flow over a rearward-facing step without transverse injection for comparison with calculated pressure profiles. The second configuration was staged injection behind the rearward-facing step at an injection dynamic pressure ratio of 1.06. These experimental results will be used to validate computational fluid dynamic (CFD) codes being developed to model supersonic combustor flowfields.

  14. Chemical Sensor Array Response Modeling Using Quantitative Structure-Activity Relationships Technique

    NASA Astrophysics Data System (ADS)

    Shevade, Abhijit V.; Ryan, Margaret A.; Homer, Margie L.; Zhou, Hanying; Manfreda, Allison M.; Lara, Liana M.; Yen, Shiao-Pin S.; Jewell, April D.; Manatt, Kenneth S.; Kisor, Adam K.

    We have developed a Quantitative Structure-Activity Relationships (QSAR) based approach to correlate the response of chemical sensors in an array with molecular descriptors. A novel molecular descriptor set has been developed; this set combines descriptors of sensing film-analyte interactions, representing sensor response, with a basic analyte descriptor set commonly used in QSAR studies. The descriptors are obtained using a combination of molecular modeling tools and empirical and semi-empirical Quantitative Structure-Property Relationships (QSPR) methods. The sensors under investigation are polymer-carbon sensing films which have been exposed to analyte vapors at parts-per-million (ppm) concentrations; response is measured as change in film resistance. Statistically validated QSAR models have been developed using Genetic Function Approximations (GFA) for a sensor array for a given training data set. The applicability of the sensor response models has been tested by using it to predict the sensor activities for test analytes not considered in the training set for the model development. The validated QSAR sensor response models show good predictive ability. The QSAR approach is a promising computational tool for sensing materials evaluation and selection. It can also be used to predict response of an existing sensing film to new target analytes.

  15. Multi-GPU hybrid programming accelerated three-dimensional phase-field model in binary alloy

    NASA Astrophysics Data System (ADS)

    Zhu, Changsheng; Liu, Jieqiong; Zhu, Mingfang; Feng, Li

    2018-03-01

    In the process of dendritic growth simulation, the computational efficiency and the problem scales have extremely important influence on simulation efficiency of three-dimensional phase-field model. Thus, seeking for high performance calculation method to improve the computational efficiency and to expand the problem scales has a great significance to the research of microstructure of the material. A high performance calculation method based on MPI+CUDA hybrid programming model is introduced. Multi-GPU is used to implement quantitative numerical simulations of three-dimensional phase-field model in binary alloy under the condition of multi-physical processes coupling. The acceleration effect of different GPU nodes on different calculation scales is explored. On the foundation of multi-GPU calculation model that has been introduced, two optimization schemes, Non-blocking communication optimization and overlap of MPI and GPU computing optimization, are proposed. The results of two optimization schemes and basic multi-GPU model are compared. The calculation results show that the use of multi-GPU calculation model can improve the computational efficiency of three-dimensional phase-field obviously, which is 13 times to single GPU, and the problem scales have been expanded to 8193. The feasibility of two optimization schemes is shown, and the overlap of MPI and GPU computing optimization has better performance, which is 1.7 times to basic multi-GPU model, when 21 GPUs are used.

  16. Nicholas Metropolis Award Talk for Outstanding Doctoral Thesis Work in Computational Physics: Computational biophysics and multiscale modeling of blood cells and blood flow in health and disease

    NASA Astrophysics Data System (ADS)

    Fedosov, Dmitry

    2011-03-01

    Computational biophysics is a large and rapidly growing area of computational physics. In this talk, we will focus on a number of biophysical problems related to blood cells and blood flow in health and disease. Blood flow plays a fundamental role in a wide range of physiological processes and pathologies in the organism. To understand and, if necessary, manipulate the course of these processes it is essential to investigate blood flow under realistic conditions including deformability of blood cells, their interactions, and behavior in the complex microvascular network. Using a multiscale cell model we are able to accurately capture red blood cell mechanics, rheology, and dynamics in agreement with a number of single cell experiments. Further, this validated model yields accurate predictions of the blood rheological properties, cell migration, cell-free layer, and hemodynamic resistance in microvessels. In addition, we investigate blood related changes in malaria, which include a considerable stiffening of red blood cells and their cytoadherence to endothelium. For these biophysical problems computational modeling is able to provide new physical insights and capabilities for quantitative predictions of blood flow in health and disease.

  17. A Simple and Computationally Efficient Approach to Multifactor Dimensionality Reduction Analysis of Gene-Gene Interactions for Quantitative Traits

    PubMed Central

    Gui, Jiang; Moore, Jason H.; Williams, Scott M.; Andrews, Peter; Hillege, Hans L.; van der Harst, Pim; Navis, Gerjan; Van Gilst, Wiek H.; Asselbergs, Folkert W.; Gilbert-Diamond, Diane

    2013-01-01

    We present an extension of the two-class multifactor dimensionality reduction (MDR) algorithm that enables detection and characterization of epistatic SNP-SNP interactions in the context of a quantitative trait. The proposed Quantitative MDR (QMDR) method handles continuous data by modifying MDR’s constructive induction algorithm to use a T-test. QMDR replaces the balanced accuracy metric with a T-test statistic as the score to determine the best interaction model. We used a simulation to identify the empirical distribution of QMDR’s testing score. We then applied QMDR to genetic data from the ongoing prospective Prevention of Renal and Vascular End-Stage Disease (PREVEND) study. PMID:23805232

  18. Systematic comparison of the behaviors produced by computational models of epileptic neocortex.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warlaumont, A. S.; Lee, H. C.; Benayoun, M.

    2010-12-01

    Two existing models of brain dynamics in epilepsy, one detailed (i.e., realistic) and one abstract (i.e., simplified) are compared in terms of behavioral range and match to in vitro mouse recordings. A new method is introduced for comparing across computational models that may have very different forms. First, high-level metrics were extracted from model and in vitro output time series. A principal components analysis was then performed over these metrics to obtain a reduced set of derived features. These features define a low-dimensional behavior space in which quantitative measures of behavioral range and degree of match to real data canmore » be obtained. The detailed and abstract models and the mouse recordings overlapped considerably in behavior space. Both the range of behaviors and similarity to mouse data were similar between the detailed and abstract models. When no high-level metrics were used and principal components analysis was computed over raw time series, the models overlapped minimally with the mouse recordings. The method introduced here is suitable for comparing across different kinds of model data and across real brain recordings. It appears that, despite differences in form and computational expense, detailed and abstract models do not necessarily differ in their behaviors.« less

  19. In silico method for modelling metabolism and gene product expression at genome scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lerman, Joshua A.; Hyduke, Daniel R.; Latif, Haythem

    2012-07-03

    Transcription and translation use raw materials and energy generated metabolically to create the macromolecular machinery responsible for all cellular functions, including metabolism. A biochemically accurate model of molecular biology and metabolism will facilitate comprehensive and quantitative computations of an organism's molecular constitution as a function of genetic and environmental parameters. Here we formulate a model of metabolism and macromolecular expression. Prototyping it using the simple microorganism Thermotoga maritima, we show our model accurately simulates variations in cellular composition and gene expression. Moreover, through in silico comparative transcriptomics, the model allows the discovery of new regulons and improving the genome andmore » transcription unit annotations. Our method presents a framework for investigating molecular biology and cellular physiology in silico and may allow quantitative interpretation of multi-omics data sets in the context of an integrated biochemical description of an organism.« less

  20. Coulombic Models in Chemical Bonding.

    ERIC Educational Resources Information Center

    Sacks, Lawrence J.

    1986-01-01

    Describes a bonding theory which provides a framework for the description of a wide range of substances and provides quantitative information of remarkable accuracy with far less computational effort than that required of other approaches. Includes applications, such as calculation of bond energies of two binary hydrides (methane and diborane).…

  1. Modelling Conceptual Development: A computational Account of Conservation Learning

    DTIC Science & Technology

    1990-02-28

    6 year-old age range are unable to obtain specific quantitative values for larger sets of objects e.g. (Gelman & Gallistel , 1978). 3. Children who...Piagetian concepts. In J.H. Flavell & E.M. Markman (Eds.), Handbook of Child Psychology(Fourth ed.). New York: Wiley. Gelman, R. & Gallistel , R.C. (1978

  2. Investigation of computational aeroacoustic tools for noise predictions of wind turbine aerofoils

    NASA Astrophysics Data System (ADS)

    Humpf, A.; Ferrer, E.; Munduate, X.

    2007-07-01

    In this work trailing edge noise levels of a research aerofoil have been computed and compared to aeroacoustic measurements using two different approaches. On the other hand, aerodynamic and aeroacoustic calculations were performed with the full Navier-Stokes CFD code Fluent [Fluent Inc 2005 Fluent 6.2 Users Guide, Lebanon, NH, USA] on the basis of a steady RANS simulation. Aerodynamic characteristics were computed by the aid of various turbulence models. By the combined usage of implemented broadband noise source models, it was tried to isolate and determine the trailing edge noise level. Throughout this work two methods of different computational cost have been tested and quantitative and qualitative results obtained. On the one hand, the semi-empirical noise prediction tool NAFNoise [Moriarty P 2005 NAFNoise User's Guide. Golden, Colorado, July. http://wind.nrel.gov/designcodes/ simulators/NAFNoise] was used to directly predict trailing edge noise by taking into consideration the nature of the experiments.

  3. Integrated Computational Materials Engineering for Magnesium in Automotive Body Applications

    NASA Astrophysics Data System (ADS)

    Allison, John E.; Liu, Baicheng; Boyle, Kevin P.; Hector, Lou; McCune, Robert

    This paper provides an overview and progress report for an international collaborative project which aims to develop an ICME infrastructure for magnesium for use in automotive body applications. Quantitative processing-micro structure-property relationships are being developed for extruded Mg alloys, sheet-formed Mg alloys and high pressure die cast Mg alloys. These relationships are captured in computational models which are then linked with manufacturing process simulation and used to provide constitutive models for component performance analysis. The long term goal is to capture this information in efficient computational models and in a web-centered knowledge base. The work is being conducted at leading universities, national labs and industrial research facilities in the US, China and Canada. This project is sponsored by the U.S. Department of Energy, the U.S. Automotive Materials Partnership (USAMP), Chinese Ministry of Science and Technology (MOST) and Natural Resources Canada (NRCan).

  4. 3D electron tomography of pretreated biomass informs atomic modeling of cellulose microfibrils.

    PubMed

    Ciesielski, Peter N; Matthews, James F; Tucker, Melvin P; Beckham, Gregg T; Crowley, Michael F; Himmel, Michael E; Donohoe, Bryon S

    2013-09-24

    Fundamental insights into the macromolecular architecture of plant cell walls will elucidate new structure-property relationships and facilitate optimization of catalytic processes that produce fuels and chemicals from biomass. Here we introduce computational methodology to extract nanoscale geometry of cellulose microfibrils within thermochemically treated biomass directly from electron tomographic data sets. We quantitatively compare the cell wall nanostructure in corn stover following two leading pretreatment strategies: dilute acid with iron sulfate co-catalyst and ammonia fiber expansion (AFEX). Computational analysis of the tomographic data is used to extract mathematical descriptions for longitudinal axes of cellulose microfibrils from which we calculate their nanoscale curvature. These nanostructural measurements are used to inform the construction of atomistic models that exhibit features of cellulose within real, process-relevant biomass. By computational evaluation of these atomic models, we propose relationships between the crystal structure of cellulose Iβ and the nanoscale geometry of cellulose microfibrils.

  5. Photopolarimetry of scattering surfaces and their interpretation by computer model

    NASA Technical Reports Server (NTRS)

    Wolff, M.

    1979-01-01

    Wolff's computer model of a rough planetary surface was simplified and revised. Close adherence to the actual geometry of a pitted surface and the inclusion of a function for diffuse light resulted in a quantitative model comparable to observations by planetary satellites and asteroids. A function is also derived to describe diffuse light emitted from a particulate surface. The function is in terms of the indices of refraction of the surface material, particle size, and viewing angles. Computer-generated plots describe the observable and theoretical light components for the Moon, Mercury, Mars and a spectrum of asteroids. Other plots describe the effects of changing surface material properties. Mathematical results are generated to relate the parameters of the negative polarization branch to the properties of surface pitting. An explanation is offered for the polarization of the rings of Saturn, and the average diameter of ring objects is found to be 30 to 40 centimeters.

  6. Permittivity and conductivity parameter estimations using full waveform inversion

    NASA Astrophysics Data System (ADS)

    Serrano, Jheyston O.; Ramirez, Ana B.; Abreo, Sergio A.; Sadler, Brian M.

    2018-04-01

    Full waveform inversion of Ground Penetrating Radar (GPR) data is a promising strategy to estimate quantitative characteristics of the subsurface such as permittivity and conductivity. In this paper, we propose a methodology that uses Full Waveform Inversion (FWI) in time domain of 2D GPR data to obtain highly resolved images of the permittivity and conductivity parameters of the subsurface. FWI is an iterative method that requires a cost function to measure the misfit between observed and modeled data, a wave propagator to compute the modeled data and an initial velocity model that is updated at each iteration until an acceptable decrease of the cost function is reached. The use of FWI with GPR are expensive computationally because it is based on the computation of the electromagnetic full wave propagation. Also, the commercially available acquisition systems use only one transmitter and one receiver antenna at zero offset, requiring a large number of shots to scan a single line.

  7. Petri-net-based 2D design of DNA walker circuits.

    PubMed

    Gilbert, David; Heiner, Monika; Rohr, Christian

    2018-01-01

    We consider localised DNA computation, where a DNA strand walks along a binary decision graph to compute a binary function. One of the challenges for the design of reliable walker circuits consists in leakage transitions, which occur when a walker jumps into another branch of the decision graph. We automatically identify leakage transitions, which allows for a detailed qualitative and quantitative assessment of circuit designs, design comparison, and design optimisation. The ability to identify leakage transitions is an important step in the process of optimising DNA circuit layouts where the aim is to minimise the computational error inherent in a circuit while minimising the area of the circuit. Our 2D modelling approach of DNA walker circuits relies on coloured stochastic Petri nets which enable functionality, topology and dimensionality all to be integrated in one two-dimensional model. Our modelling and analysis approach can be easily extended to 3-dimensional walker systems.

  8. Estimating cellular parameters through optimization procedures: elementary principles and applications.

    PubMed

    Kimura, Akatsuki; Celani, Antonio; Nagao, Hiromichi; Stasevich, Timothy; Nakamura, Kazuyuki

    2015-01-01

    Construction of quantitative models is a primary goal of quantitative biology, which aims to understand cellular and organismal phenomena in a quantitative manner. In this article, we introduce optimization procedures to search for parameters in a quantitative model that can reproduce experimental data. The aim of optimization is to minimize the sum of squared errors (SSE) in a prediction or to maximize likelihood. A (local) maximum of likelihood or (local) minimum of the SSE can efficiently be identified using gradient approaches. Addition of a stochastic process enables us to identify the global maximum/minimum without becoming trapped in local maxima/minima. Sampling approaches take advantage of increasing computational power to test numerous sets of parameters in order to determine the optimum set. By combining Bayesian inference with gradient or sampling approaches, we can estimate both the optimum parameters and the form of the likelihood function related to the parameters. Finally, we introduce four examples of research that utilize parameter optimization to obtain biological insights from quantified data: transcriptional regulation, bacterial chemotaxis, morphogenesis, and cell cycle regulation. With practical knowledge of parameter optimization, cell and developmental biologists can develop realistic models that reproduce their observations and thus, obtain mechanistic insights into phenomena of interest.

  9. Comparison of Mars Science Laboratory Reaction Control System Jet Computations With Flow Visualization and Velocimetry

    NASA Technical Reports Server (NTRS)

    Bathel, Brett F.; Danehy, Paul M.; Johansen, Craig T.; Ashcraft, Scott W.; Novak, Luke A.

    2013-01-01

    Numerical predictions of the Mars Science Laboratory reaction control system jets interacting with a Mach 10 hypersonic flow are compared to experimental nitric oxide planar laser-induced fluorescence data. The steady Reynolds Averaged Navier Stokes equations using the Baldwin-Barth one-equation turbulence model were solved using the OVERFLOW code. The experimental fluorescence data used for comparison consists of qualitative two-dimensional visualization images, qualitative reconstructed three-dimensional flow structures, and quantitative two-dimensional distributions of streamwise velocity. Through modeling of the fluorescence signal equation, computational flow images were produced and directly compared to the qualitative fluorescence data.

  10. Modeling inelastic phonon scattering in atomic- and molecular-wire junctions

    NASA Astrophysics Data System (ADS)

    Paulsson, Magnus; Frederiksen, Thomas; Brandbyge, Mads

    2005-11-01

    Computationally inexpensive approximations describing electron-phonon scattering in molecular-scale conductors are derived from the nonequilibrium Green’s function method. The accuracy is demonstrated with a first-principles calculation on an atomic gold wire. Quantitative agreement between the full nonequilibrium Green’s function calculation and the newly derived expressions is obtained while simplifying the computational burden by several orders of magnitude. In addition, analytical models provide intuitive understanding of the conductance including nonequilibrium heating and provide a convenient way of parameterizing the physics. This is exemplified by fitting the expressions to the experimentally observed conductances through both an atomic gold wire and a hydrogen molecule.

  11. Evaluation of synthetic linear motor-molecule actuation energetics

    PubMed Central

    Brough, Branden; Northrop, Brian H.; Schmidt, Jacob J.; Tseng, Hsian-Rong; Houk, Kendall N.; Stoddart, J. Fraser; Ho, Chih-Ming

    2006-01-01

    By applying atomic force microscope (AFM)-based force spectroscopy together with computational modeling in the form of molecular force-field simulations, we have determined quantitatively the actuation energetics of a synthetic motor-molecule. This multidisciplinary approach was performed on specifically designed, bistable, redox-controllable [2]rotaxanes to probe the steric and electrostatic interactions that dictate their mechanical switching at the single-molecule level. The fusion of experimental force spectroscopy and theoretical computational modeling has revealed that the repulsive electrostatic interaction, which is responsible for the molecular actuation, is as high as 65 kcal·mol−1, a result that is supported by ab initio calculations. PMID:16735470

  12. Introductory Life Science Mathematics and Quantitative Neuroscience Courses

    PubMed Central

    Olifer, Andrei

    2010-01-01

    We describe two sets of courses designed to enhance the mathematical, statistical, and computational training of life science undergraduates at Emory College. The first course is an introductory sequence in differential and integral calculus, modeling with differential equations, probability, and inferential statistics. The second is an upper-division course in computational neuroscience. We provide a description of each course, detailed syllabi, examples of content, and a brief discussion of the main issues encountered in developing and offering the courses. PMID:20810971

  13. A Computational Model of Liver Iron Metabolism

    PubMed Central

    Mitchell, Simon; Mendes, Pedro

    2013-01-01

    Iron is essential for all known life due to its redox properties; however, these same properties can also lead to its toxicity in overload through the production of reactive oxygen species. Robust systemic and cellular control are required to maintain safe levels of iron, and the liver seems to be where this regulation is mainly located. Iron misregulation is implicated in many diseases, and as our understanding of iron metabolism improves, the list of iron-related disorders grows. Recent developments have resulted in greater knowledge of the fate of iron in the body and have led to a detailed map of its metabolism; however, a quantitative understanding at the systems level of how its components interact to produce tight regulation remains elusive. A mechanistic computational model of human liver iron metabolism, which includes the core regulatory components, is presented here. It was constructed based on known mechanisms of regulation and on their kinetic properties, obtained from several publications. The model was then quantitatively validated by comparing its results with previously published physiological data, and it is able to reproduce multiple experimental findings. A time course simulation following an oral dose of iron was compared to a clinical time course study and the simulation was found to recreate the dynamics and time scale of the systems response to iron challenge. A disease state simulation of haemochromatosis was created by altering a single reaction parameter that mimics a human haemochromatosis gene (HFE) mutation. The simulation provides a quantitative understanding of the liver iron overload that arises in this disease. This model supports and supplements understanding of the role of the liver as an iron sensor and provides a framework for further modelling, including simulations to identify valuable drug targets and design of experiments to improve further our knowledge of this system. PMID:24244122

  14. Formulation and Validation of an Efficient Computational Model for a Dilute, Settling Suspension Undergoing Rotational Mixing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sprague, Michael A.; Stickel, Jonathan J.; Sitaraman, Hariswaran

    Designing processing equipment for the mixing of settling suspensions is a challenging problem. Achieving low-cost mixing is especially difficult for the application of slowly reacting suspended solids because the cost of impeller power consumption becomes quite high due to the long reaction times (batch mode) or due to large-volume reactors (continuous mode). Further, the usual scale-up metrics for mixing, e.g., constant tip speed and constant power per volume, do not apply well for mixing of suspensions. As an alternative, computational fluid dynamics (CFD) can be useful for analyzing mixing at multiple scales and determining appropriate mixer designs and operating parameters.more » We developed a mixture model to describe the hydrodynamics of a settling cellulose suspension. The suspension motion is represented as a single velocity field in a computationally efficient Eulerian framework. The solids are represented by a scalar volume-fraction field that undergoes transport due to particle diffusion, settling, fluid advection, and shear stress. A settling model and a viscosity model, both functions of volume fraction, were selected to fit experimental settling and viscosity data, respectively. Simulations were performed with the open-source Nek5000 CFD program, which is based on the high-order spectral-finite-element method. Simulations were performed for the cellulose suspension undergoing mixing in a laboratory-scale vane mixer. The settled-bed heights predicted by the simulations were in semi-quantitative agreement with experimental observations. Further, the simulation results were in quantitative agreement with experimentally obtained torque and mixing-rate data, including a characteristic torque bifurcation. In future work, we plan to couple this CFD model with a reaction-kinetics model for the enzymatic digestion of cellulose, allowing us to predict enzymatic digestion performance for various mixing intensities and novel reactor designs.« less

  15. Quantitative analysis of breast cancer diagnosis using a probabilistic modelling approach.

    PubMed

    Liu, Shuo; Zeng, Jinshu; Gong, Huizhou; Yang, Hongqin; Zhai, Jia; Cao, Yi; Liu, Junxiu; Luo, Yuling; Li, Yuhua; Maguire, Liam; Ding, Xuemei

    2018-01-01

    Breast cancer is the most prevalent cancer in women in most countries of the world. Many computer-aided diagnostic methods have been proposed, but there are few studies on quantitative discovery of probabilistic dependencies among breast cancer data features and identification of the contribution of each feature to breast cancer diagnosis. This study aims to fill this void by utilizing a Bayesian network (BN) modelling approach. A K2 learning algorithm and statistical computation methods are used to construct BN structure and assess the obtained BN model. The data used in this study were collected from a clinical ultrasound dataset derived from a Chinese local hospital and a fine-needle aspiration cytology (FNAC) dataset from UCI machine learning repository. Our study suggested that, in terms of ultrasound data, cell shape is the most significant feature for breast cancer diagnosis, and the resistance index presents a strong probabilistic dependency on blood signals. With respect to FNAC data, bare nuclei are the most important discriminating feature of malignant and benign breast tumours, and uniformity of both cell size and cell shape are tightly interdependent. The BN modelling approach can support clinicians in making diagnostic decisions based on the significant features identified by the model, especially when some other features are missing for specific patients. The approach is also applicable to other healthcare data analytics and data modelling for disease diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. A Review of Hemolysis Prediction Models for Computational Fluid Dynamics.

    PubMed

    Yu, Hai; Engel, Sebastian; Janiga, Gábor; Thévenin, Dominique

    2017-07-01

    Flow-induced hemolysis is a crucial issue for many biomedical applications; in particular, it is an essential issue for the development of blood-transporting devices such as left ventricular assist devices, and other types of blood pumps. In order to estimate red blood cell (RBC) damage in blood flows, many models have been proposed in the past. Most models have been validated by their respective authors. However, the accuracy and the validity range of these models remains unclear. In this work, the most established hemolysis models compatible with computational fluid dynamics of full-scale devices are described and assessed by comparing two selected reference experiments: a simple rheometric flow and a more complex hemodialytic flow through a needle. The quantitative comparisons show very large deviations concerning hemolysis predictions, depending on the model and model parameter. In light of the current results, two simple power-law models deliver the best compromise between computational efficiency and obtained accuracy. Finally, hemolysis has been computed in an axial blood pump. The reconstructed geometry of a HeartMate II shows that hemolysis occurs mainly at the tip and leading edge of the rotor blades, as well as at the leading edge of the diffusor vanes. © 2017 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  17. Quantitative computational models of molecular self-assembly in systems biology

    PubMed Central

    Thomas, Marcus; Schwartz, Russell

    2017-01-01

    Molecular self-assembly is the dominant form of chemical reaction in living systems, yet efforts at systems biology modeling are only beginning to appreciate the need for and challenges to accurate quantitative modeling of self-assembly. Self-assembly reactions are essential to nearly every important process in cell and molecular biology and handling them is thus a necessary step in building comprehensive models of complex cellular systems. They present exceptional challenges, however, to standard methods for simulating complex systems. While the general systems biology world is just beginning to deal with these challenges, there is an extensive literature dealing with them for more specialized self-assembly modeling. This review will examine the challenges of self-assembly modeling, nascent efforts to deal with these challenges in the systems modeling community, and some of the solutions offered in prior work on self-assembly specifically. The review concludes with some consideration of the likely role of self-assembly in the future of complex biological system models more generally. PMID:28535149

  18. Quantitative computational models of molecular self-assembly in systems biology.

    PubMed

    Thomas, Marcus; Schwartz, Russell

    2017-05-23

    Molecular self-assembly is the dominant form of chemical reaction in living systems, yet efforts at systems biology modeling are only beginning to appreciate the need for and challenges to accurate quantitative modeling of self-assembly. Self-assembly reactions are essential to nearly every important process in cell and molecular biology and handling them is thus a necessary step in building comprehensive models of complex cellular systems. They present exceptional challenges, however, to standard methods for simulating complex systems. While the general systems biology world is just beginning to deal with these challenges, there is an extensive literature dealing with them for more specialized self-assembly modeling. This review will examine the challenges of self-assembly modeling, nascent efforts to deal with these challenges in the systems modeling community, and some of the solutions offered in prior work on self-assembly specifically. The review concludes with some consideration of the likely role of self-assembly in the future of complex biological system models more generally.

  19. A computational model of selection by consequences: log survivor plots.

    PubMed

    Kulubekova, Saule; McDowell, J J

    2008-06-01

    [McDowell, J.J, 2004. A computational model of selection by consequences. J. Exp. Anal. Behav. 81, 297-317] instantiated the principle of selection by consequences in a virtual organism with an evolving repertoire of possible behaviors undergoing selection, reproduction, and mutation over many generations. The process is based on the computational approach, which is non-deterministic and rules-based. The model proposes a causal account for operant behavior. McDowell found that the virtual organism consistently showed a hyperbolic relationship between response and reinforcement rates according to the quantitative law of effect. To continue validation of the computational model, the present study examined its behavior on the molecular level by comparing the virtual organism's IRT distributions in the form of log survivor plots to findings from live organisms. Log survivor plots did not show the "broken-stick" feature indicative of distinct bouts and pauses in responding, although the bend in slope of the plots became more defined at low reinforcement rates. The shape of the virtual organism's log survivor plots was more consistent with the data on reinforced responding in pigeons. These results suggest that log survivor plot patterns of the virtual organism were generally consistent with the findings from live organisms providing further support for the computational model of selection by consequences as a viable account of operant behavior.

  20. Effect of low-intensity pulsed ultrasound stimulation on gap healing in a rabbit osteotomy model evaluated by quantitative micro-computed tomography-based cross-sectional moment of inertia.

    PubMed

    Tobita, Kenji; Matsumoto, Takuya; Ohashi, Satoru; Bessho, Masahiko; Kaneko, Masako; Ohnishi, Isao

    2012-07-01

    It has been previously demonstrated that low-intensity pulsed ultrasound stimulation (LIPUS) enhances formation of the medullary canal and cortex in a gap-healing model of the tibia in rabbits, shortens the time required for remodeling, and enhances mineralization of the callus. In the current study, the mechanical integrity of these models was confirmed. In order to do this, the cross-sectional moment of inertia (CSMI) obtained from quantitative micro-computed tomography scans was calculated, and a comparison was made with a four-point bending test. This parameter can be analyzed in any direction, and three directions were selected in order to adopt an XYZ coordinate (X and Y for bending; Z for torsion). The present results demonstrated that LIPUS improved earlier restoration of bending stiffness at the healing site. In addition, LIPUS was effective not only in the ultrasound-irradiated plane, but also in the other two planes. CSMI may provide the structural as well as compositional determinants to assess fracture healing and would be very useful to replace the mechanical testing.

  1. Quantitative Thermochronology

    NASA Astrophysics Data System (ADS)

    Braun, Jean; van der Beek, Peter; Batt, Geoffrey

    2006-05-01

    Thermochronology, the study of the thermal history of rocks, enables us to quantify the nature and timing of tectonic processes. Quantitative Thermochronology is a robust review of isotopic ages, and presents a range of numerical modeling techniques to allow the physical implications of isotopic age data to be explored. The authors provide analytical, semi-analytical, and numerical solutions to the heat transfer equation in a range of tectonic settings and under varying boundary conditions. They then illustrate their modeling approach built around a large number of case studies. The benefits of different thermochronological techniques are also described. Computer programs on an accompanying website at www.cambridge.org/9780521830577 are introduced through the text and provide a means of solving the heat transport equation in the deforming Earth to predict the ages of rocks and compare them directly to geological and geochronological data. Several short tutorials, with hints and solutions, are also included. Numerous case studies help geologists to interpret age data and relate it to Earth processes Essential background material to aid understanding and using thermochronological data Provides a thorough treatise on numerical modeling of heat transport in the Earth's crust Supported by a website hosting relevant computer programs and colour slides of figures from the book for use in teaching

  2. Enhancement of PET Images

    NASA Astrophysics Data System (ADS)

    Davis, Paul B.; Abidi, Mongi A.

    1989-05-01

    PET is the only imaging modality that provides doctors with early analytic and quantitative biochemical assessment and precise localization of pathology. In PET images, boundary information as well as local pixel intensity are both crucial for manual and/or automated feature tracing, extraction, and identification. Unfortunately, the present PET technology does not provide the necessary image quality from which such precise analytic and quantitative measurements can be made. PET images suffer from significantly high levels of radial noise present in the form of streaks caused by the inexactness of the models used in image reconstruction. In this paper, our objective is to model PET noise and remove it without altering dominant features in the image. The ultimate goal here is to enhance these dominant features to allow for automatic computer interpretation and classification of PET images by developing techniques that take into consideration PET signal characteristics, data collection, and data reconstruction. We have modeled the noise steaks in PET images in both rectangular and polar representations and have shown both analytically and through computer simulation that it exhibits consistent mapping patterns. A class of filters was designed and applied successfully. Visual inspection of the filtered images show clear enhancement over the original images.

  3. Quantitative trait nucleotide analysis using Bayesian model selection.

    PubMed

    Blangero, John; Goring, Harald H H; Kent, Jack W; Williams, Jeff T; Peterson, Charles P; Almasy, Laura; Dyer, Thomas D

    2005-10-01

    Although much attention has been given to statistical genetic methods for the initial localization and fine mapping of quantitative trait loci (QTLs), little methodological work has been done to date on the problem of statistically identifying the most likely functional polymorphisms using sequence data. In this paper we provide a general statistical genetic framework, called Bayesian quantitative trait nucleotide (BQTN) analysis, for assessing the likely functional status of genetic variants. The approach requires the initial enumeration of all genetic variants in a set of resequenced individuals. These polymorphisms are then typed in a large number of individuals (potentially in families), and marker variation is related to quantitative phenotypic variation using Bayesian model selection and averaging. For each sequence variant a posterior probability of effect is obtained and can be used to prioritize additional molecular functional experiments. An example of this quantitative nucleotide analysis is provided using the GAW12 simulated data. The results show that the BQTN method may be useful for choosing the most likely functional variants within a gene (or set of genes). We also include instructions on how to use our computer program, SOLAR, for association analysis and BQTN analysis.

  4. Estimation and Identifiability of Model Parameters in Human Nociceptive Processing Using Yes-No Detection Responses to Electrocutaneous Stimulation.

    PubMed

    Yang, Huan; Meijer, Hil G E; Buitenweg, Jan R; van Gils, Stephan A

    2016-01-01

    Healthy or pathological states of nociceptive subsystems determine different stimulus-response relations measured from quantitative sensory testing. In turn, stimulus-response measurements may be used to assess these states. In a recently developed computational model, six model parameters characterize activation of nerve endings and spinal neurons. However, both model nonlinearity and limited information in yes-no detection responses to electrocutaneous stimuli challenge to estimate model parameters. Here, we address the question whether and how one can overcome these difficulties for reliable parameter estimation. First, we fit the computational model to experimental stimulus-response pairs by maximizing the likelihood. To evaluate the balance between model fit and complexity, i.e., the number of model parameters, we evaluate the Bayesian Information Criterion. We find that the computational model is better than a conventional logistic model regarding the balance. Second, our theoretical analysis suggests to vary the pulse width among applied stimuli as a necessary condition to prevent structural non-identifiability. In addition, the numerically implemented profile likelihood approach reveals structural and practical non-identifiability. Our model-based approach with integration of psychophysical measurements can be useful for a reliable assessment of states of the nociceptive system.

  5. Electronic field emission models beyond the Fowler-Nordheim one

    NASA Astrophysics Data System (ADS)

    Lepetit, Bruno

    2017-12-01

    We propose several quantum mechanical models to describe electronic field emission from first principles. These models allow us to correlate quantitatively the electronic emission current with the electrode surface details at the atomic scale. They all rely on electronic potential energy surfaces obtained from three dimensional density functional theory calculations. They differ by the various quantum mechanical methods (exact or perturbative, time dependent or time independent), which are used to describe tunneling through the electronic potential energy barrier. Comparison of these models between them and with the standard Fowler-Nordheim one in the context of one dimensional tunneling allows us to assess the impact on the accuracy of the computed current of the approximations made in each model. Among these methods, the time dependent perturbative one provides a well-balanced trade-off between accuracy and computational cost.

  6. Fast Realistic MRI Simulations Based on Generalized Multi-Pool Exchange Tissue Model.

    PubMed

    Liu, Fang; Velikina, Julia V; Block, Walter F; Kijowski, Richard; Samsonov, Alexey A

    2017-02-01

    We present MRiLab, a new comprehensive simulator for large-scale realistic MRI simulations on a regular PC equipped with a modern graphical processing unit (GPU). MRiLab combines realistic tissue modeling with numerical virtualization of an MRI system and scanning experiment to enable assessment of a broad range of MRI approaches including advanced quantitative MRI methods inferring microstructure on a sub-voxel level. A flexible representation of tissue microstructure is achieved in MRiLab by employing the generalized tissue model with multiple exchanging water and macromolecular proton pools rather than a system of independent proton isochromats typically used in previous simulators. The computational power needed for simulation of the biologically relevant tissue models in large 3D objects is gained using parallelized execution on GPU. Three simulated and one actual MRI experiments were performed to demonstrate the ability of the new simulator to accommodate a wide variety of voxel composition scenarios and demonstrate detrimental effects of simplified treatment of tissue micro-organization adapted in previous simulators. GPU execution allowed  ∼ 200× improvement in computational speed over standard CPU. As a cross-platform, open-source, extensible environment for customizing virtual MRI experiments, MRiLab streamlines the development of new MRI methods, especially those aiming to infer quantitatively tissue composition and microstructure.

  7. Quantitative evaluation of simulated functional brain networks in graph theoretical analysis.

    PubMed

    Lee, Won Hee; Bullmore, Ed; Frangou, Sophia

    2017-02-01

    There is increasing interest in the potential of whole-brain computational models to provide mechanistic insights into resting-state brain networks. It is therefore important to determine the degree to which computational models reproduce the topological features of empirical functional brain networks. We used empirical connectivity data derived from diffusion spectrum and resting-state functional magnetic resonance imaging data from healthy individuals. Empirical and simulated functional networks, constrained by structural connectivity, were defined based on 66 brain anatomical regions (nodes). Simulated functional data were generated using the Kuramoto model in which each anatomical region acts as a phase oscillator. Network topology was studied using graph theory in the empirical and simulated data. The difference (relative error) between graph theory measures derived from empirical and simulated data was then estimated. We found that simulated data can be used with confidence to model graph measures of global network organization at different dynamic states and highlight the sensitive dependence of the solutions obtained in simulated data on the specified connection densities. This study provides a method for the quantitative evaluation and external validation of graph theory metrics derived from simulated data that can be used to inform future study designs. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  8. Fast Realistic MRI Simulations Based on Generalized Multi-Pool Exchange Tissue Model

    PubMed Central

    Velikina, Julia V.; Block, Walter F.; Kijowski, Richard; Samsonov, Alexey A.

    2017-01-01

    We present MRiLab, a new comprehensive simulator for large-scale realistic MRI simulations on a regular PC equipped with a modern graphical processing unit (GPU). MRiLab combines realistic tissue modeling with numerical virtualization of an MRI system and scanning experiment to enable assessment of a broad range of MRI approaches including advanced quantitative MRI methods inferring microstructure on a sub-voxel level. A flexibl representation of tissue microstructure is achieved in MRiLab by employing the generalized tissue model with multiple exchanging water and macromolecular proton pools rather than a system of independent proton isochromats typically used in previous simulators. The computational power needed for simulation of the biologically relevant tissue models in large 3D objects is gained using parallelized execution on GPU. Three simulated and one actual MRI experiments were performed to demonstrate the ability of the new simulator to accommodate a wide variety of voxel composition scenarios and demonstrate detrimental effects of simplifie treatment of tissue micro-organization adapted in previous simulators. GPU execution allowed ∼200× improvement in computational speed over standard CPU. As a cross-platform, open-source, extensible environment for customizing virtual MRI experiments, MRiLab streamlines the development of new MRI methods, especially those aiming to infer quantitatively tissue composition and microstructure. PMID:28113746

  9. Numerical formulation for the prediction of solid/liquid change of a binary alloy

    NASA Technical Reports Server (NTRS)

    Schneider, G. E.; Tiwari, S. N.

    1990-01-01

    A computational model is presented for the prediction of solid/liquid phase change energy transport including the influence of free convection fluid flow in the liquid phase region. The computational model considers the velocity components of all non-liquid phase change material control volumes to be zero but fully solves the coupled mass-momentum problem within the liquid region. The thermal energy model includes the entire domain and uses an enthalpy like model and a recently developed method for handling the phase change interface nonlinearity. Convergence studies are performed and comparisons made with experimental data for two different problem specifications. The convergence studies indicate that grid independence was achieved and the comparison with experimental data indicates excellent quantitative prediction of the melt fraction evolution. Qualitative data is also provided in the form of velocity vector diagrams and isotherm plots for selected times in the evolution of both problems. The computational costs incurred are quite low by comparison with previous efforts on solving these problems.

  10. Molecular descriptor subset selection in theoretical peptide quantitative structure-retention relationship model development using nature-inspired optimization algorithms.

    PubMed

    Žuvela, Petar; Liu, J Jay; Macur, Katarzyna; Bączek, Tomasz

    2015-10-06

    In this work, performance of five nature-inspired optimization algorithms, genetic algorithm (GA), particle swarm optimization (PSO), artificial bee colony (ABC), firefly algorithm (FA), and flower pollination algorithm (FPA), was compared in molecular descriptor selection for development of quantitative structure-retention relationship (QSRR) models for 83 peptides that originate from eight model proteins. The matrix with 423 descriptors was used as input, and QSRR models based on selected descriptors were built using partial least squares (PLS), whereas root mean square error of prediction (RMSEP) was used as a fitness function for their selection. Three performance criteria, prediction accuracy, computational cost, and the number of selected descriptors, were used to evaluate the developed QSRR models. The results show that all five variable selection methods outperform interval PLS (iPLS), sparse PLS (sPLS), and the full PLS model, whereas GA is superior because of its lowest computational cost and higher accuracy (RMSEP of 5.534%) with a smaller number of variables (nine descriptors). The GA-QSRR model was validated initially through Y-randomization. In addition, it was successfully validated with an external testing set out of 102 peptides originating from Bacillus subtilis proteomes (RMSEP of 22.030%). Its applicability domain was defined, from which it was evident that the developed GA-QSRR exhibited strong robustness. All the sources of the model's error were identified, thus allowing for further application of the developed methodology in proteomics.

  11. A Semiquantitative Framework for Gene Regulatory Networks: Increasing the Time and Quantitative Resolution of Boolean Networks

    PubMed Central

    Kerkhofs, Johan; Geris, Liesbet

    2015-01-01

    Boolean models have been instrumental in predicting general features of gene networks and more recently also as explorative tools in specific biological applications. In this study we introduce a basic quantitative and a limited time resolution to a discrete (Boolean) framework. Quantitative resolution is improved through the employ of normalized variables in unison with an additive approach. Increased time resolution stems from the introduction of two distinct priority classes. Through the implementation of a previously published chondrocyte network and T helper cell network, we show that this addition of quantitative and time resolution broadens the scope of biological behaviour that can be captured by the models. Specifically, the quantitative resolution readily allows models to discern qualitative differences in dosage response to growth factors. The limited time resolution, in turn, can influence the reachability of attractors, delineating the likely long term system behaviour. Importantly, the information required for implementation of these features, such as the nature of an interaction, is typically obtainable from the literature. Nonetheless, a trade-off is always present between additional computational cost of this approach and the likelihood of extending the model’s scope. Indeed, in some cases the inclusion of these features does not yield additional insight. This framework, incorporating increased and readily available time and semi-quantitative resolution, can help in substantiating the litmus test of dynamics for gene networks, firstly by excluding unlikely dynamics and secondly by refining falsifiable predictions on qualitative behaviour. PMID:26067297

  12. Use of laser 3D surface digitizer in data collection and 3D modeling of anatomical structures

    NASA Astrophysics Data System (ADS)

    Tse, Kelly; Van Der Wall, Hans; Vu, Dzung H.

    2006-02-01

    A laser digitizer (Konica-Minolta Vivid 910) is used to obtain 3-dimensional surface scans of anatomical structures with a maximum resolution of 0.1mm. Placing the specimen on a turntable allows multiple scans allaround because the scanner only captures data from the portion facing its lens. A computer model is generated using 3D modeling software such as Geomagic. The 3D model can be manipulated on screen for repeated analysis of anatomical features, a useful capability when the specimens are rare or inaccessible (museum collection, fossils, imprints in rock formation.). As accurate measurements can be performed on the computer model, instead of taking measurements on actual specimens only at the archeological excavation site e.g., a variety of quantitative data can be later obtained on the computer model in the laboratory as new ideas come to mind. Our group had used a mechanical contact digitizer (Microscribe) for this purpose, but with the surface digitizer, we have been obtaining data sets more accurately and more quickly.

  13. Simulation of the M13 life cycle I: Assembly of a genetically-structured deterministic chemical kinetic simulation.

    PubMed

    Smeal, Steven W; Schmitt, Margaret A; Pereira, Ronnie Rodrigues; Prasad, Ashok; Fisk, John D

    2017-01-01

    To expand the quantitative, systems level understanding and foster the expansion of the biotechnological applications of the filamentous bacteriophage M13, we have unified the accumulated quantitative information on M13 biology into a genetically-structured, experimentally-based computational simulation of the entire phage life cycle. The deterministic chemical kinetic simulation explicitly includes the molecular details of DNA replication, mRNA transcription, protein translation and particle assembly, as well as the competing protein-protein and protein-nucleic acid interactions that control the timing and extent of phage production. The simulation reproduces the holistic behavior of M13, closely matching experimentally reported values of the intracellular levels of phage species and the timing of events in the M13 life cycle. The computational model provides a quantitative description of phage biology, highlights gaps in the present understanding of M13, and offers a framework for exploring alternative mechanisms of regulation in the context of the complete M13 life cycle. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Effect of Diffusion Limitations on Multianalyte Determination from Biased Biosensor Response

    PubMed Central

    Baronas, Romas; Kulys, Juozas; Lančinskas, Algirdas; Žilinskas, Antanas

    2014-01-01

    The optimization-based quantitative determination of multianalyte concentrations from biased biosensor responses is investigated under internal and external diffusion-limited conditions. A computational model of a biocatalytic amperometric biosensor utilizing a mono-enzyme-catalyzed (nonspecific) competitive conversion of two substrates was used to generate pseudo-experimental responses to mixtures of compounds. The influence of possible perturbations of the biosensor signal, due to a white noise- and temperature-induced trend, on the precision of the concentration determination has been investigated for different configurations of the biosensor operation. The optimization method was found to be suitable and accurate enough for the quantitative determination of the concentrations of the compounds from a given biosensor transient response. The computational experiments showed a complex dependence of the precision of the concentration estimation on the relative thickness of the outer diffusion layer, as well as on whether the biosensor operates under diffusion- or kinetics-limited conditions. When the biosensor response is affected by the induced exponential trend, the duration of the biosensor action can be optimized for increasing the accuracy of the quantitative analysis. PMID:24608006

  15. Metrics for comparing dynamic earthquake rupture simulations

    USGS Publications Warehouse

    Barall, Michael; Harris, Ruth A.

    2014-01-01

    Earthquakes are complex events that involve a myriad of interactions among multiple geologic features and processes. One of the tools that is available to assist with their study is computer simulation, particularly dynamic rupture simulation. A dynamic rupture simulation is a numerical model of the physical processes that occur during an earthquake. Starting with the fault geometry, friction constitutive law, initial stress conditions, and assumptions about the condition and response of the near‐fault rocks, a dynamic earthquake rupture simulation calculates the evolution of fault slip and stress over time as part of the elastodynamic numerical solution (Ⓔ see the simulation description in the electronic supplement to this article). The complexity of the computations in a dynamic rupture simulation make it challenging to verify that the computer code is operating as intended, because there are no exact analytic solutions against which these codes’ results can be directly compared. One approach for checking if dynamic rupture computer codes are working satisfactorily is to compare each code’s results with the results of other dynamic rupture codes running the same earthquake simulation benchmark. To perform such a comparison consistently, it is necessary to have quantitative metrics. In this paper, we present a new method for quantitatively comparing the results of dynamic earthquake rupture computer simulation codes.

  16. Analytic Guided-Search Model of Human Performance Accuracy in Target- Localization Search Tasks

    NASA Technical Reports Server (NTRS)

    Eckstein, Miguel P.; Beutter, Brent R.; Stone, Leland S.

    2000-01-01

    Current models of human visual search have extended the traditional serial/parallel search dichotomy. Two successful models for predicting human visual search are the Guided Search model and the Signal Detection Theory model. Although these models are inherently different, it has been difficult to compare them because the Guided Search model is designed to predict response time, while Signal Detection Theory models are designed to predict performance accuracy. Moreover, current implementations of the Guided Search model require the use of Monte-Carlo simulations, a method that makes fitting the model's performance quantitatively to human data more computationally time consuming. We have extended the Guided Search model to predict human accuracy in target-localization search tasks. We have also developed analytic expressions that simplify simulation of the model to the evaluation of a small set of equations using only three free parameters. This new implementation and extension of the Guided Search model will enable direct quantitative comparisons with human performance in target-localization search experiments and with the predictions of Signal Detection Theory and other search accuracy models.

  17. Knowledge brokering on emissions modelling in Strategic Environmental Assessment of Estonian energy policy with special reference to the LEAP model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuldna, Piret, E-mail: piret.kuldna@seit.ee; Peterson, Kaja; Kuhi-Thalfeldt, Reeli

    Strategic Environmental Assessment (SEA) serves as a platform for bringing together researchers, policy developers and other stakeholders to evaluate and communicate significant environmental and socio-economic effects of policies, plans and programmes. Quantitative computer models can facilitate knowledge exchange between various parties that strive to use scientific findings to guide policy-making decisions. The process of facilitating knowledge generation and exchange, i.e. knowledge brokerage, has been increasingly explored, but there is not much evidence in the literature on how knowledge brokerage activities are used in full cycles of SEAs which employ quantitative models. We report on the SEA process of the nationalmore » energy plan with reflections on where and how the Long-range Energy Alternatives Planning (LEAP) model was used for knowledge brokerage on emissions modelling between researchers and policy developers. Our main suggestion is that applying a quantitative model not only in ex ante, but also ex post scenario modelling and associated impact assessment can facilitate systematic and inspiring knowledge exchange process on a policy problem and capacity building of participating actors. - Highlights: • We examine the knowledge brokering on emissions modelling between researchers and policy developers in a full cycle of SEA. • Knowledge exchange process can evolve at any modelling stage within SEA. • Ex post scenario modelling enables systematic knowledge exchange and learning on a policy problem.« less

  18. Variability of Protein Structure Models from Electron Microscopy.

    PubMed

    Monroe, Lyman; Terashi, Genki; Kihara, Daisuke

    2017-04-04

    An increasing number of biomolecular structures are solved by electron microscopy (EM). However, the quality of structure models determined from EM maps vary substantially. To understand to what extent structure models are supported by information embedded in EM maps, we used two computational structure refinement methods to examine how much structures can be refined using a dataset of 49 maps with accompanying structure models. The extent of structure modification as well as the disagreement between refinement models produced by the two computational methods scaled inversely with the global and the local map resolutions. A general quantitative estimation of deviations of structures for particular map resolutions are provided. Our results indicate that the observed discrepancy between the deposited map and the refined models is due to the lack of structural information present in EM maps and thus these annotations must be used with caution for further applications. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Atomistic simulations of carbon diffusion and segregation in liquid silicon

    NASA Astrophysics Data System (ADS)

    Luo, Jinping; Alateeqi, Abdullah; Liu, Lijun; Sinno, Talid

    2017-12-01

    The diffusivity of carbon atoms in liquid silicon and their equilibrium distribution between the silicon melt and crystal phases are key, but unfortunately not precisely known parameters for the global models of silicon solidification processes. In this study, we apply a suite of molecular simulation tools, driven by multiple empirical potential models, to compute diffusion and segregation coefficients of carbon at the silicon melting temperature. We generally find good consistency across the potential model predictions, although some exceptions are identified and discussed. We also find good agreement with the range of available experimental measurements of segregation coefficients. However, the carbon diffusion coefficients we compute are significantly lower than the values typically assumed in continuum models of impurity distribution. Overall, we show that currently available empirical potential models may be useful, at least semi-quantitatively, for studying carbon (and possibly other impurity) transport in silicon solidification, especially if a multi-model approach is taken.

  20. Solutions of the epidemic of EIAV infection by HPM

    NASA Astrophysics Data System (ADS)

    Balamuralitharan, S.; Geethamalini, S.

    2018-04-01

    In this article, Homotopy Perturbation Method (HPM) is to process of estimate to the arrangements to a model for Equine Infectious Anemia Virus (EIAV) disease. This technique allows a direct scheme for solving the problem. MATLAB is operated to complete the computations. Graphical results are displayed and discussed quantitatively and simplicity of the method.

  1. Innovation from a Computational Social Science Perspective: Analyses and Models

    ERIC Educational Resources Information Center

    Casstevens, Randy M.

    2013-01-01

    Innovation processes are critical for preserving and improving our standard of living. While innovation has been studied by many disciplines, the focus has been on qualitative measures that are specific to a single technological domain. I adopt a quantitative approach to investigate underlying regularities that generalize across multiple domains.…

  2. US EPA - A*Star Partnership - Accelerating the Acceptance of Next-Generation Sciences and Their Application to Regulatory Risk Assessment (A*Star Symposium, Singapore)

    EPA Science Inventory

    The path for incorporating new alternative methods and technologies into quantitative chemical risk assessment poses a diverse set of scientific challenges. Some of these challenges include development of relevant and predictive test systems and computational models to integrate...

  3. Non-invasive brain stimulation and computational models in post-stroke aphasic patients: single session of transcranial magnetic stimulation and transcranial direct current stimulation. A randomized clinical trial.

    PubMed

    Santos, Michele Devido Dos; Cavenaghi, Vitor Breseghello; Mac-Kay, Ana Paula Machado Goyano; Serafim, Vitor; Venturi, Alexandre; Truong, Dennis Quangvinh; Huang, Yu; Boggio, Paulo Sérgio; Fregni, Felipe; Simis, Marcel; Bikson, Marom; Gagliardi, Rubens José

    2017-01-01

    Patients undergoing the same neuromodulation protocol may present different responses. Computational models may help in understanding such differences. The aims of this study were, firstly, to compare the performance of aphasic patients in naming tasks before and after one session of transcranial direct current stimulation (tDCS), transcranial magnetic stimulation (TMS) and sham, and analyze the results between these neuromodulation techniques; and secondly, through computational model on the cortex and surrounding tissues, to assess current flow distribution and responses among patients who received tDCS and presented different levels of results from naming tasks. Prospective, descriptive, qualitative and quantitative, double blind, randomized and placebo-controlled study conducted at Faculdade de Ciências Médicas da Santa Casa de São Paulo. Patients with aphasia received one session of tDCS, TMS or sham stimulation. The time taken to name pictures and the response time were evaluated before and after neuromodulation. Selected patients from the first intervention underwent a computational model stimulation procedure that simulated tDCS. The results did not indicate any statistically significant differences from before to after the stimulation.The computational models showed different current flow distributions. The present study did not show any statistically significant difference between tDCS, TMS and sham stimulation regarding naming tasks. The patients'responses to the computational model showed different patterns of current distribution.

  4. Bayesian Latent Class Analysis Tutorial.

    PubMed

    Li, Yuelin; Lord-Bessen, Jennifer; Shiyko, Mariya; Loeb, Rebecca

    2018-01-01

    This article is a how-to guide on Bayesian computation using Gibbs sampling, demonstrated in the context of Latent Class Analysis (LCA). It is written for students in quantitative psychology or related fields who have a working knowledge of Bayes Theorem and conditional probability and have experience in writing computer programs in the statistical language R . The overall goals are to provide an accessible and self-contained tutorial, along with a practical computation tool. We begin with how Bayesian computation is typically described in academic articles. Technical difficulties are addressed by a hypothetical, worked-out example. We show how Bayesian computation can be broken down into a series of simpler calculations, which can then be assembled together to complete a computationally more complex model. The details are described much more explicitly than what is typically available in elementary introductions to Bayesian modeling so that readers are not overwhelmed by the mathematics. Moreover, the provided computer program shows how Bayesian LCA can be implemented with relative ease. The computer program is then applied in a large, real-world data set and explained line-by-line. We outline the general steps in how to extend these considerations to other methodological applications. We conclude with suggestions for further readings.

  5. Impacts of Fluid Dynamics Simulation in Study of Nasal Airflow Physiology and Pathophysiology in Realistic Human Three-Dimensional Nose Models

    PubMed Central

    Lee, Heow Peuh; Gordon, Bruce R.

    2012-01-01

    During the past decades, numerous computational fluid dynamics (CFD) studies, constructed from CT or MRI images, have simulated human nasal models. As compared to rhinomanometry and acoustic rhinometry, which provide quantitative information only of nasal airflow, resistance, and cross sectional areas, CFD enables additional measurements of airflow passing through the nasal cavity that help visualize the physiologic impact of alterations in intranasal structures. Therefore, it becomes possible to quantitatively measure, and visually appreciate, the airflow pattern (laminar or turbulent), velocity, pressure, wall shear stress, particle deposition, and temperature changes at different flow rates, in different parts of the nasal cavity. The effects of both existing anatomical factors, as well as post-operative changes, can be assessed. With recent improvements in CFD technology and computing power, there is a promising future for CFD to become a useful tool in planning, predicting, and evaluating outcomes of nasal surgery. This review discusses the possibilities and potential impacts, as well as technical limitations, of using CFD simulation to better understand nasal airflow physiology. PMID:23205221

  6. A simple linear regression method for quantitative trait loci linkage analysis with censored observations.

    PubMed

    Anderson, Carl A; McRae, Allan F; Visscher, Peter M

    2006-07-01

    Standard quantitative trait loci (QTL) mapping techniques commonly assume that the trait is both fully observed and normally distributed. When considering survival or age-at-onset traits these assumptions are often incorrect. Methods have been developed to map QTL for survival traits; however, they are both computationally intensive and not available in standard genome analysis software packages. We propose a grouped linear regression method for the analysis of continuous survival data. Using simulation we compare this method to both the Cox and Weibull proportional hazards models and a standard linear regression method that ignores censoring. The grouped linear regression method is of equivalent power to both the Cox and Weibull proportional hazards methods and is significantly better than the standard linear regression method when censored observations are present. The method is also robust to the proportion of censored individuals and the underlying distribution of the trait. On the basis of linear regression methodology, the grouped linear regression model is computationally simple and fast and can be implemented readily in freely available statistical software.

  7. Single Cell Genomics: Approaches and Utility in Immunology

    PubMed Central

    Neu, Karlynn E; Tang, Qingming; Wilson, Patrick C; Khan, Aly A

    2017-01-01

    Single cell genomics offers powerful tools for studying lymphocytes, which make it possible to observe rare and intermediate cell states that cannot be resolved at the population-level. Advances in computer science and single cell sequencing technology have created a data-driven revolution in immunology. The challenge for immunologists is to harness computing and turn an avalanche of quantitative data into meaningful discovery of immunological principles, predictive models, and strategies for therapeutics. Here, we review the current literature on computational analysis of single cell RNA-seq data and discuss underlying assumptions, methods, and applications in immunology, and highlight important directions for future research. PMID:28094102

  8. Machine learning methods for classifying human physical activity from on-body accelerometers.

    PubMed

    Mannini, Andrea; Sabatini, Angelo Maria

    2010-01-01

    The use of on-body wearable sensors is widespread in several academic and industrial domains. Of great interest are their applications in ambulatory monitoring and pervasive computing systems; here, some quantitative analysis of human motion and its automatic classification are the main computational tasks to be pursued. In this paper, we discuss how human physical activity can be classified using on-body accelerometers, with a major emphasis devoted to the computational algorithms employed for this purpose. In particular, we motivate our current interest for classifiers based on Hidden Markov Models (HMMs). An example is illustrated and discussed by analysing a dataset of accelerometer time series.

  9. Modeling the filament winding process

    NASA Technical Reports Server (NTRS)

    Calius, E. P.; Springer, G. S.

    1985-01-01

    A model is presented which can be used to determine the appropriate values of the process variables for filament winding a cylinder. The model provides the cylinder temperature, viscosity, degree of cure, fiber position and fiber tension as functions of position and time during the filament winding and subsequent cure, and the residual stresses and strains within the cylinder during and after the cure. A computer code was developed to obtain quantitative results. Sample results are given which illustrate the information that can be generated with this code.

  10. Quantitative structure-activity relationship: promising advances in drug discovery platforms.

    PubMed

    Wang, Tao; Wu, Mian-Bin; Lin, Jian-Ping; Yang, Li-Rong

    2015-12-01

    Quantitative structure-activity relationship (QSAR) modeling is one of the most popular computer-aided tools employed in medicinal chemistry for drug discovery and lead optimization. It is especially powerful in the absence of 3D structures of specific drug targets. QSAR methods have been shown to draw public attention since they were first introduced. In this review, the authors provide a brief discussion of the basic principles of QSAR, model development and model validation. They also highlight the current applications of QSAR in different fields, particularly in virtual screening, rational drug design and multi-target QSAR. Finally, in view of recent controversies, the authors detail the challenges faced by QSAR modeling and the relevant solutions. The aim of this review is to show how QSAR modeling can be applied in novel drug discovery, design and lead optimization. QSAR should intentionally be used as a powerful tool for fragment-based drug design platforms in the field of drug discovery and design. Although there have been an increasing number of experimentally determined protein structures in recent years, a great number of protein structures cannot be easily obtained (i.e., membrane transport proteins and G-protein coupled receptors). Fragment-based drug discovery, such as QSAR, could be applied further and have a significant role in dealing with these problems. Moreover, along with the development of computer software and hardware, it is believed that QSAR will be increasingly important.

  11. Computational physiology and the Physiome Project.

    PubMed

    Crampin, Edmund J; Halstead, Matthew; Hunter, Peter; Nielsen, Poul; Noble, Denis; Smith, Nicolas; Tawhai, Merryn

    2004-01-01

    Bioengineering analyses of physiological systems use the computational solution of physical conservation laws on anatomically detailed geometric models to understand the physiological function of intact organs in terms of the properties and behaviour of the cells and tissues within the organ. By linking behaviour in a quantitative, mathematically defined sense across multiple scales of biological organization--from proteins to cells, tissues, organs and organ systems--these methods have the potential to link patient-specific knowledge at the two ends of these spatial scales. A genetic profile linked to cardiac ion channel mutations, for example, can be interpreted in relation to body surface ECG measurements via a mathematical model of the heart and torso, which includes the spatial distribution of cardiac ion channels throughout the myocardium and the individual kinetics for each of the approximately 50 types of ion channel, exchanger or pump known to be present in the heart. Similarly, linking molecular defects such as mutations of chloride ion channels in lung epithelial cells to the integrated function of the intact lung requires models that include the detailed anatomy of the lungs, the physics of air flow, blood flow and gas exchange, together with the large deformation mechanics of breathing. Organizing this large body of knowledge into a coherent framework for modelling requires the development of ontologies, markup languages for encoding models, and web-accessible distributed databases. In this article we review the state of the field at all the relevant levels, and the tools that are being developed to tackle such complexity. Integrative physiology is central to the interpretation of genomic and proteomic data, and is becoming a highly quantitative, computer-intensive discipline.

  12. Spatiotemporal dynamics of the Ebola epidemic in Guinea and implications for vaccination and disease elimination: a computational modeling analysis.

    PubMed

    Ajelli, Marco; Merler, Stefano; Fumanelli, Laura; Pastore Y Piontti, Ana; Dean, Natalie E; Longini, Ira M; Halloran, M Elizabeth; Vespignani, Alessandro

    2016-09-07

    Among the three countries most affected by the Ebola virus disease outbreak in 2014-2015, Guinea presents an unusual spatiotemporal epidemic pattern, with several waves and a long tail in the decay of the epidemic incidence. Here, we develop a stochastic agent-based model at the level of a single household that integrates detailed data on Guinean demography, hospitals, Ebola treatment units, contact tracing, and safe burial interventions. The microsimulation-based model is used to assess the effect of each control strategy and the probability of elimination of the epidemic according to different intervention scenarios, including ring vaccination with the recombinant vesicular stomatitis virus-vectored vaccine. The numerical results indicate that the dynamics of the Ebola epidemic in Guinea can be quantitatively explained by the timeline of the implemented interventions. In particular, the early availability of Ebola treatment units and the associated isolation of cases and safe burials helped to limit the number of Ebola cases experienced by Guinea. We provide quantitative evidence of a strong negative correlation between the time series of cases and the number of traced contacts. This result is confirmed by the computational model that suggests that contact tracing effort is a key determinant in the control and elimination of the disease. In data-driven microsimulations, we find that tracing at least 5-10 contacts per case is crucial in preventing epidemic resurgence during the epidemic elimination phase. The computational model is used to provide an analysis of the ring vaccination trial highlighting its potential effect on disease elimination. We identify contact tracing as one of the key determinants of the epidemic's behavior in Guinea, and we show that the early availability of Ebola treatment unit beds helped to limit the number of Ebola cases in Guinea.

  13. Quantitative Image Feature Engine (QIFE): an Open-Source, Modular Engine for 3D Quantitative Feature Extraction from Volumetric Medical Images.

    PubMed

    Echegaray, Sebastian; Bakr, Shaimaa; Rubin, Daniel L; Napel, Sandy

    2017-10-06

    The aim of this study was to develop an open-source, modular, locally run or server-based system for 3D radiomics feature computation that can be used on any computer system and included in existing workflows for understanding associations and building predictive models between image features and clinical data, such as survival. The QIFE exploits various levels of parallelization for use on multiprocessor systems. It consists of a managing framework and four stages: input, pre-processing, feature computation, and output. Each stage contains one or more swappable components, allowing run-time customization. We benchmarked the engine using various levels of parallelization on a cohort of CT scans presenting 108 lung tumors. Two versions of the QIFE have been released: (1) the open-source MATLAB code posted to Github, (2) a compiled version loaded in a Docker container, posted to DockerHub, which can be easily deployed on any computer. The QIFE processed 108 objects (tumors) in 2:12 (h/mm) using 1 core, and 1:04 (h/mm) hours using four cores with object-level parallelization. We developed the Quantitative Image Feature Engine (QIFE), an open-source feature-extraction framework that focuses on modularity, standards, parallelism, provenance, and integration. Researchers can easily integrate it with their existing segmentation and imaging workflows by creating input and output components that implement their existing interfaces. Computational efficiency can be improved by parallelizing execution at the cost of memory usage. Different parallelization levels provide different trade-offs, and the optimal setting will depend on the size and composition of the dataset to be processed.

  14. Quantitative paleotopography and paleogeography around the Gibraltar Arc (South Spain) during the Messinian Salinity Crisis

    NASA Astrophysics Data System (ADS)

    Elez, Javier; Silva, Pablo G.; Huerta, Pedro; Perucha, M. Ángeles; Civis, Jorge; Roquero, Elvira; Rodríguez-Pascua, Miguel A.; Bardají, Teresa; Giner-Robles, Jorge L.; Martínez-Graña, Antonio

    2016-12-01

    The Malaga basin contains an important geological record documenting the complex paleogeographic evolution of the Gibraltar Arc before, during and after the closure and desiccation of the Mediterranean Sea triggered by the "Messinian Salinity crisis" (MSC). Proxy paleo-elevation data, estimated from the stratigraphic and geomorphological records, allow the building of quantitative paleogeoid, paleotopographic and paleogeographic models for the three main paleogeographic stages: pre-MSC (Tortonian-early Messinian), syn-MSC (late Messinian) and post-MSC (early Pliocene). The methodological workflow combines classical contouring procedures used in geology and isobase map models from geomorphometric analyses and proxy data overprinted on present Digital Terrain Models. The resulting terrain quantitative models have been arranged, managed and computed in a GIS environment. The computed terrain models enable the exploration of past landscapes usually beyond the reach of classical geomorphological analyses and strongly improve the paleogeographic and paleotopographic knowledge of the study area. The resulting models suggest the occurrence of a set of uplifted littoral erosive and paleokarstic landforms that evolved during pre-MSC times. These uplifted landform assemblages can explain the origin of key elements of the present landscape, such as the Torcal de Antequera and the large amount of mogote-like relict hills present in the zone, in terms of ancient uplifted tropical islands. The most prominent landform is the extensive erosional platform dominating the Betic frontal zone that represents the relic Atlantic wave cut platform elaborated during late-Tortonian to early Messinian times. The amount of uplift derived from paleogeoid models suggests that the area rose by about 340 m during the MSC. This points to isostatic uplift triggered by differential erosional unloading (towards the Mediterranean) as the main factor controlling landscape evolution in the area during and after the MSC. Former littoral landscapes in the old emergent axis of the Gibraltar Arc were uplifted to form the main water-divide of the present Betic Cordillera in the zone.

  15. A synchrotron-based local computed tomography combined with data-constrained modelling approach for quantitative analysis of anthracite coal microstructure

    PubMed Central

    Chen, Wen Hao; Yang, Sam Y. S.; Xiao, Ti Qiao; Mayo, Sherry C.; Wang, Yu Dan; Wang, Hai Peng

    2014-01-01

    Quantifying three-dimensional spatial distributions of pores and material compositions in samples is a key materials characterization challenge, particularly in samples where compositions are distributed across a range of length scales, and where such compositions have similar X-ray absorption properties, such as in coal. Consequently, obtaining detailed information within sub-regions of a multi-length-scale sample by conventional approaches may not provide the resolution and level of detail one might desire. Herein, an approach for quantitative high-definition determination of material compositions from X-ray local computed tomography combined with a data-constrained modelling method is proposed. The approach is capable of dramatically improving the spatial resolution and enabling finer details within a region of interest of a sample larger than the field of view to be revealed than by using conventional techniques. A coal sample containing distributions of porosity and several mineral compositions is employed to demonstrate the approach. The optimal experimental parameters are pre-analyzed. The quantitative results demonstrated that the approach can reveal significantly finer details of compositional distributions in the sample region of interest. The elevated spatial resolution is crucial for coal-bed methane reservoir evaluation and understanding the transformation of the minerals during coal processing. The method is generic and can be applied for three-dimensional compositional characterization of other materials. PMID:24763649

  16. In vivo quantitative bioluminescence tomography using heterogeneous and homogeneous mouse models.

    PubMed

    Liu, Junting; Wang, Yabin; Qu, Xiaochao; Li, Xiangsi; Ma, Xiaopeng; Han, Runqiang; Hu, Zhenhua; Chen, Xueli; Sun, Dongdong; Zhang, Rongqing; Chen, Duofang; Chen, Dan; Chen, Xiaoyuan; Liang, Jimin; Cao, Feng; Tian, Jie

    2010-06-07

    Bioluminescence tomography (BLT) is a new optical molecular imaging modality, which can monitor both physiological and pathological processes by using bioluminescent light-emitting probes in small living animal. Especially, this technology possesses great potential in drug development, early detection, and therapy monitoring in preclinical settings. In the present study, we developed a dual modality BLT prototype system with Micro-computed tomography (MicroCT) registration approach, and improved the quantitative reconstruction algorithm based on adaptive hp finite element method (hp-FEM). Detailed comparisons of source reconstruction between the heterogeneous and homogeneous mouse models were performed. The models include mice with implanted luminescence source and tumor-bearing mice with firefly luciferase report gene. Our data suggest that the reconstruction based on heterogeneous mouse model is more accurate in localization and quantification than the homogeneous mouse model with appropriate optical parameters and that BLT allows super-early tumor detection in vivo based on tomographic reconstruction of heterogeneous mouse model signal.

  17. Quantitative Structure – Property Relationship Modeling of Remote Liposome Loading Of Drugs

    PubMed Central

    Cern, Ahuva; Golbraikh, Alexander; Sedykh, Aleck; Tropsha, Alexander; Barenholz, Yechezkel; Goldblum, Amiram

    2012-01-01

    Remote loading of liposomes by trans-membrane gradients is used to achieve therapeutically efficacious intra-liposome concentrations of drugs. We have developed Quantitative Structure Property Relationship (QSPR) models of remote liposome loading for a dataset including 60 drugs studied in 366 loading experiments internally or elsewhere. Both experimental conditions and computed chemical descriptors were employed as independent variables to predict the initial drug/lipid ratio (D/L) required to achieve high loading efficiency. Both binary (to distinguish high vs. low initial D/L) and continuous (to predict real D/L values) models were generated using advanced machine learning approaches and five-fold external validation. The external prediction accuracy for binary models was as high as 91–96%; for continuous models the mean coefficient R2 for regression between predicted versus observed values was 0.76–0.79. We conclude that QSPR models can be used to identify candidate drugs expected to have high remote loading capacity while simultaneously optimizing the design of formulation experiments. PMID:22154932

  18. Tractable Experiment Design via Mathematical Surrogates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Brian J.

    This presentation summarizes the development and implementation of quantitative design criteria motivated by targeted inference objectives for identifying new, potentially expensive computational or physical experiments. The first application is concerned with estimating features of quantities of interest arising from complex computational models, such as quantiles or failure probabilities. A sequential strategy is proposed for iterative refinement of the importance distributions used to efficiently sample the uncertain inputs to the computational model. In the second application, effective use of mathematical surrogates is investigated to help alleviate the analytical and numerical intractability often associated with Bayesian experiment design. This approach allows formore » the incorporation of prior information into the design process without the need for gross simplification of the design criterion. Illustrative examples of both design problems will be presented as an argument for the relevance of these research problems.« less

  19. Forest management and economics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buongiorno, J.; Gilless, J.K.

    1987-01-01

    This volume provides a survey of quantitative methods, guiding the reader through formulation and analysis of models that address forest management problems. The authors use simple mathematics, graphics, and short computer programs to explain each method. Emphasizing applications, they discuss linear, integer, dynamic, and goal programming; simulation; network modeling; and econometrics, as these relate to problems of determining economic harvest schedules in even-aged and uneven-aged forests, the evaluation of forest policies, multiple-objective decision making, and more.

  20. Toward computational crime prediction. Comment on "Statistical physics of crime: A review" by M.R. D'Orsogna and M. Perc

    NASA Astrophysics Data System (ADS)

    Ferrara, Emilio

    2015-03-01

    Containing the spreading of crime in modern society in an ongoing battle: our understanding of the dynamics underlying criminal events and the motifs behind individuals therein involved is crucial to design cost-effective prevention policies and intervention strategies. During recent years we witnessed various research fields joining forces, sharing models and methods, toward modeling and quantitatively characterizing crime and criminal behavior.

  1. Calculation of thermodynamic functions of aluminum plasma for high-energy-density systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shumaev, V. V., E-mail: shumaev@student.bmstu.ru

    The results of calculating the degree of ionization, the pressure, and the specific internal energy of aluminum plasma in a wide temperature range are presented. The TERMAG computational code based on the Thomas–Fermi model was used at temperatures T > 105 K, and the ionization equilibrium model (Saha model) was applied at lower temperatures. Quantitatively similar results were obtained in the temperature range where both models are applicable. This suggests that the obtained data may be joined to produce a wide-range equation of state.

  2. Chromatic Image Analysis For Quantitative Thermal Mapping

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.

    1995-01-01

    Chromatic image analysis system (CIAS) developed for use in noncontact measurements of temperatures on aerothermodynamic models in hypersonic wind tunnels. Based on concept of temperature coupled to shift in color spectrum for optical measurement. Video camera images fluorescence emitted by phosphor-coated model at two wavelengths. Temperature map of model then computed from relative brightnesses in video images of model at those wavelengths. Eliminates need for intrusive, time-consuming, contact temperature measurements by gauges, making it possible to map temperatures on complex surfaces in timely manner and at reduced cost.

  3. Photons Revisited

    NASA Astrophysics Data System (ADS)

    Batic, Matej; Begalli, Marcia; Han, Min Cheol; Hauf, Steffen; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Han Sung; Grazia Pia, Maria; Saracco, Paolo; Weidenspointner, Georg

    2014-06-01

    A systematic review of methods and data for the Monte Carlo simulation of photon interactions is in progress: it concerns a wide set of theoretical modeling approaches and data libraries available for this purpose. Models and data libraries are assessed quantitatively with respect to an extensive collection of experimental measurements documented in the literature to determine their accuracy; this evaluation exploits rigorous statistical analysis methods. The computational performance of the associated modeling algorithms is evaluated as well. An overview of the assessment of photon interaction models and results of the experimental validation are presented.

  4. Man-machine analysis of translation and work tasks of Skylab films

    NASA Technical Reports Server (NTRS)

    Hosler, W. W.; Boelter, J. G.; Morrow, J. R., Jr.; Jackson, J. T.

    1979-01-01

    An objective approach to determine the concurrent validity of computer-graphic models is real time film analysis. This technique was illustrated through the procedures and results obtained in an evaluation of translation of Skylab mission astronauts. The quantitative analysis was facilitated by the use of an electronic film analyzer, minicomputer, and specifically supportive software. The uses of this technique for human factors research are: (1) validation of theoretical operator models; (2) biokinetic analysis; (3) objective data evaluation; (4) dynamic anthropometry; (5) empirical time-line analysis; and (6) consideration of human variability. Computer assisted techniques for interface design and evaluation have the potential for improving the capability for human factors engineering.

  5. Client - server programs analysis in the EPOCA environment

    NASA Astrophysics Data System (ADS)

    Donatelli, Susanna; Mazzocca, Nicola; Russo, Stefano

    1996-09-01

    Client - server processing is a popular paradigm for distributed computing. In the development of client - server programs, the designer has first to ensure that the implementation behaves correctly, in particular that it is deadlock free. Second, he has to guarantee that the program meets predefined performance requirements. This paper addresses the issues in the analysis of client - server programs in EPOCA. EPOCA is a computer-aided software engeneering (CASE) support system that allows the automated construction and analysis of generalized stochastic Petri net (GSPN) models of concurrent applications. The paper describes, on the basis of a realistic case study, how client - server systems are modelled in EPOCA, and the kind of qualitative and quantitative analysis supported by its tools.

  6. [Modeling continuous scaling of NDVI based on fractal theory].

    PubMed

    Luan, Hai-Jun; Tian, Qing-Jiu; Yu, Tao; Hu, Xin-Li; Huang, Yan; Du, Ling-Tong; Zhao, Li-Min; Wei, Xi; Han, Jie; Zhang, Zhou-Wei; Li, Shao-Peng

    2013-07-01

    Scale effect was one of the very important scientific problems of remote sensing. The scale effect of quantitative remote sensing can be used to study retrievals' relationship between different-resolution images, and its research became an effective way to confront the challenges, such as validation of quantitative remote sensing products et al. Traditional up-scaling methods cannot describe scale changing features of retrievals on entire series of scales; meanwhile, they are faced with serious parameters correction issues because of imaging parameters' variation of different sensors, such as geometrical correction, spectral correction, etc. Utilizing single sensor image, fractal methodology was utilized to solve these problems. Taking NDVI (computed by land surface radiance) as example and based on Enhanced Thematic Mapper Plus (ETM+) image, a scheme was proposed to model continuous scaling of retrievals. Then the experimental results indicated that: (a) For NDVI, scale effect existed, and it could be described by fractal model of continuous scaling; (2) The fractal method was suitable for validation of NDVI. All of these proved that fractal was an effective methodology of studying scaling of quantitative remote sensing.

  7. On the origin of the electrostatic potential difference at a liquid-vacuum interface.

    PubMed

    Harder, Edward; Roux, Benoît

    2008-12-21

    The microscopic origin of the interface potential calculated from computer simulations is elucidated by considering a simple model of molecules near an interface. The model posits that molecules are isotropically oriented and their charge density is Gaussian distributed. Molecules that have a charge density that is more negative toward their interior tend to give rise to a negative interface potential relative to the gaseous phase, while charge densities more positive toward their interior give rise to a positive interface potential. The interface potential for the model is compared to the interface potential computed from molecular dynamics simulations of the nonpolar vacuum-methane system and the polar vacuum-water interface system. The computed vacuum-methane interface potential from a molecular dynamics simulation (-220 mV) is captured with quantitative precision by the model. For the vacuum-water interface system, the model predicts a potential of -400 mV compared to -510 mV, calculated from a molecular dynamics simulation. The physical implications of this isotropic contribution to the interface potential is examined using the example of ion solvation in liquid methane.

  8. Different treatment modalities of fusiform basilar trunk aneurysm: study on computational hemodynamics.

    PubMed

    Wu, Chen; Xu, Bai-Nan; Sun, Zheng-Hui; Wang, Fu-Yu; Liu, Lei; Zhang, Xiao-Jun; Zhou, Ding-Biao

    2012-01-01

    Unclippable fusiform basilar trunk aneurysm is a formidable condition for surgical treatment. The aim of this study was to establish a computational model and to investigate the hemodynamic characteristics in a fusiform basilar trunk aneurysm. The three-dimensional digital model of a fusiform basilar trunk aneurysm was constructed using MIMICS, ANSYS and CFX software. Different hemodynamic modalities and border conditions were assigned to the model. Thirty points were selected randomly on the wall and within the aneurysm. Wall total pressure (WTP), wall shear stress (WSS), and blood flow velocity of each point were calculated and hemodynamic status was compared between different modalities. The quantitative average values of the 30 points on the wall and within the aneurysm were obtained by computational calculation point by point. The velocity and WSS in modalities A and B were different from those of the remaining 5 modalities; and the WTP in modalities A, E and F were higher than those of the remaining 4 modalities. The digital model of a fusiform basilar artery aneurysm is feasible and reliable. This model could provide some important information to clinical treatment options.

  9. [Contribution of X-ray computed tomography in the evaluation of kidney performance].

    PubMed

    Lemoine, Sandrine; Rognant, Nicolas; Collet-Benzaquen, Diane; Juillard, Laurent

    2012-07-01

    X-ray computer assisted tomography scanner is an imaging method based on the use of X-ray attenuation in tissue. This attenuation is proportional to the density of the tissue (without or after contrast media injection) in each pixel image of the image. Spiral scanner, the electron beam computed tomography (EBCT) scanner and multidetector computed tomography scanner allow renal anatomical measurements, such as cortical and medullary volume, but also the measurement of renal functional parameters, such as regional renal perfusion, renal blood flow and glomerular filtration rate. These functional parameters are extracted from the modeling of the kinetics of the contrast media concentration in the vascular space and the renal tissue, using two main mathematical models (the gamma variate model and the Patlak model). Renal functional imaging allows measuring quantitative parameters on each kidney separately, in a non-invasive manner, providing significant opportunities in nephrology, both for experimental and clinical studies. However, this method uses contrast media that may alter renal function, thus limiting its use in patients with chronic renal failure. Moreover, the increase irradiation delivered to the patient with multi detector computed tomography (MDCT) should be considered. Copyright © 2011 Association Société de néphrologie. Published by Elsevier SAS. All rights reserved.

  10. Automatic inference of multicellular regulatory networks using informative priors.

    PubMed

    Sun, Xiaoyun; Hong, Pengyu

    2009-01-01

    To fully understand the mechanisms governing animal development, computational models and algorithms are needed to enable quantitative studies of the underlying regulatory networks. We developed a mathematical model based on dynamic Bayesian networks to model multicellular regulatory networks that govern cell differentiation processes. A machine-learning method was developed to automatically infer such a model from heterogeneous data. We show that the model inference procedure can be greatly improved by incorporating interaction data across species. The proposed approach was applied to C. elegans vulval induction to reconstruct a model capable of simulating C. elegans vulval induction under 73 different genetic conditions.

  11. Quantitative systems toxicology

    PubMed Central

    Bloomingdale, Peter; Housand, Conrad; Apgar, Joshua F.; Millard, Bjorn L.; Mager, Donald E.; Burke, John M.; Shah, Dhaval K.

    2017-01-01

    The overarching goal of modern drug development is to optimize therapeutic benefits while minimizing adverse effects. However, inadequate efficacy and safety concerns remain to be the major causes of drug attrition in clinical development. For the past 80 years, toxicity testing has consisted of evaluating the adverse effects of drugs in animals to predict human health risks. The U.S. Environmental Protection Agency recognized the need to develop innovative toxicity testing strategies and asked the National Research Council to develop a long-range vision and strategy for toxicity testing in the 21st century. The vision aims to reduce the use of animals and drug development costs through the integration of computational modeling and in vitro experimental methods that evaluates the perturbation of toxicity-related pathways. Towards this vision, collaborative quantitative systems pharmacology and toxicology modeling endeavors (QSP/QST) have been initiated amongst numerous organizations worldwide. In this article, we discuss how quantitative structure-activity relationship (QSAR), network-based, and pharmacokinetic/pharmacodynamic modeling approaches can be integrated into the framework of QST models. Additionally, we review the application of QST models to predict cardiotoxicity and hepatotoxicity of drugs throughout their development. Cell and organ specific QST models are likely to become an essential component of modern toxicity testing, and provides a solid foundation towards determining individualized therapeutic windows to improve patient safety. PMID:29308440

  12. Hybrid model based unified scheme for endoscopic Cerenkov and radio-luminescence tomography: Simulation demonstration

    NASA Astrophysics Data System (ADS)

    Wang, Lin; Cao, Xin; Ren, Qingyun; Chen, Xueli; He, Xiaowei

    2018-05-01

    Cerenkov luminescence imaging (CLI) is an imaging method that uses an optical imaging scheme to probe a radioactive tracer. Application of CLI with clinically approved radioactive tracers has opened an opportunity for translating optical imaging from preclinical to clinical applications. Such translation was further improved by developing an endoscopic CLI system. However, two-dimensional endoscopic imaging cannot identify accurate depth and obtain quantitative information. Here, we present an imaging scheme to retrieve the depth and quantitative information from endoscopic Cerenkov luminescence tomography, which can also be applied for endoscopic radio-luminescence tomography. In the scheme, we first constructed a physical model for image collection, and then a mathematical model for characterizing the luminescent light propagation from tracer to the endoscopic detector. The mathematical model is a hybrid light transport model combined with the 3rd order simplified spherical harmonics approximation, diffusion, and radiosity equations to warrant accuracy and speed. The mathematical model integrates finite element discretization, regularization, and primal-dual interior-point optimization to retrieve the depth and the quantitative information of the tracer. A heterogeneous-geometry-based numerical simulation was used to explore the feasibility of the unified scheme, which demonstrated that it can provide a satisfactory balance between imaging accuracy and computational burden.

  13. Multi-Scale Modeling to Improve Single-Molecule, Single-Cell Experiments

    NASA Astrophysics Data System (ADS)

    Munsky, Brian; Shepherd, Douglas

    2014-03-01

    Single-cell, single-molecule experiments are producing an unprecedented amount of data to capture the dynamics of biological systems. When integrated with computational models, observations of spatial, temporal and stochastic fluctuations can yield powerful quantitative insight. We concentrate on experiments that localize and count individual molecules of mRNA. These high precision experiments have large imaging and computational processing costs, and we explore how improved computational analyses can dramatically reduce overall data requirements. In particular, we show how analyses of spatial, temporal and stochastic fluctuations can significantly enhance parameter estimation results for small, noisy data sets. We also show how full probability distribution analyses can constrain parameters with far less data than bulk analyses or statistical moment closures. Finally, we discuss how a systematic modeling progression from simple to more complex analyses can reduce total computational costs by orders of magnitude. We illustrate our approach using single-molecule, spatial mRNA measurements of Interleukin 1-alpha mRNA induction in human THP1 cells following stimulation. Our approach could improve the effectiveness of single-molecule gene regulation analyses for many other process.

  14. Efficient computation of electrograms and ECGs in human whole heart simulations using a reaction-eikonal model.

    PubMed

    Neic, Aurel; Campos, Fernando O; Prassl, Anton J; Niederer, Steven A; Bishop, Martin J; Vigmond, Edward J; Plank, Gernot

    2017-10-01

    Anatomically accurate and biophysically detailed bidomain models of the human heart have proven a powerful tool for gaining quantitative insight into the links between electrical sources in the myocardium and the concomitant current flow in the surrounding medium as they represent their relationship mechanistically based on first principles. Such models are increasingly considered as a clinical research tool with the perspective of being used, ultimately, as a complementary diagnostic modality. An important prerequisite in many clinical modeling applications is the ability of models to faithfully replicate potential maps and electrograms recorded from a given patient. However, while the personalization of electrophysiology models based on the gold standard bidomain formulation is in principle feasible, the associated computational expenses are significant, rendering their use incompatible with clinical time frames. In this study we report on the development of a novel computationally efficient reaction-eikonal (R-E) model for modeling extracellular potential maps and electrograms. Using a biventricular human electrophysiology model, which incorporates a topologically realistic His-Purkinje system (HPS), we demonstrate by comparing against a high-resolution reaction-diffusion (R-D) bidomain model that the R-E model predicts extracellular potential fields, electrograms as well as ECGs at the body surface with high fidelity and offers vast computational savings greater than three orders of magnitude. Due to their efficiency R-E models are ideally suitable for forward simulations in clinical modeling studies which attempt to personalize electrophysiological model features.

  15. Efficient computation of electrograms and ECGs in human whole heart simulations using a reaction-eikonal model

    NASA Astrophysics Data System (ADS)

    Neic, Aurel; Campos, Fernando O.; Prassl, Anton J.; Niederer, Steven A.; Bishop, Martin J.; Vigmond, Edward J.; Plank, Gernot

    2017-10-01

    Anatomically accurate and biophysically detailed bidomain models of the human heart have proven a powerful tool for gaining quantitative insight into the links between electrical sources in the myocardium and the concomitant current flow in the surrounding medium as they represent their relationship mechanistically based on first principles. Such models are increasingly considered as a clinical research tool with the perspective of being used, ultimately, as a complementary diagnostic modality. An important prerequisite in many clinical modeling applications is the ability of models to faithfully replicate potential maps and electrograms recorded from a given patient. However, while the personalization of electrophysiology models based on the gold standard bidomain formulation is in principle feasible, the associated computational expenses are significant, rendering their use incompatible with clinical time frames. In this study we report on the development of a novel computationally efficient reaction-eikonal (R-E) model for modeling extracellular potential maps and electrograms. Using a biventricular human electrophysiology model, which incorporates a topologically realistic His-Purkinje system (HPS), we demonstrate by comparing against a high-resolution reaction-diffusion (R-D) bidomain model that the R-E model predicts extracellular potential fields, electrograms as well as ECGs at the body surface with high fidelity and offers vast computational savings greater than three orders of magnitude. Due to their efficiency R-E models are ideally suitable for forward simulations in clinical modeling studies which attempt to personalize electrophysiological model features.

  16. Building a Database for a Quantitative Model

    NASA Technical Reports Server (NTRS)

    Kahn, C. Joseph; Kleinhammer, Roger

    2014-01-01

    A database can greatly benefit a quantitative analysis. The defining characteristic of a quantitative risk, or reliability, model is the use of failure estimate data. Models can easily contain a thousand Basic Events, relying on hundreds of individual data sources. Obviously, entering so much data by hand will eventually lead to errors. Not so obviously entering data this way does not aid linking the Basic Events to the data sources. The best way to organize large amounts of data on a computer is with a database. But a model does not require a large, enterprise-level database with dedicated developers and administrators. A database built in Excel can be quite sufficient. A simple spreadsheet database can link every Basic Event to the individual data source selected for them. This database can also contain the manipulations appropriate for how the data is used in the model. These manipulations include stressing factors based on use and maintenance cycles, dormancy, unique failure modes, the modeling of multiple items as a single "Super component" Basic Event, and Bayesian Updating based on flight and testing experience. A simple, unique metadata field in both the model and database provides a link from any Basic Event in the model to its data source and all relevant calculations. The credibility for the entire model often rests on the credibility and traceability of the data.

  17. Exploring the Perceptions of College Instructors towards Computer Simulation Software Programs: A Quantitative Study

    ERIC Educational Resources Information Center

    Punch, Raymond J.

    2012-01-01

    The purpose of the quantitative regression study was to explore and to identify relationships between attitudes toward use and perceptions of value of computer-based simulation programs, of college instructors, toward computer based simulation programs. A relationship has been reported between attitudes toward use and perceptions of the value of…

  18. 14 CFR 152.319 - Monitoring and reporting of program performance.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... established for the period, made, if applicable, on a quantitative basis related to cost data for computation... established for the period, made, if applicable, on a quantitative basis related to costs for computation of...

  19. 14 CFR 152.319 - Monitoring and reporting of program performance.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... established for the period, made, if applicable, on a quantitative basis related to cost data for computation... established for the period, made, if applicable, on a quantitative basis related to costs for computation of...

  20. 14 CFR 152.319 - Monitoring and reporting of program performance.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... established for the period, made, if applicable, on a quantitative basis related to cost data for computation... established for the period, made, if applicable, on a quantitative basis related to costs for computation of...

  1. 14 CFR 152.319 - Monitoring and reporting of program performance.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... established for the period, made, if applicable, on a quantitative basis related to cost data for computation... established for the period, made, if applicable, on a quantitative basis related to costs for computation of...

  2. 14 CFR 152.319 - Monitoring and reporting of program performance.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... established for the period, made, if applicable, on a quantitative basis related to cost data for computation... established for the period, made, if applicable, on a quantitative basis related to costs for computation of...

  3. Highly Reproducible Label Free Quantitative Proteomic Analysis of RNA Polymerase Complexes*

    PubMed Central

    Mosley, Amber L.; Sardiu, Mihaela E.; Pattenden, Samantha G.; Workman, Jerry L.; Florens, Laurence; Washburn, Michael P.

    2011-01-01

    The use of quantitative proteomics methods to study protein complexes has the potential to provide in-depth information on the abundance of different protein components as well as their modification state in various cellular conditions. To interrogate protein complex quantitation using shotgun proteomic methods, we have focused on the analysis of protein complexes using label-free multidimensional protein identification technology and studied the reproducibility of biological replicates. For these studies, we focused on three highly related and essential multi-protein enzymes, RNA polymerase I, II, and III from Saccharomyces cerevisiae. We found that label-free quantitation using spectral counting is highly reproducible at the protein and peptide level when analyzing RNA polymerase I, II, and III. In addition, we show that peptide sampling does not follow a random sampling model, and we show the need for advanced computational models to predict peptide detection probabilities. In order to address these issues, we used the APEX protocol to model the expected peptide detectability based on whole cell lysate acquired using the same multidimensional protein identification technology analysis used for the protein complexes. Neither method was able to predict the peptide sampling levels that we observed using replicate multidimensional protein identification technology analyses. In addition to the analysis of the RNA polymerase complexes, our analysis provides quantitative information about several RNAP associated proteins including the RNAPII elongation factor complexes DSIF and TFIIF. Our data shows that DSIF and TFIIF are the most highly enriched RNAP accessory factors in Rpb3-TAP purifications and demonstrate our ability to measure low level associated protein abundance across biological replicates. In addition, our quantitative data supports a model in which DSIF and TFIIF interact with RNAPII in a dynamic fashion in agreement with previously published reports. PMID:21048197

  4. Multialternative drift-diffusion model predicts the relationship between visual fixations and choice in value-based decisions.

    PubMed

    Krajbich, Ian; Rangel, Antonio

    2011-08-16

    How do we make decisions when confronted with several alternatives (e.g., on a supermarket shelf)? Previous work has shown that accumulator models, such as the drift-diffusion model, can provide accurate descriptions of the psychometric data for binary value-based choices, and that the choice process is guided by visual attention. However, the computational processes used to make choices in more complicated situations involving three or more options are unknown. We propose a model of trinary value-based choice that generalizes what is known about binary choice, and test it using an eye-tracking experiment. We find that the model provides a quantitatively accurate description of the relationship between choice, reaction time, and visual fixation data using the same parameters that were estimated in previous work on binary choice. Our findings suggest that the brain uses similar computational processes to make binary and trinary choices.

  5. Beyond textbook illustrations: Hand-held models of ordered DNA and protein structures as 3D supplements to enhance student learning of helical biopolymers.

    PubMed

    Jittivadhna, Karnyupha; Ruenwongsa, Pintip; Panijpan, Bhinyo

    2010-11-01

    Textbook illustrations of 3D biopolymers on printed paper, regardless of how detailed and colorful, suffer from its two-dimensionality. For beginners, computer screen display of skeletal models of biopolymers and their animation usually does not provide the at-a-glance 3D perception and details, which can be done by good hand-held models. Here, we report a study on how our students learned more from using our ordered DNA and protein models assembled from colored computer-printouts on transparency film sheets that have useful structural details. Our models (reported in BAMBED 2009), having certain distinguished features, helped our students to grasp various aspects of these biopolymers that they usually find difficult. Quantitative and qualitative learning data from this study are reported. Copyright © 2010 International Union of Biochemistry and Molecular Biology, Inc.

  6. QSAR modeling: where have you been? Where are you going to?

    PubMed

    Cherkasov, Artem; Muratov, Eugene N; Fourches, Denis; Varnek, Alexandre; Baskin, Igor I; Cronin, Mark; Dearden, John; Gramatica, Paola; Martin, Yvonne C; Todeschini, Roberto; Consonni, Viviana; Kuz'min, Victor E; Cramer, Richard; Benigni, Romualdo; Yang, Chihae; Rathman, James; Terfloth, Lothar; Gasteiger, Johann; Richard, Ann; Tropsha, Alexander

    2014-06-26

    Quantitative structure-activity relationship modeling is one of the major computational tools employed in medicinal chemistry. However, throughout its entire history it has drawn both praise and criticism concerning its reliability, limitations, successes, and failures. In this paper, we discuss (i) the development and evolution of QSAR; (ii) the current trends, unsolved problems, and pressing challenges; and (iii) several novel and emerging applications of QSAR modeling. Throughout this discussion, we provide guidelines for QSAR development, validation, and application, which are summarized in best practices for building rigorously validated and externally predictive QSAR models. We hope that this Perspective will help communications between computational and experimental chemists toward collaborative development and use of QSAR models. We also believe that the guidelines presented here will help journal editors and reviewers apply more stringent scientific standards to manuscripts reporting new QSAR studies, as well as encourage the use of high quality, validated QSARs for regulatory decision making.

  7. QSAR Modeling: Where have you been? Where are you going to?

    PubMed Central

    Cherkasov, Artem; Muratov, Eugene N.; Fourches, Denis; Varnek, Alexandre; Baskin, Igor I.; Cronin, Mark; Dearden, John; Gramatica, Paola; Martin, Yvonne C.; Todeschini, Roberto; Consonni, Viviana; Kuz'min, Victor E.; Cramer, Richard; Benigni, Romualdo; Yang, Chihae; Rathman, James; Terfloth, Lothar; Gasteiger, Johann; Richard, Ann; Tropsha, Alexander

    2014-01-01

    Quantitative Structure-Activity Relationship modeling is one of the major computational tools employed in medicinal chemistry. However, throughout its entire history it has drawn both praise and criticism concerning its reliability, limitations, successes, and failures. In this paper, we discuss: (i) the development and evolution of QSAR; (ii) the current trends, unsolved problems, and pressing challenges; and (iii) several novel and emerging applications of QSAR modeling. Throughout this discussion, we provide guidelines for QSAR development, validation, and application, which are summarized in best practices for building rigorously validated and externally predictive QSAR models. We hope that this Perspective will help communications between computational and experimental chemists towards collaborative development and use of QSAR models. We also believe that the guidelines presented here will help journal editors and reviewers apply more stringent scientific standards to manuscripts reporting new QSAR studies, as well as encourage the use of high quality, validated QSARs for regulatory decision making. PMID:24351051

  8. Quantitative Functional Imaging Using Dynamic Positron Computed Tomography and Rapid Parameter Estimation Techniques

    NASA Astrophysics Data System (ADS)

    Koeppe, Robert Allen

    Positron computed tomography (PCT) is a diagnostic imaging technique that provides both three dimensional imaging capability and quantitative measurements of local tissue radioactivity concentrations in vivo. This allows the development of non-invasive methods that employ the principles of tracer kinetics for determining physiological properties such as mass specific blood flow, tissue pH, and rates of substrate transport or utilization. A physiologically based, two-compartment tracer kinetic model was derived to mathematically describe the exchange of a radioindicator between blood and tissue. The model was adapted for use with dynamic sequences of data acquired with a positron tomograph. Rapid estimation techniques were implemented to produce functional images of the model parameters by analyzing each individual pixel sequence of the image data. A detailed analysis of the performance characteristics of three different parameter estimation schemes was performed. The analysis included examination of errors caused by statistical uncertainties in the measured data, errors in the timing of the data, and errors caused by violation of various assumptions of the tracer kinetic model. Two specific radioindicators were investigated. ('18)F -fluoromethane, an inert freely diffusible gas, was used for local quantitative determinations of both cerebral blood flow and tissue:blood partition coefficient. A method was developed that did not require direct sampling of arterial blood for the absolute scaling of flow values. The arterial input concentration time course was obtained by assuming that the alveolar or end-tidal expired breath radioactivity concentration is proportional to the arterial blood concentration. The scale of the input function was obtained from a series of venous blood concentration measurements. The method of absolute scaling using venous samples was validated in four studies, performed on normal volunteers, in which directly measured arterial concentrations were compared to those predicted from the expired air and venous blood samples. The glucose analog ('18)F-3-deoxy-3-fluoro-D -glucose (3-FDG) was used for quantitating the membrane transport rate of glucose. The measured data indicated that the phosphorylation rate of 3-FDG was low enough to allow accurate estimation of the transport rate using a two compartment model.

  9. Quantitative relation between server motion and receiver anticipation in tennis: implications of responses to computer-simulated motions.

    PubMed

    Ida, Hirofumi; Fukuhara, Kazunobu; Sawada, Misako; Ishii, Motonobu

    2011-01-01

    The purpose of this study was to determine the quantitative relationships between the server's motion and the receiver's anticipation using a computer graphic animation of tennis serves. The test motions were determined by capturing the motion of a model player and estimating the computational perturbations caused by modulating the rotation of the player's elbow and forearm joints. Eight experienced and eight novice players rated their anticipation of the speed, direction, and spin of the ball on a visual analogue scale. The experienced players significantly altered some of their anticipatory judgment depending on the percentage of both the forearm and elbow modulations, while the novice players indicated no significant changes. Multiple regression analyses, including that of the racket's kinematic parameters immediately before racket-ball impact as independent variables, showed that the experienced players demonstrated a higher coefficient of determination than the novice players in their anticipatory judgment of the ball direction. The results have implications on the understanding of the functional relation between a player's motion and the opponent's anticipatory judgment during real play.

  10. Computational technique for stepwise quantitative assessment of equation correctness

    NASA Astrophysics Data System (ADS)

    Othman, Nuru'l Izzah; Bakar, Zainab Abu

    2017-04-01

    Many of the computer-aided mathematics assessment systems that are available today possess the capability to implement stepwise correctness checking of a working scheme for solving equations. The computational technique for assessing the correctness of each response in the scheme mainly involves checking the mathematical equivalence and providing qualitative feedback. This paper presents a technique, known as the Stepwise Correctness Checking and Scoring (SCCS) technique that checks the correctness of each equation in terms of structural equivalence and provides quantitative feedback. The technique, which is based on the Multiset framework, adapts certain techniques from textual information retrieval involving tokenization, document modelling and similarity evaluation. The performance of the SCCS technique was tested using worked solutions on solving linear algebraic equations in one variable. 350 working schemes comprising of 1385 responses were collected using a marking engine prototype, which has been developed based on the technique. The results show that both the automated analytical scores and the automated overall scores generated by the marking engine exhibit high percent agreement, high correlation and high degree of agreement with manual scores with small average absolute and mixed errors.

  11. QPROT: Statistical method for testing differential expression using protein-level intensity data in label-free quantitative proteomics.

    PubMed

    Choi, Hyungwon; Kim, Sinae; Fermin, Damian; Tsou, Chih-Chiang; Nesvizhskii, Alexey I

    2015-11-03

    We introduce QPROT, a statistical framework and computational tool for differential protein expression analysis using protein intensity data. QPROT is an extension of the QSPEC suite, originally developed for spectral count data, adapted for the analysis using continuously measured protein-level intensity data. QPROT offers a new intensity normalization procedure and model-based differential expression analysis, both of which account for missing data. Determination of differential expression of each protein is based on the standardized Z-statistic based on the posterior distribution of the log fold change parameter, guided by the false discovery rate estimated by a well-known Empirical Bayes method. We evaluated the classification performance of QPROT using the quantification calibration data from the clinical proteomic technology assessment for cancer (CPTAC) study and a recently published Escherichia coli benchmark dataset, with evaluation of FDR accuracy in the latter. QPROT is a statistical framework with computational software tool for comparative quantitative proteomics analysis. It features various extensions of QSPEC method originally built for spectral count data analysis, including probabilistic treatment of missing values in protein intensity data. With the increasing popularity of label-free quantitative proteomics data, the proposed method and accompanying software suite will be immediately useful for many proteomics laboratories. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Computational Modeling and Simulation of Developmental ...

    EPA Pesticide Factsheets

    Standard practice for assessing developmental toxicity is the observation of apical endpoints (intrauterine death, fetal growth retardation, structural malformations) in pregnant rats/rabbits following exposure during organogenesis. EPA’s computational toxicology research program (ToxCast) generated vast in vitro cellular and molecular effects data on >1858 chemicals in >600 high-throughput screening (HTS) assays. The diversity of assays has been increased for developmental toxicity with several HTS platforms, including the devTOX-quickPredict assay from Stemina Biomarker Discovery utilizing the human embryonic stem cell line (H9). Translating these HTS data into higher order-predictions of developmental toxicity is a significant challenge. Here, we address the application of computational systems models that recapitulate the kinematics of dynamical cell signaling networks (e.g., SHH, FGF, BMP, retinoids) in a CompuCell3D.org modeling environment. Examples include angiogenesis (angiodysplasia) and dysmorphogenesis. Being numerically responsive to perturbation, these models are amenable to data integration for systems Toxicology and Adverse Outcome Pathways (AOPs). The AOP simulation outputs predict potential phenotypes based on the in vitro HTS data ToxCast. A heuristic computational intelligence framework that recapitulates the kinematics of dynamical cell signaling networks in the embryo, together with the in vitro profiling data, produce quantitative predic

  13. XDesign: an open-source software package for designing X-ray imaging phantoms and experiments.

    PubMed

    Ching, Daniel J; Gürsoy, Dogˇa

    2017-03-01

    The development of new methods or utilization of current X-ray computed tomography methods is impeded by the substantial amount of expertise required to design an X-ray computed tomography experiment from beginning to end. In an attempt to make material models, data acquisition schemes and reconstruction algorithms more accessible to researchers lacking expertise in some of these areas, a software package is described here which can generate complex simulated phantoms and quantitatively evaluate new or existing data acquisition schemes and image reconstruction algorithms for targeted applications.

  14. XDesign: An open-source software package for designing X-ray imaging phantoms and experiments

    DOE PAGES

    Ching, Daniel J.; Gursoy, Dogˇa

    2017-02-21

    Here, the development of new methods or utilization of current X-ray computed tomography methods is impeded by the substantial amount of expertise required to design an X-ray computed tomography experiment from beginning to end. In an attempt to make material models, data acquisition schemes and reconstruction algorithms more accessible to researchers lacking expertise in some of these areas, a software package is described here which can generate complex simulated phantoms and quantitatively evaluate new or existing data acquisition schemes and image reconstruction algorithms for targeted applications.

  15. Contextual Fraction as a Measure of Contextuality.

    PubMed

    Abramsky, Samson; Barbosa, Rui Soares; Mansfield, Shane

    2017-08-04

    We consider the contextual fraction as a quantitative measure of contextuality of empirical models, i.e., tables of probabilities of measurement outcomes in an experimental scenario. It provides a general way to compare the degree of contextuality across measurement scenarios; it bears a precise relationship to violations of Bell inequalities; its value, and a witnessing inequality, can be computed using linear programing; it is monotonic with respect to the "free" operations of a resource theory for contextuality; and it measures quantifiable advantages in informatic tasks, such as games and a form of measurement-based quantum computing.

  16. Contextual Fraction as a Measure of Contextuality

    NASA Astrophysics Data System (ADS)

    Abramsky, Samson; Barbosa, Rui Soares; Mansfield, Shane

    2017-08-01

    We consider the contextual fraction as a quantitative measure of contextuality of empirical models, i.e., tables of probabilities of measurement outcomes in an experimental scenario. It provides a general way to compare the degree of contextuality across measurement scenarios; it bears a precise relationship to violations of Bell inequalities; its value, and a witnessing inequality, can be computed using linear programing; it is monotonic with respect to the "free" operations of a resource theory for contextuality; and it measures quantifiable advantages in informatic tasks, such as games and a form of measurement-based quantum computing.

  17. Quantitative correlations between collision induced dissociation mass spectrometry coupled with electrospray ionization or atmospheric pressure chemical ionization mass spectrometry - Experiment and theory

    NASA Astrophysics Data System (ADS)

    Ivanova, Bojidarka; Spiteller, Michael

    2018-04-01

    The problematic that we consider in this paper treats the quantitative correlation model equations between experimental kinetic and thermodynamic parameters of coupled electrospray ionization (ESI) mass spectrometry (MS) or atmospheric pressure chemical ionization (APCI) mass spectrometry with collision induced dissociation mass spectrometry, accounting for the fact that the physical phenomena and mechanisms of ESI- and APCI-ion formation are completely different. There are described forty two fragment reactions of three analytes under independent ESI- and APCI-measurements. The developed new quantitative models allow us to study correlatively the reaction kinetics and thermodynamics using the methods of mass spectrometry, which complementary application with the methods of the quantum chemistry provide 3D structural information of the analytes. Both static and dynamic quantum chemical computations are carried out. The object of analyses are [2,3-dimethyl-4-(4-methyl-benzoyl)-2,3-di-p-tolyl-cyclobutyl]-p-tolyl-methanone (1) and the polycyclic aromatic hydrocarbons derivatives of dibenzoperylen (2) and tetrabenzo [a,c,fg,op]naphthacene (3), respectively. As far as (1) is known to be a product of [2π+2π] cycloaddition reactions of chalcone (1,3-di-p-tolyl-propenone), however producing cyclic derivatives with different stereo selectivity, so that the study provide crucial data about the capability of mass spectrometry to provide determine the stereo selectivity of the analytes. This work also first provides quantitative treatment of the relations '3D molecular/electronic structures'-'quantum chemical diffusion coefficient'-'mass spectrometric diffusion coefficient', thus extending the capability of the mass spectrometry for determination of the exact 3D structure of the analytes using independent measurements and computations of the diffusion coefficients. The determination of the experimental diffusion parameters is carried out within the 'current monitoring method' evaluating the translation diffusion of charged analytes, while the theoretical modelling of MS ions and computations of theoretical diffusion coefficients are based on the Arrhenius type behavior of the charged species under ESI- and APCI-conditions. Although the study provide certain sound considerations for the quantitative relations between the reaction kinetic-thermodynamics and 3D structure of the analytes together with correlations between 3D molecular/electronic structures-quantum chemical diffusion coefficient-mass spectrometric diffusion coefficient, which contribute significantly to the structural analytical chemistry, the results have importance to other areas such as organic synthesis and catalysis as well.

  18. Chemical graphs, molecular matrices and topological indices in chemoinformatics and quantitative structure-activity relationships.

    PubMed

    Ivanciuc, Ovidiu

    2013-06-01

    Chemical and molecular graphs have fundamental applications in chemoinformatics, quantitative structureproperty relationships (QSPR), quantitative structure-activity relationships (QSAR), virtual screening of chemical libraries, and computational drug design. Chemoinformatics applications of graphs include chemical structure representation and coding, database search and retrieval, and physicochemical property prediction. QSPR, QSAR and virtual screening are based on the structure-property principle, which states that the physicochemical and biological properties of chemical compounds can be predicted from their chemical structure. Such structure-property correlations are usually developed from topological indices and fingerprints computed from the molecular graph and from molecular descriptors computed from the three-dimensional chemical structure. We present here a selection of the most important graph descriptors and topological indices, including molecular matrices, graph spectra, spectral moments, graph polynomials, and vertex topological indices. These graph descriptors are used to define several topological indices based on molecular connectivity, graph distance, reciprocal distance, distance-degree, distance-valency, spectra, polynomials, and information theory concepts. The molecular descriptors and topological indices can be developed with a more general approach, based on molecular graph operators, which define a family of graph indices related by a common formula. Graph descriptors and topological indices for molecules containing heteroatoms and multiple bonds are computed with weighting schemes based on atomic properties, such as the atomic number, covalent radius, or electronegativity. The correlation in QSPR and QSAR models can be improved by optimizing some parameters in the formula of topological indices, as demonstrated for structural descriptors based on atomic connectivity and graph distance.

  19. Genome Scale Modeling in Systems Biology: Algorithms and Resources

    PubMed Central

    Najafi, Ali; Bidkhori, Gholamreza; Bozorgmehr, Joseph H.; Koch, Ina; Masoudi-Nejad, Ali

    2014-01-01

    In recent years, in silico studies and trial simulations have complemented experimental procedures. A model is a description of a system, and a system is any collection of interrelated objects; an object, moreover, is some elemental unit upon which observations can be made but whose internal structure either does not exist or is ignored. Therefore, any network analysis approach is critical for successful quantitative modeling of biological systems. This review highlights some of most popular and important modeling algorithms, tools, and emerging standards for representing, simulating and analyzing cellular networks in five sections. Also, we try to show these concepts by means of simple example and proper images and graphs. Overall, systems biology aims for a holistic description and understanding of biological processes by an integration of analytical experimental approaches along with synthetic computational models. In fact, biological networks have been developed as a platform for integrating information from high to low-throughput experiments for the analysis of biological systems. We provide an overview of all processes used in modeling and simulating biological networks in such a way that they can become easily understandable for researchers with both biological and mathematical backgrounds. Consequently, given the complexity of generated experimental data and cellular networks, it is no surprise that researchers have turned to computer simulation and the development of more theory-based approaches to augment and assist in the development of a fully quantitative understanding of cellular dynamics. PMID:24822031

  20. Quantitative estimation of pesticide-likeness for agrochemical discovery.

    PubMed

    Avram, Sorin; Funar-Timofei, Simona; Borota, Ana; Chennamaneni, Sridhar Rao; Manchala, Anil Kumar; Muresan, Sorel

    2014-12-01

    The design of chemical libraries, an early step in agrochemical discovery programs, is frequently addressed by means of qualitative physicochemical and/or topological rule-based methods. The aim of this study is to develop quantitative estimates of herbicide- (QEH), insecticide- (QEI), fungicide- (QEF), and, finally, pesticide-likeness (QEP). In the assessment of these definitions, we relied on the concept of desirability functions. We found a simple function, shared by the three classes of pesticides, parameterized particularly, for six, easy to compute, independent and interpretable, molecular properties: molecular weight, logP, number of hydrogen bond acceptors, number of hydrogen bond donors, number of rotatable bounds and number of aromatic rings. Subsequently, we describe the scoring of each pesticide class by the corresponding quantitative estimate. In a comparative study, we assessed the performance of the scoring functions using extensive datasets of patented pesticides. The hereby-established quantitative assessment has the ability to rank compounds whether they fail well-established pesticide-likeness rules or not, and offer an efficient way to prioritize (class-specific) pesticides. These findings are valuable for the efficient estimation of pesticide-likeness of vast chemical libraries in the field of agrochemical discovery. Graphical AbstractQuantitative models for pesticide-likeness were derived using the concept of desirability functions parameterized for six, easy to compute, independent and interpretable, molecular properties: molecular weight, logP, number of hydrogen bond acceptors, number of hydrogen bond donors, number of rotatable bounds and number of aromatic rings.

  1. A Roadmap for the Development of Applied Computational Psychiatry.

    PubMed

    Paulus, Martin P; Huys, Quentin J M; Maia, Tiago V

    2016-09-01

    Computational psychiatry is a burgeoning field that utilizes mathematical approaches to investigate psychiatric disorders, derive quantitative predictions, and integrate data across multiple levels of description. Computational psychiatry has already led to many new insights into the neurobehavioral mechanisms that underlie several psychiatric disorders, but its usefulness from a clinical standpoint is only now starting to be considered. Examples of computational psychiatry are highlighted, and a phase-based pipeline for the development of clinical computational-psychiatry applications is proposed, similar to the phase-based pipeline used in drug development. It is proposed that each phase has unique endpoints and deliverables, which will be important milestones to move tasks, procedures, computational models, and algorithms from the laboratory to clinical practice. Application of computational approaches should be tested on healthy volunteers in Phase I, transitioned to target populations in Phase IB and Phase IIA, and thoroughly evaluated using randomized clinical trials in Phase IIB and Phase III. Successful completion of these phases should be the basis of determining whether computational models are useful tools for prognosis, diagnosis, or treatment of psychiatric patients. A new type of infrastructure will be necessary to implement the proposed pipeline. This infrastructure should consist of groups of investigators with diverse backgrounds collaborating to make computational psychiatry relevant for the clinic.

  2. Novel Uses of In Vitro Data to Develop Quantitative Biological Activity Relationship Models for in Vivo Carcinogenicity Prediction.

    PubMed

    Pradeep, Prachi; Povinelli, Richard J; Merrill, Stephen J; Bozdag, Serdar; Sem, Daniel S

    2015-04-01

    The availability of large in vitro datasets enables better insight into the mode of action of chemicals and better identification of potential mechanism(s) of toxicity. Several studies have shown that not all in vitro assays can contribute as equal predictors of in vivo carcinogenicity for development of hybrid Quantitative Structure Activity Relationship (QSAR) models. We propose two novel approaches for the use of mechanistically relevant in vitro assay data in the identification of relevant biological descriptors and development of Quantitative Biological Activity Relationship (QBAR) models for carcinogenicity prediction. We demonstrate that in vitro assay data can be used to develop QBAR models for in vivo carcinogenicity prediction via two case studies corroborated with firm scientific rationale. The case studies demonstrate the similarities between QBAR and QSAR modeling in: (i) the selection of relevant descriptors to be used in the machine learning algorithm, and (ii) the development of a computational model that maps chemical or biological descriptors to a toxic endpoint. The results of both the case studies show: (i) improved accuracy and sensitivity which is especially desirable under regulatory requirements, and (ii) overall adherence with the OECD/REACH guidelines. Such mechanism based models can be used along with QSAR models for prediction of mechanistically complex toxic endpoints. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Using ToxCast in vitro Assays in the Hierarchical Quantitative Structure-Activity Relationship (QSAR) Modeling for Predicting in vivo Toxicity of Chemicals

    EPA Science Inventory

    The goal of chemical toxicology research is utilizing short term bioassays and/or robust computational methods to predict in vivo toxicity endpoints for chemicals. The ToxCast program established at the US Environmental Protection Agency (EPA) is addressing this goal by using ca....

  4. A Quantitative Examination of User Experience as an Antecedent to Student Perception in Technology Acceptance Modeling

    ERIC Educational Resources Information Center

    Butler, Rory

    2013-01-01

    Internet-enabled mobile devices have increased the accessibility of learning content for students. Given the ubiquitous nature of mobile computing technology, a thorough understanding of the acceptance factors that impact a learner's intention to use mobile technology as an augment to their studies is warranted. Student acceptance of mobile…

  5. Quantitative body DW-MRI biomarkers uncertainty estimation using unscented wild-bootstrap.

    PubMed

    Freiman, M; Voss, S D; Mulkern, R V; Perez-Rossello, J M; Warfield, S K

    2011-01-01

    We present a new method for the uncertainty estimation of diffusion parameters for quantitative body DW-MRI assessment. Diffusion parameters uncertainty estimation from DW-MRI is necessary for clinical applications that use these parameters to assess pathology. However, uncertainty estimation using traditional techniques requires repeated acquisitions, which is undesirable in routine clinical use. Model-based bootstrap techniques, for example, assume an underlying linear model for residuals rescaling and cannot be utilized directly for body diffusion parameters uncertainty estimation due to the non-linearity of the body diffusion model. To offset this limitation, our method uses the Unscented transform to compute the residuals rescaling parameters from the non-linear body diffusion model, and then applies the wild-bootstrap method to infer the body diffusion parameters uncertainty. Validation through phantom and human subject experiments shows that our method identify the regions with higher uncertainty in body DWI-MRI model parameters correctly with realtive error of -36% in the uncertainty values.

  6. Agent-based modeling: case study in cleavage furrow models

    PubMed Central

    Mogilner, Alex; Manhart, Angelika

    2016-01-01

    The number of studies in cell biology in which quantitative models accompany experiments has been growing steadily. Roughly, mathematical and computational techniques of these models can be classified as “differential equation based” (DE) or “agent based” (AB). Recently AB models have started to outnumber DE models, but understanding of AB philosophy and methodology is much less widespread than familiarity with DE techniques. Here we use the history of modeling a fundamental biological problem—positioning of the cleavage furrow in dividing cells—to explain how and why DE and AB models are used. We discuss differences, advantages, and shortcomings of these two approaches. PMID:27811328

  7. Petri net modelling of biological networks.

    PubMed

    Chaouiya, Claudine

    2007-07-01

    Mathematical modelling is increasingly used to get insights into the functioning of complex biological networks. In this context, Petri nets (PNs) have recently emerged as a promising tool among the various methods employed for the modelling and analysis of molecular networks. PNs come with a series of extensions, which allow different abstraction levels, from purely qualitative to more complex quantitative models. Noteworthily, each of these models preserves the underlying graph, which depicts the interactions between the biological components. This article intends to present the basics of the approach and to foster the potential role PNs could play in the development of the computational systems biology.

  8. Operation of the computer model for direct atomic oxygen exposure of Earth satellites

    NASA Technical Reports Server (NTRS)

    Bourassa, R. J.; Gruenbaum, P. E.; Gillis, J. R.; Hargraves, C. R.

    1995-01-01

    One of the primary causes of material degradation in low Earth orbit (LEO) is exposure to atomic oxygen. When atomic oxygen molecules collide with an orbiting spacecraft, the relative velocity is 7 to 8 km/sec and the collision energy is 4 to 5 eV per atom. Under these conditions, atomic oxygen may initiate a number of chemical and physical reactions with exposed materials. These reactions contribute to material degradation, surface erosion, and contamination. Interpretation of these effects on materials and the design of space hardware to withstand on-orbit conditions requires quantitative knowledge of the atomic oxygen exposure environment. Atomic oxygen flux is a function of orbit altitude, the orientation of the orbit plan to the Sun, solar and geomagnetic activity, and the angle between exposed surfaces and the spacecraft heading. We have developed a computer model to predict the atomic oxygen exposure of spacecraft in low Earth orbit. The application of this computer model is discussed.

  9. A Computational Framework for Bioimaging Simulation.

    PubMed

    Watabe, Masaki; Arjunan, Satya N V; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.

  10. Cancer imaging phenomics toolkit: quantitative imaging analytics for precision diagnostics and predictive modeling of clinical outcome.

    PubMed

    Davatzikos, Christos; Rathore, Saima; Bakas, Spyridon; Pati, Sarthak; Bergman, Mark; Kalarot, Ratheesh; Sridharan, Patmaa; Gastounioti, Aimilia; Jahani, Nariman; Cohen, Eric; Akbari, Hamed; Tunc, Birkan; Doshi, Jimit; Parker, Drew; Hsieh, Michael; Sotiras, Aristeidis; Li, Hongming; Ou, Yangming; Doot, Robert K; Bilello, Michel; Fan, Yong; Shinohara, Russell T; Yushkevich, Paul; Verma, Ragini; Kontos, Despina

    2018-01-01

    The growth of multiparametric imaging protocols has paved the way for quantitative imaging phenotypes that predict treatment response and clinical outcome, reflect underlying cancer molecular characteristics and spatiotemporal heterogeneity, and can guide personalized treatment planning. This growth has underlined the need for efficient quantitative analytics to derive high-dimensional imaging signatures of diagnostic and predictive value in this emerging era of integrated precision diagnostics. This paper presents cancer imaging phenomics toolkit (CaPTk), a new and dynamically growing software platform for analysis of radiographic images of cancer, currently focusing on brain, breast, and lung cancer. CaPTk leverages the value of quantitative imaging analytics along with machine learning to derive phenotypic imaging signatures, based on two-level functionality. First, image analysis algorithms are used to extract comprehensive panels of diverse and complementary features, such as multiparametric intensity histogram distributions, texture, shape, kinetics, connectomics, and spatial patterns. At the second level, these quantitative imaging signatures are fed into multivariate machine learning models to produce diagnostic, prognostic, and predictive biomarkers. Results from clinical studies in three areas are shown: (i) computational neuro-oncology of brain gliomas for precision diagnostics, prediction of outcome, and treatment planning; (ii) prediction of treatment response for breast and lung cancer, and (iii) risk assessment for breast cancer.

  11. Quantitative Analysis of the Cervical Texture by Ultrasound and Correlation with Gestational Age.

    PubMed

    Baños, Núria; Perez-Moreno, Alvaro; Migliorelli, Federico; Triginer, Laura; Cobo, Teresa; Bonet-Carne, Elisenda; Gratacos, Eduard; Palacio, Montse

    2017-01-01

    Quantitative texture analysis has been proposed to extract robust features from the ultrasound image to detect subtle changes in the textures of the images. The aim of this study was to evaluate the feasibility of quantitative cervical texture analysis to assess cervical tissue changes throughout pregnancy. This was a cross-sectional study including singleton pregnancies between 20.0 and 41.6 weeks of gestation from women who delivered at term. Cervical length was measured, and a selected region of interest in the cervix was delineated. A model to predict gestational age based on features extracted from cervical images was developed following three steps: data splitting, feature transformation, and regression model computation. Seven hundred images, 30 per gestational week, were included for analysis. There was a strong correlation between the gestational age at which the images were obtained and the estimated gestational age by quantitative analysis of the cervical texture (R = 0.88). This study provides evidence that quantitative analysis of cervical texture can extract features from cervical ultrasound images which correlate with gestational age. Further research is needed to evaluate its applicability as a biomarker of the risk of spontaneous preterm birth, as well as its role in cervical assessment in other clinical situations in which cervical evaluation might be relevant. © 2016 S. Karger AG, Basel.

  12. Laboratory volcano geodesy

    NASA Astrophysics Data System (ADS)

    Færøvik Johannessen, Rikke; Galland, Olivier; Mair, Karen

    2014-05-01

    Magma transport in volcanic plumbing systems induces surface deformation, which can be monitored by geodetic techniques, such as GPS and InSAR. These geodetic signals are commonly analyzed through geodetic models in order to constrain the shape of, and the pressure in, magma plumbing systems. These models, however, suffer critical limitations: (1) the modelled magma conduit shapes cannot be compared with the real conduits, so the geodetic models cannot be tested nor validated; (2) the modelled conduits only exhibit shapes that are too simplistic; (3) most geodetic models only account for elasticity of the host rock, whereas substantial plastic deformation is known to occur. To overcome these limitations, one needs to use a physical system, in which (1) both surface deformation and the shape of, and pressure in, the underlying conduit are known, and (2) the mechanical properties of the host material are controlled and well known. In this contribution, we present novel quantitative laboratory results of shallow magma emplacement. Fine-grained silica flour represents the brittle crust, and low viscosity vegetable oil is an analogue for the magma. The melting temperature of the oil is 31°C; the oil solidifies in the models after the end of the experiments. At the time of injection the oil temperature is 50°C. The oil is pumped from a reservoir using a volumetric pump into the silica flour through a circular inlet at the bottom of a 40x40 cm square box. The silica flour is cohesive, such that oil intrudes it by fracturing it, and produces typical sheet intrusions (dykes, cone sheets, etc.). During oil intrusion, the model surface deforms, mostly by doming. These movements are measured by an advanced photogrammetry method, which uses 4 synchronized fixed cameras that periodically image the surface of the model from different angles. We apply particle tracking method to compute the 3D ground deformation pattern through time. After solidification of the oil, the intrusion can be excavated and photographed from several angles to compute its 3D shape with the same photogrammetry method. Then, the surface deformation pattern can be directly compared with the shape of underlying intrusion. This quantitative dataset is essential to quantitatively test and validate classical volcano geodetic models.

  13. qPIPSA: Relating enzymatic kinetic parameters and interaction fields

    PubMed Central

    Gabdoulline, Razif R; Stein, Matthias; Wade, Rebecca C

    2007-01-01

    Background The simulation of metabolic networks in quantitative systems biology requires the assignment of enzymatic kinetic parameters. Experimentally determined values are often not available and therefore computational methods to estimate these parameters are needed. It is possible to use the three-dimensional structure of an enzyme to perform simulations of a reaction and derive kinetic parameters. However, this is computationally demanding and requires detailed knowledge of the enzyme mechanism. We have therefore sought to develop a general, simple and computationally efficient procedure to relate protein structural information to enzymatic kinetic parameters that allows consistency between the kinetic and structural information to be checked and estimation of kinetic constants for structurally and mechanistically similar enzymes. Results We describe qPIPSA: quantitative Protein Interaction Property Similarity Analysis. In this analysis, molecular interaction fields, for example, electrostatic potentials, are computed from the enzyme structures. Differences in molecular interaction fields between enzymes are then related to the ratios of their kinetic parameters. This procedure can be used to estimate unknown kinetic parameters when enzyme structural information is available and kinetic parameters have been measured for related enzymes or were obtained under different conditions. The detailed interaction of the enzyme with substrate or cofactors is not modeled and is assumed to be similar for all the proteins compared. The protein structure modeling protocol employed ensures that differences between models reflect genuine differences between the protein sequences, rather than random fluctuations in protein structure. Conclusion Provided that the experimental conditions and the protein structural models refer to the same protein state or conformation, correlations between interaction fields and kinetic parameters can be established for sets of related enzymes. Outliers may arise due to variation in the importance of different contributions to the kinetic parameters, such as protein stability and conformational changes. The qPIPSA approach can assist in the validation as well as estimation of kinetic parameters, and provide insights into enzyme mechanism. PMID:17919319

  14. Quantitative assessment of cervical vertebral maturation using cone beam computed tomography in Korean girls.

    PubMed

    Byun, Bo-Ram; Kim, Yong-Il; Yamaguchi, Tetsutaro; Maki, Koutaro; Son, Woo-Sung

    2015-01-01

    This study was aimed to examine the correlation between skeletal maturation status and parameters from the odontoid process/body of the second vertebra and the bodies of third and fourth cervical vertebrae and simultaneously build multiple regression models to be able to estimate skeletal maturation status in Korean girls. Hand-wrist radiographs and cone beam computed tomography (CBCT) images were obtained from 74 Korean girls (6-18 years of age). CBCT-generated cervical vertebral maturation (CVM) was used to demarcate the odontoid process and the body of the second cervical vertebra, based on the dentocentral synchondrosis. Correlation coefficient analysis and multiple linear regression analysis were used for each parameter of the cervical vertebrae (P < 0.05). Forty-seven of 64 parameters from CBCT-generated CVM (independent variables) exhibited statistically significant correlations (P < 0.05). The multiple regression model with the greatest R (2) had six parameters (PH2/W2, UW2/W2, (OH+AH2)/LW2, UW3/LW3, D3, and H4/W4) as independent variables with a variance inflation factor (VIF) of <2. CBCT-generated CVM was able to include parameters from the second cervical vertebral body and odontoid process, respectively, for the multiple regression models. This suggests that quantitative analysis might be used to estimate skeletal maturation status.

  15. Quantitative Analysis of Intracellular Motility Based on Optical Flow Model

    PubMed Central

    Li, Heng

    2017-01-01

    Analysis of cell mobility is a key issue for abnormality identification and classification in cell biology research. However, since cell deformation induced by various biological processes is random and cell protrusion is irregular, it is difficult to measure cell morphology and motility in microscopic images. To address this dilemma, we propose an improved variation optical flow model for quantitative analysis of intracellular motility, which not only extracts intracellular motion fields effectively but also deals with optical flow computation problem at the border by taking advantages of the formulation based on L1 and L2 norm, respectively. In the energy functional of our proposed optical flow model, the data term is in the form of L2 norm; the smoothness of the data changes with regional features through an adaptive parameter, using L1 norm near the edge of the cell and L2 norm away from the edge. We further extract histograms of oriented optical flow (HOOF) after optical flow field of intracellular motion is computed. Then distances of different HOOFs are calculated as the intracellular motion features to grade the intracellular motion. Experimental results show that the features extracted from HOOFs provide new insights into the relationship between the cell motility and the special pathological conditions. PMID:29065574

  16. Three-dimensional quantitative structure-activity relationship analysis for human pregnane X receptor for the prediction of CYP3A4 induction in human hepatocytes: structure-based comparative molecular field analysis.

    PubMed

    Handa, Koichi; Nakagome, Izumi; Yamaotsu, Noriyuki; Gouda, Hiroaki; Hirono, Shuichi

    2015-01-01

    The pregnane X receptor [PXR (NR1I2)] induces the expression of xenobiotic metabolic genes and transporter genes. In this study, we aimed to establish a computational method for quantifying the enzyme-inducing potencies of different compounds via their ability to activate PXR, for the application in drug discovery and development. To achieve this purpose, we developed a three-dimensional quantitative structure-activity relationship (3D-QSAR) model using comparative molecular field analysis (CoMFA) for predicting enzyme-inducing potencies, based on computer-ligand docking to multiple PXR protein structures sampled from the trajectory of a molecular dynamics simulation. Molecular mechanics-generalized born/surface area scores representing the ligand-protein-binding free energies were calculated for each ligand. As a result, the predicted enzyme-inducing potencies for compounds generated by the CoMFA model were in good agreement with the experimental values. Finally, we concluded that this 3D-QSAR model has the potential to predict the enzyme-inducing potencies of novel compounds with high precision and therefore has valuable applications in the early stages of the drug discovery process. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  17. Computational design of a Diels-Alderase from a thermophilic esterase: the importance of dynamics

    NASA Astrophysics Data System (ADS)

    Linder, Mats; Johansson, Adam Johannes; Olsson, Tjelvar S. G.; Liebeschuetz, John; Brinck, Tore

    2012-09-01

    A novel computational Diels-Alderase design, based on a relatively rare form of carboxylesterase from Geobacillus stearothermophilus, is presented and theoretically evaluated. The structure was found by mining the PDB for a suitable oxyanion hole-containing structure, followed by a combinatorial approach to find suitable substrates and rational mutations. Four lead designs were selected and thoroughly modeled to obtain realistic estimates of substrate binding and prearrangement. Molecular dynamics simulations and DFT calculations were used to optimize and estimate binding affinity and activation energies. A large quantum chemical model was used to capture the salient interactions in the crucial transition state (TS). Our quantitative estimation of kinetic parameters was validated against four experimentally characterized Diels-Alderases with good results. The final designs in this work are predicted to have rate enhancements of ≈103-106 and high predicted proficiencies. This work emphasizes the importance of considering protein dynamics in the design approach, and provides a quantitative estimate of the how the TS stabilization observed in most de novo and redesigned enzymes is decreased compared to a minimal, `ideal' model. The presented design is highly interesting for further optimization and applications since it is based on a thermophilic enzyme ( T opt = 70 °C).

  18. A cortical edge-integration model of object-based lightness computation that explains effects of spatial context and individual differences

    PubMed Central

    Rudd, Michael E.

    2014-01-01

    Previous work has demonstrated that perceived surface reflectance (lightness) can be modeled in simple contexts in a quantitatively exact way by assuming that the visual system first extracts information about local, directed steps in log luminance, then spatially integrates these steps along paths through the image to compute lightness (Rudd and Zemach, 2004, 2005, 2007). This method of computing lightness is called edge integration. Recent evidence (Rudd, 2013) suggests that human vision employs a default strategy to integrate luminance steps only along paths from a common background region to the targets whose lightness is computed. This implies a role for gestalt grouping in edge-based lightness computation. Rudd (2010) further showed the perceptual weights applied to edges in lightness computation can be influenced by the observer's interpretation of luminance steps as resulting from either spatial variation in surface reflectance or illumination. This implies a role for top-down factors in any edge-based model of lightness (Rudd and Zemach, 2005). Here, I show how the separate influences of grouping and attention on lightness can be modeled in tandem by a cortical mechanism that first employs top-down signals to spatially select regions of interest for lightness computation. An object-based network computation, involving neurons that code for border-ownership, then automatically sets the neural gains applied to edge signals surviving the earlier spatial selection stage. Only the borders that survive both processing stages are spatially integrated to compute lightness. The model assumptions are consistent with those of the cortical lightness model presented earlier by Rudd (2010, 2013), and with neurophysiological data indicating extraction of local edge information in V1, network computations to establish figure-ground relations and border ownership in V2, and edge integration to encode lightness and darkness signals in V4. PMID:25202253

  19. A cortical edge-integration model of object-based lightness computation that explains effects of spatial context and individual differences.

    PubMed

    Rudd, Michael E

    2014-01-01

    Previous work has demonstrated that perceived surface reflectance (lightness) can be modeled in simple contexts in a quantitatively exact way by assuming that the visual system first extracts information about local, directed steps in log luminance, then spatially integrates these steps along paths through the image to compute lightness (Rudd and Zemach, 2004, 2005, 2007). This method of computing lightness is called edge integration. Recent evidence (Rudd, 2013) suggests that human vision employs a default strategy to integrate luminance steps only along paths from a common background region to the targets whose lightness is computed. This implies a role for gestalt grouping in edge-based lightness computation. Rudd (2010) further showed the perceptual weights applied to edges in lightness computation can be influenced by the observer's interpretation of luminance steps as resulting from either spatial variation in surface reflectance or illumination. This implies a role for top-down factors in any edge-based model of lightness (Rudd and Zemach, 2005). Here, I show how the separate influences of grouping and attention on lightness can be modeled in tandem by a cortical mechanism that first employs top-down signals to spatially select regions of interest for lightness computation. An object-based network computation, involving neurons that code for border-ownership, then automatically sets the neural gains applied to edge signals surviving the earlier spatial selection stage. Only the borders that survive both processing stages are spatially integrated to compute lightness. The model assumptions are consistent with those of the cortical lightness model presented earlier by Rudd (2010, 2013), and with neurophysiological data indicating extraction of local edge information in V1, network computations to establish figure-ground relations and border ownership in V2, and edge integration to encode lightness and darkness signals in V4.

  20. Systems cell biology

    PubMed Central

    Mast, Fred D.; Ratushny, Alexander V.

    2014-01-01

    Systems cell biology melds high-throughput experimentation with quantitative analysis and modeling to understand many critical processes that contribute to cellular organization and dynamics. Recently, there have been several advances in technology and in the application of modeling approaches that enable the exploration of the dynamic properties of cells. Merging technology and computation offers an opportunity to objectively address unsolved cellular mechanisms, and has revealed emergent properties and helped to gain a more comprehensive and fundamental understanding of cell biology. PMID:25225336

  1. Dimensionality of Motion and Binding Valency Govern Receptor-Ligand Kinetics As Revealed by Agent-Based Modeling.

    PubMed

    Lehnert, Teresa; Figge, Marc Thilo

    2017-01-01

    Mathematical modeling and computer simulations have become an integral part of modern biological research. The strength of theoretical approaches is in the simplification of complex biological systems. We here consider the general problem of receptor-ligand binding in the context of antibody-antigen binding. On the one hand, we establish a quantitative mapping between macroscopic binding rates of a deterministic differential equation model and their microscopic equivalents as obtained from simulating the spatiotemporal binding kinetics by stochastic agent-based models. On the other hand, we investigate the impact of various properties of B cell-derived receptors-such as their dimensionality of motion, morphology, and binding valency-on the receptor-ligand binding kinetics. To this end, we implemented an algorithm that simulates antigen binding by B cell-derived receptors with a Y-shaped morphology that can move in different dimensionalities, i.e., either as membrane-anchored receptors or as soluble receptors. The mapping of the macroscopic and microscopic binding rates allowed us to quantitatively compare different agent-based model variants for the different types of B cell-derived receptors. Our results indicate that the dimensionality of motion governs the binding kinetics and that this predominant impact is quantitatively compensated by the bivalency of these receptors.

  2. Dimensionality of Motion and Binding Valency Govern Receptor–Ligand Kinetics As Revealed by Agent-Based Modeling

    PubMed Central

    Lehnert, Teresa; Figge, Marc Thilo

    2017-01-01

    Mathematical modeling and computer simulations have become an integral part of modern biological research. The strength of theoretical approaches is in the simplification of complex biological systems. We here consider the general problem of receptor–ligand binding in the context of antibody–antigen binding. On the one hand, we establish a quantitative mapping between macroscopic binding rates of a deterministic differential equation model and their microscopic equivalents as obtained from simulating the spatiotemporal binding kinetics by stochastic agent-based models. On the other hand, we investigate the impact of various properties of B cell-derived receptors—such as their dimensionality of motion, morphology, and binding valency—on the receptor–ligand binding kinetics. To this end, we implemented an algorithm that simulates antigen binding by B cell-derived receptors with a Y-shaped morphology that can move in different dimensionalities, i.e., either as membrane-anchored receptors or as soluble receptors. The mapping of the macroscopic and microscopic binding rates allowed us to quantitatively compare different agent-based model variants for the different types of B cell-derived receptors. Our results indicate that the dimensionality of motion governs the binding kinetics and that this predominant impact is quantitatively compensated by the bivalency of these receptors. PMID:29250071

  3. The effective application of a discrete transition model to explore cell-cycle regulation in yeast

    PubMed Central

    2013-01-01

    Background Bench biologists often do not take part in the development of computational models for their systems, and therefore, they frequently employ them as “black-boxes”. Our aim was to construct and test a model that does not depend on the availability of quantitative data, and can be directly used without a need for intensive computational background. Results We present a discrete transition model. We used cell-cycle in budding yeast as a paradigm for a complex network, demonstrating phenomena such as sequential protein expression and activity, and cell-cycle oscillation. The structure of the network was validated by its response to computational perturbations such as mutations, and its response to mating-pheromone or nitrogen depletion. The model has a strong predicative capability, demonstrating how the activity of a specific transcription factor, Hcm1, is regulated, and what determines commitment of cells to enter and complete the cell-cycle. Conclusion The model presented herein is intuitive, yet is expressive enough to elucidate the intrinsic structure and qualitative behavior of large and complex regulatory networks. Moreover our model allowed us to examine multiple hypotheses in a simple and intuitive manner, giving rise to testable predictions. This methodology can be easily integrated as a useful approach for the study of networks, enriching experimental biology with computational insights. PMID:23915717

  4. Systems Biology-Driven Hypotheses Tested In Vivo: The Need to Advancing Molecular Imaging Tools.

    PubMed

    Verma, Garima; Palombo, Alessandro; Grigioni, Mauro; La Monaca, Morena; D'Avenio, Giuseppe

    2018-01-01

    Processing and interpretation of biological images may provide invaluable insights on complex, living systems because images capture the overall dynamics as a "whole." Therefore, "extraction" of key, quantitative morphological parameters could be, at least in principle, helpful in building a reliable systems biology approach in understanding living objects. Molecular imaging tools for system biology models have attained widespread usage in modern experimental laboratories. Here, we provide an overview on advances in the computational technology and different instrumentations focused on molecular image processing and analysis. Quantitative data analysis through various open source software and algorithmic protocols will provide a novel approach for modeling the experimental research program. Besides this, we also highlight the predictable future trends regarding methods for automatically analyzing biological data. Such tools will be very useful to understand the detailed biological and mathematical expressions under in-silico system biology processes with modeling properties.

  5. Neutron multiplicity counting: Confidence intervals for reconstruction parameters

    DOE PAGES

    Verbeke, Jerome M.

    2016-03-09

    From nuclear materials accountability to homeland security, the need for improved nuclear material detection, assay, and authentication has grown over the past decades. Starting in the 1940s, neutron multiplicity counting techniques have enabled quantitative evaluation of masses and multiplications of fissile materials. In this paper, we propose a new method to compute uncertainties on these parameters using a model-based sequential Bayesian processor, resulting in credible regions in the fissile material mass and multiplication space. These uncertainties will enable us to evaluate quantitatively proposed improvements to the theoretical fission chain model. Additionally, because the processor can calculate uncertainties in real time,more » it is a useful tool in applications such as portal monitoring: monitoring can stop as soon as a preset confidence of non-threat is reached.« less

  6. In vivo classification of human skin burns using machine learning and quantitative features captured by optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Singla, Neeru; Srivastava, Vishal; Singh Mehta, Dalip

    2018-02-01

    We report the first fully automated detection of human skin burn injuries in vivo, with the goal of automatic surgical margin assessment based on optical coherence tomography (OCT) images. Our proposed automated procedure entails building a machine-learning-based classifier by extracting quantitative features from normal and burn tissue images recorded by OCT. In this study, 56 samples (28 normal, 28 burned) were imaged by OCT and eight features were extracted. A linear model classifier was trained using 34 samples and 22 samples were used to test the model. Sensitivity of 91.6% and specificity of 90% were obtained. Our results demonstrate the capability of a computer-aided technique for accurately and automatically identifying burn tissue resection margins during surgical treatment.

  7. Single-Cell Genomics: Approaches and Utility in Immunology.

    PubMed

    Neu, Karlynn E; Tang, Qingming; Wilson, Patrick C; Khan, Aly A

    2017-02-01

    Single-cell genomics offers powerful tools for studying immune cells, which make it possible to observe rare and intermediate cell states that cannot be resolved at the population level. Advances in computer science and single-cell sequencing technology have created a data-driven revolution in immunology. The challenge for immunologists is to harness computing and turn an avalanche of quantitative data into meaningful discovery of immunological principles, predictive models, and strategies for therapeutics. Here, we review the current literature on computational analysis of single-cell RNA-sequencing data and discuss underlying assumptions, methods, and applications in immunology, and highlight important directions for future research. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Image analysis and machine learning in digital pathology: Challenges and opportunities.

    PubMed

    Madabhushi, Anant; Lee, George

    2016-10-01

    With the rise in whole slide scanner technology, large numbers of tissue slides are being scanned and represented and archived digitally. While digital pathology has substantial implications for telepathology, second opinions, and education there are also huge research opportunities in image computing with this new source of "big data". It is well known that there is fundamental prognostic data embedded in pathology images. The ability to mine "sub-visual" image features from digital pathology slide images, features that may not be visually discernible by a pathologist, offers the opportunity for better quantitative modeling of disease appearance and hence possibly improved prediction of disease aggressiveness and patient outcome. However the compelling opportunities in precision medicine offered by big digital pathology data come with their own set of computational challenges. Image analysis and computer assisted detection and diagnosis tools previously developed in the context of radiographic images are woefully inadequate to deal with the data density in high resolution digitized whole slide images. Additionally there has been recent substantial interest in combining and fusing radiologic imaging and proteomics and genomics based measurements with features extracted from digital pathology images for better prognostic prediction of disease aggressiveness and patient outcome. Again there is a paucity of powerful tools for combining disease specific features that manifest across multiple different length scales. The purpose of this review is to discuss developments in computational image analysis tools for predictive modeling of digital pathology images from a detection, segmentation, feature extraction, and tissue classification perspective. We discuss the emergence of new handcrafted feature approaches for improved predictive modeling of tissue appearance and also review the emergence of deep learning schemes for both object detection and tissue classification. We also briefly review some of the state of the art in fusion of radiology and pathology images and also combining digital pathology derived image measurements with molecular "omics" features for better predictive modeling. The review ends with a brief discussion of some of the technical and computational challenges to be overcome and reflects on future opportunities for the quantitation of histopathology. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Quantitative assessment of computational models for retinotopic map formation

    PubMed Central

    Sterratt, David C; Cutts, Catherine S; Willshaw, David J; Eglen, Stephen J

    2014-01-01

    ABSTRACT Molecular and activity‐based cues acting together are thought to guide retinal axons to their terminal sites in vertebrate optic tectum or superior colliculus (SC) to form an ordered map of connections. The details of mechanisms involved, and the degree to which they might interact, are still not well understood. We have developed a framework within which existing computational models can be assessed in an unbiased and quantitative manner against a set of experimental data curated from the mouse retinocollicular system. Our framework facilitates comparison between models, testing new models against known phenotypes and simulating new phenotypes in existing models. We have used this framework to assess four representative models that combine Eph/ephrin gradients and/or activity‐based mechanisms and competition. Two of the models were updated from their original form to fit into our framework. The models were tested against five different phenotypes: wild type, Isl2‐EphA3 ki/ki, Isl2‐EphA3 ki/+, ephrin‐A2,A3,A5 triple knock‐out (TKO), and Math5 −/− (Atoh7). Two models successfully reproduced the extent of the Math5 −/− anteromedial projection, but only one of those could account for the collapse point in Isl2‐EphA3 ki/+. The models needed a weak anteroposterior gradient in the SC to reproduce the residual order in the ephrin‐A2,A3,A5 TKO phenotype, suggesting either an incomplete knock‐out or the presence of another guidance molecule. Our article demonstrates the importance of testing retinotopic models against as full a range of phenotypes as possible, and we have made available MATLAB software, we wrote to facilitate this process. © 2014 Wiley Periodicals, Inc. Develop Neurobiol 75: 641–666, 2015 PMID:25367067

  10. A two-dimensional model of water: Theory and computer simulations

    NASA Astrophysics Data System (ADS)

    Urbič, T.; Vlachy, V.; Kalyuzhnyi, Yu. V.; Southall, N. T.; Dill, K. A.

    2000-02-01

    We develop an analytical theory for a simple model of liquid water. We apply Wertheim's thermodynamic perturbation theory (TPT) and integral equation theory (IET) for associative liquids to the MB model, which is among the simplest models of water. Water molecules are modeled as 2-dimensional Lennard-Jones disks with three hydrogen bonding arms arranged symmetrically, resembling the Mercedes-Benz (MB) logo. The MB model qualitatively predicts both the anomalous properties of pure water and the anomalous solvation thermodynamics of nonpolar molecules. IET is based on the orientationally averaged version of the Ornstein-Zernike equation. This is one of the main approximations in the present work. IET correctly predicts the pair correlation function of the model water at high temperatures. Both TPT and IET are in semi-quantitative agreement with the Monte Carlo values of the molar volume, isothermal compressibility, thermal expansion coefficient, and heat capacity. A major advantage of these theories is that they require orders of magnitude less computer time than the Monte Carlo simulations.

  11. Validation of vibration-dissociation coupling models in hypersonic non-equilibrium separated flows

    NASA Astrophysics Data System (ADS)

    Shoev, G.; Oblapenko, G.; Kunova, O.; Mekhonoshina, M.; Kustova, E.

    2018-03-01

    The validation of recently developed models of vibration-dissociation coupling is discussed in application to numerical solutions of the Navier-Stokes equations in a two-temperature approximation for a binary N2/N flow. Vibrational-translational relaxation rates are computed using the Landau-Teller formula generalized for strongly non-equilibrium flows obtained in the framework of the Chapman-Enskog method. Dissociation rates are calculated using the modified Treanor-Marrone model taking into account the dependence of the model parameter on the vibrational state. The solutions are compared to those obtained using traditional Landau-Teller and Treanor-Marrone models, and it is shown that for high-enthalpy flows, the traditional and recently developed models can give significantly different results. The computed heat flux and pressure on the surface of a double cone are in a good agreement with experimental data available in the literature on low-enthalpy flow with strong thermal non-equilibrium. The computed heat flux on a double wedge qualitatively agrees with available data for high-enthalpy non-equilibrium flows. Different contributions to the heat flux calculated using rigorous kinetic theory methods are evaluated. Quantitative discrepancy of numerical and experimental data is discussed.

  12. Recent advances in modeling and simulation of the exposure and response of tungsten to fusion energy conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marian, Jaime; Becquart, Charlotte S.; Domain, Christophe

    2017-06-09

    Under the anticipated operating conditions for demonstration magnetic fusion reactors beyond ITER, structural materials will be exposed to unprecedented conditions of irradiation, heat flux, and temperature. While such extreme environments remain inaccessible experimentally, computational modeling and simulation can provide qualitative and quantitative insights into materials response and complement the available experimental measurements with carefully validated predictions. For plasma facing components such as the first wall and the divertor, tungsten (W) has been selected as the best candidate material due to its superior high-temperature and irradiation properties. In this paper we provide a review of recent efforts in computational modeling ofmore » W both as a plasma-facing material exposed to He deposition as well as a bulk structural material subjected to fast neutron irradiation. We use a multiscale modeling approach –commonly used as the materials modeling paradigm– to define the outline of the paper and highlight recent advances using several classes of techniques and their interconnection. We highlight several of the most salient findings obtained via computational modeling and point out a number of remaining challenges and future research directions« less

  13. Computer Model Used to Help Customize Medicine

    NASA Technical Reports Server (NTRS)

    Stauber, Laurel J.; Veris, Jenise

    2001-01-01

    Dr. Radhakrishnan, a researcher at the NASA Glenn Research Center, in collaboration with biomedical researchers at the Case Western Reserve University School of Medicine and Rainbow Babies and Children's Hospital, is developing computational models of human physiology that quantitate metabolism and its regulation, in both healthy and pathological states. These models can help predict the effects of stresses or interventions, such as drug therapies, and contribute to the development of customized medicine. Customized medical treatment protocols can give more comprehensive evaluations and lead to more specific and effective treatments for patients, reducing treatment time and cost. Commercial applications of this research may help the pharmaceutical industry identify therapeutic needs and predict drug-drug interactions. Researchers will be able to study human metabolic reactions to particular treatments while in different environments as well as establish more definite blood metabolite concentration ranges in normal and pathological states. These computational models may help NASA provide the background for developing strategies to monitor and safeguard the health of astronauts and civilians in space stations and colonies. They may also help to develop countermeasures that ameliorate the effects of both acute and chronic space exposure.

  14. [The implementation of computer model in research of dynamics of proliferation of cells of thyroid gland follicle].

    PubMed

    Abduvaliev, A A; Gil'dieva, M S; Khidirov, B N; Saĭdalieva, M; Khasanov, A A; Musaeva, Sh N; Saatov, T S

    2012-04-01

    The article deals with the results of computational experiments in research of dynamics of proliferation of cells of thyroid gland follicle in normal condition and in the case of malignant neoplasm. The model studies demonstrated that the chronic increase of parameter of proliferation of cells of thyroid gland follicle results in abnormal behavior of numbers of cell cenosis of thyroid gland follicle. The stationary state interrupts, the auto-oscillations occur with transition to irregular oscillations with unpredictable cell proliferation and further to the "black hole" effect. It is demonstrated that the present medical biologic experimental data and theory propositions concerning the structural functional organization of thyroid gland on cell level permit to develop mathematical models for quantitative analysis of numbers of cell cenosis of thyroid gland follicle in normal conditions. The technique of modeling of regulative mechanisms of living systems and equations of cell cenosis regulations was used

  15. Numerical study of base pressure characteristic curve for a four-engine clustered nozzle configuration

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See

    1993-01-01

    The objective of this study is to benchmark a four-engine clustered nozzle base flowfield with a computational fluid dynamics (CFD) model. The CFD model is a three-dimensional pressure-based, viscous flow formulation. An adaptive upwind scheme is employed for the spatial discretization. The upwind scheme is based on second and fourth order central differencing with adaptive artificial dissipation. Qualitative base flow features such as the reverse jet, wall jet, recompression shock, and plume-plume impingement have been captured. The computed quantitative flow properties such as the radial base pressure distribution, model centerline Mach number and static pressure variation, and base pressure characteristic curve agreed reasonably well with those of the measurement. Parametric study on the effect of grid resolution, turbulence model, inlet boundary condition and difference scheme on convective terms has been performed. The results showed that grid resolution had a strong influence on the accuracy of the base flowfield prediction.

  16. Evolving a lingua franca and associated software infrastructure for computational systems biology: the Systems Biology Markup Language (SBML) project.

    PubMed

    Hucka, M; Finney, A; Bornstein, B J; Keating, S M; Shapiro, B E; Matthews, J; Kovitz, B L; Schilstra, M J; Funahashi, A; Doyle, J C; Kitano, H

    2004-06-01

    Biologists are increasingly recognising that computational modelling is crucial for making sense of the vast quantities of complex experimental data that are now being collected. The systems biology field needs agreed-upon information standards if models are to be shared, evaluated and developed cooperatively. Over the last four years, our team has been developing the Systems Biology Markup Language (SBML) in collaboration with an international community of modellers and software developers. SBML has become a de facto standard format for representing formal, quantitative and qualitative models at the level of biochemical reactions and regulatory networks. In this article, we summarise the current and upcoming versions of SBML and our efforts at developing software infrastructure for supporting and broadening its use. We also provide a brief overview of the many SBML-compatible software tools available today.

  17. Handling of computational in vitro/in vivo correlation problems by Microsoft Excel: V. Predictive absorbability models.

    PubMed

    Langenbucher, Frieder

    2007-08-01

    This paper discusses Excel applications related to the prediction of drug absorbability from physicochemical constants. PHDISSOC provides a generalized model for pH profiles of electrolytic dissociation, water solubility, and partition coefficient. SKMODEL predicts drug absorbability, based on a log-log plot of water solubility and O/W partitioning; augmented by additional features such as electrolytic dissociation, melting point, and the dose administered. GIABS presents a mechanistic model of g.i. drug absorption. BIODATCO presents a database compiling relevant drug data to be used for quantitative predictions.

  18. Computer-aided design of liposomal drugs: In silico prediction and experimental validation of drug candidates for liposomal remote loading.

    PubMed

    Cern, Ahuva; Barenholz, Yechezkel; Tropsha, Alexander; Goldblum, Amiram

    2014-01-10

    Previously we have developed and statistically validated Quantitative Structure Property Relationship (QSPR) models that correlate drugs' structural, physical and chemical properties as well as experimental conditions with the relative efficiency of remote loading of drugs into liposomes (Cern et al., J. Control. Release 160 (2012) 147-157). Herein, these models have been used to virtually screen a large drug database to identify novel candidate molecules for liposomal drug delivery. Computational hits were considered for experimental validation based on their predicted remote loading efficiency as well as additional considerations such as availability, recommended dose and relevance to the disease. Three compounds were selected for experimental testing which were confirmed to be correctly classified by our previously reported QSPR models developed with Iterative Stochastic Elimination (ISE) and k-Nearest Neighbors (kNN) approaches. In addition, 10 new molecules with known liposome remote loading efficiency that were not used by us in QSPR model development were identified in the published literature and employed as an additional model validation set. The external accuracy of the models was found to be as high as 82% or 92%, depending on the model. This study presents the first successful application of QSPR models for the computer-model-driven design of liposomal drugs. © 2013.

  19. Computer-aided design of liposomal drugs: in silico prediction and experimental validation of drug candidates for liposomal remote loading

    PubMed Central

    Cern, Ahuva; Barenholz, Yechezkel; Tropsha, Alexander; Goldblum, Amiram

    2014-01-01

    Previously we have developed and statistically validated Quantitative Structure Property Relationship (QSPR) models that correlate drugs’ structural, physical and chemical properties as well as experimental conditions with the relative efficiency of remote loading of drugs into liposomes (Cern et al, Journal of Controlled Release, 160(2012) 14–157). Herein, these models have been used to virtually screen a large drug database to identify novel candidate molecules for liposomal drug delivery. Computational hits were considered for experimental validation based on their predicted remote loading efficiency as well as additional considerations such as availability, recommended dose and relevance to the disease. Three compounds were selected for experimental testing which were confirmed to be correctly classified by our previously reported QSPR models developed with Iterative Stochastic Elimination (ISE) and k-nearest neighbors (kNN) approaches. In addition, 10 new molecules with known liposome remote loading efficiency that were not used in QSPR model development were identified in the published literature and employed as an additional model validation set. The external accuracy of the models was found to be as high as 82% or 92%, depending on the model. This study presents the first successful application of QSPR models for the computer-model-driven design of liposomal drugs. PMID:24184343

  20. Leveraging unsupervised training sets for multi-scale compartmentalization in renal pathology

    NASA Astrophysics Data System (ADS)

    Lutnick, Brendon; Tomaszewski, John E.; Sarder, Pinaki

    2017-03-01

    Clinical pathology relies on manual compartmentalization and quantification of biological structures, which is time consuming and often error-prone. Application of computer vision segmentation algorithms to histopathological image analysis, in contrast, can offer fast, reproducible, and accurate quantitative analysis to aid pathologists. Algorithms tunable to different biologically relevant structures can allow accurate, precise, and reproducible estimates of disease states. In this direction, we have developed a fast, unsupervised computational method for simultaneously separating all biologically relevant structures from histopathological images in multi-scale. Segmentation is achieved by solving an energy optimization problem. Representing the image as a graph, nodes (pixels) are grouped by minimizing a Potts model Hamiltonian, adopted from theoretical physics, modeling interacting electron spins. Pixel relationships (modeled as edges) are used to update the energy of the partitioned graph. By iteratively improving the clustering, the optimal number of segments is revealed. To reduce computational time, the graph is simplified using a Cantor pairing function to intelligently reduce the number of included nodes. The classified nodes are then used to train a multiclass support vector machine to apply the segmentation over the full image. Accurate segmentations of images with as many as 106 pixels can be completed only in 5 sec, allowing for attainable multi-scale visualization. To establish clinical potential, we employed our method in renal biopsies to quantitatively visualize for the first time scale variant compartments of heterogeneous intra- and extraglomerular structures simultaneously. Implications of the utility of our method extend to fields such as oncology, genomics, and non-biological problems.

  1. A new method of morphological comparison for bony reconstructive surgery: maxillary reconstruction using scapular tip bone

    NASA Astrophysics Data System (ADS)

    Chan, Harley; Gilbert, Ralph W.; Pagedar, Nitin A.; Daly, Michael J.; Irish, Jonathan C.; Siewerdsen, Jeffrey H.

    2010-02-01

    esthetic appearance is one of the most important factors for reconstructive surgery. The current practice of maxillary reconstruction chooses radial forearm, fibula or iliac rest osteocutaneous to recreate three-dimensional complex structures of the palate and maxilla. However, these bone flaps lack shape similarity to the palate and result in a less satisfactory esthetic. Considering similarity factors and vasculature advantages, reconstructive surgeons recently explored the use of scapular tip myo-osseous free flaps to restore the excised site. We have developed a new method that quantitatively evaluates the morphological similarity of the scapula tip bone and palate based on a diagnostic volumetric computed tomography (CT) image. This quantitative result was further interpreted as a color map that rendered on the surface of a three-dimensional computer model. For surgical planning, this color interpretation could potentially assist the surgeon to maximize the orientation of the bone flaps for best fit of the reconstruction site. With approval from the Research Ethics Board (REB) of the University Health Network, we conducted a retrospective analysis with CT image obtained from 10 patients. Each patient had a CT scans including the maxilla and chest on the same day. Based on this image set, we simulated total, subtotal and hemi palate reconstruction. The procedure of simulation included volume segmentation, conversing the segmented volume to a stereo lithography (STL) model, manual registration, computation of minimum geometric distances and curvature between STL model. Across the 10 patients data, we found the overall root-mean-square (RMS) conformance was 3.71+/- 0.16 mm

  2. GHRS observations and theoretical modeling of early type stars in R136a

    NASA Astrophysics Data System (ADS)

    de Koter, A.; Heap, S.; Hubeny, I.; Lanz, T.; Hutchings, J.; Lamers, H. J. G. L. M.; Maran, S.; Schmutz, W.

    1994-05-01

    We present the first spectroscopic observations of individual stars in R136a, the most dense part of the starburst cluster 30 Doradus in the LMC. Spectra of two stars are scheduled to be obtained with the GHRS on board the HST: R136a5, the brightest of the complex and R136a2, a Wolf-Rayet star of type WN. The 30 Doradus cluster is the only starburst region in which individual stars can be studied. Therefore, quantitative knowledge of the basic stellar parameters will yield valuable insight into the formation of massive stars in starbursts and into their subsequent evolution. Detailed modeling of the structure of the atmosphere and wind of these stars will also lead to a better understanding of the mechanism(s) that govern their dynamics. We present the first results of our detailed quantitative spectral analysis using state-of-the-art non-LTE model atmospheres for stars with extended and expanding atmospheres. The models are computed using the Improved-Sobolev Approximation wind code (ISA-WIND) of de Koter, Schmutz & Lamers (1993, A&A 277, 561), which has been extended to include C, N and Si. Our model computations are not based on the core-halo approximation, but use a unified treatment of the photosphere and wind. This approach is essential for Wolf-Rayet stars. Our synthetic spectra, dominated by the P Cygni profiles of the UV resonance lines, also account for the numerous weak metal lines of photospheric origin.

  3. Blinded Prospective Evaluation of Computer-Based Mechanistic Schizophrenia Disease Model for Predicting Drug Response

    PubMed Central

    Geerts, Hugo; Spiros, Athan; Roberts, Patrick; Twyman, Roy; Alphs, Larry; Grace, Anthony A.

    2012-01-01

    The tremendous advances in understanding the neurobiological circuits involved in schizophrenia have not translated into more effective treatments. An alternative strategy is to use a recently published ‘Quantitative Systems Pharmacology’ computer-based mechanistic disease model of cortical/subcortical and striatal circuits based upon preclinical physiology, human pathology and pharmacology. The physiology of 27 relevant dopamine, serotonin, acetylcholine, norepinephrine, gamma-aminobutyric acid (GABA) and glutamate-mediated targets is calibrated using retrospective clinical data on 24 different antipsychotics. The model was challenged to predict quantitatively the clinical outcome in a blinded fashion of two experimental antipsychotic drugs; JNJ37822681, a highly selective low-affinity dopamine D2 antagonist and ocaperidone, a very high affinity dopamine D2 antagonist, using only pharmacology and human positron emission tomography (PET) imaging data. The model correctly predicted the lower performance of JNJ37822681 on the positive and negative syndrome scale (PANSS) total score and the higher extra-pyramidal symptom (EPS) liability compared to olanzapine and the relative performance of ocaperidone against olanzapine, but did not predict the absolute PANSS total score outcome and EPS liability for ocaperidone, possibly due to placebo responses and EPS assessment methods. Because of its virtual nature, this modeling approach can support central nervous system research and development by accounting for unique human drug properties, such as human metabolites, exposure, genotypes and off-target effects and can be a helpful tool for drug discovery and development. PMID:23251349

  4. A case for human systems neuroscience.

    PubMed

    Gardner, J L

    2015-06-18

    Can the human brain itself serve as a model for a systems neuroscience approach to understanding the human brain? After all, how the brain is able to create the richness and complexity of human behavior is still largely mysterious. What better choice to study that complexity than to study it in humans? However, measurements of brain activity typically need to be made non-invasively which puts severe constraints on what can be learned about the internal workings of the brain. Our approach has been to use a combination of psychophysics in which we can use human behavioral flexibility to make quantitative measurements of behavior and link those through computational models to measurements of cortical activity through magnetic resonance imaging. In particular, we have tested various computational hypotheses about what neural mechanisms could account for behavioral enhancement with spatial attention (Pestilli et al., 2011). Resting both on quantitative measurements and considerations of what is known through animal models, we concluded that weighting of sensory signals by the magnitude of their response is a neural mechanism for efficient selection of sensory signals and consequent improvements in behavioral performance with attention. While animal models have many technical advantages over studying the brain in humans, we believe that human systems neuroscience should endeavor to validate, replicate and extend basic knowledge learned from animal model systems and thus form a bridge to understanding how the brain creates the complex and rich cognitive capacities of humans. Copyright © 2014 IBRO. Published by Elsevier Ltd. All rights reserved.

  5. Cloud computing approaches for prediction of ligand binding poses and pathways.

    PubMed

    Lawrenz, Morgan; Shukla, Diwakar; Pande, Vijay S

    2015-01-22

    We describe an innovative protocol for ab initio prediction of ligand crystallographic binding poses and highly effective analysis of large datasets generated for protein-ligand dynamics. We include a procedure for setup and performance of distributed molecular dynamics simulations on cloud computing architectures, a model for efficient analysis of simulation data, and a metric for evaluation of model convergence. We give accurate binding pose predictions for five ligands ranging in affinity from 7 nM to > 200 μM for the immunophilin protein FKBP12, for expedited results in cases where experimental structures are difficult to produce. Our approach goes beyond single, low energy ligand poses to give quantitative kinetic information that can inform protein engineering and ligand design.

  6. Remote control missile model test

    NASA Technical Reports Server (NTRS)

    Allen, Jerry M.; Shaw, David S.; Sawyer, Wallace C.

    1989-01-01

    An extremely large, systematic, axisymmetric body/tail fin data base was gathered through tests of an innovative missile model design which is described herein. These data were originally obtained for incorporation into a missile aerodynamics code based on engineering methods (Program MISSILE3), but can also be used as diagnostic test cases for developing computational methods because of the individual-fin data included in the data base. Detailed analysis of four sample cases from these data are presented to illustrate interesting individual-fin force and moment trends. These samples quantitatively show how bow shock, fin orientation, fin deflection, and body vortices can produce strong, unusual, and computationally challenging effects on individual fin loads. Comparisons between these data and calculations from the SWINT Euler code are also presented.

  7. Error Estimation and Uncertainty Propagation in Computational Fluid Mechanics

    NASA Technical Reports Server (NTRS)

    Zhu, J. Z.; He, Guowei; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    Numerical simulation has now become an integral part of engineering design process. Critical design decisions are routinely made based on the simulation results and conclusions. Verification and validation of the reliability of the numerical simulation is therefore vitally important in the engineering design processes. We propose to develop theories and methodologies that can automatically provide quantitative information about the reliability of the numerical simulation by estimating numerical approximation error, computational model induced errors and the uncertainties contained in the mathematical models so that the reliability of the numerical simulation can be verified and validated. We also propose to develop and implement methodologies and techniques that can control the error and uncertainty during the numerical simulation so that the reliability of the numerical simulation can be improved.

  8. Computational aeroelasticity using a pressure-based solver

    NASA Astrophysics Data System (ADS)

    Kamakoti, Ramji

    A computational methodology for performing fluid-structure interaction computations for three-dimensional elastic wing geometries is presented. The flow solver used is based on an unsteady Reynolds-Averaged Navier-Stokes (RANS) model. A well validated k-ε turbulence model with wall function treatment for near wall region was used to perform turbulent flow calculations. Relative merits of alternative flow solvers were investigated. The predictor-corrector-based Pressure Implicit Splitting of Operators (PISO) algorithm was found to be computationally economic for unsteady flow computations. Wing structure was modeled using Bernoulli-Euler beam theory. A fully implicit time-marching scheme (using the Newmark integration method) was used to integrate the equations of motion for structure. Bilinear interpolation and linear extrapolation techniques were used to transfer necessary information between fluid and structure solvers. Geometry deformation was accounted for by using a moving boundary module. The moving grid capability was based on a master/slave concept and transfinite interpolation techniques. Since computations were performed on a moving mesh system, the geometric conservation law must be preserved. This is achieved by appropriately evaluating the Jacobian values associated with each cell. Accurate computation of contravariant velocities for unsteady flows using the momentum interpolation method on collocated, curvilinear grids was also addressed. Flutter computations were performed for the AGARD 445.6 wing at subsonic, transonic and supersonic Mach numbers. Unsteady computations were performed at various dynamic pressures to predict the flutter boundary. Results showed favorable agreement of experiment and previous numerical results. The computational methodology exhibited capabilities to predict both qualitative and quantitative features of aeroelasticity.

  9. Revealing Neurocomputational Mechanisms of Reinforcement Learning and Decision-Making With the hBayesDM Package

    PubMed Central

    Ahn, Woo-Young; Haines, Nathaniel; Zhang, Lei

    2017-01-01

    Reinforcement learning and decision-making (RLDM) provide a quantitative framework and computational theories with which we can disentangle psychiatric conditions into the basic dimensions of neurocognitive functioning. RLDM offer a novel approach to assessing and potentially diagnosing psychiatric patients, and there is growing enthusiasm for both RLDM and computational psychiatry among clinical researchers. Such a framework can also provide insights into the brain substrates of particular RLDM processes, as exemplified by model-based analysis of data from functional magnetic resonance imaging (fMRI) or electroencephalography (EEG). However, researchers often find the approach too technical and have difficulty adopting it for their research. Thus, a critical need remains to develop a user-friendly tool for the wide dissemination of computational psychiatric methods. We introduce an R package called hBayesDM (hierarchical Bayesian modeling of Decision-Making tasks), which offers computational modeling of an array of RLDM tasks and social exchange games. The hBayesDM package offers state-of-the-art hierarchical Bayesian modeling, in which both individual and group parameters (i.e., posterior distributions) are estimated simultaneously in a mutually constraining fashion. At the same time, the package is extremely user-friendly: users can perform computational modeling, output visualization, and Bayesian model comparisons, each with a single line of coding. Users can also extract the trial-by-trial latent variables (e.g., prediction errors) required for model-based fMRI/EEG. With the hBayesDM package, we anticipate that anyone with minimal knowledge of programming can take advantage of cutting-edge computational-modeling approaches to investigate the underlying processes of and interactions between multiple decision-making (e.g., goal-directed, habitual, and Pavlovian) systems. In this way, we expect that the hBayesDM package will contribute to the dissemination of advanced modeling approaches and enable a wide range of researchers to easily perform computational psychiatric research within different populations. PMID:29601060

  10. In silico designing of power conversion efficient organic lead dyes for solar cells using todays innovative approaches to assure renewable energy for future

    NASA Astrophysics Data System (ADS)

    Kar, Supratik; Roy, Juganta K.; Leszczynski, Jerzy

    2017-06-01

    Advances in solar cell technology require designing of new organic dye sensitizers for dye-sensitized solar cells with high power conversion efficiency to circumvent the disadvantages of silicon-based solar cells. In silico studies including quantitative structure-property relationship analysis combined with quantum chemical analysis were employed to understand the primary electron transfer mechanism and photo-physical properties of 273 arylamine organic dyes from 11 diverse chemical families explicit to iodine electrolyte. The direct quantitative structure-property relationship models enable identification of the essential electronic and structural attributes necessary for quantifying the molecular prerequisites of 11 classes of arylamine organic dyes, responsible for high power conversion efficiency of dye-sensitized solar cells. Tetrahydroquinoline, N,N'-dialkylaniline and indoline have been least explored classes under arylamine organic dyes for dye-sensitized solar cells. Therefore, the identified properties from the corresponding quantitative structure-property relationship models of the mentioned classes were employed in designing of "lead dyes". Followed by, a series of electrochemical and photo-physical parameters were computed for designed dyes to check the required variables for electron flow of dye-sensitized solar cells. The combined computational techniques yielded seven promising lead dyes each for all three chemical classes considered. Significant (130, 183, and 46%) increment in predicted %power conversion efficiency was observed comparing with the existing dye with highest experimental %power conversion efficiency value for tetrahydroquinoline, N,N'-dialkylaniline and indoline, respectively maintaining required electrochemical parameters.

  11. Quantitative computed tomography for the prediction of pulmonary function after lung cancer surgery: a simple method using simulation software.

    PubMed

    Ueda, Kazuhiro; Tanaka, Toshiki; Li, Tao-Sheng; Tanaka, Nobuyuki; Hamano, Kimikazu

    2009-03-01

    The prediction of pulmonary functional reserve is mandatory in therapeutic decision-making for patients with resectable lung cancer, especially those with underlying lung disease. Volumetric analysis in combination with densitometric analysis of the affected lung lobe or segment with quantitative computed tomography (CT) helps to identify residual pulmonary function, although the utility of this modality needs investigation. The subjects of this prospective study were 30 patients with resectable lung cancer. A three-dimensional CT lung model was created with voxels representing normal lung attenuation (-600 to -910 Hounsfield units). Residual pulmonary function was predicted by drawing a boundary line between the lung to be preserved and that to be resected, directly on the lung model. The predicted values were correlated with the postoperative measured values. The predicted and measured values corresponded well (r=0.89, p<0.001). Although the predicted values corresponded with values predicted by simple calculation using a segment-counting method (r=0.98), there were two outliers whose pulmonary functional reserves were predicted more accurately by CT than by segment counting. The measured pulmonary functional reserves were significantly higher than the predicted values in patients with extensive emphysematous areas (<-910 Hounsfield units), but not in patients with chronic obstructive pulmonary disease. Quantitative CT yielded accurate prediction of functional reserve after lung cancer surgery and helped to identify patients whose functional reserves are likely to be underestimated. Hence, this modality should be utilized for patients with marginal pulmonary function.

  12. Soft Tissue Structure Modelling for Use in Orthopaedic Applications and Musculoskeletal Biomechanics

    NASA Astrophysics Data System (ADS)

    Audenaert, E. A.; Mahieu, P.; van Hoof, T.; Pattyn, C.

    2009-12-01

    We present our methodology for the three-dimensional anatomical and geometrical description of soft tissues, relevant for orthopaedic surgical applications and musculoskeletal biomechanics. The technique involves the segmentation and geometrical description of muscles and neurovascular structures from high-resolution computer tomography scanning for the reconstruction of generic anatomical models. These models can be used for quantitative interpretation of anatomical and biomechanical aspects of different soft tissue structures. This approach should allow the use of these data in other application fields, such as musculoskeletal modelling, simulations for radiation therapy, and databases for use in minimally invasive, navigated and robotic surgery.

  13. Internet-based system for simulation-based medical planning for cardiovascular disease.

    PubMed

    Steele, Brooke N; Draney, Mary T; Ku, Joy P; Taylor, Charles A

    2003-06-01

    Current practice in vascular surgery utilizes only diagnostic and empirical data to plan treatments, which does not enable quantitative a priori prediction of the outcomes of interventions. We have previously described simulation-based medical planning methods to model blood flow in arteries and plan medical treatments based on physiologic models. An important consideration for the design of these patient-specific modeling systems is the accessibility to physicians with modest computational resources. We describe a simulation-based medical planning environment developed for the World Wide Web (WWW) using the Virtual Reality Modeling Language (VRML) and the Java programming language.

  14. On agent-based modeling and computational social science.

    PubMed

    Conte, Rosaria; Paolucci, Mario

    2014-01-01

    In the first part of the paper, the field of agent-based modeling (ABM) is discussed focusing on the role of generative theories, aiming at explaining phenomena by growing them. After a brief analysis of the major strengths of the field some crucial weaknesses are analyzed. In particular, the generative power of ABM is found to have been underexploited, as the pressure for simple recipes has prevailed and shadowed the application of rich cognitive models. In the second part of the paper, the renewal of interest for Computational Social Science (CSS) is focused upon, and several of its variants, such as deductive, generative, and complex CSS, are identified and described. In the concluding remarks, an interdisciplinary variant, which takes after ABM, reconciling it with the quantitative one, is proposed as a fundamental requirement for a new program of the CSS.

  15. On agent-based modeling and computational social science

    PubMed Central

    Conte, Rosaria; Paolucci, Mario

    2014-01-01

    In the first part of the paper, the field of agent-based modeling (ABM) is discussed focusing on the role of generative theories, aiming at explaining phenomena by growing them. After a brief analysis of the major strengths of the field some crucial weaknesses are analyzed. In particular, the generative power of ABM is found to have been underexploited, as the pressure for simple recipes has prevailed and shadowed the application of rich cognitive models. In the second part of the paper, the renewal of interest for Computational Social Science (CSS) is focused upon, and several of its variants, such as deductive, generative, and complex CSS, are identified and described. In the concluding remarks, an interdisciplinary variant, which takes after ABM, reconciling it with the quantitative one, is proposed as a fundamental requirement for a new program of the CSS. PMID:25071642

  16. A Computational Model of the Ionic Currents, Ca2+ Dynamics and Action Potentials Underlying Contraction of Isolated Uterine Smooth Muscle

    PubMed Central

    Tong, Wing-Chiu; Choi, Cecilia Y.; Karche, Sanjay; Holden, Arun V.; Zhang, Henggui; Taggart, Michael J.

    2011-01-01

    Uterine contractions during labor are discretely regulated by rhythmic action potentials (AP) of varying duration and form that serve to determine calcium-dependent force production. We have employed a computational biology approach to develop a fuller understanding of the complexity of excitation-contraction (E-C) coupling of uterine smooth muscle cells (USMC). Our overall aim is to establish a mathematical platform of sufficient biophysical detail to quantitatively describe known uterine E-C coupling parameters and thereby inform future empirical investigations of physiological and pathophysiological mechanisms governing normal and dysfunctional labors. From published and unpublished data we construct mathematical models for fourteen ionic currents of USMCs: currents (L- and T-type), current, an hyperpolarization-activated current, three voltage-gated currents, two -activated current, -activated current, non-specific cation current, - exchanger, - pump and background current. The magnitudes and kinetics of each current system in a spindle shaped single cell with a specified surface area∶volume ratio is described by differential equations, in terms of maximal conductances, electrochemical gradient, voltage-dependent activation/inactivation gating variables and temporal changes in intracellular computed from known fluxes. These quantifications are validated by the reconstruction of the individual experimental ionic currents obtained under voltage-clamp. Phasic contraction is modeled in relation to the time constant of changing . This integrated model is validated by its reconstruction of the different USMC AP configurations (spikes, plateau and bursts of spikes), the change from bursting to plateau type AP produced by estradiol and of simultaneous experimental recordings of spontaneous AP, and phasic force. In summary, our advanced mathematical model provides a powerful tool to investigate the physiological ionic mechanisms underlying the genesis of uterine electrical E-C coupling of labor and parturition. This will furnish the evolution of descriptive and predictive quantitative models of myometrial electrogenesis at the whole cell and tissue levels. PMID:21559514

  17. Mechanistic Computational Model of Steroidgenesis in H295R Cells: Role of (Oxysterols and Cell Proliferation to Improve Predictability of Biochemical Response to Endocrine Active Chemical-Metyrapone

    EPA Science Inventory

    The human adrenocortical carcinoma cell line H295R is being used as an in vitro steroidogenesis screening assay to assess the impact of endocrine active chemicals (EACs) capable of altering steroid biosynthesis. To enhance the interpretation and quantitative application of measur...

  18. Theoretical Framework for Interaction Game Design

    DTIC Science & Technology

    2016-05-19

    modeling. We take a data-driven quantitative approach to understand conversational behaviors by measuring conversational behaviors using advanced sensing...current state of the art, human computing is considered to be a reasonable approach to break through the current limitation. To solicit high quality and...proper resources in conversation to enable smooth and effective interaction. The last technique is about conversation measurement , analysis, and

  19. Computational models of aortic coarctation in hypoplastic left heart syndrome: considerations on validation of a detailed 3D model.

    PubMed

    Biglino, Giovanni; Corsini, Chiara; Schievano, Silvia; Dubini, Gabriele; Giardini, Alessandro; Hsia, Tain-Yen; Pennati, Giancarlo; Taylor, Andrew M

    2014-05-01

    Reliability of computational models for cardiovascular investigations strongly depends on their validation against physical data. This study aims to experimentally validate a computational model of complex congenital heart disease (i.e., surgically palliated hypoplastic left heart syndrome with aortic coarctation) thus demonstrating that hemodynamic information can be reliably extrapolated from the model for clinically meaningful investigations. A patient-specific aortic arch model was tested in a mock circulatory system and the same flow conditions were re-created in silico, by setting an appropriate lumped parameter network (LPN) attached to the same three-dimensional (3D) aortic model (i.e., multi-scale approach). The model included a modified Blalock-Taussig shunt and coarctation of the aorta. Different flow regimes were tested as well as the impact of uncertainty in viscosity. Computational flow and pressure results were in good agreement with the experimental signals, both qualitatively, in terms of the shape of the waveforms, and quantitatively (mean aortic pressure 62.3 vs. 65.1 mmHg, 4.8% difference; mean aortic flow 28.0 vs. 28.4% inlet flow, 1.4% difference; coarctation pressure drop 30.0 vs. 33.5 mmHg, 10.4% difference), proving the reliability of the numerical approach. It was observed that substantial changes in fluid viscosity or using a turbulent model in the numerical simulations did not significantly affect flows and pressures of the investigated physiology. Results highlighted how the non-linear fluid dynamic phenomena occurring in vitro must be properly described to ensure satisfactory agreement. This study presents methodological considerations for using experimental data to preliminarily set up a computational model, and then simulate a complex congenital physiology using a multi-scale approach.

  20. Use of mechanistic simulations as a quantitative risk-ranking tool within the quality by design framework.

    PubMed

    Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G

    2014-11-20

    The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.

  1. Computer Modeling of Non-Isothermal Crystallization

    NASA Technical Reports Server (NTRS)

    Kelton, K. F.; Narayan, K. Lakshmi; Levine, L. E.; Cull, T. C.; Ray, C. S.

    1996-01-01

    A realistic computer model for simulating isothermal and non-isothermal phase transformations proceeding by homogeneous and heterogeneous nucleation and interface-limited growth is presented. A new treatment for particle size effects on the crystallization kinetics is developed and is incorporated into the numerical model. Time-dependent nucleation rates, size-dependent growth rates, and surface crystallization are also included. Model predictions are compared with experimental measurements of DSC/DTA peak parameters for the crystallization of lithium disilicate glass as a function of particle size, Pt doping levels, and water content. The quantitative agreement that is demonstrated indicates that the numerical model can be used to extract key kinetic data from easily obtained calorimetric data. The model can also be used to probe nucleation and growth behavior in regimes that are otherwise inaccessible. Based on a fit to data, an earlier prediction that the time-dependent nucleation rate in a DSC/DTA scan can rise above the steady-state value at a temperature higher than the peak in the steady-state rate is demonstrated.

  2. Computational study of nonlinear plasma waves. [plasma simulation model applied to electrostatic waves in collisionless plasma

    NASA Technical Reports Server (NTRS)

    Matsuda, Y.

    1974-01-01

    A low-noise plasma simulation model is developed and applied to a series of linear and nonlinear problems associated with electrostatic wave propagation in a one-dimensional, collisionless, Maxwellian plasma, in the absence of magnetic field. It is demonstrated that use of the hybrid simulation model allows economical studies to be carried out in both the linear and nonlinear regimes with better quantitative results, for comparable computing time, than can be obtained by conventional particle simulation models, or direct solution of the Vlasov equation. The characteristics of the hybrid simulation model itself are first investigated, and it is shown to be capable of verifying the theoretical linear dispersion relation at wave energy levels as low as .000001 of the plasma thermal energy. Having established the validity of the hybrid simulation model, it is then used to study the nonlinear dynamics of monochromatic wave, sideband instability due to trapped particles, and satellite growth.

  3. Quantitative interpretations of Visible-NIR reflectance spectra of blood.

    PubMed

    Serebrennikova, Yulia M; Smith, Jennifer M; Huffman, Debra E; Leparc, German F; García-Rubio, Luis H

    2008-10-27

    This paper illustrates the implementation of a new theoretical model for rapid quantitative analysis of the Vis-NIR diffuse reflectance spectra of blood cultures. This new model is based on the photon diffusion theory and Mie scattering theory that have been formulated to account for multiple scattering populations and absorptive components. This study stresses the significance of the thorough solution of the scattering and absorption problem in order to accurately resolve for optically relevant parameters of blood culture components. With advantages of being calibration-free and computationally fast, the new model has two basic requirements. First, wavelength-dependent refractive indices of the basic chemical constituents of blood culture components are needed. Second, multi-wavelength measurements or at least the measurements of characteristic wavelengths equal to the degrees of freedom, i.e. number of optically relevant parameters, of blood culture system are required. The blood culture analysis model was tested with a large number of diffuse reflectance spectra of blood culture samples characterized by an extensive range of the relevant parameters.

  4. Development of a hydraulic model of the human systemic circulation

    NASA Technical Reports Server (NTRS)

    Sharp, M. K.; Dharmalingham, R. K.

    1999-01-01

    Physical and numeric models of the human circulation are constructed for a number of objectives, including studies and training in physiologic control, interpretation of clinical observations, and testing of prosthetic cardiovascular devices. For many of these purposes it is important to quantitatively validate the dynamic response of the models in terms of the input impedance (Z = oscillatory pressure/oscillatory flow). To address this need, the authors developed an improved physical model. Using a computer study, the authors first identified the configuration of lumped parameter elements in a model of the systemic circulation; the result was a good match with human aortic input impedance with a minimum number of elements. Design, construction, and testing of a hydraulic model analogous to the computer model followed. Numeric results showed that a three element model with two resistors and one compliance produced reasonable matching without undue complication. The subsequent analogous hydraulic model included adjustable resistors incorporating a sliding plate to vary the flow area through a porous material and an adjustable compliance consisting of a variable-volume air chamber. The response of the hydraulic model compared favorably with other circulation models.

  5. Automatic Fitting of Spiking Neuron Models to Electrophysiological Recordings

    PubMed Central

    Rossant, Cyrille; Goodman, Dan F. M.; Platkiewicz, Jonathan; Brette, Romain

    2010-01-01

    Spiking models can accurately predict the spike trains produced by cortical neurons in response to somatically injected currents. Since the specific characteristics of the model depend on the neuron, a computational method is required to fit models to electrophysiological recordings. The fitting procedure can be very time consuming both in terms of computer simulations and in terms of code writing. We present algorithms to fit spiking models to electrophysiological data (time-varying input and spike trains) that can run in parallel on graphics processing units (GPUs). The model fitting library is interfaced with Brian, a neural network simulator in Python. If a GPU is present it uses just-in-time compilation to translate model equations into optimized code. Arbitrary models can then be defined at script level and run on the graphics card. This tool can be used to obtain empirically validated spiking models of neurons in various systems. We demonstrate its use on public data from the INCF Quantitative Single-Neuron Modeling 2009 competition by comparing the performance of a number of neuron spiking models. PMID:20224819

  6. Modular modelling with Physiome standards

    PubMed Central

    Nickerson, David P.; Nielsen, Poul M. F.; Hunter, Peter J.

    2016-01-01

    Key points The complexity of computational models is increasing, supported by research in modelling tools and frameworks. But relatively little thought has gone into design principles for complex models.We propose a set of design principles for complex model construction with the Physiome standard modelling protocol CellML.By following the principles, models are generated that are extensible and are themselves suitable for reuse in larger models of increasing complexity.We illustrate these principles with examples including an architectural prototype linking, for the first time, electrophysiology, thermodynamically compliant metabolism, signal transduction, gene regulation and synthetic biology.The design principles complement other Physiome research projects, facilitating the application of virtual experiment protocols and model analysis techniques to assist the modelling community in creating libraries of composable, characterised and simulatable quantitative descriptions of physiology. Abstract The ability to produce and customise complex computational models has great potential to have a positive impact on human health. As the field develops towards whole‐cell models and linking such models in multi‐scale frameworks to encompass tissue, organ, or organism levels, reuse of previous modelling efforts will become increasingly necessary. Any modelling group wishing to reuse existing computational models as modules for their own work faces many challenges in the context of construction, storage, retrieval, documentation and analysis of such modules. Physiome standards, frameworks and tools seek to address several of these challenges, especially for models expressed in the modular protocol CellML. Aside from providing a general ability to produce modules, there has been relatively little research work on architectural principles of CellML models that will enable reuse at larger scales. To complement and support the existing tools and frameworks, we develop a set of principles to address this consideration. The principles are illustrated with examples that couple electrophysiology, signalling, metabolism, gene regulation and synthetic biology, together forming an architectural prototype for whole‐cell modelling (including human intervention) in CellML. Such models illustrate how testable units of quantitative biophysical simulation can be constructed. Finally, future relationships between modular models so constructed and Physiome frameworks and tools are discussed, with particular reference to how such frameworks and tools can in turn be extended to complement and gain more benefit from the results of applying the principles. PMID:27353233

  7. Visual salience metrics for image inpainting

    NASA Astrophysics Data System (ADS)

    Ardis, Paul A.; Singhal, Amit

    2009-01-01

    Quantitative metrics for successful image inpainting currently do not exist, with researchers instead relying upon qualitative human comparisons to evaluate their methodologies and techniques. In an attempt to rectify this situation, we propose two new metrics to capture the notions of noticeability and visual intent in order to evaluate inpainting results. The proposed metrics use a quantitative measure of visual salience based upon a computational model of human visual attention. We demonstrate how these two metrics repeatably correlate with qualitative opinion in a human observer study, correctly identify the optimum uses for exemplar-based inpainting (as specified in the original publication), and match qualitative opinion in published examples.

  8. Quantitative Prediction of Computational Quality (so the S and C Folks will Accept it)

    NASA Technical Reports Server (NTRS)

    Hemsch, Michael J.; Luckring, James M.; Morrison, Joseph H.

    2004-01-01

    Our choice of title may seem strange but we mean each word. In this talk, we are not going to be concerned with computations made "after the fact", i.e. those for which data are available and which are being conducted for explanation and insight. Here we are interested in preventing S&C design problems by finding them through computation before data are available. For such a computation to have any credibility with those who absorb the risk, it is necessary to quantitatively PREDICT the quality of the computational results.

  9. Systems cell biology.

    PubMed

    Mast, Fred D; Ratushny, Alexander V; Aitchison, John D

    2014-09-15

    Systems cell biology melds high-throughput experimentation with quantitative analysis and modeling to understand many critical processes that contribute to cellular organization and dynamics. Recently, there have been several advances in technology and in the application of modeling approaches that enable the exploration of the dynamic properties of cells. Merging technology and computation offers an opportunity to objectively address unsolved cellular mechanisms, and has revealed emergent properties and helped to gain a more comprehensive and fundamental understanding of cell biology. © 2014 Mast et al.

  10. A New Tool for Local Manipulation of Neuronal Micro-Circuitry with Ions and Force

    DTIC Science & Technology

    2017-02-07

    lifesci.ucsb.edu Final Report 7/30/2015-9/30/2016 3 neurons, memory , connectivity, microcircuitry 2/7/2017 Our goal is to compute the complete functional... memory trace. To date no functional connectivity map exists for living neurons at the resolution proposed here. In fact, a quantitative model of the...propagation signals are also present in cultures of human iPS-derived neurons and thus could be used to study axonal physiology in human disease models. 3

  11. Parallel FEM Simulation of Electromechanics in the Heart

    NASA Astrophysics Data System (ADS)

    Xia, Henian; Wong, Kwai; Zhao, Xiaopeng

    2011-11-01

    Cardiovascular disease is the leading cause of death in America. Computer simulation of complicated dynamics of the heart could provide valuable quantitative guidance for diagnosis and treatment of heart problems. In this paper, we present an integrated numerical model which encompasses the interaction of cardiac electrophysiology, electromechanics, and mechanoelectrical feedback. The model is solved by finite element method on a Linux cluster and the Cray XT5 supercomputer, kraken. Dynamical influences between the effects of electromechanics coupling and mechanic-electric feedback are shown.

  12. Diagnostic accuracy of semi-automatic quantitative metrics as an alternative to expert reading of CT myocardial perfusion in the CORE320 study.

    PubMed

    Ostovaneh, Mohammad R; Vavere, Andrea L; Mehra, Vishal C; Kofoed, Klaus F; Matheson, Matthew B; Arbab-Zadeh, Armin; Fujisawa, Yasuko; Schuijf, Joanne D; Rochitte, Carlos E; Scholte, Arthur J; Kitagawa, Kakuya; Dewey, Marc; Cox, Christopher; DiCarli, Marcelo F; George, Richard T; Lima, Joao A C

    To determine the diagnostic accuracy of semi-automatic quantitative metrics compared to expert reading for interpretation of computed tomography perfusion (CTP) imaging. The CORE320 multicenter diagnostic accuracy clinical study enrolled patients between 45 and 85 years of age who were clinically referred for invasive coronary angiography (ICA). Computed tomography angiography (CTA), CTP, single photon emission computed tomography (SPECT), and ICA images were interpreted manually in blinded core laboratories by two experienced readers. Additionally, eight quantitative CTP metrics as continuous values were computed semi-automatically from myocardial and blood attenuation and were combined using logistic regression to derive a final quantitative CTP metric score. For the reference standard, hemodynamically significant coronary artery disease (CAD) was defined as a quantitative ICA stenosis of 50% or greater and a corresponding perfusion defect by SPECT. Diagnostic accuracy was determined by area under the receiver operating characteristic curve (AUC). Of the total 377 included patients, 66% were male, median age was 62 (IQR: 56, 68) years, and 27% had prior myocardial infarction. In patient based analysis, the AUC (95% CI) for combined CTA-CTP expert reading and combined CTA-CTP semi-automatic quantitative metrics was 0.87(0.84-0.91) and 0.86 (0.83-0.9), respectively. In vessel based analyses the AUC's were 0.85 (0.82-0.88) and 0.84 (0.81-0.87), respectively. No significant difference in AUC was found between combined CTA-CTP expert reading and CTA-CTP semi-automatic quantitative metrics in patient based or vessel based analyses(p > 0.05 for all). Combined CTA-CTP semi-automatic quantitative metrics is as accurate as CTA-CTP expert reading to detect hemodynamically significant CAD. Copyright © 2018 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  13. Computational systems biology and dose-response modeling in relation to new directions in toxicity testing.

    PubMed

    Zhang, Qiang; Bhattacharya, Sudin; Andersen, Melvin E; Conolly, Rory B

    2010-02-01

    The new paradigm envisioned for toxicity testing in the 21st century advocates shifting from the current animal-based testing process to a combination of in vitro cell-based studies, high-throughput techniques, and in silico modeling. A strategic component of the vision is the adoption of the systems biology approach to acquire, analyze, and interpret toxicity pathway data. As key toxicity pathways are identified and their wiring details elucidated using traditional and high-throughput techniques, there is a pressing need to understand their qualitative and quantitative behaviors in response to perturbation by both physiological signals and exogenous stressors. The complexity of these molecular networks makes the task of understanding cellular responses merely by human intuition challenging, if not impossible. This process can be aided by mathematical modeling and computer simulation of the networks and their dynamic behaviors. A number of theoretical frameworks were developed in the last century for understanding dynamical systems in science and engineering disciplines. These frameworks, which include metabolic control analysis, biochemical systems theory, nonlinear dynamics, and control theory, can greatly facilitate the process of organizing, analyzing, and understanding toxicity pathways. Such analysis will require a comprehensive examination of the dynamic properties of "network motifs"--the basic building blocks of molecular circuits. Network motifs like feedback and feedforward loops appear repeatedly in various molecular circuits across cell types and enable vital cellular functions like homeostasis, all-or-none response, memory, and biological rhythm. These functional motifs and associated qualitative and quantitative properties are the predominant source of nonlinearities observed in cellular dose response data. Complex response behaviors can arise from toxicity pathways built upon combinations of network motifs. While the field of computational cell biology has advanced rapidly with increasing availability of new data and powerful simulation techniques, a quantitative orientation is still lacking in life sciences education to make efficient use of these new tools to implement the new toxicity testing paradigm. A revamped undergraduate curriculum in the biological sciences including compulsory courses in mathematics and analysis of dynamical systems is required to address this gap. In parallel, dissemination of computational systems biology techniques and other analytical tools among practicing toxicologists and risk assessment professionals will help accelerate implementation of the new toxicity testing vision.

  14. PEPIS: A Pipeline for Estimating Epistatic Effects in Quantitative Trait Locus Mapping and Genome-Wide Association Studies.

    PubMed

    Zhang, Wenchao; Dai, Xinbin; Wang, Qishan; Xu, Shizhong; Zhao, Patrick X

    2016-05-01

    The term epistasis refers to interactions between multiple genetic loci. Genetic epistasis is important in regulating biological function and is considered to explain part of the 'missing heritability,' which involves marginal genetic effects that cannot be accounted for in genome-wide association studies. Thus, the study of epistasis is of great interest to geneticists. However, estimating epistatic effects for quantitative traits is challenging due to the large number of interaction effects that must be estimated, thus significantly increasing computing demands. Here, we present a new web server-based tool, the Pipeline for estimating EPIStatic genetic effects (PEPIS), for analyzing polygenic epistatic effects. The PEPIS software package is based on a new linear mixed model that has been used to predict the performance of hybrid rice. The PEPIS includes two main sub-pipelines: the first for kinship matrix calculation, and the second for polygenic component analyses and genome scanning for main and epistatic effects. To accommodate the demand for high-performance computation, the PEPIS utilizes C/C++ for mathematical matrix computing. In addition, the modules for kinship matrix calculations and main and epistatic-effect genome scanning employ parallel computing technology that effectively utilizes multiple computer nodes across our networked cluster, thus significantly improving the computational speed. For example, when analyzing the same immortalized F2 rice population genotypic data examined in a previous study, the PEPIS returned identical results at each analysis step with the original prototype R code, but the computational time was reduced from more than one month to about five minutes. These advances will help overcome the bottleneck frequently encountered in genome wide epistatic genetic effect analysis and enable accommodation of the high computational demand. The PEPIS is publically available at http://bioinfo.noble.org/PolyGenic_QTL/.

  15. Verification methodology for fault-tolerant, fail-safe computers applied to maglev control computer systems. Final report, July 1991-May 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lala, J.H.; Nagle, G.A.; Harper, R.E.

    1993-05-01

    The Maglev control computer system should be designed to verifiably possess high reliability and safety as well as high availability to make Maglev a dependable and attractive transportation alternative to the public. A Maglev control computer system has been designed using a design-for-validation methodology developed earlier under NASA and SDIO sponsorship for real-time aerospace applications. The present study starts by defining the maglev mission scenario and ends with the definition of a maglev control computer architecture. Key intermediate steps included definitions of functional and dependability requirements, synthesis of two candidate architectures, development of qualitative and quantitative evaluation criteria, and analyticalmore » modeling of the dependability characteristics of the two architectures. Finally, the applicability of the design-for-validation methodology was also illustrated by applying it to the German Transrapid TR07 maglev control system.« less

  16. Human judgment vs. quantitative models for the management of ecological resources.

    PubMed

    Holden, Matthew H; Ellner, Stephen P

    2016-07-01

    Despite major advances in quantitative approaches to natural resource management, there has been resistance to using these tools in the actual practice of managing ecological populations. Given a managed system and a set of assumptions, translated into a model, optimization methods can be used to solve for the most cost-effective management actions. However, when the underlying assumptions are not met, such methods can potentially lead to decisions that harm the environment and economy. Managers who develop decisions based on past experience and judgment, without the aid of mathematical models, can potentially learn about the system and develop flexible management strategies. However, these strategies are often based on subjective criteria and equally invalid and often unstated assumptions. Given the drawbacks of both methods, it is unclear whether simple quantitative models improve environmental decision making over expert opinion. In this study, we explore how well students, using their experience and judgment, manage simulated fishery populations in an online computer game and compare their management outcomes to the performance of model-based decisions. We consider harvest decisions generated using four different quantitative models: (1) the model used to produce the simulated population dynamics observed in the game, with the values of all parameters known (as a control), (2) the same model, but with unknown parameter values that must be estimated during the game from observed data, (3) models that are structurally different from those used to simulate the population dynamics, and (4) a model that ignores age structure. Humans on average performed much worse than the models in cases 1-3, but in a small minority of scenarios, models produced worse outcomes than those resulting from students making decisions based on experience and judgment. When the models ignored age structure, they generated poorly performing management decisions, but still outperformed students using experience and judgment 66% of the time. © 2016 by the Ecological Society of America.

  17. Multi-scale Modeling of Chromosomal DNA in Living Cells

    NASA Astrophysics Data System (ADS)

    Spakowitz, Andrew

    The organization and dynamics of chromosomal DNA play a pivotal role in a range of biological processes, including gene regulation, homologous recombination, replication, and segregation. Establishing a quantitative theoretical model of DNA organization and dynamics would be valuable in bridging the gap between the molecular-level packaging of DNA and genome-scale chromosomal processes. Our research group utilizes analytical theory and computational modeling to establish a predictive theoretical model of chromosomal organization and dynamics. In this talk, I will discuss our efforts to develop multi-scale polymer models of chromosomal DNA that are both sufficiently detailed to address specific protein-DNA interactions while capturing experimentally relevant time and length scales. I will demonstrate how these modeling efforts are capable of quantitatively capturing aspects of behavior of chromosomal DNA in both prokaryotic and eukaryotic cells. This talk will illustrate that capturing dynamical behavior of chromosomal DNA at various length scales necessitates a range of theoretical treatments that accommodate the critical physical contributions that are relevant to in vivo behavior at these disparate length and time scales. National Science Foundation, Physics of Living Systems Program (PHY-1305516).

  18. Tracer kinetics of forearm endothelial function: comparison of an empirical method and a quantitative modeling technique.

    PubMed

    Zhao, Xueli; Arsenault, Andre; Lavoie, Kim L; Meloche, Bernard; Bacon, Simon L

    2007-01-01

    Forearm Endothelial Function (FEF) is a marker that has been shown to discriminate patients with cardiovascular disease (CVD). FEF has been assessed using several parameters: the Rate of Uptake Ratio (RUR), EWUR (Elbow-to-Wrist Uptake Ratio) and EWRUR (Elbow-to-Wrist Relative Uptake Ratio). However, the modeling functions of FEF require more robust models. The present study was designed to compare an empirical method with quantitative modeling techniques to better estimate the physiological parameters and understand the complex dynamic processes. The fitted time activity curves of the forearms, estimating blood and muscle components, were assessed using both an empirical method and a two-compartment model. Although correlational analyses suggested a good correlation between the methods for RUR (r=.90) and EWUR (r=.79), but not EWRUR (r=.34), Altman-Bland plots found poor agreement between the methods for all 3 parameters. These results indicate that there is a large discrepancy between the empirical and computational method for FEF. Further work is needed to establish the physiological and mathematical validity of the 2 modeling methods.

  19. Quantitative structure-property relationship modeling of remote liposome loading of drugs.

    PubMed

    Cern, Ahuva; Golbraikh, Alexander; Sedykh, Aleck; Tropsha, Alexander; Barenholz, Yechezkel; Goldblum, Amiram

    2012-06-10

    Remote loading of liposomes by trans-membrane gradients is used to achieve therapeutically efficacious intra-liposome concentrations of drugs. We have developed Quantitative Structure Property Relationship (QSPR) models of remote liposome loading for a data set including 60 drugs studied in 366 loading experiments internally or elsewhere. Both experimental conditions and computed chemical descriptors were employed as independent variables to predict the initial drug/lipid ratio (D/L) required to achieve high loading efficiency. Both binary (to distinguish high vs. low initial D/L) and continuous (to predict real D/L values) models were generated using advanced machine learning approaches and 5-fold external validation. The external prediction accuracy for binary models was as high as 91-96%; for continuous models the mean coefficient R(2) for regression between predicted versus observed values was 0.76-0.79. We conclude that QSPR models can be used to identify candidate drugs expected to have high remote loading capacity while simultaneously optimizing the design of formulation experiments. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. Designer drugs: the evolving science of drug discovery.

    PubMed

    Wanke, L A; DuBose, R F

    1998-07-01

    Drug discovery and design are fundamental to drug development. Until recently, most drugs were discovered through random screening or developed through molecular modification. New technologies are revolutionizing this phase of drug development. Rational drug design, using powerful computers and computational chemistry and employing X-ray crystallography, nuclear magnetic resonance spectroscopy, and three-dimensional quantitative structure activity relationship analysis, is creating highly specific, biologically active molecules by virtual reality modeling. Sophisticated screening technologies are eliminating all but the most active lead compounds. These new technologies promise more efficacious, safe, and cost-effective medications, while minimizing drug development time and maximizing profits.

  1. The use of virtual environments for percentage view analysis.

    PubMed

    Schofield, Damian; Cox, Christopher J B

    2005-09-01

    It is recognised that Visual Impact Assessment (VIA), unlike many other aspects of Environmental Impact Assessments (EIA), relies less upon measurement than upon experience and judgement. Hence, it is necessary for a more structured and consistent approach towards VIA, reducing the amount of bias and subjectivity. For proposed developments, there are very few quantitative techniques for the evaluation of visibility, and these existing methods can be highly inaccurate and time consuming. Percentage view changes are one of the few quantitative techniques, and the use of computer technology can reduce the inaccuracy and the time spent evaluating the visibility of either existing or proposed developments. For over 10 years, research work undertaken by the authors at the University of Nottingham has employed Computer Graphics (CG) and Virtual Reality (VR) in civilian and industrial contexts for environmental planning, design visualisation, accident reconstruction, risk analysis, data visualisation and training simulators. This paper describes a method to quantitatively assess the visual impact of proposed developments on the landscape using CG techniques. This method allows the determination of accurate percentage view changes with the use of a computer-generated model of the environment and the application of specialist software that has been developed at the University of Nottingham. The principles are easy to understand and therefore planners, authorisation agencies and members of the public can use and understand the results. A case study is shown to demonstrate the application and the capabilities of the technology.

  2. An affinity-structure database of helix-turn-helix: DNA complexes with a universal coordinate system.

    PubMed

    AlQuraishi, Mohammed; Tang, Shengdong; Xia, Xide

    2015-11-19

    Molecular interactions between proteins and DNA molecules underlie many cellular processes, including transcriptional regulation, chromosome replication, and nucleosome positioning. Computational analyses of protein-DNA interactions rely on experimental data characterizing known protein-DNA interactions structurally and biochemically. While many databases exist that contain either structural or biochemical data, few integrate these two data sources in a unified fashion. Such integration is becoming increasingly critical with the rapid growth of structural and biochemical data, and the emergence of algorithms that rely on the synthesis of multiple data types to derive computational models of molecular interactions. We have developed an integrated affinity-structure database in which the experimental and quantitative DNA binding affinities of helix-turn-helix proteins are mapped onto the crystal structures of the corresponding protein-DNA complexes. This database provides access to: (i) protein-DNA structures, (ii) quantitative summaries of protein-DNA binding affinities using position weight matrices, and (iii) raw experimental data of protein-DNA binding instances. Critically, this database establishes a correspondence between experimental structural data and quantitative binding affinity data at the single basepair level. Furthermore, we present a novel alignment algorithm that structurally aligns the protein-DNA complexes in the database and creates a unified residue-level coordinate system for comparing the physico-chemical environments at the interface between complexes. Using this unified coordinate system, we compute the statistics of atomic interactions at the protein-DNA interface of helix-turn-helix proteins. We provide an interactive website for visualization, querying, and analyzing this database, and a downloadable version to facilitate programmatic analysis. This database will facilitate the analysis of protein-DNA interactions and the development of programmatic computational methods that capitalize on integration of structural and biochemical datasets. The database can be accessed at http://ProteinDNA.hms.harvard.edu.

  3. Relationship Between Coronary Contrast-Flow Quantitative Flow Ratio and Myocardial Ischemia Assessed by SPECT MPI.

    PubMed

    Smit, Jeff M; Koning, Gerhard; van Rosendael, Alexander R; Dibbets-Schneider, Petra; Mertens, Bart J; Jukema, J Wouter; Delgado, Victoria; Reiber, Johan H C; Bax, Jeroen J; Scholte, Arthur J

    2017-10-01

    A new method has been developed to calculate fractional flow reserve (FFR) from invasive coronary angiography, the so-called "contrast-flow quantitative flow ratio (cQFR)". Recently, cQFR was compared to invasive FFR in intermediate coronary lesions showing an overall diagnostic accuracy of 85%. The purpose of this study was to investigate the relationship between cQFR and myocardial ischemia assessed by single-photon emission computed tomography myocardial perfusion imaging (SPECT MPI). Patients who underwent SPECT MPI and coronary angiography within 3 months were included. The cQFR computation was performed offline, using dedicated software. The cQFR computation was based on 3-dimensional quantitative coronary angiography (QCA) and computational fluid dynamics. The standard 17-segment model was used to determine the vascular territories. Myocardial ischemia was defined as a summed difference score ≥2 in a vascular territory. A cQFR of ≤0.80 was considered abnormal. Two hundred and twenty-four coronary arteries were analysed in 85 patients. Overall accuracy of cQFR to detect ischemia on SPECT MPI was 90%. In multivariable analysis, cQFR was independently associated with ischemia on SPECT MPI (OR per 0.01 decrease of cQFR: 1.10; 95% CI 1.04-1.18, p = 0.002), whereas clinical and QCA parameters were not. Furthermore, cQFR showed incremental value for the detection of ischemia compared to clinical and QCA parameters (global chi square 48.7 to 62.6; p <0.001). A good relationship between cQFR and SPECT MPI was found. cQFR was independently associated with ischemia on SPECT MPI and showed incremental value to detect ischemia compared to clinical and QCA parameters.

  4. Computational modeling of radiofrequency ablation: evaluation on ex vivo data using ultrasound monitoring

    NASA Astrophysics Data System (ADS)

    Audigier, Chloé; Kim, Younsu; Dillow, Austin; Boctor, Emad M.

    2017-03-01

    Radiofrequency ablation (RFA) is the most widely used minimally invasive ablative therapy for liver cancer, but it is challenged by a lack of patient-specific monitoring. Inter-patient tissue variability and the presence of blood vessels make the prediction of the RFA difficult. A monitoring tool which can be personalized for a given patient during the intervention would be helpful to achieve a complete tumor ablation. However, the clinicians do not have access to such a tool, which results in incomplete treatment and a large number of recurrences. Computational models can simulate the phenomena and mechanisms governing this therapy. The temperature evolution as well as the resulted ablation can be modeled. When combined together with intraoperative measurements, computational modeling becomes an accurate and powerful tool to gain quantitative understanding and to enable improvements in the ongoing clinical settings. This paper shows how computational models of RFA can be evaluated using intra-operative measurements. First, simulations are used to demonstrate the feasibility of the method, which is then evaluated on two ex vivo datasets. RFA is simulated on a simplified geometry to generate realistic longitudinal temperature maps and the resulted necrosis. Computed temperatures are compared with the temperature evolution recorded using thermometers, and with temperatures monitored by ultrasound (US) in a 2D plane containing the ablation tip. Two ablations are performed on two cadaveric bovine livers, and we achieve error of 2.2 °C on average between the computed and the thermistors temperature and 1.4 °C and 2.7 °C on average between the temperature computed and monitored by US during the ablation at two different time points (t = 240 s and t = 900 s).

  5. Computer Simulation of Embryonic Systems: What can a ...

    EPA Pesticide Factsheets

    (1) Standard practice for assessing developmental toxicity is the observation of apical endpoints (intrauterine death, fetal growth retardation, structural malformations) in pregnant rats/rabbits following exposure during organogenesis. EPA’s computational toxicology research program (ToxCast) generated vast in vitro cellular and molecular effects data on >1858 chemicals in >600 high-throughput screening (HTS) assays. The diversity of assays has been increased for developmental toxicity with several HTS platforms, including the devTOX-quickPredict assay from Stemina Biomarker Discovery utilizing the human embryonic stem cell line (H9). Translating these HTS data into higher order-predictions of developmental toxicity is a significant challenge. Here, we address the application of computational systems models that recapitulate the kinematics of dynamical cell signaling networks (e.g., SHH, FGF, BMP, retinoids) in a CompuCell3D.org modeling environment. Examples include angiogenesis (angiodysplasia) and dysmorphogenesis. Being numerically responsive to perturbation, these models are amenable to data integration for systems Toxicology and Adverse Outcome Pathways (AOPs). The AOP simulation outputs predict potential phenotypes based on the in vitro HTS data ToxCast. A heuristic computational intelligence framework that recapitulates the kinematics of dynamical cell signaling networks in the embryo, together with the in vitro profiling data, produce quantitative pr

  6. Reliably Discriminating Stock Structure with Genetic Markers:Mixture Models with Robust and Fast Computation.

    PubMed

    Foster, Scott D; Feutry, Pierre; Grewe, Peter M; Berry, Oliver; Hui, Francis K C; Davies, Campbell R

    2018-06-26

    Delineating naturally occurring and self-sustaining sub-populations (stocks) of a species is an important task, especially for species harvested from the wild. Despite its central importance to natural resource management, analytical methods used to delineate stocks are often, and increasingly, borrowed from superficially similar analytical tasks in human genetics even though models specifically for stock identification have been previously developed. Unfortunately, the analytical tasks in resource management and human genetics are not identical { questions about humans are typically aimed at inferring ancestry (often referred to as 'admixture') rather than breeding stocks. In this article, we argue, and show through simulation experiments and an analysis of yellowfin tuna data, that ancestral analysis methods are not always appropriate for stock delineation. In this work, we advocate a variant of a previouslyintroduced and simpler model that identifies stocks directly. We also highlight that the computational aspects of the analysis, irrespective of the model, are difficult. We introduce some alternative computational methods and quantitatively compare these methods to each other and to established methods. We also present a method for quantifying uncertainty in model parameters and in assignment probabilities. In doing so, we demonstrate that point estimates can be misleading. One of the computational strategies presented here, based on an expectation-maximisation algorithm with judiciously chosen starting values, is robust and has a modest computational cost. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  7. Modeling and Diagnostic Software for Liquefying-Fuel Rockets

    NASA Technical Reports Server (NTRS)

    Poll, Scott; Iverson, David; Ou, Jeremy; Sanderfer, Dwight; Patterson-Hine, Ann

    2005-01-01

    A report presents a study of five modeling and diagnostic computer programs considered for use in an integrated vehicle health management (IVHM) system during testing of liquefying-fuel hybrid rocket engines in the Hybrid Combustion Facility (HCF) at NASA Ames Research Center. Three of the programs -- TEAMS, L2, and RODON -- are model-based reasoning (or diagnostic) programs. The other two programs -- ICS and IMS -- do not attempt to isolate the causes of failures but can be used for detecting faults. In the study, qualitative models (in TEAMS and L2) and quantitative models (in RODON) having varying scope and completeness were created. Each of the models captured the structure and behavior of the HCF as a physical system. It was noted that in the cases of the qualitative models, the temporal aspects of the behavior of the HCF and the abstraction of sensor data are handled outside of the models, and it is necessary to develop additional code for this purpose. A need for additional code was also noted in the case of the quantitative model, though the amount of development effort needed was found to be less than that for the qualitative models.

  8. Computational strategy for quantifying human pesticide exposure based upon a saliva measurement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timchalk, Charles; Weber, Thomas J.; Smith, Jordan N.

    The National Research Council of the National Academies report, Toxicity Testing in the 21st Century: A Vision and Strategy, highlighted the importance of quantitative exposure data for evaluating human toxicity risk and noted that biomonitoring is a critical tool for quantitatively evaluating exposure from both environmental and occupational settings. Direct measurement of chemical exposures using personal monitoring provides the most accurate estimation of a subject’s true exposure, and non-invasive methods have also been advocated for quantifying the pharmacokinetics and bioavailability of drugs and xenobiotics. In this regard, there is a need to identify chemicals that are readily cleared in salivamore » at concentrations that can be quantified to support the implementation of this approach.. The current manuscript describes the use of computational modeling approaches that are closely coupled to in vivo and in vitro experiments to predict salivary uptake and clearance of xenobiotics. The primary mechanism by which xenobiotics leave the blood and enter saliva is thought to involve paracellular transport, passive transcellular diffusion, or trancellular active transport with the majority of drugs and xenobiotics cleared from plasma into saliva by passive diffusion. The transcellular or paracellular diffusion of unbound chemicals in plasma to saliva has been computational modeled using a combination of compartmental and physiologically based approaches. Of key importance for determining the plasma:saliva partitioning was the utilization of a modified Schmitt algorithm that calculates partitioning based upon the tissue composition, pH, chemical pKa and plasma protein-binding. Sensitivity analysis of key model parameters specifically identified that both protein-binding and pKa (for weak acids and bases) had the most significant impact on the determination of partitioning and that there were clear species dependent differences based upon physiological variance between rats and humans. Ongoing efforts are focused on extending this modeling strategy to an in vitro salivary acinar cell based system that will be utilized to experimentally determine and computationally predict salivary gland uptake and clearance for a broad range of xenobiotics. Hence, it is envisioned that a combination of salivary biomonitoring and computational modeling will enable the non-invasive measurement of both environmental and occupational exposure in human populations using saliva.« less

  9. A finite element model of myocardial infarction using a composite material approach.

    PubMed

    Haddad, Seyyed M H; Samani, Abbas

    2018-01-01

    Computational models are effective tools to study cardiac mechanics under normal and pathological conditions. They can be used to gain insight into the physiology of the heart under these conditions while they are adaptable to computer assisted patient-specific clinical diagnosis and therapeutic procedures. Realistic cardiac mechanics models incorporate tissue active/passive response in conjunction with hyperelasticity and anisotropy. Conventional formulation of such models leads to mathematically-complex problems usually solved by custom-developed non-linear finite element (FE) codes. With a few exceptions, such codes are not available to the research community. This article describes a computational cardiac mechanics model developed such that it can be implemented using off-the-shelf FE solvers while tissue pathologies can be introduced in the model in a straight-forward manner. The model takes into account myocardial hyperelasticity, anisotropy, and active contraction forces. It follows a composite tissue modeling approach where the cardiac tissue is decomposed into two major parts: background and myofibers. The latter is modelled as rebars under initial stresses mimicking the contraction forces. The model was applied in silico to study the mechanics of infarcted left ventricle (LV) of a canine. End-systolic strain components, ejection fraction, and stress distribution attained using this LV model were compared quantitatively and qualitatively to corresponding data obtained from measurements as well as to other corresponding LV mechanics models. This comparison showed very good agreement.

  10. Correlation of quantitative dual-energy computed tomography iodine maps and abdominal computed tomography perfusion measurements: are single-acquisition dual-energy computed tomography iodine maps more than a reduced-dose surrogate of conventional computed tomography perfusion?

    PubMed

    Stiller, Wolfram; Skornitzke, Stephan; Fritz, Franziska; Klauss, Miriam; Hansen, Jens; Pahn, Gregor; Grenacher, Lars; Kauczor, Hans-Ulrich

    2015-10-01

    Study objectives were the quantitative evaluation of whether conventional abdominal computed tomography (CT) perfusion measurements mathematically correlate with quantitative single-acquisition dual-energy CT (DECT) iodine concentration maps, the determination of the optimum time of acquisition for achieving maximum correlation, and the estimation of the potential for radiation exposure reduction when replacing conventional CT perfusion by single-acquisition DECT iodine concentration maps. Dual-energy CT perfusion sequences were dynamically acquired over 51 seconds (34 acquisitions every 1.5 seconds) in 24 patients with histologically verified pancreatic carcinoma using dual-source DECT at tube potentials of 80 kVp and 140 kVp. Using software developed in-house, perfusion maps were calculated from 80-kVp image series using the maximum slope model after deformable motion correction. In addition, quantitative iodine maps were calculated for each of the 34 DECT acquisitions per patient. Within a manual segmentation of the pancreas, voxel-by-voxel correlation between the perfusion map and each of the iodine maps was calculated for each patient to determine the optimum time of acquisition topt defined as the acquisition time of the iodine map with the highest correlation coefficient. Subsequently, regions of interest were placed inside the tumor and inside healthy pancreatic tissue, and correlation between mean perfusion values and mean iodine concentrations within these regions of interest at topt was calculated for the patient sample. The mean (SD) topt was 31.7 (5.4) seconds after the start of contrast agent injection. The mean (SD) perfusion values for healthy pancreatic and tumor tissues were 67.8 (26.7) mL per 100 mL/min and 43.7 (32.2) mL per 100 mL/min, respectively. At topt, the mean (SD) iodine concentrations were 2.07 (0.71) mg/mL in healthy pancreatic and 1.69 (0.98) mg/mL in tumor tissue, respectively. Overall, the correlation between perfusion values and iodine concentrations was high (0.77), with correlation of 0.89 in tumor and of 0.56 in healthy pancreatic tissue at topt. Comparing radiation exposure associated with a single DECT acquisition at topt (0.18 mSv) to that of an 80 kVp CT perfusion sequence (2.96 mSv) indicates that an average reduction of Deff by 94% could be achieved by replacing conventional CT perfusion with a single-acquisition DECT iodine concentration map. Quantitative iodine concentration maps obtained with DECT correlate well with conventional abdominal CT perfusion measurements, suggesting that quantitative iodine maps calculated from a single DECT acquisition at an organ-specific and patient-specific optimum time of acquisition might be able to replace conventional abdominal CT perfusion measurements if the time of acquisition is carefully calibrated. This could lead to large reductions of radiation exposure to the patients while offering quantitative perfusion data for diagnosis.

  11. Computational studies of novel chymase inhibitors against cardiovascular and allergic diseases: mechanism and inhibition.

    PubMed

    Arooj, Mahreen; Thangapandian, Sundarapandian; John, Shalini; Hwang, Swan; Park, Jong K; Lee, Keun W

    2012-12-01

    To provide a new idea for drug design, a computational investigation is performed on chymase and its novel 1,4-diazepane-2,5-diones inhibitors that explores the crucial molecular features contributing to binding specificity. Molecular docking studies of inhibitors within the active site of chymase were carried out to rationalize the inhibitory properties of these compounds and understand their inhibition mechanism. The density functional theory method was used to optimize molecular structures with the subsequent analysis of highest occupied molecular orbital, lowest unoccupied molecular orbital, and molecular electrostatic potential maps, which revealed that negative potentials near 1,4-diazepane-2,5-diones ring are essential for effective binding of inhibitors at active site of enzyme. The Bayesian model with receiver operating curve statistic of 0.82 also identified arylsulfonyl and aminocarbonyl as the molecular features favoring and not favoring inhibition of chymase, respectively. Moreover, genetic function approximation was applied to construct 3D quantitative structure-activity relationships models. Two models (genetic function approximation model 1 r(2) = 0.812 and genetic function approximation model 2 r(2) = 0.783) performed better in terms of correlation coefficients and cross-validation analysis. In general, this study is used as example to illustrate how combinational use of 2D/3D quantitative structure-activity relationships modeling techniques, molecular docking, frontier molecular orbital density fields (highest occupied molecular orbital and lowest unoccupied molecular orbital), and molecular electrostatic potential analysis may be useful to gain an insight into the binding mechanism between enzyme and its inhibitors. © 2012 John Wiley & Sons A/S.

  12. Establishment of quantitative retention-activity model by optimized microemulsion liquid chromatography.

    PubMed

    Xu, Liyuan; Gao, Haoshi; Li, Liangxing; Li, Yinnong; Wang, Liuyun; Gao, Chongkai; Li, Ning

    2016-12-23

    The effective permeability coefficient is of theoretical and practical importance in evaluation of the bioavailability of drug candidates. However, most methods currently used to measure this coefficient are expensive and time-consuming. In this paper, we addressed these problems by proposing a new measurement method which is based on the microemulsion liquid chromatography. First, the parallel artificial membrane permeability assays model was used to determine the effective permeability of drug so that quantitative retention-activity relationships could be established, which were used to optimize the microemulsion liquid chromatography. The most effective microemulsion system used a mobile phase of 6.0% (w/w) Brij35, 6.6% (w/w) butanol, 0.8% (w/w) octanol, and 86.6% (w/w) phosphate buffer (pH 7.4). Next, support vector machine and back-propagation neural networks are employed to develop a quantitative retention-activity relationships model associated with the optimal microemulsion system, and used to improve the prediction ability. Finally, an adequate correlation between experimental value and predicted value is computed to verify the performance of the optimal model. The results indicate that the microemulsion liquid chromatography can serve as a possible alternative to the PAMPA method for determination of high-throughput permeability and simulation of biological processes. Copyright © 2016. Published by Elsevier B.V.

  13. Computational model, method, and system for kinetically-tailoring multi-drug chemotherapy for individuals

    DOEpatents

    Gardner, Shea Nicole

    2007-10-23

    A method and system for tailoring treatment regimens to individual patients with diseased cells exhibiting evolution of resistance to such treatments. A mathematical model is provided which models rates of population change of proliferating and quiescent diseased cells using cell kinetics and evolution of resistance of the diseased cells, and pharmacokinetic and pharmacodynamic models. Cell kinetic parameters are obtained from an individual patient and applied to the mathematical model to solve for a plurality of treatment regimens, each having a quantitative efficacy value associated therewith. A treatment regimen may then be selected from the plurlaity of treatment options based on the efficacy value.

  14. Nondestructive Evaluation for Aerospace Composites

    NASA Technical Reports Server (NTRS)

    Leckey, Cara; Cramer, Elliott; Perey, Daniel

    2015-01-01

    Nondestructive evaluation (NDE) techniques are important for enabling NASA's missions in space exploration and aeronautics. The expanded and continued use of composite materials for aerospace components and vehicles leads to a need for advanced NDE techniques capable of quantitatively characterizing damage in composites. Quantitative damage detection techniques help to ensure safety, reliability and durability of space and aeronautic vehicles. This presentation will give a broad outline of NASA's range of technical work and an overview of the NDE research performed in the Nondestructive Evaluation Sciences Branch at NASA Langley Research Center. The presentation will focus on ongoing research in the development of NDE techniques for composite materials and structures, including development of automated data processing tools to turn NDE data into quantitative location and sizing results. Composites focused NDE research in the areas of ultrasonics, thermography, X-ray computed tomography, and NDE modeling will be discussed.

  15. Statistical molecular design of balanced compound libraries for QSAR modeling.

    PubMed

    Linusson, A; Elofsson, M; Andersson, I E; Dahlgren, M K

    2010-01-01

    A fundamental step in preclinical drug development is the computation of quantitative structure-activity relationship (QSAR) models, i.e. models that link chemical features of compounds with activities towards a target macromolecule associated with the initiation or progression of a disease. QSAR models are computed by combining information on the physicochemical and structural features of a library of congeneric compounds, typically assembled from two or more building blocks, and biological data from one or more in vitro assays. Since the models provide information on features affecting the compounds' biological activity they can be used as guides for further optimization. However, in order for a QSAR model to be relevant to the targeted disease, and drug development in general, the compound library used must contain molecules with balanced variation of the features spanning the chemical space believed to be important for interaction with the biological target. In addition, the assays used must be robust and deliver high quality data that are directly related to the function of the biological target and the associated disease state. In this review, we discuss and exemplify the concept of statistical molecular design (SMD) in the selection of building blocks and final synthetic targets (i.e. compounds to synthesize) to generate information-rich, balanced libraries for biological testing and computation of QSAR models.

  16. Invasion emerges from cancer cell adaptation to competitive microenvironments: Quantitative predictions from multiscale mathematical models

    PubMed Central

    Rejniak, Katarzyna A.; Gerlee, Philip

    2013-01-01

    Summary In this review we summarize our recent efforts using mathematical modeling and computation to simulate cancer invasion, with a special emphasis on the tumor microenvironment. We consider cancer progression as a complex multiscale process and approach it with three single-cell based mathematical models that examine the interactions between tumor microenvironment and cancer cells at several scales. The models exploit distinct mathematical and computational techniques, yet they share core elements and can be compared and/or related to each other. The overall aim of using mathematical models is to uncover the fundamental mechanisms that lend cancer progression its direction towards invasion and metastasis. The models effectively simulate various modes of cancer cell adaptation to the microenvironment in a growing tumor. All three point to a general mechanism underlying cancer invasion: competition for adaptation between distinct cancer cell phenotypes, driven by a tumor microenvironment with scarce resources. These theoretical predictions pose an intriguing experimental challenge: test the hypothesis that invasion is an emergent property of cancer cell populations adapting to selective microenvironment pressure, rather than culmination of cancer progression producing cells with the “invasive phenotype”. In broader terms, we propose that fundamental insights into cancer can be achieved by experimentation interacting with theoretical frameworks provided by computational and mathematical modeling. PMID:18524624

  17. Comprehensive, Quantitative Risk Assessment of CO{sub 2} Geologic Sequestration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lepinski, James

    2013-09-30

    A Quantitative Failure Modes and Effects Analysis (QFMEA) was developed to conduct comprehensive, quantitative risk assessments on CO{sub 2} capture, transportation, and sequestration or use in deep saline aquifers, enhanced oil recovery operations, or enhanced coal bed methane operations. The model identifies and characterizes potential risks; identifies the likely failure modes, causes, effects and methods of detection; lists possible risk prevention and risk mitigation steps; estimates potential damage recovery costs, mitigation costs and costs savings resulting from mitigation; and ranks (prioritizes) risks according to the probability of failure, the severity of failure, the difficulty of early failure detection and themore » potential for fatalities. The QFMEA model generates the necessary information needed for effective project risk management. Diverse project information can be integrated into a concise, common format that allows comprehensive, quantitative analysis, by a cross-functional team of experts, to determine: What can possibly go wrong? How much will damage recovery cost? How can it be prevented or mitigated? What is the cost savings or benefit of prevention or mitigation? Which risks should be given highest priority for resolution? The QFMEA model can be tailored to specific projects and is applicable to new projects as well as mature projects. The model can be revised and updated as new information comes available. It accepts input from multiple sources, such as literature searches, site characterization, field data, computer simulations, analogues, process influence diagrams, probability density functions, financial analysis models, cost factors, and heuristic best practices manuals, and converts the information into a standardized format in an Excel spreadsheet. Process influence diagrams, geologic models, financial models, cost factors and an insurance schedule were developed to support the QFMEA model. Comprehensive, quantitative risk assessments were conducted on three (3) sites using the QFMEA model: (1) SACROC Northern Platform CO{sub 2}-EOR Site in the Permian Basin, Scurry County, TX, (2) Pump Canyon CO{sub 2}-ECBM Site in the San Juan Basin, San Juan County, NM, and (3) Farnsworth Unit CO{sub 2}-EOR Site in the Anadarko Basin, Ochiltree County, TX. The sites were sufficiently different from each other to test the robustness of the QFMEA model.« less

  18. Computer Aided Enzyme Design and Catalytic Concepts

    PubMed Central

    Frushicheva, Maria P.; Mills, Matthew J. L.; Schopf, Patrick; Singh, Manoj K.; Warshel, Arieh

    2014-01-01

    Gaining a deeper understanding of enzyme catalysis is of great practical and fundamental importance. Over the years it has become clear that despite advances made in experimental mutational studies, a quantitative understanding of enzyme catalysis will not be possible without the use of computer modeling approaches. While we believe that electrostatic preorganization is by far the most important catalytic factor, convincing the wider scientific community of this may require the demonstration of effective rational enzyme design. Here we make the point that the main current advances in enzyme design are basically advances in directed evolution and that computer aided enzyme design must involve approaches that can reproduce catalysis in well-defined test cases. Such an approach is provided by the empirical valence bond method. PMID:24814389

  19. Combustion Processes in Hybrid Rocket Engines

    NASA Technical Reports Server (NTRS)

    Venkateswaran,S.; Merkle, C. L.

    1996-01-01

    In recent years, there has been a resurgence of interest in the development of hybrid rocket engines for advanced launch vehicle applications. Hybrid propulsion systems use a solid fuel such as hydroxyl-terminated polybutadiene (HTPB) along with a gaseous/liquid oxidizer. The performance of hybrid combustors depends on the convective and radiative heat fluxes to the fuel surface, the rate of pyrolysis in the solid phase, and the turbulent combustion processes in the gaseous phases. These processes in combination specify the regression rates of the fuel surface and thereby the utilization efficiency of the fuel. In this paper, we employ computational fluid dynamics (CFD) techniques in order to gain a quantitative understanding of the physical trends in hybrid rocket combustors. The computational modeling is tailored to ongoing experiments at Penn State that employ a two dimensional slab burner configuration. The coordinated computational/experimental effort enables model validation while providing an understanding of the experimental observations. Computations to date have included the full length geometry with and with the aft nozzle section as well as shorter length domains for extensive parametric characterization. HTPB is sed as the fuel with 1,3 butadiene being taken as the gaseous product of the pyrolysis. Pure gaseous oxygen is taken as the oxidizer. The fuel regression rate is specified using an Arrhenius rate reaction, which the fuel surface temperature is given by an energy balance involving gas-phase convection and radiation as well as thermal conduction in the solid-phase. For the gas-phase combustion, a two step global reaction is used. The standard kappa - epsilon model is used for turbulence closure. Radiation is presently treated using a simple diffusion approximation which is valid for large optical path lengths, representative of radiation from soot particles. Computational results are obtained to determine the trends in the fuel burning or regression rates as a function of the head-end oxidizer mass flux, G=rho(e)U(e), and the chamber pressure. Furthermore, computation of the full slab burner configuration has also been obtained for various stages of the burn. Comparisons with available experimental data from small scale tests conducted by General Dynamics-Thiokol-Rocketdyne suggest reasonable agreement in the predicted regression rates. Future work will include: (1) a model for soot generation in the flame for more quantitative radiative transfer modelling, (2) a parametric study of combustion efficiency, and (3) transient calculations to help determine the possible mechanisms responsible for combustion instability in hybrid rocket motors.

  20. Electronic transport in VO 2 —Experimentally calibrated Boltzmann transport modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kinaci, Alper; Kado, Motohisa; Rosenmann, Daniel

    2015-12-28

    Materials that undergo metal-insulator transitions (MITs) are under intense study because the transition is scientifically fascinating and technologically promising for various applications. Among these materials, VO2 has served as a prototype due to its favorable transition temperature. While the physical underpinnings of the transition have been heavily investigated experimentally and computationally, quantitative modeling of electronic transport in the two phases has yet to be undertaken. In this work, we establish a density-functional-theory (DFT)-based approach to model electronic transport properties in VO2 in the semiconducting and metallic regimes, focusing on band transport using the Boltzmann transport equations. We synthesized high qualitymore » VO2 films and measured the transport quantities across the transition, in order to calibrate the free parameters in the model. We find that the experimental calibration of the Hubbard correction term can efficiently and adequately model the metallic and semiconducting phases, allowing for further computational design of MIT materials for desirable transport properties.« less

  1. Development of a General Form CO 2 and Brine Flux Input Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mansoor, K.; Sun, Y.; Carroll, S.

    2014-08-01

    The National Risk Assessment Partnership (NRAP) project is developing a science-based toolset for the quantitative analysis of the potential risks associated with changes in groundwater chemistry from CO 2 injection. In order to address uncertainty probabilistically, NRAP is developing efficient, reduced-order models (ROMs) as part of its approach. These ROMs are built from detailed, physics-based process models to provide confidence in the predictions over a range of conditions. The ROMs are designed to reproduce accurately the predictions from the computationally intensive process models at a fraction of the computational time, thereby allowing the utilization of Monte Carlo methods to probemore » variability in key parameters. This report presents the procedures used to develop a generalized model for CO 2 and brine leakage fluxes based on the output of a numerical wellbore simulation. The resulting generalized parameters and ranges reported here will be used for the development of third-generation groundwater ROMs.« less

  2. Computational modeling of hypertensive growth in the human carotid artery

    NASA Astrophysics Data System (ADS)

    Sáez, Pablo; Peña, Estefania; Martínez, Miguel Angel; Kuhl, Ellen

    2014-06-01

    Arterial hypertension is a chronic medical condition associated with an elevated blood pressure. Chronic arterial hypertension initiates a series of events, which are known to collectively initiate arterial wall thickening. However, the correlation between macrostructural mechanical loading, microstructural cellular changes, and macrostructural adaptation remains unclear. Here, we present a microstructurally motivated computational model for chronic arterial hypertension through smooth muscle cell growth. To model growth, we adopt a classical concept based on the multiplicative decomposition of the deformation gradient into an elastic part and a growth part. Motivated by clinical observations, we assume that the driving force for growth is the stretch sensed by the smooth muscle cells. We embed our model into a finite element framework, where growth is stored locally as an internal variable. First, to demonstrate the features of our model, we investigate the effects of hypertensive growth in a real human carotid artery. Our results agree nicely with experimental data reported in the literature both qualitatively and quantitatively.

  3. Continuum and discrete approach in modeling biofilm development and structure: a review.

    PubMed

    Mattei, M R; Frunzo, L; D'Acunto, B; Pechaud, Y; Pirozzi, F; Esposito, G

    2018-03-01

    The scientific community has recognized that almost 99% of the microbial life on earth is represented by biofilms. Considering the impacts of their sessile lifestyle on both natural and human activities, extensive experimental activity has been carried out to understand how biofilms grow and interact with the environment. Many mathematical models have also been developed to simulate and elucidate the main processes characterizing the biofilm growth. Two main mathematical approaches for biomass representation can be distinguished: continuum and discrete. This review is aimed at exploring the main characteristics of each approach. Continuum models can simulate the biofilm processes in a quantitative and deterministic way. However, they require a multidimensional formulation to take into account the biofilm spatial heterogeneity, which makes the models quite complicated, requiring significant computational effort. Discrete models are more recent and can represent the typical multidimensional structural heterogeneity of biofilm reflecting the experimental expectations, but they generate computational results including elements of randomness and introduce stochastic effects into the solutions.

  4. Numerical analysis of base flowfield at high altitude for a four-engine clustered nozzle configuration

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See

    1993-01-01

    The objective of this study is to benchmark a four-engine clustered nozzle base flowfield with a computational fluid dynamics (CFD) model. The CFD model is a pressure based, viscous flow formulation. An adaptive upwind scheme is employed for the spatial discretization. The upwind scheme is based on second and fourth order central differencing with adaptive artificial dissipation. Qualitative base flow features such as the reverse jet, wall jet, recompression shock, and plume-plume impingement have been captured. The computed quantitative flow properties such as the radial base pressure distribution, model centerline Mach number and static pressure variation, and base pressure characteristic curve agreed reasonably well with those of the measurement. Parametric study on the effect of grid resolution, turbulence model, inlet boundary condition and difference scheme on convective terms has been performed. The results showed that grid resolution and turbulence model are two primary factors that influence the accuracy of the base flowfield prediction.

  5. Atomistic and coarse-grained computer simulations of raft-like lipid mixtures.

    PubMed

    Pandit, Sagar A; Scott, H Larry

    2007-01-01

    Computer modeling can provide insights into the existence, structure, size, and thermodynamic stability of localized raft-like regions in membranes. However, the challenges in the construction and simulation of accurate models of heterogeneous membranes are great. The primary obstacle in modeling the lateral organization within a membrane is the relatively slow lateral diffusion rate for lipid molecules. Microsecond or longer time-scales are needed to fully model the formation and stability of a raft in a membra ne. Atomistic simulations currently are not able to reach this scale, but they do provide quantitative information on the intermolecular forces and correlations that are involved in lateral organization. In this chapter, the steps needed to carry out and analyze atomistic simulations of hydrated lipid bilayers having heterogeneous composition are outlined. It is then shown how the data from a molecular dynamics simulation can be used to construct a coarse-grained model for the heterogeneous bilayer that can predict the lateral organization and stability of rafts at up to millisecond time-scales.

  6. Computer simulation of a pilot in V/STOL aircraft control loops

    NASA Technical Reports Server (NTRS)

    Vogt, William G.; Mickle, Marlin H.; Zipf, Mark E.; Kucuk, Senol

    1989-01-01

    The objective was to develop a computerized adaptive pilot model for the computer model of the research aircraft, the Harrier II AV-8B V/STOL with special emphasis on propulsion control. In fact, two versions of the adaptive pilot are given. The first, simply called the Adaptive Control Model (ACM) of a pilot includes a parameter estimation algorithm for the parameters of the aircraft and an adaption scheme based on the root locus of the poles of the pilot controlled aircraft. The second, called the Optimal Control Model of the pilot (OCM), includes an adaption algorithm and an optimal control algorithm. These computer simulations were developed as a part of the ongoing research program in pilot model simulation supported by NASA Lewis from April 1, 1985 to August 30, 1986 under NASA Grant NAG 3-606 and from September 1, 1986 through November 30, 1988 under NASA Grant NAG 3-729. Once installed, these pilot models permitted the computer simulation of the pilot model to close all of the control loops normally closed by a pilot actually manipulating the control variables. The current version of this has permitted a baseline comparison of various qualitative and quantitative performance indices for propulsion control, the control loops and the work load on the pilot. Actual data for an aircraft flown by a human pilot furnished by NASA was compared to the outputs furnished by the computerized pilot and found to be favorable.

  7. Single-exposure quantitative phase imaging in color-coded LED microscopy.

    PubMed

    Lee, Wonchan; Jung, Daeseong; Ryu, Suho; Joo, Chulmin

    2017-04-03

    We demonstrate single-shot quantitative phase imaging (QPI) in a platform of color-coded LED microscopy (cLEDscope). The light source in a conventional microscope is replaced by a circular LED pattern that is trisected into subregions with equal area, assigned to red, green, and blue colors. Image acquisition with a color image sensor and subsequent computation based on weak object transfer functions allow for the QPI of a transparent specimen. We also provide a correction method for color-leakage, which may be encountered in implementing our method with consumer-grade LEDs and image sensors. Most commercially available LEDs and image sensors do not provide spectrally isolated emissions and pixel responses, generating significant error in phase estimation in our method. We describe the correction scheme for this color-leakage issue, and demonstrate improved phase measurement accuracy. The computational model and single-exposure QPI capability of our method are presented by showing images of calibrated phase samples and cellular specimens.

  8. Quantitative evaluation of software packages for single-molecule localization microscopy.

    PubMed

    Sage, Daniel; Kirshner, Hagai; Pengo, Thomas; Stuurman, Nico; Min, Junhong; Manley, Suliana; Unser, Michael

    2015-08-01

    The quality of super-resolution images obtained by single-molecule localization microscopy (SMLM) depends largely on the software used to detect and accurately localize point sources. In this work, we focus on the computational aspects of super-resolution microscopy and present a comprehensive evaluation of localization software packages. Our philosophy is to evaluate each package as a whole, thus maintaining the integrity of the software. We prepared synthetic data that represent three-dimensional structures modeled after biological components, taking excitation parameters, noise sources, point-spread functions and pixelation into account. We then asked developers to run their software on our data; most responded favorably, allowing us to present a broad picture of the methods available. We evaluated their results using quantitative and user-interpretable criteria: detection rate, accuracy, quality of image reconstruction, resolution, software usability and computational resources. These metrics reflect the various tradeoffs of SMLM software packages and help users to choose the software that fits their needs.

  9. Toward Quantitative Small Animal Pinhole SPECT: Assessment of Quantitation Accuracy Prior to Image Compensations

    PubMed Central

    Chen, Chia-Lin; Wang, Yuchuan; Lee, Jason J. S.; Tsui, Benjamin M. W.

    2011-01-01

    Purpose We assessed the quantitation accuracy of small animal pinhole single photon emission computed tomography (SPECT) under the current preclinical settings, where image compensations are not routinely applied. Procedures The effects of several common image-degrading factors and imaging parameters on quantitation accuracy were evaluated using Monte-Carlo simulation methods. Typical preclinical imaging configurations were modeled, and quantitative analyses were performed based on image reconstructions without compensating for attenuation, scatter, and limited system resolution. Results Using mouse-sized phantom studies as examples, attenuation effects alone degraded quantitation accuracy by up to −18% (Tc-99m or In-111) or −41% (I-125). The inclusion of scatter effects changed the above numbers to −12% (Tc-99m or In-111) and −21% (I-125), respectively, indicating the significance of scatter in quantitative I-125 imaging. Region-of-interest (ROI) definitions have greater impacts on regional quantitation accuracy for small sphere sources as compared to attenuation and scatter effects. For the same ROI, SPECT acquisitions using pinhole apertures of different sizes could significantly affect the outcome, whereas the use of different radii-of-rotation yielded negligible differences in quantitation accuracy for the imaging configurations simulated. Conclusions We have systematically quantified the influence of several factors affecting the quantitation accuracy of small animal pinhole SPECT. In order to consistently achieve accurate quantitation within 5% of the truth, comprehensive image compensation methods are needed. PMID:19048346

  10. Generalized likelihood ratios for quantitative diagnostic test scores.

    PubMed

    Tandberg, D; Deely, J J; O'Malley, A J

    1997-11-01

    The reduction of quantitative diagnostic test scores to the dichotomous case is a wasteful and unnecessary simplification in the era of high-speed computing. Physicians could make better use of the information embedded in quantitative test results if modern generalized curve estimation techniques were applied to the likelihood functions of Bayes' theorem. Hand calculations could be completely avoided and computed graphical summaries provided instead. Graphs showing posttest probability of disease as a function of pretest probability with confidence intervals (POD plots) would enhance acceptance of these techniques if they were immediately available at the computer terminal when test results were retrieved. Such constructs would also provide immediate feedback to physicians when a valueless test had been ordered.

  11. BioModels Database: An enhanced, curated and annotated resource for published quantitative kinetic models

    PubMed Central

    2010-01-01

    Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in community-supported and standardised formats. In addition, the models and their components should be cross-referenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freely-accessible online resource for storing, viewing, retrieving, and analysing published, peer-reviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access up-to-date data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. PMID:20587024

  12. pulver: an R package for parallel ultra-rapid p-value computation for linear regression interaction terms.

    PubMed

    Molnos, Sophie; Baumbach, Clemens; Wahl, Simone; Müller-Nurasyid, Martina; Strauch, Konstantin; Wang-Sattler, Rui; Waldenberger, Melanie; Meitinger, Thomas; Adamski, Jerzy; Kastenmüller, Gabi; Suhre, Karsten; Peters, Annette; Grallert, Harald; Theis, Fabian J; Gieger, Christian

    2017-09-29

    Genome-wide association studies allow us to understand the genetics of complex diseases. Human metabolism provides information about the disease-causing mechanisms, so it is usual to investigate the associations between genetic variants and metabolite levels. However, only considering genetic variants and their effects on one trait ignores the possible interplay between different "omics" layers. Existing tools only consider single-nucleotide polymorphism (SNP)-SNP interactions, and no practical tool is available for large-scale investigations of the interactions between pairs of arbitrary quantitative variables. We developed an R package called pulver to compute p-values for the interaction term in a very large number of linear regression models. Comparisons based on simulated data showed that pulver is much faster than the existing tools. This is achieved by using the correlation coefficient to test the null-hypothesis, which avoids the costly computation of inversions. Additional tricks are a rearrangement of the order, when iterating through the different "omics" layers, and implementing this algorithm in the fast programming language C++. Furthermore, we applied our algorithm to data from the German KORA study to investigate a real-world problem involving the interplay among DNA methylation, genetic variants, and metabolite levels. The pulver package is a convenient and rapid tool for screening huge numbers of linear regression models for significant interaction terms in arbitrary pairs of quantitative variables. pulver is written in R and C++, and can be downloaded freely from CRAN at https://cran.r-project.org/web/packages/pulver/ .

  13. Assessing the impact of the Lebanese National Polio Immunization Campaign using a population-based computational model.

    PubMed

    Alawieh, Ali; Sabra, Zahraa; Langley, E Farris; Bizri, Abdul Rahman; Hamadeh, Randa; Zaraket, Fadi A

    2017-11-25

    After the re-introduction of poliovirus to Syria in 2013, Lebanon was considered at high transmission risk due to its proximity to Syria and the high number of Syrian refugees. However, after a large-scale national immunization initiative, Lebanon was able to prevent a potential outbreak of polio among nationals and refugees. In this work, we used a computational individual-simulation model to assess the risk of poliovirus threat to Lebanon prior and after the immunization campaign and to quantitatively assess the healthcare impact of the campaign and the required standards that need to be maintained nationally to prevent a future outbreak. Acute poliomyelitis surveillance in Lebanon was along with the design and coverage rate of the recent national polio immunization campaign were reviewed from the records of the Lebanese Ministry of Public Health. Lebanese population demographics including Syrian and Palestinian refugees were reviewed to design individual-based models that predicts the consequences of polio spread to Lebanon and evaluate the outcome of immunization campaigns. The model takes into account geographic, demographic and health-related features. Our simulations confirmed the high risk of polio outbreaks in Lebanon within 10 days of case introduction prior to the immunization campaign, and showed that the current immunization campaign significantly reduced the speed of the infection in the event poliomyelitis cases enter the country. A minimum of 90% national immunization coverage was found to be required to prevent exponential propagation of potential transmission. Both surveillance and immunization efforts should be maintained at high standards in Lebanon and other countries in the area to detect and limit any potential outbreak. The use of computational population simulation models can provide a quantitative approach to assess the impact of immunization campaigns and the burden of infectious diseases even in the context of population migration.

  14. Computational Study of Thrombus Formation and Clotting Factor Effects under Venous Flow Conditions

    DTIC Science & Technology

    2016-04-26

    Telemedicine and Advanced Technology Research Center, U.S. Army Medical Research and Materiel Command, Fort Detrick, MarylandABSTRACT A comprehensive... experimental study. The model allowed us to identify the distinct patterns character- izing the spatial distributions of thrombin, platelets, and fibrin...time, elevated fibrinogen levels may contribute to the development of thrombosis (4,6,12). Quantitative knowledge about the interactions between fibrin

  15. Studies of Peptide-Mineral Interactions and Biosilicification

    DTIC Science & Technology

    2010-07-16

    His). The effect of zinc oxide -binding peptides ( ZnO -BPs) on the morphology and formation of ZnO were studied using G-12 (GLHVMHKVAPPR) and EM-12...interactions with silica and zinc oxide . Detailed quantitative experimental studies together with molecular modeling studies have shown that G12 (GLHVMHKVAPPR...studies of a primitive 15. SUBJECT TERMS Peptides, zinc oxide , silica, silver, peptide-mineral interactions, computational chemistry, molecular

  16. Biomacromolecular quantitative structure-activity relationship (BioQSAR): a proof-of-concept study on the modeling, prediction and interpretation of protein-protein binding affinity.

    PubMed

    Zhou, Peng; Wang, Congcong; Tian, Feifei; Ren, Yanrong; Yang, Chao; Huang, Jian

    2013-01-01

    Quantitative structure-activity relationship (QSAR), a regression modeling methodology that establishes statistical correlation between structure feature and apparent behavior for a series of congeneric molecules quantitatively, has been widely used to evaluate the activity, toxicity and property of various small-molecule compounds such as drugs, toxicants and surfactants. However, it is surprising to see that such useful technique has only very limited applications to biomacromolecules, albeit the solved 3D atom-resolution structures of proteins, nucleic acids and their complexes have accumulated rapidly in past decades. Here, we present a proof-of-concept paradigm for the modeling, prediction and interpretation of the binding affinity of 144 sequence-nonredundant, structure-available and affinity-known protein complexes (Kastritis et al. Protein Sci 20:482-491, 2011) using a biomacromolecular QSAR (BioQSAR) scheme. We demonstrate that the modeling performance and predictive power of BioQSAR are comparable to or even better than that of traditional knowledge-based strategies, mechanism-type methods and empirical scoring algorithms, while BioQSAR possesses certain additional features compared to the traditional methods, such as adaptability, interpretability, deep-validation and high-efficiency. The BioQSAR scheme could be readily modified to infer the biological behavior and functions of other biomacromolecules, if their X-ray crystal structures, NMR conformation assemblies or computationally modeled structures are available.

  17. Emerging systems biology approaches in nanotoxicology: Towards a mechanism-based understanding of nanomaterial hazard and risk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Costa, Pedro M.; Fadeel, Bengt, E-mail: Bengt.Fade

    Engineered nanomaterials are being developed for a variety of technological applications. However, the increasing use of nanomaterials in society has led to concerns about their potential adverse effects on human health and the environment. During the first decade of nanotoxicological research, the realization has emerged that effective risk assessment of the multitudes of new nanomaterials would benefit from a comprehensive understanding of their toxicological mechanisms, which is difficult to achieve with traditional, low-throughput, single end-point oriented approaches. Therefore, systems biology approaches are being progressively applied within the nano(eco)toxicological sciences. This novel paradigm implies that the study of biological systems shouldmore » be integrative resulting in quantitative and predictive models of nanomaterial behaviour in a biological system. To this end, global ‘omics’ approaches with which to assess changes in genes, proteins, metabolites, etc. are deployed allowing for computational modelling of the biological effects of nanomaterials. Here, we highlight omics and systems biology studies in nanotoxicology, aiming towards the implementation of a systems nanotoxicology and mechanism-based risk assessment of nanomaterials. - Highlights: • Systems nanotoxicology is a multi-disciplinary approach to quantitative modelling. • Transcriptomics, proteomics and metabolomics remain the most common methods. • Global “omics” techniques should be coupled to computational modelling approaches. • The discovery of nano-specific toxicity pathways and biomarkers is a prioritized goal. • Overall, experimental nanosafety research must endeavour reproducibility and relevance.« less

  18. Sensitivity analysis of Repast computational ecology models with R/Repast.

    PubMed

    Prestes García, Antonio; Rodríguez-Patón, Alfonso

    2016-12-01

    Computational ecology is an emerging interdisciplinary discipline founded mainly on modeling and simulation methods for studying ecological systems. Among the existing modeling formalisms, the individual-based modeling is particularly well suited for capturing the complex temporal and spatial dynamics as well as the nonlinearities arising in ecosystems, communities, or populations due to individual variability. In addition, being a bottom-up approach, it is useful for providing new insights on the local mechanisms which are generating some observed global dynamics. Of course, no conclusions about model results could be taken seriously if they are based on a single model execution and they are not analyzed carefully. Therefore, a sound methodology should always be used for underpinning the interpretation of model results. The sensitivity analysis is a methodology for quantitatively assessing the effect of input uncertainty in the simulation output which should be incorporated compulsorily to every work based on in-silico experimental setup. In this article, we present R/Repast a GNU R package for running and analyzing Repast Simphony models accompanied by two worked examples on how to perform global sensitivity analysis and how to interpret the results.

  19. Quantitative Assessment of Cervical Vertebral Maturation Using Cone Beam Computed Tomography in Korean Girls

    PubMed Central

    Byun, Bo-Ram; Kim, Yong-Il; Maki, Koutaro; Son, Woo-Sung

    2015-01-01

    This study was aimed to examine the correlation between skeletal maturation status and parameters from the odontoid process/body of the second vertebra and the bodies of third and fourth cervical vertebrae and simultaneously build multiple regression models to be able to estimate skeletal maturation status in Korean girls. Hand-wrist radiographs and cone beam computed tomography (CBCT) images were obtained from 74 Korean girls (6–18 years of age). CBCT-generated cervical vertebral maturation (CVM) was used to demarcate the odontoid process and the body of the second cervical vertebra, based on the dentocentral synchondrosis. Correlation coefficient analysis and multiple linear regression analysis were used for each parameter of the cervical vertebrae (P < 0.05). Forty-seven of 64 parameters from CBCT-generated CVM (independent variables) exhibited statistically significant correlations (P < 0.05). The multiple regression model with the greatest R 2 had six parameters (PH2/W2, UW2/W2, (OH+AH2)/LW2, UW3/LW3, D3, and H4/W4) as independent variables with a variance inflation factor (VIF) of <2. CBCT-generated CVM was able to include parameters from the second cervical vertebral body and odontoid process, respectively, for the multiple regression models. This suggests that quantitative analysis might be used to estimate skeletal maturation status. PMID:25878721

  20. Skeletal adaptations associated with pre-pubertal gymnastics participation as determined by DXA and pQCT: a systematic review and meta-analysis.

    PubMed

    Burt, Lauren A; Greene, David A; Ducher, Gaele; Naughton, Geraldine A

    2013-05-01

    Participation in gymnastics prior to puberty offers an intriguing and unique model, particularly in girls. The individuality comes from both upper and lower limbs being exposed to high mechanical loading through year long intensive training programs, initiated at a young age. Studying this unique model and the associated changes in musculoskeletal health during growth is an area of specific interest. Previous reviews on gymnastics participation and bone health have been broad; and not limited to a particular maturation period, such as pre-puberty. To determine the difference in skeletal health between pre-pubertal girls participating in gymnastics compared with non-gymnasts. Meta-analysis. Following a systematic search, 17 studies were included in this meta-analysis. All studies used dual-energy X-ray absorptiometry to assess bone mineral density and bone mineral content. In addition, two studies included peripheral quantitative computed tomography. Following the implementation of a random effects model, gymnasts were found to have greater bone properties than non-gymnasts. The largest difference in bone health between gymnasts and non-gymnasts was observed in peripheral quantitative computed tomography-derived volumetric bone mineral density at the distal radius (d=1.06). Participation in gymnastics during pre-pubertal growth was associated with skeletal health benefits, particularly to the upper body. Copyright © 2012 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  1. Safe uses of Hill's model: an exact comparison with the Adair-Klotz model

    PubMed Central

    2011-01-01

    Background The Hill function and the related Hill model are used frequently to study processes in the living cell. There are very few studies investigating the situations in which the model can be safely used. For example, it has been shown, at the mean field level, that the dose response curve obtained from a Hill model agrees well with the dose response curves obtained from a more complicated Adair-Klotz model, provided that the parameters of the Adair-Klotz model describe strongly cooperative binding. However, it has not been established whether such findings can be extended to other properties and non-mean field (stochastic) versions of the same, or other, models. Results In this work a rather generic quantitative framework for approaching such a problem is suggested. The main idea is to focus on comparing the particle number distribution functions for Hill's and Adair-Klotz's models instead of investigating a particular property (e.g. the dose response curve). The approach is valid for any model that can be mathematically related to the Hill model. The Adair-Klotz model is used to illustrate the technique. One main and two auxiliary similarity measures were introduced to compare the distributions in a quantitative way. Both time dependent and the equilibrium properties of the similarity measures were studied. Conclusions A strongly cooperative Adair-Klotz model can be replaced by a suitable Hill model in such a way that any property computed from the two models, even the one describing stochastic features, is approximately the same. The quantitative analysis showed that boundaries of the regions in the parameter space where the models behave in the same way exhibit a rather rich structure. PMID:21521501

  2. CCTV Coverage Index Based on Surveillance Resolution and Its Evaluation Using 3D Spatial Analysis

    PubMed Central

    Choi, Kyoungah; Lee, Impyeong

    2015-01-01

    We propose a novel approach to evaluating how effectively a closed circuit television (CCTV) system can monitor a targeted area. With 3D models of the target area and the camera parameters of the CCTV system, the approach produces surveillance coverage index, which is newly defined in this study as a quantitative measure for surveillance performance. This index indicates the proportion of the space being monitored with a sufficient resolution to the entire space of the target area. It is determined by computing surveillance resolution at every position and orientation, which indicates how closely a specific object can be monitored with a CCTV system. We present full mathematical derivation for the resolution, which depends on the location and orientation of the object as well as the geometric model of a camera. With the proposed approach, we quantitatively evaluated the surveillance coverage of a CCTV system in an underground parking area. Our evaluation process provided various quantitative-analysis results, compelling us to examine the design of the CCTV system prior to its installation and understand the surveillance capability of an existing CCTV system. PMID:26389909

  3. Easy calculations of lod scores and genetic risks on small computers.

    PubMed Central

    Lathrop, G M; Lalouel, J M

    1984-01-01

    A computer program that calculates lod scores and genetic risks for a wide variety of both qualitative and quantitative genetic traits is discussed. An illustration is given of the joint use of a genetic marker, affection status, and quantitative information in counseling situations regarding Duchenne muscular dystrophy. PMID:6585139

  4. Quantum phase transitions in a two-dimensional quantum XYX model: ground-state fidelity and entanglement.

    PubMed

    Li, Bo; Li, Sheng-Hao; Zhou, Huan-Qiang

    2009-06-01

    A systematic analysis is performed for quantum phase transitions in a two-dimensional anisotropic spin-1/2 antiferromagnetic XYX model in an external magnetic field. With the help of an innovative tensor network algorithm, we compute the fidelity per lattice site to demonstrate that the field-induced quantum phase transition is unambiguously characterized by a pinch point on the fidelity surface, marking a continuous phase transition. We also compute an entanglement estimator, defined as a ratio between the one-tangle and the sum of squared concurrences, to identify both the factorizing field and the critical point, resulting in a quantitative agreement with quantum Monte Carlo simulation. In addition, the local order parameter is "derived" from the tensor network representation of the system's ground-state wave functions.

  5. [Computer simulation of thyroid regulatory mechanisms in health and malignancy].

    PubMed

    Abduvaliev, A A; Gil'dieva, M S; Khidirov, B N; Saĭdalieva, M; Saatov, T S

    2010-07-01

    The paper describes a computer model for regulation of the number of thyroid follicular cells in health and malignancy. The authors'computer program for mathematical simulation of the regulatory mechanisms of a thyroid follicular cellular community cannot be now referred to as good commercial products. For commercialization of this product, it is necessary to draw up a direct relation of the introduced corrected values from the actually existing normal values, such as the peripheral blood concentrations of thyroid hormones or the mean values of endocrine tissue mitotic activity. However, the described computer program has been also used in researches by our scientific group in the study of thyroid cancer. The available biological experimental data and theoretical provisions on thyroid structural and functional organization at the cellular level allow one to construct mathematical models for quantitative analysis of the regulation of the size of a cellular community of a thyroid follicle in health and abnormalities, by using the method for simulation of the regulatory mechanisms of living systems and the equations of cellular community regulatory communities.

  6. The structure and timescales of heat perception in larval zebrafish.

    PubMed

    Haesemeyer, Martin; Robson, Drew N; Li, Jennifer M; Schier, Alexander F; Engert, Florian

    2015-11-25

    Avoiding temperatures outside the physiological range is critical for animal survival, but how temperature dynamics are transformed into behavioral output is largely not understood. Here, we used an infrared laser to challenge freely swimming larval zebrafish with "white-noise" heat stimuli and built quantitative models relating external sensory information and internal state to behavioral output. These models revealed that larval zebrafish integrate temperature information over a time-window of 400 ms preceding a swimbout and that swimming is suppressed right after the end of a bout. Our results suggest that larval zebrafish compute both an integral and a derivative across heat in time to guide their next movement. Our models put important constraints on the type of computations that occur in the nervous system and reveal principles of how somatosensory temperature information is processed to guide behavioral decisions such as sensitivity to both absolute levels and changes in stimulation.

  7. Computational Modeling of a Mechanized Benchtop Apparatus for Leading-Edge Slat Noise Treatment Device Prototypes

    NASA Technical Reports Server (NTRS)

    Turner, Travis L.; Moore, James B.; Long, David L.

    2017-01-01

    Airframe noise is a growing concern in the vicinity of airports because of population growth and gains in engine noise reduction that have rendered the airframe an equal contributor during the approach and landing phases of flight for many transport aircraft. The leading-edge-slat device of a typical high-lift system for transport aircraft is a prominent source of airframe noise. Two technologies have significant potential for slat noise reduction; the slat-cove filler (SCF) and the slat-gap filler (SGF). Previous work was done on a 2D section of a transport-aircraft wing to demonstrate the implementation feasibility of these concepts. Benchtop hardware was developed in that work for qualitative parametric study. The benchtop models were mechanized for quantitative measurements of performance. Computational models of the mechanized benchtop apparatus for the SCF were developed and the performance of the system for five different SCF assemblies is demonstrated.

  8. Issues associated with modelling of proton exchange membrane fuel cell by computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Bednarek, Tomasz; Tsotridis, Georgios

    2017-03-01

    The objective of the current study is to highlight possible limitations and difficulties associated with Computational Fluid Dynamics in PEM single fuel cell modelling. It is shown that an appropriate convergence methodology should be applied for steady-state solutions, due to inherent numerical instabilities. A single channel fuel cell model has been taken as numerical example. Results are evaluated for quantitative as well qualitative points of view. The contribution to the polarization curve of the different fuel cell components such as bi-polar plates, gas diffusion layers, catalyst layers and membrane was investigated via their effects on the overpotentials. Furthermore, the potential losses corresponding to reaction kinetics, due to ohmic and mas transport limitations and the effect of the exchange current density and open circuit voltage, were also investigated. It is highlighted that the lack of reliable and robust input data is one of the issues for obtaining accurate results.

  9. Respectful Modeling: Addressing Uncertainty in Dynamic System Models for Molecular Biology.

    PubMed

    Tsigkinopoulou, Areti; Baker, Syed Murtuza; Breitling, Rainer

    2017-06-01

    Although there is still some skepticism in the biological community regarding the value and significance of quantitative computational modeling, important steps are continually being taken to enhance its accessibility and predictive power. We view these developments as essential components of an emerging 'respectful modeling' framework which has two key aims: (i) respecting the models themselves and facilitating the reproduction and update of modeling results by other scientists, and (ii) respecting the predictions of the models and rigorously quantifying the confidence associated with the modeling results. This respectful attitude will guide the design of higher-quality models and facilitate the use of models in modern applications such as engineering and manipulating microbial metabolism by synthetic biology. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Parameter discovery in stochastic biological models using simulated annealing and statistical model checking.

    PubMed

    Hussain, Faraz; Jha, Sumit K; Jha, Susmit; Langmead, Christopher J

    2014-01-01

    Stochastic models are increasingly used to study the behaviour of biochemical systems. While the structure of such models is often readily available from first principles, unknown quantitative features of the model are incorporated into the model as parameters. Algorithmic discovery of parameter values from experimentally observed facts remains a challenge for the computational systems biology community. We present a new parameter discovery algorithm that uses simulated annealing, sequential hypothesis testing, and statistical model checking to learn the parameters in a stochastic model. We apply our technique to a model of glucose and insulin metabolism used for in-silico validation of artificial pancreata and demonstrate its effectiveness by developing parallel CUDA-based implementation for parameter synthesis in this model.

  11. A Computational Framework for Bioimaging Simulation

    PubMed Central

    Watabe, Masaki; Arjunan, Satya N. V.; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units. PMID:26147508

  12. A quantitative three-dimensional dose attenuation analysis around Fletcher-Suit-Delclos due to stainless steel tube for high-dose-rate brachytherapy by Monte Carlo calculations.

    PubMed

    Parsai, E Ishmael; Zhang, Zhengdong; Feldmeier, John J

    2009-01-01

    The commercially available brachytherapy treatment-planning systems today, usually neglects the attenuation effect from stainless steel (SS) tube when Fletcher-Suit-Delclos (FSD) is used in treatment of cervical and endometrial cancers. This could lead to potential inaccuracies in computing dwell times and dose distribution. A more accurate analysis quantifying the level of attenuation for high-dose-rate (HDR) iridium 192 radionuclide ((192)Ir) source is presented through Monte Carlo simulation verified by measurement. In this investigation a general Monte Carlo N-Particles (MCNP) transport code was used to construct a typical geometry of FSD through simulation and compare the doses delivered to point A in Manchester System with and without the SS tubing. A quantitative assessment of inaccuracies in delivered dose vs. the computed dose is presented. In addition, this investigation expanded to examine the attenuation-corrected radial and anisotropy dose functions in a form parallel to the updated AAPM Task Group No. 43 Report (AAPM TG-43) formalism. This will delineate quantitatively the inaccuracies in dose distributions in three-dimensional space. The changes in dose deposition and distribution caused by increased attenuation coefficient resulted from presence of SS are quantified using MCNP Monte Carlo simulations in coupled photon/electron transport. The source geometry was that of the Vari Source wire model VS2000. The FSD was that of the Varian medical system. In this model, the bending angles of tandem and colpostats are 15 degrees and 120 degrees , respectively. We assigned 10 dwell positions to the tandem and 4 dwell positions to right and left colpostats or ovoids to represent a typical treatment case. Typical dose delivered to point A was determined according to Manchester dosimetry system. Based on our computations, the reduction of dose to point A was shown to be at least 3%. So this effect presented by SS-FSD systems on patient dose is of concern.

  13. Generalizing the dynamic field theory of spatial cognition across real and developmental time scales

    PubMed Central

    Simmering, Vanessa R.; Spencer, John P.; Schutte, Anne R.

    2008-01-01

    Within cognitive neuroscience, computational models are designed to provide insights into the organization of behavior while adhering to neural principles. These models should provide sufficient specificity to generate novel predictions while maintaining the generality needed to capture behavior across tasks and/or time scales. This paper presents one such model, the Dynamic Field Theory (DFT) of spatial cognition, showing new simulations that provide a demonstration proof that the theory generalizes across developmental changes in performance in four tasks—the Piagetian A-not-B task, a sandbox version of the A-not-B task, a canonical spatial recall task, and a position discrimination task. Model simulations demonstrate that the DFT can accomplish both specificity—generating novel, testable predictions—and generality—spanning multiple tasks across development with a relatively simple developmental hypothesis. Critically, the DFT achieves generality across tasks and time scales with no modification to its basic structure and with a strong commitment to neural principles. The only change necessary to capture development in the model was an increase in the precision of the tuning of receptive fields as well as an increase in the precision of local excitatory interactions among neurons in the model. These small quantitative changes were sufficient to move the model through a set of quantitative and qualitative behavioral changes that span the age range from 8 months to 6 years and into adulthood. We conclude by considering how the DFT is positioned in the literature, the challenges on the horizon for our framework, and how a dynamic field approach can yield new insights into development from a computational cognitive neuroscience perspective. PMID:17716632

  14. Superimposition of 3-dimensional cone-beam computed tomography models of growing patients

    PubMed Central

    Cevidanes, Lucia H. C.; Heymann, Gavin; Cornelis, Marie A.; DeClerck, Hugo J.; Tulloch, J. F. Camilla

    2009-01-01

    Introduction The objective of this study was to evaluate a new method for superimposition of 3-dimensional (3D) models of growing subjects. Methods Cone-beam computed tomography scans were taken before and after Class III malocclusion orthopedic treatment with miniplates. Three observers independently constructed 18 3D virtual surface models from cone-beam computed tomography scans of 3 patients. Separate 3D models were constructed for soft-tissue, cranial base, maxillary, and mandibular surfaces. The anterior cranial fossa was used to register the 3D models of before and after treatment (about 1 year of follow-up). Results Three-dimensional overlays of superimposed models and 3D color-coded displacement maps allowed visual and quantitative assessment of growth and treatment changes. The range of interobserver errors for each anatomic region was 0.4 mm for the zygomatic process of maxilla, chin, condyles, posterior border of the rami, and lower border of the mandible, and 0.5 mm for the anterior maxilla soft-tissue upper lip. Conclusions Our results suggest that this method is a valid and reproducible assessment of treatment outcomes for growing subjects. This technique can be used to identify maxillary and mandibular positional changes and bone remodeling relative to the anterior cranial fossa. PMID:19577154

  15. Strategy generalization across orientation tasks: testing a computational cognitive model.

    PubMed

    Gunzelmann, Glenn

    2008-07-08

    Humans use their spatial information processing abilities flexibly to facilitate problem solving and decision making in a variety of tasks. This article explores the question of whether a general strategy can be adapted for performing two different spatial orientation tasks by testing the predictions of a computational cognitive model. Human performance was measured on an orientation task requiring participants to identify the location of a target either on a map (find-on-map) or within an egocentric view of a space (find-in-scene). A general strategy instantiated in a computational cognitive model of the find-on-map task, based on the results from Gunzelmann and Anderson (2006), was adapted to perform both tasks and used to generate performance predictions for a new study. The qualitative fit of the model to the human data supports the view that participants were able to tailor a general strategy to the requirements of particular spatial tasks. The quantitative differences between the predictions of the model and the performance of human participants in the new experiment expose individual differences in sample populations. The model provides a means of accounting for those differences and a framework for understanding how human spatial abilities are applied to naturalistic spatial tasks that involve reasoning with maps. 2008 Cognitive Science Society, Inc.

  16. Integrated model of the shallow and deep hydrothermal systems in the East Mesa area, Imperial Valley, California

    USGS Publications Warehouse

    Riney, T. David; Pritchett, J.W.; Rice, L.F.

    1982-01-01

    Geological, geophysical, thermal, petrophysical and hydrological data available for the East Mesa hydrothermal system that are pertinent to the construction of a computer model of the natural flow of heat and fluid mass within the system are assembled and correlated. A conceptual model of the full system is developed and a subregion selected for quantitative modeling. By invoking the .Boussinesq approximation, valid for describing the natural flow of heat and mass in a liquid hydrothermal system, it is found practical to carry computer simulations far enough in time to ensure that steady-state conditions are obtained. Initial calculations for an axisymmetric model approximating the system demonstrate that the vertical formation permeability of the deep East Mesa system must be very low (kv ~ 0.25 to 0.5 md). Since subsurface temperature and surface heat flow data exhibit major deviations from the axisymmetric approximation, exploratory three-dimensional calculations are performed to assess the effects of various mechanisms which might operate to produce such observed asymmetries. A three-dimensional model evolves from this iterative data synthesis and computer analysis which includes a hot fluid convective source distributed along a leaky fault radiating northward from the center of the hot spot and realistic variations in the reservoir formation properties.

  17. Computational modeling of human oral bioavailability: what will be next?

    PubMed

    Cabrera-Pérez, Miguel Ángel; Pham-The, Hai

    2018-06-01

    The oral route is the most convenient way of administrating drugs. Therefore, accurate determination of oral bioavailability is paramount during drug discovery and development. Quantitative structure-property relationship (QSPR), rule-of-thumb (RoT) and physiologically based-pharmacokinetic (PBPK) approaches are promising alternatives to the early oral bioavailability prediction. Areas covered: The authors give insight into the factors affecting bioavailability, the fundamental theoretical framework and the practical aspects of computational methods for predicting this property. They also give their perspectives on future computational models for estimating oral bioavailability. Expert opinion: Oral bioavailability is a multi-factorial pharmacokinetic property with its accurate prediction challenging. For RoT and QSPR modeling, the reliability of datasets, the significance of molecular descriptor families and the diversity of chemometric tools used are important factors that define model predictability and interpretability. Likewise, for PBPK modeling the integrity of the pharmacokinetic data, the number of input parameters, the complexity of statistical analysis and the software packages used are relevant factors in bioavailability prediction. Although these approaches have been utilized independently, the tendency to use hybrid QSPR-PBPK approaches together with the exploration of ensemble and deep-learning systems for QSPR modeling of oral bioavailability has opened new avenues for development promising tools for oral bioavailability prediction.

  18. Model based defect characterization in composites

    NASA Astrophysics Data System (ADS)

    Roberts, R.; Holland, S.

    2017-02-01

    Work is reported on model-based defect characterization in CFRP composites. The work utilizes computational models of the interaction of NDE probing energy fields (ultrasound and thermography), to determine 1) the measured signal dependence on material and defect properties (forward problem), and 2) an assessment of performance-critical defect properties from analysis of measured NDE signals (inverse problem). Work is reported on model implementation for inspection of CFRP laminates containing multi-ply impact-induced delamination, with application in this paper focusing on ultrasound. A companion paper in these proceedings summarizes corresponding activity in thermography. Inversion of ultrasound data is demonstrated showing the quantitative extraction of damage properties.

  19. Applications of the hybrid coordinate method to the TOPS autopilot

    NASA Technical Reports Server (NTRS)

    Fleischer, G. E.

    1978-01-01

    Preliminary results are presented from the application of the hybrid coordinate method to modeling TOPS (thermoelectric outer planet spacecraft) structural dynamics. Computer simulated responses of the vehicle are included which illustrate the interaction of relatively flexible appendages with an autopilot control system. Comparisons were made between simplified single-axis models of the control loop, with spacecraft flexibility represented by hinged rigid bodies, and a very detailed three-axis spacecraft model whose flexible portions are described by modal coordinates. While single-axis system, root loci provided reasonable qualitative indications of stability margins in this case, they were quantitatively optimistic when matched against responses of the detailed model.

  20. Computer simulation of the probability that endangered whales will interact with oil spills, Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reed, M.; Jayko, K.; Bowles, A.

    1986-10-01

    A numerical model system was developed to assess quantitatively the probability that endangered bowhead and gray whales will encounter spilled oil in Alaskan waters. Bowhead and gray whale migration diving-surfacing models, and an oil-spill-trajectory model comprise the system. The migration models were developed from conceptual considerations, then calibrated with and tested against observations. The distribution of animals is represented in space and time by discrete points, each of which may represent one or more whales. The movement of a whale point is governed by a random-walk algorithm which stochastically follows a migratory pathway.

Top