Sample records for interactive computer model

  1. Modeling Human-Computer Decision Making with Covariance Structure Analysis.

    ERIC Educational Resources Information Center

    Coovert, Michael D.; And Others

    Arguing that sufficient theory exists about the interplay between human information processing, computer systems, and the demands of various tasks to construct useful theories of human-computer interaction, this study presents a structural model of human-computer interaction and reports the results of various statistical analyses of this model.…

  2. Cyberpsychology: a human-interaction perspective based on cognitive modeling.

    PubMed

    Emond, Bruno; West, Robert L

    2003-10-01

    This paper argues for the relevance of cognitive modeling and cognitive architectures to cyberpsychology. From a human-computer interaction point of view, cognitive modeling can have benefits both for theory and model building, and for the design and evaluation of sociotechnical systems usability. Cognitive modeling research applied to human-computer interaction has two complimentary objectives: (1) to develop theories and computational models of human interactive behavior with information and collaborative technologies, and (2) to use the computational models as building blocks for the design, implementation, and evaluation of interactive technologies. From the perspective of building theories and models, cognitive modeling offers the possibility to anchor cyberpsychology theories and models into cognitive architectures. From the perspective of the design and evaluation of socio-technical systems, cognitive models can provide the basis for simulated users, which can play an important role in usability testing. As an example of application of cognitive modeling to technology design, the paper presents a simulation of interactive behavior with five different adaptive menu algorithms: random, fixed, stacked, frequency based, and activation based. Results of the simulation indicate that fixed menu positions seem to offer the best support for classification like tasks such as filing e-mails. This research is part of the Human-Computer Interaction, and the Broadband Visual Communication research programs at the National Research Council of Canada, in collaboration with the Carleton Cognitive Modeling Lab at Carleton University.

  3. Symmetry structure in discrete models of biochemical systems: natural subsystems and the weak control hierarchy in a new model of computation driven by interactions.

    PubMed

    Nehaniv, Chrystopher L; Rhodes, John; Egri-Nagy, Attila; Dini, Paolo; Morris, Eric Rothstein; Horváth, Gábor; Karimi, Fariba; Schreckling, Daniel; Schilstra, Maria J

    2015-07-28

    Interaction computing is inspired by the observation that cell metabolic/regulatory systems construct order dynamically, through constrained interactions between their components and based on a wide range of possible inputs and environmental conditions. The goals of this work are to (i) identify and understand mathematically the natural subsystems and hierarchical relations in natural systems enabling this and (ii) use the resulting insights to define a new model of computation based on interactions that is useful for both biology and computation. The dynamical characteristics of the cellular pathways studied in systems biology relate, mathematically, to the computational characteristics of automata derived from them, and their internal symmetry structures to computational power. Finite discrete automata models of biological systems such as the lac operon, the Krebs cycle and p53-mdm2 genetic regulation constructed from systems biology models have canonically associated algebraic structures (their transformation semigroups). These contain permutation groups (local substructures exhibiting symmetry) that correspond to 'pools of reversibility'. These natural subsystems are related to one another in a hierarchical manner by the notion of 'weak control'. We present natural subsystems arising from several biological examples and their weak control hierarchies in detail. Finite simple non-Abelian groups are found in biological examples and can be harnessed to realize finitary universal computation. This allows ensembles of cells to achieve any desired finitary computational transformation, depending on external inputs, via suitably constrained interactions. Based on this, interaction machines that grow and change their structure recursively are introduced and applied, providing a natural model of computation driven by interactions.

  4. Dynamic interactions in neural networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arbib, M.A.; Amari, S.

    The study of neural networks is enjoying a great renaissance, both in computational neuroscience, the development of information processing models of living brains, and in neural computing, the use of neurally inspired concepts in the construction of intelligent machines. This volume presents models and data on the dynamic interactions occurring in the brain, and exhibits the dynamic interactions between research in computational neuroscience and in neural computing. The authors present current research, future trends and open problems.

  5. The Computable Catchment: An executable document for model-data software sharing, reproducibility and interactive visualization

    NASA Astrophysics Data System (ADS)

    Gil, Y.; Duffy, C.

    2015-12-01

    This paper proposes the concept of a "Computable Catchment" which is used to develop a collaborative platform for watershed modeling and data analysis. The object of the research is a sharable, executable document similar to a pdf, but one that includes documentation of the underlying theoretical concepts, interactive computational/numerical resources, linkage to essential data repositories and the ability for interactive model-data visualization and analysis. The executable document for each catchment is stored in the cloud with automatic provisioning and a unique identifier allowing collaborative model and data enhancements for historical hydroclimatic reconstruction and/or future landuse or climate change scenarios to be easily reconstructed or extended. The Computable Catchment adopts metadata standards for naming all variables in the model and the data. The a-priori or initial data is derived from national data sources for soils, hydrogeology, climate, and land cover available from the www.hydroterre.psu.edu data service (Leonard and Duffy, 2015). The executable document is based on Wolfram CDF or Computable Document Format with an interactive open-source reader accessible by any modern computing platform. The CDF file and contents can be uploaded to a website or simply shared as a normal document maintaining all interactive features of the model and data. The Computable Catchment concept represents one application for Geoscience Papers of the Future representing an extensible document that combines theory, models, data and analysis that are digitally shared, documented and reused among research collaborators, students, educators and decision makers.

  6. Metaphors for the Nature of Human-Computer Interaction in an Empowering Environment: Interaction Style Influences the Manner of Human Accomplishment.

    ERIC Educational Resources Information Center

    Weller, Herman G.; Hartson, H. Rex

    1992-01-01

    Describes human-computer interface needs for empowering environments in computer usage in which the machine handles the routine mechanics of problem solving while the user concentrates on its higher order meanings. A closed-loop model of interaction is described, interface as illusion is discussed, and metaphors for human-computer interaction are…

  7. Design of Intelligent Robot as A Tool for Teaching Media Based on Computer Interactive Learning and Computer Assisted Learning to Improve the Skill of University Student

    NASA Astrophysics Data System (ADS)

    Zuhrie, M. S.; Basuki, I.; Asto B, I. G. P.; Anifah, L.

    2018-01-01

    The focus of the research is the teaching module which incorporates manufacturing, planning mechanical designing, controlling system through microprocessor technology and maneuverability of the robot. Computer interactive and computer-assisted learning is strategies that emphasize the use of computers and learning aids (computer assisted learning) in teaching and learning activity. This research applied the 4-D model research and development. The model is suggested by Thiagarajan, et.al (1974). 4-D Model consists of four stages: Define Stage, Design Stage, Develop Stage, and Disseminate Stage. This research was conducted by applying the research design development with an objective to produce a tool of learning in the form of intelligent robot modules and kit based on Computer Interactive Learning and Computer Assisted Learning. From the data of the Indonesia Robot Contest during the period of 2009-2015, it can be seen that the modules that have been developed confirm the fourth stage of the research methods of development; disseminate method. The modules which have been developed for students guide students to produce Intelligent Robot Tool for Teaching Based on Computer Interactive Learning and Computer Assisted Learning. Results of students’ responses also showed a positive feedback to relate to the module of robotics and computer-based interactive learning.

  8. Computing by physical interaction in neurons.

    PubMed

    Aur, Dorian; Jog, Mandar; Poznanski, Roman R

    2011-12-01

    The electrodynamics of action potentials represents the fundamental level where information is integrated and processed in neurons. The Hodgkin-Huxley model cannot explain the non-stereotyped spatial charge density dynamics that occur during action potential propagation. Revealed in experiments as spike directivity, the non-uniform charge density dynamics within neurons carry meaningful information and suggest that fragments of information regarding our memories are endogenously stored in structural patterns at a molecular level and are revealed only during spiking activity. The main conceptual idea is that under the influence of electric fields, efficient computation by interaction occurs between charge densities embedded within molecular structures and the transient developed flow of electrical charges. This process of computation underlying electrical interactions and molecular mechanisms at the subcellular level is dissimilar from spiking neuron models that are completely devoid of physical interactions. Computation by interaction describes a more powerful continuous model of computation than the one that consists of discrete steps as represented in Turing machines.

  9. Interactive computer graphics and its role in control system design of large space structures

    NASA Technical Reports Server (NTRS)

    Reddy, A. S. S. R.

    1985-01-01

    This paper attempts to show the relevance of interactive computer graphics in the design of control systems to maintain attitude and shape of large space structures to accomplish the required mission objectives. The typical phases of control system design, starting from the physical model such as modeling the dynamics, modal analysis, and control system design methodology are reviewed and the need of the interactive computer graphics is demonstrated. Typical constituent parts of large space structures such as free-free beams and free-free plates are used to demonstrate the complexity of the control system design and the effectiveness of the interactive computer graphics.

  10. Computer modeling and simulation of human movement. Applications in sport and rehabilitation.

    PubMed

    Neptune, R R

    2000-05-01

    Computer modeling and simulation of human movement plays an increasingly important role in sport and rehabilitation, with applications ranging from sport equipment design to understanding pathologic gait. The complex dynamic interactions within the musculoskeletal and neuromuscular systems make analyzing human movement with existing experimental techniques difficult but computer modeling and simulation allows for the identification of these complex interactions and causal relationships between input and output variables. This article provides an overview of computer modeling and simulation and presents an example application in the field of rehabilitation.

  11. Integrated computer-aided design using minicomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.

    1980-01-01

    Computer-Aided Design/Computer-Aided Manufacturing (CAD/CAM), a highly interactive software, has been implemented on minicomputers at the NASA Langley Research Center. CAD/CAM software integrates many formerly fragmented programs and procedures into one cohesive system; it also includes finite element modeling and analysis, and has been interfaced via a computer network to a relational data base management system and offline plotting devices on mainframe computers. The CAD/CAM software system requires interactive graphics terminals operating at a minimum of 4800 bits/sec transfer rate to a computer. The system is portable and introduces 'interactive graphics', which permits the creation and modification of models interactively. The CAD/CAM system has already produced designs for a large area space platform, a national transonic facility fan blade, and a laminar flow control wind tunnel model. Besides the design/drafting element analysis capability, CAD/CAM provides options to produce an automatic program tooling code to drive a numerically controlled (N/C) machine. Reductions in time for design, engineering, drawing, finite element modeling, and N/C machining will benefit productivity through reduced costs, fewer errors, and a wider range of configuration.

  12. The influence of computational assumptions on analysing abdominal aortic aneurysm haemodynamics.

    PubMed

    Ene, Florentina; Delassus, Patrick; Morris, Liam

    2014-08-01

    The variation in computational assumptions for analysing abdominal aortic aneurysm haemodynamics can influence the desired output results and computational cost. Such assumptions for abdominal aortic aneurysm modelling include static/transient pressures, steady/transient flows and rigid/compliant walls. Six computational methods and these various assumptions were simulated and compared within a realistic abdominal aortic aneurysm model with and without intraluminal thrombus. A full transient fluid-structure interaction was required to analyse the flow patterns within the compliant abdominal aortic aneurysms models. Rigid wall computational fluid dynamics overestimates the velocity magnitude by as much as 40%-65% and the wall shear stress by 30%-50%. These differences were attributed to the deforming walls which reduced the outlet volumetric flow rate for the transient fluid-structure interaction during the majority of the systolic phase. Static finite element analysis accurately approximates the deformations and von Mises stresses when compared with transient fluid-structure interaction. Simplifying the modelling complexity reduces the computational cost significantly. In conclusion, the deformation and von Mises stress can be approximately found by static finite element analysis, while for compliant models a full transient fluid-structure interaction analysis is required for acquiring the fluid flow phenomenon. © IMechE 2014.

  13. Continued use of an interactive computer game-based visual perception learning system in children with developmental delay.

    PubMed

    Lin, Hsien-Cheng; Chiu, Yu-Hsien; Chen, Yenming J; Wuang, Yee-Pay; Chen, Chiu-Ping; Wang, Chih-Chung; Huang, Chien-Ling; Wu, Tang-Meng; Ho, Wen-Hsien

    2017-11-01

    This study developed an interactive computer game-based visual perception learning system for special education children with developmental delay. To investigate whether perceived interactivity affects continued use of the system, this study developed a theoretical model of the process in which learners decide whether to continue using an interactive computer game-based visual perception learning system. The technology acceptance model, which considers perceived ease of use, perceived usefulness, and perceived playfulness, was extended by integrating perceived interaction (i.e., learner-instructor interaction and learner-system interaction) and then analyzing the effects of these perceptions on satisfaction and continued use. Data were collected from 150 participants (rehabilitation therapists, medical paraprofessionals, and parents of children with developmental delay) recruited from a single medical center in Taiwan. Structural equation modeling and partial-least-squares techniques were used to evaluate relationships within the model. The modeling results indicated that both perceived ease of use and perceived usefulness were positively associated with both learner-instructor interaction and learner-system interaction. However, perceived playfulness only had a positive association with learner-system interaction and not with learner-instructor interaction. Moreover, satisfaction was positively affected by perceived ease of use, perceived usefulness, and perceived playfulness. Thus, satisfaction positively affects continued use of the system. The data obtained by this study can be applied by researchers, designers of computer game-based learning systems, special education workers, and medical professionals. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Development and application of the GIM code for the Cyber 203 computer

    NASA Technical Reports Server (NTRS)

    Stainaker, J. F.; Robinson, M. A.; Rawlinson, E. G.; Anderson, P. G.; Mayne, A. W.; Spradley, L. W.

    1982-01-01

    The GIM computer code for fluid dynamics research was developed. Enhancement of the computer code, implicit algorithm development, turbulence model implementation, chemistry model development, interactive input module coding and wing/body flowfield computation are described. The GIM quasi-parabolic code development was completed, and the code used to compute a number of example cases. Turbulence models, algebraic and differential equations, were added to the basic viscous code. An equilibrium reacting chemistry model and implicit finite difference scheme were also added. Development was completed on the interactive module for generating the input data for GIM. Solutions for inviscid hypersonic flow over a wing/body configuration are also presented.

  15. Specifying and Refining a Measurement Model for a Computer-Based Interactive Assessment

    ERIC Educational Resources Information Center

    Levy, Roy; Mislevy, Robert J.

    2004-01-01

    The challenges of modeling students' performance in computer-based interactive assessments include accounting for multiple aspects of knowledge and skill that arise in different situations and the conditional dependencies among multiple aspects of performance. This article describes a Bayesian approach to modeling and estimating cognitive models…

  16. Development of an Interactive Computer-Based Learning Strategy to Assist in Teaching Water Quality Modelling

    ERIC Educational Resources Information Center

    Zigic, Sasha; Lemckert, Charles J.

    2007-01-01

    The following paper presents a computer-based learning strategy to assist in introducing and teaching water quality modelling to undergraduate civil engineering students. As part of the learning strategy, an interactive computer-based instructional (CBI) aid was specifically developed to assist students to set up, run and analyse the output from a…

  17. Evaluation of a computerized aid for creating human behavioral representations of human-computer interaction.

    PubMed

    Williams, Kent E; Voigt, Jeffrey R

    2004-01-01

    The research reported herein presents the results of an empirical evaluation that focused on the accuracy and reliability of cognitive models created using a computerized tool: the cognitive analysis tool for human-computer interaction (CAT-HCI). A sample of participants, expert in interacting with a newly developed tactical display for the U.S. Army's Bradley Fighting Vehicle, individually modeled their knowledge of 4 specific tasks employing the CAT-HCI tool. Measures of the accuracy and consistency of task models created by these task domain experts using the tool were compared with task models created by a double expert. The findings indicated a high degree of consistency and accuracy between the different "single experts" in the task domain in terms of the resultant models generated using the tool. Actual or potential applications of this research include assessing human-computer interaction complexity, determining the productivity of human-computer interfaces, and analyzing an interface design to determine whether methods can be automated.

  18. Developing Computer Model-Based Assessment of Chemical Reasoning: A Feasibility Study

    ERIC Educational Resources Information Center

    Liu, Xiufeng; Waight, Noemi; Gregorius, Roberto; Smith, Erica; Park, Mihwa

    2012-01-01

    This paper reports a feasibility study on developing computer model-based assessments of chemical reasoning at the high school level. Computer models are flash and NetLogo environments to make simultaneously available three domains in chemistry: macroscopic, submicroscopic, and symbolic. Students interact with computer models to answer assessment…

  19. Development of Interactive Computer Programs To Help Students Transfer Basic Skills to College Level Science and Behavioral Science Courses.

    ERIC Educational Resources Information Center

    Mikulecky, Larry

    Interactive computer programs, developed at Indiana University's Learning Skills Center, were designed to model effective strategies for reading biology and psychology textbooks. For each subject area, computer programs and textbook passages were used to instruct and model for students how to identify key concepts, compare and contrast concepts,…

  20. Interactive activation and mutual constraint satisfaction in perception and cognition.

    PubMed

    McClelland, James L; Mirman, Daniel; Bolger, Donald J; Khaitan, Pranav

    2014-08-01

    In a seminal 1977 article, Rumelhart argued that perception required the simultaneous use of multiple sources of information, allowing perceivers to optimally interpret sensory information at many levels of representation in real time as information arrives. Building on Rumelhart's arguments, we present the Interactive Activation hypothesis-the idea that the mechanism used in perception and comprehension to achieve these feats exploits an interactive activation process implemented through the bidirectional propagation of activation among simple processing units. We then examine the interactive activation model of letter and word perception and the TRACE model of speech perception, as early attempts to explore this hypothesis, and review the experimental evidence relevant to their assumptions and predictions. We consider how well these models address the computational challenge posed by the problem of perception, and we consider how consistent they are with evidence from behavioral experiments. We examine empirical and theoretical controversies surrounding the idea of interactive processing, including a controversy that swirls around the relationship between interactive computation and optimal Bayesian inference. Some of the implementation details of early versions of interactive activation models caused deviation from optimality and from aspects of human performance data. More recent versions of these models, however, overcome these deficiencies. Among these is a model called the multinomial interactive activation model, which explicitly links interactive activation and Bayesian computations. We also review evidence from neurophysiological and neuroimaging studies supporting the view that interactive processing is a characteristic of the perceptual processing machinery in the brain. In sum, we argue that a computational analysis, as well as behavioral and neuroscience evidence, all support the Interactive Activation hypothesis. The evidence suggests that contemporary versions of models based on the idea of interactive activation continue to provide a basis for efforts to achieve a fuller understanding of the process of perception. Copyright © 2014 Cognitive Science Society, Inc.

  1. Improving science and mathematics education with computational modelling in interactive engagement environments

    NASA Astrophysics Data System (ADS)

    Neves, Rui Gomes; Teodoro, Vítor Duarte

    2012-09-01

    A teaching approach aiming at an epistemologically balanced integration of computational modelling in science and mathematics education is presented. The approach is based on interactive engagement learning activities built around computational modelling experiments that span the range of different kinds of modelling from explorative to expressive modelling. The activities are designed to make a progressive introduction to scientific computation without requiring prior development of a working knowledge of programming, generate and foster the resolution of cognitive conflicts in the understanding of scientific and mathematical concepts and promote performative competency in the manipulation of different and complementary representations of mathematical models. The activities are supported by interactive PDF documents which explain the fundamental concepts, methods and reasoning processes using text, images and embedded movies, and include free space for multimedia enriched student modelling reports and teacher feedback. To illustrate, an example from physics implemented in the Modellus environment and tested in undergraduate university general physics and biophysics courses is discussed.

  2. Computing the non-Markovian coarse-grained interactions derived from the Mori-Zwanzig formalism in molecular systems: Application to polymer melts

    NASA Astrophysics Data System (ADS)

    Li, Zhen; Lee, Hee Sun; Darve, Eric; Karniadakis, George Em

    2017-01-01

    Memory effects are often introduced during coarse-graining of a complex dynamical system. In particular, a generalized Langevin equation (GLE) for the coarse-grained (CG) system arises in the context of Mori-Zwanzig formalism. Upon a pairwise decomposition, GLE can be reformulated into its pairwise version, i.e., non-Markovian dissipative particle dynamics (DPD). GLE models the dynamics of a single coarse particle, while DPD considers the dynamics of many interacting CG particles, with both CG systems governed by non-Markovian interactions. We compare two different methods for the practical implementation of the non-Markovian interactions in GLE and DPD systems. More specifically, a direct evaluation of the non-Markovian (NM) terms is performed in LE-NM and DPD-NM models, which requires the storage of historical information that significantly increases computational complexity. Alternatively, we use a few auxiliary variables in LE-AUX and DPD-AUX models to replace the non-Markovian dynamics with a Markovian dynamics in a higher dimensional space, leading to a much reduced memory footprint and computational cost. In our numerical benchmarks, the GLE and non-Markovian DPD models are constructed from molecular dynamics (MD) simulations of star-polymer melts. Results show that a Markovian dynamics with auxiliary variables successfully generates equivalent non-Markovian dynamics consistent with the reference MD system, while maintaining a tractable computational cost. Also, transient subdiffusion of the star-polymers observed in the MD system can be reproduced by the coarse-grained models. The non-interacting particle models, LE-NM/AUX, are computationally much cheaper than the interacting particle models, DPD-NM/AUX. However, the pairwise models with momentum conservation are more appropriate for correctly reproducing the long-time hydrodynamics characterised by an algebraic decay in the velocity autocorrelation function.

  3. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME.

    PubMed

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2016-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.

  4. Digital Immersive Virtual Environments and Instructional Computing

    ERIC Educational Resources Information Center

    Blascovich, Jim; Beall, Andrew C.

    2010-01-01

    This article reviews theory and research relevant to the development of digital immersive virtual environment-based instructional computing systems. The review is organized within the context of a multidimensional model of social influence and interaction within virtual environments that models the interaction of four theoretical factors: theory…

  5. Methodical and technological aspects of creation of interactive computer learning systems

    NASA Astrophysics Data System (ADS)

    Vishtak, N. M.; Frolov, D. A.

    2017-01-01

    The article presents a methodology for the development of an interactive computer training system for training power plant. The methods used in the work are a generalization of the content of scientific and methodological sources on the use of computer-based training systems in vocational education, methods of system analysis, methods of structural and object-oriented modeling of information systems. The relevance of the development of the interactive computer training systems in the preparation of the personnel in the conditions of the educational and training centers is proved. Development stages of the computer training systems are allocated, factors of efficient use of the interactive computer training system are analysed. The algorithm of work performance at each development stage of the interactive computer training system that enables one to optimize time, financial and labor expenditure on the creation of the interactive computer training system is offered.

  6. Efficient quantum circuits for one-way quantum computing.

    PubMed

    Tanamoto, Tetsufumi; Liu, Yu-Xi; Hu, Xuedong; Nori, Franco

    2009-03-13

    While Ising-type interactions are ideal for implementing controlled phase flip gates in one-way quantum computing, natural interactions between solid-state qubits are most often described by either the XY or the Heisenberg models. We show an efficient way of generating cluster states directly using either the imaginary SWAP (iSWAP) gate for the XY model, or the sqrt[SWAP] gate for the Heisenberg model. Our approach thus makes one-way quantum computing more feasible for solid-state devices.

  7. Development of an Aeroelastic Modeling Capability for Transient Nozzle Side Load Analysis

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Zhao, Xiang; Zhang, Sijun; Chen, Yen-Sen

    2013-01-01

    Lateral nozzle forces are known to cause severe structural damage to any new rocket engine in development. Currently there is no fully coupled computational tool to analyze this fluid/structure interaction process. The objective of this study was to develop a fully coupled aeroelastic modeling capability to describe the fluid/structure interaction process during the transient nozzle operations. The aeroelastic model composes of three components: the computational fluid dynamics component based on an unstructured-grid, pressure-based computational fluid dynamics formulation, the computational structural dynamics component developed in the framework of modal analysis, and the fluid-structural interface component. The developed aeroelastic model was applied to the transient nozzle startup process of the Space Shuttle Main Engine at sea level. The computed nozzle side loads and the axial nozzle wall pressure profiles from the aeroelastic nozzle are compared with those of the published rigid nozzle results, and the impact of the fluid/structure interaction on nozzle side loads is interrogated and presented.

  8. Prediction of the structure of fuel sprays in gas turbine combustors

    NASA Technical Reports Server (NTRS)

    Shuen, J. S.

    1985-01-01

    The structure of fuel sprays in a combustion chamber is theoretically investigated using computer models of current interest. Three representative spray models are considered: (1) a locally homogeneous flow (LHF) model, which assumes infinitely fast interphase transport rates; (2) a deterministic separated flow (DSF) model, which considers finite rates of interphase transport but ignores effects of droplet/turbulence interactions; and (3) a stochastic separated flow (SSF) model, which considers droplet/turbulence interactions using random sampling for turbulence properties in conjunction with random-walk computations for droplet motion and transport. Two flow conditions are studied to investigate the influence of swirl on droplet life histories and the effects of droplet/turbulence interactions on flow properties. Comparison of computed results with the experimental data show that general features of the flow structure can be predicted with reasonable accuracy using the two separated flow models. In contrast, the LHF model overpredicts the rate of development of the flow. While the SSF model provides better agreement with measurements than the DSF model, definitive evaluation of the significance of droplet/turbulence interaction is not achieved due to uncertainties in the spray initial conditions.

  9. Assessment of Spacecraft Systems Integration Using the Electric Propulsion Interactions Code (EPIC)

    NASA Technical Reports Server (NTRS)

    Mikellides, Ioannis G.; Kuharski, Robert A.; Mandell, Myron J.; Gardner, Barbara M.; Kauffman, William J. (Technical Monitor)

    2002-01-01

    SAIC is currently developing the Electric Propulsion Interactions Code 'EPIC', an interactive computer tool that allows the construction of a 3-D spacecraft model, and the assessment of interactions between its subsystems and the plume from an electric thruster. EPIC unites different computer tools to address the complexity associated with the interaction processes. This paper describes the overall architecture and capability of EPIC including the physics and algorithms that comprise its various components. Results from selected modeling efforts of different spacecraft-thruster systems are also presented.

  10. Viscous-inviscid interaction method including wake effects for three-dimensional wing-body configurations

    NASA Technical Reports Server (NTRS)

    Streett, C. L.

    1981-01-01

    A viscous-inviscid interaction method has been developed by using a three-dimensional integral boundary-layer method which produces results in good agreement with a finite-difference method in a fraction of the computer time. The integral method is stable and robust and incorporates a model for computation in a small region of streamwise separation. A locally two-dimensional wake model, accounting for thickness and curvature effects, is also included in the interaction procedure. Computation time spent in converging an interacted result is, many times, only slightly greater than that required to converge an inviscid calculation. Results are shown from the interaction method, run at experimental angle of attack, Reynolds number, and Mach number, on a wing-body test case for which viscous effects are large. Agreement with experiment is good; in particular, the present wake model improves prediction of the spanwise lift distribution and lower surface cove pressure.

  11. Integrating probabilistic models of perception and interactive neural networks: a historical and tutorial review

    PubMed Central

    McClelland, James L.

    2013-01-01

    This article seeks to establish a rapprochement between explicitly Bayesian models of contextual effects in perception and neural network models of such effects, particularly the connectionist interactive activation (IA) model of perception. The article is in part an historical review and in part a tutorial, reviewing the probabilistic Bayesian approach to understanding perception and how it may be shaped by context, and also reviewing ideas about how such probabilistic computations may be carried out in neural networks, focusing on the role of context in interactive neural networks, in which both bottom-up and top-down signals affect the interpretation of sensory inputs. It is pointed out that connectionist units that use the logistic or softmax activation functions can exactly compute Bayesian posterior probabilities when the bias terms and connection weights affecting such units are set to the logarithms of appropriate probabilistic quantities. Bayesian concepts such the prior, likelihood, (joint and marginal) posterior, probability matching and maximizing, and calculating vs. sampling from the posterior are all reviewed and linked to neural network computations. Probabilistic and neural network models are explicitly linked to the concept of a probabilistic generative model that describes the relationship between the underlying target of perception (e.g., the word intended by a speaker or other source of sensory stimuli) and the sensory input that reaches the perceiver for use in inferring the underlying target. It is shown how a new version of the IA model called the multinomial interactive activation (MIA) model can sample correctly from the joint posterior of a proposed generative model for perception of letters in words, indicating that interactive processing is fully consistent with principled probabilistic computation. Ways in which these computations might be realized in real neural systems are also considered. PMID:23970868

  12. Integrating probabilistic models of perception and interactive neural networks: a historical and tutorial review.

    PubMed

    McClelland, James L

    2013-01-01

    This article seeks to establish a rapprochement between explicitly Bayesian models of contextual effects in perception and neural network models of such effects, particularly the connectionist interactive activation (IA) model of perception. The article is in part an historical review and in part a tutorial, reviewing the probabilistic Bayesian approach to understanding perception and how it may be shaped by context, and also reviewing ideas about how such probabilistic computations may be carried out in neural networks, focusing on the role of context in interactive neural networks, in which both bottom-up and top-down signals affect the interpretation of sensory inputs. It is pointed out that connectionist units that use the logistic or softmax activation functions can exactly compute Bayesian posterior probabilities when the bias terms and connection weights affecting such units are set to the logarithms of appropriate probabilistic quantities. Bayesian concepts such the prior, likelihood, (joint and marginal) posterior, probability matching and maximizing, and calculating vs. sampling from the posterior are all reviewed and linked to neural network computations. Probabilistic and neural network models are explicitly linked to the concept of a probabilistic generative model that describes the relationship between the underlying target of perception (e.g., the word intended by a speaker or other source of sensory stimuli) and the sensory input that reaches the perceiver for use in inferring the underlying target. It is shown how a new version of the IA model called the multinomial interactive activation (MIA) model can sample correctly from the joint posterior of a proposed generative model for perception of letters in words, indicating that interactive processing is fully consistent with principled probabilistic computation. Ways in which these computations might be realized in real neural systems are also considered.

  13. Computational modelling of biomaterial surface interactions with blood platelets and osteoblastic cells for the prediction of contact osteogenesis.

    PubMed

    Amor, N; Geris, L; Vander Sloten, J; Van Oosterwyck, H

    2011-02-01

    Surface microroughness can induce contact osteogenesis (bone formation initiated at the implant surface) around oral implants, which may result from different mechanisms, such as blood platelet-biomaterial interactions and/or interaction with (pre-)osteoblast cells. We have developed a computational model of implant endosseous healing that takes into account these interactions. We hypothesized that the initial attachment and growth factor release from activated platelets is crucial in achieving contact osteogenesis. In order to investigate this, a computational model was applied to an animal experiment [7] that looked at the effect of surface microroughness on endosseous healing. Surface-specific model parameters were implemented based on in vitro data (Lincks et al. Biomaterials 1998;19:2219-32). The predicted spatio-temporal patterns of bone formation correlated with the histological data. It was found that contact osteogenesis could not be predicted if only the osteogenic response of cells was up-regulated by surface microroughness. This could only be achieved if platelet-biomaterial interactions were sufficiently up-regulated as well. These results confirmed our hypothesis and demonstrate the added value of the computational model to study the importance of surface-mediated events for peri-implant endosseous healing. Copyright © 2010 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  14. On Roles of Models in Information Systems

    NASA Astrophysics Data System (ADS)

    Sølvberg, Arne

    The increasing penetration of computers into all aspects of human activity makes it desirable that the interplay among software, data and the domains where computers are applied is made more transparent. An approach to this end is to explicitly relate the modeling concepts of the domains, e.g., natural science, technology and business, to the modeling concepts of software and data. This may make it simpler to build comprehensible integrated models of the interactions between computers and non-computers, e.g., interaction among computers, people, physical processes, biological processes, and administrative processes. This chapter contains an analysis of various facets of the modeling environment for information systems engineering. The lack of satisfactory conceptual modeling tools seems to be central to the unsatisfactory state-of-the-art in establishing information systems. The chapter contains a proposal for defining a concept of information that is relevant to information systems engineering.

  15. Brain-Computer Interfaces: A Neuroscience Paradigm of Social Interaction? A Matter of Perspective

    PubMed Central

    Mattout, Jérémie

    2012-01-01

    A number of recent studies have put human subjects in true social interactions, with the aim of better identifying the psychophysiological processes underlying social cognition. Interestingly, this emerging Neuroscience of Social Interactions (NSI) field brings up challenges which resemble important ones in the field of Brain-Computer Interfaces (BCI). Importantly, these challenges go beyond common objectives such as the eventual use of BCI and NSI protocols in the clinical domain or common interests pertaining to the use of online neurophysiological techniques and algorithms. Common fundamental challenges are now apparent and one can argue that a crucial one is to develop computational models of brain processes relevant to human interactions with an adaptive agent, whether human or artificial. Coupled with neuroimaging data, such models have proved promising in revealing the neural basis and mental processes behind social interactions. Similar models could help BCI to move from well-performing but offline static machines to reliable online adaptive agents. This emphasizes a social perspective to BCI, which is not limited to a computational challenge but extends to all questions that arise when studying the brain in interaction with its environment. PMID:22675291

  16. A COMPUTER MODELING STUDY OF BINDING PROPERTIES OF CHIRAL NUCLEOPEPTIDE FOR BIOMEDICAL APPLICATIONS.

    PubMed

    Pirtskhalava, M; Egoyan, A; Mirtskhulava, M; Roviello, G

    2017-12-01

    Nucleopeptides often show interesting properties of molecular binding that render them good candidates for development of innovative drugs for anticancer and antiviral therapies. In this work we present results of computer modeling of interactions between the molecules of hexathymine nucleopeptide (T6) and poly rA RNA (A18). The results of geometry optimization calculated using Hyperchem software and our own computer program for molecular docking show that molecules establish stable complexes due to the complementary-nucleobase interaction and the electrostatic interaction between the negative phosphate group of poly rA and the positively-charged residues present in the cationic nucleopeptide structure. Computer modeling makes it possible to find the optimal binding configuration of the molecules of a nucleopeptide and poly rA RNA and to estimate the binding energy between the molecules.

  17. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME

    PubMed Central

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2017-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments. PMID:28190948

  18. Climate Models

    NASA Technical Reports Server (NTRS)

    Druyan, Leonard M.

    2012-01-01

    Climate models is a very broad topic, so a single volume can only offer a small sampling of relevant research activities. This volume of 14 chapters includes descriptions of a variety of modeling studies for a variety of geographic regions by an international roster of authors. The climate research community generally uses the rubric climate models to refer to organized sets of computer instructions that produce simulations of climate evolution. The code is based on physical relationships that describe the shared variability of meteorological parameters such as temperature, humidity, precipitation rate, circulation, radiation fluxes, etc. Three-dimensional climate models are integrated over time in order to compute the temporal and spatial variations of these parameters. Model domains can be global or regional and the horizontal and vertical resolutions of the computational grid vary from model to model. Considering the entire climate system requires accounting for interactions between solar insolation, atmospheric, oceanic and continental processes, the latter including land hydrology and vegetation. Model simulations may concentrate on one or more of these components, but the most sophisticated models will estimate the mutual interactions of all of these environments. Advances in computer technology have prompted investments in more complex model configurations that consider more phenomena interactions than were possible with yesterday s computers. However, not every attempt to add to the computational layers is rewarded by better model performance. Extensive research is required to test and document any advantages gained by greater sophistication in model formulation. One purpose for publishing climate model research results is to present purported advances for evaluation by the scientific community.

  19. Experiment and simulation for CSI: What are the missing links?

    NASA Technical Reports Server (NTRS)

    Belvin, W. Keith; Park, K. C.

    1989-01-01

    Viewgraphs on experiment and simulation for control structure interaction (CSI) are presented. Topics covered include: control structure interaction; typical control/structure interaction system; CSI problem classification; actuator/sensor models; modeling uncertainty; noise models; real-time computations; and discrete versus continuous.

  20. Flow and Turbulence Modeling and Computation of Shock Buffet Onset for Conventional and Supercritical Airfoils

    NASA Technical Reports Server (NTRS)

    Bartels, Robert E.

    1998-01-01

    Flow and turbulence models applied to the problem of shock buffet onset are studied. The accuracy of the interactive boundary layer and the thin-layer Navier-Stokes equations solved with recent upwind techniques using similar transport field equation turbulence models is assessed for standard steady test cases, including conditions having significant shock separation. The two methods are found to compare well in the shock buffet onset region of a supercritical airfoil that involves strong trailing-edge separation. A computational analysis using the interactive-boundary layer has revealed a Reynolds scaling effect in the shock buffet onset of the supercritical airfoil, which compares well with experiment. The methods are next applied to a conventional airfoil. Steady shock-separated computations of the conventional airfoil with the two methods compare well with experiment. Although the interactive boundary layer computations in the shock buffet region compare well with experiment for the conventional airfoil, the thin-layer Navier-Stokes computations do not. These findings are discussed in connection with possible mechanisms important in the onset of shock buffet and the constraints imposed by current numerical modeling techniques.

  1. Application of a global solar wind/planetary obstacle interaction computational model: Earth, Venus, Mars, Jupiter and Saturn studies

    NASA Technical Reports Server (NTRS)

    Stahara, S. S.

    1984-01-01

    The investigations undertaken in this report relate to studies of various solar wind interaction phenomena with Venus, Earth, Mars, Jupiter and Saturn. A computational model is developed for the determination of the detailed plasma and magnetic field properties associated with various planetary obstacles throughout the solar system.

  2. Metabolic Network Modeling for Computer-Aided Design of Microbial Interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, Hyun-Seob; Nelson, William C.; Lee, Joon-Yong

    Interest in applying microbial communities to biotechnology continues to increase. Successful engineering of microbial communities requires a fundamental shift in focus from enhancing metabolic capabilities in individual organisms to promoting synergistic interspecies interactions. This goal necessitates in silico tools that provide a predictive understanding of how microorganisms interact with each other and their environments. In this regard, we highlight a need for a new concept that we have termed biological computer-aided design of interactions (BioCADi). We ground this discussion within the context of metabolic network modeling.

  3. Tertiary structure-based analysis of microRNA–target interactions

    PubMed Central

    Gan, Hin Hark; Gunsalus, Kristin C.

    2013-01-01

    Current computational analysis of microRNA interactions is based largely on primary and secondary structure analysis. Computationally efficient tertiary structure-based methods are needed to enable more realistic modeling of the molecular interactions underlying miRNA-mediated translational repression. We incorporate algorithms for predicting duplex RNA structures, ionic strength effects, duplex entropy and free energy, and docking of duplex–Argonaute protein complexes into a pipeline to model and predict miRNA–target duplex binding energies. To ensure modeling accuracy and computational efficiency, we use an all-atom description of RNA and a continuum description of ionic interactions using the Poisson–Boltzmann equation. Our method predicts the conformations of two constructs of Caenorhabditis elegans let-7 miRNA–target duplexes to an accuracy of ∼3.8 Å root mean square distance of their NMR structures. We also show that the computed duplex formation enthalpies, entropies, and free energies for eight miRNA–target duplexes agree with titration calorimetry data. Analysis of duplex–Argonaute docking shows that structural distortions arising from single-base-pair mismatches in the seed region influence the activity of the complex by destabilizing both duplex hybridization and its association with Argonaute. Collectively, these results demonstrate that tertiary structure-based modeling of miRNA interactions can reveal structural mechanisms not accessible with current secondary structure-based methods. PMID:23417009

  4. Highly Efficient Computation of the Basal kon using Direct Simulation of Protein-Protein Association with Flexible Molecular Models.

    PubMed

    Saglam, Ali S; Chong, Lillian T

    2016-01-14

    An essential baseline for determining the extent to which electrostatic interactions enhance the kinetics of protein-protein association is the "basal" kon, which is the rate constant for association in the absence of electrostatic interactions. However, since such association events are beyond the milliseconds time scale, it has not been practical to compute the basal kon by directly simulating the association with flexible models. Here, we computed the basal kon for barnase and barstar, two of the most rapidly associating proteins, using highly efficient, flexible molecular simulations. These simulations involved (a) pseudoatomic protein models that reproduce the molecular shapes, electrostatic, and diffusion properties of all-atom models, and (b) application of the weighted ensemble path sampling strategy, which enhanced the efficiency of generating association events by >130-fold. We also examined the extent to which the computed basal kon is affected by inclusion of intermolecular hydrodynamic interactions in the simulations.

  5. Computer-Aided Geometry Modeling

    NASA Technical Reports Server (NTRS)

    Shoosmith, J. N. (Compiler); Fulton, R. E. (Compiler)

    1984-01-01

    Techniques in computer-aided geometry modeling and their application are addressed. Mathematical modeling, solid geometry models, management of geometric data, development of geometry standards, and interactive and graphic procedures are discussed. The applications include aeronautical and aerospace structures design, fluid flow modeling, and gas turbine design.

  6. Design Science in Human-Computer Interaction: A Model and Three Examples

    ERIC Educational Resources Information Center

    Prestopnik, Nathan R.

    2013-01-01

    Humanity has entered an era where computing technology is virtually ubiquitous. From websites and mobile devices to computers embedded in appliances on our kitchen counters and automobiles parked in our driveways, information and communication technologies (ICTs) and IT artifacts are fundamentally changing the ways we interact with our world.…

  7. Structural mode significance using INCA. [Interactive Controls Analysis computer program

    NASA Technical Reports Server (NTRS)

    Bauer, Frank H.; Downing, John P.; Thorpe, Christopher J.

    1990-01-01

    Structural finite element models are often too large to be used in the design and analysis of control systems. Model reduction techniques must be applied to reduce the structural model to manageable size. In the past, engineers either performed the model order reduction by hand or used distinct computer programs to retrieve the data, to perform the significance analysis and to reduce the order of the model. To expedite this process, the latest version of INCA has been expanded to include an interactive graphical structural mode significance and model order reduction capability.

  8. An accurate binding interaction model in de novo computational protein design of interactions: if you build it, they will bind.

    PubMed

    London, Nir; Ambroggio, Xavier

    2014-02-01

    Computational protein design efforts aim to create novel proteins and functions in an automated manner and, in the process, these efforts shed light on the factors shaping natural proteins. The focus of these efforts has progressed from the interior of proteins to their surface and the design of functions, such as binding or catalysis. Here we examine progress in the development of robust methods for the computational design of non-natural interactions between proteins and molecular targets such as other proteins or small molecules. This problem is referred to as the de novo computational design of interactions. Recent successful efforts in de novo enzyme design and the de novo design of protein-protein interactions open a path towards solving this problem. We examine the common themes in these efforts, and review recent studies aimed at understanding the nature of successes and failures in the de novo computational design of interactions. While several approaches culminated in success, the use of a well-defined structural model for a specific binding interaction in particular has emerged as a key strategy for a successful design, and is therefore reviewed with special consideration. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. Biomolecular electrostatics and solvation: a computational perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Pengyu; Chun, Jaehun; Thomas, Dennis G.

    2012-11-01

    An understanding of molecular interactions is essential for insight into biological systems at the molecular scale. Among the various components of molecular interactions, electrostatics are of special importance because of their long-range nature and their influence on polar or charged molecules, including water, aqueous ions, proteins, nucleic acids, carbohydrates, and membrane lipids. In particular, robust models of electrostatic interactions are essential for understanding the solvation properties of biomolecules and the effects of solvation upon biomolecular folding, binding, enzyme catalysis and dynamics. Electrostatics, therefore, are of central importance to understanding biomolecular structure and modeling interactions within and among biological molecules. Thismore » review discusses the solvation of biomolecules with a computational biophysics view towards describing the phenomenon. While our main focus lies on the computational aspect of the models, we summarize the common characteristics of biomolecular solvation (e.g., solvent structure, polarization, ion binding, and nonpolar behavior) in order to provide reasonable backgrounds to understand the solvation models.« less

  10. Optimal structure and parameter learning of Ising models

    DOE PAGES

    Lokhov, Andrey; Vuffray, Marc Denis; Misra, Sidhant; ...

    2018-03-16

    Reconstruction of the structure and parameters of an Ising model from binary samples is a problem of practical importance in a variety of disciplines, ranging from statistical physics and computational biology to image processing and machine learning. The focus of the research community shifted toward developing universal reconstruction algorithms that are both computationally efficient and require the minimal amount of expensive data. Here, we introduce a new method, interaction screening, which accurately estimates model parameters using local optimization problems. The algorithm provably achieves perfect graph structure recovery with an information-theoretically optimal number of samples, notably in the low-temperature regime, whichmore » is known to be the hardest for learning. Here, the efficacy of interaction screening is assessed through extensive numerical tests on synthetic Ising models of various topologies with different types of interactions, as well as on real data produced by a D-Wave quantum computer. Finally, this study shows that the interaction screening method is an exact, tractable, and optimal technique that universally solves the inverse Ising problem.« less

  11. Optimal structure and parameter learning of Ising models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lokhov, Andrey; Vuffray, Marc Denis; Misra, Sidhant

    Reconstruction of the structure and parameters of an Ising model from binary samples is a problem of practical importance in a variety of disciplines, ranging from statistical physics and computational biology to image processing and machine learning. The focus of the research community shifted toward developing universal reconstruction algorithms that are both computationally efficient and require the minimal amount of expensive data. Here, we introduce a new method, interaction screening, which accurately estimates model parameters using local optimization problems. The algorithm provably achieves perfect graph structure recovery with an information-theoretically optimal number of samples, notably in the low-temperature regime, whichmore » is known to be the hardest for learning. Here, the efficacy of interaction screening is assessed through extensive numerical tests on synthetic Ising models of various topologies with different types of interactions, as well as on real data produced by a D-Wave quantum computer. Finally, this study shows that the interaction screening method is an exact, tractable, and optimal technique that universally solves the inverse Ising problem.« less

  12. Biomolecular electrostatics and solvation: a computational perspective

    PubMed Central

    Ren, Pengyu; Chun, Jaehun; Thomas, Dennis G.; Schnieders, Michael J.; Marucho, Marcelo; Zhang, Jiajing; Baker, Nathan A.

    2012-01-01

    An understanding of molecular interactions is essential for insight into biological systems at the molecular scale. Among the various components of molecular interactions, electrostatics are of special importance because of their long-range nature and their influence on polar or charged molecules, including water, aqueous ions, proteins, nucleic acids, carbohydrates, and membrane lipids. In particular, robust models of electrostatic interactions are essential for understanding the solvation properties of biomolecules and the effects of solvation upon biomolecular folding, binding, enzyme catalysis, and dynamics. Electrostatics, therefore, are of central importance to understanding biomolecular structure and modeling interactions within and among biological molecules. This review discusses the solvation of biomolecules with a computational biophysics view towards describing the phenomenon. While our main focus lies on the computational aspect of the models, we provide an overview of the basic elements of biomolecular solvation (e.g., solvent structure, polarization, ion binding, and nonpolar behavior) in order to provide a background to understand the different types of solvation models. PMID:23217364

  13. Biomolecular electrostatics and solvation: a computational perspective.

    PubMed

    Ren, Pengyu; Chun, Jaehun; Thomas, Dennis G; Schnieders, Michael J; Marucho, Marcelo; Zhang, Jiajing; Baker, Nathan A

    2012-11-01

    An understanding of molecular interactions is essential for insight into biological systems at the molecular scale. Among the various components of molecular interactions, electrostatics are of special importance because of their long-range nature and their influence on polar or charged molecules, including water, aqueous ions, proteins, nucleic acids, carbohydrates, and membrane lipids. In particular, robust models of electrostatic interactions are essential for understanding the solvation properties of biomolecules and the effects of solvation upon biomolecular folding, binding, enzyme catalysis, and dynamics. Electrostatics, therefore, are of central importance to understanding biomolecular structure and modeling interactions within and among biological molecules. This review discusses the solvation of biomolecules with a computational biophysics view toward describing the phenomenon. While our main focus lies on the computational aspect of the models, we provide an overview of the basic elements of biomolecular solvation (e.g. solvent structure, polarization, ion binding, and non-polar behavior) in order to provide a background to understand the different types of solvation models.

  14. Exploring Biomolecular Recognition by Modeling and Simulation

    NASA Astrophysics Data System (ADS)

    Wade, Rebecca

    2007-12-01

    Biomolecular recognition is complex. The balance between the different molecular properties that contribute to molecular recognition, such as shape, electrostatics, dynamics and entropy, varies from case to case. This, along with the extent of experimental characterization, influences the choice of appropriate computational approaches to study biomolecular interactions. I will present computational studies in which we aim to make concerted use of bioinformatics, biochemical network modeling and molecular simulation techniques to study protein-protein and protein-small molecule interactions and to facilitate computer-aided drug design.

  15. The study of early human embryos using interactive 3-dimensional computer reconstructions.

    PubMed

    Scarborough, J; Aiton, J F; McLachlan, J C; Smart, S D; Whiten, S C

    1997-07-01

    Tracings of serial histological sections from 4 human embryos at different Carnegie stages were used to create 3-dimensional (3D) computer models of the developing heart. The models were constructed using commercially available software developed for graphic design and the production of computer generated virtual reality environments. They are available as interactive objects which can be downloaded via the World Wide Web. This simple method of 3D reconstruction offers significant advantages for understanding important events in morphological sciences.

  16. Vehicle - Bridge interaction, comparison of two computing models

    NASA Astrophysics Data System (ADS)

    Melcer, Jozef; Kuchárová, Daniela

    2017-07-01

    The paper presents the calculation of the bridge response on the effect of moving vehicle moves along the bridge with various velocities. The multi-body plane computing model of vehicle is adopted. The bridge computing models are created in two variants. One computing model represents the bridge as the Bernoulli-Euler beam with continuously distributed mass and the second one represents the bridge as the lumped mass model with 1 degrees of freedom. The mid-span bridge dynamic deflections are calculated for both computing models. The results are mutually compared and quantitative evaluated.

  17. Topological phases in the Haldane model with spin–spin on-site interactions

    NASA Astrophysics Data System (ADS)

    Rubio-García, A.; García-Ripoll, J. J.

    2018-04-01

    Ultracold atom experiments allow the study of topological insulators, such as the non-interacting Haldane model. In this work we study a generalization of the Haldane model with spin–spin on-site interactions that can be implemented on such experiments. We focus on measuring the winding number, a topological invariant, of the ground state, which we compute using a mean-field calculation that effectively captures long-range correlations and a matrix product state computation in a lattice with 64 sites. Our main result is that we show how the topological phases present in the non-interacting model survive until the interactions are comparable to the kinetic energy. We also demonstrate the accuracy of our mean-field approach in efficiently capturing long-range correlations. Based on state-of-the-art ultracold atom experiments, we propose an implementation of our model that can give information about the topological phases.

  18. A Computer Model for Red Blood Cell Chemistry

    DTIC Science & Technology

    1996-10-01

    5012. 13. ABSTRACT (Maximum 200 There is a growing need for interactive computational tools for medical education and research. The most exciting...paradigm for interactive education is simulation. Fluid Mod is a simulation based computational tool developed in the late sixties and early seventies at...to a modern Windows, object oriented interface. This development will provide students with a useful computational tool for learning . More important

  19. Intelligent Context-Aware and Adaptive Interface for Mobile LBS

    PubMed Central

    Liu, Yanhong

    2015-01-01

    Context-aware user interface plays an important role in many human-computer Interaction tasks of location based services. Although spatial models for context-aware systems have been studied extensively, how to locate specific spatial information for users is still not well resolved, which is important in the mobile environment where location based services users are impeded by device limitations. Better context-aware human-computer interaction models of mobile location based services are needed not just to predict performance outcomes, such as whether people will be able to find the information needed to complete a human-computer interaction task, but to understand human processes that interact in spatial query, which will in turn inform the detailed design of better user interfaces in mobile location based services. In this study, a context-aware adaptive model for mobile location based services interface is proposed, which contains three major sections: purpose, adjustment, and adaptation. Based on this model we try to describe the process of user operation and interface adaptation clearly through the dynamic interaction between users and the interface. Then we show how the model applies users' demands in a complicated environment and suggested the feasibility by the experimental results. PMID:26457077

  20. Computational biology of RNA interactions.

    PubMed

    Dieterich, Christoph; Stadler, Peter F

    2013-01-01

    The biodiversity of the RNA world has been underestimated for decades. RNA molecules are key building blocks, sensors, and regulators of modern cells. The biological function of RNA molecules cannot be separated from their ability to bind to and interact with a wide space of chemical species, including small molecules, nucleic acids, and proteins. Computational chemists, physicists, and biologists have developed a rich tool set for modeling and predicting RNA interactions. These interactions are to some extent determined by the binding conformation of the RNA molecule. RNA binding conformations are approximated with often acceptable accuracy by sequence and secondary structure motifs. Secondary structure ensembles of a given RNA molecule can be efficiently computed in many relevant situations by employing a standard energy model for base pair interactions and dynamic programming techniques. The case of bi-molecular RNA-RNA interactions can be seen as an extension of this approach. However, unbiased transcriptome-wide scans for local RNA-RNA interactions are computationally challenging yet become efficient if the binding motif/mode is known and other external information can be used to confine the search space. Computational methods are less developed for proteins and small molecules, which bind to RNA with very high specificity. Binding descriptors of proteins are usually determined by in vitro high-throughput assays (e.g., microarrays or sequencing). Intriguingly, recent experimental advances, which are mostly based on light-induced cross-linking of binding partners, render in vivo binding patterns accessible yet require new computational methods for careful data interpretation. The grand challenge is to model the in vivo situation where a complex interplay of RNA binders competes for the same target RNA molecule. Evidently, bioinformaticians are just catching up with the impressive pace of these developments. Copyright © 2012 John Wiley & Sons, Ltd.

  1. Bounding the electrostatic free energies associated with linear continuum models of molecular solvation.

    PubMed

    Bardhan, Jaydeep P; Knepley, Matthew G; Anitescu, Mihai

    2009-03-14

    The importance of electrostatic interactions in molecular biology has driven extensive research toward the development of accurate and efficient theoretical and computational models. Linear continuum electrostatic theory has been surprisingly successful, but the computational costs associated with solving the associated partial differential equations (PDEs) preclude the theory's use in most dynamical simulations. Modern generalized-Born models for electrostatics can reproduce PDE-based calculations to within a few percent and are extremely computationally efficient but do not always faithfully reproduce interactions between chemical groups. Recent work has shown that a boundary-integral-equation formulation of the PDE problem leads naturally to a new approach called boundary-integral-based electrostatics estimation (BIBEE) to approximate electrostatic interactions. In the present paper, we prove that the BIBEE method can be used to rigorously bound the actual continuum-theory electrostatic free energy. The bounds are validated using a set of more than 600 proteins. Detailed numerical results are presented for structures of the peptide met-enkephalin taken from a molecular-dynamics simulation. These bounds, in combination with our demonstration that the BIBEE methods accurately reproduce pairwise interactions, suggest a new approach toward building a highly accurate yet computationally tractable electrostatic model.

  2. Bounding the electrostatic free energies associated with linear continuum models of molecular solvation

    NASA Astrophysics Data System (ADS)

    Bardhan, Jaydeep P.; Knepley, Matthew G.; Anitescu, Mihai

    2009-03-01

    The importance of electrostatic interactions in molecular biology has driven extensive research toward the development of accurate and efficient theoretical and computational models. Linear continuum electrostatic theory has been surprisingly successful, but the computational costs associated with solving the associated partial differential equations (PDEs) preclude the theory's use in most dynamical simulations. Modern generalized-Born models for electrostatics can reproduce PDE-based calculations to within a few percent and are extremely computationally efficient but do not always faithfully reproduce interactions between chemical groups. Recent work has shown that a boundary-integral-equation formulation of the PDE problem leads naturally to a new approach called boundary-integral-based electrostatics estimation (BIBEE) to approximate electrostatic interactions. In the present paper, we prove that the BIBEE method can be used to rigorously bound the actual continuum-theory electrostatic free energy. The bounds are validated using a set of more than 600 proteins. Detailed numerical results are presented for structures of the peptide met-enkephalin taken from a molecular-dynamics simulation. These bounds, in combination with our demonstration that the BIBEE methods accurately reproduce pairwise interactions, suggest a new approach toward building a highly accurate yet computationally tractable electrostatic model.

  3. Users matter : multi-agent systems model of high performance computing cluster users.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    North, M. J.; Hood, C. S.; Decision and Information Sciences

    2005-01-01

    High performance computing clusters have been a critical resource for computational science for over a decade and have more recently become integral to large-scale industrial analysis. Despite their well-specified components, the aggregate behavior of clusters is poorly understood. The difficulties arise from complicated interactions between cluster components during operation. These interactions have been studied by many researchers, some of whom have identified the need for holistic multi-scale modeling that simultaneously includes network level, operating system level, process level, and user level behaviors. Each of these levels presents its own modeling challenges, but the user level is the most complex duemore » to the adaptability of human beings. In this vein, there are several major user modeling goals, namely descriptive modeling, predictive modeling and automated weakness discovery. This study shows how multi-agent techniques were used to simulate a large-scale computing cluster at each of these levels.« less

  4. Analysis of whisker-toughened CMC structural components using an interactive reliability model

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.; Palko, Joseph L.

    1992-01-01

    Realizing wider utilization of ceramic matrix composites (CMC) requires the development of advanced structural analysis technologies. This article focuses on the use of interactive reliability models to predict component probability of failure. The deterministic William-Warnke failure criterion serves as theoretical basis for the reliability model presented here. The model has been implemented into a test-bed software program. This computer program has been coupled to a general-purpose finite element program. A simple structural problem is presented to illustrate the reliability model and the computer algorithm.

  5. Aerodynamic Interaction Effects of a Helicopter Rotor and Fuselage

    NASA Technical Reports Server (NTRS)

    Boyd, David D., Jr.

    1999-01-01

    A three year Cooperative Research Agreements made in each of the three years between the Subsonic Aerodynamics Branch of the NASA Langley Research Center and the Virginia Polytechnic Institute and State University (Va. Tech) has been completed. This document presents results from this three year endeavor. The goal of creating an efficient method to compute unsteady interactional effects between a helicopter rotor and fuselage has been accomplished. This paper also includes appendices to support these findings. The topics are: 1) Rotor-Fuselage Interactions Aerodynamics: An Unsteady Rotor Model; and 2) Rotor/Fuselage Unsteady Interactional Aerodynamics: A New Computational Model.

  6. Interactive design and analysis of future large spacecraft concepts

    NASA Technical Reports Server (NTRS)

    Garrett, L. B.

    1981-01-01

    An interactive computer aided design program used to perform systems level design and analysis of large spacecraft concepts is presented. Emphasis is on rapid design, analysis of integrated spacecraft, and automatic spacecraft modeling for lattice structures. Capabilities and performance of multidiscipline applications modules, the executive and data management software, and graphics display features are reviewed. A single user at an interactive terminal create, design, analyze, and conduct parametric studies of Earth orbiting spacecraft with relative ease. Data generated in the design, analysis, and performance evaluation of an Earth-orbiting large diameter antenna satellite are used to illustrate current capabilities. Computer run time statistics for the individual modules quantify the speed at which modeling, analysis, and design evaluation of integrated spacecraft concepts is accomplished in a user interactive computing environment.

  7. Education, Information Technology and Cognitive Science.

    ERIC Educational Resources Information Center

    Scaife, M.

    1989-01-01

    Discusses information technology and its effects on developmental psychology and children's education. Topics discussed include a theory of child-computer interaction (CCI); programing; communication and computers, including electronic mail; cognitive science; artificial intelligence; modeling the user-system interaction; and the future of…

  8. Computationally Efficient Multiconfigurational Reactive Molecular Dynamics

    PubMed Central

    Yamashita, Takefumi; Peng, Yuxing; Knight, Chris; Voth, Gregory A.

    2012-01-01

    It is a computationally demanding task to explicitly simulate the electronic degrees of freedom in a system to observe the chemical transformations of interest, while at the same time sampling the time and length scales required to converge statistical properties and thus reduce artifacts due to initial conditions, finite-size effects, and limited sampling. One solution that significantly reduces the computational expense consists of molecular models in which effective interactions between particles govern the dynamics of the system. If the interaction potentials in these models are developed to reproduce calculated properties from electronic structure calculations and/or ab initio molecular dynamics simulations, then one can calculate accurate properties at a fraction of the computational cost. Multiconfigurational algorithms model the system as a linear combination of several chemical bonding topologies to simulate chemical reactions, also sometimes referred to as “multistate”. These algorithms typically utilize energy and force calculations already found in popular molecular dynamics software packages, thus facilitating their implementation without significant changes to the structure of the code. However, the evaluation of energies and forces for several bonding topologies per simulation step can lead to poor computational efficiency if redundancy is not efficiently removed, particularly with respect to the calculation of long-ranged Coulombic interactions. This paper presents accurate approximations (effective long-range interaction and resulting hybrid methods) and multiple-program parallelization strategies for the efficient calculation of electrostatic interactions in reactive molecular simulations. PMID:25100924

  9. Enhancing Human-Computer Interaction Design Education: Teaching Affordance Design for Emerging Mobile Devices

    ERIC Educational Resources Information Center

    Faiola, Anthony; Matei, Sorin Adam

    2010-01-01

    The evolution of human-computer interaction design (HCID) over the last 20 years suggests that there is a growing need for educational scholars to consider new and more applicable theoretical models of interactive product design. The authors suggest that such paradigms would call for an approach that would equip HCID students with a better…

  10. A New Computational Tool for Understanding Light-Matter Interactions

    DTIC Science & Technology

    2016-02-11

    SECURITY CLASSIFICATION OF: Plasmonic resonance of a metallic nanostructure results from coherent motion of its conduction electrons driven by...Box 12211 Research Triangle Park, NC 27709-2211 Plasmonics , light-matter interaction, time-dependent density functional theory, modeling and...reviewed journals: Final Report: A New Computational Tool For Understanding Light-Matter Interactions Report Title Plasmonic resonance of a metallic

  11. An experimental and numerical investigation of shock-wave induced turbulent boundary-layer separation at hypersonic speeds

    NASA Technical Reports Server (NTRS)

    Marvin, J. G.; Horstman, C. C.; Rubesin, M. W.; Coakley, T. J.; Kussoy, M. I.

    1975-01-01

    An experiment designed to test and guide computations of the interaction of an impinging shock wave with a turbulent boundary layer is described. Detailed mean flow-field and surface data are presented for two shock strengths which resulted in attached and separated flows, respectively. Numerical computations, employing the complete time-averaged Navier-Stokes equations along with algebraic eddy-viscosity and turbulent Prandtl number models to describe shear stress and heat flux, are used to illustrate the dependence of the computations on the particulars of the turbulence models. Models appropriate for zero-pressure-gradient flows predicted the overall features of the flow fields, but were deficient in predicting many of the details of the interaction regions. Improvements to the turbulence model parameters were sought through a combination of detailed data analysis and computer simulations which tested the sensitivity of the solutions to model parameter changes. Computer simulations using these improvements are presented and discussed.

  12. Spectral method for a kinetic swarming model

    DOE PAGES

    Gamba, Irene M.; Haack, Jeffrey R.; Motsch, Sebastien

    2015-04-28

    Here we present the first numerical method for a kinetic description of the Vicsek swarming model. The kinetic model poses a unique challenge, as there is a distribution dependent collision invariant to satisfy when computing the interaction term. We use a spectral representation linked with a discrete constrained optimization to compute these interactions. To test the numerical scheme we investigate the kinetic model at different scales and compare the solution with the microscopic and macroscopic descriptions of the Vicsek model. Lastly, we observe that the kinetic model captures key features such as vortex formation and traveling waves.

  13. Cognitive engineering models: A prerequisite to the design of human-computer interaction in complex dynamic systems

    NASA Technical Reports Server (NTRS)

    Mitchell, Christine M.

    1993-01-01

    This chapter examines a class of human-computer interaction applications, specifically the design of human-computer interaction for the operators of complex systems. Such systems include space systems (e.g., manned systems such as the Shuttle or space station, and unmanned systems such as NASA scientific satellites), aviation systems (e.g., the flight deck of 'glass cockpit' airplanes or air traffic control) and industrial systems (e.g., power plants, telephone networks, and sophisticated, e.g., 'lights out,' manufacturing facilities). The main body of human-computer interaction (HCI) research complements but does not directly address the primary issues involved in human-computer interaction design for operators of complex systems. Interfaces to complex systems are somewhat special. The 'user' in such systems - i.e., the human operator responsible for safe and effective system operation - is highly skilled, someone who in human-machine systems engineering is sometimes characterized as 'well trained, well motivated'. The 'job' or task context is paramount and, thus, human-computer interaction is subordinate to human job interaction. The design of human interaction with complex systems, i.e., the design of human job interaction, is sometimes called cognitive engineering.

  14. Human-computer interaction in multitask situations

    NASA Technical Reports Server (NTRS)

    Rouse, W. B.

    1977-01-01

    Human-computer interaction in multitask decisionmaking situations is considered, and it is proposed that humans and computers have overlapping responsibilities. Queueing theory is employed to model this dynamic approach to the allocation of responsibility between human and computer. Results of simulation experiments are used to illustrate the effects of several system variables including number of tasks, mean time between arrivals of action-evoking events, human-computer speed mismatch, probability of computer error, probability of human error, and the level of feedback between human and computer. Current experimental efforts are discussed and the practical issues involved in designing human-computer systems for multitask situations are considered.

  15. Systems Concepts and Computer-Managed Instruction: An Implementation and Validation Study.

    ERIC Educational Resources Information Center

    Dick, Walter; Gallagher, Paul

    The Florida State model of computer-managed instruction (CMI) differs from other such models in that it assumes a student will achieve his maximum performance level by interacting directly with the computer in order to evaluate his learning experience. In this system the computer plays the role of real-time diagnostician and prescriber for the…

  16. Evaluation of Ground Vibrations Induced by Military Noise Sources

    DTIC Science & Technology

    2006-08-01

    1 Task 2—Determine the acoustic -to-seismic coupling coefficients C1 and C2 ...................... 1 Task 3—Computational modeling ...Determine the acoustic -to-seismic coupling coefficients C1 and C2 ....................45 Task 3—Computational modeling of acoustically induced ground...ground conditions. Task 3—Computational modeling of acoustically induced ground motion The simple model of blast sound interaction with the

  17. AA9int: SNP Interaction Pattern Search Using Non-Hierarchical Additive Model Set.

    PubMed

    Lin, Hui-Yi; Huang, Po-Yu; Chen, Dung-Tsa; Tung, Heng-Yuan; Sellers, Thomas A; Pow-Sang, Julio; Eeles, Rosalind; Easton, Doug; Kote-Jarai, Zsofia; Amin Al Olama, Ali; Benlloch, Sara; Muir, Kenneth; Giles, Graham G; Wiklund, Fredrik; Gronberg, Henrik; Haiman, Christopher A; Schleutker, Johanna; Nordestgaard, Børge G; Travis, Ruth C; Hamdy, Freddie; Neal, David E; Pashayan, Nora; Khaw, Kay-Tee; Stanford, Janet L; Blot, William J; Thibodeau, Stephen N; Maier, Christiane; Kibel, Adam S; Cybulski, Cezary; Cannon-Albright, Lisa; Brenner, Hermann; Kaneva, Radka; Batra, Jyotsna; Teixeira, Manuel R; Pandha, Hardev; Lu, Yong-Jie; Park, Jong Y

    2018-06-07

    The use of single nucleotide polymorphism (SNP) interactions to predict complex diseases is getting more attention during the past decade, but related statistical methods are still immature. We previously proposed the SNP Interaction Pattern Identifier (SIPI) approach to evaluate 45 SNP interaction patterns/patterns. SIPI is statistically powerful but suffers from a large computation burden. For large-scale studies, it is necessary to use a powerful and computation-efficient method. The objective of this study is to develop an evidence-based mini-version of SIPI as the screening tool or solitary use and to evaluate the impact of inheritance mode and model structure on detecting SNP-SNP interactions. We tested two candidate approaches: the 'Five-Full' and 'AA9int' method. The Five-Full approach is composed of the five full interaction models considering three inheritance modes (additive, dominant and recessive). The AA9int approach is composed of nine interaction models by considering non-hierarchical model structure and the additive mode. Our simulation results show that AA9int has similar statistical power compared to SIPI and is superior to the Five-Full approach, and the impact of the non-hierarchical model structure is greater than that of the inheritance mode in detecting SNP-SNP interactions. In summary, it is recommended that AA9int is a powerful tool to be used either alone or as the screening stage of a two-stage approach (AA9int+SIPI) for detecting SNP-SNP interactions in large-scale studies. The 'AA9int' and 'parAA9int' functions (standard and parallel computing version) are added in the SIPI R package, which is freely available at https://linhuiyi.github.io/LinHY_Software/. hlin1@lsuhsc.edu. Supplementary data are available at Bioinformatics online.

  18. CFD simulation of flow through heart: a perspective review.

    PubMed

    Khalafvand, S S; Ng, E Y K; Zhong, L

    2011-01-01

    The heart is an organ which pumps blood around the body by contraction of muscular wall. There is a coupled system in the heart containing the motion of wall and the motion of blood fluid; both motions must be computed simultaneously, which make biological computational fluid dynamics (CFD) difficult. The wall of the heart is not rigid and hence proper boundary conditions are essential for CFD modelling. Fluid-wall interaction is very important for real CFD modelling. There are many assumptions for CFD simulation of the heart that make it far from a real model. A realistic fluid-structure interaction modelling the structure by the finite element method and the fluid flow by CFD use more realistic coupling algorithms. This type of method is very powerful to solve the complex properties of the cardiac structure and the sensitive interaction of fluid and structure. The final goal of heart modelling is to simulate the total heart function by integrating cardiac anatomy, electrical activation, mechanics, metabolism and fluid mechanics together, as in the computational framework.

  19. Reciprocity in computer-human interaction: source-based, norm-based, and affect-based explanations.

    PubMed

    Lee, Seungcheol Austin; Liang, Yuhua Jake

    2015-04-01

    Individuals often apply social rules when they interact with computers, and this is known as the Computers Are Social Actors (CASA) effect. Following previous work, one approach to understand the mechanism responsible for CASA is to utilize computer agents and have the agents attempt to gain human compliance (e.g., completing a pattern recognition task). The current study focuses on three key factors frequently cited to influence traditional notions of compliance: evaluations toward the source (competence and warmth), normative influence (reciprocity), and affective influence (mood). Structural equation modeling assessed the effects of these factors on human compliance with computer request. The final model shows that norm-based influence (reciprocity) increased the likelihood of compliance, while evaluations toward the computer agent did not significantly influence compliance.

  20. Interactive computer modeling of combustion chemistry and coalescence-dispersion modeling of turbulent combustion

    NASA Technical Reports Server (NTRS)

    Pratt, D. T.

    1984-01-01

    An interactive computer code for simulation of a high-intensity turbulent combustor as a single point inhomogeneous stirred reactor was developed from an existing batch processing computer code CDPSR. The interactive CDPSR code was used as a guide for interpretation and direction of DOE-sponsored companion experiments utilizing Xenon tracer with optical laser diagnostic techniques to experimentally determine the appropriate mixing frequency, and for validation of CDPSR as a mixing-chemistry model for a laboratory jet-stirred reactor. The coalescence-dispersion model for finite rate mixing was incorporated into an existing interactive code AVCO-MARK I, to enable simulation of a combustor as a modular array of stirred flow and plug flow elements, each having a prescribed finite mixing frequency, or axial distribution of mixing frequency, as appropriate. Further increase the speed and reliability of the batch kinetics integrator code CREKID was increased by rewriting in vectorized form for execution on a vector or parallel processor, and by incorporating numerical techniques which enhance execution speed by permitting specification of a very low accuracy tolerance.

  1. A neuron-astrocyte transistor-like model for neuromorphic dressed neurons.

    PubMed

    Valenza, G; Pioggia, G; Armato, A; Ferro, M; Scilingo, E P; De Rossi, D

    2011-09-01

    Experimental evidences on the role of the synaptic glia as an active partner together with the bold synapse in neuronal signaling and dynamics of neural tissue strongly suggest to investigate on a more realistic neuron-glia model for better understanding human brain processing. Among the glial cells, the astrocytes play a crucial role in the tripartite synapsis, i.e. the dressed neuron. A well-known two-way astrocyte-neuron interaction can be found in the literature, completely revising the purely supportive role for the glia. The aim of this study is to provide a computationally efficient model for neuron-glia interaction. The neuron-glia interactions were simulated by implementing the Li-Rinzel model for an astrocyte and the Izhikevich model for a neuron. Assuming the dressed neuron dynamics similar to the nonlinear input-output characteristics of a bipolar junction transistor, we derived our computationally efficient model. This model may represent the fundamental computational unit for the development of real-time artificial neuron-glia networks opening new perspectives in pattern recognition systems and in brain neurophysiology. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Examining the Effects of Field Dependence-Independence on Learners' Problem-Solving Performance and Interaction with a Computer Modeling Tool: Implications for the Design of Joint Cognitive Systems

    ERIC Educational Resources Information Center

    Angeli, Charoula

    2013-01-01

    An investigation was carried out to examine the effects of cognitive style on learners' performance and interaction during complex problem solving with a computer modeling tool. One hundred and nineteen undergraduates volunteered to participate in the study. Participants were first administered a test, and based on their test scores they were…

  3. InteractiveROSETTA: a graphical user interface for the PyRosetta protein modeling suite.

    PubMed

    Schenkelberg, Christian D; Bystroff, Christopher

    2015-12-15

    Modern biotechnical research is becoming increasingly reliant on computational structural modeling programs to develop novel solutions to scientific questions. Rosetta is one such protein modeling suite that has already demonstrated wide applicability to a number of diverse research projects. Unfortunately, Rosetta is largely a command-line-driven software package which restricts its use among non-computational researchers. Some graphical interfaces for Rosetta exist, but typically are not as sophisticated as commercial software. Here, we present InteractiveROSETTA, a graphical interface for the PyRosetta framework that presents easy-to-use controls for several of the most widely used Rosetta protocols alongside a sophisticated selection system utilizing PyMOL as a visualizer. InteractiveROSETTA is also capable of interacting with remote Rosetta servers, facilitating sophisticated protocols that are not accessible in PyRosetta or which require greater computational resources. InteractiveROSETTA is freely available at https://github.com/schenc3/InteractiveROSETTA/releases and relies upon a separate download of PyRosetta which is available at http://www.pyrosetta.org after obtaining a license (free for academic use). © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. How Levels of Interactivity in Tutorials Affect Students' Learning of Modeling Transportation Problems in a Spreadsheet

    ERIC Educational Resources Information Center

    Seal, Kala Chand; Przasnyski, Zbigniew H.; Leon, Linda A.

    2010-01-01

    Do students learn to model OR/MS problems better by using computer-based interactive tutorials and, if so, does increased interactivity in the tutorials lead to better learning? In order to determine the effect of different levels of interactivity on student learning, we used screen capture technology to design interactive support materials for…

  5. An interactive program for pharmacokinetic modeling.

    PubMed

    Lu, D R; Mao, F

    1993-05-01

    A computer program, PharmK, was developed for pharmacokinetic modeling of experimental data. The program was written in C computer language based on the high-level user-interface Macintosh operating system. The intention was to provide a user-friendly tool for users of Macintosh computers. An interactive algorithm based on the exponential stripping method is used for the initial parameter estimation. Nonlinear pharmacokinetic model fitting is based on the maximum likelihood estimation method and is performed by the Levenberg-Marquardt method based on chi 2 criterion. Several methods are available to aid the evaluation of the fitting results. Pharmacokinetic data sets have been examined with the PharmK program, and the results are comparable with those obtained with other programs that are currently available for IBM PC-compatible and other types of computers.

  6. Computer-program documentation of an interactive-accounting model to simulate streamflow, water quality, and water-supply operations in a river basin

    USGS Publications Warehouse

    Burns, A.W.

    1988-01-01

    This report describes an interactive-accounting model used to simulate streamflow, chemical-constituent concentrations and loads, and water-supply operations in a river basin. The model uses regression equations to compute flow from incremental (internode) drainage areas. Conservative chemical constituents (typically dissolved solids) also are computed from regression equations. Both flow and water quality loads are accumulated downstream. Optionally, the model simulates the water use and the simplified groundwater systems of a basin. Water users include agricultural, municipal, industrial, and in-stream users , and reservoir operators. Water users list their potential water sources, including direct diversions, groundwater pumpage, interbasin imports, or reservoir releases, in the order in which they will be used. Direct diversions conform to basinwide water law priorities. The model is interactive, and although the input data exist in files, the user can modify them interactively. A major feature of the model is its color-graphic-output options. This report includes a description of the model, organizational charts of subroutines, and examples of the graphics. Detailed format instructions for the input data, example files of input data, definitions of program variables, and listing of the FORTRAN source code are Attachments to the report. (USGS)

  7. Computer display and manipulation of biological molecules

    NASA Technical Reports Server (NTRS)

    Coeckelenbergh, Y.; Macelroy, R. D.; Hart, J.; Rein, R.

    1978-01-01

    This paper describes a computer model that was designed to investigate the conformation of molecules, macromolecules and subsequent complexes. Utilizing an advanced 3-D dynamic computer display system, the model is sufficiently versatile to accommodate a large variety of molecular input and to generate data for multiple purposes such as visual representation of conformational changes, and calculation of conformation and interaction energy. Molecules can be built on the basis of several levels of information. These include the specification of atomic coordinates and connectivities and the grouping of building blocks and duplicated substructures using symmetry rules found in crystals and polymers such as proteins and nucleic acids. Called AIMS (Ames Interactive Molecular modeling System), the model is now being used to study pre-biotic molecular evolution toward life.

  8. A Cognitive Model of How Interactive Multimedia Authoring Facilitates Conceptual Understanding of Object-Oriented Programming in Novices

    ERIC Educational Resources Information Center

    Yuen, Timothy; Liu, Min

    2011-01-01

    This paper presents a cognitive model of how interactive multimedia authoring (IMA) affect novices' cognition in object-oriented programming. This model was generated through an empirical study of first year computer science students at the university level being engaged in interactive multimedia authoring of a role-playing game. Clinical…

  9. A conceptual network model of the air transportation system. the basic level 1 model.

    DOT National Transportation Integrated Search

    1971-04-01

    A basic conceptual model of the entire Air Transportation System is being developed to serve as an analytical tool for studying the interactions among the system elements. The model is being designed to function in an interactive computer graphics en...

  10. Investigation of Carbohydrate Recognition via Computer Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Quentin R.; Lindsay, Richard J.; Petridis, Loukas

    Carbohydrate recognition by proteins, such as lectins and other (bio)molecules, can be essential for many biological functions. Interest has arisen due to potential protein and drug design and future bioengineering applications. A quantitative measurement of carbohydrate-protein interaction is thus important for the full characterization of sugar recognition. Here, we focus on the aspect of utilizing computer simulations and biophysical models to evaluate the strength and specificity of carbohydrate recognition in this review. With increasing computational resources, better algorithms and refined modeling parameters, using state-of-the-art supercomputers to calculate the strength of the interaction between molecules has become increasingly mainstream. We reviewmore » the current state of this technique and its successful applications for studying protein-sugar interactions in recent years.« less

  11. Investigation of Carbohydrate Recognition via Computer Simulation

    DOE PAGES

    Johnson, Quentin R.; Lindsay, Richard J.; Petridis, Loukas; ...

    2015-04-28

    Carbohydrate recognition by proteins, such as lectins and other (bio)molecules, can be essential for many biological functions. Interest has arisen due to potential protein and drug design and future bioengineering applications. A quantitative measurement of carbohydrate-protein interaction is thus important for the full characterization of sugar recognition. Here, we focus on the aspect of utilizing computer simulations and biophysical models to evaluate the strength and specificity of carbohydrate recognition in this review. With increasing computational resources, better algorithms and refined modeling parameters, using state-of-the-art supercomputers to calculate the strength of the interaction between molecules has become increasingly mainstream. We reviewmore » the current state of this technique and its successful applications for studying protein-sugar interactions in recent years.« less

  12. MIX: a computer program to evaluate interaction between chemicals

    Treesearch

    Jacqueline L. Robertson; Kimberly C. Smith

    1989-01-01

    A computer program, MIX, was designed to identify pairs of chemicals whose interaction results in a response that departs significantly from the model predicated on the assumption of independent, uncorrelated joint action. This report describes the MIX program, its statistical basis, and instructions for its use.

  13. An application of interactive computer graphics technology to the design of dispersal mechanisms

    NASA Technical Reports Server (NTRS)

    Richter, B. J.; Welch, B. H.

    1977-01-01

    Interactive computer graphics technology is combined with a general purpose mechanisms computer code to study the operational behavior of three guided bomb dispersal mechanism designs. These studies illustrate the use of computer graphics techniques to discover operational anomalies, to assess the effectiveness of design improvements, to reduce the time and cost of the modeling effort, and to provide the mechanism designer with a visual understanding of the physical operation of such systems.

  14. Facing the challenges of multiscale modelling of bacterial and fungal pathogen–host interactions

    PubMed Central

    Schleicher, Jana; Conrad, Theresia; Gustafsson, Mika; Cedersund, Gunnar; Guthke, Reinhard

    2017-01-01

    Abstract Recent and rapidly evolving progress on high-throughput measurement techniques and computational performance has led to the emergence of new disciplines, such as systems medicine and translational systems biology. At the core of these disciplines lies the desire to produce multiscale models: mathematical models that integrate multiple scales of biological organization, ranging from molecular, cellular and tissue models to organ, whole-organism and population scale models. Using such models, hypotheses can systematically be tested. In this review, we present state-of-the-art multiscale modelling of bacterial and fungal infections, considering both the pathogen and host as well as their interaction. Multiscale modelling of the interactions of bacteria, especially Mycobacterium tuberculosis, with the human host is quite advanced. In contrast, models for fungal infections are still in their infancy, in particular regarding infections with the most important human pathogenic fungi, Candida albicans and Aspergillus fumigatus. We reflect on the current availability of computational approaches for multiscale modelling of host–pathogen interactions and point out current challenges. Finally, we provide an outlook for future requirements of multiscale modelling. PMID:26857943

  15. Current Status on the use of Parallel Computing in Turbulent Reacting Flow Computations Involving Sprays, Monte Carlo PDF and Unstructured Grids. Chapter 4

    NASA Technical Reports Server (NTRS)

    Raju, M. S.

    1998-01-01

    The state of the art in multidimensional combustor modeling as evidenced by the level of sophistication employed in terms of modeling and numerical accuracy considerations, is also dictated by the available computer memory and turnaround times afforded by present-day computers. With the aim of advancing the current multi-dimensional computational tools used in the design of advanced technology combustors, a solution procedure is developed that combines the novelty of the coupled CFD/spray/scalar Monte Carlo PDF (Probability Density Function) computations on unstructured grids with the ability to run on parallel architectures. In this approach, the mean gas-phase velocity and turbulence fields are determined from a standard turbulence model, the joint composition of species and enthalpy from the solution of a modeled PDF transport equation, and a Lagrangian-based dilute spray model is used for the liquid-phase representation. The gas-turbine combustor flows are often characterized by a complex interaction between various physical processes associated with the interaction between the liquid and gas phases, droplet vaporization, turbulent mixing, heat release associated with chemical kinetics, radiative heat transfer associated with highly absorbing and radiating species, among others. The rate controlling processes often interact with each other at various disparate time 1 and length scales. In particular, turbulence plays an important role in determining the rates of mass and heat transfer, chemical reactions, and liquid phase evaporation in many practical combustion devices.

  16. Learning Natural Selection in 4th Grade with Multi-Agent-Based Computational Models

    ERIC Educational Resources Information Center

    Dickes, Amanda Catherine; Sengupta, Pratim

    2013-01-01

    In this paper, we investigate how elementary school students develop multi-level explanations of population dynamics in a simple predator-prey ecosystem, through scaffolded interactions with a multi-agent-based computational model (MABM). The term "agent" in an MABM indicates individual computational objects or actors (e.g., cars), and these…

  17. A Computational Model of Active Vision for Visual Search in Human-Computer Interaction

    DTIC Science & Technology

    2010-08-01

    processors that interact with the production rules to produce behavior, and (c) parameters that constrain the behavior of the model (e.g., the...velocity of a saccadic eye movement). While the parameters can be task-specific, the majority of the parameters are usually fixed across a wide variety...previously estimated durations. Hooge and Erkelens (1996) review these four explanations of fixation duration control. A variety of research

  18. Speech Perception as a Cognitive Process: The Interactive Activation Model.

    ERIC Educational Resources Information Center

    Elman, Jeffrey L.; McClelland, James L.

    Research efforts to model speech perception in terms of a processing system in which knowledge and processing are distributed over large numbers of highly interactive--but computationally primative--elements are described in this report. After discussing the properties of speech that demand a parallel interactive processing system, the report…

  19. Cognitive Architectures and Human-Computer Interaction. Introduction to Special Issue.

    ERIC Educational Resources Information Center

    Gray, Wayne D.; Young, Richard M.; Kirschenbaum, Susan S.

    1997-01-01

    In this introduction to a special issue on cognitive architectures and human-computer interaction (HCI), editors and contributors provide a brief overview of cognitive architectures. The following four architectures represented by articles in this issue are: Soar; LICAI (linked model of comprehension-based action planning and instruction taking);…

  20. An Interactive Graphical Modeling Game for Teaching Musical Concepts.

    ERIC Educational Resources Information Center

    Lamb, Martin

    1982-01-01

    Describes an interactive computer game in which players compose music at a computer screen. They experiment with pitch and melodic shape and the effects of transposition, augmentation, diminution, retrograde, and inversion. The user interface is simple enough for children to use and powerful enough for composers to work with. (EAO)

  1. Technology Enhanced Elementary and Middle School Science (TEEMSS). What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2012

    2012-01-01

    "Technology Enhanced Elementary and Middle School Science" ("TEEMSS") is a physical science curriculum for grades 3-8 that utilizes computers, sensors, and interactive models to support investigations of real-world phenomena. Through 15 inquiry-based instructional units, students interact with computers, gather and analyze…

  2. On the Solution of the Three-Dimensional Flowfield About a Flow-Through Nacelle. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Compton, William Bernard

    1985-01-01

    The solution of the three dimensional flow field for a flow through nacelle was studied. Both inviscid and viscous inviscid interacting solutions were examined. Inviscid solutions were obtained with two different computational procedures for solving the three dimensional Euler equations. The first procedure employs an alternating direction implicit numerical algorithm, and required the development of a complete computational model for the nacelle problem. The second computational technique employs a fourth order Runge-Kutta numerical algorithm which was modified to fit the nacelle problem. Viscous effects on the flow field were evaluated with a viscous inviscid interacting computational model. This model was constructed by coupling the explicit Euler solution procedure with a flag entrainment boundary layer solution procedure in a global iteration scheme. The computational techniques were used to compute the flow field for a long duct turbofan engine nacelle at free stream Mach numbers of 0.80 and 0.94 and angles of attack of 0 and 4 deg.

  3. The GPRIME approach to finite element modeling

    NASA Technical Reports Server (NTRS)

    Wallace, D. R.; Mckee, J. H.; Hurwitz, M. M.

    1983-01-01

    GPRIME, an interactive modeling system, runs on the CDC 6000 computers and the DEC VAX 11/780 minicomputer. This system includes three components: (1) GPRIME, a user friendly geometric language and a processor to translate that language into geometric entities, (2) GGEN, an interactive data generator for 2-D models; and (3) SOLIDGEN, a 3-D solid modeling program. Each component has a computer user interface of an extensive command set. All of these programs make use of a comprehensive B-spline mathematics subroutine library, which can be used for a wide variety of interpolation problems and other geometric calculations. Many other user aids, such as automatic saving of the geometric and finite element data bases and hidden line removal, are available. This interactive finite element modeling capability can produce a complete finite element model, producing an output file of grid and element data.

  4. Occupational stress in human computer interaction.

    PubMed

    Smith, M J; Conway, F T; Karsh, B T

    1999-04-01

    There have been a variety of research approaches that have examined the stress issues related to human computer interaction including laboratory studies, cross-sectional surveys, longitudinal case studies and intervention studies. A critical review of these studies indicates that there are important physiological, biochemical, somatic and psychological indicators of stress that are related to work activities where human computer interaction occurs. Many of the stressors of human computer interaction at work are similar to those stressors that have historically been observed in other automated jobs. These include high workload, high work pressure, diminished job control, inadequate employee training to use new technology, monotonous tasks, por supervisory relations, and fear for job security. New stressors have emerged that can be tied primarily to human computer interaction. These include technology breakdowns, technology slowdowns, and electronic performance monitoring. The effects of the stress of human computer interaction in the workplace are increased physiological arousal; somatic complaints, especially of the musculoskeletal system; mood disturbances, particularly anxiety, fear and anger; and diminished quality of working life, such as reduced job satisfaction. Interventions to reduce the stress of computer technology have included improved technology implementation approaches and increased employee participation in implementation. Recommendations for ways to reduce the stress of human computer interaction at work are presented. These include proper ergonomic conditions, increased organizational support, improved job content, proper workload to decrease work pressure, and enhanced opportunities for social support. A model approach to the design of human computer interaction at work that focuses on the system "balance" is proposed.

  5. Computational Modeling of Arc-Slag Interaction in DC Furnaces

    NASA Astrophysics Data System (ADS)

    Reynolds, Quinn G.

    2017-02-01

    The plasma arc is central to the operation of the direct-current arc furnace, a unit operation commonly used in high-temperature processing of both primary ores and recycled metals. The arc is a high-velocity, high-temperature jet of ionized gas created and sustained by interactions among the thermal, momentum, and electromagnetic fields resulting from the passage of electric current. In addition to being the primary source of thermal energy, the arc jet also couples mechanically with the bath of molten process material within the furnace, causing substantial splashing and stirring in the region in which it impinges. The arc's interaction with the molten bath inside the furnace is studied through use of a multiphase, multiphysics computational magnetohydrodynamic model developed in the OpenFOAM® framework. Results from the computational solver are compared with empirical correlations that account for arc-slag interaction effects.

  6. Integrating Mathematical Modeling for Undergraduate Pre-Service Science Education Learning and Instruction in Middle School Classrooms

    ERIC Educational Resources Information Center

    Carrejo, David; Robertson, William H.

    2011-01-01

    Computer-based mathematical modeling in physics is a process of constructing models of concepts and the relationships between them in the scientific characteristics of work. In this manner, computer-based modeling integrates the interactions of natural phenomenon through the use of models, which provide structure for theories and a base for…

  7. Fluid-Structure Interaction Modeling of Parachutes with Disreefing and Modified Geometric Porosity and Separation Aerodynamics of a Cover Jettisoned to the Spacecraft Wake

    NASA Astrophysics Data System (ADS)

    Fritze, Matthew D.

    Fluid-structure interaction (FSI) modeling of spacecraft parachutes involves a number of computational challenges. The canopy complexity created by the hundreds of gaps and slits and design-related modification of that geometric porosity by removal of some of the sails and panels are among the formidable challenges. Disreefing from one stage to another when the parachute is used in multiple stages is another formidable challenge. This thesis addresses the computational challenges involved in disreefing of spacecraft parachutes and fully-open and reefed stages of the parachutes with modified geometric porosity. The special techniques developed to address these challenges are described and the FSI computations are be reported. The thesis also addresses the modeling and computation challenges involved in very early stages, where the sudden separation of a cover jettisoned to the spacecraft wake needs to be modeled. Higher-order temporal representations used in modeling the separation motion are described, and the computed separation and wake-induced forces acting on the cover are reported.

  8. Embedded Process Modeling, Analogy-Based Option Generation and Analytical Graphic Interaction for Enhanced User-Computer Interaction: An Interactive Storyboard of Next Generation User-Computer Interface Technology. Phase 1

    DTIC Science & Technology

    1988-03-01

    structure of the interface is a mapping from the physical world [for example, the use of icons, which S have inherent meaning to users but represent...design alternatives. Mechanisms for linking the user to the computer include physical devices (keyboards), actions taken with the devices (keystrokes...VALUATION AIDES TEMLATEI IITCOM1I LATOR IACTICAL KNOWLEDGE ACGIUISITION MICNnII t 1 Fig. 9. INTACVAL. * OtJiCTs ARE PHYSICAL ENTITIES OR CONCEPTUAL EN

  9. Informing Mechanistic Toxicology with Computational Molecular Models

    EPA Science Inventory

    Computational molecular models of chemicals interacting with biomolecular targets provides toxicologists a valuable, affordable, and sustainable source of in silico molecular level information that augments, enriches, and complements in vitro and in vivo effo...

  10. Ontogenetic ritualization of primate gesture as a case study in dyadic brain modeling.

    PubMed

    Gasser, Brad; Cartmill, Erica A; Arbib, Michael A

    2014-01-01

    This paper introduces dyadic brain modeling - the simultaneous, computational modeling of the brains of two interacting agents - to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization, a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the 'dyad'). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuro-primatology.

  11. A computational approach to climate science education with CLIMLAB

    NASA Astrophysics Data System (ADS)

    Rose, B. E. J.

    2017-12-01

    CLIMLAB is a Python-based software toolkit for interactive, process-oriented climate modeling for use in education and research. It is motivated by the need for simpler tools and more reproducible workflows with which to "fill in the gaps" between blackboard-level theory and the results of comprehensive climate models. With CLIMLAB you can interactively mix and match physical model components, or combine simpler process models together into a more comprehensive model. I use CLIMLAB in the classroom to put models in the hands of students (undergraduate and graduate), and emphasize a hierarchical, process-oriented approach to understanding the key emergent properties of the climate system. CLIMLAB is equally a tool for climate research, where the same needs exist for more robust, process-based understanding and reproducible computational results. I will give an overview of CLIMLAB and an update on recent developments, including: a full-featured, well-documented, interactive implementation of a widely-used radiation model (RRTM) packaging with conda-forge for compiler-free (and hassle-free!) installation on Mac, Windows and Linux interfacing with xarray for i/o and graphics with gridded model data a rich and growing collection of examples and self-computing lecture notes in Jupyter notebook format

  12. Simulations of noble gases adsorbed on graphene

    NASA Astrophysics Data System (ADS)

    Maiga, Sidi; Gatica, Silvina

    2014-03-01

    We present results of Grand Canonical Monte Carlo simulations of adsorption of Kr, Ar and Xe on a suspended graphene sheet. We compute the adsorbate-adsorbate interaction by a Lennard-Jones potential. We adopt a hybrid model for the graphene-adsorbate force; in the hybrid model, the potential interaction with the nearest carbon atoms (within a distance rnn) is computed with an atomistic pair potential Ua; for the atoms at r>rnn, we compute the interaction energy as a continuous integration over a carbon uniform sheet with the density of graphene. For the atomistic potential Ua, we assume the anisotropic LJ potential adapted from the graphite-He interaction proposed by Cole et.al. This interaction includes the anisotropy of the C atoms on graphene, which originates in the anisotropic π-bonds. The adsorption isotherms, energy and structure of the layer are obtained and compared with experimental results. We also compare with the adsorption on graphite and carbon nanotubes. This research was supported by NSF/PRDM (Howard University) and NSF (DMR 1006010).

  13. Can virtual reality improve anatomy education? A randomised controlled study of a computer-generated three-dimensional anatomical ear model.

    PubMed

    Nicholson, Daren T; Chalk, Colin; Funnell, W Robert J; Daniel, Sam J

    2006-11-01

    The use of computer-generated 3-dimensional (3-D) anatomical models to teach anatomy has proliferated. However, there is little evidence that these models are educationally effective. The purpose of this study was to test the educational effectiveness of a computer-generated 3-D model of the middle and inner ear. We reconstructed a fully interactive model of the middle and inner ear from a magnetic resonance imaging scan of a human cadaver ear. To test the model's educational usefulness, we conducted a randomised controlled study in which 28 medical students completed a Web-based tutorial on ear anatomy that included the interactive model, while a control group of 29 students took the tutorial without exposure to the model. At the end of the tutorials, both groups were asked a series of 15 quiz questions to evaluate their knowledge of 3-D relationships within the ear. The intervention group's mean score on the quiz was 83%, while that of the control group was 65%. This difference in means was highly significant (P < 0.001). Our findings stand in contrast to the handful of previous randomised controlled trials that evaluated the effects of computer-generated 3-D anatomical models on learning. The equivocal and negative results of these previous studies may be due to the limitations of these studies (such as small sample size) as well as the limitations of the models that were studied (such as a lack of full interactivity). Given our positive results, we believe that further research is warranted concerning the educational effectiveness of computer-generated anatomical models.

  14. Designing Interactive Learning Systems.

    ERIC Educational Resources Information Center

    Barker, Philip

    1990-01-01

    Describes multimedia, computer-based interactive learning systems that support various forms of individualized study. Highlights include design models; user interfaces; design guidelines; media utilization paradigms, including hypermedia and learner-controlled models; metaphors and myths; authoring tools; optical media; workstations; four case…

  15. A Unified Approach to Modeling Multidisciplinary Interactions

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.; Bhatia, Kumar G.

    2000-01-01

    There are a number of existing methods to transfer information among various disciplines. For a multidisciplinary application with n disciplines, the traditional methods may be required to model (n(exp 2) - n) interactions. This paper presents a unified three-dimensional approach that reduces the number of interactions from (n(exp 2) - n) to 2n by using a computer-aided design model. The proposed modeling approach unifies the interactions among various disciplines. The approach is independent of specific discipline implementation, and a number of existing methods can be reformulated in the context of the proposed unified approach. This paper provides an overview of the proposed unified approach and reformulations for two existing methods. The unified approach is specially tailored for application environments where the geometry is created and managed through a computer-aided design system. Results are presented for a blended-wing body and a high-speed civil transport.

  16. A roadmap to computational social neuroscience.

    PubMed

    Tognoli, Emmanuelle; Dumas, Guillaume; Kelso, J A Scott

    2018-02-01

    To complement experimental efforts toward understanding human social interactions at both neural and behavioral levels, two computational approaches are presented: (1) a fully parameterizable mathematical model of a social partner, the Human Dynamic Clamp which, by virtue of experimentally controlled interactions between Virtual Partners and real people, allows for emergent behaviors to be studied; and (2) a multiscale neurocomputational model of social coordination that enables exploration of social self-organization at all levels-from neuronal patterns to people interacting with each other. These complementary frameworks and the cross product of their analysis aim at understanding the fundamental principles governing social behavior.

  17. Several examples where turbulence models fail in inlet flow field analysis

    NASA Technical Reports Server (NTRS)

    Anderson, Bernhard H.

    1993-01-01

    Computational uncertainties in turbulence modeling for three dimensional inlet flow fields include flows approaching separation, strength of secondary flow field, three dimensional flow predictions of vortex liftoff, and influence of vortex-boundary layer interactions; computational uncertainties in vortex generator modeling include representation of generator vorticity field and the relationship between generator and vorticity field. The objectives of the inlet flow field studies presented in this document are to advance the understanding, prediction, and control of intake distortion and to study the basic interactions that influence this design problem.

  18. Computational Analysis and Simulation of Empathic Behaviors: a Survey of Empathy Modeling with Behavioral Signal Processing Framework.

    PubMed

    Xiao, Bo; Imel, Zac E; Georgiou, Panayiotis; Atkins, David C; Narayanan, Shrikanth S

    2016-05-01

    Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, and facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation and offer a series of open problems for future research.

  19. A novel Arg H52/Tyr H33 conservative motif in antibodies: A correlation between sequence of antibodies and antigen binding.

    PubMed

    Petrov, Artem; Arzhanik, Vladimir; Makarov, Gennady; Koliasnikov, Oleg

    2016-08-01

    Antibodies are the family of proteins, which are responsible for antigen recognition. The computational modeling of interaction between an antigen and an antibody is very important when crystallographic structure is unavailable. In this research, we have discovered the correlation between the amino acid sequence of antibody and its specific binding characteristics on the example of the novel conservative binding motif, which consists of four residues: Arg H52, Tyr H33, Thr H59, and Glu H61. These residues are specifically oriented in the binding site and interact with each other in a specific manner. The residues of the binding motif are involved in interaction strictly with negatively charged groups of antigens, and form a binding complex. Mechanism of interaction and characteristics of the complex were also discovered. The results of this research can be used to increase the accuracy of computational antibody-antigen interaction modeling and for post-modeling quality control of the modeled structures.

  20. A Pulsatile Cardiovascular Computer Model for Teaching Heart-Blood Vessel Interaction.

    ERIC Educational Resources Information Center

    Campbell, Kenneth; And Others

    1982-01-01

    Describes a model which gives realistic predictions of pulsatile pressure, flow, and volume events in the cardiovascular system. Includes computer oriented laboratory exercises for veterinary and graduate students; equations of the dynamic and algebraic models; and a flow chart for the cardiovascular teaching program. (JN)

  1. Modeling and simulation in biomedicine.

    PubMed Central

    Aarts, J.; Möller, D.; van Wijk van Brievingh, R.

    1991-01-01

    A group of researchers and educators in The Netherlands, Germany and Czechoslovakia have developed and adapted mathematical computer models of phenomena in the field of physiology and biomedicine for use in higher education. The models are graphical and highly interactive, and are all written in TurboPascal or the mathematical simulation language PSI. An educational shell has been developed to launch the models. The shell allows students to interact with the models and teachers to edit the models, to add new models and to monitor the achievements of the students. The models and the shell have been implemented on a MS-DOS personal computer. This paper describes the features of the modeling package and presents the modeling and simulation of the heart muscle as an example. PMID:1807745

  2. Role of pseudo-turbulent stresses in shocked particle clouds and construction of surrogate models for closure

    NASA Astrophysics Data System (ADS)

    Sen, O.; Gaul, N. J.; Davis, S.; Choi, K. K.; Jacobs, G.; Udaykumar, H. S.

    2018-05-01

    Macroscale models of shock-particle interactions require closure terms for unresolved solid-fluid momentum and energy transfer. These comprise the effects of mean as well as fluctuating fluid-phase velocity fields in the particle cloud. Mean drag and Reynolds stress equivalent terms (also known as pseudo-turbulent terms) appear in the macroscale equations. Closure laws for the pseudo-turbulent terms are constructed in this work from ensembles of high-fidelity mesoscale simulations. The computations are performed over a wide range of Mach numbers ( M) and particle volume fractions (φ ) and are used to explicitly compute the pseudo-turbulent stresses from the Favre average of the velocity fluctuations in the flow field. The computed stresses are then used as inputs to a Modified Bayesian Kriging method to generate surrogate models. The surrogates can be used as closure models for the pseudo-turbulent terms in macroscale computations of shock-particle interactions. It is found that the kinetic energy associated with the velocity fluctuations is comparable to that of the mean flow—especially for increasing M and φ . This work is a first attempt to quantify and evaluate the effect of velocity fluctuations for problems of shock-particle interactions.

  3. Role of pseudo-turbulent stresses in shocked particle clouds and construction of surrogate models for closure

    NASA Astrophysics Data System (ADS)

    Sen, O.; Gaul, N. J.; Davis, S.; Choi, K. K.; Jacobs, G.; Udaykumar, H. S.

    2018-02-01

    Macroscale models of shock-particle interactions require closure terms for unresolved solid-fluid momentum and energy transfer. These comprise the effects of mean as well as fluctuating fluid-phase velocity fields in the particle cloud. Mean drag and Reynolds stress equivalent terms (also known as pseudo-turbulent terms) appear in the macroscale equations. Closure laws for the pseudo-turbulent terms are constructed in this work from ensembles of high-fidelity mesoscale simulations. The computations are performed over a wide range of Mach numbers (M) and particle volume fractions (φ ) and are used to explicitly compute the pseudo-turbulent stresses from the Favre average of the velocity fluctuations in the flow field. The computed stresses are then used as inputs to a Modified Bayesian Kriging method to generate surrogate models. The surrogates can be used as closure models for the pseudo-turbulent terms in macroscale computations of shock-particle interactions. It is found that the kinetic energy associated with the velocity fluctuations is comparable to that of the mean flow—especially for increasing M and φ . This work is a first attempt to quantify and evaluate the effect of velocity fluctuations for problems of shock-particle interactions.

  4. Kenny Gruchalla | NREL

    Science.gov Websites

    feature extraction, human-computer interaction, and physics-based modeling. Professional Experience 2009 ., computer science, University of Colorado at Boulder M.S., computer science, University of Colorado at Boulder B.S., computer science, New Mexico Institute of Mining and Technology

  5. Simulation of glioblastoma multiforme (GBM) tumor cells using ising model on the Creutz Cellular Automaton

    NASA Astrophysics Data System (ADS)

    Züleyha, Artuç; Ziya, Merdan; Selçuk, Yeşiltaş; Kemal, Öztürk M.; Mesut, Tez

    2017-11-01

    Computational models for tumors have difficulties due to complexity of tumor nature and capacities of computational tools, however, these models provide visions to understand interactions between tumor and its micro environment. Moreover computational models have potential to develop strategies for individualized treatments for cancer. To observe a solid brain tumor, glioblastoma multiforme (GBM), we present a two dimensional Ising Model applied on Creutz cellular automaton (CCA). The aim of this study is to analyze avascular spherical solid tumor growth, considering transitions between non tumor cells and cancer cells are like phase transitions in physical system. Ising model on CCA algorithm provides a deterministic approach with discrete time steps and local interactions in position space to view tumor growth as a function of time. Our simulation results are given for fixed tumor radius and they are compatible with theoretical and clinic data.

  6. Proposed standards for peer-reviewed publication of computer code

    USDA-ARS?s Scientific Manuscript database

    Computer simulation models are mathematical abstractions of physical systems. In the area of natural resources and agriculture, these physical systems encompass selected interacting processes in plants, soils, animals, or watersheds. These models are scientific products and have become important i...

  7. 20180129 - Computational Embryology: Translational Tools for Modeling in vitro Data (Toxicology Forum)

    EPA Science Inventory

    Integrative models are needed to "decode the toxicological blueprint of active substances that interact with living systems" (Systems toxicology). Computational biology is uniquely positioned to capture this connectivity and help shift decision-making to mechanistic pre...

  8. Coarse-grained modeling of RNA 3D structure.

    PubMed

    Dawson, Wayne K; Maciejczyk, Maciej; Jankowska, Elzbieta J; Bujnicki, Janusz M

    2016-07-01

    Functional RNA molecules depend on three-dimensional (3D) structures to carry out their tasks within the cell. Understanding how these molecules interact to carry out their biological roles requires a detailed knowledge of RNA 3D structure and dynamics as well as thermodynamics, which strongly governs the folding of RNA and RNA-RNA interactions as well as a host of other interactions within the cellular environment. Experimental determination of these properties is difficult, and various computational methods have been developed to model the folding of RNA 3D structures and their interactions with other molecules. However, computational methods also have their limitations, especially when the biological effects demand computation of the dynamics beyond a few hundred nanoseconds. For the researcher confronted with such challenges, a more amenable approach is to resort to coarse-grained modeling to reduce the number of data points and computational demand to a more tractable size, while sacrificing as little critical information as possible. This review presents an introduction to the topic of coarse-grained modeling of RNA 3D structures and dynamics, covering both high- and low-resolution strategies. We discuss how physics-based approaches compare with knowledge based methods that rely on databases of information. In the course of this review, we discuss important aspects in the reasoning process behind building different models and the goals and pitfalls that can result. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Modeling of Tool-Tissue Interactions for Computer-Based Surgical Simulation: A Literature Review

    PubMed Central

    Misra, Sarthak; Ramesh, K. T.; Okamura, Allison M.

    2009-01-01

    Surgical simulators present a safe and potentially effective method for surgical training, and can also be used in robot-assisted surgery for pre- and intra-operative planning. Accurate modeling of the interaction between surgical instruments and organs has been recognized as a key requirement in the development of high-fidelity surgical simulators. Researchers have attempted to model tool-tissue interactions in a wide variety of ways, which can be broadly classified as (1) linear elasticity-based, (2) nonlinear (hyperelastic) elasticity-based finite element (FE) methods, and (3) other techniques that not based on FE methods or continuum mechanics. Realistic modeling of organ deformation requires populating the model with real tissue data (which are difficult to acquire in vivo) and simulating organ response in real time (which is computationally expensive). Further, it is challenging to account for connective tissue supporting the organ, friction, and topological changes resulting from tool-tissue interactions during invasive surgical procedures. Overcoming such obstacles will not only help us to model tool-tissue interactions in real time, but also enable realistic force feedback to the user during surgical simulation. This review paper classifies the existing research on tool-tissue interactions for surgical simulators specifically based on the modeling techniques employed and the kind of surgical operation being simulated, in order to inform and motivate future research on improved tool-tissue interaction models. PMID:20119508

  10. An Integrated Crustal Dynamics Simulator

    NASA Astrophysics Data System (ADS)

    Xing, H. L.; Mora, P.

    2007-12-01

    Numerical modelling offers an outstanding opportunity to gain an understanding of the crustal dynamics and complex crustal system behaviour. This presentation provides our long-term and ongoing effort on finite element based computational model and software development to simulate the interacting fault system for earthquake forecasting. A R-minimum strategy based finite-element computational model and software tool, PANDAS, for modelling 3-dimensional nonlinear frictional contact behaviour between multiple deformable bodies with the arbitrarily-shaped contact element strategy has been developed by the authors, which builds up a virtual laboratory to simulate interacting fault systems including crustal boundary conditions and various nonlinearities (e.g. from frictional contact, materials, geometry and thermal coupling). It has been successfully applied to large scale computing of the complex nonlinear phenomena in the non-continuum media involving the nonlinear frictional instability, multiple material properties and complex geometries on supercomputers, such as the South Australia (SA) interacting fault system, South California fault model and Sumatra subduction model. It has been also extended and to simulate the hot fractured rock (HFR) geothermal reservoir system in collaboration of Geodynamics Ltd which is constructing the first geothermal reservoir system in Australia and to model the tsunami generation induced by earthquakes. Both are supported by Australian Research Council.

  11. How Model Can Help Inquiry--A Qualitative Study of Model Based Inquiry Learning (Mobile) in Engineering Education

    ERIC Educational Resources Information Center

    Gong, Yu

    2017-01-01

    This study investigates how students can use "interactive example models" in inquiry activities to develop their conceptual knowledge about an engineering phenomenon like electromagnetic fields and waves. An interactive model, for example a computational model, could be used to develop and teach principles of dynamic complex systems, and…

  12. Characterization of Aerodynamic Interactions with the Mars Science Laboratory Reaction Control System Using Computation and Experiment

    NASA Technical Reports Server (NTRS)

    Schoenenberger, Mark; VanNorman, John; Rhode, Matthew; Paulson, John

    2013-01-01

    On August 5 , 2012, the Mars Science Laboratory (MSL) entry capsule successfully entered Mars' atmosphere and landed the Curiosity rover in Gale Crater. The capsule used a reaction control system (RCS) consisting of four pairs of hydrazine thrusters to fly a guided entry. The RCS provided bank control to fly along a flight path commanded by an onboard computer and also damped unwanted rates due to atmospheric disturbances and any dynamic instabilities of the capsule. A preliminary assessment of the MSL's flight data from entry showed that the capsule flew much as predicted. This paper will describe how the MSL aerodynamics team used engineering analyses, computational codes and wind tunnel testing in concert to develop the RCS system and certify it for flight. Over the course of MSL's development, the RCS configuration underwent a number of design iterations to accommodate mechanical constraints, aeroheating concerns and excessive aero/RCS interactions. A brief overview of the MSL RCS configuration design evolution is provided. Then, a brief description is presented of how the computational predictions of RCS jet interactions were validated. The primary work to certify that the RCS interactions were acceptable for flight was centered on validating computational predictions at hypersonic speeds. A comparison of computational fluid dynamics (CFD) predictions to wind tunnel force and moment data gathered in the NASA Langley 31-Inch Mach 10 Tunnel was the lynch pin to validating the CFD codes used to predict aero/RCS interactions. Using the CFD predictions and experimental data, an interaction model was developed for Monte Carlo analyses using 6-degree-of-freedom trajectory simulation. The interaction model used in the flight simulation is presented.

  13. Sculpting proteins interactively: continual energy minimization embedded in a graphical modeling system.

    PubMed

    Surles, M C; Richardson, J S; Richardson, D C; Brooks, F P

    1994-02-01

    We describe a new paradigm for modeling proteins in interactive computer graphics systems--continual maintenance of a physically valid representation, combined with direct user control and visualization. This is achieved by a fast algorithm for energy minimization, capable of real-time performance on all atoms of a small protein, plus graphically specified user tugs. The modeling system, called Sculpt, rigidly constrains bond lengths, bond angles, and planar groups (similar to existing interactive modeling programs), while it applies elastic restraints to minimize the potential energy due to torsions, hydrogen bonds, and van der Waals and electrostatic interactions (similar to existing batch minimization programs), and user-specified springs. The graphical interface can show bad and/or favorable contacts, and individual energy terms can be turned on or off to determine their effects and interactions. Sculpt finds a local minimum of the total energy that satisfies all the constraints using an augmented Lagrange-multiplier method; calculation time increases only linearly with the number of atoms because the matrix of constraint gradients is sparse and banded. On a 100-MHz MIPS R4000 processor (Silicon Graphics Indigo), Sculpt achieves 11 updates per second on a 20-residue fragment and 2 updates per second on an 80-residue protein, using all atoms except non-H-bonding hydrogens, and without electrostatic interactions. Applications of Sculpt are described: to reverse the direction of bundle packing in a designed 4-helix bundle protein, to fold up a 2-stranded beta-ribbon into an approximate beta-barrel, and to design the sequence and conformation of a 30-residue peptide that mimics one partner of a protein subunit interaction. Computer models that are both interactive and physically realistic (within the limitations of a given force field) have 2 significant advantages: (1) they make feasible the modeling of very large changes (such as needed for de novo design), and (2) they help the user understand how different energy terms interact to stabilize a given conformation. The Sculpt paradigm combines many of the best features of interactive graphical modeling, energy minimization, and actual physical models, and we propose it as an especially productive way to use current and future increases in computer speed.

  14. User's instructions for the cardiovascular Walters model

    NASA Technical Reports Server (NTRS)

    Croston, R. C.

    1973-01-01

    The model is a combined, steady-state cardiovascular and thermal model. It was originally developed for interactive use, but was converted to batch mode simulation for the Sigma 3 computer. The model has the purpose to compute steady-state circulatory and thermal variables in response to exercise work loads and environmental factors. During a computer simulation run, several selected variables are printed at each time step. End conditions are also printed at the completion of the run.

  15. Long Range Debye-Hückel Correction for Computation of Grid-based Electrostatic Forces Between Biomacromolecules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mereghetti, Paolo; Martinez, M.; Wade, Rebecca C.

    Brownian dynamics (BD) simulations can be used to study very large molecular systems, such as models of the intracellular environment, using atomic-detail structures. Such simulations require strategies to contain the computational costs, especially for the computation of interaction forces and energies. A common approach is to compute interaction forces between macromolecules by precomputing their interaction potentials on three-dimensional discretized grids. For long-range interactions, such as electrostatics, grid-based methods are subject to finite size errors. We describe here the implementation of a Debye-Hückel correction to the grid-based electrostatic potential used in the SDA BD simulation software that was applied to simulatemore » solutions of bovine serum albumin and of hen egg white lysozyme.« less

  16. Development of a recursion RNG-based turbulence model

    NASA Technical Reports Server (NTRS)

    Zhou, YE; Vahala, George; Thangam, S.

    1993-01-01

    Reynolds stress closure models based on the recursion renormalization group theory are developed for the prediction of turbulent separated flows. The proposed model uses a finite wavenumber truncation scheme to account for the spectral distribution of energy. In particular, the model incorporates effects of both local and nonlocal interactions. The nonlocal interactions are shown to yield a contribution identical to that from the epsilon-renormalization group (RNG), while the local interactions introduce higher order dispersive effects. A formal analysis of the model is presented and its ability to accurately predict separated flows is analyzed from a combined theoretical and computational stand point. Turbulent flow past a backward facing step is chosen as a test case and the results obtained based on detailed computations demonstrate that the proposed recursion -RNG model with finite cut-off wavenumber can yield very good predictions for the backstep problem.

  17. Modeling Creep-Fatigue-Environment Interactions in Steam Turbine Rotor Materials for Advanced Ultra-supercritical Coal Power Plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, Chen

    2014-04-01

    The goal of this project is to model creep-fatigue-environment interactions in steam turbine rotor materials for advanced ultra-supercritical (A-USC) coal power Alloy 282 plants, to develop and demonstrate computational algorithms for alloy property predictions, and to determine and model key mechanisms that contribute to the damages caused by creep-fatigue-environment interactions.

  18. DNS and modeling of the interaction between turbulent premixed flames and walls

    NASA Technical Reports Server (NTRS)

    Poinsot, T. J.; Haworth, D. C.

    1992-01-01

    The interaction between turbulent premixed flames and walls is studied using a two-dimensional full Navier-Stokes solver with simple chemistry. The effects of wall distance on the local and global flame structure are investigated. Quenching distances and maximum wall heat fluxes during quenching are computed in laminar cases and are found to be comparable to experimental and analytical results. For turbulent cases, it is shown that quenching distances and maximum heat fluxes remain of the same order as for laminar flames. Based on simulation results, a 'law-of-the-wall' model is derived to describe the interaction between a turbulent premixed flame and a wall. This model is constructed to provide reasonable behavior of flame surface density near a wall under the assumption that flame-wall interaction takes place at scales smaller than the computational mesh. It can be implemented in conjunction with any of several recent flamelet models based on a modeled surface density equation, with no additional constraints on mesh size or time step.

  19. A Computer Model of the Cardiovascular System for Effective Learning.

    ERIC Educational Resources Information Center

    Rothe, Carl F.

    1980-01-01

    Presents a model of the cardiovascular system which solves a set of interacting, possibly nonlinear, differential equations. Figures present a schematic diagram of the model and printouts that simulate normal conditions, exercise, hemorrhage, reduced contractility. The nine interacting equations used to describe the system are described in the…

  20. Computational Flow Modeling of Human Upper Airway Breathing

    NASA Astrophysics Data System (ADS)

    Mylavarapu, Goutham

    Computational modeling of biological systems have gained a lot of interest in biomedical research, in the recent past. This thesis focuses on the application of computational simulations to study airflow dynamics in human upper respiratory tract. With advancements in medical imaging, patient specific geometries of anatomically accurate respiratory tracts can now be reconstructed from Magnetic Resonance Images (MRI) or Computed Tomography (CT) scans, with better and accurate details than traditional cadaver cast models. Computational studies using these individualized geometrical models have advantages of non-invasiveness, ease, minimum patient interaction, improved accuracy over experimental and clinical studies. Numerical simulations can provide detailed flow fields including velocities, flow rates, airway wall pressure, shear stresses, turbulence in an airway. Interpretation of these physical quantities will enable to develop efficient treatment procedures, medical devices, targeted drug delivery etc. The hypothesis for this research is that computational modeling can predict the outcomes of a surgical intervention or a treatment plan prior to its application and will guide the physician in providing better treatment to the patients. In the current work, three different computational approaches Computational Fluid Dynamics (CFD), Flow-Structure Interaction (FSI) and Particle Flow simulations were used to investigate flow in airway geometries. CFD approach assumes airway wall as rigid, and relatively easy to simulate, compared to the more challenging FSI approach, where interactions of airway wall deformations with flow are also accounted. The CFD methodology using different turbulence models is validated against experimental measurements in an airway phantom. Two case-studies using CFD, to quantify a pre and post-operative airway and another, to perform virtual surgery to determine the best possible surgery in a constricted airway is demonstrated. The unsteady Large Eddy simulations (LES) and a steady Reynolds Averaged Navier Stokes (RANS) approaches in CFD modeling are discussed. The more challenging FSI approach is modeled first in simple two-dimensional anatomical geometry and then extended to simplified three dimensional geometry and finally in three dimensionally accurate geometries. The concepts of virtual surgery and the differences to CFD are discussed. Finally, the influence of various drug delivery parameters on particle deposition efficiency in airway anatomy are investigated through particle-flow simulations in a nasal airway model.

  1. LEOrbit: A program to calculate parameters relevant to modeling Low Earth Orbit spacecraft-plasma interaction

    NASA Astrophysics Data System (ADS)

    Marchand, R.; Purschke, D.; Samson, J.

    2013-03-01

    Understanding the physics of interaction between satellites and the space environment is essential in planning and exploiting space missions. Several computer models have been developed over the years to study this interaction. In all cases, simulations are carried out in the reference frame of the spacecraft and effects such as charging, the formation of electrostatic sheaths and wakes are calculated for given conditions of the space environment. In this paper we present a program used to compute magnetic fields and a number of space plasma and space environment parameters relevant to Low Earth Orbits (LEO) spacecraft-plasma interaction modeling. Magnetic fields are obtained from the International Geophysical Reference Field (IGRF) and plasma parameters are obtained from the International Reference Ionosphere (IRI) model. All parameters are computed in the spacecraft frame of reference as a function of its six Keplerian elements. They are presented in a format that can be used directly in most spacecraft-plasma interaction models. Catalogue identifier: AENY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENY_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 270308 No. of bytes in distributed program, including test data, etc.: 2323222 Distribution format: tar.gz Programming language: FORTRAN 90. Computer: Non specific. Operating system: Non specific. RAM: 7.1 MB Classification: 19, 4.14. External routines: IRI, IGRF (included in the package). Nature of problem: Compute magnetic field components, direction of the sun, sun visibility factor and approximate plasma parameters in the reference frame of a Low Earth Orbit satellite. Solution method: Orbit integration, calls to IGRF and IRI libraries and transformation of coordinates from geocentric to spacecraft frame reference. Restrictions: Low Earth orbits, altitudes between 150 and 2000 km. Running time: Approximately two seconds to parameterize a full orbit with 1000 points.

  2. High-performance biocomputing for simulating the spread of contagion over large contact networks

    PubMed Central

    2012-01-01

    Background Many important biological problems can be modeled as contagion diffusion processes over interaction networks. This article shows how the EpiSimdemics interaction-based simulation system can be applied to the general contagion diffusion problem. Two specific problems, computational epidemiology and human immune system modeling, are given as examples. We then show how the graphics processing unit (GPU) within each compute node of a cluster can effectively be used to speed-up the execution of these types of problems. Results We show that a single GPU can accelerate the EpiSimdemics computation kernel by a factor of 6 and the entire application by a factor of 3.3, compared to the execution time on a single core. When 8 CPU cores and 2 GPU devices are utilized, the speed-up of the computational kernel increases to 9.5. When combined with effective techniques for inter-node communication, excellent scalability can be achieved without significant loss of accuracy in the results. Conclusions We show that interaction-based simulation systems can be used to model disparate and highly relevant problems in biology. We also show that offloading some of the work to GPUs in distributed interaction-based simulations can be an effective way to achieve increased intra-node efficiency. PMID:22537298

  3. Investigation of Molecule-Surface Interactions With Overtone Absorption Spectroscopy and Computational Methods

    DTIC Science & Technology

    2010-11-01

    method at a fraction of the computational cost . The overtone frequency serves as the bridge between the molecule-surface interaction model and...the computational cost of utilizing higher levels of theory such as MP2. The second task is the calculation of absorption frequencies as a function...the methyl C-H bonds, and n\\ and inn are the carbon and hydrogen atomic masses, respectively. The calculation of the fundamental and overtone

  4. Simulating complex intracellular processes using object-oriented computational modelling.

    PubMed

    Johnson, Colin G; Goldman, Jacki P; Gullick, William J

    2004-11-01

    The aim of this paper is to give an overview of computer modelling and simulation in cellular biology, in particular as applied to complex biochemical processes within the cell. This is illustrated by the use of the techniques of object-oriented modelling, where the computer is used to construct abstractions of objects in the domain being modelled, and these objects then interact within the computer to simulate the system and allow emergent properties to be observed. The paper also discusses the role of computer simulation in understanding complexity in biological systems, and the kinds of information which can be obtained about biology via simulation.

  5. Modeling and Analysis of Power Processing Systems. [use of a digital computer for designing power plants

    NASA Technical Reports Server (NTRS)

    Fegley, K. A.; Hayden, J. H.; Rehmann, D. W.

    1974-01-01

    The feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems is investigated. It is shown that a digital computer may be used in an interactive mode for the design, modeling, analysis, and comparison of power processing systems.

  6. Laser interferometer skin-friction measurements of crossing-shock-wave/turbulent-boundary-layer interactions

    NASA Technical Reports Server (NTRS)

    Garrison, T. J.; Settles, G. S.; Narayanswami, N.; Knight, D. D.

    1994-01-01

    Wall shear stress measurements beneath crossing-shock-wave/turbulent boundary-layer interactions have been made for three interactions of different strengths. The interactions are generated by two sharp fins at symetric angles of attack mounted on a flat plate. The shear stress measurements were made for fin angles of 7 and 11 deg at Mach 3 and 15 deg at Mach 3.85. The measurements were made using a laser interferometer skin-friction meter, a device that determines the wall shear by optically measuring the time rate of thinning of an oil film placed on the test model surface. Results of the measurements reveal high skin-friction coefficients in the vicinity of the fin/plate junction and the presence of quasi-two-dimensional flow separation on the interaction center line. Additionally, two Navier-Stokes computations, one using a Baldwin-Lomax turbulence model and one using a k-epsilon model, are compared with the experimental results for the Mach 3.85, 15-deg interaction case. Although the k-epsilon model did a reasonable job of predicting the overall trend in portions of the skin-friction distribution, neither computation fully captured the physics of the near-surface flow in this complex interaction.

  7. Laser Interferometer Skin-Friction measurements of crossing-shock wave/turbulent boundary-layer interactions

    NASA Technical Reports Server (NTRS)

    Garrison, T. J.; Settles, G. S.

    1993-01-01

    Wall shear stress measurements beneath crossingshock wave/turbulent boundary-layer interactions have been made for three interactions of different strengths. The interactions are generated by two sharp fins at symmetric angles of attack mounted on a flat plate. The shear stress measurements were made for fin angles of 7 and 11 degrees at Mach 3 and 15 degrees at Mach 4. The measurements were made using a Laser Interferometer Skin Friction (LISF) meter; a device which determines the wail shear by optically measuring the time rate of thinning of an oil film placed on the test model surface. Results of the measurements reveal high skin friction coefficients in the vicinity of the fin/plate junction and the presence of quasi-two-dimensional flow separation on the interaction centerline. Additionally, two Navier-Stokes computations, one using a Baldwin-Lomax turbulence model and one using a k- model, are compared to the experimental results for the Mach 4, 15 degree interaction case. While the k- model did a reasonable job of predicting the overall trend in portions of the skin friction distribution, neither computation fully captured the physics of the near surface flow in this complex interaction.

  8. Computational Social Creativity.

    PubMed

    Saunders, Rob; Bown, Oliver

    2015-01-01

    This article reviews the development of computational models of creativity where social interactions are central. We refer to this area as computational social creativity. Its context is described, including the broader study of creativity, the computational modeling of other social phenomena, and computational models of individual creativity. Computational modeling has been applied to a number of areas of social creativity and has the potential to contribute to our understanding of creativity. A number of requirements for computational models of social creativity are common in artificial life and computational social science simulations. Three key themes are identified: (1) computational social creativity research has a critical role to play in understanding creativity as a social phenomenon and advancing computational creativity by making clear epistemological contributions in ways that would be challenging for other approaches; (2) the methodologies developed in artificial life and computational social science carry over directly to computational social creativity; and (3) the combination of computational social creativity with individual models of creativity presents significant opportunities and poses interesting challenges for the development of integrated models of creativity that have yet to be realized.

  9. Computational Analysis and Simulation of Empathic Behaviors: A Survey of Empathy Modeling with Behavioral Signal Processing Framework

    PubMed Central

    Xiao, Bo; Imel, Zac E.; Georgiou, Panayiotis; Atkins, David C.; Narayanan, Shrikanth S.

    2017-01-01

    Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation, and offer a series of open problems for future research. PMID:27017830

  10. An image-based reaction field method for electrostatic interactions in molecular dynamics simulations of aqueous solutions

    NASA Astrophysics Data System (ADS)

    Lin, Yuchun; Baumketner, Andrij; Deng, Shaozhong; Xu, Zhenli; Jacobs, Donald; Cai, Wei

    2009-10-01

    In this paper, a new solvation model is proposed for simulations of biomolecules in aqueous solutions that combines the strengths of explicit and implicit solvent representations. Solute molecules are placed in a spherical cavity filled with explicit water, thus providing microscopic detail where it is most needed. Solvent outside of the cavity is modeled as a dielectric continuum whose effect on the solute is treated through the reaction field corrections. With this explicit/implicit model, the electrostatic potential represents a solute molecule in an infinite bath of solvent, thus avoiding unphysical interactions between periodic images of the solute commonly used in the lattice-sum explicit solvent simulations. For improved computational efficiency, our model employs an accurate and efficient multiple-image charge method to compute reaction fields together with the fast multipole method for the direct Coulomb interactions. To minimize the surface effects, periodic boundary conditions are employed for nonelectrostatic interactions. The proposed model is applied to study liquid water. The effect of model parameters, which include the size of the cavity, the number of image charges used to compute reaction field, and the thickness of the buffer layer, is investigated in comparison with the particle-mesh Ewald simulations as a reference. An optimal set of parameters is obtained that allows for a faithful representation of many structural, dielectric, and dynamic properties of the simulated water, while maintaining manageable computational cost. With controlled and adjustable accuracy of the multiple-image charge representation of the reaction field, it is concluded that the employed model achieves convergence with only one image charge in the case of pure water. Future applications to pKa calculations, conformational sampling of solvated biomolecules and electrolyte solutions are briefly discussed.

  11. Turbulent reacting flow computations including turbulence-chemistry interactions

    NASA Technical Reports Server (NTRS)

    Narayan, J. R.; Girimaji, S. S.

    1992-01-01

    A two-equation (k-epsilon) turbulence model has been extended to be applicable for compressible reacting flows. A compressibility correction model based on modeling the dilatational terms in the Reynolds stress equations has been used. A turbulence-chemistry interaction model is outlined. In this model, the effects of temperature and species mass concentrations fluctuations on the species mass production rates are decoupled. The effect of temperature fluctuations is modeled via a moment model, and the effect of concentration fluctuations is included using an assumed beta-pdf model. Preliminary results obtained using this model are presented. A two-dimensional reacting mixing layer has been used as a test case. Computations are carried out using the Navier-Stokes solver SPARK using a finite rate chemistry model for hydrogen-air combustion.

  12. ComPLuS Model: A New Insight in Pupils' Collaborative Talk, Actions and Balance during a Computer-Mediated Music Task

    ERIC Educational Resources Information Center

    Nikolaidou, Georgia N.

    2012-01-01

    This exploratory work describes and analyses the collaborative interactions that emerge during computer-based music composition in the primary school. The study draws on socio-cultural theories of learning, originated within Vygotsky's theoretical context, and proposes a new model, namely Computer-mediated Praxis and Logos under Synergy (ComPLuS).…

  13. Computing Support for Basic Research in Perception and Cognition

    DTIC Science & Technology

    1988-12-07

    hearing aids and cochlear implants, this suggests that certain types of proposed coding schemes, specifically those employing periodicity tuning in...developing a computer model of the interaction of declarative and procedural knowledge in skill acquisition. In the Visual Psychophysics Laboratory... Psycholinguistics - Laboratory a computer model of text comprehension and recall has been constructed and several - experiments have been completed that verify basic

  14. Objectively Determining the Educational Potential of Computer and Video-Based Courseware; or, Producing Reliable Evaluations Despite the Dog and Pony Show.

    ERIC Educational Resources Information Center

    Barrett, Andrew J.; And Others

    The Center for Interactive Technology, Applications, and Research at the College of Engineering of the University of South Florida (Tampa) has developed objective and descriptive evaluation models to assist in determining the educational potential of computer and video courseware. The computer-based courseware evaluation model and the video-based…

  15. Computer animation of modal and transient vibrations

    NASA Technical Reports Server (NTRS)

    Lipman, Robert R.

    1987-01-01

    An interactive computer graphics processor is described that is capable of generating input to animate modal and transient vibrations of finite element models on an interactive graphics system. The results from NASTRAN can be postprocessed such that a three dimensional wire-frame picture, in perspective, of the finite element mesh is drawn on the graphics display. Modal vibrations of any mode shape or transient motions over any range of steps can be animated. The finite element mesh can be color-coded by any component of displacement. Viewing parameters and the rate of vibration of the finite element model can be interactively updated while the structure is vibrating.

  16. A Web Tool for Research in Nonlinear Optics

    NASA Astrophysics Data System (ADS)

    Prikhod'ko, Nikolay V.; Abramovsky, Viktor A.; Abramovskaya, Natalia V.; Demichev, Andrey P.; Kryukov, Alexandr P.; Polyakov, Stanislav P.

    2016-02-01

    This paper presents a project of developing the web platform called WebNLO for computer modeling of nonlinear optics phenomena. We discuss a general scheme of the platform and a model for interaction between the platform modules. The platform is built as a set of interacting RESTful web services (SaaS approach). Users can interact with the platform through a web browser or command line interface. Such a resource has no analogues in the field of nonlinear optics and will be created for the first time therefore allowing researchers to access high-performance computing resources that will significantly reduce the cost of the research and development process.

  17. A Perspective on Computational Human Performance Models as Design Tools

    NASA Technical Reports Server (NTRS)

    Jones, Patricia M.

    2010-01-01

    The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.

  18. Hierarchical, parallel computing strategies using component object model for process modelling responses of forest plantations to interacting multiple stresses

    Treesearch

    J. G. Isebrands; G. E. Host; K. Lenz; G. Wu; H. W. Stech

    2000-01-01

    Process models are powerful research tools for assessing the effects of multiple environmental stresses on forest plantations. These models are driven by interacting environmental variables and often include genetic factors necessary for assessing forest plantation growth over a range of different site, climate, and silvicultural conditions. However, process models are...

  19. Nearly Supersymmetric Dark Atoms

    DOE PAGES

    Behbahani, Siavosh R.; Jankowiak, Martin; Rube, Tomas; ...

    2011-01-01

    Theories of dark matter that support bound states are an intriguing possibility for the identity of the missing mass of the Universe. This article proposes a class of models of supersymmetric composite dark matter where the interactions with the Standard Model communicate supersymmetry breaking to the dark sector. In these models, supersymmetry breaking can be treated as a perturbation on the spectrum of bound states. Using a general formalism, the spectrum with leading supersymmetry effects is computed without specifying the details of the binding dynamics. The interactions of the composite states with the Standard Model are computed, and several benchmarkmore » models are described. General features of nonrelativistic supersymmetric bound states are emphasized.« less

  20. New Frontiers in Analyzing Dynamic Group Interactions: Bridging Social and Computer Science

    PubMed Central

    Lehmann-Willenbrock, Nale; Hung, Hayley; Keyton, Joann

    2017-01-01

    This special issue on advancing interdisciplinary collaboration between computer scientists and social scientists documents the joint results of the international Lorentz workshop, “Interdisciplinary Insights into Group and Team Dynamics,” which took place in Leiden, The Netherlands, July 2016. An equal number of scholars from social and computer science participated in the workshop and contributed to the papers included in this special issue. In this introduction, we first identify interaction dynamics as the core of group and team models and review how scholars in social and computer science have typically approached behavioral interactions in groups and teams. Next, we identify key challenges for interdisciplinary collaboration between social and computer scientists, and we provide an overview of the different articles in this special issue aimed at addressing these challenges. PMID:29249891

  1. New Frontiers in Analyzing Dynamic Group Interactions: Bridging Social and Computer Science.

    PubMed

    Lehmann-Willenbrock, Nale; Hung, Hayley; Keyton, Joann

    2017-10-01

    This special issue on advancing interdisciplinary collaboration between computer scientists and social scientists documents the joint results of the international Lorentz workshop, "Interdisciplinary Insights into Group and Team Dynamics," which took place in Leiden, The Netherlands, July 2016. An equal number of scholars from social and computer science participated in the workshop and contributed to the papers included in this special issue. In this introduction, we first identify interaction dynamics as the core of group and team models and review how scholars in social and computer science have typically approached behavioral interactions in groups and teams. Next, we identify key challenges for interdisciplinary collaboration between social and computer scientists, and we provide an overview of the different articles in this special issue aimed at addressing these challenges.

  2. Stochastic analog neutron transport with TRIPOLI-4 and FREYA: Bayesian uncertainty quantification for neutron multiplicity counting

    DOE PAGES

    Verbeke, J. M.; Petit, O.

    2016-06-01

    From nuclear safeguards to homeland security applications, the need for the better modeling of nuclear interactions has grown over the past decades. Current Monte Carlo radiation transport codes compute average quantities with great accuracy and performance; however, performance and averaging come at the price of limited interaction-by-interaction modeling. These codes often lack the capability of modeling interactions exactly: for a given collision, energy is not conserved, energies of emitted particles are uncorrelated, and multiplicities of prompt fission neutrons and photons are uncorrelated. Many modern applications require more exclusive quantities than averages, such as the fluctuations in certain observables (e.g., themore » neutron multiplicity) and correlations between neutrons and photons. In an effort to meet this need, the radiation transport Monte Carlo code TRIPOLI-4® was modified to provide a specific mode that models nuclear interactions in a full analog way, replicating as much as possible the underlying physical process. Furthermore, the computational model FREYA (Fission Reaction Event Yield Algorithm) was coupled with TRIPOLI-4 to model complete fission events. As a result, FREYA automatically includes fluctuations as well as correlations resulting from conservation of energy and momentum.« less

  3. EZLP: An Interactive Computer Program for Solving Linear Programming Problems. Final Report.

    ERIC Educational Resources Information Center

    Jarvis, John J.; And Others

    Designed for student use in solving linear programming problems, the interactive computer program described (EZLP) permits the student to input the linear programming model in exactly the same manner in which it would be written on paper. This report includes a brief review of the development of EZLP; narrative descriptions of program features,…

  4. A Guide to Analyzing Message-Response Sequences and Group Interaction Patterns in Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Jeong, Allan

    2005-01-01

    This paper proposes a set of methods and a framework for evaluating, modeling, and predicting group interactions in computer-mediated communication. The method of sequential analysis is described along with specific software tools and techniques to facilitate the analysis of message-response sequences. In addition, the Dialogic Theory and its…

  5. Visual Debugging of Object-Oriented Systems With the Unified Modeling Language

    DTIC Science & Technology

    2004-03-01

    to be “the systematic and imaginative use of the technology of interactive computer graphics and the disciplines of graphic design , typography ... Graphics volume 23 no 6, pp893-901, 1999. [SHN98] Shneiderman, B. Designing the User Interface. Strategies for Effective Human-Computer Interaction...System Design Objectives ................................................................................ 44 3.3 System Architecture

  6. Pair mobility functions for rigid spheres in concentrated colloidal dispersions: Stresslet and straining motion couplings

    NASA Astrophysics Data System (ADS)

    Su, Yu; Swan, James W.; Zia, Roseanna N.

    2017-03-01

    Accurate modeling of particle interactions arising from hydrodynamic, entropic, and other microscopic forces is essential to understanding and predicting particle motion and suspension behavior in complex and biological fluids. The long-range nature of hydrodynamic interactions can be particularly challenging to capture. In dilute dispersions, pair-level interactions are sufficient and can be modeled in detail by analytical relations derived by Jeffrey and Onishi [J. Fluid Mech. 139, 261-290 (1984)] and Jeffrey [Phys. Fluids A 4, 16-29 (1992)]. In more concentrated dispersions, analytical modeling of many-body hydrodynamic interactions quickly becomes intractable, leading to the development of simplified models. These include mean-field approaches that smear out particle-scale structure and essentially assume that long-range hydrodynamic interactions are screened by crowding, as particle mobility decays at high concentrations. Toward the development of an accurate and simplified model for the hydrodynamic interactions in concentrated suspensions, we recently computed a set of effective pair of hydrodynamic functions coupling particle motion to a hydrodynamic force and torque at volume fractions up to 50% utilizing accelerated Stokesian dynamics and a fast stochastic sampling technique [Zia et al., J. Chem. Phys. 143, 224901 (2015)]. We showed that the hydrodynamic mobility in suspensions of colloidal spheres is not screened, and the power law decay of the hydrodynamic functions persists at all concentrations studied. In the present work, we extend these mobility functions to include the couplings of particle motion and straining flow to the hydrodynamic stresslet. The couplings computed in these two articles constitute a set of orthogonal coupling functions that can be utilized to compute equilibrium properties in suspensions at arbitrary concentration and are readily applied to solve many-body hydrodynamic interactions analytically.

  7. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jablonowski, Christiane

    The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively withmore » advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project demonstrate significant advances in all six research areas. The major conclusions are that statically-adaptive variable-resolution modeling is currently becoming mature in the climate sciences, and that AMR holds outstanding promise for future-generation weather and climate models on high-performance computing architectures.« less

  8. Smart Swarms of Bacteria-Inspired Agents with Performance Adaptable Interactions

    PubMed Central

    Shklarsh, Adi; Ariel, Gil; Schneidman, Elad; Ben-Jacob, Eshel

    2011-01-01

    Collective navigation and swarming have been studied in animal groups, such as fish schools, bird flocks, bacteria, and slime molds. Computer modeling has shown that collective behavior of simple agents can result from simple interactions between the agents, which include short range repulsion, intermediate range alignment, and long range attraction. Here we study collective navigation of bacteria-inspired smart agents in complex terrains, with adaptive interactions that depend on performance. More specifically, each agent adjusts its interactions with the other agents according to its local environment – by decreasing the peers' influence while navigating in a beneficial direction, and increasing it otherwise. We show that inclusion of such performance dependent adaptable interactions significantly improves the collective swarming performance, leading to highly efficient navigation, especially in complex terrains. Notably, to afford such adaptable interactions, each modeled agent requires only simple computational capabilities with short-term memory, which can easily be implemented in simple swarming robots. PMID:21980274

  9. Smart swarms of bacteria-inspired agents with performance adaptable interactions.

    PubMed

    Shklarsh, Adi; Ariel, Gil; Schneidman, Elad; Ben-Jacob, Eshel

    2011-09-01

    Collective navigation and swarming have been studied in animal groups, such as fish schools, bird flocks, bacteria, and slime molds. Computer modeling has shown that collective behavior of simple agents can result from simple interactions between the agents, which include short range repulsion, intermediate range alignment, and long range attraction. Here we study collective navigation of bacteria-inspired smart agents in complex terrains, with adaptive interactions that depend on performance. More specifically, each agent adjusts its interactions with the other agents according to its local environment--by decreasing the peers' influence while navigating in a beneficial direction, and increasing it otherwise. We show that inclusion of such performance dependent adaptable interactions significantly improves the collective swarming performance, leading to highly efficient navigation, especially in complex terrains. Notably, to afford such adaptable interactions, each modeled agent requires only simple computational capabilities with short-term memory, which can easily be implemented in simple swarming robots.

  10. Reconstruction of an Immune Dynamic Model to Simulate the Contrasting Role of Auxin and Cytokinin in Plant Immunity.

    PubMed

    Kaltdorf, Martin; Dandekar, Thomas; Naseem, Muhammad

    2017-01-01

    In order to increase our understanding of biological dependencies in plant immune signaling pathways, the known interactions involved in plant immune networks are modeled. This allows computational analysis to predict the functions of growth related hormones in plant-pathogen interaction. The SQUAD (Standardized Qualitative Dynamical Systems) algorithm first determines stable system states in the network and then use them to compute continuous dynamical system states. Our reconstructed Boolean model encompassing hormone immune networks of Arabidopsis thaliana (Arabidopsis) and pathogenicity factors injected by model pathogen Pseudomonas syringae pv. tomato DC3000 (Pst DC3000) can be exploited to determine the impact of growth hormones in plant immunity. We describe a detailed working protocol how to use the modified SQUAD-package by exemplifying the contrasting effects of auxin and cytokinins in shaping plant-pathogen interaction.

  11. Numerical prediction of fiber orientation in injection-molded short-fiber/thermoplastic composite parts with experimental validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thi, Thanh Binh Nguyen; Morioka, Mizuki; Yokoyama, Atsushi

    Numerical prediction of the fiber orientation in the short-glass fiber (GF) reinforced polyamide 6 (PA6) composites with the fiber weight concentration of 30%, 50%, and 70% manufactured by the injection molding process is presented. And the fiber orientation was also directly observed and measured through X-ray computed tomography. During the injection molding process of the short-fiber/thermoplastic composite, the fiber orientation is produced by the flow states and the fiber-fiber interaction. Folgar and Tucker equation is the well known for modeling the fiber orientation in a concentrated suspension. They included into Jeffrey’s equation a diffusive type of term by introducing amore » phenomenological coefficient to account for the fiber-fiber interaction. Our developed model for the fiber-fiber interaction was proposed by modifying the rotary diffusion term of the Folgar-Tucker equation. This model was presented in a conference paper of the 29{sup th} International Conference of the Polymer Processing Society published by AIP conference proceeding. For modeling fiber interaction, the fiber dynamic simulation was introduced in order to obtain a global fiber interaction coefficient, which is sum function of the fiber concentration, aspect ratio, and angular velocity. The fiber orientation is predicted by using the proposed fiber interaction model incorporated into a computer aided engineering simulation package C-Mold. An experimental program has been carried out in which the fiber orientation distribution has been measured in 100 x 100 x 2 mm injection-molded plate and 100 x 80 x 2 mm injection-molded weld by analyzed with a high resolution 3D X-ray computed tomography system XVA-160α, and calculated by X-ray computed tomography imaging. The numerical prediction shows a good agreement with experimental validation. And the complex fiber orientation in the injection-molded weld was investigated.« less

  12. Numerical prediction of fiber orientation in injection-molded short-fiber/thermoplastic composite parts with experimental validation

    NASA Astrophysics Data System (ADS)

    Thi, Thanh Binh Nguyen; Morioka, Mizuki; Yokoyama, Atsushi; Hamanaka, Senji; Yamashita, Katsuhisa; Nonomura, Chisato

    2015-05-01

    Numerical prediction of the fiber orientation in the short-glass fiber (GF) reinforced polyamide 6 (PA6) composites with the fiber weight concentration of 30%, 50%, and 70% manufactured by the injection molding process is presented. And the fiber orientation was also directly observed and measured through X-ray computed tomography. During the injection molding process of the short-fiber/thermoplastic composite, the fiber orientation is produced by the flow states and the fiber-fiber interaction. Folgar and Tucker equation is the well known for modeling the fiber orientation in a concentrated suspension. They included into Jeffrey's equation a diffusive type of term by introducing a phenomenological coefficient to account for the fiber-fiber interaction. Our developed model for the fiber-fiber interaction was proposed by modifying the rotary diffusion term of the Folgar-Tucker equation. This model was presented in a conference paper of the 29th International Conference of the Polymer Processing Society published by AIP conference proceeding. For modeling fiber interaction, the fiber dynamic simulation was introduced in order to obtain a global fiber interaction coefficient, which is sum function of the fiber concentration, aspect ratio, and angular velocity. The fiber orientation is predicted by using the proposed fiber interaction model incorporated into a computer aided engineering simulation package C-Mold. An experimental program has been carried out in which the fiber orientation distribution has been measured in 100 x 100 x 2 mm injection-molded plate and 100 x 80 x 2 mm injection-molded weld by analyzed with a high resolution 3D X-ray computed tomography system XVA-160α, and calculated by X-ray computed tomography imaging. The numerical prediction shows a good agreement with experimental validation. And the complex fiber orientation in the injection-molded weld was investigated.

  13. Computational Modeling of Fluid–Structure–Acoustics Interaction during Voice Production

    PubMed Central

    Jiang, Weili; Zheng, Xudong; Xue, Qian

    2017-01-01

    The paper presented a three-dimensional, first-principle based fluid–structure–acoustics interaction computer model of voice production, which employed a more realistic human laryngeal and vocal tract geometries. Self-sustained vibrations, important convergent–divergent vibration pattern of the vocal folds, and entrainment of the two dominant vibratory modes were captured. Voice quality-associated parameters including the frequency, open quotient, skewness quotient, and flow rate of the glottal flow waveform were found to be well within the normal physiological ranges. The analogy between the vocal tract and a quarter-wave resonator was demonstrated. The acoustic perturbed flux and pressure inside the glottis were found to be at the same order with their incompressible counterparts, suggesting strong source–filter interactions during voice production. Such high fidelity computational model will be useful for investigating a variety of pathological conditions that involve complex vibrations, such as vocal fold paralysis, vocal nodules, and vocal polyps. The model is also an important step toward a patient-specific surgical planning tool that can serve as a no-risk trial and error platform for different procedures, such as injection of biomaterials and thyroplastic medialization. PMID:28243588

  14. Flowfield analysis for successive oblique shock wave-turbulent boundary layer interactions

    NASA Technical Reports Server (NTRS)

    Sun, C. C.; Childs, M. E.

    1976-01-01

    A computation procedure is described for predicting the flowfields which develop when successive interactions between oblique shock waves and a turbulent boundary layer occur. Such interactions may occur, for example, in engine inlets for supersonic aircraft. Computations are carried out for axisymmetric internal flows at M 3.82 and 2.82. The effect of boundary layer bleed is considered for the M 2.82 flow. A control volume analysis is used to predict changes in the flow field across the interactions. Two bleed flow models have been considered. A turbulent boundary layer program is used to compute changes in the boundary layer between the interactions. The results given are for flows with two shock wave interactions and for bleed at the second interaction site. In principle the method described may be extended to account for additional interactions. The predicted results are compared with measured results and are shown to be in good agreement when the bleed flow rate is low (on the order of 3% of the boundary layer mass flow), or when there is no bleed. As the bleed flow rate is increased, differences between the predicted and measured results become larger. Shortcomings of the bleed flow models at higher bleed flow rates are discussed.

  15. Modeling the effect of nano-sized polymer particles on the properties of lipid membranes

    NASA Astrophysics Data System (ADS)

    Rossi, Giulia; Monticelli, Luca

    2014-12-01

    The interaction between polymers and biological membranes has recently gained significant interest in several research areas. On the biomedical side, dendrimers, linear polyelectrolytes, and neutral copolymers find application as drug and gene delivery agents, as biocidal agents, and as platforms for biological sensors. On the environmental side, plastic debris is often disposed of in the oceans and gets degraded into small particles; therefore concern is raising about the interaction of small plastic particles with living organisms. From both perspectives, it is crucial to understand the processes driving the interaction between polymers and cell membranes. In recent times progress in computer technology and simulation methods has allowed computational predictions on the molecular mechanism of interaction between polymeric materials and lipid membranes. Here we review the computational studies on the interaction between lipid membranes and different classes of polymers: dendrimers, linear charged polymers, polyethylene glycol (PEG) and its derivatives, polystyrene, and some generic models of polymer chains. We conclude by discussing some of the technical challenges in this area and future developments.

  16. LATIS3D: The Goal Standard for Laser-Tissue-Interaction Modeling

    NASA Astrophysics Data System (ADS)

    London, R. A.; Makarewicz, A. M.; Kim, B. M.; Gentile, N. A.; Yang, T. Y. B.

    2000-03-01

    The goal of this LDRD project has been to create LATIS3D-the world's premier computer program for laser-tissue interaction modeling. The development was based on recent experience with the 2D LATIS code and the ASCI code, KULL. With LATIS3D, important applications in laser medical therapy were researched including dynamical calculations of tissue emulsification and ablation, photothermal therapy, and photon transport for photodynamic therapy. This project also enhanced LLNL's core competency in laser-matter interactions and high-energy-density physics by pushing simulation codes into new parameter regimes and by attracting external expertise. This will benefit both existing LLNL programs such as ICF and SBSS and emerging programs in medical technology and other laser applications. The purpose of this project was to develop and apply a computer program for laser-tissue interaction modeling to aid in the development of new instruments and procedures in laser medicine.

  17. Dyadic brain modelling, mirror systems and the ontogenetic ritualization of ape gesture

    PubMed Central

    Arbib, Michael; Ganesh, Varsha; Gasser, Brad

    2014-01-01

    The paper introduces dyadic brain modelling, offering both a framework for modelling the brains of interacting agents and a general framework for simulating and visualizing the interactions generated when the brains (and the two bodies) are each coded up in computational detail. It models selected neural mechanisms in ape brains supportive of social interactions, including putative mirror neuron systems inspired by macaque neurophysiology but augmented by increased access to proprioceptive state. Simulation results for a reduced version of the model show ritualized gesture emerging from interactions between a simulated child and mother ape. PMID:24778382

  18. Dyadic brain modelling, mirror systems and the ontogenetic ritualization of ape gesture.

    PubMed

    Arbib, Michael; Ganesh, Varsha; Gasser, Brad

    2014-01-01

    The paper introduces dyadic brain modelling, offering both a framework for modelling the brains of interacting agents and a general framework for simulating and visualizing the interactions generated when the brains (and the two bodies) are each coded up in computational detail. It models selected neural mechanisms in ape brains supportive of social interactions, including putative mirror neuron systems inspired by macaque neurophysiology but augmented by increased access to proprioceptive state. Simulation results for a reduced version of the model show ritualized gesture emerging from interactions between a simulated child and mother ape.

  19. A Computer Model of the Cardiovascular System for Effective Learning.

    ERIC Educational Resources Information Center

    Rothe, Carl F.

    1979-01-01

    Described is a physiological model which solves a set of interacting, possibly nonlinear, differential equations through numerical integration on a digital computer. Sample printouts are supplied and explained for effects on the components of a cardiovascular system when exercise, hemorrhage, and cardiac failure occur. (CS)

  20. Computational Psychotherapy Research: Scaling up the evaluation of patient-provider interactions

    PubMed Central

    Imel, Zac E.; Steyvers, Mark; Atkins, David C.

    2014-01-01

    In psychotherapy, the patient-provider interaction contains the treatment’s active ingredients. However, the technology for analyzing the content of this interaction has not fundamentally changed in decades, limiting both the scale and specificity of psychotherapy research. New methods are required in order to “scale up” to larger evaluation tasks and “drill down” into the raw linguistic data of patient-therapist interactions. In the current paper we demonstrate the utility of statistical text analysis models called topic models for discovering the underlying linguistic structure in psychotherapy. Topic models identify semantic themes (or topics) in a collection of documents (here, transcripts). We used topic models to summarize and visualize 1,553 psychotherapy and drug therapy (i.e., medication management) transcripts. Results showed that topic models identified clinically relevant content, including affective, content, and intervention related topics. In addition, topic models learned to identify specific types of therapist statements associated with treatment related codes (e.g., different treatment approaches, patient-therapist discussions about the therapeutic relationship). Visualizations of semantic similarity across sessions indicate that topic models identify content that discriminates between broad classes of therapy (e.g., cognitive behavioral therapy vs. psychodynamic therapy). Finally, predictive modeling demonstrated that topic model derived features can classify therapy type with a high degree of accuracy. Computational psychotherapy research has the potential to scale up the study of psychotherapy to thousands of sessions at a time, and we conclude by discussing the implications of computational methods such as topic models for the future of psychotherapy research and practice. PMID:24866972

  1. Computational psychotherapy research: scaling up the evaluation of patient-provider interactions.

    PubMed

    Imel, Zac E; Steyvers, Mark; Atkins, David C

    2015-03-01

    In psychotherapy, the patient-provider interaction contains the treatment's active ingredients. However, the technology for analyzing the content of this interaction has not fundamentally changed in decades, limiting both the scale and specificity of psychotherapy research. New methods are required to "scale up" to larger evaluation tasks and "drill down" into the raw linguistic data of patient-therapist interactions. In the current article, we demonstrate the utility of statistical text analysis models called topic models for discovering the underlying linguistic structure in psychotherapy. Topic models identify semantic themes (or topics) in a collection of documents (here, transcripts). We used topic models to summarize and visualize 1,553 psychotherapy and drug therapy (i.e., medication management) transcripts. Results showed that topic models identified clinically relevant content, including affective, relational, and intervention related topics. In addition, topic models learned to identify specific types of therapist statements associated with treatment-related codes (e.g., different treatment approaches, patient-therapist discussions about the therapeutic relationship). Visualizations of semantic similarity across sessions indicate that topic models identify content that discriminates between broad classes of therapy (e.g., cognitive-behavioral therapy vs. psychodynamic therapy). Finally, predictive modeling demonstrated that topic model-derived features can classify therapy type with a high degree of accuracy. Computational psychotherapy research has the potential to scale up the study of psychotherapy to thousands of sessions at a time. We conclude by discussing the implications of computational methods such as topic models for the future of psychotherapy research and practice. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  2. Reduced-Order Biogeochemical Flux Model for High-Resolution Multi-Scale Biophysical Simulations

    NASA Astrophysics Data System (ADS)

    Smith, Katherine; Hamlington, Peter; Pinardi, Nadia; Zavatarelli, Marco

    2017-04-01

    Biogeochemical tracers and their interactions with upper ocean physical processes such as submesoscale circulations and small-scale turbulence are critical for understanding the role of the ocean in the global carbon cycle. These interactions can cause small-scale spatial and temporal heterogeneity in tracer distributions that can, in turn, greatly affect carbon exchange rates between the atmosphere and interior ocean. For this reason, it is important to take into account small-scale biophysical interactions when modeling the global carbon cycle. However, explicitly resolving these interactions in an earth system model (ESM) is currently infeasible due to the enormous associated computational cost. As a result, understanding and subsequently parameterizing how these small-scale heterogeneous distributions develop and how they relate to larger resolved scales is critical for obtaining improved predictions of carbon exchange rates in ESMs. In order to address this need, we have developed the reduced-order, 17 state variable Biogeochemical Flux Model (BFM-17) that follows the chemical functional group approach, which allows for non-Redfield stoichiometric ratios and the exchange of matter through units of carbon, nitrate, and phosphate. This model captures the behavior of open-ocean biogeochemical systems without substantially increasing computational cost, thus allowing the model to be combined with computationally-intensive, fully three-dimensional, non-hydrostatic large eddy simulations (LES). In this talk, we couple BFM-17 with the Princeton Ocean Model and show good agreement between predicted monthly-averaged results and Bermuda testbed area field data (including the Bermuda-Atlantic Time-series Study and Bermuda Testbed Mooring). Through these tests, we demonstrate the capability of BFM-17 to accurately model open-ocean biochemistry. Additionally, we discuss the use of BFM-17 within a multi-scale LES framework and outline how this will further our understanding of turbulent biophysical interactions in the upper ocean.

  3. Reduced-Order Biogeochemical Flux Model for High-Resolution Multi-Scale Biophysical Simulations

    NASA Astrophysics Data System (ADS)

    Smith, K.; Hamlington, P.; Pinardi, N.; Zavatarelli, M.; Milliff, R. F.

    2016-12-01

    Biogeochemical tracers and their interactions with upper ocean physical processes such as submesoscale circulations and small-scale turbulence are critical for understanding the role of the ocean in the global carbon cycle. These interactions can cause small-scale spatial and temporal heterogeneity in tracer distributions which can, in turn, greatly affect carbon exchange rates between the atmosphere and interior ocean. For this reason, it is important to take into account small-scale biophysical interactions when modeling the global carbon cycle. However, explicitly resolving these interactions in an earth system model (ESM) is currently infeasible due to the enormous associated computational cost. As a result, understanding and subsequently parametrizing how these small-scale heterogeneous distributions develop and how they relate to larger resolved scales is critical for obtaining improved predictions of carbon exchange rates in ESMs. In order to address this need, we have developed the reduced-order, 17 state variable Biogeochemical Flux Model (BFM-17). This model captures the behavior of open-ocean biogeochemical systems without substantially increasing computational cost, thus allowing the model to be combined with computationally-intensive, fully three-dimensional, non-hydrostatic large eddy simulations (LES). In this talk, we couple BFM-17 with the Princeton Ocean Model and show good agreement between predicted monthly-averaged results and Bermuda testbed area field data (including the Bermuda-Atlantic Time Series and Bermuda Testbed Mooring). Through these tests, we demonstrate the capability of BFM-17 to accurately model open-ocean biochemistry. Additionally, we discuss the use of BFM-17 within a multi-scale LES framework and outline how this will further our understanding of turbulent biophysical interactions in the upper ocean.

  4. Visual interaction: models, systems, prototypes. The Pictorial Computing Laboratory at the University of Rome La Sapienza.

    PubMed

    Bottoni, Paolo; Cinque, Luigi; De Marsico, Maria; Levialdi, Stefano; Panizzi, Emanuele

    2006-06-01

    This paper reports on the research activities performed by the Pictorial Computing Laboratory at the University of Rome, La Sapienza, during the last 5 years. Such work, essentially is based on the study of humancomputer interaction, spans from metamodels of interaction down to prototypes of interactive systems for both synchronous multimedia communication and groupwork, annotation systems for web pages, also encompassing theoretical and practical issues of visual languages and environments also including pattern recognition algorithms. Some applications are also considered like e-learning and collaborative work.

  5. Computational Aerodynamic Modeling of Small Quadcopter Vehicles

    NASA Technical Reports Server (NTRS)

    Yoon, Seokkwan; Ventura Diaz, Patricia; Boyd, D. Douglas; Chan, William M.; Theodore, Colin R.

    2017-01-01

    High-fidelity computational simulations have been performed which focus on rotor-fuselage and rotor-rotor aerodynamic interactions of small quad-rotor vehicle systems. The three-dimensional unsteady Navier-Stokes equations are solved on overset grids using high-order accurate schemes, dual-time stepping, low Mach number preconditioning, and hybrid turbulence modeling. Computational results for isolated rotors are shown to compare well with available experimental data. Computational results in hover reveal the differences between a conventional configuration where the rotors are mounted above the fuselage and an unconventional configuration where the rotors are mounted below the fuselage. Complex flow physics in forward flight is investigated. The goal of this work is to demonstrate that understanding of interactional aerodynamics can be an important factor in design decisions regarding rotor and fuselage placement for next-generation multi-rotor drones.

  6. Human Factors Considerations in System Design

    NASA Technical Reports Server (NTRS)

    Mitchell, C. M. (Editor); Vanbalen, P. M. (Editor); Moe, K. L. (Editor)

    1983-01-01

    Human factors considerations in systems design was examined. Human factors in automated command and control, in the efficiency of the human computer interface and system effectiveness are outlined. The following topics are discussed: human factors aspects of control room design; design of interactive systems; human computer dialogue, interaction tasks and techniques; guidelines on ergonomic aspects of control rooms and highly automated environments; system engineering for control by humans; conceptual models of information processing; information display and interaction in real time environments.

  7. Promoter-enhancer interactions identified from Hi-C data using probabilistic models and hierarchical topological domains.

    PubMed

    Ron, Gil; Globerson, Yuval; Moran, Dror; Kaplan, Tommy

    2017-12-21

    Proximity-ligation methods such as Hi-C allow us to map physical DNA-DNA interactions along the genome, and reveal its organization into topologically associating domains (TADs). As the Hi-C data accumulate, computational methods were developed for identifying domain borders in multiple cell types and organisms. Here, we present PSYCHIC, a computational approach for analyzing Hi-C data and identifying promoter-enhancer interactions. We use a unified probabilistic model to segment the genome into domains, which we then merge hierarchically and fit using a local background model, allowing us to identify over-represented DNA-DNA interactions across the genome. By analyzing the published Hi-C data sets in human and mouse, we identify hundreds of thousands of putative enhancers and their target genes, and compile an extensive genome-wide catalog of gene regulation in human and mouse. As we show, our predictions are highly enriched for ChIP-seq and DNA accessibility data, evolutionary conservation, eQTLs and other DNA-DNA interaction data.

  8. Interactive collision detection for deformable models using streaming AABBs.

    PubMed

    Zhang, Xinyu; Kim, Young J

    2007-01-01

    We present an interactive and accurate collision detection algorithm for deformable, polygonal objects based on the streaming computational model. Our algorithm can detect all possible pairwise primitive-level intersections between two severely deforming models at highly interactive rates. In our streaming computational model, we consider a set of axis aligned bounding boxes (AABBs) that bound each of the given deformable objects as an input stream and perform massively-parallel pairwise, overlapping tests onto the incoming streams. As a result, we are able to prevent performance stalls in the streaming pipeline that can be caused by expensive indexing mechanism required by bounding volume hierarchy-based streaming algorithms. At runtime, as the underlying models deform over time, we employ a novel, streaming algorithm to update the geometric changes in the AABB streams. Moreover, in order to get only the computed result (i.e., collision results between AABBs) without reading back the entire output streams, we propose a streaming en/decoding strategy that can be performed in a hierarchical fashion. After determining overlapped AABBs, we perform a primitive-level (e.g., triangle) intersection checking on a serial computational model such as CPUs. We implemented the entire pipeline of our algorithm using off-the-shelf graphics processors (GPUs), such as nVIDIA GeForce 7800 GTX, for streaming computations, and Intel Dual Core 3.4G processors for serial computations. We benchmarked our algorithm with different models of varying complexities, ranging from 15K up to 50K triangles, under various deformation motions, and the timings were obtained as 30 approximately 100 FPS depending on the complexity of models and their relative configurations. Finally, we made comparisons with a well-known GPU-based collision detection algorithm, CULLIDE [4] and observed about three times performance improvement over the earlier approach. We also made comparisons with a SW-based AABB culling algorithm [2] and observed about two times improvement.

  9. High-resolution computational algorithms for simulating offshore wind turbines and farms: Model development and validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calderer, Antoni; Yang, Xiaolei; Angelidis, Dionysios

    2015-10-30

    The present project involves the development of modeling and analysis design tools for assessing offshore wind turbine technologies. The computational tools developed herein are able to resolve the effects of the coupled interaction of atmospheric turbulence and ocean waves on aerodynamic performance and structural stability and reliability of offshore wind turbines and farms. Laboratory scale experiments have been carried out to derive data sets for validating the computational models.

  10. Combining computer modelling and cardiac imaging to understand right ventricular pump function.

    PubMed

    Walmsley, John; van Everdingen, Wouter; Cramer, Maarten J; Prinzen, Frits W; Delhaas, Tammo; Lumens, Joost

    2017-10-01

    Right ventricular (RV) dysfunction is a strong predictor of outcome in heart failure and is a key determinant of exercise capacity. Despite these crucial findings, the RV remains understudied in the clinical, experimental, and computer modelling literature. This review outlines how recent advances in using computer modelling and cardiac imaging synergistically help to understand RV function in health and disease. We begin by highlighting the complexity of interactions that make modelling the RV both challenging and necessary, and then summarize the multiscale modelling approaches used to date to simulate RV pump function in the context of these interactions. We go on to demonstrate how these modelling approaches in combination with cardiac imaging have improved understanding of RV pump function in pulmonary arterial hypertension, arrhythmogenic right ventricular cardiomyopathy, dyssynchronous heart failure and cardiac resynchronization therapy, hypoplastic left heart syndrome, and repaired tetralogy of Fallot. We conclude with a perspective on key issues to be addressed by computational models of the RV in the near future. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2017. For permissions, please email: journals.permissions@oup.com.

  11. Space-Time Fluid-Structure Interaction Computation of Flapping-Wing Aerodynamics

    DTIC Science & Technology

    2013-12-01

    SST-VMST." The structural mechanics computations are based on the Kirchhoff -Love shell model. We use a sequential coupling technique, which is...mechanics computations are based on the Kirchhoff -Love shell model. We use a sequential coupling technique, which is ap- plicable to some classes of FSI...we use the ST-VMS method in combination with the ST-SUPS method. The structural mechanics computations are mostly based on the Kirchhoff –Love shell

  12. Spin wave Feynman diagram vertex computation package

    NASA Astrophysics Data System (ADS)

    Price, Alexander; Javernick, Philip; Datta, Trinanjan

    Spin wave theory is a well-established theoretical technique that can correctly predict the physical behavior of ordered magnetic states. However, computing the effects of an interacting spin wave theory incorporating magnons involve a laborious by hand derivation of Feynman diagram vertices. The process is tedious and time consuming. Hence, to improve productivity and have another means to check the analytical calculations, we have devised a Feynman Diagram Vertex Computation package. In this talk, we will describe our research group's effort to implement a Mathematica based symbolic Feynman diagram vertex computation package that computes spin wave vertices. Utilizing the non-commutative algebra package NCAlgebra as an add-on to Mathematica, symbolic expressions for the Feynman diagram vertices of a Heisenberg quantum antiferromagnet are obtained. Our existing code reproduces the well-known expressions of a nearest neighbor square lattice Heisenberg model. We also discuss the case of a triangular lattice Heisenberg model where non collinear terms contribute to the vertex interactions.

  13. A Parametric Study of Unsteady Rotor-Stator Interaction in a Simplified Francis Turbine

    NASA Astrophysics Data System (ADS)

    Wouden, Alex; Cimbala, John; Lewis, Bryan

    2011-11-01

    CFD analysis is becoming a critical stage in the design of hydroturbines. However, its capability to represent unsteady flow interactions between the rotor and stator (which requires a 360-degree, mesh-refined model of the turbine passage) is hindered. For CFD to become a more effective tool in predicting the performance of a hydroturbine, the key interactions between the rotor and stator need to be understood using current numerical methods. As a first step towards evaluating this unsteady behavior without the burden of a computationally expensive domain, the stator and Francis-type rotor blades are reduced to flat plates. Local and global variables are compared using periodic, semi-periodic, and 360-degree geometric models and various turbulence models (k-omega, k-epsilon, and Spalart-Allmaras). The computations take place within the OpenFOAM® environment and utilize a general grid interface (GGI) between the rotor and stator computational domains. The rotor computational domain is capable of dynamic rotation. The results demonstrate some of the strengths and limitations of utilizing CFD for hydroturbine analysis. These case studies will also serve as tutorials to help others learn how to use CFD for turbomachinery. This research is funded by a grant from the DOE.

  14. Characterizing Topology of Probabilistic Biological Networks.

    PubMed

    Todor, Andrei; Dobra, Alin; Kahveci, Tamer

    2013-09-06

    Biological interactions are often uncertain events, that may or may not take place with some probability. Existing studies analyze the degree distribution of biological networks by assuming that all the given interactions take place under all circumstances. This strong and often incorrect assumption can lead to misleading results. Here, we address this problem and develop a sound mathematical basis to characterize networks in the presence of uncertain interactions. We develop a method that accurately describes the degree distribution of such networks. We also extend our method to accurately compute the joint degree distributions of node pairs connected by edges. The number of possible network topologies grows exponentially with the number of uncertain interactions. However, the mathematical model we develop allows us to compute these degree distributions in polynomial time in the number of interactions. It also helps us find an adequate mathematical model using maximum likelihood estimation. Our results demonstrate that power law and log-normal models best describe degree distributions for probabilistic networks. The inverse correlation of degrees of neighboring nodes shows that, in probabilistic networks, nodes with large number of interactions prefer to interact with those with small number of interactions more frequently than expected.

  15. Towards a more efficient and robust representation of subsurface hydrological processes in Earth System Models

    NASA Astrophysics Data System (ADS)

    Rosolem, R.; Rahman, M.; Kollet, S. J.; Wagener, T.

    2017-12-01

    Understanding the impacts of land cover and climate changes on terrestrial hydrometeorology is important across a range of spatial and temporal scales. Earth System Models (ESMs) provide a robust platform for evaluating these impacts. However, current ESMs lack the representation of key hydrological processes (e.g., preferential water flow, and direct interactions with aquifers) in general. The typical "free drainage" conceptualization of land models can misrepresent the magnitude of those interactions, consequently affecting the exchange of energy and water at the surface as well as estimates of groundwater recharge. Recent studies show the benefits of explicitly simulating the interactions between subsurface and surface processes in similar models. However, such parameterizations are often computationally demanding resulting in limited application for large/global-scale studies. Here, we take a different approach in developing a novel parameterization for groundwater dynamics. Instead of directly adding another complex process to an established land model, we examine a set of comprehensive experimental scenarios using a very robust and establish three-dimensional hydrological model to develop a simpler parameterization that represents the aquifer to land surface interactions. The main goal of our developed parameterization is to simultaneously maximize the computational gain (i.e., "efficiency") while minimizing simulation errors in comparison to the full 3D model (i.e., "robustness") to allow for easy implementation in ESMs globally. Our study focuses primarily on understanding both the dynamics for groundwater recharge and discharge, respectively. Preliminary results show that our proposed approach significantly reduced the computational demand while model deviations from the full 3D model are considered to be small for these processes.

  16. A model-based approach to monitor complex road-vehicle interactions through first principles

    NASA Astrophysics Data System (ADS)

    Chakravarty, T.; Srinivasarengan, K.; Roy, S.; Bilal, S.; Balamuralidhar, P.

    2013-02-01

    The increasing availability of portable computing devices and their interaction with physical systems ask for designing compact models and simulations to understand and characterize such interactions. For instance, monitoring a road's grade using accelerometer stationed inside a moving ground vehicle is an emerging trend in city administration. Typically the focus has largely been to develop algorithms to articulate meaning from that. But, the experimentation cannot provide with an exhaustive analysis of all scenarios and the characteristics of them. We propose an approach of modeling these interactions of physical systems with gadgets through first principles, in a compact manner to focus on limited number of interactions. We derive an approach to model the vehicle interaction with a pothole on a road, a specific case, but allowing for selectable car parameters like natural damped frequency, tire size etc, thus generalizing it. Different road profiles are also created to represent rough road with sharp irregularities. These act as excitation to the moving vehicle and the interaction is computed to determine the vertical/ lateral vibration of the system i.e vehicle with sensors using joint time-frequency signal analysis methods. The simulation is compared with experimental data for validation. We show some directions as to how simulation of such models can reveal different characteristics of the interaction through analysis of their frequency spectrum. It is envisioned that the proposed models will get enriched further as and when large data set of real life data is captured and appropriate sensitivity analysis is done.

  17. Implicational Markedness and Frequency in Constraint-Based Computational Models of Phonological Learning

    ERIC Educational Resources Information Center

    Jarosz, Gaja

    2010-01-01

    This study examines the interacting roles of implicational markedness and frequency from the joint perspectives of formal linguistic theory, phonological acquisition and computational modeling. The hypothesis that child grammars are rankings of universal constraints, as in Optimality Theory (Prince & Smolensky, 1993/2004), that learning involves a…

  18. A Model for Intelligent Computer-Aided Education Systems.

    ERIC Educational Resources Information Center

    Du Plessis, Johan P.; And Others

    1995-01-01

    Proposes a model for intelligent computer-aided education systems that is based on cooperative learning, constructive problem-solving, object-oriented programming, interactive user interfaces, and expert system techniques. Future research is discussed, and a prototype for teaching mathematics to 10- to 12-year-old students is appended. (LRW)

  19. A Computational Model of Linguistic Humor in Puns

    ERIC Educational Resources Information Center

    Kao, Justine T.; Levy, Roger; Goodman, Noah D.

    2016-01-01

    Humor plays an essential role in human interactions. Precisely what makes something funny, however, remains elusive. While research on natural language understanding has made significant advancements in recent years, there has been little direct integration of humor research with computational models of language understanding. In this paper, we…

  20. The Effectiveness of Interactive Computer Assisted Modeling in Teaching Study Strategies and Concept Mapping of College Textbook Material.

    ERIC Educational Resources Information Center

    Mikulecky, Larry

    A study evaluated the effectiveness of a series of print materials and interactive computer-guided study programs designed to lead undergraduate students to apply basic textbook reading and concept mapping strategies to the study of science and social science textbooks. Following field testing with 25 learning skills students, 50 freshman biology…

  1. A High Performance Computing Approach to the Simulation of Fluid Solid Interaction Problems with Rigid and Flexible Components (Open Access Publisher’s Version)

    DTIC Science & Technology

    2014-08-01

    performance computing, smoothed particle hydrodynamics, rigid body dynamics, flexible body dynamics ARMAN PAZOUKI ∗, RADU SERBAN ∗, DAN NEGRUT ∗ A...HIGH PERFORMANCE COMPUTING APPROACH TO THE SIMULATION OF FLUID-SOLID INTERACTION PROBLEMS WITH RIGID AND FLEXIBLE COMPONENTS This work outlines a unified...are implemented to model rigid and flexible multibody dynamics. The two- way coupling of the fluid and solid phases is supported through use of

  2. Computer simulation of surface and film processes

    NASA Technical Reports Server (NTRS)

    Tiller, W. A.; Halicioglu, M. T.

    1983-01-01

    Adequate computer methods, based on interactions between discrete particles, provide information leading to an atomic level understanding of various physical processes. The success of these simulation methods, however, is related to the accuracy of the potential energy function representing the interactions among the particles. The development of a potential energy function for crystalline SiO2 forms that can be employed in lengthy computer modelling procedures was investigated. In many of the simulation methods which deal with discrete particles, semiempirical two body potentials were employed to analyze energy and structure related properties of the system. Many body interactions are required for a proper representation of the total energy for many systems. Many body interactions for simulations based on discrete particles are discussed.

  3. Atmospheric numerical modeling resource enhancement and model convective parameterization/scale interaction studies

    NASA Technical Reports Server (NTRS)

    Cushman, Paula P.

    1993-01-01

    Research will be undertaken in this contract in the area of Modeling Resource and Facilities Enhancement to include computer, technical and educational support to NASA investigators to facilitate model implementation, execution and analysis of output; to provide facilities linking USRA and the NASA/EADS Computer System as well as resident work stations in ESAD; and to provide a centralized location for documentation, archival and dissemination of modeling information pertaining to NASA's program. Additional research will be undertaken in the area of Numerical Model Scale Interaction/Convective Parameterization Studies to include implementation of the comparison of cloud and rain systems and convective-scale processes between the model simulations and what was observed; and to incorporate the findings of these and related research findings in at least two refereed journal articles.

  4. Interactive Computer-Enhanced Remote Viewing System (ICERVS): Final report, November 1994--September 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1997-05-01

    The Interactive Computer-Enhanced Remote Viewing System (ICERVS) is a software tool for complex three-dimensional (3-D) visualization and modeling. Its primary purpose is to facilitate the use of robotic and telerobotic systems in remote and/or hazardous environments, where spatial information is provided by 3-D mapping sensors. ICERVS provides a robust, interactive system for viewing sensor data in 3-D and combines this with interactive geometric modeling capabilities that allow an operator to construct CAD models to match the remote environment. Part I of this report traces the development of ICERVS through three evolutionary phases: (1) development of first-generation software to render orthogonalmore » view displays and wireframe models; (2) expansion of this software to include interactive viewpoint control, surface-shaded graphics, material (scalar and nonscalar) property data, cut/slice planes, color and visibility mapping, and generalized object models; (3) demonstration of ICERVS as a tool for the remediation of underground storage tanks (USTs) and the dismantlement of contaminated processing facilities. Part II of this report details the software design of ICERVS, with particular emphasis on its object-oriented architecture and user interface.« less

  5. Multimodal approaches for emotion recognition: a survey

    NASA Astrophysics Data System (ADS)

    Sebe, Nicu; Cohen, Ira; Gevers, Theo; Huang, Thomas S.

    2004-12-01

    Recent technological advances have enabled human users to interact with computers in ways previously unimaginable. Beyond the confines of the keyboard and mouse, new modalities for human-computer interaction such as voice, gesture, and force-feedback are emerging. Despite important advances, one necessary ingredient for natural interaction is still missing-emotions. Emotions play an important role in human-to-human communication and interaction, allowing people to express themselves beyond the verbal domain. The ability to understand human emotions is desirable for the computer in several applications. This paper explores new ways of human-computer interaction that enable the computer to be more aware of the user's emotional and attentional expressions. We present the basic research in the field and the recent advances into the emotion recognition from facial, voice, and physiological signals, where the different modalities are treated independently. We then describe the challenging problem of multimodal emotion recognition and we advocate the use of probabilistic graphical models when fusing the different modalities. We also discuss the difficult issues of obtaining reliable affective data, obtaining ground truth for emotion recognition, and the use of unlabeled data.

  6. Multimodal approaches for emotion recognition: a survey

    NASA Astrophysics Data System (ADS)

    Sebe, Nicu; Cohen, Ira; Gevers, Theo; Huang, Thomas S.

    2005-01-01

    Recent technological advances have enabled human users to interact with computers in ways previously unimaginable. Beyond the confines of the keyboard and mouse, new modalities for human-computer interaction such as voice, gesture, and force-feedback are emerging. Despite important advances, one necessary ingredient for natural interaction is still missing-emotions. Emotions play an important role in human-to-human communication and interaction, allowing people to express themselves beyond the verbal domain. The ability to understand human emotions is desirable for the computer in several applications. This paper explores new ways of human-computer interaction that enable the computer to be more aware of the user's emotional and attentional expressions. We present the basic research in the field and the recent advances into the emotion recognition from facial, voice, and physiological signals, where the different modalities are treated independently. We then describe the challenging problem of multimodal emotion recognition and we advocate the use of probabilistic graphical models when fusing the different modalities. We also discuss the difficult issues of obtaining reliable affective data, obtaining ground truth for emotion recognition, and the use of unlabeled data.

  7. First-Principles Molecular Dynamics Studies of Organometallic Complexes and Homogeneous Catalytic Processes.

    PubMed

    Vidossich, Pietro; Lledós, Agustí; Ujaque, Gregori

    2016-06-21

    Computational chemistry is a valuable aid to complement experimental studies of organometallic systems and their reactivity. It allows probing mechanistic hypotheses and investigating molecular structures, shedding light on the behavior and properties of molecular assemblies at the atomic scale. When approaching a chemical problem, the computational chemist has to decide on the theoretical approach needed to describe electron/nuclear interactions and the composition of the model used to approximate the actual system. Both factors determine the reliability of the modeling study. The community dedicated much effort to developing and improving the performance and accuracy of theoretical approaches for electronic structure calculations, on which the description of (inter)atomic interactions rely. Here, the importance of the model system used in computational studies is highlighted through examples from our recent research focused on organometallic systems and homogeneous catalytic processes. We show how the inclusion of explicit solvent allows the characterization of molecular events that would otherwise not be accessible in reduced model systems (clusters). These include the stabilization of nascent charged fragments via microscopic solvation (notably, hydrogen bonding), transfer of charge (protons) between distant fragments mediated by solvent molecules, and solvent coordination to unsaturated metal centers. Furthermore, when weak interactions are involved, we show how conformational and solvation properties of organometallic complexes are also affected by the explicit inclusion of solvent molecules. Such extended model systems may be treated under periodic boundary conditions, thus removing the cluster/continuum (or vacuum) boundary, and require a statistical mechanics simulation technique to sample the accessible configurational space. First-principles molecular dynamics, in which atomic forces are computed from electronic structure calculations (namely, density functional theory), is certainly the technique of choice to investigate chemical events in solution. This methodology is well established and thanks to advances in both algorithms and computational resources simulation times required for the modeling of chemical events are nowadays accessible, though the computational requirements use to be high. Specific applications reviewed here include mechanistic studies of the Shilov and Wacker processes, speciation in Pd chemistry, hydrogen bonding to metal centers, and the dynamics of agostic interactions.

  8. Free energy and phase transition of the matrix model on a plane wave

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hadizadeh, Shirin; Ramadanovic, Bojan; Semenoff, Gordon W.

    2005-03-15

    It has recently been observed that the weakly coupled plane-wave matrix model has a density of states which grows exponentially at high energy. This implies that the model has a phase transition. The transition appears to be of first order. However, its exact nature is sensitive to interactions. In this paper, we analyze the effect of interactions by computing the relevant parts of the effective potential for the Polyakov loop operator in the finite temperature plane-wave matrix model to three-loop order. We show that the phase transition is indeed of first order. We also compute the correction to the Hagedornmore » temperature to order two loops.« less

  9. Interaction between Phonological and Semantic Representations: Time Matters

    ERIC Educational Resources Information Center

    Chen, Qi; Mirman, Daniel

    2015-01-01

    Computational modeling and eye-tracking were used to investigate how phonological and semantic information interact to influence the time course of spoken word recognition. We extended our recent models (Chen & Mirman, 2012; Mirman, Britt, & Chen, 2013) to account for new evidence that competition among phonological neighbors influences…

  10. Analyses of track shift under high-speed vehicle-track interaction : safety of high speed ground transportation systems

    DOT National Transportation Integrated Search

    1997-06-01

    This report describes analysis tools to predict shift under high-speed vehicle- : track interaction. The analysis approach is based on two fundamental models : developed (as part of this research); the first model computes the track lateral : residua...

  11. Practical Issues in Interactive Multimedia Design.

    ERIC Educational Resources Information Center

    James, Jeff

    This paper describes a range of computer assisted learning software models--linear, unstructured, and ideal--and discusses issues such as control, interactivity, and ease-of-programming. It also introduces a "compromise model" used for a package currently under development at the Hong Kong Polytechnic University, which is intended to…

  12. Modeling molecule-plasmon interactions using quantized radiation fields within time-dependent electronic structure theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nascimento, Daniel R.; DePrince, A. Eugene, E-mail: deprince@chem.fsu.edu

    2015-12-07

    We present a combined cavity quantum electrodynamics/ab initio electronic structure approach for simulating plasmon-molecule interactions in the time domain. The simple Jaynes-Cummings-type model Hamiltonian typically utilized in such simulations is replaced with one in which the molecular component of the coupled system is treated in a fully ab initio way, resulting in a computationally efficient description of general plasmon-molecule interactions. Mutual polarization effects are easily incorporated within a standard ground-state Hartree-Fock computation, and time-dependent simulations carry the same formal computational scaling as real-time time-dependent Hartree-Fock theory. As a proof of principle, we apply this generalized method to the emergence ofmore » a Fano-like resonance in coupled molecule-plasmon systems; this feature is quite sensitive to the nanoparticle-molecule separation and the orientation of the molecule relative to the polarization of the external electric field.« less

  13. Laboratory modeling and analysis of aircraft-lightning interactions

    NASA Technical Reports Server (NTRS)

    Turner, C. D.; Trost, T. F.

    1982-01-01

    Modeling studies of the interaction of a delta wing aircraft with direct lightning strikes were carried out using an approximate scale model of an F-106B. The model, which is three feet in length, is subjected to direct injection of fast current pulses supplied by wires, which simulate the lightning channel and are attached at various locations on the model. Measurements are made of the resulting transient electromagnetic fields using time derivative sensors. The sensor outputs are sampled and digitized by computer. The noise level is reduced by averaging the sensor output from ten input pulses at each sample time. Computer analysis of the measured fields includes Fourier transformation and the computation of transfer functions for the model. Prony analysis is also used to determine the natural frequencies of the model. Comparisons of model natural frequencies extracted by Prony analysis with those for in flight direct strike data usually show lower damping in the in flight case. This is indicative of either a lightning channel with a higher impedance than the wires on the model, only one attachment point, or short streamers instead of a long channel.

  14. Computational procedures for probing interactions in OLS and logistic regression: SPSS and SAS implementations.

    PubMed

    Hayes, Andrew F; Matthes, Jörg

    2009-08-01

    Researchers often hypothesize moderated effects, in which the effect of an independent variable on an outcome variable depends on the value of a moderator variable. Such an effect reveals itself statistically as an interaction between the independent and moderator variables in a model of the outcome variable. When an interaction is found, it is important to probe the interaction, for theories and hypotheses often predict not just interaction but a specific pattern of effects of the focal independent variable as a function of the moderator. This article describes the familiar pick-a-point approach and the much less familiar Johnson-Neyman technique for probing interactions in linear models and introduces macros for SPSS and SAS to simplify the computations and facilitate the probing of interactions in ordinary least squares and logistic regression. A script version of the SPSS macro is also available for users who prefer a point-and-click user interface rather than command syntax.

  15. Substituent Effects in the Benzene Dimer are Due to Direct Interactions of the Substituents with the Unsubstituted Benzene

    PubMed Central

    Wheeler, Steven E.; Houk, K. N.

    2009-01-01

    The prevailing views of substituent effects in the sandwich configuration of the benzene dimer are flawed. For example, in the polar/π model of Cozzi and co-workers (J. Am. Chem. Soc. 1992, 114, 5729), electron-withdrawing substituents enhance binding in the benzene dimer by withdrawing electron density from the π-cloud of the substituted ring, reducing the repulsive electrostatic interaction with the non-substituted benzene. Conversely, electron-donating substituents donate excess electrons into the π-system and diminish the π-stacking interaction. We present computed interaction energies for the sandwich configuration of the benzene dimer and 24 substituted dimers, as well as sandwich complexes of substituted benzenes with perfluorobenzene. While the computed interaction energies correlate well with σm values for the substituents, interaction energies for related model systems demonstrate that this trend is independent of the substituted ring. Instead, the observed trends are consistent with direct electrostatic and dispersive interactions of the substituents with the unsubstituted ring. PMID:18652453

  16. KASCADE-Grande energy reconstruction based on the lateral density distribution using the QGSJet-II.04 interaction model

    NASA Astrophysics Data System (ADS)

    Gherghel-Lascu, A.; Apel, W. D.; Arteaga-Velázquez, J. C.; Bekk, K.; Bertania, M.; Blümer, J.; Bozdog, H.; Brancus, I. M.; Cantoni, E.; Chiavassa, A.; Cossavella, F.; Daumiller, K.; de Souza, V.; Di Pierro, F.; Doll, P.; Engel, R.; Fuhrmann, D.; Gils, H. J.; Glasstetter, R.; Grupen, C.; Haungs, A.; Heck, D.; Hörandel, J. R.; Huber, D.; Huege, T.; Kampert, K.-H.; Kang, D.; Klages, H. O.; Link, K.; Łuczak, P.; Mathes, H. J.; Mayer, H. J.; Milke, J.; Mitrica, B.; Morello, C.; Oehlschläger, J.; Ostapchenko, S.; Palmieri, N.; Pierog, T.; Rebel, H.; Roth, M.; Schieler, H.; Schoo, S.; Schröder, F. G.; Sima, O.; Toma, G.; Trinchero, G. C.; Ulrich, H.; Weindl, A.; Wochele, J.; Zabierowski, J.

    2017-06-01

    The charged particle densities obtained from CORSIKA simulated EAS, using the QGSJet-II.04 hadronic interaction model are used for primary energy reconstruction. Simulated data are reconstructed by using Lateral Energy Correction Functions computed with a new realistic model of the Grande stations implemented in Geant4.10.

  17. Computational Tools for Probing Interactions in Multiple Linear Regression, Multilevel Modeling, and Latent Curve Analysis

    ERIC Educational Resources Information Center

    Preacher, Kristopher J.; Curran, Patrick J.; Bauer, Daniel J.

    2006-01-01

    Simple slopes, regions of significance, and confidence bands are commonly used to evaluate interactions in multiple linear regression (MLR) models, and the use of these techniques has recently been extended to multilevel or hierarchical linear modeling (HLM) and latent curve analysis (LCA). However, conducting these tests and plotting the…

  18. Application of interactive computer graphics in wind-tunnel dynamic model testing

    NASA Technical Reports Server (NTRS)

    Doggett, R. V., Jr.; Hammond, C. E.

    1975-01-01

    The computer-controlled data-acquisition system recently installed for use with a transonic dynamics tunnel was described. This includes a discussion of the hardware/software features of the system. A subcritical response damping technique, called the combined randomdec/moving-block method, for use in windtunnel-model flutter testing, that has been implemented on the data-acquisition system, is described in some detail. Some results using the method are presented and the importance of using interactive graphics in applying the technique in near real time during wind-tunnel test operations is discussed.

  19. Social Protocols for Agile Virtual Teams

    NASA Astrophysics Data System (ADS)

    Picard, Willy

    Despite many works on collaborative networked organizations (CNOs), CSCW, groupware, workflow systems and social networks, computer support for virtual teams is still insufficient, especially support for agility, i.e. the capability of virtual team members to rapidly and cost efficiently adapt the way they interact to changes. In this paper, requirements for computer support for agile virtual teams are presented. Next, an extension of the concept of social protocol is proposed as a novel model supporting agile interactions within virtual teams. The extended concept of social protocol consists of an extended social network and a workflow model.

  20. Computations of Torque-Balanced Coaxial Rotor Flows

    NASA Technical Reports Server (NTRS)

    Yoon, Seokkwan; Chan, William M.; Pulliam, Thomas H.

    2017-01-01

    Interactional aerodynamics has been studied for counter-rotating coaxial rotors in hover. The effects of torque balancing on the performance of coaxial-rotor systems have been investigated. The three-dimensional unsteady Navier-Stokes equations are solved on overset grids using high-order accurate schemes, dual-time stepping, and a hybrid turbulence model. Computational results for an experimental model are compared to available data. The results for a coaxial quadcopter vehicle with and without torque balancing are discussed. Understanding interactions in coaxial-rotor flows would help improve the design of next-generation autonomous drones.

  1. Numerical simulation of supersonic flow using a new analytical bleed boundary condition

    NASA Technical Reports Server (NTRS)

    Harloff, G. J.; Smith, G. E.

    1995-01-01

    A new analytical bleed boundary condition is used to compute flowfields for a strong oblique shock wave/boundary layer interaction with a baseline and three bleed rates at a freestream Mach number of 2.47 with an 8 deg shock generator. The computational results are compared to experimental Pitot pressure profiles and wall static pressures through the interaction region. An algebraic turbulence model is employed for the bleed and baseline cases, and a one equation model is also used for the baseline case where the boundary layer is separated.

  2. Heterogeneous nucleation of polymorphs on polymer surfaces: polymer-molecule interactions using a Coulomb and van der Waals model.

    PubMed

    Wahlberg, Nanna; Madsen, Anders Ø; Mikkelsen, Kurt V

    2018-06-09

    The nucleation processes of acetaminophen on poly(methyl methacrylate) and poly(vinyl acetate) have been investigated and the mechanisms of the processes are studied. This is achieved by a combination of theoretical models and computational investigations within the framework of a modified QM/MM method; a Coulomb-van der Waals model. We have combined quantum mechanical computations and electrostatic models at the atomistic level for investigating the stability of different orientations of acetaminophen on the polymer surfaces. Based on the Coulomb-van der Waals model, we have determined the most stable orientation to be a flat orientation, and the strongest interaction is seen between poly(vinyl acetate) and the molecule in a flat orientation in vacuum.

  3. Tangible Landscape: Cognitively Grasping the Flow of Water

    NASA Astrophysics Data System (ADS)

    Harmon, B. A.; Petrasova, A.; Petras, V.; Mitasova, H.; Meentemeyer, R. K.

    2016-06-01

    Complex spatial forms like topography can be challenging to understand, much less intentionally shape, given the heavy cognitive load of visualizing and manipulating 3D form. Spatiotemporal processes like the flow of water over a landscape are even more challenging to understand and intentionally direct as they are dependent upon their context and require the simulation of forces like gravity and momentum. This cognitive work can be offloaded onto computers through 3D geospatial modeling, analysis, and simulation. Interacting with computers, however, can also be challenging, often requiring training and highly abstract thinking. Tangible computing - an emerging paradigm of human-computer interaction in which data is physically manifested so that users can feel it and directly manipulate it - aims to offload this added cognitive work onto the body. We have designed Tangible Landscape, a tangible interface powered by an open source geographic information system (GRASS GIS), so that users can naturally shape topography and interact with simulated processes with their hands in order to make observations, generate and test hypotheses, and make inferences about scientific phenomena in a rapid, iterative process. Conceptually Tangible Landscape couples a malleable physical model with a digital model of a landscape through a continuous cycle of 3D scanning, geospatial modeling, and projection. We ran a flow modeling experiment to test whether tangible interfaces like this can effectively enhance spatial performance by offloading cognitive processes onto computers and our bodies. We used hydrological simulations and statistics to quantitatively assess spatial performance. We found that Tangible Landscape enhanced 3D spatial performance and helped users understand water flow.

  4. Coupled multipolar interactions in small-particle metallic clusters.

    PubMed

    Pustovit, Vitaly N; Sotelo, Juan A; Niklasson, Gunnar A

    2002-03-01

    We propose a new formalism for computing the optical properties of small clusters of particles. It is a generalization of the coupled dipole-dipole particle-interaction model and allows one in principle to take into account all multipolar interactions in the long-wavelength limit. The method is illustrated by computations of the optical properties of N = 6 particle clusters for different multipolar approximations. We examine the effect of separation between particles and compare the optical spectra with the discrete-dipole approximation and the generalized Mie theory.

  5. The Human-Computer Interface and Information Literacy: Some Basics and Beyond.

    ERIC Educational Resources Information Center

    Church, Gary M.

    1999-01-01

    Discusses human/computer interaction research, human/computer interface, and their relationships to information literacy. Highlights include communication models; cognitive perspectives; task analysis; theory of action; problem solving; instructional design considerations; and a suggestion that human/information interface may be a more appropriate…

  6. The Computer as Adaptive Instructional Decision Maker.

    ERIC Educational Resources Information Center

    Kopstein, Felix F.; Seidel, Robert J.

    The computer's potential for education, and most particularly for instruction, is contingent on the development of a class of instructional decision models (formal instructional strategies) that interact with the student through appropriate peripheral equipment (man-machine interfaces). Computer hardware and software by themselves should not be…

  7. Instructional Variables in Meaningful Learning of Computer Programming.

    ERIC Educational Resources Information Center

    Mayer, Richard E.

    Some 120 undergraduate students participated in experiments to learn how novice computer programers learn to interact with the computer. Two instructional booklets were used: A "rule" booklet consisted of definitions and examples of seven modified FORTRAN statements and appropriate grammar rules; the "model" booklet was…

  8. Why E-Business Must Evolve beyond Market Orientation: Applying Human Interaction Models to Computer-Mediated Corporate Communications.

    ERIC Educational Resources Information Center

    Johnston, Kevin McCullough

    2001-01-01

    Considers the design of corporate communications for electronic business and discusses the increasing importance of corporate interaction as companies work in virtual environments. Compares sociological and psychological theories of human interaction and relationship formation with organizational interaction theories of corporate relationship…

  9. Computer-assisted visual interactive recognition and its prospects of implementation over the Internet

    NASA Astrophysics Data System (ADS)

    Zou, Jie; Gattani, Abhishek

    2005-01-01

    When completely automated systems don't yield acceptable accuracy, many practical pattern recognition systems involve the human either at the beginning (pre-processing) or towards the end (handling rejects). We believe that it may be more useful to involve the human throughout the recognition process rather than just at the beginning or end. We describe a methodology of interactive visual recognition for human-centered low-throughput applications, Computer Assisted Visual InterActive Recognition (CAVIAR), and discuss the prospects of implementing CAVIAR over the Internet. The novelty of CAVIAR is image-based interaction through a domain-specific parameterized geometrical model, which reduces the semantic gap between humans and computers. The user may interact with the computer anytime that she considers its response unsatisfactory. The interaction improves the accuracy of the classification features by improving the fit of the computer-proposed model. The computer makes subsequent use of the parameters of the improved model to refine not only its own statistical model-fitting process, but also its internal classifier. The CAVIAR methodology was applied to implement a flower recognition system. The principal conclusions from the evaluation of the system include: 1) the average recognition time of the CAVIAR system is significantly shorter than that of the unaided human; 2) its accuracy is significantly higher than that of the unaided machine; 3) it can be initialized with as few as one training sample per class and still achieve high accuracy; and 4) it demonstrates a self-learning ability. We have also implemented a Mobile CAVIAR system, where a pocket PC, as a client, connects to a server through wireless communication. The motivation behind a mobile platform for CAVIAR is to apply the methodology in a human-centered pervasive environment, where the user can seamlessly interact with the system for classifying field-data. Deploying CAVIAR to a networked mobile platform poses the challenge of classifying field images and programming under constraints of display size, network bandwidth, processor speed, and memory size. Editing of the computer-proposed model is performed on the handheld while statistical model fitting and classification take place on the server. The possibility that the user can easily take several photos of the object poses an interesting information fusion problem. The advantage of the Internet is that the patterns identified by different users can be pooled together to benefit all peer users. When users identify patterns with CAVIAR in a networked setting, they also collect training samples and provide opportunities for machine learning from their intervention. CAVIAR implemented over the Internet provides a perfect test bed for, and extends, the concept of Open Mind Initiative proposed by David Stork. Our experimental evaluation focuses on human time, machine and human accuracy, and machine learning. We devoted much effort to evaluating the use of our image-based user interface and on developing principles for the evaluation of interactive pattern recognition system. The Internet architecture and Mobile CAVIAR methodology have many applications. We are exploring in the directions of teledermatology, face recognition, and education.

  10. CADRE-SS, an in Silico Tool for Predicting Skin Sensitization Potential Based on Modeling of Molecular Interactions.

    PubMed

    Kostal, Jakub; Voutchkova-Kostal, Adelina

    2016-01-19

    Using computer models to accurately predict toxicity outcomes is considered to be a major challenge. However, state-of-the-art computational chemistry techniques can now be incorporated in predictive models, supported by advances in mechanistic toxicology and the exponential growth of computing resources witnessed over the past decade. The CADRE (Computer-Aided Discovery and REdesign) platform relies on quantum-mechanical modeling of molecular interactions that represent key biochemical triggers in toxicity pathways. Here, we present an external validation exercise for CADRE-SS, a variant developed to predict the skin sensitization potential of commercial chemicals. CADRE-SS is a hybrid model that evaluates skin permeability using Monte Carlo simulations, assigns reactive centers in a molecule and possible biotransformations via expert rules, and determines reactivity with skin proteins via quantum-mechanical modeling. The results were promising with an overall very good concordance of 93% between experimental and predicted values. Comparison to performance metrics yielded by other tools available for this endpoint suggests that CADRE-SS offers distinct advantages for first-round screenings of chemicals and could be used as an in silico alternative to animal tests where permissible by legislative programs.

  11. Computer system for definition of the quantitative geometry of musculature from CT images.

    PubMed

    Daniel, Matej; Iglic, Ales; Kralj-Iglic, Veronika; Konvicková, Svatava

    2005-02-01

    The computer system for quantitative determination of musculoskeletal geometry from computer tomography (CT) images has been developed. The computer system processes series of CT images to obtain three-dimensional (3D) model of bony structures where the effective muscle fibres can be interactively defined. Presented computer system has flexible modular structure and is suitable also for educational purposes.

  12. Fluid–Structure Interaction Analysis of Papillary Muscle Forces Using a Comprehensive Mitral Valve Model with 3D Chordal Structure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toma, Milan; Jensen, Morten Ø.; Einstein, Daniel R.

    2015-07-17

    Numerical models of native heart valves are being used to study valve biomechanics to aid design and development of repair procedures and replacement devices. These models have evolved from simple two-dimensional approximations to complex three-dimensional, fully coupled fluid-structure interaction (FSI) systems. Such simulations are useful for predicting the mechanical and hemodynamic loading on implanted valve devices. A current challenge for improving the accuracy of these predictions is choosing and implementing modeling boundary conditions. In order to address this challenge, we are utilizing an advanced in-vitro system to validate FSI conditions for the mitral valve system. Explanted ovine mitral valves weremore » mounted in an in vitro setup, and structural data for the mitral valve was acquired with *CT. Experimental data from the in-vitro ovine mitral valve system were used to validate the computational model. As the valve closes, the hemodynamic data, high speed lea et dynamics, and force vectors from the in-vitro system were compared to the results of the FSI simulation computational model. The total force of 2.6 N per papillary muscle is matched by the computational model. In vitro and in vivo force measurements are important in validating and adjusting material parameters in computational models. The simulations can then be used to answer questions that are otherwise not possible to investigate experimentally. This work is important to maximize the validity of computational models of not just the mitral valve, but any biomechanical aspect using computational simulation in designing medical devices.« less

  13. Fluid-Structure Interaction Analysis of Papillary Muscle Forces Using a Comprehensive Mitral Valve Model with 3D Chordal Structure.

    PubMed

    Toma, Milan; Jensen, Morten Ø; Einstein, Daniel R; Yoganathan, Ajit P; Cochran, Richard P; Kunzelman, Karyn S

    2016-04-01

    Numerical models of native heart valves are being used to study valve biomechanics to aid design and development of repair procedures and replacement devices. These models have evolved from simple two-dimensional approximations to complex three-dimensional, fully coupled fluid-structure interaction (FSI) systems. Such simulations are useful for predicting the mechanical and hemodynamic loading on implanted valve devices. A current challenge for improving the accuracy of these predictions is choosing and implementing modeling boundary conditions. In order to address this challenge, we are utilizing an advanced in vitro system to validate FSI conditions for the mitral valve system. Explanted ovine mitral valves were mounted in an in vitro setup, and structural data for the mitral valve was acquired with [Formula: see text]CT. Experimental data from the in vitro ovine mitral valve system were used to validate the computational model. As the valve closes, the hemodynamic data, high speed leaflet dynamics, and force vectors from the in vitro system were compared to the results of the FSI simulation computational model. The total force of 2.6 N per papillary muscle is matched by the computational model. In vitro and in vivo force measurements enable validating and adjusting material parameters to improve the accuracy of computational models. The simulations can then be used to answer questions that are otherwise not possible to investigate experimentally. This work is important to maximize the validity of computational models of not just the mitral valve, but any biomechanical aspect using computational simulation in designing medical devices.

  14. A computational model of in vitro angiogenesis based on extracellular matrix fibre orientation.

    PubMed

    Edgar, Lowell T; Sibole, Scott C; Underwood, Clayton J; Guilkey, James E; Weiss, Jeffrey A

    2013-01-01

    Recent interest in the process of vascularisation within the biomedical community has motivated numerous new research efforts focusing on the process of angiogenesis. Although the role of chemical factors during angiogenesis has been well documented, the role of mechanical factors, such as the interaction between angiogenic vessels and the extracellular matrix, remains poorly understood. In vitro methods for studying angiogenesis exist; however, measurements available using such techniques often suffer from limited spatial and temporal resolutions. For this reason, computational models have been extensively employed to investigate various aspects of angiogenesis. This paper outlines the formulation and validation of a simple and robust computational model developed to accurately simulate angiogenesis based on length, branching and orientation morphometrics collected from vascularised tissue constructs. Microvessels were represented as a series of connected line segments. The morphology of the vessels was determined by a linear combination of the collagen fibre orientation, the vessel density gradient and a random walk component. Excellent agreement was observed between computational and experimental morphometric data over time. Computational predictions of microvessel orientation within an anisotropic matrix correlated well with experimental data. The accuracy of this modelling approach makes it a valuable platform for investigating the role of mechanical interactions during angiogenesis.

  15. AI/OR computational model for integrating qualitative and quantitative design methods

    NASA Technical Reports Server (NTRS)

    Agogino, Alice M.; Bradley, Stephen R.; Cagan, Jonathan; Jain, Pramod; Michelena, Nestor

    1990-01-01

    A theoretical framework for integrating qualitative and numerical computational methods for optimally-directed design is described. The theory is presented as a computational model and features of implementations are summarized where appropriate. To demonstrate the versatility of the methodology we focus on four seemingly disparate aspects of the design process and their interaction: (1) conceptual design, (2) qualitative optimal design, (3) design innovation, and (4) numerical global optimization.

  16. User Interaction Modeling and Profile Extraction in Interactive Systems: A Groupware Application Case Study †

    PubMed Central

    Tîrnăucă, Cristina; Duque, Rafael; Montaña, José L.

    2017-01-01

    A relevant goal in human–computer interaction is to produce applications that are easy to use and well-adjusted to their users’ needs. To address this problem it is important to know how users interact with the system. This work constitutes a methodological contribution capable of identifying the context of use in which users perform interactions with a groupware application (synchronous or asynchronous) and provides, using machine learning techniques, generative models of how users behave. Additionally, these models are transformed into a text that describes in natural language the main characteristics of the interaction of the users with the system. PMID:28726762

  17. Long range Debye-Hückel correction for computation of grid-based electrostatic forces between biomacromolecules

    PubMed Central

    2014-01-01

    Background Brownian dynamics (BD) simulations can be used to study very large molecular systems, such as models of the intracellular environment, using atomic-detail structures. Such simulations require strategies to contain the computational costs, especially for the computation of interaction forces and energies. A common approach is to compute interaction forces between macromolecules by precomputing their interaction potentials on three-dimensional discretized grids. For long-range interactions, such as electrostatics, grid-based methods are subject to finite size errors. We describe here the implementation of a Debye-Hückel correction to the grid-based electrostatic potential used in the SDA BD simulation software that was applied to simulate solutions of bovine serum albumin and of hen egg white lysozyme. Results We found that the inclusion of the long-range electrostatic correction increased the accuracy of both the protein-protein interaction profiles and the protein diffusion coefficients at low ionic strength. Conclusions An advantage of this method is the low additional computational cost required to treat long-range electrostatic interactions in large biomacromolecular systems. Moreover, the implementation described here for BD simulations of protein solutions can also be applied in implicit solvent molecular dynamics simulations that make use of gridded interaction potentials. PMID:25045516

  18. Automated Tutoring in Interactive Environments: A Task-Centered Approach.

    ERIC Educational Resources Information Center

    Wolz, Ursula; And Others

    1989-01-01

    Discusses tutoring and consulting functions in interactive computer environments. Tutoring strategies are considered, the expert model and the user model are described, and GENIE (Generated Informative Explanations)--an answer generating system for the Berkeley Unix Mail system--is explained as an example of an automated consulting system. (33…

  19. Modeling Interactions in Small Groups

    ERIC Educational Resources Information Center

    Heise, David R.

    2013-01-01

    A new theory of interaction within small groups posits that group members initiate actions when tension mounts between the affective meanings of their situational identities and impressions produced by recent events. Actors choose partners and behaviors so as to reduce the tensions. A computer model based on this theory, incorporating reciprocal…

  20. A Suggested Model for a Working Cyberschool.

    ERIC Educational Resources Information Center

    Javid, Mahnaz A.

    2000-01-01

    Suggests a model for a working cyberschool based on a case study of Kamiak Cyberschool (Washington), a technology-driven public high school. Topics include flexible hours; one-to-one interaction with teachers; a supportive school environment; use of computers, interactive media, and online resources; and self-paced, project-based learning.…

  1. A study of modelling simplifications in ground vibration predictions for railway traffic at grade

    NASA Astrophysics Data System (ADS)

    Germonpré, M.; Degrande, G.; Lombaert, G.

    2017-10-01

    Accurate computational models are required to predict ground-borne vibration due to railway traffic. Such models generally require a substantial computational effort. Therefore, much research has focused on developing computationally efficient methods, by either exploiting the regularity of the problem geometry in the direction along the track or assuming a simplified track structure. This paper investigates the modelling errors caused by commonly made simplifications of the track geometry. A case study is presented investigating a ballasted track in an excavation. The soil underneath the ballast is stiffened by a lime treatment. First, periodic track models with different cross sections are analyzed, revealing that a prediction of the rail receptance only requires an accurate representation of the soil layering directly underneath the ballast. A much more detailed representation of the cross sectional geometry is required, however, to calculate vibration transfer from track to free field. Second, simplifications in the longitudinal track direction are investigated by comparing 2.5D and periodic track models. This comparison shows that the 2.5D model slightly overestimates the track stiffness, while the transfer functions between track and free field are well predicted. Using a 2.5D model to predict the response during a train passage leads to an overestimation of both train-track interaction forces and free field vibrations. A combined periodic/2.5D approach is therefore proposed in this paper. First, the dynamic axle loads are computed by solving the train-track interaction problem with a periodic model. Next, the vibration transfer to the free field is computed with a 2.5D model. This combined periodic/2.5D approach only introduces small modelling errors compared to an approach in which a periodic model is used in both steps, while significantly reducing the computational cost.

  2. Interactive computer simulations of knee-replacement surgery.

    PubMed

    Gunther, Stephen B; Soto, Gabriel E; Colman, William W

    2002-07-01

    Current surgical training programs in the United States are based on an apprenticeship model. This model is outdated because it does not provide conceptual scaffolding, promote collaborative learning, or offer constructive reinforcement. Our objective was to create a more useful approach by preparing students and residents for operative cases using interactive computer simulations of surgery. Total-knee-replacement surgery (TKR) is an ideal procedure to model on the computer because there is a systematic protocol for the procedure. Also, this protocol is difficult to learn by the apprenticeship model because of the multiple instruments that must be used in a specific order. We designed an interactive computer tutorial to teach medical students and residents how to perform knee-replacement surgery. We also aimed to reinforce the specific protocol of the operative procedure. Our final goal was to provide immediate, constructive feedback. We created a computer tutorial by generating three-dimensional wire-frame models of the surgical instruments. Next, we applied a surface to the wire-frame models using three-dimensional modeling. Finally, the three-dimensional models were animated to simulate the motions of an actual TKR. The tutorial is a step-by-step tutorial that teaches and tests the correct sequence of steps in a TKR. The student or resident must select the correct instruments in the correct order. The learner is encouraged to learn the stepwise surgical protocol through repetitive use of the computer simulation. Constructive feedback is acquired through a grading system, which rates the student's or resident's ability to perform the task in the correct order. The grading system also accounts for the time required to perform the simulated procedure. We evaluated the efficacy of this teaching technique by testing medical students who learned by the computer simulation and those who learned by reading the surgical protocol manual. Both groups then performed TKR on manufactured bone models using real instruments. Their technique was graded with the standard protocol. The students who learned on the computer simulation performed the task in a shorter time and with fewer errors than the control group. They were also more engaged in the learning process. Surgical training programs generally lack a consistent approach to preoperative education related to surgical procedures. This interactive computer tutorial has allowed us to make a quantum leap in medical student and resident teaching in our orthopedic department because the students actually participate in the entire process. Our technique provides a linear, sequential method of skill acquisition and direct feedback, which is ideally suited for learning stepwise surgical protocols. Since our initial evaluation has shown the efficacy of this program, we have implemented this teaching tool into our orthopedic curriculum. Our plans for future work with this simulator include modeling procedures involving other anatomic areas of interest, such as the hip and shoulder.

  3. A Mixed Model for Real-Time, Interactive Simulation of a Cable Passing Through Several Pulleys

    NASA Astrophysics Data System (ADS)

    García-Fernández, Ignacio; Pla-Castells, Marta; Martínez-Durá, Rafael J.

    2007-09-01

    A model of a cable and pulleys is presented that can be used in Real Time Computer Graphics applications. The model is formulated by the coupling of a damped spring and a variable coefficient wave equation, and can be integrated in more complex mechanical models of lift systems, such as cranes, elevators, etc. with a high degree of interactivity.

  4. A novel pH-responsive interpolyelectrolyte hydrogel complex for the oral delivery of levodopa. Part I. IPEC modeling and synthesis.

    PubMed

    Ngwuluka, Ndidi C; Choonara, Yahya E; Kumar, Pradeep; du Toit, Lisa C; Khan, Riaz A; Pillay, Viness

    2015-03-01

    This study was undertaken to synthesize an interpolyelectrolyte complex (IPEC) of polymethacrylate (E100) and sodium carboxymethylcellulose (NaCMC) to form a polymeric hydrogel material for application in specialized oral drug delivery of sensitive levodopa. Computational modeling was employed to proffer insight into the interactions between the polymers. In addition, the reactional profile of NaCMC and polymethacrylate was elucidated using molecular mechanics energy relationships (MMER) and molecular dynamics simulations (MDS) by exploring the spatial disposition of NaCMC and E100 with respect to each other. Computational modeling revealed that the formation of the IPEC was due to strong ionic associations, hydrogen bonding, and hydrophilic interactions. The computational results corroborated well with the experimental and the analytical data. © 2014 Wiley Periodicals, Inc.

  5. Application of advanced computational procedures for modeling solar-wind interactions with Venus: Theory and computer code

    NASA Technical Reports Server (NTRS)

    Stahara, S. S.; Klenke, D.; Trudinger, B. C.; Spreiter, J. R.

    1980-01-01

    Computational procedures are developed and applied to the prediction of solar wind interaction with nonmagnetic terrestrial planet atmospheres, with particular emphasis to Venus. The theoretical method is based on a single fluid, steady, dissipationless, magnetohydrodynamic continuum model, and is appropriate for the calculation of axisymmetric, supersonic, super-Alfvenic solar wind flow past terrestrial planets. The procedures, which consist of finite difference codes to determine the gasdynamic properties and a variety of special purpose codes to determine the frozen magnetic field, streamlines, contours, plots, etc. of the flow, are organized into one computational program. Theoretical results based upon these procedures are reported for a wide variety of solar wind conditions and ionopause obstacle shapes. Plasma and magnetic field comparisons in the ionosheath are also provided with actual spacecraft data obtained by the Pioneer Venus Orbiter.

  6. A System for Modelling Cell–Cell Interactions during Plant Morphogenesis

    PubMed Central

    Dupuy, Lionel; Mackenzie, Jonathan; Rudge, Tim; Haseloff, Jim

    2008-01-01

    Background and aims During the development of multicellular organisms, cells are capable of interacting with each other through a range of biological and physical mechanisms. A description of these networks of cell–cell interactions is essential for an understanding of how cellular activity is co-ordinated in regionalized functional entities such as tissues or organs. The difficulty of experimenting on living tissues has been a major limitation to describing such systems, and computer modelling appears particularly helpful to characterize the behaviour of multicellular systems. The experimental difficulties inherent to the multitude of parallel interactions that underlie cellular morphogenesis have led to the need for computer models. Methods A new generic model of plant cellular morphogenesis is described that expresses interactions amongst cellular entities explicitly: the plant is described as a multi-scale structure, and interactions between distinct entities is established through a topological neighbourhood. Tissues are represented as 2D biphasic systems where the cell wall responds to turgor pressure through a viscous yielding of the cell wall. Key Results This principle was used in the development of the CellModeller software, a generic tool dedicated to the analysis and modelling of plant morphogenesis. The system was applied to three contrasting study cases illustrating genetic, hormonal and mechanical factors involved in plant morphogenesis. Conclusions Plant morphogenesis is fundamentally a cellular process and the CellModeller software, through its underlying generic model, provides an advanced research tool to analyse coupled physical and biological morphogenetic mechanisms. PMID:17921524

  7. An experimental-computational platform for investigating microbial interactions and dynamics in communities with two codependent species

    NASA Astrophysics Data System (ADS)

    Fuentes-Cabrera, Miguel; Anderson, John D.; Wilmoth, Jared; Ginovart, Marta; Prats, Clara; Portell-Canal, Xavier; Retterer, Scott

    Microbial interactions are critical for governing community behavior and structure in natural environments. Examination of microbial interactions in the lab involves growth under ideal conditions in batch culture; conditions that occur in nature are, however, characterized by disequilibrium. Of particular interest is the role that system variables play in shaping cell-to-cell interactions and organization at ultrafine spatial scales. We seek to use experiments and agent-based modeling to help discover mechanisms relevant to microbial dynamics and interactions in the environment. Currently, we are using an agent-based model to simulate microbial growth, dynamics and interactions that occur on a microwell-array device developed in our lab. Bacterial cells growing in the microwells of this platform can be studied with high-throughput and high-content image analyses using brightfield and fluorescence microscopy. The agent-based model is written in the language Netlogo, which in turn is ''plugged into'' a computational framework that allows submitting many calculations in parallel for different initial parameters; visualizing the outcomes in an interactive phase-like diagram; and searching, with a genetic algorithm, for the parameters that lead to the most optimal simulation outcome.

  8. Modeling low-temperature geochemical processes: Chapter 2

    USGS Publications Warehouse

    Nordstrom, D. Kirk; Campbell, Kate M.

    2014-01-01

    This chapter provides an overview of geochemical modeling that applies to water–rock interactions under ambient conditions of temperature and pressure. Topics include modeling definitions, historical background, issues of activity coefficients, popular codes and databases, examples of modeling common types of water–rock interactions, and issues of model reliability. Examples include speciation, microbial redox kinetics and ferrous iron oxidation, calcite dissolution, pyrite oxidation, combined pyrite and calcite dissolution, dedolomitization, seawater–carbonate groundwater mixing, reactive-transport modeling in streams, modeling catchments, and evaporation of seawater. The chapter emphasizes limitations to geochemical modeling: that a proper understanding and ability to communicate model results well are as important as completing a set of useful modeling computations and that greater sophistication in model and code development is not necessarily an advancement. If the goal is to understand how a particular geochemical system behaves, it is better to collect more field data than rely on computer codes.

  9. Teaching Concept Mapping and University Level Study Strategies Using Computers.

    ERIC Educational Resources Information Center

    Mikulecky, Larry; And Others

    1989-01-01

    Assesses the utility and effectiveness of three interactive computer programs and associated print materials in instructing and modeling for undergraduates how to comprehend and reconceptualize scientific textbook material. Finds that "how to" reading strategies can be taught via computer and transferred to new material. (RS)

  10. Engaging Undergraduate Math Majors in Geoscience Research using Interactive Simulations and Computer Art

    NASA Astrophysics Data System (ADS)

    Matott, L. S.; Hymiak, B.; Reslink, C. F.; Baxter, C.; Aziz, S.

    2012-12-01

    As part of the NSF-sponsored 'URGE (Undergraduate Research Group Experiences) to Compute' program, Dr. Matott has been collaborating with talented Math majors to explore the design of cost-effective systems to safeguard groundwater supplies from contaminated sites. Such activity is aided by a combination of groundwater modeling, simulation-based optimization, and high-performance computing - disciplines largely unfamiliar to the students at the outset of the program. To help train and engage the students, a number of interactive and graphical software packages were utilized. Examples include: (1) a tutorial for exploring the behavior of evolutionary algorithms and other heuristic optimizers commonly used in simulation-based optimization; (2) an interactive groundwater modeling package for exploring alternative pump-and-treat containment scenarios at a contaminated site in Billings, Montana; (3) the R software package for visualizing various concepts related to subsurface hydrology; and (4) a job visualization tool for exploring the behavior of numerical experiments run on a large distributed computing cluster. Further engagement and excitement in the program was fostered by entering (and winning) a computer art competition run by the Coalition for Academic Scientific Computation (CASC). The winning submission visualizes an exhaustively mapped optimization cost surface and dramatically illustrates the phenomena of artificial minima - valley locations that correspond to designs whose costs are only partially optimal.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Busbey, A.B.

    Seismic Processing Workshop, a program by Parallel Geosciences of Austin, TX, is discussed in this column. The program is a high-speed, interactive seismic processing and computer analysis system for the Apple Macintosh II family of computers. Also reviewed in this column are three products from Wilkerson Associates of Champaign, IL. SubSide is an interactive program for basin subsidence analysis; MacFault and MacThrustRamp are programs for modeling faults.

  12. Use of computer modeling to investigate a dynamic interaction problem in the Skylab TACS quad-valve package

    NASA Technical Reports Server (NTRS)

    Hesser, R. J.; Gershman, R.

    1975-01-01

    A valve opening-response problem encountered during development of a control valve for the Skylab thruster attitude control system (TACS) is described. The problem involved effects of dynamic interaction among valves in the quad-redundant valve package. Also described is a detailed computer simulation of the quad-valve package which was helpful in resolving the problem.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sen, Oishik, E-mail: oishik-sen@uiowa.edu; Gaul, Nicholas J., E-mail: nicholas-gaul@ramdosolutions.com; Choi, K.K., E-mail: kyung-choi@uiowa.edu

    Macro-scale computations of shocked particulate flows require closure laws that model the exchange of momentum/energy between the fluid and particle phases. Closure laws are constructed in this work in the form of surrogate models derived from highly resolved mesoscale computations of shock-particle interactions. The mesoscale computations are performed to calculate the drag force on a cluster of particles for different values of Mach Number and particle volume fraction. Two Kriging-based methods, viz. the Dynamic Kriging Method (DKG) and the Modified Bayesian Kriging Method (MBKG) are evaluated for their ability to construct surrogate models with sparse data; i.e. using the leastmore » number of mesoscale simulations. It is shown that if the input data is noise-free, the DKG method converges monotonically; convergence is less robust in the presence of noise. The MBKG method converges monotonically even with noisy input data and is therefore more suitable for surrogate model construction from numerical experiments. This work is the first step towards a full multiscale modeling of interaction of shocked particle laden flows.« less

  14. On the quasi-conical flowfield structure of the swept shock wave-turbulent boundary layer interaction

    NASA Technical Reports Server (NTRS)

    Knight, Doyle D.; Badekas, Dias

    1991-01-01

    The swept oblique shock-wave/turbulent-boundary-layer interaction generated by a 20-deg sharp fin at Mach 4 and Reynolds number 21,000 is investigated via a series of computations using both conical and three-dimensional Reynolds-averaged Navier-Stokes equations with turbulence incorporated through the algebraic turbulent eddy viscosity model of Baldwin-Lomax. Results are compared with known experimental data, and it is concluded that the computed three-dimensional flowfield is quasi-conical (in agreement with the experimental data), the computed three-dimensional and conical surface pressure and surface flow direction are in good agreement with the experiment, and the three-dimensional and conical flows significantly underpredict the peak experimental skin friction. It is pointed out that most of the features of the conical flowfield model in the experiment are observed in the conical computation which also describes the complete conical streamline pattern not included in the model of the experiment.

  15. Computational substrates of social value in interpersonal collaboration.

    PubMed

    Fareri, Dominic S; Chang, Luke J; Delgado, Mauricio R

    2015-05-27

    Decisions to engage in collaborative interactions require enduring considerable risk, yet provide the foundation for building and maintaining relationships. Here, we investigate the mechanisms underlying this process and test a computational model of social value to predict collaborative decision making. Twenty-six participants played an iterated trust game and chose to invest more frequently with their friends compared with a confederate or computer despite equal reinforcement rates. This behavior was predicted by our model, which posits that people receive a social value reward signal from reciprocation of collaborative decisions conditional on the closeness of the relationship. This social value signal was associated with increased activity in the ventral striatum and medial prefrontal cortex, which significantly predicted the reward parameters from the social value model. Therefore, we demonstrate that the computation of social value drives collaborative behavior in repeated interactions and provide a mechanistic account of reward circuit function instantiating this process. Copyright © 2015 the authors 0270-6474/15/358170-11$15.00/0.

  16. An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology.

    PubMed

    Deodhar, Suruchi; Bisset, Keith R; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V

    2014-07-01

    We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity.

  17. An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology

    PubMed Central

    Deodhar, Suruchi; Bisset, Keith R.; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V.

    2014-01-01

    We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity. PMID:25530914

  18. qPIPSA: Relating enzymatic kinetic parameters and interaction fields

    PubMed Central

    Gabdoulline, Razif R; Stein, Matthias; Wade, Rebecca C

    2007-01-01

    Background The simulation of metabolic networks in quantitative systems biology requires the assignment of enzymatic kinetic parameters. Experimentally determined values are often not available and therefore computational methods to estimate these parameters are needed. It is possible to use the three-dimensional structure of an enzyme to perform simulations of a reaction and derive kinetic parameters. However, this is computationally demanding and requires detailed knowledge of the enzyme mechanism. We have therefore sought to develop a general, simple and computationally efficient procedure to relate protein structural information to enzymatic kinetic parameters that allows consistency between the kinetic and structural information to be checked and estimation of kinetic constants for structurally and mechanistically similar enzymes. Results We describe qPIPSA: quantitative Protein Interaction Property Similarity Analysis. In this analysis, molecular interaction fields, for example, electrostatic potentials, are computed from the enzyme structures. Differences in molecular interaction fields between enzymes are then related to the ratios of their kinetic parameters. This procedure can be used to estimate unknown kinetic parameters when enzyme structural information is available and kinetic parameters have been measured for related enzymes or were obtained under different conditions. The detailed interaction of the enzyme with substrate or cofactors is not modeled and is assumed to be similar for all the proteins compared. The protein structure modeling protocol employed ensures that differences between models reflect genuine differences between the protein sequences, rather than random fluctuations in protein structure. Conclusion Provided that the experimental conditions and the protein structural models refer to the same protein state or conformation, correlations between interaction fields and kinetic parameters can be established for sets of related enzymes. Outliers may arise due to variation in the importance of different contributions to the kinetic parameters, such as protein stability and conformational changes. The qPIPSA approach can assist in the validation as well as estimation of kinetic parameters, and provide insights into enzyme mechanism. PMID:17919319

  19. Generalization through the Recurrent Interaction of Episodic Memories: A Model of the Hippocampal System

    ERIC Educational Resources Information Center

    Kumaran, Dharshan; McClelland, James L.

    2012-01-01

    In this article, we present a perspective on the role of the hippocampal system in generalization, instantiated in a computational model called REMERGE (recurrency and episodic memory results in generalization). We expose a fundamental, but neglected, tension between prevailing computational theories that emphasize the function of the hippocampus…

  20. Subgroup Discovery with User Interaction Data: An Empirically Guided Approach to Improving Intelligent Tutoring Systems

    ERIC Educational Resources Information Center

    Poitras, Eric G.; Lajoie, Susanne P.; Doleck, Tenzin; Jarrell, Amanda

    2016-01-01

    Learner modeling, a challenging and complex endeavor, is an important and oft-studied research theme in computer-supported education. From this perspective, Educational Data Mining (EDM) research has focused on modeling and comprehending various dimensions of learning in computer-based learning environments (CBLE). Researchers and designers are…

  1. Modelling the Landing of a Plane in a Calculus Lab

    ERIC Educational Resources Information Center

    Morante, Antonio; Vallejo, Jose A.

    2012-01-01

    We exhibit a simple model of a plane landing that involves only basic concepts of differential calculus, so it is suitable for a first-year calculus lab. We use the computer algebra system Maxima and the interactive geometry software GeoGebra to do the computations and graphics. (Contains 5 figures and 1 note.)

  2. Close-packed structure dynamics with finite-range interaction: computational mechanics with individual layer interaction.

    PubMed

    Rodriguez-Horta, Edwin; Estevez-Rams, Ernesto; Lora-Serrano, Raimundo; Neder, Reinhard

    2017-09-01

    This is the second contribution in a series of papers dealing with dynamical models in equilibrium theories of polytypism. A Hamiltonian introduced by Ahmad & Khan [Phys. Status Solidi B (2000), 218, 425-430] avoids the unphysical assignment of interaction terms to fictitious entities given by spins in the Hägg coding of the stacking arrangement. In this paper an analysis of polytype generation and disorder in close-packed structures is made for such a Hamiltonian. Results are compared with a previous analysis using the Ising model. Computational mechanics is the framework under which the analysis is performed. The competing effects of disorder and structure, as given by entropy density and excess entropy, respectively, are discussed. It is argued that the Ahmad & Khan model is simpler and predicts a larger set of polytypes than previous treatments.

  3. Computational challenges of structure-based approaches applied to HIV.

    PubMed

    Forli, Stefano; Olson, Arthur J

    2015-01-01

    Here, we review some of the opportunities and challenges that we face in computational modeling of HIV therapeutic targets and structural biology, both in terms of methodology development and structure-based drug design (SBDD). Computational methods have provided fundamental support to HIV research since the initial structural studies, helping to unravel details of HIV biology. Computational models have proved to be a powerful tool to analyze and understand the impact of mutations and to overcome their structural and functional influence in drug resistance. With the availability of structural data, in silico experiments have been instrumental in exploiting and improving interactions between drugs and viral targets, such as HIV protease, reverse transcriptase, and integrase. Issues such as viral target dynamics and mutational variability, as well as the role of water and estimates of binding free energy in characterizing ligand interactions, are areas of active computational research. Ever-increasing computational resources and theoretical and algorithmic advances have played a significant role in progress to date, and we envision a continually expanding role for computational methods in our understanding of HIV biology and SBDD in the future.

  4. Flexible configuration-interaction shell-model many-body solver

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Calvin W.; Ormand, W. Erich; McElvain, Kenneth S.

    BIGSTICK Is a flexible configuration-Interaction open-source shell-model code for the many-fermion problem In a shell model (occupation representation) framework. BIGSTICK can generate energy spectra, static and transition one-body densities, and expectation values of scalar operators. Using the built-in Lanczos algorithm one can compute transition probabflity distributions and decompose wave functions into components defined by group theory.

  5. A Case-Series Test of the Interactive Two-Step Model of Lexical Access: Predicting Word Repetition from Picture Naming

    ERIC Educational Resources Information Center

    Dell, Gary S.; Martin, Nadine; Schwartz, Myrna F.

    2007-01-01

    Lexical access in language production, and particularly pathologies of lexical access, are often investigated by examining errors in picture naming and word repetition. In this article, we test a computational approach to lexical access, the two-step interactive model, by examining whether the model can quantitatively predict the repetition-error…

  6. Regularization method for large eddy simulations of shock-turbulence interactions

    NASA Astrophysics Data System (ADS)

    Braun, N. O.; Pullin, D. I.; Meiron, D. I.

    2018-05-01

    The rapid change in scales over a shock has the potential to introduce unique difficulties in Large Eddy Simulations (LES) of compressible shock-turbulence flows if the governing model does not sufficiently capture the spectral distribution of energy in the upstream turbulence. A method for the regularization of LES of shock-turbulence interactions is presented which is constructed to enforce that the energy content in the highest resolved wavenumbers decays as k - 5 / 3, and is computed locally in physical-space at low computational cost. The application of the regularization to an existing subgrid scale model is shown to remove high wavenumber errors while maintaining agreement with Direct Numerical Simulations (DNS) of forced and decaying isotropic turbulence. Linear interaction analysis is implemented to model the interaction of a shock with isotropic turbulence from LES. Comparisons to analytical models suggest that the regularization significantly improves the ability of the LES to predict amplifications in subgrid terms over the modeled shockwave. LES and DNS of decaying, modeled post shock turbulence are also considered, and inclusion of the regularization in shock-turbulence LES is shown to improve agreement with lower Reynolds number DNS.

  7. Assessment of Turbulent Shock-Boundary Layer Interaction Computations Using the OVERFLOW Code

    NASA Technical Reports Server (NTRS)

    Oliver, A. B.; Lillard, R. P.; Schwing, A. M.; Blaisdell, G> A.; Lyrintzis, A. S.

    2007-01-01

    The performance of two popular turbulence models, the Spalart-Allmaras model and Menter s SST model, and one relatively new model, Olsen & Coakley s Lag model, are evaluated using the OVERFLOWcode. Turbulent shock-boundary layer interaction predictions are evaluated with three different experimental datasets: a series of 2D compression ramps at Mach 2.87, a series of 2D compression ramps at Mach 2.94, and an axisymmetric coneflare at Mach 11. The experimental datasets include flows with no separation, moderate separation, and significant separation, and use several different experimental measurement techniques (including laser doppler velocimetry (LDV), pitot-probe measurement, inclined hot-wire probe measurement, preston tube skin friction measurement, and surface pressure measurement). Additionally, the OVERFLOW solutions are compared to the solutions of a second CFD code, DPLR. The predictions for weak shock-boundary layer interactions are in reasonable agreement with the experimental data. For strong shock-boundary layer interactions, all of the turbulence models overpredict the separation size and fail to predict the correct skin friction recovery distribution. In most cases, surface pressure predictions show too much upstream influence, however including the tunnel side-wall boundary layers in the computation improves the separation predictions.

  8. Color Algebras

    NASA Technical Reports Server (NTRS)

    Mulligan, Jeffrey B.

    2017-01-01

    A color algebra refers to a system for computing sums and products of colors, analogous to additive and subtractive color mixtures. The difficulty addressed here is the fact that, because of metamerism, we cannot know with certainty the spectrum that produced a particular color solely on the basis of sensory data. Knowledge of the spectrum is not required to compute additive mixture of colors, but is critical for subtractive (multiplicative) mixture. Therefore, we cannot predict with certainty the multiplicative interactions between colors based solely on sensory data. There are two potential applications of a color algebra: first, to aid modeling phenomena of human visual perception, such as color constancy and transparency; and, second, to provide better models of the interactions of lights and surfaces for computer graphics rendering.

  9. A framework for analyzing the cognitive complexity of computer-assisted clinical ordering.

    PubMed

    Horsky, Jan; Kaufman, David R; Oppenheim, Michael I; Patel, Vimla L

    2003-01-01

    Computer-assisted provider order entry is a technology that is designed to expedite medical ordering and to reduce the frequency of preventable errors. This paper presents a multifaceted cognitive methodology for the characterization of cognitive demands of a medical information system. Our investigation was informed by the distributed resources (DR) model, a novel approach designed to describe the dimensions of user interfaces that introduce unnecessary cognitive complexity. This method evaluates the relative distribution of external (system) and internal (user) representations embodied in system interaction. We conducted an expert walkthrough evaluation of a commercial order entry system, followed by a simulated clinical ordering task performed by seven clinicians. The DR model was employed to explain variation in user performance and to characterize the relationship of resource distribution and ordering errors. The analysis revealed that the configuration of resources in this ordering application placed unnecessarily heavy cognitive demands on the user, especially on those who lacked a robust conceptual model of the system. The resources model also provided some insight into clinicians' interactive strategies and patterns of associated errors. Implications for user training and interface design based on the principles of human-computer interaction in the medical domain are discussed.

  10. Tools for building a comprehensive modeling system for virtual screening under real biological conditions: The Computational Titration algorithm.

    PubMed

    Kellogg, Glen E; Fornabaio, Micaela; Chen, Deliang L; Abraham, Donald J; Spyrakis, Francesca; Cozzini, Pietro; Mozzarelli, Andrea

    2006-05-01

    Computational tools utilizing a unique empirical modeling system based on the hydrophobic effect and the measurement of logP(o/w) (the partition coefficient for solvent transfer between 1-octanol and water) are described. The associated force field, Hydropathic INTeractions (HINT), contains much rich information about non-covalent interactions in the biological environment because of its basis in an experiment that measures interactions in solution. HINT is shown to be the core of an evolving virtual screening system that is capable of taking into account a number of factors often ignored such as entropy, effects of solvent molecules at the active site, and the ionization states of acidic and basic residues and ligand functional groups. The outline of a comprehensive modeling system for virtual screening that incorporates these features is described. In addition, a detailed description of the Computational Titration algorithm is provided. As an example, three complexes of dihydrofolate reductase (DHFR) are analyzed with our system and these results are compared with the experimental free energies of binding.

  11. Efficient coarse simulation of a growing avascular tumor

    PubMed Central

    Kavousanakis, Michail E.; Liu, Ping; Boudouvis, Andreas G.; Lowengrub, John; Kevrekidis, Ioannis G.

    2013-01-01

    The subject of this work is the development and implementation of algorithms which accelerate the simulation of early stage tumor growth models. Among the different computational approaches used for the simulation of tumor progression, discrete stochastic models (e.g., cellular automata) have been widely used to describe processes occurring at the cell and subcell scales (e.g., cell-cell interactions and signaling processes). To describe macroscopic characteristics (e.g., morphology) of growing tumors, large numbers of interacting cells must be simulated. However, the high computational demands of stochastic models make the simulation of large-scale systems impractical. Alternatively, continuum models, which can describe behavior at the tumor scale, often rely on phenomenological assumptions in place of rigorous upscaling of microscopic models. This limits their predictive power. In this work, we circumvent the derivation of closed macroscopic equations for the growing cancer cell populations; instead, we construct, based on the so-called “equation-free” framework, a computational superstructure, which wraps around the individual-based cell-level simulator and accelerates the computations required for the study of the long-time behavior of systems involving many interacting cells. The microscopic model, e.g., a cellular automaton, which simulates the evolution of cancer cell populations, is executed for relatively short time intervals, at the end of which coarse-scale information is obtained. These coarse variables evolve on slower time scales than each individual cell in the population, enabling the application of forward projection schemes, which extrapolate their values at later times. This technique is referred to as coarse projective integration. Increasing the ratio of projection times to microscopic simulator execution times enhances the computational savings. Crucial accuracy issues arising for growing tumors with radial symmetry are addressed by applying the coarse projective integration scheme in a cotraveling (cogrowing) frame. As a proof of principle, we demonstrate that the application of this scheme yields highly accurate solutions, while preserving the computational savings of coarse projective integration. PMID:22587128

  12. Evaluating Pharmacokinetic and Pharmacodynamic Interactions with Computational Models in Cumulative Risk Assessment

    EPA Science Inventory

    Simultaneous or sequential exposure to multiple chemicals may cause interactions in the pharmacokinetics (PK) and/or pharmacodynamics (PD) of the individual chemicals. Such interactions can cause modification of the internal or target dose/response of one chemical in the mixture ...

  13. Performance estimation of a Venturi scrubber using a computational model for capturing dust particles with liquid spray.

    PubMed

    Pak, S I; Chang, K S

    2006-12-01

    A Venturi scrubber has dispersed three-phase flow of gas, dust, and liquid. Atomization of a liquid jet and interaction between the phases has a large effect on the performance of Venturi scrubbers. In this study, a computational model for the interactive three-phase flow in a Venturi scrubber has been developed to estimate pressure drop and collection efficiency. The Eulerian-Lagrangian method is used to solve the model numerically. Gas flow is solved using the Eulerian approach by using the Navier-Stokes equations, and the motion of dust and liquid droplets, described by the Basset-Boussinesq-Oseen (B-B-O) equation, is solved using the Lagrangian approach. This model includes interaction between gas and droplets, atomization of a liquid jet, droplet deformation, breakup and collision of droplets, and capture of dust by droplets. A circular Pease-Anthony Venturi scrubber was simulated numerically with this new model. The numerical results were compared with earlier experimental data for pressure drop and collection efficiency, and gave good agreements.

  14. A Cohomological Perspective on Algebraic Quantum Field Theory

    NASA Astrophysics Data System (ADS)

    Hawkins, Eli

    2018-05-01

    Algebraic quantum field theory is considered from the perspective of the Hochschild cohomology bicomplex. This is a framework for studying deformations and symmetries. Deformation is a possible approach to the fundamental challenge of constructing interacting QFT models. Symmetry is the primary tool for understanding the structure and properties of a QFT model. This perspective leads to a generalization of the algebraic quantum field theory framework, as well as a more general definition of symmetry. This means that some models may have symmetries that were not previously recognized or exploited. To first order, a deformation of a QFT model is described by a Hochschild cohomology class. A deformation could, for example, correspond to adding an interaction term to a Lagrangian. The cohomology class for such an interaction is computed here. However, the result is more general and does not require the undeformed model to be constructed from a Lagrangian. This computation leads to a more concrete version of the construction of perturbative algebraic quantum field theory.

  15. Method of performing computational aeroelastic analyses

    NASA Technical Reports Server (NTRS)

    Silva, Walter A. (Inventor)

    2011-01-01

    Computational aeroelastic analyses typically use a mathematical model for the structural modes of a flexible structure and a nonlinear aerodynamic model that can generate a plurality of unsteady aerodynamic responses based on the structural modes for conditions defining an aerodynamic condition of the flexible structure. In the present invention, a linear state-space model is generated using a single execution of the nonlinear aerodynamic model for all of the structural modes where a family of orthogonal functions is used as the inputs. Then, static and dynamic aeroelastic solutions are generated using computational interaction between the mathematical model and the linear state-space model for a plurality of periodic points in time.

  16. Large Advanced Space Systems (LASS) computer-aided design program additions

    NASA Technical Reports Server (NTRS)

    Farrell, C. E.

    1982-01-01

    The LSS preliminary and conceptual design requires extensive iteractive analysis because of the effects of structural, thermal, and control intercoupling. A computer aided design program that will permit integrating and interfacing of required large space system (LSS) analyses is discussed. The primary objective of this program is the implementation of modeling techniques and analysis algorithms that permit interactive design and tradeoff studies of LSS concepts. Eight software modules were added to the program. The existing rigid body controls module was modified to include solar pressure effects. The new model generator modules and appendage synthesizer module are integrated (interfaced) to permit interactive definition and generation of LSS concepts. The mass properties module permits interactive specification of discrete masses and their locations. The other modules permit interactive analysis of orbital transfer requirements, antenna primary beam n, and attitude control requirements.

  17. Virtual imprinting as a tool to design efficient MIPs for photosynthesis-inhibiting herbicides.

    PubMed

    Breton, Florent; Rouillon, Regis; Piletska, Elena V; Karim, Kal; Guerreiro, Antonio; Chianella, Iva; Piletsky, Sergey A

    2007-04-15

    Molecular modelling and computational screening were used to identify functional monomers capable of interacting with several different photosynthesis-inhibiting herbicides. The process involved the design of a virtual library of molecular models of functional monomers containing polymerizable residues and residues able to interact with the template through electrostatic, hydrophobic, Van der Waals forces and dipole-dipole interactions. Each of the entries in the virtual library was probed for its possible interactions with molecular models of the template molecules. It was anticipated that the monomers giving the highest binding score would represent good candidates for the preparation of affinity polymers. Strong interactions were computationally determined between acidic functional monomers like methacrylic acid (MAA) or itaconic acid (IA) with triazines, and between vinylimidazole with bentazone and bromoxynil. Nevertheless, weaker interactions were seen with phenylureas. The corresponding blank polymers were prepared using the selected monomers and tested in the solid phase extraction (SPE) of herbicides from chloroform solutions. A good correlation was found between the binding score of the monomers and the affinities of the corresponding polymers. The use of computationally designed blanks can potentially eliminate the need for molecular imprinting, (adding a template to the monomer mixture to create specific binding sites). Data also showed that some monomers have a natural selectivity for some herbicides, which can be further enhanced by imprinting. Thus, in regard to retention on the blank polymer, we can estimate if the resulting imprinted polymer will be effective or not.

  18. The MV model of the color glass condensate for a finite number of sources including Coulomb interactions

    DOE PAGES

    McLerran, Larry; Skokov, Vladimir V.

    2016-09-19

    We modify the McLerran–Venugopalan model to include only a finite number of sources of color charge. In the effective action for such a system of a finite number of sources, there is a point-like interaction and a Coulombic interaction. The point interaction generates the standard fluctuation term in the McLerran–Venugopalan model. The Coulomb interaction generates the charge screening originating from well known evolution in x. Such a model may be useful for computing angular harmonics of flow measured in high energy hadron collisions for small systems. In this study we provide a basic formulation of the problem on a lattice.

  19. Understanding Emergency Care Delivery Through Computer Simulation Modeling.

    PubMed

    Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L

    2018-02-01

    In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.

  20. I Use the Computer to ADVANCE Advances in Comprehension-Strategy Research.

    ERIC Educational Resources Information Center

    Blohm, Paul J.

    Merging the instructional implications drawn from theory and research in the interactive reading model, schemata, and metacognition with computer based instruction seems a natural approach for actively involving students' participation in reading and learning from text. Computer based graphic organizers guide students' preview or review of lengthy…

  1. Modeling Mendel's Laws on Inheritance in Computational Biology and Medical Sciences

    ERIC Educational Resources Information Center

    Singh, Gurmukh; Siddiqui, Khalid; Singh, Mankiran; Singh, Satpal

    2011-01-01

    The current research article is based on a simple and practical way of employing the computational power of widely available, versatile software MS Excel 2007 to perform interactive computer simulations for undergraduate/graduate students in biology, biochemistry, biophysics, microbiology, medicine in college and university classroom setting. To…

  2. Evaluative methodology for comprehensive water quality management planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dyer, H. L.

    Computer-based evaluative methodologies have been developed to provide for the analysis of coupled phenomena associated with natural resource comprehensive planning requirements. Provisions for planner/computer interaction have been included. Each of the simulation models developed is described in terms of its coded procedures. An application of the models for water quality management planning is presented; and the data requirements for each of the models are noted.

  3. A queueing model of pilot decision making in a multi-task flight management situation

    NASA Technical Reports Server (NTRS)

    Walden, R. S.; Rouse, W. B.

    1977-01-01

    Allocation of decision making responsibility between pilot and computer is considered and a flight management task, designed for the study of pilot-computer interaction, is discussed. A queueing theory model of pilot decision making in this multi-task, control and monitoring situation is presented. An experimental investigation of pilot decision making and the resulting model parameters are discussed.

  4. Development of a computational model for predicting solar wind flows past nonmagnetic terrestrial planets

    NASA Technical Reports Server (NTRS)

    Stahara, S. S.; Spreiter, J. R.

    1983-01-01

    A computational model for the determination of the detailed plasma and magnetic field properties of the global interaction of the solar wind with nonmagnetic terrestrial planetary obstacles is described. The theoretical method is based on an established single fluid, steady, dissipationless, magnetohydrodynamic continuum model, and is appropriate for the calculation of supersonic, super-Alfvenic solar wind flow past terrestrial ionospheres.

  5. Documentation and user guides for SPBLOB: a computer simulation model of the join population dynamics for loblolly pine and the southern pine beetle

    Treesearch

    John Bishir; James Roberds; Brian Strom; Xiaohai Wan

    2009-01-01

    SPLOB is a computer simulation model for the interaction between loblolly pine (Pinus taeda L.), the economically most important forest crop in the United States, and the southern pine beetle (SPB: Dendroctonus frontalis Zimm.), the major insect pest for this species. The model simulates loblolly pine stands from time of planting...

  6. Multiscale Space-Time Computational Methods for Fluid-Structure Interactions

    DTIC Science & Technology

    2015-09-13

    prescribed fully or partially, is from an actual locust, extracted from high-speed, multi-camera video recordings of the locust in a wind tunnel . We use...With creative methods for coupling the fluid and structure, we can increase the scope and efficiency of the FSI modeling . Multiscale methods, which now...play an important role in computational mathematics, can also increase the accuracy and efficiency of the computer modeling techniques. The main

  7. Estimation and Mapping of Clouds and Rainfall Areas with an Interactive Computer.

    DTIC Science & Technology

    1982-12-01

    test. C . TEST PROCEDURES The following lis1t is the set of procedures for this test of the SPADS Cloud Model. The steps taken were to: 1. Capture...12, 1640-1648. 121 0 APPENDIX4 SPADS CLOUD MlDEL COMPUTER PROGrRArt C CLOY) -IS DRIVER/IMAIN PROGRAIM C T41S PROC.RAM ANALYZES VIS AND IR. TOGETHEI TO...NITH AN INTERACTIVE COMPUTER(U) NAYAL POSTGRADUATE SCHOOL MONTEREY CA C A NELSON DEC 92 UNLSSIFIED F/G 9/2 NUC MENOMONE NONI smhhhhhhhhohh

  8. Computer-generated forces in distributed interactive simulation

    NASA Astrophysics Data System (ADS)

    Petty, Mikel D.

    1995-04-01

    Distributed Interactive Simulation (DIS) is an architecture for building large-scale simulation models from a set of independent simulator nodes communicating via a common network protocol. DIS is most often used to create a simulated battlefield for military training. Computer Generated Forces (CGF) systems control large numbers of autonomous battlefield entities in a DIS simulation using computer equipment and software rather than humans in simulators. CGF entities serve as both enemy forces and supplemental friendly forces in a DIS exercise. Research into various aspects of CGF systems is ongoing. Several CGF systems have been implemented.

  9. Computer constructed imagery of distant plasma interaction boundaries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grenstadt, E.W.; Schurr, H.D.; Tsugawa, R.K.

    1982-01-01

    Computer constructed sketches of plasma boundaries arising from the interaction between the solar wind and the magnetosphere can serve as both didactic and research tools. In particular, the structure of the earth's bow shock can be represented as a nonuniform surfce according to the instantaneous orientation of the IMF, and temporal changes in structural distribution can be modeled as a sequence of sketches based on observed sequences of spacecraft-based measurements. Viewed rapidly, such a sequence of sketches can be the basis for representation of plasma processes by computer animation.

  10. Model-Based Design of Air Traffic Controller-Automation Interaction

    NASA Technical Reports Server (NTRS)

    Romahn, Stephan; Callantine, Todd J.; Palmer, Everett A.; Null, Cynthia H. (Technical Monitor)

    1998-01-01

    A model of controller and automation activities was used to design the controller-automation interactions necessary to implement a new terminal area air traffic management concept. The model was then used to design a controller interface that provides the requisite information and functionality. Using data from a preliminary study, the Crew Activity Tracking System (CATS) was used to help validate the model as a computational tool for describing controller performance.

  11. NASTRAN data generation of helicopter fuselages using interactive graphics. [preprocessor system for finite element analysis using IBM computer

    NASA Technical Reports Server (NTRS)

    Sainsbury-Carter, J. B.; Conaway, J. H.

    1973-01-01

    The development and implementation of a preprocessor system for the finite element analysis of helicopter fuselages is described. The system utilizes interactive graphics for the generation, display, and editing of NASTRAN data for fuselage models. It is operated from an IBM 2250 cathode ray tube (CRT) console driven by an IBM 370/145 computer. Real time interaction plus automatic data generation reduces the nominal 6 to 10 week time for manual generation and checking of data to a few days. The interactive graphics system consists of a series of satellite programs operated from a central NASTRAN Systems Monitor. Fuselage structural models including the outer shell and internal structure may be rapidly generated. All numbering systems are automatically assigned. Hard copy plots of the model labeled with GRID or elements ID's are also available. General purpose programs for displaying and editing NASTRAN data are included in the system. Utilization of the NASTRAN interactive graphics system has made possible the multiple finite element analysis of complex helicopter fuselage structures within design schedules.

  12. Reliable prediction of three-body intermolecular interactions using dispersion-corrected second-order Møller-Plesset perturbation theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Yuanhang; Beran, Gregory J. O., E-mail: gregory.beran@ucr.edu

    2015-07-28

    Three-body and higher intermolecular interactions can play an important role in molecular condensed phases. Recent benchmark calculations found problematic behavior for many widely used density functional approximations in treating 3-body intermolecular interactions. Here, we demonstrate that the combination of second-order Møller-Plesset (MP2) perturbation theory plus short-range damped Axilrod-Teller-Muto (ATM) dispersion accurately describes 3-body interactions with reasonable computational cost. The empirical damping function used in the ATM dispersion term compensates both for the absence of higher-order dispersion contributions beyond the triple-dipole ATM term and non-additive short-range exchange terms which arise in third-order perturbation theory and beyond. Empirical damping enables this simplemore » model to out-perform a non-expanded coupled Kohn-Sham dispersion correction for 3-body intermolecular dispersion. The MP2 plus ATM dispersion model approaches the accuracy of O(N{sup 6}) methods like MP2.5 or even spin-component-scaled coupled cluster models for 3-body intermolecular interactions with only O(N{sup 5}) computational cost.« less

  13. Predicting Pilot Error in Nextgen: Pilot Performance Modeling and Validation Efforts

    NASA Technical Reports Server (NTRS)

    Wickens, Christopher; Sebok, Angelia; Gore, Brian; Hooey, Becky

    2012-01-01

    We review 25 articles presenting 5 general classes of computational models to predict pilot error. This more targeted review is placed within the context of the broader review of computational models of pilot cognition and performance, including such aspects as models of situation awareness or pilot-automation interaction. Particular emphasis is placed on the degree of validation of such models against empirical pilot data, and the relevance of the modeling and validation efforts to Next Gen technology and procedures.

  14. An Interactive Multimedia Learning Environment for VLSI Built with COSMOS

    ERIC Educational Resources Information Center

    Angelides, Marios C.; Agius, Harry W.

    2002-01-01

    This paper presents Bigger Bits, an interactive multimedia learning environment that teaches students about VLSI within the context of computer electronics. The system was built with COSMOS (Content Oriented semantic Modelling Overlay Scheme), which is a modelling scheme that we developed for enabling the semantic content of multimedia to be used…

  15. An Interactive Simulation System for Modeling Stands, Harvests, and Machines

    Treesearch

    Jingxin Wang; W. Dale Greene

    1999-01-01

    A interactive computer simulation program models stands, harvest, and machine factors and evaluates their interatcitons while performing felling, skidding, or fowarding activities. A stand generator allows the user to generate either natural or planted stands. Fellings with chainsaw, drive-to-tree feller-bunchers, or harvesters and extraction with grapple skidders or...

  16. Low Speed Rot or/Fuselage Interactional Aerodynamics

    NASA Technical Reports Server (NTRS)

    Barnwell, Richard W.; Prichard, Devon S.

    2003-01-01

    This report presents work performed under a Cooperative Research Agreement between Virginia Tech and the NASA Langley Research Center. The work involved development of computational techniques for modeling helicopter rotor/airframe aerodynamic interaction. A brief overview of the problem is presented, the modeling techniques are described, and selected example calculations are briefly discussed.

  17. Exploring Classroom Interaction with Dynamic Social Network Analysis

    ERIC Educational Resources Information Center

    Bokhove, Christian

    2018-01-01

    This article reports on an exploratory project in which technology and dynamic social network analysis (SNA) are used for modelling classroom interaction. SNA focuses on the links between social actors, draws on graphic imagery to reveal and display the patterning of those links, and develops mathematical and computational models to describe and…

  18. A multiple-time-scale turbulence model based on variable partitioning of turbulent kinetic energy spectrum

    NASA Technical Reports Server (NTRS)

    Kim, S.-W.; Chen, C.-P.

    1987-01-01

    A multiple-time-scale turbulence model of a single point closure and a simplified split-spectrum method is presented. In the model, the effect of the ratio of the production rate to the dissipation rate on eddy viscosity is modeled by use of the multiple-time-scales and a variable partitioning of the turbulent kinetic energy spectrum. The concept of a variable partitioning of the turbulent kinetic energy spectrum and the rest of the model details are based on the previously reported algebraic stress turbulence model. Example problems considered include: a fully developed channel flow, a plane jet exhausting into a moving stream, a wall jet flow, and a weakly coupled wake-boundary layer interaction flow. The computational results compared favorably with those obtained by using the algebraic stress turbulence model as well as experimental data. The present turbulence model, as well as the algebraic stress turbulence model, yielded significantly improved computational results for the complex turbulent boundary layer flows, such as the wall jet flow and the wake boundary layer interaction flow, compared with available computational results obtained by using the standard kappa-epsilon turbulence model.

  19. A multiple-time-scale turbulence model based on variable partitioning of the turbulent kinetic energy spectrum

    NASA Technical Reports Server (NTRS)

    Kim, S.-W.; Chen, C.-P.

    1989-01-01

    A multiple-time-scale turbulence model of a single point closure and a simplified split-spectrum method is presented. In the model, the effect of the ratio of the production rate to the dissipation rate on eddy viscosity is modeled by use of the multiple-time-scales and a variable partitioning of the turbulent kinetic energy spectrum. The concept of a variable partitioning of the turbulent kinetic energy spectrum and the rest of the model details are based on the previously reported algebraic stress turbulence model. Example problems considered include: a fully developed channel flow, a plane jet exhausting into a moving stream, a wall jet flow, and a weakly coupled wake-boundary layer interaction flow. The computational results compared favorably with those obtained by using the algebraic stress turbulence model as well as experimental data. The present turbulence model, as well as the algebraic stress turbulence model, yielded significantly improved computational results for the complex turbulent boundary layer flows, such as the wall jet flow and the wake boundary layer interaction flow, compared with available computational results obtained by using the standard kappa-epsilon turbulence model.

  20. Impact of computational structure-based methods on drug discovery.

    PubMed

    Reynolds, Charles H

    2014-01-01

    Structure-based drug design has become an indispensible tool in drug discovery. The emergence of structure-based design is due to gains in structural biology that have provided exponential growth in the number of protein crystal structures, new computational algorithms and approaches for modeling protein-ligand interactions, and the tremendous growth of raw computer power in the last 30 years. Computer modeling and simulation have made major contributions to the discovery of many groundbreaking drugs in recent years. Examples are presented that highlight the evolution of computational structure-based design methodology, and the impact of that methodology on drug discovery.

  1. Validation of coupled atmosphere-fire behavior models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bossert, J.E.; Reisner, J.M.; Linn, R.R.

    1998-12-31

    Recent advances in numerical modeling and computer power have made it feasible to simulate the dynamical interaction and feedback between the heat and turbulence induced by wildfires and the local atmospheric wind and temperature fields. At Los Alamos National Laboratory, the authors have developed a modeling system that includes this interaction by coupling a high resolution atmospheric dynamics model, HIGRAD, with a fire behavior model, BEHAVE, to predict the spread of wildfires. The HIGRAD/BEHAVE model is run at very high resolution to properly resolve the fire/atmosphere interaction. At present, these coupled wildfire model simulations are computationally intensive. The additional complexitymore » of these models require sophisticated methods for assuring their reliability in real world applications. With this in mind, a substantial part of the research effort is directed at model validation. Several instrumented prescribed fires have been conducted with multi-agency support and participation from chaparral, marsh, and scrub environments in coastal areas of Florida and inland California. In this paper, the authors first describe the data required to initialize the components of the wildfire modeling system. Then they present results from one of the Florida fires, and discuss a strategy for further testing and improvement of coupled weather/wildfire models.« less

  2. Taking Venus models to new dimensions.

    NASA Astrophysics Data System (ADS)

    Murawski, K.

    1997-11-01

    Space plasma physicists in Poland and Japan have gained new insights into the interaction between the solar wind and Venus. Computer simulations of this 3D global interaction between the solar wind and nonmagnetized bodies have enabled greater understanding of the large-scale processes involved in such phenomena. A model that offers improved understanding of the solar wind interaction with Venus (as well as other nonmagnetized bodies impacted by the solar wind) has been developed. In this model, the interaction of the solar wind with the ionosphere of Venus is studied by calculating numerical solutions of the 3D MHD equations for two-component, chemically reactive plasma. The author describes the innovative model.

  3. Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems

    NASA Technical Reports Server (NTRS)

    Bujorianu, Marius C.; Bujorianu, Manuela L.

    2009-01-01

    In this paper, we sketch a framework for interdisciplinary modeling of space systems, by proposing a holistic view. We consider different system dimensions and their interaction. Specifically, we study the interactions between computation, physics, communication, uncertainty and autonomy. The most comprehensive computational paradigm that supports a holistic perspective on autonomous space systems is given by cyber-physical systems. For these, the state of art consists of collaborating multi-engineering efforts that prompt for an adequate formal foundation. To achieve this, we propose a leveraging of the traditional content of formal modeling by a co-engineering process.

  4. Adiabatic quantum computation with neutral atoms via the Rydberg blockade

    NASA Astrophysics Data System (ADS)

    Goyal, Krittika; Deutsch, Ivan

    2011-05-01

    We study a trapped-neutral-atom implementation of the adiabatic model of quantum computation whereby the Hamiltonian of a set of interacting qubits is changed adiabatically so that its ground state evolves to the desired output of the algorithm. We employ the ``Rydberg blockade interaction,'' which previously has been used to implement two-qubit entangling gates in the quantum circuit model. Here it is employed via off-resonant virtual dressing of the excited levels, so that atoms always remain in the ground state. The resulting dressed-Rydberg interaction is insensitive to the distance between the atoms within a certain blockade radius, making this process robust to temperature and vibrational fluctuations. Single qubit interactions are implemented with global microwaves and atoms are locally addressed with light shifts. With these ingredients, we study a protocol to implement the two-qubit Quadratic Unconstrained Binary Optimization (QUBO) problem. We model atom trapping, addressing, coherent evolution, and decoherence. We also explore collective control of the many-atom system and generalize the QUBO problem to multiple qubits. We study a trapped-neutral-atom implementation of the adiabatic model of quantum computation whereby the Hamiltonian of a set of interacting qubits is changed adiabatically so that its ground state evolves to the desired output of the algorithm. We employ the ``Rydberg blockade interaction,'' which previously has been used to implement two-qubit entangling gates in the quantum circuit model. Here it is employed via off-resonant virtual dressing of the excited levels, so that atoms always remain in the ground state. The resulting dressed-Rydberg interaction is insensitive to the distance between the atoms within a certain blockade radius, making this process robust to temperature and vibrational fluctuations. Single qubit interactions are implemented with global microwaves and atoms are locally addressed with light shifts. With these ingredients, we study a protocol to implement the two-qubit Quadratic Unconstrained Binary Optimization (QUBO) problem. We model atom trapping, addressing, coherent evolution, and decoherence. We also explore collective control of the many-atom system and generalize the QUBO problem to multiple qubits. We acknowledge funding from the AQUARIUS project, Sandia National Laboratories

  5. Automation of reliability evaluation procedures through CARE - The computer-aided reliability estimation program.

    NASA Technical Reports Server (NTRS)

    Mathur, F. P.

    1972-01-01

    Description of an on-line interactive computer program called CARE (Computer-Aided Reliability Estimation) which can model self-repair and fault-tolerant organizations and perform certain other functions. Essentially CARE consists of a repository of mathematical equations defining the various basic redundancy schemes. These equations, under program control, are then interrelated to generate the desired mathematical model to fit the architecture of the system under evaluation. The mathematical model is then supplied with ground instances of its variables and is then evaluated to generate values for the reliability-theoretic functions applied to the model.

  6. Damsel: A Data Model Storage Library for Exascale Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koziol, Quincey

    The goal of this project is to enable exascale computational science applications to interact conveniently and efficiently with storage through abstractions that match their data models. We will accomplish this through three major activities: (1) identifying major data model motifs in computational science applications and developing representative benchmarks; (2) developing a data model storage library, called Damsel, that supports these motifs, provides efficient storage data layouts, incorporates optimizations to enable exascale operation, and is tolerant to failures; and (3) productizing Damsel and working with computational scientists to encourage adoption of this library by the scientific community.

  7. Integrating interactive computational modeling in biology curricula.

    PubMed

    Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A

    2015-03-01

    While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  8. Heat transfer, velocity-temperature correlation, and turbulent shear stress from Navier-Stokes computations of shock wave/turbulent boundary layer interaction flows

    NASA Technical Reports Server (NTRS)

    Wang, C. R.; Hingst, W. R.; Porro, A. R.

    1991-01-01

    The properties of 2-D shock wave/turbulent boundary layer interaction flows were calculated by using a compressible turbulent Navier-Stokes numerical computational code. Interaction flows caused by oblique shock wave impingement on the turbulent boundary layer flow were considered. The oblique shock waves were induced with shock generators at angles of attack less than 10 degs in supersonic flows. The surface temperatures were kept at near-adiabatic (ratio of wall static temperature to free stream total temperature) and cold wall (ratio of wall static temperature to free stream total temperature) conditions. The computational results were studied for the surface heat transfer, velocity temperature correlation, and turbulent shear stress in the interaction flow fields. Comparisons of the computational results with existing measurements indicated that (1) the surface heat transfer rates and surface pressures could be correlated with Holden's relationship, (2) the mean flow streamwise velocity components and static temperatures could be correlated with Crocco's relationship if flow separation did not occur, and (3) the Baldwin-Lomax turbulence model should be modified for turbulent shear stress computations in the interaction flows.

  9. Information Interaction: Providing a Framework for Information Architecture.

    ERIC Educational Resources Information Center

    Toms, Elaine G.

    2002-01-01

    Discussion of information architecture focuses on a model of information interaction that bridges the gap between human and computer and between information behavior and information retrieval. Illustrates how the process of information interaction is affected by the user, the system, and the content. (Contains 93 references.) (LRW)

  10. The Design of Hand Gestures for Human-Computer Interaction: Lessons from Sign Language Interpreters.

    PubMed

    Rempel, David; Camilleri, Matt J; Lee, David L

    2015-10-01

    The design and selection of 3D modeled hand gestures for human-computer interaction should follow principles of natural language combined with the need to optimize gesture contrast and recognition. The selection should also consider the discomfort and fatigue associated with distinct hand postures and motions, especially for common commands. Sign language interpreters have extensive and unique experience forming hand gestures and many suffer from hand pain while gesturing. Professional sign language interpreters (N=24) rated discomfort for hand gestures associated with 47 characters and words and 33 hand postures. Clear associations of discomfort with hand postures were identified. In a nominal logistic regression model, high discomfort was associated with gestures requiring a flexed wrist, discordant adjacent fingers, or extended fingers. These and other findings should be considered in the design of hand gestures to optimize the relationship between human cognitive and physical processes and computer gesture recognition systems for human-computer input.

  11. Variance-based interaction index measuring heteroscedasticity

    NASA Astrophysics Data System (ADS)

    Ito, Keiichi; Couckuyt, Ivo; Poles, Silvia; Dhaene, Tom

    2016-06-01

    This work is motivated by the need to deal with models with high-dimensional input spaces of real variables. One way to tackle high-dimensional problems is to identify interaction or non-interaction among input parameters. We propose a new variance-based sensitivity interaction index that can detect and quantify interactions among the input variables of mathematical functions and computer simulations. The computation is very similar to first-order sensitivity indices by Sobol'. The proposed interaction index can quantify the relative importance of input variables in interaction. Furthermore, detection of non-interaction for screening can be done with as low as 4 n + 2 function evaluations, where n is the number of input variables. Using the interaction indices based on heteroscedasticity, the original function may be decomposed into a set of lower dimensional functions which may then be analyzed separately.

  12. NASA LeRC/Akron University Graduate Cooperative Fellowship Program and Graduate Student Researchers Program

    NASA Technical Reports Server (NTRS)

    Fertis, D. G.; Simon, A. L.

    1981-01-01

    The requisite methodology to solve linear and nonlinear problems associated with the static and dynamic analysis of rotating machinery, their static and dynamic behavior, and the interaction between the rotating and nonrotating parts of an engine is developed. Linear and nonlinear structural engine problems are investigated by developing solution strategies and interactive computational methods whereby the man and computer can communicate directly in making analysis decisions. Representative examples include modifying structural models, changing material, parameters, selecting analysis options and coupling with interactive graphical display for pre- and postprocessing capability.

  13. The visible ear simulator: a public PC application for GPU-accelerated haptic 3D simulation of ear surgery based on the visible ear data.

    PubMed

    Sorensen, Mads Solvsten; Mosegaard, Jesper; Trier, Peter

    2009-06-01

    Existing virtual simulators for middle ear surgery are based on 3-dimensional (3D) models from computed tomographic or magnetic resonance imaging data in which image quality is limited by the lack of detail (maximum, approximately 50 voxels/mm3), natural color, and texture of the source material.Virtual training often requires the purchase of a program, a customized computer, and expensive peripherals dedicated exclusively to this purpose. The Visible Ear freeware library of digital images from a fresh-frozen human temporal bone was segmented, and real-time volume rendered as a 3D model of high-fidelity, true color, and great anatomic detail and realism of the surgically relevant structures. A haptic drilling model was developed for surgical interaction with the 3D model. Realistic visualization in high-fidelity (approximately 125 voxels/mm3) and true color, 2D, or optional anaglyph stereoscopic 3D was achieved on a standard Core 2 Duo personal computer with a GeForce 8,800 GTX graphics card, and surgical interaction was provided through a relatively inexpensive (approximately $2,500) Phantom Omni haptic 3D pointing device. This prototype is published for download (approximately 120 MB) as freeware at http://www.alexandra.dk/ves/index.htm.With increasing personal computer performance, future versions may include enhanced resolution (up to 8,000 voxels/mm3) and realistic interaction with deformable soft tissue components such as skin, tympanic membrane, dura, and cholesteatomas-features some of which are not possible with computed tomographic-/magnetic resonance imaging-based systems.

  14. Computer simulation and experimental study of the polysaccharide-polysaccharide interaction in the bacteria Azospirillum brasilense Sp245

    NASA Astrophysics Data System (ADS)

    Arefeva, Oksana A.; Kuznetsov, Pavel E.; Tolmachev, Sergey A.; Kupadze, Machammad S.; Khlebtsov, Boris N.; Rogacheva, Svetlana M.

    2003-09-01

    We have studied the conformational properties and molecular dynamics of polysaccharides by using molecular modeling methods. Theoretical and experimental results of polysaccharide-polysaccharide interactions are described.

  15. An Interactive Computer Model for Improved Student Understanding of Random Particle Motion and Osmosis

    ERIC Educational Resources Information Center

    Kottonau, Johannes

    2011-01-01

    Effectively teaching the concepts of osmosis to college-level students is a major obstacle in biological education. Therefore, a novel computer model is presented that allows students to observe the random nature of particle motion simultaneously with the seemingly directed net flow of water across a semipermeable membrane during osmotic…

  16. Virtual Reality Anatomy: Is It Comparable with Traditional Methods in the Teaching of Human Forearm Musculoskeletal Anatomy?

    ERIC Educational Resources Information Center

    Codd, Anthony M.; Choudhury, Bipasha

    2011-01-01

    The use of cadavers to teach anatomy is well established, but limitations with this approach have led to the introduction of alternative teaching methods. One such method is the use of three-dimensional virtual reality computer models. An interactive, three-dimensional computer model of human forearm anterior compartment musculoskeletal anatomy…

  17. Modeling of fatigue crack induced nonlinear ultrasonics using a highly parallelized explicit local interaction simulation approach

    NASA Astrophysics Data System (ADS)

    Shen, Yanfeng; Cesnik, Carlos E. S.

    2016-04-01

    This paper presents a parallelized modeling technique for the efficient simulation of nonlinear ultrasonics introduced by the wave interaction with fatigue cracks. The elastodynamic wave equations with contact effects are formulated using an explicit Local Interaction Simulation Approach (LISA). The LISA formulation is extended to capture the contact-impact phenomena during the wave damage interaction based on the penalty method. A Coulomb friction model is integrated into the computation procedure to capture the stick-slip contact shear motion. The LISA procedure is coded using the Compute Unified Device Architecture (CUDA), which enables the highly parallelized supercomputing on powerful graphic cards. Both the explicit contact formulation and the parallel feature facilitates LISA's superb computational efficiency over the conventional finite element method (FEM). The theoretical formulations based on the penalty method is introduced and a guideline for the proper choice of the contact stiffness is given. The convergence behavior of the solution under various contact stiffness values is examined. A numerical benchmark problem is used to investigate the new LISA formulation and results are compared with a conventional contact finite element solution. Various nonlinear ultrasonic phenomena are successfully captured using this contact LISA formulation, including the generation of nonlinear higher harmonic responses. Nonlinear mode conversion of guided waves at fatigue cracks is also studied.

  18. Figure-ground organization and object recognition processes: an interactive account.

    PubMed

    Vecera, S P; O'Reilly, R C

    1998-04-01

    Traditional bottom-up models of visual processing assume that figure-ground organization precedes object recognition. This assumption seems logically necessary: How can object recognition occur before a region is labeled as figure? However, some behavioral studies find that familiar regions are more likely to be labeled figure than less familiar regions, a problematic finding for bottom-up models. An interactive account is proposed in which figure-ground processes receive top-down input from object representations in a hierarchical system. A graded, interactive computational model is presented that accounts for behavioral results in which familiarity effects are found. The interactive model offers an alternative conception of visual processing to bottom-up models.

  19. Multiphysics modeling of non-linear laser-matter interactions for optically active semiconductors

    NASA Astrophysics Data System (ADS)

    Kraczek, Brent; Kanp, Jaroslaw

    Development of photonic devices for sensors and communications devices has been significantly enhanced by computational modeling. We present a new computational method for modelling laser propagation in optically-active semiconductors within the paraxial wave approximation (PWA). Light propagation is modeled using the Streamline-upwind/Petrov-Galerkin finite element method (FEM). Material response enters through the non-linear polarization, which serves as the right-hand side of the FEM calculation. Maxwell's equations for classical light propagation within the PWA can be written solely in terms of the electric field, producing a wave equation that is a form of the advection-diffusion-reaction equations (ADREs). This allows adaptation of the computational machinery developed for solving ADREs in fluid dynamics to light-propagation modeling. The non-linear polarization is incorporated using a flexible framework to enable the use of multiple methods for carrier-carrier interactions (e.g. relaxation-time-based or Monte Carlo) to enter through the non-linear polarization, as appropriate to the material type. We demonstrate using a simple carrier-carrier model approximating the response of GaN. Supported by ARL Materials Enterprise.

  20. Local rules simulation of the kinetics of virus capsid self-assembly.

    PubMed

    Schwartz, R; Shor, P W; Prevelige, P E; Berger, B

    1998-12-01

    A computer model is described for studying the kinetics of the self-assembly of icosahedral viral capsids. Solution of this problem is crucial to an understanding of the viral life cycle, which currently cannot be adequately addressed through laboratory techniques. The abstract simulation model employed to address this is based on the local rules theory of. Proc. Natl. Acad. Sci. USA. 91:7732-7736). It is shown that the principle of local rules, generalized with a model of kinetics and other extensions, can be used to simulate complicated problems in self-assembly. This approach allows for a computationally tractable molecular dynamics-like simulation of coat protein interactions while retaining many relevant features of capsid self-assembly. Three simple simulation experiments are presented to illustrate the use of this model. These show the dependence of growth and malformation rates on the energetics of binding interactions, the tolerance of errors in binding positions, and the concentration of subunits in the examples. These experiments demonstrate a tradeoff within the model between growth rate and fidelity of assembly for the three parameters. A detailed discussion of the computational model is also provided.

  1. Model Reduction of Computational Aerothermodynamics for Multi-Discipline Analysis in High Speed Flows

    NASA Astrophysics Data System (ADS)

    Crowell, Andrew Rippetoe

    This dissertation describes model reduction techniques for the computation of aerodynamic heat flux and pressure loads for multi-disciplinary analysis of hypersonic vehicles. NASA and the Department of Defense have expressed renewed interest in the development of responsive, reusable hypersonic cruise vehicles capable of sustained high-speed flight and access to space. However, an extensive set of technical challenges have obstructed the development of such vehicles. These technical challenges are partially due to both the inability to accurately test scaled vehicles in wind tunnels and to the time intensive nature of high-fidelity computational modeling, particularly for the fluid using Computational Fluid Dynamics (CFD). The aim of this dissertation is to develop efficient and accurate models for the aerodynamic heat flux and pressure loads to replace the need for computationally expensive, high-fidelity CFD during coupled analysis. Furthermore, aerodynamic heating and pressure loads are systematically evaluated for a number of different operating conditions, including: simple two-dimensional flow over flat surfaces up to three-dimensional flows over deformed surfaces with shock-shock interaction and shock-boundary layer interaction. An additional focus of this dissertation is on the implementation and computation of results using the developed aerodynamic heating and pressure models in complex fluid-thermal-structural simulations. Model reduction is achieved using a two-pronged approach. One prong focuses on developing analytical corrections to isothermal, steady-state CFD flow solutions in order to capture flow effects associated with transient spatially-varying surface temperatures and surface pressures (e.g., surface deformation, surface vibration, shock impingements, etc.). The second prong is focused on minimizing the computational expense of computing the steady-state CFD solutions by developing an efficient surrogate CFD model. The developed two-pronged approach is found to exhibit balanced performance in terms of accuracy and computational expense, relative to several existing approaches. This approach enables CFD-based loads to be implemented into long duration fluid-thermal-structural simulations.

  2. Theoretical, Experimental, and Computational Evaluation of Disk-Loaded Circular Wave Guides

    NASA Technical Reports Server (NTRS)

    Wallett, Thomas M.; Qureshi, A. Haq

    1994-01-01

    A disk-loaded circular wave guide structure and test fixture were fabricated. The dispersion characteristics were found by theoretical analysis, experimental testing, and computer simulation using the codes ARGUS and SOS. Interaction impedances were computed based on the corresponding dispersion characteristics. Finally, an equivalent circuit model for one period of the structure was chosen using equivalent circuit models for cylindrical wave guides of different radii. Optimum values for the discrete capacitors and inductors describing discontinuities between cylindrical wave guides were found using the computer code TOUCHSTONE.

  3. Towards accurate modeling of noncovalent interactions for protein rigidity analysis.

    PubMed

    Fox, Naomi; Streinu, Ileana

    2013-01-01

    Protein rigidity analysis is an efficient computational method for extracting flexibility information from static, X-ray crystallography protein data. Atoms and bonds are modeled as a mechanical structure and analyzed with a fast graph-based algorithm, producing a decomposition of the flexible molecule into interconnected rigid clusters. The result depends critically on noncovalent atomic interactions, primarily on how hydrogen bonds and hydrophobic interactions are computed and modeled. Ongoing research points to the stringent need for benchmarking rigidity analysis software systems, towards the goal of increasing their accuracy and validating their results, either against each other and against biologically relevant (functional) parameters. We propose two new methods for modeling hydrogen bonds and hydrophobic interactions that more accurately reflect a mechanical model, without being computationally more intensive. We evaluate them using a novel scoring method, based on the B-cubed score from the information retrieval literature, which measures how well two cluster decompositions match. To evaluate the modeling accuracy of KINARI, our pebble-game rigidity analysis system, we use a benchmark data set of 20 proteins, each with multiple distinct conformations deposited in the Protein Data Bank. Cluster decompositions for them were previously determined with the RigidFinder method from Gerstein's lab and validated against experimental data. When KINARI's default tuning parameters are used, an improvement of the B-cubed score over a crude baseline is observed in 30% of this data. With our new modeling options, improvements were observed in over 70% of the proteins in this data set. We investigate the sensitivity of the cluster decomposition score with case studies on pyruvate phosphate dikinase and calmodulin. To substantially improve the accuracy of protein rigidity analysis systems, thorough benchmarking must be performed on all current systems and future extensions. We have measured the gain in performance by comparing different modeling methods for noncovalent interactions. We showed that new criteria for modeling hydrogen bonds and hydrophobic interactions can significantly improve the results. The two new methods proposed here have been implemented and made publicly available in the current version of KINARI (v1.3), together with the benchmarking tools, which can be downloaded from our software's website, http://kinari.cs.umass.edu.

  4. Towards accurate modeling of noncovalent interactions for protein rigidity analysis

    PubMed Central

    2013-01-01

    Background Protein rigidity analysis is an efficient computational method for extracting flexibility information from static, X-ray crystallography protein data. Atoms and bonds are modeled as a mechanical structure and analyzed with a fast graph-based algorithm, producing a decomposition of the flexible molecule into interconnected rigid clusters. The result depends critically on noncovalent atomic interactions, primarily on how hydrogen bonds and hydrophobic interactions are computed and modeled. Ongoing research points to the stringent need for benchmarking rigidity analysis software systems, towards the goal of increasing their accuracy and validating their results, either against each other and against biologically relevant (functional) parameters. We propose two new methods for modeling hydrogen bonds and hydrophobic interactions that more accurately reflect a mechanical model, without being computationally more intensive. We evaluate them using a novel scoring method, based on the B-cubed score from the information retrieval literature, which measures how well two cluster decompositions match. Results To evaluate the modeling accuracy of KINARI, our pebble-game rigidity analysis system, we use a benchmark data set of 20 proteins, each with multiple distinct conformations deposited in the Protein Data Bank. Cluster decompositions for them were previously determined with the RigidFinder method from Gerstein's lab and validated against experimental data. When KINARI's default tuning parameters are used, an improvement of the B-cubed score over a crude baseline is observed in 30% of this data. With our new modeling options, improvements were observed in over 70% of the proteins in this data set. We investigate the sensitivity of the cluster decomposition score with case studies on pyruvate phosphate dikinase and calmodulin. Conclusion To substantially improve the accuracy of protein rigidity analysis systems, thorough benchmarking must be performed on all current systems and future extensions. We have measured the gain in performance by comparing different modeling methods for noncovalent interactions. We showed that new criteria for modeling hydrogen bonds and hydrophobic interactions can significantly improve the results. The two new methods proposed here have been implemented and made publicly available in the current version of KINARI (v1.3), together with the benchmarking tools, which can be downloaded from our software's website, http://kinari.cs.umass.edu. PMID:24564209

  5. Aortic dissection simulation models for clinical support: fluid-structure interaction vs. rigid wall models.

    PubMed

    Alimohammadi, Mona; Sherwood, Joseph M; Karimpour, Morad; Agu, Obiekezie; Balabani, Stavroula; Díaz-Zuccarini, Vanessa

    2015-04-15

    The management and prognosis of aortic dissection (AD) is often challenging and the use of personalised computational models is being explored as a tool to improve clinical outcome. Including vessel wall motion in such simulations can provide more realistic and potentially accurate results, but requires significant additional computational resources, as well as expertise. With clinical translation as the final aim, trade-offs between complexity, speed and accuracy are inevitable. The present study explores whether modelling wall motion is worth the additional expense in the case of AD, by carrying out fluid-structure interaction (FSI) simulations based on a sample patient case. Patient-specific anatomical details were extracted from computed tomography images to provide the fluid domain, from which the vessel wall was extrapolated. Two-way fluid-structure interaction simulations were performed, with coupled Windkessel boundary conditions and hyperelastic wall properties. The blood was modelled using the Carreau-Yasuda viscosity model and turbulence was accounted for via a shear stress transport model. A simulation without wall motion (rigid wall) was carried out for comparison purposes. The displacement of the vessel wall was comparable to reports from imaging studies in terms of intimal flap motion and contraction of the true lumen. Analysis of the haemodynamics around the proximal and distal false lumen in the FSI model showed complex flow structures caused by the expansion and contraction of the vessel wall. These flow patterns led to significantly different predictions of wall shear stress, particularly its oscillatory component, which were not captured by the rigid wall model. Through comparison with imaging data, the results of the present study indicate that the fluid-structure interaction methodology employed herein is appropriate for simulations of aortic dissection. Regions of high wall shear stress were not significantly altered by the wall motion, however, certain collocated regions of low and oscillatory wall shear stress which may be critical for disease progression were only identified in the FSI simulation. We conclude that, if patient-tailored simulations of aortic dissection are to be used as an interventional planning tool, then the additional complexity, expertise and computational expense required to model wall motion is indeed justified.

  6. Efficient Geometric Sound Propagation Using Visibility Culling

    NASA Astrophysics Data System (ADS)

    Chandak, Anish

    2011-07-01

    Simulating propagation of sound can improve the sense of realism in interactive applications such as video games and can lead to better designs in engineering applications such as architectural acoustics. In this thesis, we present geometric sound propagation techniques which are faster than prior methods and map well to upcoming parallel multi-core CPUs. We model specular reflections by using the image-source method and model finite-edge diffraction by using the well-known Biot-Tolstoy-Medwin (BTM) model. We accelerate the computation of specular reflections by applying novel visibility algorithms, FastV and AD-Frustum, which compute visibility from a point. We accelerate finite-edge diffraction modeling by applying a novel visibility algorithm which computes visibility from a region. Our visibility algorithms are based on frustum tracing and exploit recent advances in fast ray-hierarchy intersections, data-parallel computations, and scalable, multi-core algorithms. The AD-Frustum algorithm adapts its computation to the scene complexity and allows small errors in computing specular reflection paths for higher computational efficiency. FastV and our visibility algorithm from a region are general, object-space, conservative visibility algorithms that together significantly reduce the number of image sources compared to other techniques while preserving the same accuracy. Our geometric propagation algorithms are an order of magnitude faster than prior approaches for modeling specular reflections and two to ten times faster for modeling finite-edge diffraction. Our algorithms are interactive, scale almost linearly on multi-core CPUs, and can handle large, complex, and dynamic scenes. We also compare the accuracy of our sound propagation algorithms with other methods. Once sound propagation is performed, it is desirable to listen to the propagated sound in interactive and engineering applications. We can generate smooth, artifact-free output audio signals by applying efficient audio-processing algorithms. We also present the first efficient audio-processing algorithm for scenarios with simultaneously moving source and moving receiver (MS-MR) which incurs less than 25% overhead compared to static source and moving receiver (SS-MR) or moving source and static receiver (MS-SR) scenario.

  7. Routine Discovery of Complex Genetic Models using Genetic Algorithms

    PubMed Central

    Moore, Jason H.; Hahn, Lance W.; Ritchie, Marylyn D.; Thornton, Tricia A.; White, Bill C.

    2010-01-01

    Simulation studies are useful in various disciplines for a number of reasons including the development and evaluation of new computational and statistical methods. This is particularly true in human genetics and genetic epidemiology where new analytical methods are needed for the detection and characterization of disease susceptibility genes whose effects are complex, nonlinear, and partially or solely dependent on the effects of other genes (i.e. epistasis or gene-gene interaction). Despite this need, the development of complex genetic models that can be used to simulate data is not always intuitive. In fact, only a few such models have been published. We have previously developed a genetic algorithm approach to discovering complex genetic models in which two single nucleotide polymorphisms (SNPs) influence disease risk solely through nonlinear interactions. In this paper, we extend this approach for the discovery of high-order epistasis models involving three to five SNPs. We demonstrate that the genetic algorithm is capable of routinely discovering interesting high-order epistasis models in which each SNP influences risk of disease only through interactions with the other SNPs in the model. This study opens the door for routine simulation of complex gene-gene interactions among SNPs for the development and evaluation of new statistical and computational approaches for identifying common, complex multifactorial disease susceptibility genes. PMID:20948983

  8. Improvements in the Scalability of the NASA Goddard Multiscale Modeling Framework for Hurricane Climate Studies

    NASA Technical Reports Server (NTRS)

    Shen, Bo-Wen; Tao, Wei-Kuo; Chern, Jiun-Dar

    2007-01-01

    Improving our understanding of hurricane inter-annual variability and the impact of climate change (e.g., doubling CO2 and/or global warming) on hurricanes brings both scientific and computational challenges to researchers. As hurricane dynamics involves multiscale interactions among synoptic-scale flows, mesoscale vortices, and small-scale cloud motions, an ideal numerical model suitable for hurricane studies should demonstrate its capabilities in simulating these interactions. The newly-developed multiscale modeling framework (MMF, Tao et al., 2007) and the substantial computing power by the NASA Columbia supercomputer show promise in pursuing the related studies, as the MMF inherits the advantages of two NASA state-of-the-art modeling components: the GEOS4/fvGCM and 2D GCEs. This article focuses on the computational issues and proposes a revised methodology to improve the MMF's performance and scalability. It is shown that this prototype implementation enables 12-fold performance improvements with 364 CPUs, thereby making it more feasible to study hurricane climate.

  9. Computational Model of Population Dynamics Based on the Cell Cycle and Local Interactions

    NASA Astrophysics Data System (ADS)

    Oprisan, Sorinel Adrian; Oprisan, Ana

    2005-03-01

    Our study bridges cellular (mesoscopic) level interactions and global population (macroscopic) dynamics of carcinoma. The morphological differences and transitions between well and smooth defined benign tumors and tentacular malignat tumors suggest a theoretical analysis of tumor invasion based on the development of mathematical models exhibiting bifurcations of spatial patterns in the density of tumor cells. Our computational model views the most representative and clinically relevant features of oncogenesis as a fight between two distinct sub-systems: the immune system of the host and the neoplastic system. We implemented the neoplastic sub-system using a three-stage cell cycle: active, dormant, and necrosis. The second considered sub-system consists of cytotoxic active (effector) cells — EC, with a very broad phenotype ranging from NK cells to CTL cells, macrophages, etc. Based on extensive numerical simulations, we correlated the fractal dimensions for carcinoma, which could be obtained from tumor imaging, with the malignat stage. Our computational model was able to also simulate the effects of surgical, chemotherapeutical, and radiotherapeutical treatments.

  10. Quantum simulation of transverse Ising models with Rydberg atoms

    NASA Astrophysics Data System (ADS)

    Schauss, Peter

    2018-04-01

    Quantum Ising models are canonical models for the study of quantum phase transitions (Sachdev 1999 Quantum Phase Transitions (Cambridge: Cambridge University Press)) and are the underlying concept for many analogue quantum computing and quantum annealing ideas (Tanaka et al Quantum Spin Glasses, Annealing and Computation (Cambridge: Cambridge University Press)). Here we focus on the implementation of finite-range interacting Ising spin models, which are barely tractable numerically. Recent experiments with cold atoms have reached the interaction-dominated regime in quantum Ising magnets via optical coupling of trapped neutral atoms to Rydberg states. This approach allows for the tunability of all relevant terms in an Ising spin Hamiltonian with 1/{r}6 interactions in transverse and longitudinal fields. This review summarizes the recent progress of these implementations in Rydberg lattices with site-resolved detection. Strong correlations in quantum Ising models have been observed in several experiments, starting from a single excitation in the superatom regime up to the point of crystallization. The rapid progress in this field makes spin systems based on Rydberg atoms a promising platform for quantum simulation because of the unmatched flexibility and strength of interactions combined with high control and good isolation from the environment.

  11. Biocellion: accelerating computer simulation of multicellular biological system models

    PubMed Central

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-01-01

    Motivation: Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. Results: We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Availability and implementation: Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. Contact: seunghwa.kang@pnnl.gov PMID:25064572

  12. Some foundational aspects of quantum computers and quantum robots.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benioff, P.; Physics

    1998-01-01

    This paper addresses foundational issues related to quantum computing. The need for a universally valid theory such as quantum mechanics to describe to some extent its own validation is noted. This includes quantum mechanical descriptions of systems that do theoretical calculations (i.e. quantum computers) and systems that perform experiments. Quantum robots interacting with an environment are a small first step in this direction. Quantum robots are described here as mobile quantum systems with on-board quantum computers that interact with environments. Included are discussions on the carrying out of tasks and the division of tasks into computation and action phases. Specificmore » models based on quantum Turing machines are described. Differences and similarities between quantum robots plus environments and quantum computers are discussed.« less

  13. A collision scheme for hybrid fluid-particle simulation of plasmas

    NASA Astrophysics Data System (ADS)

    Nguyen, Christine; Lim, Chul-Hyun; Verboncoeur, John

    2006-10-01

    Desorption phenomena at the wall of a tokamak can lead to the introduction of impurities at the edge of a thermonuclear plasma. In particular, the use of carbon as a constituent of the tokamak wall, as planned for ITER, requires the study of carbon and hydrocarbon transport in the plasma, including understanding of collisional interaction with the plasma. These collisions can result in new hydrocarbons, hydrogen, secondary electrons and so on. Computational modeling is a primary tool for studying these phenomena. XOOPIC [1] and OOPD1 are widely used computer modeling tools for the simulation of plasmas. Both are particle type codes. Particle simulation gives more kinetic information than fluid simulation, but more computation time is required. In order to reduce this disadvantage, hybrid simulation has been developed, and applied to the modeling of collisions. Present particle simulation tools such as XOOPIC and OODP1 employ a Monte Carlo model for the collisions between particle species and a neutral background gas defined by its temperature and pressure. In fluid-particle hybrid plasma models, collisions include combinations of particle and fluid interactions categorized by projectile-target pairing: particle-particle, particle-fluid, and fluid-fluid. For verification of this hybrid collision scheme, we compare simulation results to analytic solutions for classical plasma models. [1] Verboncoeur et al. Comput. Phys. Comm. 87, 199 (1995).

  14. Learning of embodied interaction dynamics with recurrent neural networks: some exploratory experiments.

    PubMed

    Oubbati, Mohamed; Kord, Bahram; Koprinkova-Hristova, Petia; Palm, Günther

    2014-04-01

    The new tendency of artificial intelligence suggests that intelligence must be seen as a result of the interaction between brains, bodies and environments. This view implies that designing sophisticated behaviour requires a primary focus on how agents are functionally coupled to their environments. Under this perspective, we present early results with the application of reservoir computing as an efficient tool to understand how behaviour emerges from interaction. Specifically, we present reservoir computing models, that are inspired by imitation learning designs, to extract the essential components of behaviour that results from agent-environment interaction dynamics. Experimental results using a mobile robot are reported to validate the learning architectures.

  15. Learning of embodied interaction dynamics with recurrent neural networks: some exploratory experiments

    NASA Astrophysics Data System (ADS)

    Oubbati, Mohamed; Kord, Bahram; Koprinkova-Hristova, Petia; Palm, Günther

    2014-04-01

    The new tendency of artificial intelligence suggests that intelligence must be seen as a result of the interaction between brains, bodies and environments. This view implies that designing sophisticated behaviour requires a primary focus on how agents are functionally coupled to their environments. Under this perspective, we present early results with the application of reservoir computing as an efficient tool to understand how behaviour emerges from interaction. Specifically, we present reservoir computing models, that are inspired by imitation learning designs, to extract the essential components of behaviour that results from agent-environment interaction dynamics. Experimental results using a mobile robot are reported to validate the learning architectures.

  16. A Systematic Prediction of Drug-Target Interactions Using Molecular Fingerprints and Protein Sequences.

    PubMed

    Huang, Yu-An; You, Zhu-Hong; Chen, Xing

    2018-01-01

    Drug-Target Interactions (DTI) play a crucial role in discovering new drug candidates and finding new proteins to target for drug development. Although the number of detected DTI obtained by high-throughput techniques has been increasing, the number of known DTI is still limited. On the other hand, the experimental methods for detecting the interactions among drugs and proteins are costly and inefficient. Therefore, computational approaches for predicting DTI are drawing increasing attention in recent years. In this paper, we report a novel computational model for predicting the DTI using extremely randomized trees model and protein amino acids information. More specifically, the protein sequence is represented as a Pseudo Substitution Matrix Representation (Pseudo-SMR) descriptor in which the influence of biological evolutionary information is retained. For the representation of drug molecules, a novel fingerprint feature vector is utilized to describe its substructure information. Then the DTI pair is characterized by concatenating the two vector spaces of protein sequence and drug substructure. Finally, the proposed method is explored for predicting the DTI on four benchmark datasets: Enzyme, Ion Channel, GPCRs and Nuclear Receptor. The experimental results demonstrate that this method achieves promising prediction accuracies of 89.85%, 87.87%, 82.99% and 81.67%, respectively. For further evaluation, we compared the performance of Extremely Randomized Trees model with that of the state-of-the-art Support Vector Machine classifier. And we also compared the proposed model with existing computational models, and confirmed 15 potential drug-target interactions by looking for existing databases. The experiment results show that the proposed method is feasible and promising for predicting drug-target interactions for new drug candidate screening based on sizeable features. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  17. Fluid-Structure Interaction Modeling of the Reefed Stages of the Orion Spacecraft Main Parachutes

    NASA Astrophysics Data System (ADS)

    Boswell, Cody W.

    Spacecraft parachutes are typically used in multiple stages, starting with a "reefed" stage where a cable along the parachute skirt constrains the diameter to be less than the diameter in the subsequent stage. After a certain period of time during the descent, the cable is cut and the parachute "disreefs" (i.e. expands) to the next stage. Computing the parachute shape at the reefed stage and fluid-structure interaction (FSI) modeling during the disreefing involve computational challenges beyond those we have in FSI modeling of fully-open spacecraft parachutes. These additional challenges are created by the increased geometric complexities and by the rapid changes in the parachute geometry. The computational challenges are further increased because of the added geometric porosity of the latest design, where the "windows" created by the removal of panels and the wider gaps created by the removal of sails compound the geometric and flow complexity. Orion spacecraft main parachutes will have three stages, with computation of the Stage 1 shape and FSI modeling of disreefing from Stage 1 to Stage 2 being the most challenging. We present the special modeling techniques we devised to address the computational challenges and the results from the computations carried out. We also present the methods we devised to calculate for a parachute gore the radius of curvature in the circumferential direction. The curvature values are intended for quick and simple engineering analysis in estimating the structural stresses.

  18. The Bilingual Language Interaction Network for Comprehension of Speech

    ERIC Educational Resources Information Center

    Shook, Anthony; Marian, Viorica

    2013-01-01

    During speech comprehension, bilinguals co-activate both of their languages, resulting in cross-linguistic interaction at various levels of processing. This interaction has important consequences for both the structure of the language system and the mechanisms by which the system processes spoken language. Using computational modeling, we can…

  19. Interaction Network Estimation: Predicting Problem-Solving Diversity in Interactive Environments

    ERIC Educational Resources Information Center

    Eagle, Michael; Hicks, Drew; Barnes, Tiffany

    2015-01-01

    Intelligent tutoring systems and computer aided learning environments aimed at developing problem solving produce large amounts of transactional data which make it a challenge for both researchers and educators to understand how students work within the environment. Researchers have modeled student-tutor interactions using complex networks in…

  20. Imagining Garage Start-Ups: Interactive Effects of Imaginative Capacities on Entrepreneurial Intention

    ERIC Educational Resources Information Center

    Chang, Chi-Cheng; Yao, Shu-Nung; Chen, Shi-An; King, Jung-Tai; Liang, Chaoyun

    2016-01-01

    This article describes a structural examination of the interaction among different imaginative capacities and the entrepreneurial intention of electrical and computer engineering students. Two studies were combined to confirm the factor structure of survey items and test the hypothesised interaction model. The results indicated that imaginative…

  1. Distance Learning and Cloud Computing: "Just Another Buzzword or a Major E-Learning Breakthrough?"

    ERIC Educational Resources Information Center

    Romiszowski, Alexander J.

    2012-01-01

    "Cloud computing is a model for the enabling of ubiquitous, convenient, and on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and other services) that can be rapidly provisioned and released with minimal management effort or service provider interaction." This…

  2. Dynamic electronic institutions in agent oriented cloud robotic systems.

    PubMed

    Nagrath, Vineet; Morel, Olivier; Malik, Aamir; Saad, Naufal; Meriaudeau, Fabrice

    2015-01-01

    The dot-com bubble bursted in the year 2000 followed by a swift movement towards resource virtualization and cloud computing business model. Cloud computing emerged not as new form of computing or network technology but a mere remoulding of existing technologies to suit a new business model. Cloud robotics is understood as adaptation of cloud computing ideas for robotic applications. Current efforts in cloud robotics stress upon developing robots that utilize computing and service infrastructure of the cloud, without debating on the underlying business model. HTM5 is an OMG's MDA based Meta-model for agent oriented development of cloud robotic systems. The trade-view of HTM5 promotes peer-to-peer trade amongst software agents. HTM5 agents represent various cloud entities and implement their business logic on cloud interactions. Trade in a peer-to-peer cloud robotic system is based on relationships and contracts amongst several agent subsets. Electronic Institutions are associations of heterogeneous intelligent agents which interact with each other following predefined norms. In Dynamic Electronic Institutions, the process of formation, reformation and dissolution of institutions is automated leading to run time adaptations in groups of agents. DEIs in agent oriented cloud robotic ecosystems bring order and group intellect. This article presents DEI implementations through HTM5 methodology.

  3. Brain-computer interaction research at the Computer Vision and Multimedia Laboratory, University of Geneva.

    PubMed

    Pun, Thierry; Alecu, Teodor Iulian; Chanel, Guillaume; Kronegg, Julien; Voloshynovskiy, Sviatoslav

    2006-06-01

    This paper describes the work being conducted in the domain of brain-computer interaction (BCI) at the Multimodal Interaction Group, Computer Vision and Multimedia Laboratory, University of Geneva, Geneva, Switzerland. The application focus of this work is on multimodal interaction rather than on rehabilitation, that is how to augment classical interaction by means of physiological measurements. Three main research topics are addressed. The first one concerns the more general problem of brain source activity recognition from EEGs. In contrast with classical deterministic approaches, we studied iterative robust stochastic based reconstruction procedures modeling source and noise statistics, to overcome known limitations of current techniques. We also developed procedures for optimal electroencephalogram (EEG) sensor system design in terms of placement and number of electrodes. The second topic is the study of BCI protocols and performance from an information-theoretic point of view. Various information rate measurements have been compared for assessing BCI abilities. The third research topic concerns the use of EEG and other physiological signals for assessing a user's emotional status.

  4. GPU-accelerated FDTD modeling of radio-frequency field-tissue interactions in high-field MRI.

    PubMed

    Chi, Jieru; Liu, Feng; Weber, Ewald; Li, Yu; Crozier, Stuart

    2011-06-01

    The analysis of high-field RF field-tissue interactions requires high-performance finite-difference time-domain (FDTD) computing. Conventional CPU-based FDTD calculations offer limited computing performance in a PC environment. This study presents a graphics processing unit (GPU)-based parallel-computing framework, producing substantially boosted computing efficiency (with a two-order speedup factor) at a PC-level cost. Specific details of implementing the FDTD method on a GPU architecture have been presented and the new computational strategy has been successfully applied to the design of a novel 8-element transceive RF coil system at 9.4 T. Facilitated by the powerful GPU-FDTD computing, the new RF coil array offers optimized fields (averaging 25% improvement in sensitivity, and 20% reduction in loop coupling compared with conventional array structures of the same size) for small animal imaging with a robust RF configuration. The GPU-enabled acceleration paves the way for FDTD to be applied for both detailed forward modeling and inverse design of MRI coils, which were previously impractical.

  5. The computational future for climate and Earth system models: on the path to petaflop and beyond.

    PubMed

    Washington, Warren M; Buja, Lawrence; Craig, Anthony

    2009-03-13

    The development of the climate and Earth system models has had a long history, starting with the building of individual atmospheric, ocean, sea ice, land vegetation, biogeochemical, glacial and ecological model components. The early researchers were much aware of the long-term goal of building the Earth system models that would go beyond what is usually included in the climate models by adding interactive biogeochemical interactions. In the early days, the progress was limited by computer capability, as well as by our knowledge of the physical and chemical processes. Over the last few decades, there has been much improved knowledge, better observations for validation and more powerful supercomputer systems that are increasingly meeting the new challenges of comprehensive models. Some of the climate model history will be presented, along with some of the successes and difficulties encountered with present-day supercomputer systems.

  6. Computational models for predicting interactions with membrane transporters.

    PubMed

    Xu, Y; Shen, Q; Liu, X; Lu, J; Li, S; Luo, C; Gong, L; Luo, X; Zheng, M; Jiang, H

    2013-01-01

    Membrane transporters, including two members: ATP-binding cassette (ABC) transporters and solute carrier (SLC) transporters are proteins that play important roles to facilitate molecules into and out of cells. Consequently, these transporters can be major determinants of the therapeutic efficacy, toxicity and pharmacokinetics of a variety of drugs. Considering the time and expense of bio-experiments taking, research should be driven by evaluation of efficacy and safety. Computational methods arise to be a complementary choice. In this article, we provide an overview of the contribution that computational methods made in transporters field in the past decades. At the beginning, we present a brief introduction about the structure and function of major members of two families in transporters. In the second part, we focus on widely used computational methods in different aspects of transporters research. In the absence of a high-resolution structure of most of transporters, homology modeling is a useful tool to interpret experimental data and potentially guide experimental studies. We summarize reported homology modeling in this review. Researches in computational methods cover major members of transporters and a variety of topics including the classification of substrates and/or inhibitors, prediction of protein-ligand interactions, constitution of binding pocket, phenotype of non-synonymous single-nucleotide polymorphisms, and the conformation analysis that try to explain the mechanism of action. As an example, one of the most important transporters P-gp is elaborated to explain the differences and advantages of various computational models. In the third part, the challenges of developing computational methods to get reliable prediction, as well as the potential future directions in transporter related modeling are discussed.

  7. Do Responses to Different Anthropogenic Forcings Add Linearly in Climate Models?

    NASA Technical Reports Server (NTRS)

    Marvel, Kate; Schmidt, Gavin A.; Shindell, Drew; Bonfils, Celine; LeGrande, Allegra N.; Nazarenko, Larissa; Tsigaridis, Kostas

    2015-01-01

    Many detection and attribution and pattern scaling studies assume that the global climate response to multiple forcings is additive: that the response over the historical period is statistically indistinguishable from the sum of the responses to individual forcings. Here, we use the NASA Goddard Institute for Space Studies (GISS) and National Center for Atmospheric Research Community Climate System Model (CCSM) simulations from the CMIP5 archive to test this assumption for multi-year trends in global-average, annual-average temperature and precipitation at multiple timescales. We find that responses in models forced by pre-computed aerosol and ozone concentrations are generally additive across forcings; however, we demonstrate that there are significant nonlinearities in precipitation responses to di?erent forcings in a configuration of the GISS model that interactively computes these concentrations from precursor emissions. We attribute these to di?erences in ozone forcing arising from interactions between forcing agents. Our results suggest that attribution to specific forcings may be complicated in a model with fully interactive chemistry and may provide motivation for other modeling groups to conduct further single-forcing experiments.

  8. Do responses to different anthropogenic forcings add linearly in climate models?

    DOE PAGES

    Marvel, Kate; Schmidt, Gavin A.; Shindell, Drew; ...

    2015-10-14

    Many detection and attribution and pattern scaling studies assume that the global climate response to multiple forcings is additive: that the response over the historical period is statistically indistinguishable from the sum of the responses to individual forcings. Here, we use the NASA Goddard Institute for Space Studies (GISS) and National Center for Atmospheric Research Community Climate System Model (CCSM4) simulations from the CMIP5 archive to test this assumption for multi-year trends in global-average, annual-average temperature and precipitation at multiple timescales. We find that responses in models forced by pre-computed aerosol and ozone concentrations are generally additive across forcings. However,more » we demonstrate that there are significant nonlinearities in precipitation responses to different forcings in a configuration of the GISS model that interactively computes these concentrations from precursor emissions. We attribute these to differences in ozone forcing arising from interactions between forcing agents. Lastly, our results suggest that attribution to specific forcings may be complicated in a model with fully interactive chemistry and may provide motivation for other modeling groups to conduct further single-forcing experiments.« less

  9. Generalized theoretical method for the interaction between arbitrary nonuniform electric field and molecular vibrations: Toward near-field infrared spectroscopy and microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iwasa, Takeshi, E-mail: tiwasa@mail.sci.hokudai.ac.jp; Takenaka, Masato; Taketsugu, Tetsuya

    A theoretical method to compute infrared absorption spectra when a molecule is interacting with an arbitrary nonuniform electric field such as near-fields is developed and numerically applied to simple model systems. The method is based on the multipolar Hamiltonian where the light-matter interaction is described by a spatial integral of the inner product of the molecular polarization and applied electric field. The computation scheme is developed under the harmonic approximation for the molecular vibrations and the framework of modern electronic structure calculations such as the density functional theory. Infrared reflection absorption and near-field infrared absorption are considered as model systems.more » The obtained IR spectra successfully reflect the spatial structure of the applied electric field and corresponding vibrational modes, demonstrating applicability of the present method to analyze modern nanovibrational spectroscopy using near-fields. The present method can use arbitral electric fields and thus can integrate two fields such as computational chemistry and electromagnetics.« less

  10. Generalized theoretical method for the interaction between arbitrary nonuniform electric field and molecular vibrations: Toward near-field infrared spectroscopy and microscopy.

    PubMed

    Iwasa, Takeshi; Takenaka, Masato; Taketsugu, Tetsuya

    2016-03-28

    A theoretical method to compute infrared absorption spectra when a molecule is interacting with an arbitrary nonuniform electric field such as near-fields is developed and numerically applied to simple model systems. The method is based on the multipolar Hamiltonian where the light-matter interaction is described by a spatial integral of the inner product of the molecular polarization and applied electric field. The computation scheme is developed under the harmonic approximation for the molecular vibrations and the framework of modern electronic structure calculations such as the density functional theory. Infrared reflection absorption and near-field infrared absorption are considered as model systems. The obtained IR spectra successfully reflect the spatial structure of the applied electric field and corresponding vibrational modes, demonstrating applicability of the present method to analyze modern nanovibrational spectroscopy using near-fields. The present method can use arbitral electric fields and thus can integrate two fields such as computational chemistry and electromagnetics.

  11. Manual for obscuration code with space station applications

    NASA Technical Reports Server (NTRS)

    Marhefka, R. J.; Takacs, L.

    1986-01-01

    The Obscuration Code, referred to as SHADOW, is a user-oriented computer code to determine the case shadow of an antenna in a complex environment onto the far zone sphere. The surrounding structure can be composed of multiple composite cone frustums and multiply sided flat plates. These structural pieces are ideal for modeling space station configurations. The means of describing the geometry input is compatible with the NEC-BASIC Scattering Code. In addition, an interactive mode of operation has been provided for DEC VAX computers. The first part of this document is a user's manual designed to give a description of the method used to obtain the shadow map, to provide an overall view of the operation of the computer code, to instruct a user in how to model structures, and to give examples of inputs and outputs. The second part is a code manual that details how to set up the interactive and non-interactive modes of the code and provides a listing and brief description of each of the subroutines.

  12. IFEMS, an Interactive Finite Element Modeling System Using a CAD/CAM System

    NASA Technical Reports Server (NTRS)

    Mckellip, S.; Schuman, T.; Lauer, S.

    1980-01-01

    A method of coupling a CAD/CAM system with a general purpose finite element mesh generator is described. The three computer programs which make up the interactive finite element graphics system are discussed.

  13. Integrative multicellular biological modeling: a case study of 3D epidermal development using GPU algorithms

    PubMed Central

    2010-01-01

    Background Simulation of sophisticated biological models requires considerable computational power. These models typically integrate together numerous biological phenomena such as spatially-explicit heterogeneous cells, cell-cell interactions, cell-environment interactions and intracellular gene networks. The recent advent of programming for graphical processing units (GPU) opens up the possibility of developing more integrative, detailed and predictive biological models while at the same time decreasing the computational cost to simulate those models. Results We construct a 3D model of epidermal development and provide a set of GPU algorithms that executes significantly faster than sequential central processing unit (CPU) code. We provide a parallel implementation of the subcellular element method for individual cells residing in a lattice-free spatial environment. Each cell in our epidermal model includes an internal gene network, which integrates cellular interaction of Notch signaling together with environmental interaction of basement membrane adhesion, to specify cellular state and behaviors such as growth and division. We take a pedagogical approach to describing how modeling methods are efficiently implemented on the GPU including memory layout of data structures and functional decomposition. We discuss various programmatic issues and provide a set of design guidelines for GPU programming that are instructive to avoid common pitfalls as well as to extract performance from the GPU architecture. Conclusions We demonstrate that GPU algorithms represent a significant technological advance for the simulation of complex biological models. We further demonstrate with our epidermal model that the integration of multiple complex modeling methods for heterogeneous multicellular biological processes is both feasible and computationally tractable using this new technology. We hope that the provided algorithms and source code will be a starting point for modelers to develop their own GPU implementations, and encourage others to implement their modeling methods on the GPU and to make that code available to the wider community. PMID:20696053

  14. Predictive modeling of multicellular structure formation by using Cellular Particle Dynamics simulations

    NASA Astrophysics Data System (ADS)

    McCune, Matthew; Shafiee, Ashkan; Forgacs, Gabor; Kosztin, Ioan

    2014-03-01

    Cellular Particle Dynamics (CPD) is an effective computational method for describing and predicting the time evolution of biomechanical relaxation processes of multicellular systems. A typical example is the fusion of spheroidal bioink particles during post bioprinting structure formation. In CPD cells are modeled as an ensemble of cellular particles (CPs) that interact via short-range contact interactions, characterized by an attractive (adhesive interaction) and a repulsive (excluded volume interaction) component. The time evolution of the spatial conformation of the multicellular system is determined by following the trajectories of all CPs through integration of their equations of motion. CPD was successfully applied to describe and predict the fusion of 3D tissue construct involving identical spherical aggregates. Here, we demonstrate that CPD can also predict tissue formation involving uneven spherical aggregates whose volumes decrease during the fusion process. Work supported by NSF [PHY-0957914]. Computer time provided by the University of Missouri Bioinformatics Consortium.

  15. Use of computer programs STLK1 and STWT1 for analysis of stream-aquifer hydraulic interaction

    USGS Publications Warehouse

    Desimone, Leslie A.; Barlow, Paul M.

    1999-01-01

    Quantifying the hydraulic interaction of aquifers and streams is important in the analysis of stream base fow, flood-wave effects, and contaminant transport between surface- and ground-water systems. This report describes the use of two computer programs, STLK1 and STWT1, to analyze the hydraulic interaction of streams with confined, leaky, and water-table aquifers during periods of stream-stage fuctuations and uniform, areal recharge. The computer programs are based on analytical solutions to the ground-water-flow equation in stream-aquifer settings and calculate ground-water levels, seepage rates across the stream-aquifer boundary, and bank storage that result from arbitrarily varying stream stage or recharge. Analysis of idealized, hypothetical stream-aquifer systems is used to show how aquifer type, aquifer boundaries, and aquifer and streambank hydraulic properties affect aquifer response to stresses. Published data from alluvial and stratifed-drift aquifers in Kentucky, Massachusetts, and Iowa are used to demonstrate application of the programs to field settings. Analytical models of these three stream-aquifer systems are developed on the basis of available hydrogeologic information. Stream-stage fluctuations and recharge are applied to the systems as hydraulic stresses. The models are calibrated by matching ground-water levels calculated with computer program STLK1 or STWT1 to measured ground-water levels. The analytical models are used to estimate hydraulic properties of the aquifer, aquitard, and streambank; to evaluate hydrologic conditions in the aquifer; and to estimate seepage rates and bank-storage volumes resulting from flood waves and recharge. Analysis of field examples demonstrates the accuracy and limitations of the analytical solutions and programs when applied to actual ground-water systems and the potential uses of the analytical methods as alternatives to numerical modeling for quantifying stream-aquifer interactions.

  16. Aeroacoustic Prediction Codes

    NASA Technical Reports Server (NTRS)

    Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)

    2000-01-01

    This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor-noise correlation model was developed from engine acoustic test results. This work provided several insights on potential approaches to reducing aircraft engine noise. Code development is described in this report, and those insights are discussed.

  17. Simulating Microbial Community Patterning Using Biocellion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kang, Seung-Hwa; Kahan, Simon H.; Momeni, Babak

    2014-04-17

    Mathematical modeling and computer simulation are important tools for understanding complex interactions between cells and their biotic and abiotic environment: similarities and differences between modeled and observed behavior provide the basis for hypothesis forma- tion. Momeni et al. [5] investigated pattern formation in communities of yeast strains engaging in different types of ecological interactions, comparing the predictions of mathematical modeling and simulation to actual patterns observed in wet-lab experiments. However, simu- lations of millions of cells in a three-dimensional community are ex- tremely time-consuming. One simulation run in MATLAB may take a week or longer, inhibiting exploration of the vastmore » space of parameter combinations and assumptions. Improving the speed, scale, and accu- racy of such simulations facilitates hypothesis formation and expedites discovery. Biocellion is a high performance software framework for ac- celerating discrete agent-based simulation of biological systems with millions to trillions of cells. Simulations of comparable scale and accu- racy to those taking a week of computer time using MATLAB require just hours using Biocellion on a multicore workstation. Biocellion fur- ther accelerates large scale, high resolution simulations using cluster computers by partitioning the work to run on multiple compute nodes. Biocellion targets computational biologists who have mathematical modeling backgrounds and basic C++ programming skills. This chap- ter describes the necessary steps to adapt the original Momeni et al.'s model to the Biocellion framework as a case study.« less

  18. Augmented Computer Mouse Would Measure Applied Force

    NASA Technical Reports Server (NTRS)

    Li, Larry C. H.

    1993-01-01

    Proposed computer mouse measures force of contact applied by user. Adds another dimension to two-dimensional-position-measuring capability of conventional computer mouse; force measurement designated to represent any desired continuously variable function of time and position, such as control force, acceleration, velocity, or position along axis perpendicular to computer video display. Proposed mouse enhances sense of realism and intuition in interaction between operator and computer. Useful in such applications as three-dimensional computer graphics, computer games, and mathematical modeling of dynamics.

  19. Wind-US Code Contributions to the First AIAA Shock Boundary Layer Interaction Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Georgiadis, Nicholas J.; Vyas, Manan A.; Yoder, Dennis A.

    2013-01-01

    This report discusses the computations of a set of shock wave/turbulent boundary layer interaction (SWTBLI) test cases using the Wind-US code, as part of the 2010 American Institute of Aeronautics and Astronautics (AIAA) shock/boundary layer interaction workshop. The experiments involve supersonic flows in wind tunnels with a shock generator that directs an oblique shock wave toward the boundary layer along one of the walls of the wind tunnel. The Wind-US calculations utilized structured grid computations performed in Reynolds-averaged Navier-Stokes mode. Four turbulence models were investigated: the Spalart-Allmaras one-equation model, the Menter Baseline and Shear Stress Transport k-omega two-equation models, and an explicit algebraic stress k-omega formulation. Effects of grid resolution and upwinding scheme were also considered. The results from the CFD calculations are compared to particle image velocimetry (PIV) data from the experiments. As expected, turbulence model effects dominated the accuracy of the solutions with upwinding scheme selection indicating minimal effects.

  20. Additive and Interactive Effects of Stimulus Degradation: No Challenge for CDP+

    ERIC Educational Resources Information Center

    Ziegler, Johannes C.; Perry, Conrad; Zorzi, Marco

    2009-01-01

    S. O'Malley and D. Besner (2008) showed that additive effects of stimulus degradation and word frequency in reading aloud occur in the presence of nonwords but not in pure word lists. They argued that this dissociation presents a major challenge to interactive computational models of reading aloud and claimed that no currently implemented model is…

  1. Time-Filtered Navier-Stokes Approach and Emulation of Turbulence-Chemistry Interaction

    NASA Technical Reports Server (NTRS)

    Liu, Nan-Suey; Wey, Thomas; Shih, Tsan-Hsing

    2013-01-01

    This paper describes the time-filtered Navier-Stokes approach capable of capturing unsteady flow structures important for turbulent mixing and an accompanying subgrid model directly accounting for the major processes in turbulence-chemistry interaction. They have been applied to the computation of two-phase turbulent combustion occurring in a single-element lean-direct-injection combustor. Some of the preliminary results from this computational effort are presented in this paper.

  2. Computer and laboratory simulation of interactions between spacecraft surfaces and charged-particle environments

    NASA Technical Reports Server (NTRS)

    Stevens, N. J.

    1979-01-01

    Cases where the charged-particle environment acts on the spacecraft (e.g., spacecraft charging phenomena) and cases where a system on the spacecraft causes the interaction (e.g., high voltage space power systems) are considered. Both categories were studied in ground simulation facilities to understand the processes involved and to measure the pertinent parameters. Computer simulations are based on the NASA Charging Analyzer Program (NASCAP) code. Analytical models are developed in this code and verified against the experimental data. Extrapolation from the small test samples to space conditions are made with this code. Typical results from laboratory and computer simulations are presented for both types of interactions. Extrapolations from these simulations to performance in space environments are discussed.

  3. Modeling of electron-specimen interaction in scanning electron microscope for e-beam metrology and inspection: challenges and perspectives

    NASA Astrophysics Data System (ADS)

    Suzuki, Makoto; Kameda, Toshimasa; Doi, Ayumi; Borisov, Sergey; Babin, Sergey

    2018-03-01

    The interpretation of scanning electron microscopy (SEM) images of the latest semiconductor devices is not intuitive and requires comparison with computed images based on theoretical modeling and simulations. For quantitative image prediction and geometrical reconstruction of the specimen structure, the accuracy of the physical model is essential. In this paper, we review the current models of electron-solid interaction and discuss their accuracy. We perform the comparison of the simulated results with our experiments of SEM overlay of under-layer, grain imaging of copper interconnect, and hole bottom visualization by angular selective detectors, and show that our model well reproduces the experimental results. Remaining issues for quantitative simulation are also discussed, including the accuracy of the charge dynamics, treatment of beam skirt, and explosive increase in computing time.

  4. Modeling Trait Anxiety: From Computational Processes to Personality

    PubMed Central

    Raymond, James G.; Steele, J. Douglas; Seriès, Peggy

    2017-01-01

    Computational methods are increasingly being applied to the study of psychiatric disorders. Often, this involves fitting models to the behavior of individuals with subclinical character traits that are known vulnerability factors for the development of psychiatric conditions. Anxiety disorders can be examined with reference to the behavior of individuals high in “trait” anxiety, which is a known vulnerability factor for the development of anxiety and mood disorders. However, it is not clear how this self-report measure relates to neural and behavioral processes captured by computational models. This paper reviews emerging computational approaches to the study of trait anxiety, specifying how interacting processes susceptible to analysis using computational models could drive a tendency to experience frequent anxious states and promote vulnerability to the development of clinical disorders. Existing computational studies are described in the light of this perspective and appropriate targets for future studies are discussed. PMID:28167920

  5. Modeling Trait Anxiety: From Computational Processes to Personality.

    PubMed

    Raymond, James G; Steele, J Douglas; Seriès, Peggy

    2017-01-01

    Computational methods are increasingly being applied to the study of psychiatric disorders. Often, this involves fitting models to the behavior of individuals with subclinical character traits that are known vulnerability factors for the development of psychiatric conditions. Anxiety disorders can be examined with reference to the behavior of individuals high in "trait" anxiety, which is a known vulnerability factor for the development of anxiety and mood disorders. However, it is not clear how this self-report measure relates to neural and behavioral processes captured by computational models. This paper reviews emerging computational approaches to the study of trait anxiety, specifying how interacting processes susceptible to analysis using computational models could drive a tendency to experience frequent anxious states and promote vulnerability to the development of clinical disorders. Existing computational studies are described in the light of this perspective and appropriate targets for future studies are discussed.

  6. SciSpark: Highly Interactive and Scalable Model Evaluation and Climate Metrics

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Mattmann, C. A.; Waliser, D. E.; Kim, J.; Loikith, P.; Lee, H.; McGibbney, L. J.; Whitehall, K. D.

    2014-12-01

    Remote sensing data and climate model output are multi-dimensional arrays of massive sizes locked away in heterogeneous file formats (HDF5/4, NetCDF 3/4) and metadata models (HDF-EOS, CF) making it difficult to perform multi-stage, iterative science processing since each stage requires writing and reading data to and from disk. We are developing a lightning fast Big Data technology called SciSpark based on ApacheTM Spark. Spark implements the map-reduce paradigm for parallel computing on a cluster, but emphasizes in-memory computation, "spilling" to disk only as needed, and so outperforms the disk-based ApacheTM Hadoop by 100x in memory and by 10x on disk, and makes iterative algorithms feasible. SciSpark will enable scalable model evaluation by executing large-scale comparisons of A-Train satellite observations to model grids on a cluster of 100 to 1000 compute nodes. This 2nd generation capability for NASA's Regional Climate Model Evaluation System (RCMES) will compute simple climate metrics at interactive speeds, and extend to quite sophisticated iterative algorithms such as machine-learning (ML) based clustering of temperature PDFs, and even graph-based algorithms for searching for Mesocale Convective Complexes. The goals of SciSpark are to: (1) Decrease the time to compute comparison statistics and plots from minutes to seconds; (2) Allow for interactive exploration of time-series properties over seasons and years; (3) Decrease the time for satellite data ingestion into RCMES to hours; (4) Allow for Level-2 comparisons with higher-order statistics or PDF's in minutes to hours; and (5) Move RCMES into a near real time decision-making platform. We will report on: the architecture and design of SciSpark, our efforts to integrate climate science algorithms in Python and Scala, parallel ingest and partitioning (sharding) of A-Train satellite observations from HDF files and model grids from netCDF files, first parallel runs to compute comparison statistics and PDF's, and first metrics quantifying parallel speedups and memory & disk usage.

  7. Integrating the Computer into Language Arts in a Fifth Grade Classroom: A Developing Instructional Model.

    ERIC Educational Resources Information Center

    Lund, David M.; Hildreth, Donna

    A case study investigated an instructional model that incorporated the personal computer and Hyperstudio (tm) software into an assignment to write and illustrate an interactive, multimedia story. Subjects were 21 students in a fifth-grade homeroom in a public school (with a state-mandated minimum 45% ratio of minority students achieved by busing…

  8. Computer prediction of insecticide efficacy for western spruce budworm and Douglas-fir tussock moth

    Treesearch

    Jacqueline L. Robertson; Molly W. Stock

    1986-01-01

    A generalized interactive computer model that simulates and predicts insecticide efficacy, over seasonal development of western spruce budworm and Douglas-fir tussock moth, is described. This model can be used for any insecticide for which the user has laboratory-based concentration-response data. The program has four options, is written in BASIC, and can be operated...

  9. Effect of Speed on Tire-Soil Interaction and Development of Towed Pneumatic Tire-Soil Model

    DTIC Science & Technology

    1974-10-01

    rigid wheels were per- formed by several researchers under laboratory conditions (Refs. 20 through 22) using the flash X -ray technique. These experiments...Towed Tire-Soil Model ................................... 90 IX Conclusions and Recommendations .............. 95 X References...Velocity Fields ................................. A-1 x Section Page Appendix B - Computer Program Chart for Computation 3- of Tire Performance with

  10. Characterizing the topology of probabilistic biological networks.

    PubMed

    Todor, Andrei; Dobra, Alin; Kahveci, Tamer

    2013-01-01

    Biological interactions are often uncertain events, that may or may not take place with some probability. This uncertainty leads to a massive number of alternative interaction topologies for each such network. The existing studies analyze the degree distribution of biological networks by assuming that all the given interactions take place under all circumstances. This strong and often incorrect assumption can lead to misleading results. In this paper, we address this problem and develop a sound mathematical basis to characterize networks in the presence of uncertain interactions. Using our mathematical representation, we develop a method that can accurately describe the degree distribution of such networks. We also take one more step and extend our method to accurately compute the joint-degree distributions of node pairs connected by edges. The number of possible network topologies grows exponentially with the number of uncertain interactions. However, the mathematical model we develop allows us to compute these degree distributions in polynomial time in the number of interactions. Our method works quickly even for entire protein-protein interaction (PPI) networks. It also helps us find an adequate mathematical model using MLE. We perform a comparative study of node-degree and joint-degree distributions in two types of biological networks: the classical deterministic networks and the more flexible probabilistic networks. Our results confirm that power-law and log-normal models best describe degree distributions for both probabilistic and deterministic networks. Moreover, the inverse correlation of degrees of neighboring nodes shows that, in probabilistic networks, nodes with large number of interactions prefer to interact with those with small number of interactions more frequently than expected. We also show that probabilistic networks are more robust for node-degree distribution computation than the deterministic ones. all the data sets used, the software implemented and the alignments found in this paper are available at http://bioinformatics.cise.ufl.edu/projects/probNet/.

  11. A transported probability density function/photon Monte Carlo method for high-temperature oxy-natural gas combustion with spectral gas and wall radiation

    NASA Astrophysics Data System (ADS)

    Zhao, X. Y.; Haworth, D. C.; Ren, T.; Modest, M. F.

    2013-04-01

    A computational fluid dynamics model for high-temperature oxy-natural gas combustion is developed and exercised. The model features detailed gas-phase chemistry and radiation treatments (a photon Monte Carlo method with line-by-line spectral resolution for gas and wall radiation - PMC/LBL) and a transported probability density function (PDF) method to account for turbulent fluctuations in composition and temperature. The model is first validated for a 0.8 MW oxy-natural gas furnace, and the level of agreement between model and experiment is found to be at least as good as any that has been published earlier. Next, simulations are performed with systematic model variations to provide insight into the roles of individual physical processes and their interplay in high-temperature oxy-fuel combustion. This includes variations in the chemical mechanism and the radiation model, and comparisons of results obtained with versus without the PDF method to isolate and quantify the effects of turbulence-chemistry interactions and turbulence-radiation interactions. In this combustion environment, it is found to be important to account for the interconversion of CO and CO2, and radiation plays a dominant role. The PMC/LBL model allows the effects of molecular gas radiation and wall radiation to be clearly separated and quantified. Radiation and chemistry are tightly coupled through the temperature, and correct temperature prediction is required for correct prediction of the CO/CO2 ratio. Turbulence-chemistry interactions influence the computed flame structure and mean CO levels. Strong local effects of turbulence-radiation interactions are found in the flame, but the net influence of TRI on computed mean temperature and species profiles is small. The ultimate goal of this research is to simulate high-temperature oxy-coal combustion, where accurate treatments of chemistry, radiation and turbulence-chemistry-particle-radiation interactions will be even more important.

  12. Influence of nonelectrostatic ion-ion interactions on double-layer capacitance

    NASA Astrophysics Data System (ADS)

    Zhao, Hui

    2012-11-01

    Recently a Poisson-Helmholtz-Boltzmann (PHB) model [Bohinc , Phys. Rev. EPLEEE81539-375510.1103/PhysRevE.85.031130 85, 031130 (2012)] was developed by accounting for solvent-mediated nonelectrostatic ion-ion interactions. Nonelectrostatic interactions are described by a Yukawa-like pair potential. In the present work, we modify the PHB model by adding steric effects (finite ion size) into the free energy to derive governing equations. The modified PHB model is capable of capturing both ion specificity and ion crowding. This modified model is then employed to study the capacitance of the double layer. More specifically, we focus on the influence of nonelectrostatic ion-ion interactions on charging a double layer near a flat surface in the presence of steric effects. We numerically compute the differential capacitance as a function of the voltage under various conditions. At small voltages and low salt concentrations (dilute solution), we find out that the predictions from the modified PHB model are the same as those from the classical Poisson-Boltzmann theory, indicating that nonelectrostatic ion-ion interactions and steric effects are negligible. At moderate voltages, nonelectrostatic ion-ion interactions play an important role in determining the differential capacitance. Generally speaking, nonelectrostatic interactions decrease the capacitance because of additional nonelectrostatic repulsion among excess counterions inside the double layer. However, increasing the voltage gradually favors steric effects, which induce a condensed layer with crowding of counterions near the electrode. Accordingly, the predictions from the modified PHB model collapse onto those computed by the modified Poisson-Boltzmann theory considering steric effects alone. Finally, theoretical predictions are compared and favorably agree with experimental data, in particular, in concentrated solutions, leading one to conclude that the modified PHB model adequately predicts the diffuse-charge dynamics of the double layer with ion specificity and steric effects.

  13. Footwear Physics.

    ERIC Educational Resources Information Center

    Blaser, Mark; Larsen, Jamie

    1996-01-01

    Presents five interactive, computer-based activities that mimic scientific tests used by sport researchers to help companies design high-performance athletic shoes, including impact tests, flexion tests, friction tests, video analysis, and computer modeling. Provides a platform for teachers to build connections between chemistry (polymer science),…

  14. Characterization of the interface of the bone marrow stromal cell antigen 2-Vpu protein complex via computational chemistry.

    PubMed

    Zhou, Jinming; Zhang, Zhixin; Mi, Zeyun; Wang, Xin; Zhang, Quan; Li, Xiaoyu; Liang, Chen; Cen, Shan

    2012-02-14

    Bone marrow stromal cell antigen 2 (BST-2) inhibits the release of enveloped viruses from the cell surface. Various viral counter measures have been discovered, which allow viruses to escape BST-2 restriction. Human immunodeficiency virus type 1 (HIV-1) encodes viral protein U (Vpu) that interacts with BST-2 through their transmembrane domains and causes the downregulation of cell surface BST-2. In this study, we used a computer modeling method to establish a molecular model to investigate the binding interface of the transmembrane domains of BST-2 and Vpu. The model predicts that the interface is composed of Vpu residues I6, A10, A14, A18, V25, and W22 and BST-2 residues L23, I26, V30, I34, V35, L41, I42, and T45. Introduction of mutations that have been previously reported to disrupt the Vpu-BST-2 interaction led to a calculated higher binding free energy (MMGBSA), which supports our molecular model. A pharmacophore was also generated on the basis of this model. Our results provide a precise model that predicts the detailed interaction occurring between the transmembrane domains of Vpu and BST-2 and should facilitate the design of anti-HIV agents that are able to disrupt this interaction.

  15. Improving Online Interactions: Lessons from an Online Anatomy Course with a Laboratory for Undergraduate Students.

    PubMed

    Attardi, Stefanie M; Barbeau, Michele L; Rogers, Kem A

    2018-03-01

    An online section of a face-to-face (F2F) undergraduate (bachelor's level) anatomy course with a prosection laboratory was offered in 2013-2014. Lectures for F2F students (353) were broadcast to online students (138) using Blackboard Collaborate (BBC) virtual classroom. Online laboratories were offered using BBC and three-dimensional (3D) anatomical computer models. This iteration of the course was modified from the previous year to improve online student-teacher and student-student interactions. Students were divided into laboratory groups that rotated through virtual breakout rooms, giving them the opportunity to interact with three instructors. The objectives were to assess student performance outcomes, perceptions of student-teacher and student-student interactions, methods of peer interaction, and helpfulness of the 3D computer models. Final grades were statistically identical between the online and F2F groups. There were strong, positive correlations between incoming grade average and final anatomy grade in both groups, suggesting prior academic performance, and not delivery format, predicts anatomy grades. Quantitative student perception surveys (273 F2F; 101 online) revealed that both groups agreed they were engaged by teachers, could interact socially with teachers and peers, and ask them questions in both the lecture and laboratory sessions, though agreement was significantly greater for the F2F students in most comparisons. The most common methods of peer communication were texting, Facebook, and meeting F2F. The perceived helpfulness of the 3D computer models improved from the previous year. While virtual breakout rooms can be used to adequately replace traditional prosection laboratories and improve interactions, they are not equivalent to F2F laboratories. Anat Sci Educ. © 2018 American Association of Anatomists. © 2018 American Association of Anatomists.

  16. Modeling wildlife populations with HexSim

    EPA Science Inventory

    HexSim is a framework for constructing spatially-explicit, individual-based computer models designed for simulating terrestrial wildlife population dynamics and interactions. HexSim is useful for a broad set of modeling applications including population viability analysis for on...

  17. On the concept of the interactive information and simulation system for gas dynamics and multiphysics problems

    NASA Astrophysics Data System (ADS)

    Bessonov, O.; Silvestrov, P.

    2017-02-01

    This paper describes the general idea and the first implementation of the Interactive information and simulation system - an integrated environment that combines computational modules for modeling the aerodynamics and aerothermodynamics of re-entry space vehicles with the large collection of different information materials on this topic. The internal organization and the composition of the system are described and illustrated. Examples of the computational and information output are presented. The system has the unified implementation for Windows and Linux operation systems and can be deployed on any modern high-performance personal computer.

  18. Design and Evaluation of Fusion Approach for Combining Brain and Gaze Inputs for Target Selection

    PubMed Central

    Évain, Andéol; Argelaguet, Ferran; Casiez, Géry; Roussel, Nicolas; Lécuyer, Anatole

    2016-01-01

    Gaze-based interfaces and Brain-Computer Interfaces (BCIs) allow for hands-free human–computer interaction. In this paper, we investigate the combination of gaze and BCIs. We propose a novel selection technique for 2D target acquisition based on input fusion. This new approach combines the probabilistic models for each input, in order to better estimate the intent of the user. We evaluated its performance against the existing gaze and brain–computer interaction techniques. Twelve participants took part in our study, in which they had to search and select 2D targets with each of the evaluated techniques. Our fusion-based hybrid interaction technique was found to be more reliable than the previous gaze and BCI hybrid interaction techniques for 10 participants over 12, while being 29% faster on average. However, similarly to what has been observed in hybrid gaze-and-speech interaction, gaze-only interaction technique still provides the best performance. Our results should encourage the use of input fusion, as opposed to sequential interaction, in order to design better hybrid interfaces. PMID:27774048

  19. Connectionist Interaction Information Retrieval.

    ERIC Educational Resources Information Center

    Dominich, Sandor

    2003-01-01

    Discussion of connectionist views for adaptive clustering in information retrieval focuses on a connectionist clustering technique and activation spreading-based information retrieval model using the interaction information retrieval method. Presents theoretical as well as simulation results as regards computational complexity and includes…

  20. 20170312 - In Silico Dynamics: computer simulation in a ...

    EPA Pesticide Factsheets

    Abstract: Utilizing cell biological information to predict higher order biological processes is a significant challenge in predictive toxicology. This is especially true for highly dynamical systems such as the embryo where morphogenesis, growth and differentiation require precisely orchestrated interactions between diverse cell populations. In patterning the embryo, genetic signals setup spatial information that cells then translate into a coordinated biological response. This can be modeled as ‘biowiring diagrams’ representing genetic signals and responses. Because the hallmark of multicellular organization resides in the ability of cells to interact with one another via well-conserved signaling pathways, multiscale computational (in silico) models that enable these interactions provide a platform to translate cellular-molecular lesions perturbations into higher order predictions. Just as ‘the Cell’ is the fundamental unit of biology so too should it be the computational unit (‘Agent’) for modeling embryogenesis. As such, we constructed multicellular agent-based models (ABM) with ‘CompuCell3D’ (www.compucell3d.org) to simulate kinematics of complex cell signaling networks and enable critical tissue events for use in predictive toxicology. Seeding the ABMs with HTS/HCS data from ToxCast demonstrated the potential to predict, quantitatively, the higher order impacts of chemical disruption at the cellular or bioche

  1. In Silico Dynamics: computer simulation in a Virtual Embryo ...

    EPA Pesticide Factsheets

    Abstract: Utilizing cell biological information to predict higher order biological processes is a significant challenge in predictive toxicology. This is especially true for highly dynamical systems such as the embryo where morphogenesis, growth and differentiation require precisely orchestrated interactions between diverse cell populations. In patterning the embryo, genetic signals setup spatial information that cells then translate into a coordinated biological response. This can be modeled as ‘biowiring diagrams’ representing genetic signals and responses. Because the hallmark of multicellular organization resides in the ability of cells to interact with one another via well-conserved signaling pathways, multiscale computational (in silico) models that enable these interactions provide a platform to translate cellular-molecular lesions perturbations into higher order predictions. Just as ‘the Cell’ is the fundamental unit of biology so too should it be the computational unit (‘Agent’) for modeling embryogenesis. As such, we constructed multicellular agent-based models (ABM) with ‘CompuCell3D’ (www.compucell3d.org) to simulate kinematics of complex cell signaling networks and enable critical tissue events for use in predictive toxicology. Seeding the ABMs with HTS/HCS data from ToxCast demonstrated the potential to predict, quantitatively, the higher order impacts of chemical disruption at the cellular or biochemical level. This is demonstrate

  2. Comparing the cognitive differences resulting from modeling instruction: Using computer microworld and physical object instruction to model real world problems

    NASA Astrophysics Data System (ADS)

    Oursland, Mark David

    This study compared the modeling achievement of students receiving mathematical modeling instruction using the computer microworld, Interactive Physics, and students receiving instruction using physical objects. Modeling instruction included activities where students applied the (a) linear model to a variety of situations, (b) linear model to two-rate situations with a constant rate, (c) quadratic model to familiar geometric figures. Both quantitative and qualitative methods were used to analyze achievement differences between students (a) receiving different methods of modeling instruction, (b) with different levels of beginning modeling ability, or (c) with different levels of computer literacy. Student achievement was analyzed quantitatively through a three-factor analysis of variance where modeling instruction, beginning modeling ability, and computer literacy were used as the three independent factors. The SOLO (Structure of the Observed Learning Outcome) assessment framework was used to design written modeling assessment instruments to measure the students' modeling achievement. The same three independent factors were used to collect and analyze the interviews and observations of student behaviors. Both methods of modeling instruction used the data analysis approach to mathematical modeling. The instructional lessons presented problem situations where students were asked to collect data, analyze the data, write a symbolic mathematical equation, and use equation to solve the problem. The researcher recommends the following practice for modeling instruction based on the conclusions of this study. A variety of activities with a common structure are needed to make explicit the modeling process of applying a standard mathematical model. The modeling process is influenced strongly by prior knowledge of the problem context and previous modeling experiences. The conclusions of this study imply that knowledge of the properties about squares improved the students' ability to model a geometric problem more than instruction in data analysis modeling. The uses of computer microworlds such as Interactive Physics in conjunction with cooperative groups are a viable method of modeling instruction.

  3. A Micro-Level Data-Calibrated Agent-Based Model: The Synergy between Microsimulation and Agent-Based Modeling.

    PubMed

    Singh, Karandeep; Ahn, Chang-Won; Paik, Euihyun; Bae, Jang Won; Lee, Chun-Hee

    2018-01-01

    Artificial life (ALife) examines systems related to natural life, its processes, and its evolution, using simulations with computer models, robotics, and biochemistry. In this article, we focus on the computer modeling, or "soft," aspects of ALife and prepare a framework for scientists and modelers to be able to support such experiments. The framework is designed and built to be a parallel as well as distributed agent-based modeling environment, and does not require end users to have expertise in parallel or distributed computing. Furthermore, we use this framework to implement a hybrid model using microsimulation and agent-based modeling techniques to generate an artificial society. We leverage this artificial society to simulate and analyze population dynamics using Korean population census data. The agents in this model derive their decisional behaviors from real data (microsimulation feature) and interact among themselves (agent-based modeling feature) to proceed in the simulation. The behaviors, interactions, and social scenarios of the agents are varied to perform an analysis of population dynamics. We also estimate the future cost of pension policies based on the future population structure of the artificial society. The proposed framework and model demonstrates how ALife techniques can be used by researchers in relation to social issues and policies.

  4. Universal Adiabatic Quantum Computing using Double Quantum Dot Charge Qubits

    NASA Astrophysics Data System (ADS)

    Ryan-Anderson, Ciaran; Jacobson, N. Tobias; Landahl, Andrew

    Adiabatic quantum computation (AQC) provides one path to achieving universal quantum computing in experiment. Computation in the AQC model occurs by starting with an easy to prepare groundstate of some simple Hamiltonian and then adiabatically evolving the Hamiltonian to obtain the groundstate of a final, more complex Hamiltonian. It has been shown that the circuit model can be mapped to AQC Hamiltonians and, thus, AQC can be made universal. Further, these Hamiltonians can be made planar and two-local. We propose using double quantum dot charge qubits (DQDs) to implement such universal AQC Hamiltonians. However, the geometry and restricted set of interactions of DQDs make the application of even these 2-local planar Hamiltonians non-trivial. We present a construction tailored to DQDs to overcome the geometric and interaction contraints and allow for universal AQC. These constraints are dealt with in this construction by making use of perturbation gadgets, which introduce ancillary qubits to mediate interactions. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  5. A heterogeneous system based on GPU and multi-core CPU for real-time fluid and rigid body simulation

    NASA Astrophysics Data System (ADS)

    da Silva Junior, José Ricardo; Gonzalez Clua, Esteban W.; Montenegro, Anselmo; Lage, Marcos; Dreux, Marcelo de Andrade; Joselli, Mark; Pagliosa, Paulo A.; Kuryla, Christine Lucille

    2012-03-01

    Computational fluid dynamics in simulation has become an important field not only for physics and engineering areas but also for simulation, computer graphics, virtual reality and even video game development. Many efficient models have been developed over the years, but when many contact interactions must be processed, most models present difficulties or cannot achieve real-time results when executed. The advent of parallel computing has enabled the development of many strategies for accelerating the simulations. Our work proposes a new system which uses some successful algorithms already proposed, as well as a data structure organisation based on a heterogeneous architecture using CPUs and GPUs, in order to process the simulation of the interaction of fluids and rigid bodies. This successfully results in a two-way interaction between them and their surrounding objects. As far as we know, this is the first work that presents a computational collaborative environment which makes use of two different paradigms of hardware architecture for this specific kind of problem. Since our method achieves real-time results, it is suitable for virtual reality, simulation and video game fluid simulation problems.

  6. An efficient approach to the analysis of rail surface irregularities accounting for dynamic train-track interaction and inelastic deformations

    NASA Astrophysics Data System (ADS)

    Andersson, Robin; Torstensson, Peter T.; Kabo, Elena; Larsson, Fredrik

    2015-11-01

    A two-dimensional computational model for assessment of rolling contact fatigue induced by discrete rail surface irregularities, especially in the context of so-called squats, is presented. Dynamic excitation in a wide frequency range is considered in computationally efficient time-domain simulations of high-frequency dynamic vehicle-track interaction accounting for transient non-Hertzian wheel-rail contact. Results from dynamic simulations are mapped onto a finite element model to resolve the cyclic, elastoplastic stress response in the rail. Ratcheting under multiple wheel passages is quantified. In addition, low cycle fatigue impact is quantified using the Jiang-Sehitoglu fatigue parameter. The functionality of the model is demonstrated by numerical examples.

  7. The human factors of workstation telepresence

    NASA Technical Reports Server (NTRS)

    Smith, Thomas J.; Smith, Karl U.

    1990-01-01

    The term workstation telepresence has been introduced to describe human-telerobot compliance, which enables the human operator to effectively project his/her body image and behavioral skills to control of the telerobot itself. Major human-factors considerations for establishing high fidelity workstation telepresence during human-telerobot operation are discussed. Telerobot workstation telepresence is defined by the proficiency and skill with which the operator is able to control sensory feedback from direct interaction with the workstation itself, and from workstation-mediated interaction with the telerobot. Numerous conditions influencing such control have been identified. This raises the question as to what specific factors most critically influence the realization of high fidelity workstation telepresence. The thesis advanced here is that perturbations in sensory feedback represent a major source of variability in human performance during interactive telerobot operation. Perturbed sensory feedback research over the past three decades has established that spatial transformations or temporal delays in sensory feedback engender substantial decrements in interactive task performance, which training does not completely overcome. A recently developed social cybernetic model of human-computer interaction can be used to guide this approach, based on computer-mediated tracking and control of sensory feedback. How the social cybernetic model can be employed for evaluating the various modes, patterns, and integrations of interpersonal, team, and human-computer interactions which play a central role is workstation telepresence are discussed.

  8. Using affinity capillary electrophoresis and computational models for binding studies of heparinoids with p-selectin and other proteins.

    PubMed

    Mozafari, Mona; Balasupramaniam, Shantheya; Preu, Lutz; El Deeb, Sami; Reiter, Christian G; Wätzig, Hermann

    2017-06-01

    A fast and precise affinity capillary electrophoresis (ACE) method has been developed and applied for the investigation of the binding interactions between P-selectin and heparinoids as potential P-selectin inhibitors in the presence and absence of calcium ions. Furthermore, model proteins and vitronectin were used to appraise the binding behavior of P-selectin. The normalized mobility ratios (∆R/R f ), which provided information about the binding strength and the overall charge of the protein-ligand complex, were used to evaluate the binding affinities. It was found that P-selectin interacts more strongly with heparinoids in the presence of calcium ions. P-selectin was affected by heparinoids at the concentration of 3 mg/L. In addition, the results of the ACE experiments showed that among other investigated proteins, albumins and vitronectin exhibited strong interactions with heparinoids. Especially with P-selectin and vitronectin, the interaction may additionally induce conformational changes. Subsequently, computational models were applied to interpret the ACE experiments. Docking experiments explained that the binding of heparinoids on P-selectin is promoted by calcium ions. These docking models proved to be particularly well suited to investigate the interaction of charged compounds, and are therefore complementary to ACE experiments. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Interactive Visualization of Infrared Spectral Data: Synergy of Computation, Visualization, and Experiment for Learning Spectroscopy

    NASA Astrophysics Data System (ADS)

    Lahti, Paul M.; Motyka, Eric J.; Lancashire, Robert J.

    2000-05-01

    A straightforward procedure is described to combine computation of molecular vibrational modes using commonly available molecular modeling programs with visualization of the modes using advanced features of the MDL Information Systems Inc. Chime World Wide Web browser plug-in. Minor editing of experimental spectra that are stored in the JCAMP-DX format allows linkage of IR spectral frequency ranges to Chime molecular display windows. The spectra and animation files can be combined by Hypertext Markup Language programming to allow interactive linkage between experimental spectra and computationally generated vibrational displays. Both the spectra and the molecular displays can be interactively manipulated to allow the user maximum control of the objects being viewed. This procedure should be very valuable not only for aiding students through visual linkage of spectra and various vibrational animations, but also by assisting them in learning the advantages and limitations of computational chemistry by comparison to experiment.

  10. Evaluation of synthetic linear motor-molecule actuation energetics

    PubMed Central

    Brough, Branden; Northrop, Brian H.; Schmidt, Jacob J.; Tseng, Hsian-Rong; Houk, Kendall N.; Stoddart, J. Fraser; Ho, Chih-Ming

    2006-01-01

    By applying atomic force microscope (AFM)-based force spectroscopy together with computational modeling in the form of molecular force-field simulations, we have determined quantitatively the actuation energetics of a synthetic motor-molecule. This multidisciplinary approach was performed on specifically designed, bistable, redox-controllable [2]rotaxanes to probe the steric and electrostatic interactions that dictate their mechanical switching at the single-molecule level. The fusion of experimental force spectroscopy and theoretical computational modeling has revealed that the repulsive electrostatic interaction, which is responsible for the molecular actuation, is as high as 65 kcal·mol−1, a result that is supported by ab initio calculations. PMID:16735470

  11. AMP and adenosine are both ligands for adenosine 2B receptor signaling.

    PubMed

    Holien, Jessica K; Seibt, Benjamin; Roberts, Veena; Salvaris, Evelyn; Parker, Michael W; Cowan, Peter J; Dwyer, Karen M

    2018-01-15

    Adenosine is considered the canonical ligand for the adenosine 2B receptor (A 2B R). A 2B R is upregulated following kidney ischemia augmenting post ischemic blood flow and limiting tubular injury. In this context the beneficial effect of A 2B R signaling has been attributed to an increase in the pericellular concentration of adenosine. However, following renal ischemia both kidney adenosine monophosphate (AMP) and adenosine levels are substantially increased. Using computational modeling and calcium mobilization assays, we investigated whether AMP could also be a ligand for A 2B R. The computational modeling suggested that AMP interacts with more favorable energy to A 2B R compared with adenosine. Furthermore, AMPαS, a non-hydrolyzable form of AMP, increased calcium uptake by Chinese hamster ovary (CHO) cells expressing the human A 2B R, indicating preferential signaling via the G q pathway. Therefore, a putative AMP-A 2B R interaction is supported by the computational modeling data and the biological results suggest this interaction involves preferential G q activation. These data provide further insights into the role of purinergic signaling in the pathophysiology of renal IRI. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Computational structure analysis of biomacromolecule complexes by interface geometry.

    PubMed

    Mahdavi, Sedigheh; Salehzadeh-Yazdi, Ali; Mohades, Ali; Masoudi-Nejad, Ali

    2013-12-01

    The ability to analyze and compare protein-nucleic acid and protein-protein interaction interface has critical importance in understanding the biological function and essential processes occurring in the cells. Since high-resolution three-dimensional (3D) structures of biomacromolecule complexes are available, computational characterizing of the interface geometry become an important research topic in the field of molecular biology. In this study, the interfaces of a set of 180 protein-nucleic acid and protein-protein complexes are computed to understand the principles of their interactions. The weighted Voronoi diagram of the atoms and the Alpha complex has provided an accurate description of the interface atoms. Our method is implemented in the presence and absence of water molecules. A comparison among the three types of interaction interfaces show that RNA-protein complexes have the largest size of an interface. The results show a high correlation coefficient between our method and the PISA server in the presence and absence of water molecules in the Voronoi model and the traditional model based on solvent accessibility and the high validation parameters in comparison to the classical model. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Validation of a 3D computational fluid-structure interaction model simulating flow through an elastic aperture.

    PubMed

    Quaini, A; Canic, S; Glowinski, R; Igo, S; Hartley, C J; Zoghbi, W; Little, S

    2012-01-10

    This work presents a validation of a fluid-structure interaction computational model simulating the flow conditions in an in vitro mock heart chamber modeling mitral valve regurgitation during the ejection phase during which the trans-valvular pressure drop and valve displacement are not as large. The mock heart chamber was developed to study the use of 2D and 3D color Doppler techniques in imaging the clinically relevant complex intra-cardiac flow events associated with mitral regurgitation. Computational models are expected to play an important role in supporting, refining, and reinforcing the emerging 3D echocardiographic applications. We have developed a 3D computational fluid-structure interaction algorithm based on a semi-implicit, monolithic method, combined with an arbitrary Lagrangian-Eulerian approach to capture the fluid domain motion. The mock regurgitant mitral valve corresponding to an elastic plate with a geometric orifice, was modeled using 3D elasticity, while the blood flow was modeled using the 3D Navier-Stokes equations for an incompressible, viscous fluid. The two are coupled via the kinematic and dynamic conditions describing the two-way coupling. The pressure, the flow rate, and orifice plate displacement were measured and compared with numerical simulation results. In-line flow meter was used to measure the flow, pressure transducers were used to measure the pressure, and a Doppler method developed by one of the authors was used to measure the axial displacement of the orifice plate. The maximum recorded difference between experiment and numerical simulation for the flow rate was 4%, the pressure 3.6%, and for the orifice displacement 15%, showing excellent agreement between the two. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Advanced earth observation spacecraft computer-aided design software: Technical, user and programmer guide

    NASA Technical Reports Server (NTRS)

    Farrell, C. E.; Krauze, L. D.

    1983-01-01

    The IDEAS computer of NASA is a tool for interactive preliminary design and analysis of LSS (Large Space System). Nine analysis modules were either modified or created. These modules include the capabilities of automatic model generation, model mass properties calculation, model area calculation, nonkinematic deployment modeling, rigid-body controls analysis, RF performance prediction, subsystem properties definition, and EOS science sensor selection. For each module, a section is provided that contains technical information, user instructions, and programmer documentation.

  15. An eLearning Standard Approach for Supporting PBL in Computer Engineering

    ERIC Educational Resources Information Center

    Garcia-Robles, R.; Diaz-del-Rio, F.; Vicente-Diaz, S.; Linares-Barranco, A.

    2009-01-01

    Problem-based learning (PBL) has proved to be a highly successful pedagogical model in many fields, although it is not that common in computer engineering. PBL goes beyond the typical teaching methodology by promoting student interaction. This paper presents a PBL trial applied to a course in a computer engineering degree at the University of…

  16. Numerical Optimization Using Desktop Computers

    DTIC Science & Technology

    1980-09-11

    concentrating compound parabolic trough solar collector . Thermophysical, geophysical, optical and economic analyses were used to compute a life-cycle...third computer program, NISCO, was developed to model a nonimaging concentrating compound parabolic trough solar collector using thermophysical...concentrating compound parabolic trough Solar Collector . C. OBJECTIVE The objective of this thesis was to develop a system of interactive programs for the Hewlett

  17. PEO Integration Acronym Book

    DTIC Science & Technology

    2011-02-01

    Command CASE Computer Aided Software Engineering CASEVAC Casualty Evacuation CASTFOREM Combined Arms And Support Task Force Evaluation Model CAT Center For...Advanced Technologies CAT Civil Affairs Team CAT Combined Arms Training CAT Crew Integration CAT Crisis Action Team CATIA Computer-Aided Three...Dimensional Interactive Application CATOX Catalytic Oxidation CATS Combined Arms Training Strategy CATT Combined Arms Tactical Trainer CATT Computer

  18. Assessing the Purpose and Importance University Students Attribute to Current ICT Applications

    ERIC Educational Resources Information Center

    DiGiuseppe, Maurice; Partosoedarso, Elita

    2014-01-01

    In this study we surveyed students in a mid-sized university in Ontario, Canada to explore various aspects associated with their use of computer-based applications. For the purpose of analysis, the computer applications under study were categorized according to the Human-Computer-Human Interaction (HCHI) model of Desjardins (2005) in which…

  19. A Novel Numerical Approach for Generation and Propagation of Rotor-Stator Interaction Noise

    NASA Astrophysics Data System (ADS)

    Patel, Krishna

    As turbofan engine designs move towards bypass ratios ≥12 and corresponding low pressure ratios, fan rotor blade tip Mach numbers are reduced, leading to rotor-stator interaction becoming an important contributor to tonal fan noise. For future aircraft configurations employing boundary layer ingestion, non-uniform flow enters the fan. The impact of such non-uniform flows on the generation and propagation of rotor-stator interaction tones has yet to be assessed. In this thesis, a novel approach is proposed to numerically predict the generation and propagation of rotor-stator interaction noise with distorted inflow. The approach enables a 42% reduction in computational cost compared to traditional approaches employing a sliding interface between the rotor and stator. Such an interface may distort rotor wakes and can cause non-physical acoustic wave reflections if time steps are not sufficiently small. Computational costs are reduced by modelling the rotor using distributed, volumetric body forces. This eliminates the need for a sliding interface and thus allows a larger time step size. The force model responds to local flow conditions and thus can capture the effects of long-wavelength flow distortions. Since interaction noise is generated by the incidence of the rotor wakes onto the stator vanes, the key challenge is to produce the wakes using a body force field since the rotor blades are not directly modelled. It is shown that such an approach can produce wakes by concentrating the viscous forces along streamtubes in the last 15% chord. The new approach to rotor wake generation is assessed on the GE R4 fan from NASA's Source Diagnostic Test, for which the computed overall aerodynamic performance matches the experiment to within 1%. The rotor blade wakes are generated with widths in excellent agreement and depths in fair agreement with the experiment. An assessment of modal sound power levels computed in the exhaust duct indicates that this approach can be used for predicting downstream propagating interaction noise.

  20. A DGS Gesture Dictionary for Modelling on Mobile Devices

    ERIC Educational Resources Information Center

    Isotani, Seiji; Reis, Helena M.; Alvares, Danilo; Brandão, Anarosa A. F.; Brandão, Leônidas O.

    2018-01-01

    Interactive or Dynamic Geometry System (DGS) is a tool that help to teach and learn geometry using a computer-based interactive environment. Traditionally, the interaction with DGS is based on keyboard and mouse events where the functionalities are accessed using a menu of icons. Nevertheless, recent findings suggest that such a traditional model…

  1. Some Technical Implications of Distributed Cognition on the Design on Interactive Learning Environments.

    ERIC Educational Resources Information Center

    Dillenbourg, Pierre

    1996-01-01

    Maintains that diagnosis, explanation, and tutoring, the functions of an interactive learning environment, are collaborative processes. Examines how human-computer interaction can be improved using a distributed cognition framework. Discusses situational and distributed knowledge theories and provides a model on how they can be used to redesign…

  2. Multipartite interacting scalar dark matter in the light of updated LUX data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhattacharya, Subhaditya; Ghosh, Purusottam; Poulose, Poulose, E-mail: subhab@iitg.ernet.in, E-mail: p.ghosh@iitg.ernet.in, E-mail: poulose@iitg.ernet.in

    2017-04-01

    We explore constraints on multipartite dark matter (DM) framework composed of singlet scalar DM interacting with the Standard Model (SM) through Higgs portal coupling. We compute relic density and direct search constraints including the updated LUX bound for two component scenario with non-zero interactions between two DM components in Z{sub 2} × Z{sub 2}{sup '} framework in comparison with the one having O(2) symmetry. We point out availability of a significantly large region of parameter space of such a multipartite model with DM-DM interactions.

  3. Long-ranged contributions to solvation free energies from theory and short-ranged models

    PubMed Central

    Remsing, Richard C.; Liu, Shule; Weeks, John D.

    2016-01-01

    Long-standing problems associated with long-ranged electrostatic interactions have plagued theory and simulation alike. Traditional lattice sum (Ewald-like) treatments of Coulomb interactions add significant overhead to computer simulations and can produce artifacts from spurious interactions between simulation cell images. These subtle issues become particularly apparent when estimating thermodynamic quantities, such as free energies of solvation in charged and polar systems, to which long-ranged Coulomb interactions typically make a large contribution. In this paper, we develop a framework for determining very accurate solvation free energies of systems with long-ranged interactions from models that interact with purely short-ranged potentials. Our approach is generally applicable and can be combined with existing computational and theoretical techniques for estimating solvation thermodynamics. We demonstrate the utility of our approach by examining the hydration thermodynamics of hydrophobic and ionic solutes and the solvation of a large, highly charged colloid that exhibits overcharging, a complex nonlinear electrostatic phenomenon whereby counterions from the solvent effectively overscreen and locally invert the integrated charge of the solvated object. PMID:26929375

  4. Computational and mathematical methods in brain atlasing.

    PubMed

    Nowinski, Wieslaw L

    2017-12-01

    Brain atlases have a wide range of use from education to research to clinical applications. Mathematical methods as well as computational methods and tools play a major role in the process of brain atlas building and developing atlas-based applications. Computational methods and tools cover three areas: dedicated editors for brain model creation, brain navigators supporting multiple platforms, and atlas-assisted specific applications. Mathematical methods in atlas building and developing atlas-aided applications deal with problems in image segmentation, geometric body modelling, physical modelling, atlas-to-scan registration, visualisation, interaction and virtual reality. Here I overview computational and mathematical methods in atlas building and developing atlas-assisted applications, and share my contribution to and experience in this field.

  5. Virtual Interactive Musculoskeletal System (VIMS) in orthopaedic research, education and clinical patient care.

    PubMed

    Chao, Edmund Y S; Armiger, Robert S; Yoshida, Hiroaki; Lim, Jonathan; Haraguchi, Naoki

    2007-03-08

    The ability to combine physiology and engineering analyses with computer sciences has opened the door to the possibility of creating the "Virtual Human" reality. This paper presents a broad foundation for a full-featured biomechanical simulator for the human musculoskeletal system physiology. This simulation technology unites the expertise in biomechanical analysis and graphic modeling to investigate joint and connective tissue mechanics at the structural level and to visualize the results in both static and animated forms together with the model. Adaptable anatomical models including prosthetic implants and fracture fixation devices and a robust computational infrastructure for static, kinematic, kinetic, and stress analyses under varying boundary and loading conditions are incorporated on a common platform, the VIMS (Virtual Interactive Musculoskeletal System). Within this software system, a manageable database containing long bone dimensions, connective tissue material properties and a library of skeletal joint system functional activities and loading conditions are also available and they can easily be modified, updated and expanded. Application software is also available to allow end-users to perform biomechanical analyses interactively. Examples using these models and the computational algorithms in a virtual laboratory environment are used to demonstrate the utility of these unique database and simulation technology. This integrated system, model library and database will impact on orthopaedic education, basic research, device development and application, and clinical patient care related to musculoskeletal joint system reconstruction, trauma management, and rehabilitation.

  6. Virtual interactive musculoskeletal system (VIMS) in orthopaedic research, education and clinical patient care

    PubMed Central

    Chao, Edmund YS; Armiger, Robert S; Yoshida, Hiroaki; Lim, Jonathan; Haraguchi, Naoki

    2007-01-01

    The ability to combine physiology and engineering analyses with computer sciences has opened the door to the possibility of creating the "Virtual Human" reality. This paper presents a broad foundation for a full-featured biomechanical simulator for the human musculoskeletal system physiology. This simulation technology unites the expertise in biomechanical analysis and graphic modeling to investigate joint and connective tissue mechanics at the structural level and to visualize the results in both static and animated forms together with the model. Adaptable anatomical models including prosthetic implants and fracture fixation devices and a robust computational infrastructure for static, kinematic, kinetic, and stress analyses under varying boundary and loading conditions are incorporated on a common platform, the VIMS (Virtual Interactive Musculoskeletal System). Within this software system, a manageable database containing long bone dimensions, connective tissue material properties and a library of skeletal joint system functional activities and loading conditions are also available and they can easily be modified, updated and expanded. Application software is also available to allow end-users to perform biomechanical analyses interactively. Examples using these models and the computational algorithms in a virtual laboratory environment are used to demonstrate the utility of these unique database and simulation technology. This integrated system, model library and database will impact on orthopaedic education, basic research, device development and application, and clinical patient care related to musculoskeletal joint system reconstruction, trauma management, and rehabilitation. PMID:17343764

  7. Simulation of flexible appendage interactions with Mariner Venus/Mercury attitude control and science platform pointing

    NASA Technical Reports Server (NTRS)

    Fleischer, G. E.

    1973-01-01

    A new computer subroutine, which solves the attitude equations of motion for any vehicle idealized as a topological tree of hinge-connected rigid bodies, is used to simulate and analyze science instrument pointing control interaction with a flexible Mariner Venus/Mercury (MVM) spacecraft. The subroutine's user options include linearized or partially linearized hinge-connected models whose computational advantages are demonstrated for the MVM problem. Results of the pointing control/flexible vehicle interaction simulations, including imaging experiment pointing accuracy predictions and implications for MVM science sequence planning, are described in detail.

  8. Fluid-structure interaction including volumetric coupling with homogenised subdomains for modeling respiratory mechanics.

    PubMed

    Yoshihara, Lena; Roth, Christian J; Wall, Wolfgang A

    2017-04-01

    In this article, a novel approach is presented for combining standard fluid-structure interaction with additional volumetric constraints to model fluid flow into and from homogenised solid domains. The proposed algorithm is particularly interesting for investigations in the field of respiratory mechanics as it enables the mutual coupling of airflow in the conducting part and local tissue deformation in the respiratory part of the lung by means of a volume constraint. In combination with a classical monolithic fluid-structure interaction approach, a comprehensive model of the human lung can be established that will be useful to gain new insights into respiratory mechanics in health and disease. To illustrate the validity and versatility of the novel approach, three numerical examples including a patient-specific lung model are presented. The proposed algorithm proves its capability of computing clinically relevant airflow distribution and tissue strain data at a level of detail that is not yet achievable, neither with current imaging techniques nor with existing computational models. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  9. 3D Fluid-Structure Interaction Simulation of Aortic Valves Using a Unified Continuum ALE FEM Model.

    PubMed

    Spühler, Jeannette H; Jansson, Johan; Jansson, Niclas; Hoffman, Johan

    2018-01-01

    Due to advances in medical imaging, computational fluid dynamics algorithms and high performance computing, computer simulation is developing into an important tool for understanding the relationship between cardiovascular diseases and intraventricular blood flow. The field of cardiac flow simulation is challenging and highly interdisciplinary. We apply a computational framework for automated solutions of partial differential equations using Finite Element Methods where any mathematical description directly can be translated to code. This allows us to develop a cardiac model where specific properties of the heart such as fluid-structure interaction of the aortic valve can be added in a modular way without extensive efforts. In previous work, we simulated the blood flow in the left ventricle of the heart. In this paper, we extend this model by placing prototypes of both a native and a mechanical aortic valve in the outflow region of the left ventricle. Numerical simulation of the blood flow in the vicinity of the valve offers the possibility to improve the treatment of aortic valve diseases as aortic stenosis (narrowing of the valve opening) or regurgitation (leaking) and to optimize the design of prosthetic heart valves in a controlled and specific way. The fluid-structure interaction and contact problem are formulated in a unified continuum model using the conservation laws for mass and momentum and a phase function. The discretization is based on an Arbitrary Lagrangian-Eulerian space-time finite element method with streamline diffusion stabilization, and it is implemented in the open source software Unicorn which shows near optimal scaling up to thousands of cores. Computational results are presented to demonstrate the capability of our framework.

  10. 3D Fluid-Structure Interaction Simulation of Aortic Valves Using a Unified Continuum ALE FEM Model

    PubMed Central

    Spühler, Jeannette H.; Jansson, Johan; Jansson, Niclas; Hoffman, Johan

    2018-01-01

    Due to advances in medical imaging, computational fluid dynamics algorithms and high performance computing, computer simulation is developing into an important tool for understanding the relationship between cardiovascular diseases and intraventricular blood flow. The field of cardiac flow simulation is challenging and highly interdisciplinary. We apply a computational framework for automated solutions of partial differential equations using Finite Element Methods where any mathematical description directly can be translated to code. This allows us to develop a cardiac model where specific properties of the heart such as fluid-structure interaction of the aortic valve can be added in a modular way without extensive efforts. In previous work, we simulated the blood flow in the left ventricle of the heart. In this paper, we extend this model by placing prototypes of both a native and a mechanical aortic valve in the outflow region of the left ventricle. Numerical simulation of the blood flow in the vicinity of the valve offers the possibility to improve the treatment of aortic valve diseases as aortic stenosis (narrowing of the valve opening) or regurgitation (leaking) and to optimize the design of prosthetic heart valves in a controlled and specific way. The fluid-structure interaction and contact problem are formulated in a unified continuum model using the conservation laws for mass and momentum and a phase function. The discretization is based on an Arbitrary Lagrangian-Eulerian space-time finite element method with streamline diffusion stabilization, and it is implemented in the open source software Unicorn which shows near optimal scaling up to thousands of cores. Computational results are presented to demonstrate the capability of our framework. PMID:29713288

  11. A computer program for modeling non-spherical eclipsing binary star systems

    NASA Technical Reports Server (NTRS)

    Wood, D. B.

    1972-01-01

    The accurate analysis of eclipsing binary light curves is fundamental to obtaining information on the physical properties of stars. The model described accounts for the important geometric and photometric distortions such as rotational and tidal distortion, gravity brightening, and reflection effect. This permits a more accurate analysis of interacting eclipsing star systems. The model is designed to be useful to anyone with moderate computing resources. The programs, written in FORTRAN 4 for the IBM 360, consume about 80k bytes of core. The FORTRAN program listings are provided, and the computational aspects are described in some detail.

  12. A model of transverse fuel injection applied to the computation of supersonic combustor flow

    NASA Technical Reports Server (NTRS)

    Rogers, R. C.

    1979-01-01

    A two-dimensional, nonreacting flow model of the aerodynamic interaction of a transverse hydrogen jet within a supersonic mainstream has been developed. The model assumes profile shapes of mass flux, pressure, flow angle, and hydrogen concentration and produces downstream profiles of the other flow parameters under the constraints of the integrated conservation equations. These profiles are used as starting conditions for an existing finite difference parabolic computer code for the turbulent supersonic combustion of hydrogen. Integrated mixing and flow profile results obtained from the computer code compare favorably with existing data for the supersonic combustion of hydrogen.

  13. Dynamic Interaction of Long Suspension Bridges with Running Trains

    NASA Astrophysics Data System (ADS)

    XIA, H.; XU, Y. L.; CHAN, T. H. T.

    2000-10-01

    This paper presents an investigation of dynamic interaction of long suspension bridges with running trains. A three-dimensional finite element model is used to represent a long suspension bridge. Each 4-axle vehicle in a train is modelled by a 27-degrees-of-freedom dynamic system. The dynamic interaction between the bridge and train is realized through the contact forces between the wheels and track. By applying a mode superposition technique to the bridge only and taking the measured track irregularities as known quantities, the number of degrees of freedom (d.o.f.) the bridge-train system is significantly reduced and the coupled equations of motion are efficiently solved. The proposed formulation and the associated computer program are then applied to a real long suspension bridge carrying a railway within the bridge deck. The dynamic response of the bridge-train system and the derail and offload factors related to the running safety of the train are computed. The results show that the formulation presented in this paper can well predict dynamic behaviors of both bridge and train with reasonable computation efforts. Dynamic interaction between the long suspension bridge and train is not significant.

  14. Development of hybrid computer plasma models for different pressure regimes

    NASA Astrophysics Data System (ADS)

    Hromadka, Jakub; Ibehej, Tomas; Hrach, Rudolf

    2016-09-01

    With increased performance of contemporary computers during last decades numerical simulations became a very powerful tool applicable also in plasma physics research. Plasma is generally an ensemble of mutually interacting particles that is out of the thermodynamic equilibrium and for this reason fluid computer plasma models give results with only limited accuracy. On the other hand, much more precise particle models are often limited only on 2D problems because of their huge demands on the computer resources. Our contribution is devoted to hybrid modelling techniques that combine advantages of both modelling techniques mentioned above, particularly to their so-called iterative version. The study is focused on mutual relations between fluid and particle models that are demonstrated on the calculations of sheath structures of low temperature argon plasma near a cylindrical Langmuir probe for medium and higher pressures. Results of a simple iterative hybrid plasma computer model are also given. The authors acknowledge the support of the Grant Agency of Charles University in Prague (project 220215).

  15. Biocellion: accelerating computer simulation of multicellular biological system models.

    PubMed

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. Scientific Discovery through Advanced Computing (SciDAC-3) Partnership Project Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, Forest M.; Bochev, Pavel B.; Cameron-Smith, Philip J..

    The Applying Computationally Efficient Schemes for BioGeochemical Cycles ACES4BGC Project is advancing the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project is implementing and optimizing new computationally efficient tracer advection algorithms for large numbers of tracer species; adding important biogeochemical interactions between the atmosphere, land, and ocean models; and applying uncertainty quanti cation (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system.

  17. Simulation of hypersonic shock wave - laminar boundary layer interactions

    NASA Astrophysics Data System (ADS)

    Kianvashrad, N.; Knight, D.

    2017-06-01

    The capability of the Navier-Stokes equations with a perfect gas model for simulation of hypersonic shock wave - laminar boundary layer interactions is assessed. The configuration is a hollow cylinder flare. The experimental data were obtained by Calspan-University of Buffalo (CUBRC) for total enthalpies ranging from 5.07 to 21.85 MJ/kg. Comparison of the computed and experimental surface pressure and heat transfer is performed and the computed §ow¦eld structure is analyzed.

  18. Drug-Target Interaction Prediction through Label Propagation with Linear Neighborhood Information.

    PubMed

    Zhang, Wen; Chen, Yanlin; Li, Dingfang

    2017-11-25

    Interactions between drugs and target proteins provide important information for the drug discovery. Currently, experiments identified only a small number of drug-target interactions. Therefore, the development of computational methods for drug-target interaction prediction is an urgent task of theoretical interest and practical significance. In this paper, we propose a label propagation method with linear neighborhood information (LPLNI) for predicting unobserved drug-target interactions. Firstly, we calculate drug-drug linear neighborhood similarity in the feature spaces, by considering how to reconstruct data points from neighbors. Then, we take similarities as the manifold of drugs, and assume the manifold unchanged in the interaction space. At last, we predict unobserved interactions between known drugs and targets by using drug-drug linear neighborhood similarity and known drug-target interactions. The experiments show that LPLNI can utilize only known drug-target interactions to make high-accuracy predictions on four benchmark datasets. Furthermore, we consider incorporating chemical structures into LPLNI models. Experimental results demonstrate that the model with integrated information (LPLNI-II) can produce improved performances, better than other state-of-the-art methods. The known drug-target interactions are an important information source for computational predictions. The usefulness of the proposed method is demonstrated by cross validation and the case study.

  19. EXPOSURE ANALYSIS MODELING SYSTEM (EXAMS): USER MANUAL AND SYSTEM DOCUMENTATION

    EPA Science Inventory

    The Exposure Analysis Modeling System, first published in 1982 (EPA-600/3-82-023), provides interactive computer software for formulating aquatic ecosystem models and rapidly evaluating the fate, transport, and exposure concentrations of synthetic organic chemicals - pesticides, ...

  20. Birth/death process model

    NASA Technical Reports Server (NTRS)

    Solloway, C. B.; Wakeland, W.

    1976-01-01

    First-order Markov model developed on digital computer for population with specific characteristics. System is user interactive, self-documenting, and does not require user to have complete understanding of underlying model details. Contains thorough error-checking algorithms on input and default capabilities.

  1. Sharp Interface Tracking in Rotating Microflows of Solvent Extraction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glimm, James; Almeida, Valmor de; Jiao, Xiangmin

    2013-01-08

    The objective of this project is to develop a specialized sharp interface tracking simulation capability for predicting interaction of micron-sized drops and bubbles in rotating flows relevant to optimized design of contactor devices used in solvent extraction processes of spent nuclear fuel reprocessing. The primary outcomes of this project include the capability to resolve drops and bubbles micro-hydrodynamics in solvent extraction contactors, determining from first principles continuum fluid mechanics how micro-drops and bubbles interact with each other and the surrounding shearing fluid for realistic flows. In the near term, this effort will play a central role in providing parameters andmore » insight into the flow dynamics of models that average over coarser scales, say at the millimeter unit length. In the longer term, it will prove to be the platform to conduct full-device, detailed simulations as parallel computing power reaches the exaflop level. The team will develop an accurate simulation tool for flows containing interacting droplets and bubbles with sharp interfaces under conditions that mimic those found in realistic contactor operations. The main objective is to create an off-line simulation capability to model drop and bubble interactions in a domain representative of the averaged length scale. The technical approach is to combine robust interface tracking software, subgrid modeling, validation quality experiments, powerful computational hardware, and a team with simulation modeling, physical modeling and technology integration experience. Simulations will then fully resolve the microflow of drops and bubbles at the microsecond time scale. This approach is computationally intensive but very accurate in treating important coupled physical phenomena in the vicinity of interfaces. The method makes it possible to resolve spatial scales smaller than the typical distance between bubbles and to model some non-equilibrium thermodynamic features such as finite critical tension in cavitating liquids« less

  2. Sentence-Based Attentional Mechanisms in Word Learning: Evidence from a Computational Model

    PubMed Central

    Alishahi, Afra; Fazly, Afsaneh; Koehne, Judith; Crocker, Matthew W.

    2012-01-01

    When looking for the referents of novel nouns, adults and young children are sensitive to cross-situational statistics (Yu and Smith, 2007; Smith and Yu, 2008). In addition, the linguistic context that a word appears in has been shown to act as a powerful attention mechanism for guiding sentence processing and word learning (Landau and Gleitman, 1985; Altmann and Kamide, 1999; Kako and Trueswell, 2000). Koehne and Crocker (2010, 2011) investigate the interaction between cross-situational evidence and guidance from the sentential context in an adult language learning scenario. Their studies reveal that these learning mechanisms interact in a complex manner: they can be used in a complementary way when context helps reduce referential uncertainty; they influence word learning about equally strongly when cross-situational and contextual evidence are in conflict; and contextual cues block aspects of cross-situational learning when both mechanisms are independently applicable. To address this complex pattern of findings, we present a probabilistic computational model of word learning which extends a previous cross-situational model (Fazly et al., 2010) with an attention mechanism based on sentential cues. Our model uses a framework that seamlessly combines the two sources of evidence in order to study their emerging pattern of interaction during the process of word learning. Simulations of the experiments of (Koehne and Crocker, 2010, 2011) reveal an overall pattern of results that are in line with their findings. Importantly, we demonstrate that our model does not need to explicitly assign priority to either source of evidence in order to produce these results: learning patterns emerge as a result of a probabilistic interaction between the two clue types. Moreover, using a computational model allows us to examine the developmental trajectory of the differential roles of cross-situational and sentential cues in word learning. PMID:22783211

  3. Entangled spin chain

    NASA Astrophysics Data System (ADS)

    Salberger, Olof; Korepin, Vladimir

    We introduce a new model of interacting spin 1/2. It describes interactions of three nearest neighbors. The Hamiltonian can be expressed in terms of Fredkin gates. The Fredkin gate (also known as the controlled swap gate) is a computational circuit suitable for reversible computing. Our construction generalizes the model presented by Peter Shor and Ramis Movassagh to half-integer spins. Our model can be solved by means of Catalan combinatorics in the form of random walks on the upper half plane of a square lattice (Dyck walks). Each Dyck path can be mapped on a wave function of spins. The ground state is an equally weighted superposition of Dyck walks (instead of Motzkin walks). We can also express it as a matrix product state. We further construct a model of interacting spins 3/2 and greater half-integer spins. The models with higher spins require coloring of Dyck walks. We construct a SU(k) symmetric model (where k is the number of colors). The leading term of the entanglement entropy is then proportional to the square root of the length of the lattice (like in the Shor-Movassagh model). The gap closes as a high power of the length of the lattice [5, 11].

  4. Quantum Vertex Model for Reversible Classical Computing

    NASA Astrophysics Data System (ADS)

    Chamon, Claudio; Mucciolo, Eduardo; Ruckenstein, Andrei; Yang, Zhicheng

    We present a planar vertex model that encodes the result of a universal reversible classical computation in its ground state. The approach involves Boolean variables (spins) placed on links of a two-dimensional lattice, with vertices representing logic gates. Large short-ranged interactions between at most two spins implement the operation of each gate. The lattice is anisotropic with one direction corresponding to computational time, and with transverse boundaries storing the computation's input and output. The model displays no finite temperature phase transitions, including no glass transitions, independent of circuit. The computational complexity is encoded in the scaling of the relaxation rate into the ground state with the system size. We use thermal annealing and a novel and more efficient heuristic \\x9Dannealing with learning to study various computational problems. To explore faster relaxation routes, we construct an explicit mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating a novel approach to reversible classical computation based on quantum annealing.

  5. Computationally efficient methods for modelling laser wakefield acceleration in the blowout regime

    NASA Astrophysics Data System (ADS)

    Cowan, B. M.; Kalmykov, S. Y.; Beck, A.; Davoine, X.; Bunkers, K.; Lifschitz, A. F.; Lefebvre, E.; Bruhwiler, D. L.; Shadwick, B. A.; Umstadter, D. P.; Umstadter

    2012-08-01

    Electron self-injection and acceleration until dephasing in the blowout regime is studied for a set of initial conditions typical of recent experiments with 100-terawatt-class lasers. Two different approaches to computationally efficient, fully explicit, 3D particle-in-cell modelling are examined. First, the Cartesian code vorpal (Nieter, C. and Cary, J. R. 2004 VORPAL: a versatile plasma simulation code. J. Comput. Phys. 196, 538) using a perfect-dispersion electromagnetic solver precisely describes the laser pulse and bubble dynamics, taking advantage of coarser resolution in the propagation direction, with a proportionally larger time step. Using third-order splines for macroparticles helps suppress the sampling noise while keeping the usage of computational resources modest. The second way to reduce the simulation load is using reduced-geometry codes. In our case, the quasi-cylindrical code calder-circ (Lifschitz, A. F. et al. 2009 Particle-in-cell modelling of laser-plasma interaction using Fourier decomposition. J. Comput. Phys. 228(5), 1803-1814) uses decomposition of fields and currents into a set of poloidal modes, while the macroparticles move in the Cartesian 3D space. Cylindrical symmetry of the interaction allows using just two modes, reducing the computational load to roughly that of a planar Cartesian simulation while preserving the 3D nature of the interaction. This significant economy of resources allows using fine resolution in the direction of propagation and a small time step, making numerical dispersion vanishingly small, together with a large number of particles per cell, enabling good particle statistics. Quantitative agreement of two simulations indicates that these are free of numerical artefacts. Both approaches thus retrieve the physically correct evolution of the plasma bubble, recovering the intrinsic connection of electron self-injection to the nonlinear optical evolution of the driver.

  6. Aerothermal modeling program, phase 2. Element B: Flow interaction experiment

    NASA Technical Reports Server (NTRS)

    Nikjooy, M.; Mongia, H. C.; Murthy, S. N. B.; Sullivan, J. P.

    1986-01-01

    The design process was improved and the efficiency, life, and maintenance costs of the turbine engine hot section was enhanced. Recently, there has been much emphasis on the need for improved numerical codes for the design of efficient combustors. For the development of improved computational codes, there is a need for an experimentally obtained data base to be used at test cases for the accuracy of the computations. The purpose of Element-B is to establish a benchmark quality velocity and scalar measurements of the flow interaction of circular jets with swirling flow typical of that in the dome region of annular combustor. In addition to the detailed experimental effort, extensive computations of the swirling flows are to be compared with the measurements for the purpose of assessing the accuracy of current and advanced turbulence and scalar transport models.

  7. A turbulence model for iced airfoils and its validation

    NASA Technical Reports Server (NTRS)

    Shin, Jaiwon; Chen, Hsun H.; Cebeci, Tuncer

    1992-01-01

    A turbulence model based on the extension of the algebraic eddy viscosity formulation of Cebeci and Smith developed for two dimensional flows over smooth and rough surfaces is described for iced airfoils and validated for computed ice shapes obtained for a range of total temperatures varying from 28 to -15 F. The validation is made with an interactive boundary layer method which uses a panel method to compute the inviscid flow and an inverse finite difference boundary layer method to compute the viscous flow. The interaction between inviscid and viscous flows is established by the use of the Hilbert integral. The calculated drag coefficients compare well with recent experimental data taken at the NASA-Lewis Icing Research Tunnel (IRT) and show that, in general, the drag increase due to ice accretion can be predicted well and efficiently.

  8. A Physics-driven Neural Networks-based Simulation System (PhyNNeSS) for multimodal interactive virtual environments involving nonlinear deformable objects

    PubMed Central

    De, Suvranu; Deo, Dhannanjay; Sankaranarayanan, Ganesh; Arikatla, Venkata S.

    2012-01-01

    Background While an update rate of 30 Hz is considered adequate for real time graphics, a much higher update rate of about 1 kHz is necessary for haptics. Physics-based modeling of deformable objects, especially when large nonlinear deformations and complex nonlinear material properties are involved, at these very high rates is one of the most challenging tasks in the development of real time simulation systems. While some specialized solutions exist, there is no general solution for arbitrary nonlinearities. Methods In this work we present PhyNNeSS - a Physics-driven Neural Networks-based Simulation System - to address this long-standing technical challenge. The first step is an off-line pre-computation step in which a database is generated by applying carefully prescribed displacements to each node of the finite element models of the deformable objects. In the next step, the data is condensed into a set of coefficients describing neurons of a Radial Basis Function network (RBFN). During real-time computation, these neural networks are used to reconstruct the deformation fields as well as the interaction forces. Results We present realistic simulation examples from interactive surgical simulation with real time force feedback. As an example, we have developed a deformable human stomach model and a Penrose-drain model used in the Fundamentals of Laparoscopic Surgery (FLS) training tool box. Conclusions A unique computational modeling system has been developed that is capable of simulating the response of nonlinear deformable objects in real time. The method distinguishes itself from previous efforts in that a systematic physics-based pre-computational step allows training of neural networks which may be used in real time simulations. We show, through careful error analysis, that the scheme is scalable, with the accuracy being controlled by the number of neurons used in the simulation. PhyNNeSS has been integrated into SoFMIS (Software Framework for Multimodal Interactive Simulation) for general use. PMID:22629108

  9. Improving compound-protein interaction prediction by building up highly credible negative samples.

    PubMed

    Liu, Hui; Sun, Jianjiang; Guan, Jihong; Zheng, Jie; Zhou, Shuigeng

    2015-06-15

    Computational prediction of compound-protein interactions (CPIs) is of great importance for drug design and development, as genome-scale experimental validation of CPIs is not only time-consuming but also prohibitively expensive. With the availability of an increasing number of validated interactions, the performance of computational prediction approaches is severely impended by the lack of reliable negative CPI samples. A systematic method of screening reliable negative sample becomes critical to improving the performance of in silico prediction methods. This article aims at building up a set of highly credible negative samples of CPIs via an in silico screening method. As most existing computational models assume that similar compounds are likely to interact with similar target proteins and achieve remarkable performance, it is rational to identify potential negative samples based on the converse negative proposition that the proteins dissimilar to every known/predicted target of a compound are not much likely to be targeted by the compound and vice versa. We integrated various resources, including chemical structures, chemical expression profiles and side effects of compounds, amino acid sequences, protein-protein interaction network and functional annotations of proteins, into a systematic screening framework. We first tested the screened negative samples on six classical classifiers, and all these classifiers achieved remarkably higher performance on our negative samples than on randomly generated negative samples for both human and Caenorhabditis elegans. We then verified the negative samples on three existing prediction models, including bipartite local model, Gaussian kernel profile and Bayesian matrix factorization, and found that the performances of these models are also significantly improved on the screened negative samples. Moreover, we validated the screened negative samples on a drug bioactivity dataset. Finally, we derived two sets of new interactions by training an support vector machine classifier on the positive interactions annotated in DrugBank and our screened negative interactions. The screened negative samples and the predicted interactions provide the research community with a useful resource for identifying new drug targets and a helpful supplement to the current curated compound-protein databases. Supplementary files are available at: http://admis.fudan.edu.cn/negative-cpi/. © The Author 2015. Published by Oxford University Press.

  10. Multi-scale modeling in cell biology

    PubMed Central

    Meier-Schellersheim, Martin; Fraser, Iain D. C.; Klauschen, Frederick

    2009-01-01

    Biomedical research frequently involves performing experiments and developing hypotheses that link different scales of biological systems such as, for instance, the scales of intracellular molecular interactions to the scale of cellular behavior and beyond to the behavior of cell populations. Computational modeling efforts that aim at exploring such multi-scale systems quantitatively with the help of simulations have to incorporate several different simulation techniques due to the different time and space scales involved. Here, we provide a non-technical overview of how different scales of experimental research can be combined with the appropriate computational modeling techniques. We also show that current modeling software permits building and simulating multi-scale models without having to become involved with the underlying technical details of computational modeling. PMID:20448808

  11. Scheduling Algorithm for Mission Planning and Logistics Evaluation (SAMPLE). Volume 1: User's guide

    NASA Technical Reports Server (NTRS)

    Dupnick, E.; Wiggins, D.

    1980-01-01

    An interactive computer program for automatically generating traffic models for the Space Transportation System (STS) is presented. Information concerning run stream construction, input data, and output data is provided. The flow of the interactive data stream is described. Error messages are specified, along with suggestions for remedial action. In addition, formats and parameter definitions for the payload data set (payload model), feasible combination file, and traffic model are documented.

  12. Interactive Activation Model of Speech Perception.

    DTIC Science & Technology

    1984-11-01

    contract. 0 Elar, .l... & .McC’lelland .1.1. Speech perception a, a cognitive proces,: The interactive act ia- %e., tion model of speech perception. In...attempts to provide a machine solution to the problem of speech perception. A second kind of model, growing out of Cognitive Psychology, attempts to...architectures to cognitive and perceptual problems. We also owe a debt to what we might call the computational connectionists -- those who have applied highly

  13. How Do Tides and Tsunamis Interact in a Highly Energetic Channel? The Case of Canal Chacao, Chile

    NASA Astrophysics Data System (ADS)

    Winckler, Patricio; Sepúlveda, Ignacio; Aron, Felipe; Contreras-López, Manuel

    2017-12-01

    This study aims at understanding the role of tidal level, speed, and direction in tsunami propagation in highly energetic tidal channels. The main goal is to comprehend whether tide-tsunami interactions enhance/reduce elevation, currents speeds, and arrival times, when compared to pure tsunami models and to simulations in which tides and tsunamis are linearly superimposed. We designed various numerical experiments to compute the tsunami propagation along Canal Chacao, a highly energetic channel in the Chilean Patagonia lying on a subduction margin prone to megathrust earthquakes. Three modeling approaches were implemented under the same seismic scenario: a tsunami model with a constant tide level, a series of six composite models in which independent tide and tsunami simulations are linearly superimposed, and a series of six tide-tsunami nonlinear interaction models (full models). We found that hydrodynamic patterns differ significantly among approaches, being the composite and full models sensitive to both the tidal phase at which the tsunami is triggered and the local depth of the channel. When compared to full models, composite models adequately predicted the maximum surface elevation, but largely overestimated currents. The amplitude and arrival time of the tsunami-leading wave computed with the full model was found to be strongly dependent on the direction of the tidal current and less responsive to the tide level and the tidal current speed. These outcomes emphasize the importance of addressing more carefully the interactions of tides and tsunamis on hazard assessment studies.

  14. Leveraging Modeling Approaches: Reaction Networks and Rules

    PubMed Central

    Blinov, Michael L.; Moraru, Ion I.

    2012-01-01

    We have witnessed an explosive growth in research involving mathematical models and computer simulations of intracellular molecular interactions, ranging from metabolic pathways to signaling and gene regulatory networks. Many software tools have been developed to aid in the study of such biological systems, some of which have a wealth of features for model building and visualization, and powerful capabilities for simulation and data analysis. Novel high resolution and/or high throughput experimental techniques have led to an abundance of qualitative and quantitative data related to the spatio-temporal distribution of molecules and complexes, their interactions kinetics, and functional modifications. Based on this information, computational biology researchers are attempting to build larger and more detailed models. However, this has proved to be a major challenge. Traditionally, modeling tools require the explicit specification of all molecular species and interactions in a model, which can quickly become a major limitation in the case of complex networks – the number of ways biomolecules can combine to form multimolecular complexes can be combinatorially large. Recently, a new breed of software tools has been created to address the problems faced when building models marked by combinatorial complexity. These have a different approach for model specification, using reaction rules and species patterns. Here we compare the traditional modeling approach with the new rule-based methods. We make a case for combining the capabilities of conventional simulation software with the unique features and flexibility of a rule-based approach in a single software platform for building models of molecular interaction networks. PMID:22161349

  15. Leveraging modeling approaches: reaction networks and rules.

    PubMed

    Blinov, Michael L; Moraru, Ion I

    2012-01-01

    We have witnessed an explosive growth in research involving mathematical models and computer simulations of intracellular molecular interactions, ranging from metabolic pathways to signaling and gene regulatory networks. Many software tools have been developed to aid in the study of such biological systems, some of which have a wealth of features for model building and visualization, and powerful capabilities for simulation and data analysis. Novel high-resolution and/or high-throughput experimental techniques have led to an abundance of qualitative and quantitative data related to the spatiotemporal distribution of molecules and complexes, their interactions kinetics, and functional modifications. Based on this information, computational biology researchers are attempting to build larger and more detailed models. However, this has proved to be a major challenge. Traditionally, modeling tools require the explicit specification of all molecular species and interactions in a model, which can quickly become a major limitation in the case of complex networks - the number of ways biomolecules can combine to form multimolecular complexes can be combinatorially large. Recently, a new breed of software tools has been created to address the problems faced when building models marked by combinatorial complexity. These have a different approach for model specification, using reaction rules and species patterns. Here we compare the traditional modeling approach with the new rule-based methods. We make a case for combining the capabilities of conventional simulation software with the unique features and flexibility of a rule-based approach in a single software platform for building models of molecular interaction networks.

  16. The Evolution of Integrated Assessment and Emerging Challenges in the Assessment of Human and Natural System Interactions

    NASA Astrophysics Data System (ADS)

    Clarke, L.

    2017-12-01

    Integrated assessment (IA) modeling and research has a long history, spanning over 30 years since its inception and addressing a wide range of contemporary issues along the way. Over the last decade, IA modeling and research has emerged as one of the primary analytical methods for understanding the complex interactions between human and natural systems, from the interactions between energy, water, and land/food systems to the interplay between health, climate, and air pollution. IA modeling and research is particularly well-suited for the analysis of these interactions because it is a discipline that strives to integrate representations of multiple systems into consistent computational platforms or frameworks. In doing so, it explicitly confronts the many tradeoffs that are frequently necessary to manage complexity and computational cost while still representing the most important interactions and overall, coupled system behavior. This talk explores the history of IA modeling and research as a means to better understand its role in the assessment of contemporary issues at the confluence of human and natural systems. It traces the evolution of IA modeling and research from initial exploration of long-term emissions pathways, to the role of technology in the global evolution of the energy system, to the key linkages between land and energy systems and, more recently, the linkages with water, air pollution, and other key systems and issues. It discusses the advances in modeling that have emerged over this evolution and the biggest challenges that still present themselves as we strive to better understand the most important interactions between human and natural systems and the implications of these interactions for human welfare and decision making.

  17. Numerical assessment of fore-and-aft suspension performance to reduce whole-body vibration of wheel loader drivers

    NASA Astrophysics Data System (ADS)

    Fleury, Gérard; Mistrot, Pierre

    2006-12-01

    While driving off-road vehicles, operators are exposed to whole-body vibration acting in the fore-and-aft direction. Seat manufacturers supply products equipped with fore-and-aft suspension but only a few studies report on their performance. This work proposes a computational approach to design fore-and-aft suspensions for wheel loader seats. Field tests were conducted in a quarry to analyse the nature of vibration to which the driver was exposed. Typical input signals were recorded to be reproduced in the laboratory. Technical specifications are defined for the suspension. In order to evaluate the suspension vibration attenuation performance, a model of a sitting human body was developed and coupled to a seat model. The seat model combines the models of each suspension component. A linear two-degree-of-freedom model is used to describe the dynamic behaviour of the sitting driver. Model parameters are identified by fitting the computed apparent mass frequency response functions to the measured values. Model extensions are proposed to investigate postural effects involving variations in hands and feet positions and interaction of the driver's back with the backrest. Suspension design parameters are firstly optimized by computing the seat/man model response to sinusoidal acceleration. Four criteria including transmissibility, interaction force between the driver's back and the backrest and relative maximal displacement of the suspension are computed. A new suspension design with optimized features is proposed. Its performance is checked from calculations of the response of the seat/man model subjected to acceleration measured on the wheel loader during real work conditions. On the basis of the computed values of the SEAT factors, it is found possible to design a suspension that would increase the attenuation provided by the seat by a factor of two.

  18. Implicit prosody mining based on the human eye image capture technology

    NASA Astrophysics Data System (ADS)

    Gao, Pei-pei; Liu, Feng

    2013-08-01

    The technology of eye tracker has become the main methods of analyzing the recognition issues in human-computer interaction. Human eye image capture is the key problem of the eye tracking. Based on further research, a new human-computer interaction method introduced to enrich the form of speech synthetic. We propose a method of Implicit Prosody mining based on the human eye image capture technology to extract the parameters from the image of human eyes when reading, control and drive prosody generation in speech synthesis, and establish prosodic model with high simulation accuracy. Duration model is key issues for prosody generation. For the duration model, this paper put forward a new idea for obtaining gaze duration of eyes when reading based on the eye image capture technology, and synchronous controlling this duration and pronunciation duration in speech synthesis. The movement of human eyes during reading is a comprehensive multi-factor interactive process, such as gaze, twitching and backsight. Therefore, how to extract the appropriate information from the image of human eyes need to be considered and the gaze regularity of eyes need to be obtained as references of modeling. Based on the analysis of current three kinds of eye movement control model and the characteristics of the Implicit Prosody reading, relative independence between speech processing system of text and eye movement control system was discussed. It was proved that under the same text familiarity condition, gaze duration of eyes when reading and internal voice pronunciation duration are synchronous. The eye gaze duration model based on the Chinese language level prosodic structure was presented to change previous methods of machine learning and probability forecasting, obtain readers' real internal reading rhythm and to synthesize voice with personalized rhythm. This research will enrich human-computer interactive form, and will be practical significance and application prospect in terms of disabled assisted speech interaction. Experiments show that Implicit Prosody mining based on the human eye image capture technology makes the synthesized speech has more flexible expressions.

  19. Computational modeling of joint U.S.-Russian experiments relevant to magnetic compression/magnetized target fusion (MAGO/MTF)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheehey, P.T.; Faehl, R.J.; Kirkpatrick, R.C.

    1997-12-31

    Magnetized Target Fusion (MTF) experiments, in which a preheated and magnetized target plasma is hydrodynamically compressed to fusion conditions, present some challenging computational modeling problems. Recently, joint experiments relevant to MTF (Russian acronym MAGO, for Magnitnoye Obzhatiye, or magnetic compression) have been performed by Los Alamos National Laboratory and the All-Russian Scientific Research Institute of Experimental Physics (VNIIEF). Modeling of target plasmas must accurately predict plasma densities, temperatures, fields, and lifetime; dense plasma interactions with wall materials must be characterized. Modeling of magnetically driven imploding solid liners, for compression of target plasmas, must address issues such as Rayleigh-Taylor instability growthmore » in the presence of material strength, and glide plane-liner interactions. Proposed experiments involving liner-on-plasma compressions to fusion conditions will require integrated target plasma and liner calculations. Detailed comparison of the modeling results with experiment will be presented.« less

  20. Low-Dimensional Models for Physiological Systems: Nonlinear Coupling of Gas and Liquid Flows

    NASA Astrophysics Data System (ADS)

    Staples, A. E.; Oran, E. S.; Boris, J. P.; Kailasanath, K.

    2006-11-01

    Current computational models of biological organisms focus on the details of a specific component of the organism. For example, very detailed models of the human heart, an aorta, a vein, or part of the respiratory or digestive system, are considered either independently from the rest of the body, or as interacting simply with other systems and components in the body. In actual biological organisms, these components and systems are strongly coupled and interact in complex, nonlinear ways leading to complicated global behavior. Here we describe a low-order computational model of two physiological systems, based loosely on a circulatory and respiratory system. Each system is represented as a one-dimensional fluid system with an interconnected series of mass sources, pumps, valves, and other network components, as appropriate, representing different physical organs and system components. Preliminary results from a first version of this model system are presented.

  1. EEGLAB, SIFT, NFT, BCILAB, and ERICA: new tools for advanced EEG processing.

    PubMed

    Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott

    2011-01-01

    We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments.

  2. Interactions of Organics within Hydrated Selective Layer of Reverse Osmosis Desalination Membrane: A Combined Experimental and Computational Study.

    PubMed

    Ghoufi, Aziz; Dražević, Emil; Szymczyk, Anthony

    2017-03-07

    In this work we have examined a computational approach in predicting the interactions between uncharged organic solutes and polyamide membranes. We used three model organic molecules with identical molecular weights (100.1 g/mol), 4-aminopiperidine, 3,3-dimethyl-2-butanone (pinacolone) and methylisobutyl ketone for which we obtained experimental data on partitioning, diffusion and separation on a typical seawater reverse osmosis (RO) membrane. The interaction energy between the solutes and the membrane phase (fully aromatic polyamide) was computed from molecular dynamics (MD) simulations and the resulting sequence was found to correlate well with the experimental rejections and sorption data. Sorption of the different organic solutes within the membrane skin layer determined from attenuated total reflection Fourier transform infrared spectroscopy (ATR-FTIR) nicely agreed with interaction energies computed from molecular simulations. Qualitative information about solute diffusivity inside the membrane was also extracted from MD simulations while ATR-FTIR experiments indicated strongly hindered diffusion with diffusion coefficients in the membrane about 10 -15 m 2 /s. The computational approach presented here could be a first step toward predicting rejections trends of, for example, hormones and pharmaceuticals by RO dense membranes.

  3. Testing the Two-Layer Model for Correcting Near Cloud Reflectance Enhancement Using LES SHDOM Simulated Radiances

    NASA Technical Reports Server (NTRS)

    Wen, Guoyong; Marshak, Alexander; Varnai, Tamas; Levy, Robert

    2016-01-01

    A transition zone exists between cloudy skies and clear sky; such that, clouds scatter solar radiation into clear-sky regions. From a satellite perspective, it appears that clouds enhance the radiation nearby. We seek a simple method to estimate this enhancement, since it is so computationally expensive to account for all three-dimensional (3-D) scattering processes. In previous studies, we developed a simple two-layer model (2LM) that estimated the radiation scattered via cloud-molecular interactions. Here we have developed a new model to account for cloud-surface interaction (CSI). We test the models by comparing to calculations provided by full 3-D radiative transfer simulations of realistic cloud scenes. For these scenes, the Moderate Resolution Imaging Spectroradiometer (MODIS)-like radiance fields were computed from the Spherical Harmonic Discrete Ordinate Method (SHDOM), based on a large number of cumulus fields simulated by the University of California, Los Angeles (UCLA) large eddy simulation (LES) model. We find that the original 2LM model that estimates cloud-air molecule interactions accounts for 64 of the total reflectance enhancement and the new model (2LM+CSI) that also includes cloud-surface interactions accounts for nearly 80. We discuss the possibility of accounting for cloud-aerosol radiative interactions in 3-D cloud-induced reflectance enhancement, which may explain the remaining 20 of enhancements. Because these are simple models, these corrections can be applied to global satellite observations (e.g., MODIS) and help to reduce biases in aerosol and other clear-sky retrievals.

  4. Modeling and evaluating user behavior in exploratory visual analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reda, Khairi; Johnson, Andrew E.; Papka, Michael E.

    Empirical evaluation methods for visualizations have traditionally focused on assessing the outcome of the visual analytic process as opposed to characterizing how that process unfolds. There are only a handful of methods that can be used to systematically study how people use visualizations, making it difficult for researchers to capture and characterize the subtlety of cognitive and interaction behaviors users exhibit during visual analysis. To validate and improve visualization design, however, it is important for researchers to be able to assess and understand how users interact with visualization systems under realistic scenarios. This paper presents a methodology for modeling andmore » evaluating the behavior of users in exploratory visual analysis. We model visual exploration using a Markov chain process comprising transitions between mental, interaction, and computational states. These states and the transitions between them can be deduced from a variety of sources, including verbal transcripts, videos and audio recordings, and log files. This model enables the evaluator to characterize the cognitive and computational processes that are essential to insight acquisition in exploratory visual analysis, and reconstruct the dynamics of interaction between the user and the visualization system. We illustrate this model with two exemplar user studies, and demonstrate the qualitative and quantitative analytical tools it affords.« less

  5. Computational modeling of brain tumors: discrete, continuum or hybrid?

    NASA Astrophysics Data System (ADS)

    Wang, Zhihui; Deisboeck, Thomas S.

    In spite of all efforts, patients diagnosed with highly malignant brain tumors (gliomas), continue to face a grim prognosis. Achieving significant therapeutic advances will also require a more detailed quantitative understanding of the dynamic interactions among tumor cells, and between these cells and their biological microenvironment. Data-driven computational brain tumor models have the potential to provide experimental tumor biologists with such quantitative and cost-efficient tools to generate and test hypotheses on tumor progression, and to infer fundamental operating principles governing bidirectional signal propagation in multicellular cancer systems. This review highlights the modeling objectives of and challenges with developing such in silico brain tumor models by outlining two distinct computational approaches: discrete and continuum, each with representative examples. Future directions of this integrative computational neuro-oncology field, such as hybrid multiscale multiresolution modeling are discussed.

  6. Computational modeling of brain tumors: discrete, continuum or hybrid?

    NASA Astrophysics Data System (ADS)

    Wang, Zhihui; Deisboeck, Thomas S.

    2008-04-01

    In spite of all efforts, patients diagnosed with highly malignant brain tumors (gliomas), continue to face a grim prognosis. Achieving significant therapeutic advances will also require a more detailed quantitative understanding of the dynamic interactions among tumor cells, and between these cells and their biological microenvironment. Data-driven computational brain tumor models have the potential to provide experimental tumor biologists with such quantitative and cost-efficient tools to generate and test hypotheses on tumor progression, and to infer fundamental operating principles governing bidirectional signal propagation in multicellular cancer systems. This review highlights the modeling objectives of and challenges with developing such in silicobrain tumor models by outlining two distinct computational approaches: discrete and continuum, each with representative examples. Future directions of this integrative computational neuro-oncology field, such as hybrid multiscale multiresolution modeling are discussed.

  7. Tree-Structured Digital Organisms Model

    NASA Astrophysics Data System (ADS)

    Suzuki, Teruhiko; Nobesawa, Shiho; Tahara, Ikuo

    Tierra and Avida are well-known models of digital organisms. They describe a life process as a sequence of computation codes. A linear sequence model may not be the only way to describe a digital organism, though it is very simple for a computer-based model. Thus we propose a new digital organism model based on a tree structure, which is rather similar to the generic programming. With our model, a life process is a combination of various functions, as if life in the real world is. This implies that our model can easily describe the hierarchical structure of life, and it can simulate evolutionary computation through mutual interaction of functions. We verified our model by simulations that our model can be regarded as a digital organism model according to its definitions. Our model even succeeded in creating species such as viruses and parasites.

  8. A Perspective on the Role of Computational Models in Immunology.

    PubMed

    Chakraborty, Arup K

    2017-04-26

    This is an exciting time for immunology because the future promises to be replete with exciting new discoveries that can be translated to improve health and treat disease in novel ways. Immunologists are attempting to answer increasingly complex questions concerning phenomena that range from the genetic, molecular, and cellular scales to that of organs, whole animals or humans, and populations of humans and pathogens. An important goal is to understand how the many different components involved interact with each other within and across these scales for immune responses to emerge, and how aberrant regulation of these processes causes disease. To aid this quest, large amounts of data can be collected using high-throughput instrumentation. The nonlinear, cooperative, and stochastic character of the interactions between components of the immune system as well as the overwhelming amounts of data can make it difficult to intuit patterns in the data or a mechanistic understanding of the phenomena being studied. Computational models are increasingly important in confronting and overcoming these challenges. I first describe an iterative paradigm of research that integrates laboratory experiments, clinical data, computational inference, and mechanistic computational models. I then illustrate this paradigm with a few examples from the recent literature that make vivid the power of bringing together diverse types of computational models with experimental and clinical studies to fruitfully interrogate the immune system.

  9. Toward a 24/7 Learning Community.

    ERIC Educational Resources Information Center

    Revenaugh, Mickey

    2000-01-01

    Although nearly two-thirds of family households have computers and 46 percent have Internet connections, troubling income-related gaps persist. Parents want interactive connections with teachers, homework hotlines, and tutoring services more than school web sites. Web-quest models, laptops, and computer donation programs are promising…

  10. Distributed geospatial model sharing based on open interoperability standards

    USGS Publications Warehouse

    Feng, Min; Liu, Shuguang; Euliss, Ned H.; Fang, Yin

    2009-01-01

    Numerous geospatial computational models have been developed based on sound principles and published in journals or presented in conferences. However modelers have made few advances in the development of computable modules that facilitate sharing during model development or utilization. Constraints hampering development of model sharing technology includes limitations on computing, storage, and connectivity; traditional stand-alone and closed network systems cannot fully support sharing and integrating geospatial models. To address this need, we have identified methods for sharing geospatial computational models using Service Oriented Architecture (SOA) techniques and open geospatial standards. The service-oriented model sharing service is accessible using any tools or systems compliant with open geospatial standards, making it possible to utilize vast scientific resources available from around the world to solve highly sophisticated application problems. The methods also allow model services to be empowered by diverse computational devices and technologies, such as portable devices and GRID computing infrastructures. Based on the generic and abstract operations and data structures required for Web Processing Service (WPS) standards, we developed an interactive interface for model sharing to help reduce interoperability problems for model use. Geospatial computational models are shared on model services, where the computational processes provided by models can be accessed through tools and systems compliant with WPS. We developed a platform to help modelers publish individual models in a simplified and efficient way. Finally, we illustrate our technique using wetland hydrological models we developed for the prairie pothole region of North America.

  11. Computer-aided roll pass design in rolling of airfoil shapes

    NASA Technical Reports Server (NTRS)

    Akgerman, N.; Lahoti, G. D.; Altan, T.

    1980-01-01

    This paper describes two computer-aided design (CAD) programs developed for modeling the shape rolling process for airfoil sections. The first program, SHPROL, uses a modular upper-bound method of analysis and predicts the lateral spread, elongation, and roll torque. The second program, ROLPAS, predicts the stresses, roll separating force, the roll torque and the details of metal flow by simulating the rolling process, using the slab method of analysis. ROLPAS is an interactive program; it offers graphic display capabilities and allows the user to interact with the computer via a keyboard, CRT, and a light pen. The accuracy of the computerized models was evaluated by (a) rolling a selected airfoil shape at room temperature from 1018 steel and isothermally at high temperature from Ti-6Al-4V, and (b) comparing the experimental results with computer predictions. The comparisons indicated that the CAD systems, described here, are useful for practical engineering purposes and can be utilized in roll pass design and analysis for airfoil and similar shapes.

  12. The Design of Hand Gestures for Human-Computer Interaction: Lessons from Sign Language Interpreters

    PubMed Central

    Rempel, David; Camilleri, Matt J.; Lee, David L.

    2015-01-01

    The design and selection of 3D modeled hand gestures for human-computer interaction should follow principles of natural language combined with the need to optimize gesture contrast and recognition. The selection should also consider the discomfort and fatigue associated with distinct hand postures and motions, especially for common commands. Sign language interpreters have extensive and unique experience forming hand gestures and many suffer from hand pain while gesturing. Professional sign language interpreters (N=24) rated discomfort for hand gestures associated with 47 characters and words and 33 hand postures. Clear associations of discomfort with hand postures were identified. In a nominal logistic regression model, high discomfort was associated with gestures requiring a flexed wrist, discordant adjacent fingers, or extended fingers. These and other findings should be considered in the design of hand gestures to optimize the relationship between human cognitive and physical processes and computer gesture recognition systems for human-computer input. PMID:26028955

  13. Validation of the Concurrent Atomistic-Continuum Method on Screw Dislocation/Stacking Fault Interactions

    DOE PAGES

    Xu, Shuozhi; Xiong, Liming; Chen, Youping; ...

    2017-04-26

    Dislocation/stacking fault interactions play an important role in the plastic deformation of metallic nanocrystals and polycrystals. These interactions have been explored in atomistic models, which are limited in scale length by high computational cost. In contrast, multiscale material modeling approaches have the potential to simulate the same systems at a fraction of the computational cost. In this paper, we validate the concurrent atomistic-continuum (CAC) method on the interactions between a lattice screw dislocation and a stacking fault (SF) in three face-centered cubic metallic materials—Ni, Al, and Ag. Two types of SFs are considered: intrinsic SF (ISF) and extrinsic SF (ESF).more » For the three materials at different strain levels, two screw dislocation/ISF interaction modes (annihilation of the ISF and transmission of the dislocation across the ISF) and three screw dislocation/ESF interaction modes (transformation of the ESF into a three-layer twin, transformation of the ESF into an ISF, and transmission of the dislocation across the ESF) are identified. Here, our results show that CAC is capable of accurately predicting the dislocation/SF interaction modes with greatly reduced DOFs compared to fully-resolved atomistic simulations.« less

  14. Validation of the Concurrent Atomistic-Continuum Method on Screw Dislocation/Stacking Fault Interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Shuozhi; Xiong, Liming; Chen, Youping

    Dislocation/stacking fault interactions play an important role in the plastic deformation of metallic nanocrystals and polycrystals. These interactions have been explored in atomistic models, which are limited in scale length by high computational cost. In contrast, multiscale material modeling approaches have the potential to simulate the same systems at a fraction of the computational cost. In this paper, we validate the concurrent atomistic-continuum (CAC) method on the interactions between a lattice screw dislocation and a stacking fault (SF) in three face-centered cubic metallic materials—Ni, Al, and Ag. Two types of SFs are considered: intrinsic SF (ISF) and extrinsic SF (ESF).more » For the three materials at different strain levels, two screw dislocation/ISF interaction modes (annihilation of the ISF and transmission of the dislocation across the ISF) and three screw dislocation/ESF interaction modes (transformation of the ESF into a three-layer twin, transformation of the ESF into an ISF, and transmission of the dislocation across the ESF) are identified. Here, our results show that CAC is capable of accurately predicting the dislocation/SF interaction modes with greatly reduced DOFs compared to fully-resolved atomistic simulations.« less

  15. Computational simulation of the creep-rupture process in filamentary composite materials

    NASA Technical Reports Server (NTRS)

    Slattery, Kerry T.; Hackett, Robert M.

    1991-01-01

    A computational simulation of the internal damage accumulation which causes the creep-rupture phenomenon in filamentary composite materials is developed. The creep-rupture process involves complex interactions between several damage mechanisms. A statistically-based computational simulation using a time-differencing approach is employed to model these progressive interactions. The finite element method is used to calculate the internal stresses. The fibers are modeled as a series of bar elements which are connected transversely by matrix elements. Flaws are distributed randomly throughout the elements in the model. Load is applied, and the properties of the individual elements are updated at the end of each time step as a function of the stress history. The simulation is continued until failure occurs. Several cases, with different initial flaw dispersions, are run to establish a statistical distribution of the time-to-failure. The calculations are performed on a supercomputer. The simulation results compare favorably with the results of creep-rupture experiments conducted at the Lawrence Livermore National Laboratory.

  16. Economic models for management of resources in peer-to-peer and grid computing

    NASA Astrophysics Data System (ADS)

    Buyya, Rajkumar; Stockinger, Heinz; Giddy, Jonathan; Abramson, David

    2001-07-01

    The accelerated development in Peer-to-Peer (P2P) and Grid computing has positioned them as promising next generation computing platforms. They enable the creation of Virtual Enterprises (VE) for sharing resources distributed across the world. However, resource management, application development and usage models in these environments is a complex undertaking. This is due to the geographic distribution of resources that are owned by different organizations or peers. The resource owners of each of these resources have different usage or access policies and cost models, and varying loads and availability. In order to address complex resource management issues, we have proposed a computational economy framework for resource allocation and for regulating supply and demand in Grid computing environments. The framework provides mechanisms for optimizing resource provider and consumer objective functions through trading and brokering services. In a real world market, there exist various economic models for setting the price for goods based on supply-and-demand and their value to the user. They include commodity market, posted price, tenders and auctions. In this paper, we discuss the use of these models for interaction between Grid components in deciding resource value and the necessary infrastructure to realize them. In addition to normal services offered by Grid computing systems, we need an infrastructure to support interaction protocols, allocation mechanisms, currency, secure banking, and enforcement services. Furthermore, we demonstrate the usage of some of these economic models in resource brokering through Nimrod/G deadline and cost-based scheduling for two different optimization strategies on the World Wide Grid (WWG) testbed that contains peer-to-peer resources located on five continents: Asia, Australia, Europe, North America, and South America.

  17. Airline flight planning - The weather connection

    NASA Technical Reports Server (NTRS)

    Steinberg, R.

    1981-01-01

    The history of airline flight planning is briefly reviewed. Over half a century ago, when scheduled airline services began, weather data were almost nonexistent. By the early 1950's a reliable synoptic network provided upper air reports. The next 15 years saw a rapid growth in commercial aviation, and airlines introduced computer techniques to flight planning. The 1970's saw the development of weather satellites. The current state of flight planning activities is analyzed. It is found that accurate flight planning will require meteorological information on a finer scale than can be provided by a synoptic forecast. Opportunities for a new approach are examined, giving attention to the available options, a mesoscale numerical weather prediction model, limited area fine mesh models, man-computer interactive display systems, the use of interactive techniques with the present upper air data base, and the implementation of interactive techniques.

  18. a Cultural Market Model

    NASA Astrophysics Data System (ADS)

    HerdaǦDELEN, Amaç; Bingol, Haluk

    Social interactions and personal tastes shape our consumption behavior of cultural products. In this study, we present a computational model of a cultural market and we aim to analyze the behavior of the consumer population as an emergent phenomena. Our results suggest that the final market shares of cultural products dramatically depend on consumer heterogeneity and social interaction pressure. Furthermore, the relation between the resulting market shares and social interaction is robust with respect to a wide range of variation in the parameter values and the type of topology.

  19. Calculating Henry’s Constants of Charged Molecules Using SPARC

    EPA Science Inventory

    SPARC Performs Automated Reasoning in Chemistry is a computer program designed to model physical and chemical properties of molecules solely based on thier chemical structure. SPARC uses a toolbox of mechanistic perturbation models to model intermolecular interactions. SPARC has ...

  20. Distributed and collaborative synthetic environments

    NASA Technical Reports Server (NTRS)

    Bajaj, Chandrajit L.; Bernardini, Fausto

    1995-01-01

    Fast graphics workstations and increased computing power, together with improved interface technologies, have created new and diverse possibilities for developing and interacting with synthetic environments. A synthetic environment system is generally characterized by input/output devices that constitute the interface between the human senses and the synthetic environment generated by the computer; and a computation system running a real-time simulation of the environment. A basic need of a synthetic environment system is that of giving the user a plausible reproduction of the visual aspect of the objects with which he is interacting. The goal of our Shastra research project is to provide a substrate of geometric data structures and algorithms which allow the distributed construction and modification of the environment, efficient querying of objects attributes, collaborative interaction with the environment, fast computation of collision detection and visibility information for efficient dynamic simulation and real-time scene display. In particular, we address the following issues: (1) A geometric framework for modeling and visualizing synthetic environments and interacting with them. We highlight the functions required for the geometric engine of a synthetic environment system. (2) A distribution and collaboration substrate that supports construction, modification, and interaction with synthetic environments on networked desktop machines.

  1. Reference interaction site model with hydrophobicity induced density inhomogeneity: An analytical theory to compute solvation properties of large hydrophobic solutes in the mixture of polyatomic solvent molecules.

    PubMed

    Cao, Siqin; Sheong, Fu Kit; Huang, Xuhui

    2015-08-07

    Reference interaction site model (RISM) has recently become a popular approach in the study of thermodynamical and structural properties of the solvent around macromolecules. On the other hand, it was widely suggested that there exists water density depletion around large hydrophobic solutes (>1 nm), and this may pose a great challenge to the RISM theory. In this paper, we develop a new analytical theory, the Reference Interaction Site Model with Hydrophobicity induced density Inhomogeneity (RISM-HI), to compute solvent radial distribution function (RDF) around large hydrophobic solute in water as well as its mixture with other polyatomic organic solvents. To achieve this, we have explicitly considered the density inhomogeneity at the solute-solvent interface using the framework of the Yvon-Born-Green hierarchy, and the RISM theory is used to obtain the solute-solvent pair correlation. In order to efficiently solve the relevant equations while maintaining reasonable accuracy, we have also developed a new closure called the D2 closure. With this new theory, the solvent RDFs around a large hydrophobic particle in water and different water-acetonitrile mixtures could be computed, which agree well with the results of the molecular dynamics simulations. Furthermore, we show that our RISM-HI theory can also efficiently compute the solvation free energy of solute with a wide range of hydrophobicity in various water-acetonitrile solvent mixtures with a reasonable accuracy. We anticipate that our theory could be widely applied to compute the thermodynamic and structural properties for the solvation of hydrophobic solute.

  2. Interactive visualization of Earth and Space Science computations

    NASA Technical Reports Server (NTRS)

    Hibbard, William L.; Paul, Brian E.; Santek, David A.; Dyer, Charles R.; Battaiola, Andre L.; Voidrot-Martinez, Marie-Francoise

    1994-01-01

    Computers have become essential tools for scientists simulating and observing nature. Simulations are formulated as mathematical models but are implemented as computer algorithms to simulate complex events. Observations are also analyzed and understood in terms of mathematical models, but the number of these observations usually dictates that we automate analyses with computer algorithms. In spite of their essential role, computers are also barriers to scientific understanding. Unlike hand calculations, automated computations are invisible and, because of the enormous numbers of individual operations in automated computations, the relation between an algorithm's input and output is often not intuitive. This problem is illustrated by the behavior of meteorologists responsible for forecasting weather. Even in this age of computers, many meteorologists manually plot weather observations on maps, then draw isolines of temperature, pressure, and other fields by hand (special pads of maps are printed for just this purpose). Similarly, radiologists use computers to collect medical data but are notoriously reluctant to apply image-processing algorithms to that data. To these scientists with life-and-death responsibilities, computer algorithms are black boxes that increase rather than reduce risk. The barrier between scientists and their computations can be bridged by techniques that make the internal workings of algorithms visible and that allow scientists to experiment with their computations. Here we describe two interactive systems developed at the University of Wisconsin-Madison Space Science and Engineering Center (SSEC) that provide these capabilities to Earth and space scientists.

  3. Modeling the Effect of Fluid-Structure Interaction on the Impact Dynamics of Pressurized Tank Cars

    DOT National Transportation Integrated Search

    2009-11-13

    This paper presents a computational framework that : analyzes the effect of fluid-structure interaction (FSI) on the : impact dynamics of pressurized commodity tank cars using the : nonlinear dynamic finite element code ABAQUS/Explicit. : There exist...

  4. Epistasis analysis for quantitative traits by functional regression model.

    PubMed

    Zhang, Futao; Boerwinkle, Eric; Xiong, Momiao

    2014-06-01

    The critical barrier in interaction analysis for rare variants is that most traditional statistical methods for testing interactions were originally designed for testing the interaction between common variants and are difficult to apply to rare variants because of their prohibitive computational time and poor ability. The great challenges for successful detection of interactions with next-generation sequencing (NGS) data are (1) lack of methods for interaction analysis with rare variants, (2) severe multiple testing, and (3) time-consuming computations. To meet these challenges, we shift the paradigm of interaction analysis between two loci to interaction analysis between two sets of loci or genomic regions and collectively test interactions between all possible pairs of SNPs within two genomic regions. In other words, we take a genome region as a basic unit of interaction analysis and use high-dimensional data reduction and functional data analysis techniques to develop a novel functional regression model to collectively test interactions between all possible pairs of single nucleotide polymorphisms (SNPs) within two genome regions. By intensive simulations, we demonstrate that the functional regression models for interaction analysis of the quantitative trait have the correct type 1 error rates and a much better ability to detect interactions than the current pairwise interaction analysis. The proposed method was applied to exome sequence data from the NHLBI's Exome Sequencing Project (ESP) and CHARGE-S study. We discovered 27 pairs of genes showing significant interactions after applying the Bonferroni correction (P-values < 4.58 × 10(-10)) in the ESP, and 11 were replicated in the CHARGE-S study. © 2014 Zhang et al.; Published by Cold Spring Harbor Laboratory Press.

  5. Modeling Cross-Situational Word–Referent Learning: Prior Questions

    PubMed Central

    Yu, Chen; Smith, Linda B.

    2013-01-01

    Both adults and young children possess powerful statistical computation capabilities—they can infer the referent of a word from highly ambiguous contexts involving many words and many referents by aggregating cross-situational statistical information across contexts. This ability has been explained by models of hypothesis testing and by models of associative learning. This article describes a series of simulation studies and analyses designed to understand the different learning mechanisms posited by the 2 classes of models and their relation to each other. Variants of a hypothesis-testing model and a simple or dumb associative mechanism were examined under different specifications of information selection, computation, and decision. Critically, these 3 components of the models interact in complex ways. The models illustrate a fundamental tradeoff between amount of data input and powerful computations: With the selection of more information, dumb associative models can mimic the powerful learning that is accomplished by hypothesis-testing models with fewer data. However, because of the interactions among the component parts of the models, the associative model can mimic various hypothesis-testing models, producing the same learning patterns but through different internal components. The simulations argue for the importance of a compositional approach to human statistical learning: the experimental decomposition of the processes that contribute to statistical learning in human learners and models with the internal components that can be evaluated independently and together. PMID:22229490

  6. Visualization and Interaction in Research, Teaching, and Scientific Communication

    NASA Astrophysics Data System (ADS)

    Ammon, C. J.

    2017-12-01

    Modern computing provides many tools for exploring observations, numerical calculations, and theoretical relationships. The number of options is, in fact, almost overwhelming. But the choices provide those with modest programming skills opportunities to create unique views of scientific information and to develop deeper insights into their data, their computations, and the underlying theoretical data-model relationships. I present simple examples of using animation and human-computer interaction to explore scientific data and scientific-analysis approaches. I illustrate how valuable a little programming ability can free scientists from the constraints of existing tools and can facilitate the development of deeper appreciation data and models. I present examples from a suite of programming languages ranging from C to JavaScript including the Wolfram Language. JavaScript is valuable for sharing tools and insight (hopefully) with others because it is integrated into one of the most powerful communication tools in human history, the web browser. Although too much of that power is often spent on distracting advertisements, the underlying computation and graphics engines are efficient, flexible, and almost universally available in desktop and mobile computing platforms. Many are working to fulfill the browser's potential to become the most effective tool for interactive study. Open-source frameworks for visualizing everything from algorithms to data are available, but advance rapidly. One strategy for dealing with swiftly changing tools is to adopt common, open data formats that are easily adapted (often by framework or tool developers). I illustrate the use of animation and interaction in research and teaching with examples from earthquake seismology.

  7. SPH Numerical Modeling for the Wave-Thin Structure Interaction

    NASA Astrophysics Data System (ADS)

    Ren, Xi-feng; Sun, Zhao-chen; Wang, Xing-gang; Liang, Shu-xiu

    2018-04-01

    In this paper, a numerical model of 2D weakly compressible smoothed particle hydrodynamics (WCSPH) is developed to simulate the interaction between waves and thin structures. A new color domain particle (CDP) technique is proposed to overcome difficulties of applying the ghost particle method to thin structures in dealing with solid boundaries. The new technique can deal with zero-thickness structures. To apply this enforcing technique, the computational fluid domain is divided into sub domains, i.e., boundary domains and internal domains. A color value is assigned to each particle, and contains the information of the domains in which the particle belongs to and the particles can interact with. A particle, nearby a thin boundary, is prevented from interacting with particles, which should not interact with on the other side of the structure. It is possible to model thin structures, or the structures with the thickness negligible with this technique. The proposed WCSPH module is validated for a still water tank, divided by a thin plate at the middle section, with different water levels in the subdomains, and is applied to simulate the interaction between regular waves and a perforated vertical plate. Finally, the computation is carried out for waves and submerged twin-horizontal plate interaction. It is shown that the numerical results agree well with experimental data in terms of the pressure distribution, pressure time series and wave transmission.

  8. Modeling and computational simulation and the potential of virtual and augmented reality associated to the teaching of nanoscience and nanotechnology

    NASA Astrophysics Data System (ADS)

    Ribeiro, Allan; Santos, Helen

    With the advent of new information and communication technologies (ICTs), the communicative interaction changes the way of being and acting of people, at the same time that changes the way of work activities related to education. In this range of possibilities provided by the advancement of computational resources include virtual reality (VR) and augmented reality (AR), are highlighted as new forms of information visualization in computer applications. While the RV allows user interaction with a virtual environment totally computer generated; in RA the virtual images are inserted in real environment, but both create new opportunities to support teaching and learning in formal and informal contexts. Such technologies are able to express representations of reality or of the imagination, as systems in nanoscale and low dimensionality, being imperative to explore, in the most diverse areas of knowledge, the potential offered by ICT and emerging technologies. In this sense, this work presents computer applications of virtual and augmented reality developed with the use of modeling and simulation in computational approaches to topics related to nanoscience and nanotechnology, and articulated with innovative pedagogical practices.

  9. Yoink: An interaction-based partitioning API.

    PubMed

    Zheng, Min; Waller, Mark P

    2018-05-15

    Herein, we describe the implementation details of our interaction-based partitioning API (application programming interface) called Yoink for QM/MM modeling and fragment-based quantum chemistry studies. Interactions are detected by computing density descriptors such as reduced density gradient, density overlap regions indicator, and single exponential decay detector. Only molecules having an interaction with a user-definable QM core are added to the QM region of a hybrid QM/MM calculation. Moreover, a set of molecule pairs having density-based interactions within a molecular system can be computed in Yoink, and an interaction graph can then be constructed. Standard graph clustering methods can then be applied to construct fragments for further quantum chemical calculations. The Yoink API is licensed under Apache 2.0 and can be accessed via yoink.wallerlab.org. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  10. An overview of bioinformatics methods for modeling biological pathways in yeast

    PubMed Central

    Hou, Jie; Acharya, Lipi; Zhu, Dongxiao

    2016-01-01

    The advent of high-throughput genomics techniques, along with the completion of genome sequencing projects, identification of protein–protein interactions and reconstruction of genome-scale pathways, has accelerated the development of systems biology research in the yeast organism Saccharomyces cerevisiae. In particular, discovery of biological pathways in yeast has become an important forefront in systems biology, which aims to understand the interactions among molecules within a cell leading to certain cellular processes in response to a specific environment. While the existing theoretical and experimental approaches enable the investigation of well-known pathways involved in metabolism, gene regulation and signal transduction, bioinformatics methods offer new insights into computational modeling of biological pathways. A wide range of computational approaches has been proposed in the past for reconstructing biological pathways from high-throughput datasets. Here we review selected bioinformatics approaches for modeling biological pathways in S. cerevisiae, including metabolic pathways, gene-regulatory pathways and signaling pathways. We start with reviewing the research on biological pathways followed by discussing key biological databases. In addition, several representative computational approaches for modeling biological pathways in yeast are discussed. PMID:26476430

  11. Simulation of Rotary-Wing Near-Wake Vortex Structures Using Navier-Stokes CFD Methods

    NASA Technical Reports Server (NTRS)

    Kenwright, David; Strawn, Roger; Ahmad, Jasim; Duque, Earl; Warmbrodt, William (Technical Monitor)

    1997-01-01

    This paper will use high-resolution Navier-Stokes computational fluid dynamics (CFD) simulations to model the near-wake vortex roll-up behind rotor blades. The locations and strengths of the trailing vortices will be determined from newly-developed visualization and analysis software tools applied to the CFD solutions. Computational results for rotor nearwake vortices will be used to study the near-wake vortex roll up for highly-twisted tiltrotor blades. These rotor blades typically have combinations of positive and negative spanwise loading and complex vortex wake interactions. Results of the computational studies will be compared to vortex-lattice wake models that are frequently used in rotorcraft comprehensive codes. Information from these comparisons will be used to improve the rotor wake models in the Tilt-Rotor Acoustic Code (TRAC) portion of NASA's Short Haul Civil Transport program (SHCT). Accurate modeling of the rotor wake is an important part of this program and crucial to the successful design of future civil tiltrotor aircraft. The rotor wake system plays an important role in blade-vortex interaction noise, a major problem for all rotorcraft including tiltrotors.

  12. 76 FR 28821 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-18

    ... exclusively through computer software-based models or applications termed under the rule as ``interactive Web... conducted through an interactive Web site in accordance with the rule.\\7\\ \\1\\ 17 CFR 275.203A-2(f). Included in rule 203A-2(f) is a limited exception to the interactive Web site requirement which allows these...

  13. Automatic Sound Generation for Spherical Objects Hitting Straight Beams Based on Physical Models.

    ERIC Educational Resources Information Center

    Rauterberg, M.; And Others

    Sounds are the result of one or several interactions between one or several objects at a certain place and in a certain environment; the attributes of every interaction influence the generated sound. The following factors influence users in human/computer interaction: the organization of the learning environment, the content of the learning tasks,…

  14. Getting off the Straight and Narrow: Exploiting Non-Linear, Interactive Narrative Structures in Digital Stories for Language Teaching

    ERIC Educational Resources Information Center

    Prosser, Andrew

    2014-01-01

    Digital storytelling is already used extensively in language education. Web documentaries, particularly in terms of design and narrative structure, provide an extension of the digital storytelling concept, specifically in terms of increased interactivity. Using a model of interactive, non-linear storytelling, originally derived from computer game…

  15. Mathematical and computational approaches can complement experimental studies of host-pathogen interactions.

    PubMed

    Kirschner, Denise E; Linderman, Jennifer J

    2009-04-01

    In addition to traditional and novel experimental approaches to study host-pathogen interactions, mathematical and computer modelling have recently been applied to address open questions in this area. These modelling tools not only offer an additional avenue for exploring disease dynamics at multiple biological scales, but also complement and extend knowledge gained via experimental tools. In this review, we outline four examples where modelling has complemented current experimental techniques in a way that can or has already pushed our knowledge of host-pathogen dynamics forward. Two of the modelling approaches presented go hand in hand with articles in this issue exploring fluorescence resonance energy transfer and two-photon intravital microscopy. Two others explore virtual or 'in silico' deletion and depletion as well as a new method to understand and guide studies in genetic epidemiology. In each of these examples, the complementary nature of modelling and experiment is discussed. We further note that multi-scale modelling may allow us to integrate information across length (molecular, cellular, tissue, organism, population) and time (e.g. seconds to lifetimes). In sum, when combined, these compatible approaches offer new opportunities for understanding host-pathogen interactions.

  16. Rapid Human-Computer Interactive Conceptual Design of Mobile and Manipulative Robot Systems

    DTIC Science & Technology

    2015-05-19

    algorithm based on Age-Fitness Pareto Optimization (AFPO) ([9]) with an additional user prefer- ence objective and a neural network-based user model, we...greater than 40, which is about 5 times further than any robot traveled in our experiments. 6 3.3 Methods The algorithm uses a client -server computational...architecture. The client here is an interactive pro- gram which takes a pair of controllers as input, simulates4 two copies of the robot with

  17. Image-Based Patient-Specific Ventricle Models with Fluid-Structure Interaction for Cardiac Function Assessment and Surgical Design Optimization

    PubMed Central

    Tang, Dalin; Yang, Chun; Geva, Tal; del Nido, Pedro J.

    2010-01-01

    Recent advances in medical imaging technology and computational modeling techniques are making it possible that patient-specific computational ventricle models be constructed and used to test surgical hypotheses and replace empirical and often risky clinical experimentation to examine the efficiency and suitability of various reconstructive procedures in diseased hearts. In this paper, we provide a brief review on recent development in ventricle modeling and its potential application in surgical planning and management of tetralogy of Fallot (ToF) patients. Aspects of data acquisition, model selection and construction, tissue material properties, ventricle layer structure and tissue fiber orientations, pressure condition, model validation and virtual surgery procedures (changing patient-specific ventricle data and perform computer simulation) were reviewed. Results from a case study using patient-specific cardiac magnetic resonance (CMR) imaging and right/left ventricle and patch (RV/LV/Patch) combination model with fluid-structure interactions (FSI) were reported. The models were used to evaluate and optimize human pulmonary valve replacement/insertion (PVR) surgical procedure and patch design and test a surgical hypothesis that PVR with small patch and aggressive scar tissue trimming in PVR surgery may lead to improved recovery of RV function and reduced stress/strain conditions in the patch area. PMID:21344066

  18. The 3-D CFD modeling of gas turbine combustor-integral bleed flow interaction

    NASA Technical Reports Server (NTRS)

    Chen, D. Y.; Reynolds, R. S.

    1993-01-01

    An advanced 3-D Computational Fluid Dynamics (CFD) model was developed to analyze the flow interaction between a gas turbine combustor and an integral bleed plenum. In this model, the elliptic governing equations of continuity, momentum and the k-e turbulence model were solved on a boundary-fitted, curvilinear, orthogonal grid system. The model was first validated against test data from public literature and then applied to a gas turbine combustor with integral bleed. The model predictions agreed well with data from combustor rig testing. The model predictions also indicated strong flow interaction between the combustor and the integral bleed. Integral bleed flow distribution was found to have a great effect on the pressure distribution around the gas turbine combustor.

  19. The Electric Propulsion Interactions Code (EPIC)

    NASA Technical Reports Server (NTRS)

    Mikellides, I. G.; Mandell, M. J.; Kuharski, R. A.; Davis, V. A.; Gardner, B. M.; Minor, J.

    2004-01-01

    Science Applications International Corporation is currently developing the Electric Propulsion Interactions Code, EPIC, as part of a project sponsored by the Space Environments and Effects Program at the NASA Marshall Space Flight Center. Now in its second year of development, EPIC is an interactive computer tool that allows the construction of a 3-D spacecraft model, and the assessment of a variety of interactions between its subsystems and the plume from an electric thruster. These interactions may include erosion of surfaces due to sputtering and re-deposition of sputtered materials, surface heating, torque on the spacecraft, and changes in surface properties due to erosion and deposition. This paper describes the overall capability of EPIC and provides an outline of the physics and algorithms that comprise many of its computational modules.

  20. 'Tagger' - a Mac OS X Interactive Graphical Application for Data Inference and Analysis of N-Dimensional Datasets in the Natural Physical Sciences.

    NASA Astrophysics Data System (ADS)

    Morse, P. E.; Reading, A. M.; Lueg, C.

    2014-12-01

    Pattern-recognition in scientific data is not only a computational problem but a human-observer problem as well. Human observation of - and interaction with - data visualization software can augment, select, interrupt and modify computational routines and facilitate processes of pattern and significant feature recognition for subsequent human analysis, machine learning, expert and artificial intelligence systems.'Tagger' is a Mac OS X interactive data visualisation tool that facilitates Human-Computer interaction for the recognition of patterns and significant structures. It is a graphical application developed using the Quartz Composer framework. 'Tagger' follows a Model-View-Controller (MVC) software architecture: the application problem domain (the model) is to facilitate novel ways of abstractly representing data to a human interlocutor, presenting these via different viewer modalities (e.g. chart representations, particle systems, parametric geometry) to the user (View) and enabling interaction with the data (Controller) via a variety of Human Interface Devices (HID). The software enables the user to create an arbitrary array of tags that may be appended to the visualised data, which are then saved into output files as forms of semantic metadata. Three fundamental problems that are not strongly supported by conventional scientific visualisation software are addressed:1] How to visually animate data over time, 2] How to rapidly deploy unconventional parametrically driven data visualisations, 3] How to construct and explore novel interaction models that capture the activity of the end-user as semantic metadata that can be used to computationally enhance subsequent interrogation. Saved tagged data files may be loaded into Tagger, so that tags may be tagged, if desired. Recursion opens up the possibility of refining or overlapping different types of tags, tagging a variety of different POIs or types of events, and of capturing different types of specialist observations of important or noticeable events. Other visualisations and modes of interaction will also be demonstrated, with the aim of discovering knowledge in large datasets in the natural, physical sciences. Fig.1 Wave height data from an oceanographic Wave Rider Buoy. Colors/radii are driven by wave height data.

  1. Advanced computational simulations of water waves interacting with wave energy converters

    NASA Astrophysics Data System (ADS)

    Pathak, Ashish; Freniere, Cole; Raessi, Mehdi

    2017-03-01

    Wave energy converter (WEC) devices harness the renewable ocean wave energy and convert it into useful forms of energy, e.g. mechanical or electrical. This paper presents an advanced 3D computational framework to study the interaction between water waves and WEC devices. The computational tool solves the full Navier-Stokes equations and considers all important effects impacting the device performance. To enable large-scale simulations in fast turnaround times, the computational solver was developed in an MPI parallel framework. A fast multigrid preconditioned solver is introduced to solve the computationally expensive pressure Poisson equation. The computational solver was applied to two surface-piercing WEC geometries: bottom-hinged cylinder and flap. Their numerically simulated response was validated against experimental data. Additional simulations were conducted to investigate the applicability of Froude scaling in predicting full-scale WEC response from the model experiments.

  2. Computational Workbench for Multibody Dynamics

    NASA Technical Reports Server (NTRS)

    Edmonds, Karina

    2007-01-01

    PyCraft is a computer program that provides an interactive, workbenchlike computing environment for developing and testing algorithms for multibody dynamics. Examples of multibody dynamic systems amenable to analysis with the help of PyCraft include land vehicles, spacecraft, robots, and molecular models. PyCraft is based on the Spatial-Operator- Algebra (SOA) formulation for multibody dynamics. The SOA operators enable construction of simple and compact representations of complex multibody dynamical equations. Within the Py-Craft computational workbench, users can, essentially, use the high-level SOA operator notation to represent the variety of dynamical quantities and algorithms and to perform computations interactively. PyCraft provides a Python-language interface to underlying C++ code. Working with SOA concepts, a user can create and manipulate Python-level operator classes in order to implement and evaluate new dynamical quantities and algorithms. During use of PyCraft, virtually all SOA-based algorithms are available for computational experiments.

  3. High-pressure melting curve of hydrogen.

    PubMed

    Davis, Sergio M; Belonoshko, Anatoly B; Johansson, Börje; Skorodumova, Natalia V; van Duin, Adri C T

    2008-11-21

    The melting curve of hydrogen was computed for pressures up to 200 GPa, using molecular dynamics. The inter- and intramolecular interactions were described by the reactive force field (ReaxFF) model. The model describes the pressure-volume equation of state solid hydrogen in good agreement with experiment up to pressures over 150 GPa, however the corresponding equation of state for liquid deviates considerably from density functional theory calculations. Due to this, the computed melting curve, although shares most of the known features, yields considerably lower melting temperatures compared to extrapolations of the available diamond anvil cell data. This failure of the ReaxFF model, which can reproduce many physical and chemical properties (including chemical reactions in hydrocarbons) of solid hydrogen, hints at an important change in the mechanism of interaction of hydrogen molecules in the liquid state.

  4. A comparison of experimental and computer model results on the charge-exchange plasma flow from a 30 cm mercury ion thruster

    NASA Technical Reports Server (NTRS)

    Gabriel, S. B.; Kaufman, H. R.

    1982-01-01

    Ion thrusters can be used in a variety of primary and auxiliary space-propulsion applications. A thruster produces a charge-exchange plasma which can interact with various systems on the spacecraft. The propagation of the charge-exchange plasma is crucial in determining the interaction of that plasma with the spacecraft. This paper compares experimental measurements with computer model predictions of the propagation of the charge-exchange plasma from a 30 cm mercury ion thruster. The plasma potentials, and ion densities, and directed energies are discussed. Good agreement is found in a region upstream of, and close to, the ion thruster optics. Outside of this region the agreement is reasonable in view of the modeling difficulties.

  5. Recent developments of NASTRAN pre- amd post-processors: Response spectrum analysis (RESPAN) and interactive graphics (GIFTS)

    NASA Technical Reports Server (NTRS)

    Hirt, E. F.; Fox, G. L.

    1982-01-01

    Two specific NASTRAN preprocessors and postprocessors are examined. A postprocessor for dynamic analysis and a graphical interactive package for model generation and review of resuls are presented. A computer program that provides response spectrum analysis capability based on data from NASTRAN finite element model is described and the GIFTS system, a graphic processor to augment NASTRAN is introduced.

  6. VASA: Interactive Computational Steering of Large Asynchronous Simulation Pipelines for Societal Infrastructure.

    PubMed

    Ko, Sungahn; Zhao, Jieqiong; Xia, Jing; Afzal, Shehzad; Wang, Xiaoyu; Abram, Greg; Elmqvist, Niklas; Kne, Len; Van Riper, David; Gaither, Kelly; Kennedy, Shaun; Tolone, William; Ribarsky, William; Ebert, David S

    2014-12-01

    We present VASA, a visual analytics platform consisting of a desktop application, a component model, and a suite of distributed simulation components for modeling the impact of societal threats such as weather, food contamination, and traffic on critical infrastructure such as supply chains, road networks, and power grids. Each component encapsulates a high-fidelity simulation model that together form an asynchronous simulation pipeline: a system of systems of individual simulations with a common data and parameter exchange format. At the heart of VASA is the Workbench, a visual analytics application providing three distinct features: (1) low-fidelity approximations of the distributed simulation components using local simulation proxies to enable analysts to interactively configure a simulation run; (2) computational steering mechanisms to manage the execution of individual simulation components; and (3) spatiotemporal and interactive methods to explore the combined results of a simulation run. We showcase the utility of the platform using examples involving supply chains during a hurricane as well as food contamination in a fast food restaurant chain.

  7. Modeling adsorption with lattice Boltzmann equation

    PubMed Central

    Guo, Long; Xiao, Lizhi; Shan, Xiaowen; Zhang, Xiaoling

    2016-01-01

    The research of adsorption theory has recently gained renewed attention due to its critical relevance to a number of trending industrial applications, hydrogen storage and shale gas exploration for instance. The existing theoretical foundation, laid mostly in the early twentieth century, was largely based on simple heuristic molecular interaction models and static interaction potential which, although being insightful in illuminating the fundamental mechanisms, are insufficient for computations with realistic adsorbent structure and adsorbate hydrodynamics, both critical for real-life applications. Here we present and validate a novel lattice Boltzmann model incorporating both adsorbate-adsorbate and adsorbate-adsorbent interactions with hydrodynamics which, for the first time, allows adsorption to be computed with real-life details. Connection with the classic Ono-Kondo lattice theory is established and various adsorption isotherms, both within and beyond the IUPAC classification are observed as a pseudo-potential is varied. This new approach not only enables an important physical to be simulated for real-life applications, but also provides an enabling theoretical framework within which the fundamentals of adsorption can be studied. PMID:27256325

  8. Does Cation Size Affect Occupancy and Electrostatic Screening of the Nucleic Acid Ion Atmosphere?

    PubMed Central

    2016-01-01

    Electrostatics are central to all aspects of nucleic acid behavior, including their folding, condensation, and binding to other molecules, and the energetics of these processes are profoundly influenced by the ion atmosphere that surrounds nucleic acids. Given the highly complex and dynamic nature of the ion atmosphere, understanding its properties and effects will require synergy between computational modeling and experiment. Prior computational models and experiments suggest that cation occupancy in the ion atmosphere depends on the size of the cation. However, the computational models have not been independently tested, and the experimentally observed effects were small. Here, we evaluate a computational model of ion size effects by experimentally testing a blind prediction made from that model, and we present additional experimental results that extend our understanding of the ion atmosphere. Giambasu et al. developed and implemented a three-dimensional reference interaction site (3D-RISM) model for monovalent cations surrounding DNA and RNA helices, and this model predicts that Na+ would outcompete Cs+ by 1.8–2.1-fold; i.e., with Cs+ in 2-fold excess of Na+ the ion atmosphere would contain an equal number of each cation (Nucleic Acids Res.2015, 43, 8405). However, our ion counting experiments indicate that there is no significant preference for Na+ over Cs+. There is an ∼25% preferential occupancy of Li+ over larger cations in the ion atmosphere but, counter to general expectations from existing models, no size dependence for the other alkali metal ions. Further, we followed the folding of the P4–P6 RNA and showed that differences in folding with different alkali metal ions observed at high concentration arise from cation–anion interactions and not cation size effects. Overall, our results provide a critical test of a computational prediction, fundamental information about ion atmosphere properties, and parameters that will aid in the development of next-generation nucleic acid computational models. PMID:27479701

  9. The Bilingual Language Interaction Network for Comprehension of Speech*

    PubMed Central

    Marian, Viorica

    2013-01-01

    During speech comprehension, bilinguals co-activate both of their languages, resulting in cross-linguistic interaction at various levels of processing. This interaction has important consequences for both the structure of the language system and the mechanisms by which the system processes spoken language. Using computational modeling, we can examine how cross-linguistic interaction affects language processing in a controlled, simulated environment. Here we present a connectionist model of bilingual language processing, the Bilingual Language Interaction Network for Comprehension of Speech (BLINCS), wherein interconnected levels of processing are created using dynamic, self-organizing maps. BLINCS can account for a variety of psycholinguistic phenomena, including cross-linguistic interaction at and across multiple levels of processing, cognate facilitation effects, and audio-visual integration during speech comprehension. The model also provides a way to separate two languages without requiring a global language-identification system. We conclude that BLINCS serves as a promising new model of bilingual spoken language comprehension. PMID:24363602

  10. Bayesian Action–Perception Computational Model: Interaction of Production and Recognition of Cursive Letters

    PubMed Central

    Gilet, Estelle; Diard, Julien; Bessière, Pierre

    2011-01-01

    In this paper, we study the collaboration of perception and action representations involved in cursive letter recognition and production. We propose a mathematical formulation for the whole perception–action loop, based on probabilistic modeling and Bayesian inference, which we call the Bayesian Action–Perception (BAP) model. Being a model of both perception and action processes, the purpose of this model is to study the interaction of these processes. More precisely, the model includes a feedback loop from motor production, which implements an internal simulation of movement. Motor knowledge can therefore be involved during perception tasks. In this paper, we formally define the BAP model and show how it solves the following six varied cognitive tasks using Bayesian inference: i) letter recognition (purely sensory), ii) writer recognition, iii) letter production (with different effectors), iv) copying of trajectories, v) copying of letters, and vi) letter recognition (with internal simulation of movements). We present computer simulations of each of these cognitive tasks, and discuss experimental predictions and theoretical developments. PMID:21674043

  11. Interface for the documentation and compilation of a library of computer models in physiology.

    PubMed Central

    Summers, R. L.; Montani, J. P.

    1994-01-01

    A software interface for the documentation and compilation of a library of computer models in physiology was developed. The interface is an interactive program built within a word processing template in order to provide ease and flexibility of documentation. A model editor within the interface directs the model builder as to standardized requirements for incorporating models into the library and provides the user with an index to the levels of documentation. The interface and accompanying library are intended to facilitate model development, preservation and distribution and will be available for public use. PMID:7950046

  12. Fluid{Structure Interaction Modeling of Modified-Porosity Parachutes and Parachute Clusters

    NASA Astrophysics Data System (ADS)

    Boben, Joseph J.

    To increase aerodynamic performance, the geometric porosity of a ringsail spacecraft parachute canopy is sometimes increased, beyond the "rings" and "sails" with hundreds of "ring gaps" and "sail slits." This creates extra computational challenges for fluid-structure interaction (FSI) modeling of clusters of such parachutes, beyond those created by the lightness of the canopy structure, geometric complexities of hundreds of gaps and slits, and the contact between the parachutes of the cluster. In FSI computation of parachutes with such "modified geometric porosity," the ow through the "windows" created by the removal of the panels and the wider gaps created by the removal of the sails cannot be accurately modeled with the Homogenized Modeling of Geometric Porosity (HMGP), which was introduced to deal with the hundreds of gaps and slits. The ow needs to be actually resolved. All these computational challenges need to be addressed simultaneously in FSI modeling of clusters of spacecraft parachutes with modified geometric porosity. The core numerical technology is the Stabilized Space-Time FSI (SSTFSI) technique, and the contact between the parachutes is handled with the Surface-Edge-Node Contact Tracking (SENCT) technique. In the computations reported here, in addition to the SSTFSI and SENCT techniques and HMGP, we use the special techniques we have developed for removing the numerical spinning component of the parachute motion and for restoring the mesh integrity without a remesh. We present results for 2- and 3-parachute clusters with two different payload models. We also present the FSI computations we carried out for a single, subscale modified-porosity parachute.

  13. Causal Reasoning in Medicine: Analysis of a Protocol.

    ERIC Educational Resources Information Center

    Kuipers, Benjamin; Kassirer, Jerome P.

    1984-01-01

    Describes the construction of a knowledge representation from the identification of the problem (nephrotic syndrome) to a running computer simulation of causal reasoning to provide a vertical slice of the construction of a cognitive model. Interactions between textbook knowledge, observations of human experts, and computational requirements are…

  14. Three-dimensional vector modeling and restoration of flat finite wave tank radiometric measurements

    NASA Technical Reports Server (NTRS)

    Truman, W. M.; Balanis, C. A.

    1977-01-01

    The three-dimensional vector interaction between a microwave radiometer and a wave tank was modeled. Computer programs for predicting the response of the radiometer to the brightness temperature characteristics of the surroundings were developed along with a computer program that can invert (restore) the radiometer measurements. It is shown that the computer programs can be used to simulate the viewing of large bodies of water, and is applicable to radiometer measurements received from satellites monitoring the ocean. The water temperature, salinity, and wind speed can be determined.

  15. Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation

    NASA Technical Reports Server (NTRS)

    Afjeh, Abdollah A.; Reed, John A.

    2003-01-01

    The following reports are presented on this project:A first year progress report on: Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; A second year progress report on: Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; An Extensible, Interchangeable and Sharable Database Model for Improving Multidisciplinary Aircraft Design; Interactive, Secure Web-enabled Aircraft Engine Simulation Using XML Databinding Integration; and Improving the Aircraft Design Process Using Web-based Modeling and Simulation.

  16. Computations of Lifshitz-van der Waals interaction energies between irregular particles and surfaces at all separations for resuspension modelling

    NASA Astrophysics Data System (ADS)

    Priye, Aashish; Marlow, William H.

    2013-10-01

    The phenomenon of particle resuspension plays a vital role in numerous fields. Among many aspects of particle resuspension dynamics, a dominant concern is the accurate description and formulation of the van der Waals (vdW) interactions between the particle and substrate. Current models treat adhesion by incorporating a material-dependent Hamaker's constant which relies on the heuristic Hamaker's two-body interactions. However, this assumption of pairwise summation of interaction energies can lead to significant errors in condensed matter as it does not take into account the many-body interaction and retardation effects. To address these issues, an approach based on Lifshitz continuum theory of vdW interactions has been developed to calculate the principal many-body interactions between arbitrary geometries at all separation distances to a high degree of accuracy through Lifshitz's theory. We have applied this numerical implementation to calculate the many-body vdW interactions between spherical particles and surfaces with sinusoidally varying roughness profile and also to non-spherical particles (cubes, cylinders, tetrahedron etc) orientated differently with respect to the surface. Our calculations revealed that increasing the surface roughness amplitude decreases the adhesion force and non-spherical particles adhere to the surfaces more strongly when their flatter sides are oriented towards the surface. Such practical shapes and structures of particle-surface systems have not been previously considered in resuspension models and this rigorous treatment of vdW interactions provides more realistic adhesion forces between the particle and the surface which can then be coupled with computational fluid dynamics models to improve the predictive capabilities of particle resuspension dynamics.

  17. Quantification of the transferability of a designed protein specificity switch reveals extensive epistasis in molecular recognition

    DOE PAGES

    Melero, Cristina; Ollikainen, Noah; Harwood, Ian; ...

    2014-10-13

    Re-engineering protein–protein recognition is an important route to dissecting and controlling complex interaction networks. Experimental approaches have used the strategy of “second-site suppressors,” where a functional interaction is inferred between two proteins if a mutation in one protein can be compensated by a mutation in the second. Mimicking this strategy, computational design has been applied successfully to change protein recognition specificity by predicting such sets of compensatory mutations in protein–protein interfaces. To extend this approach, it would be advantageous to be able to “transplant” existing engineered and experimentally validated specificity changes to other homologous protein–protein complexes. Here, we test thismore » strategy by designing a pair of mutations that modulates peptide recognition specificity in the Syntrophin PDZ domain, confirming the designed interaction biochemically and structurally, and then transplanting the mutations into the context of five related PDZ domain–peptide complexes. We find a wide range of energetic effects of identical mutations in structurally similar positions, revealing a dramatic context dependence (epistasis) of designed mutations in homologous protein–protein interactions. To better understand the structural basis of this context dependence, we apply a structure-based computational model that recapitulates these energetic effects and we use this model to make and validate forward predictions. The context dependence of these mutations is captured by computational predictions, our results both highlight the considerable difficulties in designing protein–protein interactions and provide challenging benchmark cases for the development of improved protein modeling and design methods that accurately account for the context.« less

  18. Quantification of the transferability of a designed protein specificity switch reveals extensive epistasis in molecular recognition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melero, Cristina; Ollikainen, Noah; Harwood, Ian

    Re-engineering protein–protein recognition is an important route to dissecting and controlling complex interaction networks. Experimental approaches have used the strategy of “second-site suppressors,” where a functional interaction is inferred between two proteins if a mutation in one protein can be compensated by a mutation in the second. Mimicking this strategy, computational design has been applied successfully to change protein recognition specificity by predicting such sets of compensatory mutations in protein–protein interfaces. To extend this approach, it would be advantageous to be able to “transplant” existing engineered and experimentally validated specificity changes to other homologous protein–protein complexes. Here, we test thismore » strategy by designing a pair of mutations that modulates peptide recognition specificity in the Syntrophin PDZ domain, confirming the designed interaction biochemically and structurally, and then transplanting the mutations into the context of five related PDZ domain–peptide complexes. We find a wide range of energetic effects of identical mutations in structurally similar positions, revealing a dramatic context dependence (epistasis) of designed mutations in homologous protein–protein interactions. To better understand the structural basis of this context dependence, we apply a structure-based computational model that recapitulates these energetic effects and we use this model to make and validate forward predictions. The context dependence of these mutations is captured by computational predictions, our results both highlight the considerable difficulties in designing protein–protein interactions and provide challenging benchmark cases for the development of improved protein modeling and design methods that accurately account for the context.« less

  19. Construction of Interaction Layer on Socio-Environmental Simulation

    NASA Astrophysics Data System (ADS)

    Torii, Daisuke; Ishida, Toru

    In this study, we propose a method to construct a system based on a legacy socio-environmental simulator which enables to design more realistic interaction models in socio-environmetal simulations. First, to provide a computational model suitable for agent interactions, an interaction layer is constructed and connected from outside of a legacy socio-environmental simulator. Next, to configure the agents interacting ability, connection description for controlling the flow of information in the connection area is provided. As a concrete example, we realized an interaction layer by Q which is a scenario description language and connected it to CORMAS, a socio-envirionmental simulator. Finally, we discuss the capability of our method, using the system, in the Fire-Fighter domain.

  20. Safety Analysis of FMS/CTAS Interactions During Aircraft Arrivals

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy G.

    1998-01-01

    This grant funded research on human-computer interaction design and analysis techniques, using future ATC environments as a testbed. The basic approach was to model the nominal behavior of both the automated and human procedures and then to apply safety analysis techniques to these models. Our previous modeling language, RSML, had been used to specify the system requirements for TCAS II for the FAA. Using the lessons learned from this experience, we designed a new modeling language that (among other things) incorporates features to assist in designing less error-prone human-computer interactions and interfaces and in detecting potential HCI problems, such as mode confusion. The new language, SpecTRM-RL, uses "intent" abstractions, based on Rasmussen's abstraction hierarchy, and includes both informal (English and graphical) specifications and formal, executable models for specifying various aspects of the system. One of the goals for our language was to highlight the system modes and mode changes to assist in identifying the potential for mode confusion. Three published papers resulted from this research. The first builds on the work of Degani on mode confusion to identify aspects of the system design that could lead to potential hazards. We defined and modeled modes differently than Degani and also defined design criteria for SpecTRM-RL models. Our design criteria include the Degani criteria but extend them to include more potential problems. In a second paper, Leveson and Palmer showed how the criteria for indirect mode transitions could be applied to a mode confusion problem found in several ASRS reports for the MD-88. In addition, we defined a visual task modeling language that can be used by system designers to model human-computer interaction. The visual models can be translated into SpecTRM-RL models, and then the SpecTRM-RL suite of analysis tools can be used to perform formal and informal safety analyses on the task model in isolation or integrated with the rest of the modeled system. We had hoped to be able to apply these modeling languages and analysis tools to a TAP air/ground trajectory negotiation scenario, but the development of the tools took more time than we anticipated.

Top