Sample records for computer models aim

  1. Investigating College and Graduate Students' Multivariable Reasoning in Computational Modeling

    ERIC Educational Resources Information Center

    Wu, Hsin-Kai; Wu, Pai-Hsing; Zhang, Wen-Xin; Hsu, Ying-Shao

    2013-01-01

    Drawing upon the literature in computational modeling, multivariable reasoning, and causal attribution, this study aims at characterizing multivariable reasoning practices in computational modeling and revealing the nature of understanding about multivariable causality. We recruited two freshmen, two sophomores, two juniors, two seniors, four…

  2. Thermal management of closed computer modules utilizing high density circuitry. [in Airborne Information Management System

    NASA Technical Reports Server (NTRS)

    Hoadley, A. W.; Porter, A. J.

    1990-01-01

    This paper presents data on a preliminary analysis of the thermal dynamic characteristics of the Airborne Information Management System (AIMS), which is a continuing design project at NASA Dryden. The analysis established the methods which will be applied to the actual AIMS boards as they become available. The paper also describes the AIMS liquid cooling system design and presents a thermodynamic computer model of the AIMS cooling system, together with an experimental validation of this model.

  3. Tutorial: Asteroseismic Stellar Modelling with AIMS

    NASA Astrophysics Data System (ADS)

    Lund, Mikkel N.; Reese, Daniel R.

    The goal of aims (Asteroseismic Inference on a Massive Scale) is to estimate stellar parameters and credible intervals/error bars in a Bayesian manner from a set of asteroseismic frequency data and so-called classical constraints. To achieve reliable parameter estimates and computational efficiency, it searches through a grid of pre-computed models using an MCMC algorithm—interpolation within the grid of models is performed by first tessellating the grid using a Delaunay triangulation and then doing a linear barycentric interpolation on matching simplexes. Inputs for the modelling consist of individual frequencies from peak-bagging, which can be complemented with classical spectroscopic constraints. aims is mostly written in Python with a modular structure to facilitate contributions from the community. Only a few computationally intensive parts have been rewritten in Fortran in order to speed up calculations.

  4. Fast and accurate computation of system matrix for area integral model-based algebraic reconstruction technique

    NASA Astrophysics Data System (ADS)

    Zhang, Shunli; Zhang, Dinghua; Gong, Hao; Ghasemalizadeh, Omid; Wang, Ge; Cao, Guohua

    2014-11-01

    Iterative algorithms, such as the algebraic reconstruction technique (ART), are popular for image reconstruction. For iterative reconstruction, the area integral model (AIM) is more accurate for better reconstruction quality than the line integral model (LIM). However, the computation of the system matrix for AIM is more complex and time-consuming than that for LIM. Here, we propose a fast and accurate method to compute the system matrix for AIM. First, we calculate the intersection of each boundary line of a narrow fan-beam with pixels in a recursive and efficient manner. Then, by grouping the beam-pixel intersection area into six types according to the slopes of the two boundary lines, we analytically compute the intersection area of the narrow fan-beam with the pixels in a simple algebraic fashion. Overall, experimental results show that our method is about three times faster than the Siddon algorithm and about two times faster than the distance-driven model (DDM) in computation of the system matrix. The reconstruction speed of our AIM-based ART is also faster than the LIM-based ART that uses the Siddon algorithm and DDM-based ART, for one iteration. The fast reconstruction speed of our method was accomplished without compromising the image quality.

  5. Academic Information Management--A New Synthesis.

    ERIC Educational Resources Information Center

    Drummond, Marshall Edward

    The design concept and initial development phases of academic information management (AIM) are discussed. The AIM concept is an attempt to serve three segments of academic management with data and models to support decision making. AIM is concerned with management and evaluation of instructional computing in areas other than direct computing (data…

  6. Vectorial Representations of Meaning for a Computational Model of Language Comprehension

    ERIC Educational Resources Information Center

    Wu, Stephen Tze-Inn

    2010-01-01

    This thesis aims to define and extend a line of computational models for text comprehension that are humanly plausible. Since natural language is human by nature, computational models of human language will always be just that--models. To the degree that they miss out on information that humans would tap into, they may be improved by considering…

  7. Finite element analysis of TAVI: Impact of native aortic root computational modeling strategies on simulation outcomes.

    PubMed

    Finotello, Alice; Morganti, Simone; Auricchio, Ferdinando

    2017-09-01

    In the last few years, several studies, each with different aim and modeling detail, have been proposed to investigate transcatheter aortic valve implantation (TAVI) with finite elements. The present work focuses on the patient-specific finite element modeling of the aortic valve complex. In particular, we aim at investigating how different modeling strategies in terms of material models/properties and discretization procedures can impact analysis results. Four different choices both for the mesh size (from  20 k elements to  200 k elements) and for the material model (from rigid to hyperelastic anisotropic) are considered. Different approaches for modeling calcifications are also taken into account. Post-operative CT data of the real implant are used as reference solution with the aim of outlining a trade-off between computational model complexity and reliability of the results. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.

  8. An overview of computer viruses in a research environment

    NASA Technical Reports Server (NTRS)

    Bishop, Matt

    1991-01-01

    The threat of attack by computer viruses is in reality a very small part of a much more general threat, specifically threats aimed at subverting computer security. Here, computer viruses are examined as a malicious logic in a research and development environment. A relation is drawn between the viruses and various models of security and integrity. Current research techniques aimed at controlling the threats posed to computer systems by threatening viruses in particular and malicious logic in general are examined. Finally, a brief examination of the vulnerabilities of research and development systems that malicious logic and computer viruses may exploit is undertaken.

  9. Modeling Physiological Systems in the Human Body as Networks of Quasi-1D Fluid Flows

    NASA Astrophysics Data System (ADS)

    Staples, Anne

    2008-11-01

    Extensive research has been done on modeling human physiology. Most of this work has been aimed at developing detailed, three-dimensional models of specific components of physiological systems, such as a cell, a vein, a molecule, or a heart valve. While efforts such as these are invaluable to our understanding of human biology, if we were to construct a global model of human physiology with this level of detail, computing even a nanosecond in this computational being's life would certainly be prohibitively expensive. With this in mind, we derive the Pulsed Flow Equations, a set of coupled one-dimensional partial differential equations, specifically designed to capture two-dimensional viscous, transport, and other effects, and aimed at providing accurate and fast-to-compute global models for physiological systems represented as networks of quasi one-dimensional fluid flows. Our goal is to be able to perform faster-than-real time simulations of global processes in the human body on desktop computers.

  10. Multicellular Models of Morphogenesis

    EPA Science Inventory

    EPA’s Virtual Embryo project (v-Embryo™), in collaboration with developers of CompuCell3D, aims to create computer models of morphogenesis that can be used to address the effects of chemical perturbation on embryo development at the cellular level. Such computational (in silico) ...

  11. Thrombosis in Cerebral Aneurysms and the Computational Modeling Thereof: A Review

    PubMed Central

    Ngoepe, Malebogo N.; Frangi, Alejandro F.; Byrne, James V.; Ventikos, Yiannis

    2018-01-01

    Thrombosis is a condition closely related to cerebral aneurysms and controlled thrombosis is the main purpose of endovascular embolization treatment. The mechanisms governing thrombus initiation and evolution in cerebral aneurysms have not been fully elucidated and this presents challenges for interventional planning. Significant effort has been directed towards developing computational methods aimed at streamlining the interventional planning process for unruptured cerebral aneurysm treatment. Included in these methods are computational models of thrombus development following endovascular device placement. The main challenge with developing computational models for thrombosis in disease cases is that there exists a wide body of literature that addresses various aspects of the clotting process, but it may not be obvious what information is of direct consequence for what modeling purpose (e.g., for understanding the effect of endovascular therapies). The aim of this review is to present the information so it will be of benefit to the community attempting to model cerebral aneurysm thrombosis for interventional planning purposes, in a simplified yet appropriate manner. The paper begins by explaining current understanding of physiological coagulation and highlights the documented distinctions between the physiological process and cerebral aneurysm thrombosis. Clinical observations of thrombosis following endovascular device placement are then presented. This is followed by a section detailing the demands placed on computational models developed for interventional planning. Finally, existing computational models of thrombosis are presented. This last section begins with description and discussion of physiological computational clotting models, as they are of immense value in understanding how to construct a general computational model of clotting. This is then followed by a review of computational models of clotting in cerebral aneurysms, specifically. Even though some progress has been made towards computational predictions of thrombosis following device placement in cerebral aneurysms, many gaps still remain. Answering the key questions will require the combined efforts of the clinical, experimental and computational communities. PMID:29670533

  12. Thrombosis in Cerebral Aneurysms and the Computational Modeling Thereof: A Review.

    PubMed

    Ngoepe, Malebogo N; Frangi, Alejandro F; Byrne, James V; Ventikos, Yiannis

    2018-01-01

    Thrombosis is a condition closely related to cerebral aneurysms and controlled thrombosis is the main purpose of endovascular embolization treatment. The mechanisms governing thrombus initiation and evolution in cerebral aneurysms have not been fully elucidated and this presents challenges for interventional planning. Significant effort has been directed towards developing computational methods aimed at streamlining the interventional planning process for unruptured cerebral aneurysm treatment. Included in these methods are computational models of thrombus development following endovascular device placement. The main challenge with developing computational models for thrombosis in disease cases is that there exists a wide body of literature that addresses various aspects of the clotting process, but it may not be obvious what information is of direct consequence for what modeling purpose (e.g., for understanding the effect of endovascular therapies). The aim of this review is to present the information so it will be of benefit to the community attempting to model cerebral aneurysm thrombosis for interventional planning purposes, in a simplified yet appropriate manner. The paper begins by explaining current understanding of physiological coagulation and highlights the documented distinctions between the physiological process and cerebral aneurysm thrombosis. Clinical observations of thrombosis following endovascular device placement are then presented. This is followed by a section detailing the demands placed on computational models developed for interventional planning. Finally, existing computational models of thrombosis are presented. This last section begins with description and discussion of physiological computational clotting models, as they are of immense value in understanding how to construct a general computational model of clotting. This is then followed by a review of computational models of clotting in cerebral aneurysms, specifically. Even though some progress has been made towards computational predictions of thrombosis following device placement in cerebral aneurysms, many gaps still remain. Answering the key questions will require the combined efforts of the clinical, experimental and computational communities.

  13. A Guide to Computer Simulations of Three Adaptive Instructional Models for the Advanced Instructional System Phases II and III. Final Report.

    ERIC Educational Resources Information Center

    Hansen, Duncan N.; And Others

    Computer simulations of three individualized adaptive instructional models (AIM) were undertaken to determine if these models function as prescribed in Air Force technical training programs. In addition, the project sought to develop a user's guide for effective understanding of adaptive models during field implementation. Successful simulations…

  14. Simulating complex intracellular processes using object-oriented computational modelling.

    PubMed

    Johnson, Colin G; Goldman, Jacki P; Gullick, William J

    2004-11-01

    The aim of this paper is to give an overview of computer modelling and simulation in cellular biology, in particular as applied to complex biochemical processes within the cell. This is illustrated by the use of the techniques of object-oriented modelling, where the computer is used to construct abstractions of objects in the domain being modelled, and these objects then interact within the computer to simulate the system and allow emergent properties to be observed. The paper also discusses the role of computer simulation in understanding complexity in biological systems, and the kinds of information which can be obtained about biology via simulation.

  15. Validating the ACE Model for Evaluating Student Performance Using a Teaching-Learning Process Based on Computational Modeling Systems

    ERIC Educational Resources Information Center

    Louzada, Alexandre Neves; Elia, Marcos da Fonseca; Sampaio, Fábio Ferrentini; Vidal, Andre Luiz Pestana

    2014-01-01

    The aim of this work is to adapt and test, in a Brazilian public school, the ACE model proposed by Borkulo for evaluating student performance as a teaching-learning process based on computational modeling systems. The ACE model is based on different types of reasoning involving three dimensions. In addition to adapting the model and introducing…

  16. Developmental Changes in Learning: Computational Mechanisms and Social Influences

    PubMed Central

    Bolenz, Florian; Reiter, Andrea M. F.; Eppinger, Ben

    2017-01-01

    Our ability to learn from the outcomes of our actions and to adapt our decisions accordingly changes over the course of the human lifespan. In recent years, there has been an increasing interest in using computational models to understand developmental changes in learning and decision-making. Moreover, extensions of these models are currently applied to study socio-emotional influences on learning in different age groups, a topic that is of great relevance for applications in education and health psychology. In this article, we aim to provide an introduction to basic ideas underlying computational models of reinforcement learning and focus on parameters and model variants that might be of interest to developmental scientists. We then highlight recent attempts to use reinforcement learning models to study the influence of social information on learning across development. The aim of this review is to illustrate how computational models can be applied in developmental science, what they can add to our understanding of developmental mechanisms and how they can be used to bridge the gap between psychological and neurobiological theories of development. PMID:29250006

  17. Integration of Computational Geometry, Finite Element, and Multibody System Algorithms for the Development of New Computational Methodology for High-Fidelity Vehicle Systems Modeling and Simulation. ADDENDUM

    DTIC Science & Technology

    2013-11-12

    Dr. Paramsothy Jayakumar (586) 282-4896 Computational Dynamics Inc.   0    Name of Contractor Computational Dynamics Inc. (CDI) 1809...Dr. Paramsothy Jayakumar TARDEC Computational Dynamics Inc.   1    Project Summary This project aims at addressing and remedying the serious...Shabana, A.A., Jayakumar , P., and Letherwood, M., “Soil Models and Vehicle System Dynamics”, Applied Mechanics Reviews, Vol. 65(4), 2013, doi

  18. The Use of Engineering Design Concept for Computer Programming Course: A Model of Blended Learning Environment

    ERIC Educational Resources Information Center

    Tritrakan, Kasame; Kidrakarn, Pachoen; Asanok, Manit

    2016-01-01

    The aim of this research is to develop a learning model which blends factors from learning environment and engineering design concept for learning in computer programming course. The usage of the model was also analyzed. This study presents the design, implementation, and evaluation of the model. The research methodology is divided into three…

  19. Comparing Computer-Supported Dynamic Modeling and "Paper & Pencil" Concept Mapping Technique in Students' Collaborative Activity

    ERIC Educational Resources Information Center

    Komis, Vassilis; Ergazaki, Marida; Zogza, Vassiliki

    2007-01-01

    This study aims at highlighting the collaborative activity of two high school students (age 14) in the cases of modeling the complex biological process of plant growth with two different tools: the "paper & pencil" concept mapping technique and the computer-supported educational environment "ModelsCreator". Students' shared activity in both cases…

  20. Modeling the Contribution of Phonotactic Cues to the Problem of Word Segmentation

    ERIC Educational Resources Information Center

    Blanchard, Daniel; Heinz, Jeffrey; Golinkoff, Roberta

    2010-01-01

    How do infants find the words in the speech stream? Computational models help us understand this feat by revealing the advantages and disadvantages of different strategies that infants might use. Here, we outline a computational model of word segmentation that aims both to incorporate cues proposed by language acquisition researchers and to…

  1. The effects of a FRAX revision for the US

    USDA-ARS?s Scientific Manuscript database

    The aim of this study was to determine the effects of a revision of the epidemiological data used to compute fracture probabilities in the US with FRAX (registered trademark). Models were constructed to compute fracture probabilities based on updated epidemiology of fracture in the US. The models c...

  2. The Impact of Iranian Teachers Cultural Values on Computer Technology Acceptance

    ERIC Educational Resources Information Center

    Sadeghi, Karim; Saribagloo, Javad Amani; Aghdam, Samad Hanifepour; Mahmoudi, Hojjat

    2014-01-01

    This study was conducted with the aim of testing the technology acceptance model and the impact of Hofstede cultural values (masculinity/femininity, uncertainty avoidance, individualism/collectivism, and power distance) on computer technology acceptance among teachers at Urmia city (Iran) using the structural equation modeling approach. From among…

  3. Analysing Students' Shared Activity while Modeling a Biological Process in a Computer-Supported Educational Environment

    ERIC Educational Resources Information Center

    Ergazaki, M.; Zogza, V.; Komis, V.

    2007-01-01

    This paper reports on a case study with three dyads of high school students (age 14 years) each collaborating on a plant growth modeling task in the computer-supported educational environment "ModelsCreator". Following a qualitative line of research, the present study aims at highlighting the ways in which the collaborating students as well as the…

  4. A Computational Model of Early Argument Structure Acquisition

    ERIC Educational Resources Information Center

    Alishahi, Afra; Stevenson, Suzanne

    2008-01-01

    How children go about learning the general regularities that govern language, as well as keeping track of the exceptions to them, remains one of the challenging open questions in the cognitive science of language. Computational modeling is an important methodology in research aimed at addressing this issue. We must determine appropriate learning…

  5. Choosing Learning Methods Suitable for Teaching and Learning in Computer Science

    ERIC Educational Resources Information Center

    Taylor, Estelle; Breed, Marnus; Hauman, Ilette; Homann, Armando

    2013-01-01

    Our aim is to determine which teaching methods students in Computer Science and Information Systems prefer. There are in total 5 different paradigms (behaviorism, cognitivism, constructivism, design-based and humanism) with 32 models between them. Each model is unique and states different learning methods. Recommendations are made on methods that…

  6. Reshaping Computer Literacy Teaching in Higher Education: Identification of Critical Success Factors

    ERIC Educational Resources Information Center

    Taylor, Estelle; Goede, Roelien; Steyn, Tjaart

    2011-01-01

    Purpose: Acquiring computer skills is more important today than ever before, especially in a developing country. Teaching of computer skills, however, has to adapt to new technology. This paper aims to model factors influencing the success of the learning of computer literacy by means of an e-learning environment. The research question for this…

  7. Integrating Computer-Assisted Language Learning in Saudi Schools: A Change Model

    ERIC Educational Resources Information Center

    Alresheed, Saleh; Leask, Marilyn; Raiker, Andrea

    2015-01-01

    Computer-assisted language learning (CALL) technology and pedagogy have gained recognition globally for their success in supporting second language acquisition (SLA). In Saudi Arabia, the government aims to provide most educational institutions with computers and networking for integrating CALL into classrooms. However, the recognition of CALL's…

  8. Computational models of the pulmonary circulation: Insights and the move towards clinically directed studies

    PubMed Central

    Tawhai, Merryn H.; Clark, Alys R.; Burrowes, Kelly S.

    2011-01-01

    Biophysically-based computational models provide a tool for integrating and explaining experimental data, observations, and hypotheses. Computational models of the pulmonary circulation have evolved from minimal and efficient constructs that have been used to study individual mechanisms that contribute to lung perfusion, to sophisticated multi-scale and -physics structure-based models that predict integrated structure-function relationships within a heterogeneous organ. This review considers the utility of computational models in providing new insights into the function of the pulmonary circulation, and their application in clinically motivated studies. We review mathematical and computational models of the pulmonary circulation based on their application; we begin with models that seek to answer questions in basic science and physiology and progress to models that aim to have clinical application. In looking forward, we discuss the relative merits and clinical relevance of computational models: what important features are still lacking; and how these models may ultimately be applied to further increasing our understanding of the mechanisms occurring in disease of the pulmonary circulation. PMID:22034608

  9. Periodicity computation of generalized mathematical biology problems involving delay differential equations.

    PubMed

    Jasim Mohammed, M; Ibrahim, Rabha W; Ahmad, M Z

    2017-03-01

    In this paper, we consider a low initial population model. Our aim is to study the periodicity computation of this model by using neutral differential equations, which are recognized in various studies including biology. We generalize the neutral Rayleigh equation for the third-order by exploiting the model of fractional calculus, in particular the Riemann-Liouville differential operator. We establish the existence and uniqueness of a periodic computational outcome. The technique depends on the continuation theorem of the coincidence degree theory. Besides, an example is presented to demonstrate the finding.

  10. Using Agent Base Models to Optimize Large Scale Network for Large System Inventories

    NASA Technical Reports Server (NTRS)

    Shameldin, Ramez Ahmed; Bowling, Shannon R.

    2010-01-01

    The aim of this paper is to use Agent Base Models (ABM) to optimize large scale network handling capabilities for large system inventories and to implement strategies for the purpose of reducing capital expenses. The models used in this paper either use computational algorithms or procedure implementations developed by Matlab to simulate agent based models in a principal programming language and mathematical theory using clusters, these clusters work as a high performance computational performance to run the program in parallel computational. In both cases, a model is defined as compilation of a set of structures and processes assumed to underlie the behavior of a network system.

  11. Content Analysis in Computer-Mediated Communication: Analyzing Models for Assessing Critical Thinking through the Lens of Social Constructivism

    ERIC Educational Resources Information Center

    Buraphadeja, Vasa; Dawson, Kara

    2008-01-01

    This article reviews content analysis studies aimed to assess critical thinking in computer-mediated communication. It also discusses theories and content analysis models that encourage critical thinking skills in asynchronous learning environments and reviews theories and factors that may foster critical thinking skills and new knowledge…

  12. Pedagogy and Processes for a Computer Programming Outreach Workshop--The Bridge to College Model

    ERIC Educational Resources Information Center

    Tangney, Brendan; Oldham, Elizabeth; Conneely, Claire; Barrett, Stephen; Lawlor, John

    2010-01-01

    This paper describes a model for computer programming outreach workshops aimed at second-level students (ages 15-16). Participants engage in a series of programming activities based on the Scratch visual programming language, and a very strong group-based pedagogy is followed. Participants are not required to have any prior programming experience.…

  13. Determination of the Computer Self-Efficacy Perception of Students and Metaphors Related to "Computer Ownership"

    ERIC Educational Resources Information Center

    Gecer, Aynur

    2013-01-01

    The aim of this research is to determine the computer self-efficacy perception of second grade primary school students and their opinions regarding computer ownership through metaphors. The research applied the scanning model and was conducted during the 2011-2012 academic year among seven primary schools of the Ministry of National Education in…

  14. Parallelizing a peanut butter sandwich

    NASA Astrophysics Data System (ADS)

    Quenette, S. M.

    2005-12-01

    This poster aims to demonstrate, in a novel way, why contemporary computational code development is seemingly hard to a geodynamics modeler (i.e. a non-computer-scientist). For example, to utilise comtemporary computer hardware, parallelisation is required. But why do we chose the explicit approach (MPI) over an implicit (OpenMP) one? How does this relate to the typical geodynamics codes. And do we face this same style of problems in every day life? We aim to demonstrate that the little bit of complexity, fore-thought and effort is worth its while.

  15. A conceptual and computational model of moral decision making in human and artificial agents.

    PubMed

    Wallach, Wendell; Franklin, Stan; Allen, Colin

    2010-07-01

    Recently, there has been a resurgence of interest in general, comprehensive models of human cognition. Such models aim to explain higher-order cognitive faculties, such as deliberation and planning. Given a computational representation, the validity of these models can be tested in computer simulations such as software agents or embodied robots. The push to implement computational models of this kind has created the field of artificial general intelligence (AGI). Moral decision making is arguably one of the most challenging tasks for computational approaches to higher-order cognition. The need for increasingly autonomous artificial agents to factor moral considerations into their choices and actions has given rise to another new field of inquiry variously known as Machine Morality, Machine Ethics, Roboethics, or Friendly AI. In this study, we discuss how LIDA, an AGI model of human cognition, can be adapted to model both affective and rational features of moral decision making. Using the LIDA model, we will demonstrate how moral decisions can be made in many domains using the same mechanisms that enable general decision making. Comprehensive models of human cognition typically aim for compatibility with recent research in the cognitive and neural sciences. Global workspace theory, proposed by the neuropsychologist Bernard Baars (1988), is a highly regarded model of human cognition that is currently being computationally instantiated in several software implementations. LIDA (Franklin, Baars, Ramamurthy, & Ventura, 2005) is one such computational implementation. LIDA is both a set of computational tools and an underlying model of human cognition, which provides mechanisms that are capable of explaining how an agent's selection of its next action arises from bottom-up collection of sensory data and top-down processes for making sense of its current situation. We will describe how the LIDA model helps integrate emotions into the human decision-making process, and we will elucidate a process whereby an agent can work through an ethical problem to reach a solution that takes account of ethically relevant factors. Copyright © 2010 Cognitive Science Society, Inc.

  16. Local spatio-temporal analysis in vision systems

    NASA Astrophysics Data System (ADS)

    Geisler, Wilson S.; Bovik, Alan; Cormack, Lawrence; Ghosh, Joydeep; Gildeen, David

    1994-07-01

    The aims of this project are the following: (1) develop a physiologically and psychophysically based model of low-level human visual processing (a key component of which are local frequency coding mechanisms); (2) develop image models and image-processing methods based upon local frequency coding; (3) develop algorithms for performing certain complex visual tasks based upon local frequency representations, (4) develop models of human performance in certain complex tasks based upon our understanding of low-level processing; and (5) develop a computational testbed for implementing, evaluating and visualizing the proposed models and algorithms, using a massively parallel computer. Progress has been substantial on all aims. The highlights include the following: (1) completion of a number of psychophysical and physiological experiments revealing new, systematic and exciting properties of the primate (human and monkey) visual system; (2) further development of image models that can accurately represent the local frequency structure in complex images; (3) near completion in the construction of the Texas Active Vision Testbed; (4) development and testing of several new computer vision algorithms dealing with shape-from-texture, shape-from-stereo, and depth-from-focus; (5) implementation and evaluation of several new models of human visual performance; and (6) evaluation, purchase and installation of a MasPar parallel computer.

  17. Evaluation of a Computational Model of Situational Awareness

    NASA Technical Reports Server (NTRS)

    Burdick, Mark D.; Shively, R. Jay; Rutkewski, Michael (Technical Monitor)

    2000-01-01

    Although the use of the psychological construct of situational awareness (SA) assists researchers in creating a flight environment that is safer and more predictable, its true potential remains untapped until a valid means of predicting SA a priori becomes available. Previous work proposed a computational model of SA (CSA) that sought to Fill that void. The current line of research is aimed at validating that model. The results show that the model accurately predicted SA in a piloted simulation.

  18. Structure of the Brazilian Sign Language (Libras) for Computational Tools: Citizenship and Social Inclusion

    NASA Astrophysics Data System (ADS)

    Guimaraes, Cayley; Antunes, Diego R.; de F. Guilhermino Trindade, Daniela; da Silva, Rafaella A. Lopes; Garcia, Laura Sanchez

    This work presents a computational model (XML) of the Brazilian Sign Language (Libras), based on its phonology. The model was used to create a sample of representative signs to aid the recording of a base of videos whose aim is to support the development of tools to support genuine social inclusion of the deaf.

  19. Exploring Biomolecular Recognition by Modeling and Simulation

    NASA Astrophysics Data System (ADS)

    Wade, Rebecca

    2007-12-01

    Biomolecular recognition is complex. The balance between the different molecular properties that contribute to molecular recognition, such as shape, electrostatics, dynamics and entropy, varies from case to case. This, along with the extent of experimental characterization, influences the choice of appropriate computational approaches to study biomolecular interactions. I will present computational studies in which we aim to make concerted use of bioinformatics, biochemical network modeling and molecular simulation techniques to study protein-protein and protein-small molecule interactions and to facilitate computer-aided drug design.

  20. From good intentions to healthy habits: towards integrated computational models of goal striving and habit formation.

    PubMed

    Pirolli, Peter

    2016-08-01

    Computational models were developed in the ACT-R neurocognitive architecture to address some aspects of the dynamics of behavior change. The simulations aim to address the day-to-day goal achievement data available from mobile health systems. The models refine current psychological theories of self-efficacy, intended effort, and habit formation, and provide an account for the mechanisms by which goal personalization, implementation intentions, and remindings work.

  1. Multi-scale modeling in cell biology

    PubMed Central

    Meier-Schellersheim, Martin; Fraser, Iain D. C.; Klauschen, Frederick

    2009-01-01

    Biomedical research frequently involves performing experiments and developing hypotheses that link different scales of biological systems such as, for instance, the scales of intracellular molecular interactions to the scale of cellular behavior and beyond to the behavior of cell populations. Computational modeling efforts that aim at exploring such multi-scale systems quantitatively with the help of simulations have to incorporate several different simulation techniques due to the different time and space scales involved. Here, we provide a non-technical overview of how different scales of experimental research can be combined with the appropriate computational modeling techniques. We also show that current modeling software permits building and simulating multi-scale models without having to become involved with the underlying technical details of computational modeling. PMID:20448808

  2. dV/dt - Accelerating the Rate of Progress towards Extreme Scale Collaborative Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Livny, Miron

    This report introduces publications that report the results of a project that aimed to design a computational framework that enables computational experimentation at scale while supporting the model of “submit locally, compute globally”. The project focuses on estimating application resource needs, finding the appropriate computing resources, acquiring those resources,deploying the applications and data on the resources, managing applications and resources during run.

  3. Development of three-dimensional patient face model that enables real-time collision detection and cutting operation for a dental simulator.

    PubMed

    Yamaguchi, Satoshi; Yamada, Yuya; Yoshida, Yoshinori; Noborio, Hiroshi; Imazato, Satoshi

    2012-01-01

    The virtual reality (VR) simulator is a useful tool to develop dental hand skill. However, VR simulations with reactions of patients have limited computational time to reproduce a face model. Our aim was to develop a patient face model that enables real-time collision detection and cutting operation by using stereolithography (STL) and deterministic finite automaton (DFA) data files. We evaluated dependence of computational cost and constructed the patient face model using the optimum condition for combining STL and DFA data files, and assessed the computational costs for operation in do-nothing, collision, cutting, and combination of collision and cutting. The face model was successfully constructed with low computational costs of 11.3, 18.3, 30.3, and 33.5 ms for do-nothing, collision, cutting, and collision and cutting, respectively. The patient face model could be useful for developing dental hand skill with VR.

  4. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    PubMed

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  5. Supporting Scientific Modeling Practices in Atmospheric Sciences: Intended and Actual Affordances of a Computer-Based Modeling Tool

    ERIC Educational Resources Information Center

    Wu, Pai-Hsing; Wu, Hsin-Kai; Kuo, Che-Yu; Hsu, Ying-Shao

    2015-01-01

    Computer-based learning tools include design features to enhance learning but learners may not always perceive the existence of these features and use them in desirable ways. There might be a gap between what the tool features are designed to offer (intended affordance) and what they are actually used (actual affordance). This study thus aims at…

  6. Computer Based Simulation of Laboratory Experiments.

    ERIC Educational Resources Information Center

    Edward, Norrie S.

    1997-01-01

    Examines computer based simulations of practical laboratory experiments in engineering. Discusses the aims and achievements of lab work (cognitive, process, psychomotor, and affective); types of simulations (model building and behavioral); and the strengths and weaknesses of simulations. Describes the development of a centrifugal pump simulation,…

  7. Computational models and motor learning paradigms: Could they provide insights for neuroplasticity after stroke? An overview.

    PubMed

    Kiper, Pawel; Szczudlik, Andrzej; Venneri, Annalena; Stozek, Joanna; Luque-Moreno, Carlos; Opara, Jozef; Baba, Alfonc; Agostini, Michela; Turolla, Andrea

    2016-10-15

    Computational approaches for modelling the central nervous system (CNS) aim to develop theories on processes occurring in the brain that allow the transformation of all information needed for the execution of motor acts. Computational models have been proposed in several fields, to interpret not only the CNS functioning, but also its efferent behaviour. Computational model theories can provide insights into neuromuscular and brain function allowing us to reach a deeper understanding of neuroplasticity. Neuroplasticity is the process occurring in the CNS that is able to permanently change both structure and function due to interaction with the external environment. To understand such a complex process several paradigms related to motor learning and computational modeling have been put forward. These paradigms have been explained through several internal model concepts, and supported by neurophysiological and neuroimaging studies. Therefore, it has been possible to make theories about the basis of different learning paradigms according to known computational models. Here we review the computational models and motor learning paradigms used to describe the CNS and neuromuscular functions, as well as their role in the recovery process. These theories have the potential to provide a way to rigorously explain all the potential of CNS learning, providing a basis for future clinical studies. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Initial draft of CSE-UCLA evaluation model based on weighted product in order to optimize digital library services in computer college in Bali

    NASA Astrophysics Data System (ADS)

    Divayana, D. G. H.; Adiarta, A.; Abadi, I. B. G. S.

    2018-01-01

    The aim of this research was to create initial design of CSE-UCLA evaluation model modified with Weighted Product in evaluating digital library service at Computer College in Bali. The method used in this research was developmental research method and developed by Borg and Gall model design. The results obtained from the research that conducted earlier this month was a rough sketch of Weighted Product based CSE-UCLA evaluation model that the design had been able to provide a general overview of the stages of weighted product based CSE-UCLA evaluation model used in order to optimize the digital library services at the Computer Colleges in Bali.

  9. A Multidimensional Approach to Determinants of Computer Use in Primary Education: Teacher and School Characteristics

    ERIC Educational Resources Information Center

    Tondeur, J.; Valcke, M.; van Braak, J.

    2008-01-01

    The central aim of this study was to test a model that integrates determinants of educational computer use. In particular, the article examines teacher and school characteristics that are associated with different types of computer use by primary school teachers. A survey was set up, involving 527 teachers from 68 primary schools in Flanders. A…

  10. Biology Teacher and Expert Opinions about Computer Assisted Biology Instruction Materials: A Software Entitled Nucleic Acids and Protein Synthesis

    ERIC Educational Resources Information Center

    Hasenekoglu, Ismet; Timucin, Melih

    2007-01-01

    The aim of this study is to collect and evaluate opinions of CAI experts and biology teachers about a high school level Computer Assisted Biology Instruction Material presenting computer-made modelling and simulations. It is a case study. A material covering "Nucleic Acids and Protein Synthesis" topic was developed as the…

  11. The Virtual Liver: Modeling Chemical-Induced Liver Toxicity

    EPA Science Inventory

    The US EPA Virtual Liver (v-Liver) project is aimed at modeling chemical-induced processes in hepatotoxicity and simulating their dose-dependent perturbations. The v-Liver embodies an emerging field of research in computational tissue modeling that integrates molecular and cellul...

  12. Integrative approaches to computational biomedicine

    PubMed Central

    Coveney, Peter V.; Diaz-Zuccarini, Vanessa; Graf, Norbert; Hunter, Peter; Kohl, Peter; Tegner, Jesper; Viceconti, Marco

    2013-01-01

    The new discipline of computational biomedicine is concerned with the application of computer-based techniques and particularly modelling and simulation to human health. Since 2007, this discipline has been synonymous, in Europe, with the name given to the European Union's ambitious investment in integrating these techniques with the eventual aim of modelling the human body as a whole: the virtual physiological human. This programme and its successors are expected, over the next decades, to transform the study and practice of healthcare, moving it towards the priorities known as ‘4P's’: predictive, preventative, personalized and participatory medicine.

  13. Improving science and mathematics education with computational modelling in interactive engagement environments

    NASA Astrophysics Data System (ADS)

    Neves, Rui Gomes; Teodoro, Vítor Duarte

    2012-09-01

    A teaching approach aiming at an epistemologically balanced integration of computational modelling in science and mathematics education is presented. The approach is based on interactive engagement learning activities built around computational modelling experiments that span the range of different kinds of modelling from explorative to expressive modelling. The activities are designed to make a progressive introduction to scientific computation without requiring prior development of a working knowledge of programming, generate and foster the resolution of cognitive conflicts in the understanding of scientific and mathematical concepts and promote performative competency in the manipulation of different and complementary representations of mathematical models. The activities are supported by interactive PDF documents which explain the fundamental concepts, methods and reasoning processes using text, images and embedded movies, and include free space for multimedia enriched student modelling reports and teacher feedback. To illustrate, an example from physics implemented in the Modellus environment and tested in undergraduate university general physics and biophysics courses is discussed.

  14. PROTO-PLASM: parallel language for adaptive and scalable modelling of biosystems.

    PubMed

    Bajaj, Chandrajit; DiCarlo, Antonio; Paoluzzi, Alberto

    2008-09-13

    This paper discusses the design goals and the first developments of PROTO-PLASM, a novel computational environment to produce libraries of executable, combinable and customizable computer models of natural and synthetic biosystems, aiming to provide a supporting framework for predictive understanding of structure and behaviour through multiscale geometric modelling and multiphysics simulations. Admittedly, the PROTO-PLASM platform is still in its infancy. Its computational framework--language, model library, integrated development environment and parallel engine--intends to provide patient-specific computational modelling and simulation of organs and biosystem, exploiting novel functionalities resulting from the symbolic combination of parametrized models of parts at various scales. PROTO-PLASM may define the model equations, but it is currently focused on the symbolic description of model geometry and on the parallel support of simulations. Conversely, CellML and SBML could be viewed as defining the behavioural functions (the model equations) to be used within a PROTO-PLASM program. Here we exemplify the basic functionalities of PROTO-PLASM, by constructing a schematic heart model. We also discuss multiscale issues with reference to the geometric and physical modelling of neuromuscular junctions.

  15. Proto-Plasm: parallel language for adaptive and scalable modelling of biosystems

    PubMed Central

    Bajaj, Chandrajit; DiCarlo, Antonio; Paoluzzi, Alberto

    2008-01-01

    This paper discusses the design goals and the first developments of Proto-Plasm, a novel computational environment to produce libraries of executable, combinable and customizable computer models of natural and synthetic biosystems, aiming to provide a supporting framework for predictive understanding of structure and behaviour through multiscale geometric modelling and multiphysics simulations. Admittedly, the Proto-Plasm platform is still in its infancy. Its computational framework—language, model library, integrated development environment and parallel engine—intends to provide patient-specific computational modelling and simulation of organs and biosystem, exploiting novel functionalities resulting from the symbolic combination of parametrized models of parts at various scales. Proto-Plasm may define the model equations, but it is currently focused on the symbolic description of model geometry and on the parallel support of simulations. Conversely, CellML and SBML could be viewed as defining the behavioural functions (the model equations) to be used within a Proto-Plasm program. Here we exemplify the basic functionalities of Proto-Plasm, by constructing a schematic heart model. We also discuss multiscale issues with reference to the geometric and physical modelling of neuromuscular junctions. PMID:18559320

  16. Teacher Effects, Value-Added Models, and Accountability

    ERIC Educational Resources Information Center

    Konstantopoulos, Spyros

    2014-01-01

    Background: In the last decade, the effects of teachers on student performance (typically manifested as state-wide standardized tests) have been re-examined using statistical models that are known as value-added models. These statistical models aim to compute the unique contribution of the teachers in promoting student achievement gains from grade…

  17. Other Cosmic Ray Links

    Science.gov Websites

    curriculum for its course Physics In and Through Cosmology. The Distributed Observatory aims to become the world's largest cosmic ray telescope, using the distributed sensing and computing power of the world's cell phones. Modeled after the distributed computing efforts of SETI@Home and Folding@Home, the

  18. How Computer-Assisted Teaching in Physics Can Enhance Student Learning

    ERIC Educational Resources Information Center

    Karamustafaoglu, O.

    2012-01-01

    Simple harmonic motion (SHM) is an important topic for physics or science students and has wide applications all over the world. Computer simulations are applications of special interest in physics teaching because they support powerful modeling environments involving physics concepts. This article is aimed to compare the effect of…

  19. Inquiry-Based Learning of Molecular Phylogenetics

    ERIC Educational Resources Information Center

    Campo, Daniel; Garcia-Vazquez, Eva

    2008-01-01

    Reconstructing phylogenies from nucleotide sequences is a challenge for students because it strongly depends on evolutionary models and computer tools that are frequently updated. We present here an inquiry-based course aimed at learning how to trace a phylogeny based on sequences existing in public databases. Computer tools are freely available…

  20. Computer display and manipulation of biological molecules

    NASA Technical Reports Server (NTRS)

    Coeckelenbergh, Y.; Macelroy, R. D.; Hart, J.; Rein, R.

    1978-01-01

    This paper describes a computer model that was designed to investigate the conformation of molecules, macromolecules and subsequent complexes. Utilizing an advanced 3-D dynamic computer display system, the model is sufficiently versatile to accommodate a large variety of molecular input and to generate data for multiple purposes such as visual representation of conformational changes, and calculation of conformation and interaction energy. Molecules can be built on the basis of several levels of information. These include the specification of atomic coordinates and connectivities and the grouping of building blocks and duplicated substructures using symmetry rules found in crystals and polymers such as proteins and nucleic acids. Called AIMS (Ames Interactive Molecular modeling System), the model is now being used to study pre-biotic molecular evolution toward life.

  1. Transonic Blunt Body Aerodynamic Coefficients Computation

    NASA Astrophysics Data System (ADS)

    Sancho, Jorge; Vargas, M.; Gonzalez, Ezequiel; Rodriguez, Manuel

    2011-05-01

    In the framework of EXPERT (European Experimental Re-entry Test-bed) accurate transonic aerodynamic coefficients are of paramount importance for the correct trajectory assessment and parachute deployment. A combined CFD (Computational Fluid Dynamics) modelling and experimental campaign strategy was selected to obtain accurate coefficients. A preliminary set of coefficients were obtained by CFD Euler inviscid computation. Then experimental campaign was performed at DNW facilities at NLR. A profound review of the CFD modelling was done lighten up by WTT results, aimed to obtain reliable values of the coefficients in the future (specially the pitching moment). Study includes different turbulence modelling and mesh sensitivity analysis. Comparison with the WTT results is explored, and lessons learnt are collected.

  2. The neuroscience of vision-based grasping: a functional review for computational modeling and bio-inspired robotics.

    PubMed

    Chinellato, Eris; Del Pobil, Angel P

    2009-06-01

    The topic of vision-based grasping is being widely studied in humans and in other primates using various techniques and with different goals. The fundamental related findings are reviewed in this paper, with the aim of providing researchers from different fields, including intelligent robotics and neural computation, a comprehensive but accessible view on the subject. A detailed description of the principal sensorimotor processes and the brain areas involved is provided following a functional perspective, in order to make this survey especially useful for computational modeling and bio-inspired robotic applications.

  3. Modelling and Optimizing Mathematics Learning in Children

    ERIC Educational Resources Information Center

    Käser, Tanja; Busetto, Alberto Giovanni; Solenthaler, Barbara; Baschera, Gian-Marco; Kohn, Juliane; Kucian, Karin; von Aster, Michael; Gross, Markus

    2013-01-01

    This study introduces a student model and control algorithm, optimizing mathematics learning in children. The adaptive system is integrated into a computer-based training system for enhancing numerical cognition aimed at children with developmental dyscalculia or difficulties in learning mathematics. The student model consists of a dynamic…

  4. Mathematical and computational modeling simulation of solar drying Systems

    USDA-ARS?s Scientific Manuscript database

    Mathematical modeling of solar drying systems has the primary aim of predicting the required drying time for a given commodity, dryer type, and environment. Both fundamental (Fickian diffusion) and semi-empirical drying models have been applied to the solar drying of a variety of agricultural commo...

  5. Application of the rapid update cycle (RUC) to aircraft flight simulation.

    DOT National Transportation Integrated Search

    2008-01-01

    An aircraft flight simulation model under development aims : to provide a computer simulation tool to investigate aircraft flight : performance during en route flight and landing under various : atmospherical conditions [1]. Within this model, the ai...

  6. Integration of Computational Geometry, Finite Element, and Multibody System Algorithms for the Development of New Computational Methodology for High-Fidelity Vehicle Systems Modeling and Simulation

    DTIC Science & Technology

    2013-04-11

    vehicle dynamics. Unclassified Unclassified Unclassified UU 9 Dr. Paramsothy Jayakumar (586) 282-4896 Computational Dynamics Inc.   0    Name of...Technical Representative Dr. Paramsothy Jayakumar TARDEC Computational Dynamics Inc.   1    Project Summary This project aims at addressing and...applications. This literature review is being summarized and incorporated into the paper. The commentary provided by Dr. Jayakumar was addressed and

  7. Emotion-affected decision making in human simulation.

    PubMed

    Zhao, Y; Kang, J; Wright, D K

    2006-01-01

    Human modelling is an interdisciplinary research field. The topic, emotion-affected decision making, was originally a cognitive psychology issue, but is now recognized as an important research direction for both computer science and biomedical modelling. The main aim of this paper is to attempt to bridge the gap between psychology and bioengineering in emotion-affected decision making. The work is based on Ortony's theory of emotions and bounded rationality theory, and attempts to connect the emotion process with decision making. A computational emotion model is proposed, and the initial framework of this model in virtual human simulation within the platform of Virtools is presented.

  8. The European computer model for optronic system performance prediction (ECOMOS)

    NASA Astrophysics Data System (ADS)

    Repasi, Endre; Bijl, Piet; Labarre, Luc; Wittenstein, Wolfgang; Bürsing, Helge

    2017-05-01

    ECOMOS is a multinational effort within the framework of an EDA Project Arrangement. Its aim is to provide a generally accepted and harmonized European computer model for computing nominal Target Acquisition (TA) ranges of optronic imagers operating in the Visible or thermal Infrared (IR). The project involves close co-operation of defence and security industry and public research institutes from France, Germany, Italy, The Netherlands and Sweden. ECOMOS uses and combines well-accepted existing European tools to build up a strong competitive position. This includes two TA models: the analytical TRM4 model and the image-based TOD model. In addition, it uses the atmosphere model MATISSE. In this paper, the central idea of ECOMOS is exposed. The overall software structure and the underlying models are shown and elucidated. The status of the project development is given as well as a short outlook on validation tests and the future potential of simulation for sensor assessment.

  9. Phase II, improved work zone design guidelines and enhanced model of traffic delays in work zones : final report, March 2009.

    DOT National Transportation Integrated Search

    2009-03-01

    This project contains three major parts. In the first part a digital computer simulation model was developed with the aim to model the traffic through a freeway work zone situation. The model was based on the Arena simulation software and used cumula...

  10. Phase II, improved work zone design guidelines and enhanced model of traffic delays in work zones : executive summary report.

    DOT National Transportation Integrated Search

    2009-03-01

    This project contains three major parts. In the first part a digital computer simulation model was developed with the aim to model the traffic through a freeway work zone situation. The model was based on the Arena simulation software and used cumula...

  11. Mind the Noise When Identifying Computational Models of Cognition from Brain Activity.

    PubMed

    Kolossa, Antonio; Kopp, Bruno

    2016-01-01

    The aim of this study was to analyze how measurement error affects the validity of modeling studies in computational neuroscience. A synthetic validity test was created using simulated P300 event-related potentials as an example. The model space comprised four computational models of single-trial P300 amplitude fluctuations which differed in terms of complexity and dependency. The single-trial fluctuation of simulated P300 amplitudes was computed on the basis of one of the models, at various levels of measurement error and at various numbers of data points. Bayesian model selection was performed based on exceedance probabilities. At very low numbers of data points, the least complex model generally outperformed the data-generating model. Invalid model identification also occurred at low levels of data quality and under low numbers of data points if the winning model's predictors were closely correlated with the predictors from the data-generating model. Given sufficient data quality and numbers of data points, the data-generating model could be correctly identified, even against models which were very similar to the data-generating model. Thus, a number of variables affects the validity of computational modeling studies, and data quality and numbers of data points are among the main factors relevant to the issue. Further, the nature of the model space (i.e., model complexity, model dependency) should not be neglected. This study provided quantitative results which show the importance of ensuring the validity of computational modeling via adequately prepared studies. The accomplishment of synthetic validity tests is recommended for future applications. Beyond that, we propose to render the demonstration of sufficient validity via adequate simulations mandatory to computational modeling studies.

  12. Comparisons for ESTA-Task3: ASTEC, CESAM and CLÉS

    NASA Astrophysics Data System (ADS)

    Christensen-Dalsgaard, J.

    The ESTA activity under the CoRoT project aims at testing the tools for computing stellar models and oscillation frequencies that will be used in the analysis of asteroseismic data from CoRoT and other large-scale upcoming asteroseismic projects. Here I report results of comparisons between calculations using the Aarhus code (ASTEC) and two other codes, for models that include diffusion and settling. It is found that there are likely deficiencies, requiring further study, in the ASTEC computation of models including convective cores.

  13. Computational Materials Research

    NASA Technical Reports Server (NTRS)

    Hinkley, Jeffrey A. (Editor); Gates, Thomas S. (Editor)

    1996-01-01

    Computational Materials aims to model and predict thermodynamic, mechanical, and transport properties of polymer matrix composites. This workshop, the second coordinated by NASA Langley, reports progress in measurements and modeling at a number of length scales: atomic, molecular, nano, and continuum. Assembled here are presentations on quantum calculations for force field development, molecular mechanics of interfaces, molecular weight effects on mechanical properties, molecular dynamics applied to poling of polymers for electrets, Monte Carlo simulation of aromatic thermoplastics, thermal pressure coefficients of liquids, ultrasonic elastic constants, group additivity predictions, bulk constitutive models, and viscoplasticity characterization.

  14. Does Perceived Ease of Use Mitigate Computer Anxiety and Stimulate Self-Regulated Learning for Pre-Service Teacher Students?

    ERIC Educational Resources Information Center

    Schlag, Myriam; Imhof, Margarete

    2017-01-01

    The aim of this study is to contribute to a better understanding of challenges and factors which influence learning efficiency with electronic-portfolios. Based on the "Technology Acceptance Model" (TAM; Davis, Bagozzi, & Warshaw, 1989) we analyzed "external variables" (e.g., computer-anxiety) that influence technology…

  15. Effects of Educational Beliefs on Attitudes towards Using Computer Technologies

    ERIC Educational Resources Information Center

    Onen, Aysem Seda

    2012-01-01

    This study, aiming to determine the relationship between pre-service teachers' beliefs about education and their attitudes towards utilizing computers and internet, is a descriptive study in scanning model. The sampling of the study consisted of 270 pre-service teachers. The potential relationship between the beliefs of pre-service teachers about…

  16. Computational Modeling and Mathematics Applied to the Physical Sciences.

    ERIC Educational Resources Information Center

    National Academy of Sciences - National Research Council, Washington, DC.

    One aim of this report is to show and emphasize that in the computational approaches to most of today's pressing and challenging scientific and technological problems, the mathematical aspects cannot and should not be considered in isolation. Following an introductory chapter, chapter 2 discusses a number of typical problems leading to…

  17. Learning Vocabulary in a Foreign Language: A Computer Software Based Model Attempt

    ERIC Educational Resources Information Center

    Yelbay Yilmaz, Yasemin

    2015-01-01

    This study aimed at devising a vocabulary learning software that would help learners learn and retain vocabulary items effectively. Foundation linguistics and learning theories have been adapted to the foreign language vocabulary learning context using a computer software named Parole that was designed exclusively for this study. Experimental…

  18. Impact of Collaborative Work on Technology Acceptance: A Case Study from Virtual Computing

    ERIC Educational Resources Information Center

    Konak, Abdullah; Kulturel-Konak, Sadan; Nasereddin, Mahdi; Bartolacci, Michael R.

    2017-01-01

    Aim/Purpose: This paper utilizes the Technology Acceptance Model (TAM) to examine the extent to which acceptance of Remote Virtual Computer Laboratories (RVCLs) is affected by students' technological backgrounds and the role of collaborative work. Background: RVCLs are widely used in information technology and cyber security education to provide…

  19. Development and Validation of a Computer Adaptive EFL Test

    ERIC Educational Resources Information Center

    He, Lianzhen; Min, Shangchao

    2017-01-01

    The first aim of this study was to develop a computer adaptive EFL test (CALT) that assesses test takers' listening and reading proficiency in English with dichotomous items and polytomous testlets. We reported in detail on the development of the CALT, including item banking, determination of suitable item response theory (IRT) models for item…

  20. The Relation between Students' Epistemological Understanding of Computer Models and Their Cognitive Processing on a Modelling Task

    ERIC Educational Resources Information Center

    Sins, Patrick H. M.; Savelsbergh, Elwin R.; van Joolingen, Wouter R.; van Hout-Wolters, Bernadette H. A. M.

    2009-01-01

    While many researchers in science education have argued that students' epistemological understanding of models and of modelling processes would influence their cognitive processing on a modelling task, there has been little direct evidence for such an effect. Therefore, this study aimed to investigate the relation between students' epistemological…

  1. Computational Flow Modeling of Hydrodynamics in Multiphase Trickle-Bed Reactors

    NASA Astrophysics Data System (ADS)

    Lopes, Rodrigo J. G.; Quinta-Ferreira, Rosa M.

    2008-05-01

    This study aims to incorporate most recent multiphase models in order to investigate the hydrodynamic behavior of a TBR in terms of pressure drop and liquid holdup. Taking into account transport phenomena such as mass and heat transfer, an Eulerian k-fluid model was developed resulting from the volume averaging of the continuity and momentum equations and solved for a 3D representation of the catalytic bed. Computational fluid dynamics (CFD) model predicts hydrodynamic parameters quite well if good closures for fluid/fluid and fluid/particle interactions are incorporated in the multiphase model. Moreover, catalytic performance is investigated with the catalytic wet oxidation of a phenolic pollutant.

  2. Computational neurorehabilitation: modeling plasticity and learning to predict recovery.

    PubMed

    Reinkensmeyer, David J; Burdet, Etienne; Casadio, Maura; Krakauer, John W; Kwakkel, Gert; Lang, Catherine E; Swinnen, Stephan P; Ward, Nick S; Schweighofer, Nicolas

    2016-04-30

    Despite progress in using computational approaches to inform medicine and neuroscience in the last 30 years, there have been few attempts to model the mechanisms underlying sensorimotor rehabilitation. We argue that a fundamental understanding of neurologic recovery, and as a result accurate predictions at the individual level, will be facilitated by developing computational models of the salient neural processes, including plasticity and learning systems of the brain, and integrating them into a context specific to rehabilitation. Here, we therefore discuss Computational Neurorehabilitation, a newly emerging field aimed at modeling plasticity and motor learning to understand and improve movement recovery of individuals with neurologic impairment. We first explain how the emergence of robotics and wearable sensors for rehabilitation is providing data that make development and testing of such models increasingly feasible. We then review key aspects of plasticity and motor learning that such models will incorporate. We proceed by discussing how computational neurorehabilitation models relate to the current benchmark in rehabilitation modeling - regression-based, prognostic modeling. We then critically discuss the first computational neurorehabilitation models, which have primarily focused on modeling rehabilitation of the upper extremity after stroke, and show how even simple models have produced novel ideas for future investigation. Finally, we conclude with key directions for future research, anticipating that soon we will see the emergence of mechanistic models of motor recovery that are informed by clinical imaging results and driven by the actual movement content of rehabilitation therapy as well as wearable sensor-based records of daily activity.

  3. An inlet analysis for the NASA hypersonic research engine aerothermodynamic integration model

    NASA Technical Reports Server (NTRS)

    Andrews, E. H., Jr.; Russell, J. W.; Mackley, E. A.; Simmonds, A. L.

    1974-01-01

    A theoretical analysis for the inlet of the NASA Hypersonic Research Engine (HRE) Aerothermodynamic Integration Model (AIM) has been undertaken by use of a method-of-characteristics computer program. The purpose of the analysis was to obtain pretest information on the full-scale HRE inlet in support of the experimental AIM program (completed May 1974). Mass-flow-ratio and additive-drag-coefficient schedules were obtained that well defined the range effected in the AIM tests. Mass-weighted average inlet total-pressure recovery, kinetic energy efficiency, and throat Mach numbers were obtained.

  4. Using Artificial Neural Networks in Educational Research: Some Comparisons with Linear Statistical Models.

    ERIC Educational Resources Information Center

    Everson, Howard T.; And Others

    This paper explores the feasibility of neural computing methods such as artificial neural networks (ANNs) and abductory induction mechanisms (AIM) for use in educational measurement. ANNs and AIMS methods are contrasted with more traditional statistical techniques, such as multiple regression and discriminant function analyses, for making…

  5. Optimal control strategy for a novel computer virus propagation model on scale-free networks

    NASA Astrophysics Data System (ADS)

    Zhang, Chunming; Huang, Haitao

    2016-06-01

    This paper aims to study the combined impact of reinstalling system and network topology on the spread of computer viruses over the Internet. Based on scale-free network, this paper proposes a novel computer viruses propagation model-SLBOSmodel. A systematic analysis of this new model shows that the virus-free equilibrium is globally asymptotically stable when its spreading threshold is less than one; nevertheless, it is proved that the viral equilibrium is permanent if the spreading threshold is greater than one. Then, the impacts of different model parameters on spreading threshold are analyzed. Next, an optimally controlled SLBOS epidemic model on complex networks is also studied. We prove that there is an optimal control existing for the control problem. Some numerical simulations are finally given to illustrate the main results.

  6. Simulation of glioblastoma multiforme (GBM) tumor cells using ising model on the Creutz Cellular Automaton

    NASA Astrophysics Data System (ADS)

    Züleyha, Artuç; Ziya, Merdan; Selçuk, Yeşiltaş; Kemal, Öztürk M.; Mesut, Tez

    2017-11-01

    Computational models for tumors have difficulties due to complexity of tumor nature and capacities of computational tools, however, these models provide visions to understand interactions between tumor and its micro environment. Moreover computational models have potential to develop strategies for individualized treatments for cancer. To observe a solid brain tumor, glioblastoma multiforme (GBM), we present a two dimensional Ising Model applied on Creutz cellular automaton (CCA). The aim of this study is to analyze avascular spherical solid tumor growth, considering transitions between non tumor cells and cancer cells are like phase transitions in physical system. Ising model on CCA algorithm provides a deterministic approach with discrete time steps and local interactions in position space to view tumor growth as a function of time. Our simulation results are given for fixed tumor radius and they are compatible with theoretical and clinic data.

  7. Computational Biochemistry-Enzyme Mechanisms Explored.

    PubMed

    Culka, Martin; Gisdon, Florian J; Ullmann, G Matthias

    2017-01-01

    Understanding enzyme mechanisms is a major task to achieve in order to comprehend how living cells work. Recent advances in biomolecular research provide huge amount of data on enzyme kinetics and structure. The analysis of diverse experimental results and their combination into an overall picture is, however, often challenging. Microscopic details of the enzymatic processes are often anticipated based on several hints from macroscopic experimental data. Computational biochemistry aims at creation of a computational model of an enzyme in order to explain microscopic details of the catalytic process and reproduce or predict macroscopic experimental findings. Results of such computations are in part complementary to experimental data and provide an explanation of a biochemical process at the microscopic level. In order to evaluate the mechanism of an enzyme, a structural model is constructed which can be analyzed by several theoretical approaches. Several simulation methods can and should be combined to get a reliable picture of the process of interest. Furthermore, abstract models of biological systems can be constructed combining computational and experimental data. In this review, we discuss structural computational models of enzymatic systems. We first discuss various models to simulate enzyme catalysis. Furthermore, we review various approaches how to characterize the enzyme mechanism both qualitatively and quantitatively using different modeling approaches. © 2017 Elsevier Inc. All rights reserved.

  8. Development of a computer-based training (CBT) course for the FSUTMS comprehensive modeling workshop.

    DOT National Transportation Integrated Search

    2008-09-01

    FSUTMS training is a major activity of the Systems Planning Office of the Florida Department of : Transportation (FDOT). The training aims to establish and maintain quality assurance for consistent : statewide modeling standards and provide up-to-dat...

  9. Using Computer Simulations for Promoting Model-based Reasoning. Epistemological and Educational Dimensions

    NASA Astrophysics Data System (ADS)

    Develaki, Maria

    2017-11-01

    Scientific reasoning is particularly pertinent to science education since it is closely related to the content and methodologies of science and contributes to scientific literacy. Much of the research in science education investigates the appropriate framework and teaching methods and tools needed to promote students' ability to reason and evaluate in a scientific way. This paper aims (a) to contribute to an extended understanding of the nature and pedagogical importance of model-based reasoning and (b) to exemplify how using computer simulations can support students' model-based reasoning. We provide first a background for both scientific reasoning and computer simulations, based on the relevant philosophical views and the related educational discussion. This background suggests that the model-based framework provides an epistemologically valid and pedagogically appropriate basis for teaching scientific reasoning and for helping students develop sounder reasoning and decision-taking abilities and explains how using computer simulations can foster these abilities. We then provide some examples illustrating the use of computer simulations to support model-based reasoning and evaluation activities in the classroom. The examples reflect the procedure and criteria for evaluating models in science and demonstrate the educational advantages of their application in classroom reasoning activities.

  10. Knowledge Representation and Ontologies

    NASA Astrophysics Data System (ADS)

    Grimm, Stephan

    Knowledge representation and reasoning aims at designing computer systems that reason about a machine-interpretable representation of the world. Knowledge-based systems have a computational model of some domain of interest in which symbols serve as surrogates for real world domain artefacts, such as physical objects, events, relationships, etc. [1]. The domain of interest can cover any part of the real world or any hypothetical system about which one desires to represent knowledge for com-putational purposes. A knowledge-based system maintains a knowledge base, which stores the symbols of the computational model in the form of statements about the domain, and it performs reasoning by manipulating these symbols. Applications can base their decisions on answers to domain-relevant questions posed to a knowledge base.

  11. Computational Modeling of Airway and Pulmonary Vascular Structure and Function: Development of a “Lung Physiome”

    PubMed Central

    Tawhai, M. H.; Clark, A. R.; Donovan, G. M.; Burrowes, K. S.

    2011-01-01

    Computational models of lung structure and function necessarily span multiple spatial and temporal scales, i.e., dynamic molecular interactions give rise to whole organ function, and the link between these scales cannot be fully understood if only molecular or organ-level function is considered. Here, we review progress in constructing multiscale finite element models of lung structure and function that are aimed at providing a computational framework for bridging the spatial scales from molecular to whole organ. These include structural models of the intact lung, embedded models of the pulmonary airways that couple to model lung tissue, and models of the pulmonary vasculature that account for distinct structural differences at the extra- and intra-acinar levels. Biophysically based functional models for tissue deformation, pulmonary blood flow, and airway bronchoconstriction are also described. The development of these advanced multiscale models has led to a better understanding of complex physiological mechanisms that govern regional lung perfusion and emergent heterogeneity during bronchoconstriction. PMID:22011236

  12. A comparative approach to computer aided design model of a dog femur.

    PubMed

    Turamanlar, O; Verim, O; Karabulut, A

    2016-01-01

    Computer assisted technologies offer new opportunities in medical imaging and rapid prototyping in biomechanical engineering. Three dimensional (3D) modelling of soft tissues and bones are becoming more important. The accuracy of the analysis in modelling processes depends on the outline of the tissues derived from medical images. The aim of this study is the evaluation of the accuracy of 3D models of a dog femur derived from computed tomography data by using point cloud method and boundary line method on several modelling software. Solidworks, Rapidform and 3DSMax software were used to create 3D models and outcomes were evaluated statistically. The most accurate 3D prototype of the dog femur was created with stereolithography method using rapid prototype device. Furthermore, the linearity of the volumes of models was investigated between software and the constructed models. The difference between the software and real models manifests the sensitivity of the software and the devices used in this manner.

  13. Development of a Diagnostic and Remedial Learning System Based on an Enhanced Concept--Effect Model

    ERIC Educational Resources Information Center

    Panjaburees, Patcharin; Triampo, Wannapong; Hwang, Gwo-Jen; Chuedoung, Meechoke; Triampo, Darapond

    2013-01-01

    With the rapid advances in computer technology during recent years, researchers have demonstrated the pivotal influences of computer-assisted diagnostic systems on student learning performance improvement. This research aims to develop a Diagnostic and Remedial Learning System (DRLS) for an algebra course in a Thai lower secondary school context…

  14. Comparing Students' Scratch Skills with Their Computational Thinking Skills in Terms of Different Variables

    ERIC Educational Resources Information Center

    Oluk, Ali; Korkmaz, Özgen

    2016-01-01

    This study aimed to compare 5th graders' scores obtained from Scratch projects developed in the framework of Information Technologies and Software classes via Dr Scratch web tool with the scores obtained from Computational Thinking Levels Scale and to examine this comparison in terms of different variables. Correlational research model was…

  15. Design and Development Computer-Based E-Learning Teaching Material for Improving Mathematical Understanding Ability and Spatial Sense of Junior High School Students

    NASA Astrophysics Data System (ADS)

    Nurjanah; Dahlan, J. A.; Wibisono, Y.

    2017-02-01

    This paper aims to make a design and development computer-based e-learning teaching material for improving mathematical understanding ability and spatial sense of junior high school students. Furthermore, the particular aims are (1) getting teaching material design, evaluation model, and intrument to measure mathematical understanding ability and spatial sense of junior high school students; (2) conducting trials computer-based e-learning teaching material model, asessment, and instrument to develop mathematical understanding ability and spatial sense of junior high school students; (3) completing teaching material models of computer-based e-learning, assessment, and develop mathematical understanding ability and spatial sense of junior high school students; (4) resulting research product is teaching materials of computer-based e-learning. Furthermore, the product is an interactive learning disc. The research method is used of this study is developmental research which is conducted by thought experiment and instruction experiment. The result showed that teaching materials could be used very well. This is based on the validation of computer-based e-learning teaching materials, which is validated by 5 multimedia experts. The judgement result of face and content validity of 5 validator shows that the same judgement result to the face and content validity of each item test of mathematical understanding ability and spatial sense. The reliability test of mathematical understanding ability and spatial sense are 0,929 and 0,939. This reliability test is very high. While the validity of both tests have a high and very high criteria.

  16. A Comparative Evaluation of Mixed Dentition Analysis on Reliability of Cone Beam Computed Tomography Image Compared to Plaster Model

    PubMed Central

    Gowd, Snigdha; Shankar, T; Dash, Samarendra; Sahoo, Nivedita; Chatterjee, Suravi; Mohanty, Pritam

    2017-01-01

    Aims and Objective: The aim of the study was to evaluate the reliability of cone beam computed tomography (CBCT) obtained image over plaster model for the assessment of mixed dentition analysis. Materials and Methods: Thirty CBCT-derived images and thirty plaster models were derived from the dental archives, and Moyer's and Tanaka-Johnston analyses were performed. The data obtained were interpreted and analyzed statistically using SPSS 10.0/PC (SPSS Inc., Chicago, IL, USA). Descriptive and analytical analysis along with Student's t-test was performed to qualitatively evaluate the data and P < 0.05 was considered statistically significant. Results: Statistically, significant results were obtained on data comparison between CBCT-derived images and plaster model; the mean for Moyer's analysis in the left and right lower arch for CBCT and plaster model was 21.2 mm, 21.1 mm and 22.5 mm, 22.5 mm, respectively. Conclusion: CBCT-derived images were less reliable as compared to data obtained directly from plaster model for mixed dentition analysis. PMID:28852639

  17. A Computational Model of Human Table Tennis for Robot Application

    NASA Astrophysics Data System (ADS)

    Mülling, Katharina; Peters, Jan

    Table tennis is a difficult motor skill which requires all basic components of a general motor skill learning system. In order to get a step closer to such a generic approach to the automatic acquisition and refinement of table tennis, we study table tennis from a human motor control point of view. We make use of the basic models of discrete human movement phases, virtual hitting points, and the operational timing hypothesis. Using these components, we create a computational model which is aimed at reproducing human-like behavior. We verify the functionality of this model in a physically realistic simulation of a Barrett WAM.

  18. Computational approaches to computational aero-acoustics

    NASA Technical Reports Server (NTRS)

    Hardin, Jay C.

    1996-01-01

    The various techniques by which the goal of computational aeroacoustics (the calculation and noise prediction of a fluctuating fluid flow) may be achieved are reviewed. The governing equations for compressible fluid flow are presented. The direct numerical simulation approach is shown to be computationally intensive for high Reynolds number viscous flows. Therefore, other approaches, such as the acoustic analogy, vortex models and various perturbation techniques that aim to break the analysis into a viscous part and an acoustic part are presented. The choice of the approach is shown to be problem dependent.

  19. A Modeling Framework for Optimal Computational Resource Allocation Estimation: Considering the Trade-offs between Physical Resolutions, Uncertainty and Computational Costs

    NASA Astrophysics Data System (ADS)

    Moslehi, M.; de Barros, F.; Rajagopal, R.

    2014-12-01

    Hydrogeological models that represent flow and transport in subsurface domains are usually large-scale with excessive computational complexity and uncertain characteristics. Uncertainty quantification for predicting flow and transport in heterogeneous formations often entails utilizing a numerical Monte Carlo framework, which repeatedly simulates the model according to a random field representing hydrogeological characteristics of the field. The physical resolution (e.g. grid resolution associated with the physical space) for the simulation is customarily chosen based on recommendations in the literature, independent of the number of Monte Carlo realizations. This practice may lead to either excessive computational burden or inaccurate solutions. We propose an optimization-based methodology that considers the trade-off between the following conflicting objectives: time associated with computational costs, statistical convergence of the model predictions and physical errors corresponding to numerical grid resolution. In this research, we optimally allocate computational resources by developing a modeling framework for the overall error based on a joint statistical and numerical analysis and optimizing the error model subject to a given computational constraint. The derived expression for the overall error explicitly takes into account the joint dependence between the discretization error of the physical space and the statistical error associated with Monte Carlo realizations. The accuracy of the proposed framework is verified in this study by applying it to several computationally extensive examples. Having this framework at hand aims hydrogeologists to achieve the optimum physical and statistical resolutions to minimize the error with a given computational budget. Moreover, the influence of the available computational resources and the geometric properties of the contaminant source zone on the optimum resolutions are investigated. We conclude that the computational cost associated with optimal allocation can be substantially reduced compared with prevalent recommendations in the literature.

  20. Modeling of an intelligent pressure sensor using functional link artificial neural networks.

    PubMed

    Patra, J C; van den Bos, A

    2000-01-01

    A capacitor pressure sensor (CPS) is modeled for accurate readout of applied pressure using a novel artificial neural network (ANN). The proposed functional link ANN (FLANN) is a computationally efficient nonlinear network and is capable of complex nonlinear mapping between its input and output pattern space. The nonlinearity is introduced into the FLANN by passing the input pattern through a functional expansion unit. Three different polynomials such as, Chebyschev, Legendre and power series have been employed in the FLANN. The FLANN offers computational advantage over a multilayer perceptron (MLP) for similar performance in modeling of the CPS. The prime aim of the present paper is to develop an intelligent model of the CPS involving less computational complexity, so that its implementation can be economical and robust. It is shown that, over a wide temperature variation ranging from -50 to 150 degrees C, the maximum error of estimation of pressure remains within +/- 3%. With the help of computer simulation, the performance of the three types of FLANN models has been compared to that of an MLP based model.

  1. Final Report Collaborative Project. Improving the Representation of Coastal and Estuarine Processes in Earth System Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryan, Frank; Dennis, John; MacCready, Parker

    This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation. The main computational objectives were: 1. To develop computationally efficient, but physically based, parameterizations of estuary and continental shelf mixing processes for use in an Earth System Model (CESM). 2. Tomore » develop a two-way nested regional modeling framework in order to dynamically downscale the climate response of particular coastal ocean regions and to upscale the impact of the regional coastal processes to the global climate in an Earth System Model (CESM). 3. To develop computational infrastructure to enhance the efficiency of data transfer between specific sources and destinations, i.e., a point-to-point communication capability, (used in objective 1) within POP, the ocean component of CESM.« less

  2. ThinkerTools. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2012

    2012-01-01

    "ThinkerTools" is a computer-based program that aims to develop students' understanding of physics and scientific modeling. The program is composed of two curricula for middle school students, "ThinkerTools Inquiry" and "Model-Enhanced ThinkerTools". "ThinkerTools Inquiry" allows students to explore the…

  3. MODEL DEVELOPMENT AND APPLICATION FOR ASSESSING HUMAN EXPOSURE AND DOSE TO TOXIC CHEMICALS AND POLLUTANTS

    EPA Science Inventory

    This project aims to strengthen the general scientific foundation of EPA's exposure and risk assessment processes by developing state-of-the-art exposure to dose computational models. This research will produce physiologically-based pharmacokinetic (PBPK) and pharmacodynamic (PD)...

  4. Overview of the status of predictive computer models for skin sensitization (JRC Expert meeting on pre- and pro-haptens )

    EPA Science Inventory

    No abstract was prepared or requested. This is a short presentation aiming to present a status of what in silico models and approaches exists in the prediction of skin sensitization potential and/or potency.

  5. Multi-objective reverse logistics model for integrated computer waste management.

    PubMed

    Ahluwalia, Poonam Khanijo; Nema, Arvind K

    2006-12-01

    This study aimed to address the issues involved in the planning and design of a computer waste management system in an integrated manner. A decision-support tool is presented for selecting an optimum configuration of computer waste management facilities (segregation, storage, treatment/processing, reuse/recycle and disposal) and allocation of waste to these facilities. The model is based on an integer linear programming method with the objectives of minimizing environmental risk as well as cost. The issue of uncertainty in the estimated waste quantities from multiple sources is addressed using the Monte Carlo simulation technique. An illustrated example of computer waste management in Delhi, India is presented to demonstrate the usefulness of the proposed model and to study tradeoffs between cost and risk. The results of the example problem show that it is possible to reduce the environmental risk significantly by a marginal increase in the available cost. The proposed model can serve as a powerful tool to address the environmental problems associated with exponentially growing quantities of computer waste which are presently being managed using rudimentary methods of reuse, recovery and disposal by various small-scale vendors.

  6. OpenFOAM: Open source CFD in research and industry

    NASA Astrophysics Data System (ADS)

    Jasak, Hrvoje

    2009-12-01

    The current focus of development in industrial Computational Fluid Dynamics (CFD) is integration of CFD into Computer-Aided product development, geometrical optimisation, robust design and similar. On the other hand, in CFD research aims to extend the boundaries ofpractical engineering use in "non-traditional " areas. Requirements of computational flexibility and code integration are contradictory: a change of coding paradigm, with object orientation, library components, equation mimicking is proposed as a way forward. This paper describes OpenFOAM, a C++ object oriented library for Computational Continuum Mechanics (CCM) developed by the author. Efficient and flexible implementation of complex physical models is achieved by mimicking the form ofpartial differential equation in software, with code functionality provided in library form. Open Source deployment and development model allows the user to achieve desired versatility in physical modeling without the sacrifice of complex geometry support and execution efficiency.

  7. Parameterized reduced-order models using hyper-dual numbers.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fike, Jeffrey A.; Brake, Matthew Robert

    2013-10-01

    The goal of most computational simulations is to accurately predict the behavior of a real, physical system. Accurate predictions often require very computationally expensive analyses and so reduced order models (ROMs) are commonly used. ROMs aim to reduce the computational cost of the simulations while still providing accurate results by including all of the salient physics of the real system in the ROM. However, real, physical systems often deviate from the idealized models used in simulations due to variations in manufacturing or other factors. One approach to this issue is to create a parameterized model in order to characterize themore » effect of perturbations from the nominal model on the behavior of the system. This report presents a methodology for developing parameterized ROMs, which is based on Craig-Bampton component mode synthesis and the use of hyper-dual numbers to calculate the derivatives necessary for the parameterization.« less

  8. Editorial: Computational Creativity, Concept Invention, and General Intelligence

    NASA Astrophysics Data System (ADS)

    Besold, Tarek R.; Kühnberger, Kai-Uwe; Veale, Tony

    2015-12-01

    Over the last decade, computational creativity as a field of scientific investigation and computational systems engineering has seen growing popularity. Still, the levels of development between projects aiming at systems for artistic production or performance and endeavours addressing creative problem-solving or models of creative cognitive capacities is diverging. While the former have already seen several great successes, the latter still remain in their infancy. This volume collects reports on work trying to close the accrued gap.

  9. Computational Fluid Dynamics. [numerical methods and algorithm development

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This collection of papers was presented at the Computational Fluid Dynamics (CFD) Conference held at Ames Research Center in California on March 12 through 14, 1991. It is an overview of CFD activities at NASA Lewis Research Center. The main thrust of computational work at Lewis is aimed at propulsion systems. Specific issues related to propulsion CFD and associated modeling will also be presented. Examples of results obtained with the most recent algorithm development will also be presented.

  10. Project Photofly: New 3d Modeling Online Web Service (case Studies and Assessments)

    NASA Astrophysics Data System (ADS)

    Abate, D.; Furini, G.; Migliori, S.; Pierattini, S.

    2011-09-01

    During summer 2010, Autodesk has released a still ongoing project called Project Photofly, freely downloadable from AutodeskLab web site until August 1 2011. Project Photofly based on computer-vision and photogrammetric principles, exploiting the power of cloud computing, is a web service able to convert collections of photographs into 3D models. Aim of our research was to evaluate the Project Photofly, through different case studies, for 3D modeling of cultural heritage monuments and objects, mostly to identify for which goals and objects it is suitable. The automatic approach will be mainly analyzed.

  11. An accurate binding interaction model in de novo computational protein design of interactions: if you build it, they will bind.

    PubMed

    London, Nir; Ambroggio, Xavier

    2014-02-01

    Computational protein design efforts aim to create novel proteins and functions in an automated manner and, in the process, these efforts shed light on the factors shaping natural proteins. The focus of these efforts has progressed from the interior of proteins to their surface and the design of functions, such as binding or catalysis. Here we examine progress in the development of robust methods for the computational design of non-natural interactions between proteins and molecular targets such as other proteins or small molecules. This problem is referred to as the de novo computational design of interactions. Recent successful efforts in de novo enzyme design and the de novo design of protein-protein interactions open a path towards solving this problem. We examine the common themes in these efforts, and review recent studies aimed at understanding the nature of successes and failures in the de novo computational design of interactions. While several approaches culminated in success, the use of a well-defined structural model for a specific binding interaction in particular has emerged as a key strategy for a successful design, and is therefore reviewed with special consideration. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Parallel Markov chain Monte Carlo - bridging the gap to high-performance Bayesian computation in animal breeding and genetics.

    PubMed

    Wu, Xiao-Lin; Sun, Chuanyu; Beissinger, Timothy M; Rosa, Guilherme Jm; Weigel, Kent A; Gatti, Natalia de Leon; Gianola, Daniel

    2012-09-25

    Most Bayesian models for the analysis of complex traits are not analytically tractable and inferences are based on computationally intensive techniques. This is true of Bayesian models for genome-enabled selection, which uses whole-genome molecular data to predict the genetic merit of candidate animals for breeding purposes. In this regard, parallel computing can overcome the bottlenecks that can arise from series computing. Hence, a major goal of the present study is to bridge the gap to high-performance Bayesian computation in the context of animal breeding and genetics. Parallel Monte Carlo Markov chain algorithms and strategies are described in the context of animal breeding and genetics. Parallel Monte Carlo algorithms are introduced as a starting point including their applications to computing single-parameter and certain multiple-parameter models. Then, two basic approaches for parallel Markov chain Monte Carlo are described: one aims at parallelization within a single chain; the other is based on running multiple chains, yet some variants are discussed as well. Features and strategies of the parallel Markov chain Monte Carlo are illustrated using real data, including a large beef cattle dataset with 50K SNP genotypes. Parallel Markov chain Monte Carlo algorithms are useful for computing complex Bayesian models, which does not only lead to a dramatic speedup in computing but can also be used to optimize model parameters in complex Bayesian models. Hence, we anticipate that use of parallel Markov chain Monte Carlo will have a profound impact on revolutionizing the computational tools for genomic selection programs.

  13. Parallel Markov chain Monte Carlo - bridging the gap to high-performance Bayesian computation in animal breeding and genetics

    PubMed Central

    2012-01-01

    Background Most Bayesian models for the analysis of complex traits are not analytically tractable and inferences are based on computationally intensive techniques. This is true of Bayesian models for genome-enabled selection, which uses whole-genome molecular data to predict the genetic merit of candidate animals for breeding purposes. In this regard, parallel computing can overcome the bottlenecks that can arise from series computing. Hence, a major goal of the present study is to bridge the gap to high-performance Bayesian computation in the context of animal breeding and genetics. Results Parallel Monte Carlo Markov chain algorithms and strategies are described in the context of animal breeding and genetics. Parallel Monte Carlo algorithms are introduced as a starting point including their applications to computing single-parameter and certain multiple-parameter models. Then, two basic approaches for parallel Markov chain Monte Carlo are described: one aims at parallelization within a single chain; the other is based on running multiple chains, yet some variants are discussed as well. Features and strategies of the parallel Markov chain Monte Carlo are illustrated using real data, including a large beef cattle dataset with 50K SNP genotypes. Conclusions Parallel Markov chain Monte Carlo algorithms are useful for computing complex Bayesian models, which does not only lead to a dramatic speedup in computing but can also be used to optimize model parameters in complex Bayesian models. Hence, we anticipate that use of parallel Markov chain Monte Carlo will have a profound impact on revolutionizing the computational tools for genomic selection programs. PMID:23009363

  14. The European computer model for optronic system performance prediction (ECOMOS)

    NASA Astrophysics Data System (ADS)

    Keßler, Stefan; Bijl, Piet; Labarre, Luc; Repasi, Endre; Wittenstein, Wolfgang; Bürsing, Helge

    2017-10-01

    ECOMOS is a multinational effort within the framework of an EDA Project Arrangement. Its aim is to provide a generally accepted and harmonized European computer model for computing nominal Target Acquisition (TA) ranges of optronic imagers operating in the Visible or thermal Infrared (IR). The project involves close co-operation of defence and security industry and public research institutes from France, Germany, Italy, The Netherlands and Sweden. ECOMOS uses and combines well-accepted existing European tools to build up a strong competitive position. This includes two TA models: the analytical TRM4 model and the image-based TOD model. In addition, it uses the atmosphere model MATISSE. In this paper, the central idea of ECOMOS is exposed. The overall software structure and the underlying models are shown and elucidated. The status of the project development is given as well as a short discussion of validation tests and an outlook on the future potential of simulation for sensor assessment.

  15. An Opening Chapter of the First Generation of Artificial Intelligence in Medicine: The First Rutgers AIM Workshop, June 1975

    PubMed Central

    2015-01-01

    Summary The first generation of Artificial Intelligence (AI) in Medicine methods were developed in the early 1970’s drawing on insights about problem solving in AI. They developed new ways of representing structured expert knowledge about clinical and biomedical problems using causal, taxonomic, associational, rule, and frame-based models. By 1975, several prototype systems had been developed and clinically tested, and the Rutgers Research Resource on Computers in Biomedicine hosted the first in a series of workshops on AI in Medicine that helped researchers and clinicians share their ideas, demonstrate their models, and comment on the prospects for the field. These developments and the workshops themselves benefited considerably from Stanford’s SUMEX-AIM pioneering experiment in biomedical computer networking. This paper focuses on discussions about issues at the intersection of medicine and artificial intelligence that took place during the presentations and panels at the First Rutgers AIM Workshop in New Brunswick, New Jersey from June 14 to 17, 1975. PMID:26123911

  16. An Opening Chapter of the First Generation of Artificial Intelligence in Medicine: The First Rutgers AIM Workshop, June 1975.

    PubMed

    Kulikowski, C A

    2015-08-13

    The first generation of Artificial Intelligence (AI) in Medicine methods were developed in the early 1970's drawing on insights about problem solving in AI. They developed new ways of representing structured expert knowledge about clinical and biomedical problems using causal, taxonomic, associational, rule, and frame-based models. By 1975, several prototype systems had been developed and clinically tested, and the Rutgers Research Resource on Computers in Biomedicine hosted the first in a series of workshops on AI in Medicine that helped researchers and clinicians share their ideas, demonstrate their models, and comment on the prospects for the field. These developments and the workshops themselves benefited considerably from Stanford's SUMEX-AIM pioneering experiment in biomedical computer networking. This paper focuses on discussions about issues at the intersection of medicine and artificial intelligence that took place during the presentations and panels at the First Rutgers AIM Workshop in New Brunswick, New Jersey from June 14 to 17, 1975.

  17. The Effects of Computer Algebra System on Undergraduate Students' Spatial Visualization Skills in a Calculus Course

    ERIC Educational Resources Information Center

    Karakus, Fatih; Aydin, Bünyamin

    2017-01-01

    This study aimed at determining the effects of using a computer algebra system (CAS) on undergraduate students' spatial visualization skills in a calculus course. This study used an experimental design. The "one group pretest-posttest design" was the research model. The participants were 41 sophomore students (26 female and 15 male)…

  18. Partnership in Being a Specialist Mathematics and Computing College--Who Gains What, How and Why?

    ERIC Educational Resources Information Center

    Sinkinson, Anne J.

    2007-01-01

    The research took place in a mathematics and computing specialist school. The article reports on part of a case study of the mathematics department's experience of being a major contributor to the requirements of being a specialist school. This article aims to explore and describe one model of partnership within the "community" remit of…

  19. Filling the gap between biology and computer science

    PubMed Central

    Aguilar-Ruiz, Jesús S; Moore, Jason H; Ritchie, Marylyn D

    2008-01-01

    This editorial introduces BioData Mining, a new journal which publishes research articles related to advances in computational methods and techniques for the extraction of useful knowledge from heterogeneous biological data. We outline the aims and scope of the journal, introduce the publishing model and describe the open peer review policy, which fosters interaction within the research community. PMID:18822148

  20. Simulated Sustainable Societies: Students' Reflections on Creating Future Cities in Computer Games

    ERIC Educational Resources Information Center

    Nilsson, Elisabet M.; Jakobsson, Anders

    2011-01-01

    The empirical study, in this article, involved 42 students (ages 14-15), who used the urban simulation computer game SimCity 4 to create models of sustainable future cities. The aim was to explore in what ways the simulated "real" worlds provided by this game could be a potential facilitator for science learning contexts. The topic investigated is…

  1. Some Observations on the Current Status of Performing Finite Element Analyses

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Knight, Norman F., Jr; Shivakumar, Kunigal N.

    2015-01-01

    Aerospace structures are complex high-performance structures. Advances in reliable and efficient computing and modeling tools are enabling analysts to consider complex configurations, build complex finite element models, and perform analysis rapidly. Many of the early career engineers of today are very proficient in the usage of modern computers, computing engines, complex software systems, and visualization tools. These young engineers are becoming increasingly efficient in building complex 3D models of complicated aerospace components. However, the current trends demonstrate blind acceptance of the results of the finite element analysis results. This paper is aimed at raising an awareness of this situation. Examples of the common encounters are presented. To overcome the current trends, some guidelines and suggestions for analysts, senior engineers, and educators are offered.

  2. Evidence Theory Based Uncertainty Quantification in Radiological Risk due to Accidental Release of Radioactivity from a Nuclear Power Plant

    NASA Astrophysics Data System (ADS)

    Ingale, S. V.; Datta, D.

    2010-10-01

    Consequence of the accidental release of radioactivity from a nuclear power plant is assessed in terms of exposure or dose to the members of the public. Assessment of risk is routed through this dose computation. Dose computation basically depends on the basic dose assessment model and exposure pathways. One of the exposure pathways is the ingestion of contaminated food. The aim of the present paper is to compute the uncertainty associated with the risk to the members of the public due to the ingestion of contaminated food. The governing parameters of the ingestion dose assessment model being imprecise, we have approached evidence theory to compute the bound of the risk. The uncertainty is addressed by the belief and plausibility fuzzy measures.

  3. The Neurona at Home project: Simulating a large-scale cellular automata brain in a distributed computing environment

    NASA Astrophysics Data System (ADS)

    Acedo, L.; Villanueva-Oller, J.; Moraño, J. A.; Villanueva, R.-J.

    2013-01-01

    The Berkeley Open Infrastructure for Network Computing (BOINC) has become the standard open source solution for grid computing in the Internet. Volunteers use their computers to complete an small part of the task assigned by a dedicated server. We have developed a BOINC project called Neurona@Home whose objective is to simulate a cellular automata random network with, at least, one million neurons. We consider a cellular automata version of the integrate-and-fire model in which excitatory and inhibitory nodes can activate or deactivate neighbor nodes according to a set of probabilistic rules. Our aim is to determine the phase diagram of the model and its behaviour and to compare it with the electroencephalographic signals measured in real brains.

  4. The GLEaMviz computational tool, a publicly available software to explore realistic epidemic spreading scenarios at the global scale

    PubMed Central

    2011-01-01

    Background Computational models play an increasingly important role in the assessment and control of public health crises, as demonstrated during the 2009 H1N1 influenza pandemic. Much research has been done in recent years in the development of sophisticated data-driven models for realistic computer-based simulations of infectious disease spreading. However, only a few computational tools are presently available for assessing scenarios, predicting epidemic evolutions, and managing health emergencies that can benefit a broad audience of users including policy makers and health institutions. Results We present "GLEaMviz", a publicly available software system that simulates the spread of emerging human-to-human infectious diseases across the world. The GLEaMviz tool comprises three components: the client application, the proxy middleware, and the simulation engine. The latter two components constitute the GLEaMviz server. The simulation engine leverages on the Global Epidemic and Mobility (GLEaM) framework, a stochastic computational scheme that integrates worldwide high-resolution demographic and mobility data to simulate disease spread on the global scale. The GLEaMviz design aims at maximizing flexibility in defining the disease compartmental model and configuring the simulation scenario; it allows the user to set a variety of parameters including: compartment-specific features, transition values, and environmental effects. The output is a dynamic map and a corresponding set of charts that quantitatively describe the geo-temporal evolution of the disease. The software is designed as a client-server system. The multi-platform client, which can be installed on the user's local machine, is used to set up simulations that will be executed on the server, thus avoiding specific requirements for large computational capabilities on the user side. Conclusions The user-friendly graphical interface of the GLEaMviz tool, along with its high level of detail and the realism of its embedded modeling approach, opens up the platform to simulate realistic epidemic scenarios. These features make the GLEaMviz computational tool a convenient teaching/training tool as well as a first step toward the development of a computational tool aimed at facilitating the use and exploitation of computational models for the policy making and scenario analysis of infectious disease outbreaks. PMID:21288355

  5. High-resolution Modeling Assisted Design of Customized and Individualized Transcranial Direct Current Stimulation Protocols

    PubMed Central

    Bikson, Marom; Rahman, Asif; Datta, Abhishek; Fregni, Felipe; Merabet, Lotfi

    2012-01-01

    Objectives Transcranial direct current stimulation (tDCS) is a neuromodulatory technique that delivers low-intensity currents facilitating or inhibiting spontaneous neuronal activity. tDCS is attractive since dose is readily adjustable by simply changing electrode number, position, size, shape, and current. In the recent past, computational models have been developed with increased precision with the goal to help customize tDCS dose. The aim of this review is to discuss the incorporation of high-resolution patient-specific computer modeling to guide and optimize tDCS. Methods In this review, we discuss the following topics: (i) The clinical motivation and rationale for models of transcranial stimulation is considered pivotal in order to leverage the flexibility of neuromodulation; (ii) The protocols and the workflow for developing high-resolution models; (iii) The technical challenges and limitations of interpreting modeling predictions, and (iv) Real cases merging modeling and clinical data illustrating the impact of computational models on the rational design of rehabilitative electrotherapy. Conclusions Though modeling for non-invasive brain stimulation is still in its development phase, it is predicted that with increased validation, dissemination, simplification and democratization of modeling tools, computational forward models of neuromodulation will become useful tools to guide the optimization of clinical electrotherapy. PMID:22780230

  6. A Taylor Expansion-Based Adaptive Design Strategy for Global Surrogate Modeling With Applications in Groundwater Modeling

    NASA Astrophysics Data System (ADS)

    Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing; Zhang, Guannan; Ye, Ming; Wu, Jianfeng; Wu, Jichun

    2017-12-01

    Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we develop a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.

  7. Deriving Points of Departure and Performance Baselines for Predictive Modeling of Systemic Toxicity using ToxRefDB (SOT)

    EPA Science Inventory

    A primary goal of computational toxicology is to generate predictive models of toxicity. An elusive target of alternative test methods and models has been the accurate prediction of systemic toxicity points of departure (PoD). We aim not only to provide a large and valuable resou...

  8. Theories for Deep Change in Affect-sensitive Cognitive Machines: A Constructivist Model.

    ERIC Educational Resources Information Center

    Kort, Barry; Reilly, Rob

    2002-01-01

    There is an interplay between emotions and learning, but this interaction is far more complex than previous learning theories have articulated. This article proffers a novel model by which to regard the interplay of emotions upon learning and discusses the larger practical aim of crafting computer-based models that will recognize a learner's…

  9. A Computer-Assisted Learning Model Based on the Digital Game Exponential Reward System

    ERIC Educational Resources Information Center

    Moon, Man-Ki; Jahng, Surng-Gahb; Kim, Tae-Yong

    2011-01-01

    The aim of this research was to construct a motivational model which would stimulate voluntary and proactive learning using digital game methods offering players more freedom and control. The theoretical framework of this research lays the foundation for a pedagogical learning model based on digital games. We analyzed the game reward system, which…

  10. A Framework for Image-Based Modeling of Acute Myocardial Ischemia Using Intramurally Recorded Extracellular Potentials.

    PubMed

    Burton, Brett M; Aras, Kedar K; Good, Wilson W; Tate, Jess D; Zenger, Brian; MacLeod, Rob S

    2018-05-21

    The biophysical basis for electrocardiographic evaluation of myocardial ischemia stems from the notion that ischemic tissues develop, with relative uniformity, along the endocardial aspects of the heart. These injured regions of subendocardial tissue give rise to intramural currents that lead to ST segment deflections within electrocardiogram (ECG) recordings. The concept of subendocardial ischemic regions is often used in clinical practice, providing a simple and intuitive description of ischemic injury; however, such a model grossly oversimplifies the presentation of ischemic disease-inadvertently leading to errors in ECG-based diagnoses. Furthermore, recent experimental studies have brought into question the subendocardial ischemia paradigm suggesting instead a more distributed pattern of tissue injury. These findings come from experiments and so have both the impact and the limitations of measurements from living organisms. Computer models have often been employed to overcome the constraints of experimental approaches and have a robust history in cardiac simulation. To this end, we have developed a computational simulation framework aimed at elucidating the effects of ischemia on measurable cardiac potentials. To validate our framework, we simulated, visualized, and analyzed 226 experimentally derived acute myocardial ischemic events. Simulation outcomes agreed both qualitatively (feature comparison) and quantitatively (correlation, average error, and significance) with experimentally obtained epicardial measurements, particularly under conditions of elevated ischemic stress. Our simulation framework introduces a novel approach to incorporating subject-specific, geometric models and experimental results that are highly resolved in space and time into computational models. We propose this framework as a means to advance the understanding of the underlying mechanisms of ischemic disease while simultaneously putting in place the computational infrastructure necessary to study and improve ischemia models aimed at reducing diagnostic errors in the clinic.

  11. Fitting Planetary Orbits with a Spreadsheet.

    ERIC Educational Resources Information Center

    Bridges, Richard

    1995-01-01

    Describes how to fit binocular observations of the planets to a theoretical model of circular orbits using a modern computer spreadsheet, from which fundamental data about the solar system may be deduced. (AIM)

  12. The Virtual Liver Project: Modeling Tissue Response To Chemicals Through Multiscale Simulation

    EPA Science Inventory

    The US EPA Virtual Liver Project is aimed at simulating the risk of toxic effects from environmental chemicals in silico. The computational systems model of organ injury due to chronic chemical exposure is based on: (i) the dynamics of perturbed molecular pathways, (ii) their lin...

  13. Computer simulation of stair falls to investigate scenarios in child abuse.

    PubMed

    Bertocci, G E; Pierce, M C; Deemer, E; Aguel, F

    2001-09-01

    To demonstrate the usefulness of computer simulation techniques in the investigation of pediatric stair falls. Since stair falls are a common falsely reported injury scenario in child abuse, our specific aim was to investigate the influence of stair characteristics on injury biomechanics of pediatric stair falls by using a computer simulation model. Our long-term goal is to use knowledge of biomechanics to aid in distinguishing between accidents and abuse. A computer simulation model of a 3-year-old child falling down stairs was developed using commercially available simulation software. This model was used to investigate the influence that stair characteristics have on biomechanical measures associated with injury risk. Since femur fractures occur in unintentional and abuse scenarios, biomechanical measures were focused on the lower extremities. The number and slope of steps and stair surface friction and elasticity were found to affect biomechanical measures associated with injury risk. Computer simulation techniques are useful for investigating the biomechanics of stair falls. Using our simulation model, we determined that stair characteristics have an effect on potential for lower extremity injuries. Although absolute values of biomechanical measures should not be relied on in an unvalidated model such as this, relationships between accident-environment factors and biomechanical measures can be studied through simulation. Future efforts will focus on model validation.

  14. Soft computing techniques toward modeling the water supplies of Cyprus.

    PubMed

    Iliadis, L; Maris, F; Tachos, S

    2011-10-01

    This research effort aims in the application of soft computing techniques toward water resources management. More specifically, the target is the development of reliable soft computing models capable of estimating the water supply for the case of "Germasogeia" mountainous watersheds in Cyprus. Initially, ε-Regression Support Vector Machines (ε-RSVM) and fuzzy weighted ε-RSVMR models have been developed that accept five input parameters. At the same time, reliable artificial neural networks have been developed to perform the same job. The 5-fold cross validation approach has been employed in order to eliminate bad local behaviors and to produce a more representative training data set. Thus, the fuzzy weighted Support Vector Regression (SVR) combined with the fuzzy partition has been employed in an effort to enhance the quality of the results. Several rational and reliable models have been produced that can enhance the efficiency of water policy designers. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. An empirical generative framework for computational modeling of language acquisition.

    PubMed

    Waterfall, Heidi R; Sandbank, Ben; Onnis, Luca; Edelman, Shimon

    2010-06-01

    This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of generative grammars from raw CHILDES data and give an account of the generative performance of the acquired grammars. Next, we summarize findings from recent longitudinal and experimental work that suggests how certain statistically prominent structural properties of child-directed speech may facilitate language acquisition. We then present a series of new analyses of CHILDES data indicating that the desired properties are indeed present in realistic child-directed speech corpora. Finally, we suggest how our computational results, behavioral findings, and corpus-based insights can be integrated into a next-generation model aimed at meeting the four requirements of our modeling framework.

  16. A 3D Model to Compute Lightning and HIRF Coupling Effects on Avionic Equipment of an Aircraft

    NASA Astrophysics Data System (ADS)

    Perrin, E.; Tristant, F.; Guiffaut, C.; Terrade, F.; Reineix, A.

    2012-05-01

    This paper describes the 3D FDTD model of an aircraft developed to compute the lightning and HIRF (High Intentity Radiated Fields) coupling effects on avionic equipment and all the wire harness associated. This virtual prototype aims at assisting the aircraft manufacturer during the lightning and HIRF certification processes. The model presented here permits to cover a frequency range from lightning spectrum to the low frequency HIRF domain, i.e. 0 to 100 MHz. Moreover, the entire aircraft, including the frame, the skin, the wire harness and the equipment are taken into account in only one model. Results obtained are compared to measurements on a real aircraft.

  17. Speedup computation of HD-sEMG signals using a motor unit-specific electrical source model.

    PubMed

    Carriou, Vincent; Boudaoud, Sofiane; Laforet, Jeremy

    2018-01-23

    Nowadays, bio-reliable modeling of muscle contraction is becoming more accurate and complex. This increasing complexity induces a significant increase in computation time which prevents the possibility of using this model in certain applications and studies. Accordingly, the aim of this work is to significantly reduce the computation time of high-density surface electromyogram (HD-sEMG) generation. This will be done through a new model of motor unit (MU)-specific electrical source based on the fibers composing the MU. In order to assess the efficiency of this approach, we computed the normalized root mean square error (NRMSE) between several simulations on single generated MU action potential (MUAP) using the usual fiber electrical sources and the MU-specific electrical source. This NRMSE was computed for five different simulation sets wherein hundreds of MUAPs are generated and summed into HD-sEMG signals. The obtained results display less than 2% error on the generated signals compared to the same signals generated with fiber electrical sources. Moreover, the computation time of the HD-sEMG signal generation model is reduced to about 90% compared to the fiber electrical source model. Using this model with MU electrical sources, we can simulate HD-sEMG signals of a physiological muscle (hundreds of MU) in less than an hour on a classical workstation. Graphical Abstract Overview of the simulation of HD-sEMG signals using the fiber scale and the MU scale. Upscaling the electrical source to the MU scale reduces the computation time by 90% inducing only small deviation of the same simulated HD-sEMG signals.

  18. Role of Statistical Random-Effects Linear Models in Personalized Medicine.

    PubMed

    Diaz, Francisco J; Yeh, Hung-Wen; de Leon, Jose

    2012-03-01

    Some empirical studies and recent developments in pharmacokinetic theory suggest that statistical random-effects linear models are valuable tools that allow describing simultaneously patient populations as a whole and patients as individuals. This remarkable characteristic indicates that these models may be useful in the development of personalized medicine, which aims at finding treatment regimes that are appropriate for particular patients, not just appropriate for the average patient. In fact, published developments show that random-effects linear models may provide a solid theoretical framework for drug dosage individualization in chronic diseases. In particular, individualized dosages computed with these models by means of an empirical Bayesian approach may produce better results than dosages computed with some methods routinely used in therapeutic drug monitoring. This is further supported by published empirical and theoretical findings that show that random effects linear models may provide accurate representations of phase III and IV steady-state pharmacokinetic data, and may be useful for dosage computations. These models have applications in the design of clinical algorithms for drug dosage individualization in chronic diseases; in the computation of dose correction factors; computation of the minimum number of blood samples from a patient that are necessary for calculating an optimal individualized drug dosage in therapeutic drug monitoring; measure of the clinical importance of clinical, demographic, environmental or genetic covariates; study of drug-drug interactions in clinical settings; the implementation of computational tools for web-site-based evidence farming; design of pharmacogenomic studies; and in the development of a pharmacological theory of dosage individualization.

  19. Space mapping method for the design of passive shields

    NASA Astrophysics Data System (ADS)

    Sergeant, Peter; Dupré, Luc; Melkebeek, Jan

    2006-04-01

    The aim of the paper is to find the optimal geometry of a passive shield for the reduction of the magnetic stray field of an axisymmetric induction heater. For the optimization, a space mapping algorithm is used that requires two models. The first is an accurate model with a high computational effort as it contains finite element models. The second is less accurate, but it has a low computational effort as it uses an analytical model: the shield is replaced by a number of mutually coupled coils. The currents in the shield are found by solving an electrical circuit. Space mapping combines both models to obtain the optimal passive shield fast and accurately. The presented optimization technique is compared with gradient, simplex, and genetic algorithms.

  20. DGSA: A Matlab toolbox for distance-based generalized sensitivity analysis of geoscientific computer experiments

    NASA Astrophysics Data System (ADS)

    Park, Jihoon; Yang, Guang; Satija, Addy; Scheidt, Céline; Caers, Jef

    2016-12-01

    Sensitivity analysis plays an important role in geoscientific computer experiments, whether for forecasting, data assimilation or model calibration. In this paper we focus on an extension of a method of regionalized sensitivity analysis (RSA) to applications typical in the Earth Sciences. Such applications involve the building of large complex spatial models, the application of computationally extensive forward modeling codes and the integration of heterogeneous sources of model uncertainty. The aim of this paper is to be practical: 1) provide a Matlab code, 2) provide novel visualization methods to aid users in getting a better understanding in the sensitivity 3) provide a method based on kernel principal component analysis (KPCA) and self-organizing maps (SOM) to account for spatial uncertainty typical in Earth Science applications and 4) provide an illustration on a real field case where the above mentioned complexities present themselves. We present methods that extend the original RSA method in several ways. First we present the calculation of conditional effects, defined as the sensitivity of a parameter given a level of another parameters. Second, we show how this conditional effect can be used to choose nominal values or ranges to fix insensitive parameters aiming to minimally affect uncertainty in the response. Third, we develop a method based on KPCA and SOM to assign a rank to spatial models in order to calculate the sensitivity on spatial variability in the models. A large oil/gas reservoir case is used as illustration of these ideas.

  1. An Evaluation into the Views of Candidate Mathematics Teachers over "Tablet Computers" to be Applied in Secondary Schools

    ERIC Educational Resources Information Center

    Aksu, Hasan Hüseyin

    2014-01-01

    This study aims to investigate, in terms of different variables, the views of prospective Mathematics teachers on tablet computers to be used in schools as an outcome of the Fatih Project, which was initiated by the Ministry of National Education. In the study, scanning model, one of the quantitative research methods, was used. In the population…

  2. Towards a Framework to Improve the Quality of Teaching and Learning: Consciousness and Validation in Computer Engineering Science, UCT

    ERIC Educational Resources Information Center

    Lévano, Marcos; Albornoz, Andrea

    2016-01-01

    This paper aims to propose a framework to improve the quality in teaching and learning in order to develop good practices to train professionals in the career of computer engineering science. To demonstrate the progress and achievements, our work is based on two principles for the formation of professionals, one based on the model of learning…

  3. Software for Brain Network Simulations: A Comparative Study

    PubMed Central

    Tikidji-Hamburyan, Ruben A.; Narayana, Vikram; Bozkus, Zeki; El-Ghazawi, Tarek A.

    2017-01-01

    Numerical simulations of brain networks are a critical part of our efforts in understanding brain functions under pathological and normal conditions. For several decades, the community has developed many software packages and simulators to accelerate research in computational neuroscience. In this article, we select the three most popular simulators, as determined by the number of models in the ModelDB database, such as NEURON, GENESIS, and BRIAN, and perform an independent evaluation of these simulators. In addition, we study NEST, one of the lead simulators of the Human Brain Project. First, we study them based on one of the most important characteristics, the range of supported models. Our investigation reveals that brain network simulators may be biased toward supporting a specific set of models. However, all simulators tend to expand the supported range of models by providing a universal environment for the computational study of individual neurons and brain networks. Next, our investigations on the characteristics of computational architecture and efficiency indicate that all simulators compile the most computationally intensive procedures into binary code, with the aim of maximizing their computational performance. However, not all simulators provide the simplest method for module development and/or guarantee efficient binary code. Third, a study of their amenability for high-performance computing reveals that NEST can almost transparently map an existing model on a cluster or multicore computer, while NEURON requires code modification if the model developed for a single computer has to be mapped on a computational cluster. Interestingly, parallelization is the weakest characteristic of BRIAN, which provides no support for cluster computations and limited support for multicore computers. Fourth, we identify the level of user support and frequency of usage for all simulators. Finally, we carry out an evaluation using two case studies: a large network with simplified neural and synaptic models and a small network with detailed models. These two case studies allow us to avoid any bias toward a particular software package. The results indicate that BRIAN provides the most concise language for both cases considered. Furthermore, as expected, NEST mostly favors large network models, while NEURON is better suited for detailed models. Overall, the case studies reinforce our general observation that simulators have a bias in the computational performance toward specific types of the brain network models. PMID:28775687

  4. State-Transition Structures in Physics and in Computation

    NASA Astrophysics Data System (ADS)

    Petri, C. A.

    1982-12-01

    In order to establish close connections between physical and computational processes, it is assumed that the concepts of “state” and of “transition” are acceptable both to physicists and to computer scientists, at least in an informal way. The aim of this paper is to propose formal definitions of state and transition elements on the basis of very low level physical concepts in such a way that (1) all physically possible computations can be described as embedded in physical processes; (2) the computational aspects of physical processes can be described on a well-defined level of abstraction; (3) the gulf between the continuous models of physics and the discrete models of computer science can be bridged by simple mathematical constructs which may be given a physical interpretation; (4) a combinatorial, nonstatistical definition of “information” can be given on low levels of abstraction which may serve as a basis to derive higher-level concepts of information, e.g., by a statistical or probabilistic approach. Conceivable practical consequences are discussed.

  5. Internet messenger based smart virtual class learning using ubiquitous computing

    NASA Astrophysics Data System (ADS)

    Umam, K.; Mardi, S. N. S.; Hariadi, M.

    2017-06-01

    Internet messenger (IM) has become an important educational technology component in college education, IM makes it possible for students to engage in learning and collaborating at smart virtual class learning (SVCL) using ubiquitous computing. However, the model of IM-based smart virtual class learning using ubiquitous computing and empirical evidence that would favor a broad application to improve engagement and behavior are still limited. In addition, the expectation that IM based SVCL using ubiquitous computing could improve engagement and behavior on smart class cannot be confirmed because the majority of the reviewed studies followed instructions paradigms. This article aims to present the model of IM-based SVCL using ubiquitous computing and showing learners’ experiences in improved engagement and behavior for learner-learner and learner-lecturer interactions. The method applied in this paper includes design process and quantitative analysis techniques, with the purpose of identifying scenarios of ubiquitous computing and realize the impressions of learners and lecturers about engagement and behavior aspect and its contribution to learning

  6. The National Cancer Informatics Program (NCIP) Annotation and Image Markup (AIM) Foundation model.

    PubMed

    Mongkolwat, Pattanasak; Kleper, Vladimir; Talbot, Skip; Rubin, Daniel

    2014-12-01

    Knowledge contained within in vivo imaging annotated by human experts or computer programs is typically stored as unstructured text and separated from other associated information. The National Cancer Informatics Program (NCIP) Annotation and Image Markup (AIM) Foundation information model is an evolution of the National Institute of Health's (NIH) National Cancer Institute's (NCI) Cancer Bioinformatics Grid (caBIG®) AIM model. The model applies to various image types created by various techniques and disciplines. It has evolved in response to the feedback and changing demands from the imaging community at NCI. The foundation model serves as a base for other imaging disciplines that want to extend the type of information the model collects. The model captures physical entities and their characteristics, imaging observation entities and their characteristics, markups (two- and three-dimensional), AIM statements, calculations, image source, inferences, annotation role, task context or workflow, audit trail, AIM creator details, equipment used to create AIM instances, subject demographics, and adjudication observations. An AIM instance can be stored as a Digital Imaging and Communications in Medicine (DICOM) structured reporting (SR) object or Extensible Markup Language (XML) document for further processing and analysis. An AIM instance consists of one or more annotations and associated markups of a single finding along with other ancillary information in the AIM model. An annotation describes information about the meaning of pixel data in an image. A markup is a graphical drawing placed on the image that depicts a region of interest. This paper describes fundamental AIM concepts and how to use and extend AIM for various imaging disciplines.

  7. The Australian Computational Earth Systems Simulator

    NASA Astrophysics Data System (ADS)

    Mora, P.; Muhlhaus, H.; Lister, G.; Dyskin, A.; Place, D.; Appelbe, B.; Nimmervoll, N.; Abramson, D.

    2001-12-01

    Numerical simulation of the physics and dynamics of the entire earth system offers an outstanding opportunity for advancing earth system science and technology but represents a major challenge due to the range of scales and physical processes involved, as well as the magnitude of the software engineering effort required. However, new simulation and computer technologies are bringing this objective within reach. Under a special competitive national funding scheme to establish new Major National Research Facilities (MNRF), the Australian government together with a consortium of Universities and research institutions have funded construction of the Australian Computational Earth Systems Simulator (ACcESS). The Simulator or computational virtual earth will provide the research infrastructure to the Australian earth systems science community required for simulations of dynamical earth processes at scales ranging from microscopic to global. It will consist of thematic supercomputer infrastructure and an earth systems simulation software system. The Simulator models and software will be constructed over a five year period by a multi-disciplinary team of computational scientists, mathematicians, earth scientists, civil engineers and software engineers. The construction team will integrate numerical simulation models (3D discrete elements/lattice solid model, particle-in-cell large deformation finite-element method, stress reconstruction models, multi-scale continuum models etc) with geophysical, geological and tectonic models, through advanced software engineering and visualization technologies. When fully constructed, the Simulator aims to provide the software and hardware infrastructure needed to model solid earth phenomena including global scale dynamics and mineralisation processes, crustal scale processes including plate tectonics, mountain building, interacting fault system dynamics, and micro-scale processes that control the geological, physical and dynamic behaviour of earth systems. ACcESS represents a part of Australia's contribution to the APEC Cooperation for Earthquake Simulation (ACES) international initiative. Together with other national earth systems science initiatives including the Japanese Earth Simulator and US General Earthquake Model projects, ACcESS aims to provide a driver for scientific advancement and technological breakthroughs including: quantum leaps in understanding of earth evolution at global, crustal, regional and microscopic scales; new knowledge of the physics of crustal fault systems required to underpin the grand challenge of earthquake prediction; new understanding and predictive capabilities of geological processes such as tectonics and mineralisation.

  8. Computational models of music perception and cognition II: Domain-specific music processing

    NASA Astrophysics Data System (ADS)

    Purwins, Hendrik; Grachten, Maarten; Herrera, Perfecto; Hazan, Amaury; Marxer, Ricard; Serra, Xavier

    2008-09-01

    In Part I [Purwins H, Herrera P, Grachten M, Hazan A, Marxer R, Serra X. Computational models of music perception and cognition I: The perceptual and cognitive processing chain. Physics of Life Reviews 2008, in press, doi:10.1016/j.plrev.2008.03.004], we addressed the study of cognitive processes that underlie auditory perception of music, and their neural correlates. The aim of the present paper is to summarize empirical findings from music cognition research that are relevant to three prominent music theoretic domains: rhythm, melody, and tonality. Attention is paid to how cognitive processes like category formation, stimulus grouping, and expectation can account for the music theoretic key concepts in these domains, such as beat, meter, voice, consonance. We give an overview of computational models that have been proposed in the literature for a variety of music processing tasks related to rhythm, melody, and tonality. Although the present state-of-the-art in computational modeling of music cognition definitely provides valuable resources for testing specific hypotheses and theories, we observe the need for models that integrate the various aspects of music perception and cognition into a single framework. Such models should be able to account for aspects that until now have only rarely been addressed in computational models of music cognition, like the active nature of perception and the development of cognitive capacities from infancy to adulthood.

  9. Implementation of Biogas Stations into Smart Heating and Cooling Network

    NASA Astrophysics Data System (ADS)

    Milčák, P.; Konvička, J.; Jasenská, M.

    2016-10-01

    The paper is aimed at the description of implementation of a biogas station into software environment for the "Smart Heating and Cooling Networks". The aim of this project is creation of a software tool for preparation of operation and optimization of treatment of heat/cool in small regions. In this case, the biogas station represents a kind of renewable energy source, which, however, has its own operational specifics which need to be taken into account at the creation of an implementation project. For a specific biogas station, a detailed computational model was elaborated, which is parameterized in particular for an optimization of the total computational time.

  10. Analysis of a Multi-Fidelity Surrogate for Handling Real Gas Equations of State

    NASA Astrophysics Data System (ADS)

    Ouellet, Frederick; Park, Chanyoung; Rollin, Bertrand; Balachandar, S.

    2017-06-01

    The explosive dispersal of particles is a complex multiphase and multi-species fluid flow problem. In these flows, the detonation products of the explosive must be treated as real gas while the ideal gas equation of state is used for the surrounding air. As the products expand outward from the detonation point, they mix with ambient air and create a mixing region where both state equations must be satisfied. One of the most accurate, yet computationally expensive, methods to handle this problem is an algorithm that iterates between both equations of state until pressure and thermal equilibrium are achieved inside of each computational cell. This work aims to use a multi-fidelity surrogate model to replace this process. A Kriging model is used to produce a curve fit which interpolates selected data from the iterative algorithm using Bayesian statistics. We study the model performance with respect to the iterative method in simulations using a finite volume code. The model's (i) computational speed, (ii) memory requirements and (iii) computational accuracy are analyzed to show the benefits of this novel approach. Also, optimizing the combination of model accuracy and computational speed through the choice of sampling points is explained. This work was supported by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program as a Cooperative Agreement under the Predictive Science Academic Alliance Program under Contract No. DE-NA0002378.

  11. Computational model of gamma irradiation room at ININ

    NASA Astrophysics Data System (ADS)

    Rodríguez-Romo, Suemi; Patlan-Cardoso, Fernando; Ibáñez-Orozco, Oscar; Vergara Martínez, Francisco Javier

    2018-03-01

    In this paper, we present a model of the gamma irradiation room at the National Institute of Nuclear Research (ININ is its acronym in Spanish) in Mexico to improve the use of physics in dosimetry for human protection. We deal with air-filled ionization chambers and scientific computing made in house and framed in both the GEANT4 scheme and our analytical approach to characterize the irradiation room. This room is the only secondary dosimetry facility in Mexico. Our aim is to optimize its experimental designs, facilities, and industrial applications of physical radiation. The computational results provided by our model are supported by all the known experimental data regarding the performance of the ININ gamma irradiation room and allow us to predict the values of the main variables related to this fully enclosed space to within an acceptable margin of error.

  12. Parallel Computing for Brain Simulation.

    PubMed

    Pastur-Romay, L A; Porto-Pazos, A B; Cedron, F; Pazos, A

    2017-01-01

    The human brain is the most complex system in the known universe, it is therefore one of the greatest mysteries. It provides human beings with extraordinary abilities. However, until now it has not been understood yet how and why most of these abilities are produced. For decades, researchers have been trying to make computers reproduce these abilities, focusing on both understanding the nervous system and, on processing data in a more efficient way than before. Their aim is to make computers process information similarly to the brain. Important technological developments and vast multidisciplinary projects have allowed creating the first simulation with a number of neurons similar to that of a human brain. This paper presents an up-to-date review about the main research projects that are trying to simulate and/or emulate the human brain. They employ different types of computational models using parallel computing: digital models, analog models and hybrid models. This review includes the current applications of these works, as well as future trends. It is focused on various works that look for advanced progress in Neuroscience and still others which seek new discoveries in Computer Science (neuromorphic hardware, machine learning techniques). Their most outstanding characteristics are summarized and the latest advances and future plans are presented. In addition, this review points out the importance of considering not only neurons: Computational models of the brain should also include glial cells, given the proven importance of astrocytes in information processing. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  13. Cyberinfrastructure to Support Collaborative and Reproducible Computational Hydrologic Modeling

    NASA Astrophysics Data System (ADS)

    Goodall, J. L.; Castronova, A. M.; Bandaragoda, C.; Morsy, M. M.; Sadler, J. M.; Essawy, B.; Tarboton, D. G.; Malik, T.; Nijssen, B.; Clark, M. P.; Liu, Y.; Wang, S. W.

    2017-12-01

    Creating cyberinfrastructure to support reproducibility of computational hydrologic models is an important research challenge. Addressing this challenge requires open and reusable code and data with machine and human readable metadata, organized in ways that allow others to replicate results and verify published findings. Specific digital objects that must be tracked for reproducible computational hydrologic modeling include (1) raw initial datasets, (2) data processing scripts used to clean and organize the data, (3) processed model inputs, (4) model results, and (5) the model code with an itemization of all software dependencies and computational requirements. HydroShare is a cyberinfrastructure under active development designed to help users store, share, and publish digital research products in order to improve reproducibility in computational hydrology, with an architecture supporting hydrologic-specific resource metadata. Researchers can upload data required for modeling, add hydrology-specific metadata to these resources, and use the data directly within HydroShare.org for collaborative modeling using tools like CyberGIS, Sciunit-CLI, and JupyterHub that have been integrated with HydroShare to run models using notebooks, Docker containers, and cloud resources. Current research aims to implement the Structure For Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model within HydroShare to support hypothesis-driven hydrologic modeling while also taking advantage of the HydroShare cyberinfrastructure. The goal of this integration is to create the cyberinfrastructure that supports hypothesis-driven model experimentation, education, and training efforts by lowering barriers to entry, reducing the time spent on informatics technology and software development, and supporting collaborative research within and across research groups.

  14. Developing predictive approaches to characterize adaptive responses of the reproductive endocrine axis to aromatase inhibition: I. Data generation in a small fish model

    EPA Science Inventory

    Adaptive or compensatory responses to chemical exposure can significantly influence in vivo concentration-duration-response relationships. The aim of this study was to provide data to support development of a computational dynamic model of the hypothalamic-pituitary-gonadal axis ...

  15. QSAR Methods.

    PubMed

    Gini, Giuseppina

    2016-01-01

    In this chapter, we introduce the basis of computational chemistry and discuss how computational methods have been extended to some biological properties and toxicology, in particular. Since about 20 years, chemical experimentation is more and more replaced by modeling and virtual experimentation, using a large core of mathematics, chemistry, physics, and algorithms. Then we see how animal experiments, aimed at providing a standardized result about a biological property, can be mimicked by new in silico methods. Our emphasis here is on toxicology and on predicting properties through chemical structures. Two main streams of such models are available: models that consider the whole molecular structure to predict a value, namely QSAR (Quantitative Structure Activity Relationships), and models that find relevant substructures to predict a class, namely SAR. The term in silico discovery is applied to chemical design, to computational toxicology, and to drug discovery. We discuss how the experimental practice in biological science is moving more and more toward modeling and simulation. Such virtual experiments confirm hypotheses, provide data for regulation, and help in designing new chemicals.

  16. [Computer aided diagnosis model for lung tumor based on ensemble convolutional neural network].

    PubMed

    Wang, Yuanyuan; Zhou, Tao; Lu, Huiling; Wu, Cuiying; Yang, Pengfei

    2017-08-01

    The convolutional neural network (CNN) could be used on computer-aided diagnosis of lung tumor with positron emission tomography (PET)/computed tomography (CT), which can provide accurate quantitative analysis to compensate for visual inertia and defects in gray-scale sensitivity, and help doctors diagnose accurately. Firstly, parameter migration method is used to build three CNNs (CT-CNN, PET-CNN, and PET/CT-CNN) for lung tumor recognition in CT, PET, and PET/CT image, respectively. Then, we aimed at CT-CNN to obtain the appropriate model parameters for CNN training through analysis the influence of model parameters such as epochs, batchsize and image scale on recognition rate and training time. Finally, three single CNNs are used to construct ensemble CNN, and then lung tumor PET/CT recognition was completed through relative majority vote method and the performance between ensemble CNN and single CNN was compared. The experiment results show that the ensemble CNN is better than single CNN on computer-aided diagnosis of lung tumor.

  17. Non-invasive brain stimulation and computational models in post-stroke aphasic patients: single session of transcranial magnetic stimulation and transcranial direct current stimulation. A randomized clinical trial.

    PubMed

    Santos, Michele Devido Dos; Cavenaghi, Vitor Breseghello; Mac-Kay, Ana Paula Machado Goyano; Serafim, Vitor; Venturi, Alexandre; Truong, Dennis Quangvinh; Huang, Yu; Boggio, Paulo Sérgio; Fregni, Felipe; Simis, Marcel; Bikson, Marom; Gagliardi, Rubens José

    2017-01-01

    Patients undergoing the same neuromodulation protocol may present different responses. Computational models may help in understanding such differences. The aims of this study were, firstly, to compare the performance of aphasic patients in naming tasks before and after one session of transcranial direct current stimulation (tDCS), transcranial magnetic stimulation (TMS) and sham, and analyze the results between these neuromodulation techniques; and secondly, through computational model on the cortex and surrounding tissues, to assess current flow distribution and responses among patients who received tDCS and presented different levels of results from naming tasks. Prospective, descriptive, qualitative and quantitative, double blind, randomized and placebo-controlled study conducted at Faculdade de Ciências Médicas da Santa Casa de São Paulo. Patients with aphasia received one session of tDCS, TMS or sham stimulation. The time taken to name pictures and the response time were evaluated before and after neuromodulation. Selected patients from the first intervention underwent a computational model stimulation procedure that simulated tDCS. The results did not indicate any statistically significant differences from before to after the stimulation.The computational models showed different current flow distributions. The present study did not show any statistically significant difference between tDCS, TMS and sham stimulation regarding naming tasks. The patients'responses to the computational model showed different patterns of current distribution.

  18. Application of computational aero-acoustics to real world problems

    NASA Technical Reports Server (NTRS)

    Hardin, Jay C.

    1996-01-01

    The application of computational aeroacoustics (CAA) to real problems is discussed in relation to the analysis performed with the aim of assessing the application of the various techniques. It is considered that the applications are limited by the inability of the computational resources to resolve the large range of scales involved in high Reynolds number flows. Possible simplifications are discussed. It is considered that problems remain to be solved in relation to the efficient use of the power of parallel computers and in the development of turbulent modeling schemes. The goal of CAA is stated as being the implementation of acoustic design studies on a computer terminal with reasonable run times.

  19. A Taylor Expansion-Based Adaptive Design Strategy for Global Surrogate Modeling With Applications in Groundwater Modeling

    DOE PAGES

    Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing; ...

    2017-12-27

    Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we developmore » a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.« less

  20. A Taylor Expansion-Based Adaptive Design Strategy for Global Surrogate Modeling With Applications in Groundwater Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing

    Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we developmore » a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.« less

  1. The relationship between venture capital investment and macro economic variables via statistical computation method

    NASA Astrophysics Data System (ADS)

    Aygunes, Gunes

    2017-07-01

    The objective of this paper is to survey and determine the macroeconomic factors affecting the level of venture capital (VC) investments in a country. The literary depends on venture capitalists' quality and countries' venture capital investments. The aim of this paper is to give relationship between venture capital investment and macro economic variables via statistical computation method. We investigate the countries and macro economic variables. By using statistical computation method, we derive correlation between venture capital investments and macro economic variables. According to method of logistic regression model (logit regression or logit model), macro economic variables are correlated with each other in three group. Venture capitalists regard correlations as a indicator. Finally, we give correlation matrix of our results.

  2. Developing a strategy for computational lab skills training through Software and Data Carpentry: Experiences from the ELIXIR Pilot action

    PubMed Central

    Pawlik, Aleksandra; van Gelder, Celia W.G.; Nenadic, Aleksandra; Palagi, Patricia M.; Korpelainen, Eija; Lijnzaad, Philip; Marek, Diana; Sansone, Susanna-Assunta; Hancock, John; Goble, Carole

    2017-01-01

    Quality training in computational skills for life scientists is essential to allow them to deliver robust, reproducible and cutting-edge research. A pan-European bioinformatics programme, ELIXIR, has adopted a well-established and progressive programme of computational lab and data skills training from Software and Data Carpentry, aimed at increasing the number of skilled life scientists and building a sustainable training community in this field. This article describes the Pilot action, which introduced the Carpentry training model to the ELIXIR community. PMID:28781745

  3. Developing a strategy for computational lab skills training through Software and Data Carpentry: Experiences from the ELIXIR Pilot action.

    PubMed

    Pawlik, Aleksandra; van Gelder, Celia W G; Nenadic, Aleksandra; Palagi, Patricia M; Korpelainen, Eija; Lijnzaad, Philip; Marek, Diana; Sansone, Susanna-Assunta; Hancock, John; Goble, Carole

    2017-01-01

    Quality training in computational skills for life scientists is essential to allow them to deliver robust, reproducible and cutting-edge research. A pan-European bioinformatics programme, ELIXIR, has adopted a well-established and progressive programme of computational lab and data skills training from Software and Data Carpentry, aimed at increasing the number of skilled life scientists and building a sustainable training community in this field. This article describes the Pilot action, which introduced the Carpentry training model to the ELIXIR community.

  4. A neuron-astrocyte transistor-like model for neuromorphic dressed neurons.

    PubMed

    Valenza, G; Pioggia, G; Armato, A; Ferro, M; Scilingo, E P; De Rossi, D

    2011-09-01

    Experimental evidences on the role of the synaptic glia as an active partner together with the bold synapse in neuronal signaling and dynamics of neural tissue strongly suggest to investigate on a more realistic neuron-glia model for better understanding human brain processing. Among the glial cells, the astrocytes play a crucial role in the tripartite synapsis, i.e. the dressed neuron. A well-known two-way astrocyte-neuron interaction can be found in the literature, completely revising the purely supportive role for the glia. The aim of this study is to provide a computationally efficient model for neuron-glia interaction. The neuron-glia interactions were simulated by implementing the Li-Rinzel model for an astrocyte and the Izhikevich model for a neuron. Assuming the dressed neuron dynamics similar to the nonlinear input-output characteristics of a bipolar junction transistor, we derived our computationally efficient model. This model may represent the fundamental computational unit for the development of real-time artificial neuron-glia networks opening new perspectives in pattern recognition systems and in brain neurophysiology. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Distributed and Lumped Parameter Models for the Characterization of High Throughput Bioreactors

    PubMed Central

    Conoscenti, Gioacchino; Cutrì, Elena; Tuan, Rocky S.; Raimondi, Manuela T.; Gottardi, Riccardo

    2016-01-01

    Next generation bioreactors are being developed to generate multiple human cell-based tissue analogs within the same fluidic system, to better recapitulate the complexity and interconnection of human physiology [1, 2]. The effective development of these devices requires a solid understanding of their interconnected fluidics, to predict the transport of nutrients and waste through the constructs and improve the design accordingly. In this work, we focus on a specific model of bioreactor, with multiple input/outputs, aimed at generating osteochondral constructs, i.e., a biphasic construct in which one side is cartilaginous in nature, while the other is osseous. We next develop a general computational approach to model the microfluidics of a multi-chamber, interconnected system that may be applied to human-on-chip devices. This objective requires overcoming several challenges at the level of computational modeling. The main one consists of addressing the multi-physics nature of the problem that combines free flow in channels with hindered flow in porous media. Fluid dynamics is also coupled with advection-diffusion-reaction equations that model the transport of biomolecules throughout the system and their interaction with living tissues and C constructs. Ultimately, we aim at providing a predictive approach useful for the general organ-on-chip community. To this end, we have developed a lumped parameter approach that allows us to analyze the behavior of multi-unit bioreactor systems with modest computational effort, provided that the behavior of a single unit can be fully characterized. PMID:27669413

  6. Distributed and Lumped Parameter Models for the Characterization of High Throughput Bioreactors.

    PubMed

    Iannetti, Laura; D'Urso, Giovanna; Conoscenti, Gioacchino; Cutrì, Elena; Tuan, Rocky S; Raimondi, Manuela T; Gottardi, Riccardo; Zunino, Paolo

    Next generation bioreactors are being developed to generate multiple human cell-based tissue analogs within the same fluidic system, to better recapitulate the complexity and interconnection of human physiology [1, 2]. The effective development of these devices requires a solid understanding of their interconnected fluidics, to predict the transport of nutrients and waste through the constructs and improve the design accordingly. In this work, we focus on a specific model of bioreactor, with multiple input/outputs, aimed at generating osteochondral constructs, i.e., a biphasic construct in which one side is cartilaginous in nature, while the other is osseous. We next develop a general computational approach to model the microfluidics of a multi-chamber, interconnected system that may be applied to human-on-chip devices. This objective requires overcoming several challenges at the level of computational modeling. The main one consists of addressing the multi-physics nature of the problem that combines free flow in channels with hindered flow in porous media. Fluid dynamics is also coupled with advection-diffusion-reaction equations that model the transport of biomolecules throughout the system and their interaction with living tissues and C constructs. Ultimately, we aim at providing a predictive approach useful for the general organ-on-chip community. To this end, we have developed a lumped parameter approach that allows us to analyze the behavior of multi-unit bioreactor systems with modest computational effort, provided that the behavior of a single unit can be fully characterized.

  7. Optimization of a hydrodynamic separator using a multiscale computational fluid dynamics approach.

    PubMed

    Schmitt, Vivien; Dufresne, Matthieu; Vazquez, Jose; Fischer, Martin; Morin, Antoine

    2013-01-01

    This article deals with the optimization of a hydrodynamic separator working on the tangential separation mechanism along a screen. The aim of this study is to optimize the shape of the device to avoid clogging. A multiscale approach is used. This methodology combines measurements and computational fluid dynamics (CFD). A local model enables us to observe the different phenomena occurring at the orifice scale, which shows the potential of expanded metal screens. A global model is used to simulate the flow within the device using a conceptual model of the screen (porous wall). After validation against the experimental measurements, the global model was used to investigate the influence of deflectors and disk plates in the structure.

  8. ScienceCast 121: The Effects of Space Weather on Aviation

    NASA Image and Video Library

    2013-10-25

    Ordinary air travelers can be exposed to significant doses of radiation during solar storms. A new computer model developed by NASA aims to help protect the public by predicting space weather hazards to aviation.

  9. Fine-Tuning Tomato Agronomic Properties by Computational Genome Redesign

    PubMed Central

    Carrera, Javier; Fernández del Carmen, Asun; Fernández-Muñoz, Rafael; Rambla, Jose Luis; Pons, Clara; Jaramillo, Alfonso; Elena, Santiago F.; Granell, Antonio

    2012-01-01

    Considering cells as biofactories, we aimed to optimize its internal processes by using the same engineering principles that large industries are implementing nowadays: lean manufacturing. We have applied reverse engineering computational methods to transcriptomic, metabolomic and phenomic data obtained from a collection of tomato recombinant inbreed lines to formulate a kinetic and constraint-based model that efficiently describes the cellular metabolism from expression of a minimal core of genes. Based on predicted metabolic profiles, a close association with agronomic and organoleptic properties of the ripe fruit was revealed with high statistical confidence. Inspired in a synthetic biology approach, the model was used for exploring the landscape of all possible local transcriptional changes with the aim of engineering tomato fruits with fine-tuned biotechnological properties. The method was validated by the ability of the proposed genomes, engineered for modified desired agronomic traits, to recapitulate experimental correlations between associated metabolites. PMID:22685389

  10. Development of Parallel Code for the Alaska Tsunami Forecast Model

    NASA Astrophysics Data System (ADS)

    Bahng, B.; Knight, W. R.; Whitmore, P.

    2014-12-01

    The Alaska Tsunami Forecast Model (ATFM) is a numerical model used to forecast propagation and inundation of tsunamis generated by earthquakes and other means in both the Pacific and Atlantic Oceans. At the U.S. National Tsunami Warning Center (NTWC), the model is mainly used in a pre-computed fashion. That is, results for hundreds of hypothetical events are computed before alerts, and are accessed and calibrated with observations during tsunamis to immediately produce forecasts. ATFM uses the non-linear, depth-averaged, shallow-water equations of motion with multiply nested grids in two-way communications between domains of each parent-child pair as waves get closer to coastal waters. Even with the pre-computation the task becomes non-trivial as sub-grid resolution gets finer. Currently, the finest resolution Digital Elevation Models (DEM) used by ATFM are 1/3 arc-seconds. With a serial code, large or multiple areas of very high resolution can produce run-times that are unrealistic even in a pre-computed approach. One way to increase the model performance is code parallelization used in conjunction with a multi-processor computing environment. NTWC developers have undertaken an ATFM code-parallelization effort to streamline the creation of the pre-computed database of results with the long term aim of tsunami forecasts from source to high resolution shoreline grids in real time. Parallelization will also permit timely regeneration of the forecast model database with new DEMs; and, will make possible future inclusion of new physics such as the non-hydrostatic treatment of tsunami propagation. The purpose of our presentation is to elaborate on the parallelization approach and to show the compute speed increase on various multi-processor systems.

  11. Spring assisted cranioplasty: A patient specific computational model.

    PubMed

    Borghi, Alessandro; Rodriguez-Florez, Naiara; Rodgers, Will; James, Gregory; Hayward, Richard; Dunaway, David; Jeelani, Owase; Schievano, Silvia

    2018-03-01

    Implantation of spring-like distractors in the treatment of sagittal craniosynostosis is a novel technique that has proven functionally and aesthetically effective in correcting skull deformities; however, final shape outcomes remain moderately unpredictable due to an incomplete understanding of the skull-distractor interaction. The aim of this study was to create a patient specific computational model of spring assisted cranioplasty (SAC) that can help predict the individual overall final head shape. Pre-operative computed tomography images of a SAC patient were processed to extract a 3D model of the infant skull anatomy and simulate spring implantation. The distractors were modeled based on mechanical experimental data. Viscoelastic bone properties from the literature were tuned using the specific patient procedural information recorded during surgery and from x-ray measurements at follow-up. The model accurately captured spring expansion on-table (within 9% of the measured values), as well as at first and second follow-ups (within 8% of the measured values). Comparison between immediate post-operative 3D head scanning and numerical results for this patient proved that the model could successfully predict the final overall head shape. This preliminary work showed the potential application of computational modeling to study SAC, to support pre-operative planning and guide novel distractor design. Copyright © 2018 IPEM. Published by Elsevier Ltd. All rights reserved.

  12. Analysis on trust influencing factors and trust model from multiple perspectives of online Auction

    NASA Astrophysics Data System (ADS)

    Yu, Wang

    2017-10-01

    Current reputation models lack the research on online auction trading completely so they cannot entirely reflect the reputation status of users and may cause problems on operability. To evaluate the user trust in online auction correctly, a trust computing model based on multiple influencing factors is established. It aims at overcoming the efficiency of current trust computing methods and the limitations of traditional theoretical trust models. The improved model comprehensively considers the trust degree evaluation factors of three types of participants according to different participation modes of online auctioneers, to improve the accuracy, effectiveness and robustness of the trust degree. The experiments test the efficiency and the performance of our model under different scale of malicious user, under environment like eBay and Sporas model. The experimental results analysis show the model proposed in this paper makes up the deficiency of existing model and it also has better feasibility.

  13. Three-Dimensional Mechanical Model of the Human Spine and the Versatility of its Use

    NASA Astrophysics Data System (ADS)

    Sokol, Milan; Velísková, Petra; Rehák, Ľuboš; Žabka, Martin

    2014-03-01

    The aim of the work is oriented towards the simulation or modeling of the lumbar and thoracic human spine as a load-bearing 3D system in a computer program (ANSYS). The human spine model includes a determination of the geometry based on X-ray pictures of frontal and lateral projections. For this reason, another computer code, BMPCOORDINATES, was developed as an aid to obtain the most precise and realistic model of the spine. Various positions, deformations, scoliosis, rotation and torsion can be modelled. Once the geometry is done, external loading on different spinal segments is entered; consequently, the response could be analysed. This can contribute a lot to medical practice as a tool for diagnoses, and developing implants or other artificial instruments for fixing the spine.

  14. Design Of Computer Based Test Using The Unified Modeling Language

    NASA Astrophysics Data System (ADS)

    Tedyyana, Agus; Danuri; Lidyawati

    2017-12-01

    The Admission selection of Politeknik Negeri Bengkalis through interest and talent search (PMDK), Joint Selection of admission test for state Polytechnics (SB-UMPN) and Independent (UM-Polbeng) were conducted by using paper-based Test (PBT). Paper Based Test model has some weaknesses. They are wasting too much paper, the leaking of the questios to the public, and data manipulation of the test result. This reasearch was Aimed to create a Computer-based Test (CBT) models by using Unified Modeling Language (UML) the which consists of Use Case diagrams, Activity diagram and sequence diagrams. During the designing process of the application, it is important to pay attention on the process of giving the password for the test questions before they were shown through encryption and description process. RSA cryptography algorithm was used in this process. Then, the questions shown in the questions banks were randomized by using the Fisher-Yates Shuffle method. The network architecture used in Computer Based test application was a client-server network models and Local Area Network (LAN). The result of the design was the Computer Based Test application for admission to the selection of Politeknik Negeri Bengkalis.

  15. A roadmap to computational social neuroscience.

    PubMed

    Tognoli, Emmanuelle; Dumas, Guillaume; Kelso, J A Scott

    2018-02-01

    To complement experimental efforts toward understanding human social interactions at both neural and behavioral levels, two computational approaches are presented: (1) a fully parameterizable mathematical model of a social partner, the Human Dynamic Clamp which, by virtue of experimentally controlled interactions between Virtual Partners and real people, allows for emergent behaviors to be studied; and (2) a multiscale neurocomputational model of social coordination that enables exploration of social self-organization at all levels-from neuronal patterns to people interacting with each other. These complementary frameworks and the cross product of their analysis aim at understanding the fundamental principles governing social behavior.

  16. Model Reduction of Computational Aerothermodynamics for Multi-Discipline Analysis in High Speed Flows

    NASA Astrophysics Data System (ADS)

    Crowell, Andrew Rippetoe

    This dissertation describes model reduction techniques for the computation of aerodynamic heat flux and pressure loads for multi-disciplinary analysis of hypersonic vehicles. NASA and the Department of Defense have expressed renewed interest in the development of responsive, reusable hypersonic cruise vehicles capable of sustained high-speed flight and access to space. However, an extensive set of technical challenges have obstructed the development of such vehicles. These technical challenges are partially due to both the inability to accurately test scaled vehicles in wind tunnels and to the time intensive nature of high-fidelity computational modeling, particularly for the fluid using Computational Fluid Dynamics (CFD). The aim of this dissertation is to develop efficient and accurate models for the aerodynamic heat flux and pressure loads to replace the need for computationally expensive, high-fidelity CFD during coupled analysis. Furthermore, aerodynamic heating and pressure loads are systematically evaluated for a number of different operating conditions, including: simple two-dimensional flow over flat surfaces up to three-dimensional flows over deformed surfaces with shock-shock interaction and shock-boundary layer interaction. An additional focus of this dissertation is on the implementation and computation of results using the developed aerodynamic heating and pressure models in complex fluid-thermal-structural simulations. Model reduction is achieved using a two-pronged approach. One prong focuses on developing analytical corrections to isothermal, steady-state CFD flow solutions in order to capture flow effects associated with transient spatially-varying surface temperatures and surface pressures (e.g., surface deformation, surface vibration, shock impingements, etc.). The second prong is focused on minimizing the computational expense of computing the steady-state CFD solutions by developing an efficient surrogate CFD model. The developed two-pronged approach is found to exhibit balanced performance in terms of accuracy and computational expense, relative to several existing approaches. This approach enables CFD-based loads to be implemented into long duration fluid-thermal-structural simulations.

  17. Role of Statistical Random-Effects Linear Models in Personalized Medicine

    PubMed Central

    Diaz, Francisco J; Yeh, Hung-Wen; de Leon, Jose

    2012-01-01

    Some empirical studies and recent developments in pharmacokinetic theory suggest that statistical random-effects linear models are valuable tools that allow describing simultaneously patient populations as a whole and patients as individuals. This remarkable characteristic indicates that these models may be useful in the development of personalized medicine, which aims at finding treatment regimes that are appropriate for particular patients, not just appropriate for the average patient. In fact, published developments show that random-effects linear models may provide a solid theoretical framework for drug dosage individualization in chronic diseases. In particular, individualized dosages computed with these models by means of an empirical Bayesian approach may produce better results than dosages computed with some methods routinely used in therapeutic drug monitoring. This is further supported by published empirical and theoretical findings that show that random effects linear models may provide accurate representations of phase III and IV steady-state pharmacokinetic data, and may be useful for dosage computations. These models have applications in the design of clinical algorithms for drug dosage individualization in chronic diseases; in the computation of dose correction factors; computation of the minimum number of blood samples from a patient that are necessary for calculating an optimal individualized drug dosage in therapeutic drug monitoring; measure of the clinical importance of clinical, demographic, environmental or genetic covariates; study of drug-drug interactions in clinical settings; the implementation of computational tools for web-site-based evidence farming; design of pharmacogenomic studies; and in the development of a pharmacological theory of dosage individualization. PMID:23467392

  18. The effects of integrating service learning into computer science: an inter-institutional longitudinal study

    NASA Astrophysics Data System (ADS)

    Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang

    2015-07-01

    This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of the Students & Technology in Academia, Research, and Service (STARS) Alliance, an NSF-supported broadening participation in computing initiative that aims to diversify the computer science pipeline through innovative pedagogy and inter-institutional partnerships. The current paper describes how the STARS Alliance has expanded to diverse institutions, all using service learning as a vehicle for broadening participation in computing and enhancing attitudes and behaviors associated with student success. Results supported the STARS model of service learning for enhancing computing efficacy and computing commitment and for providing diverse students with many personal and professional development benefits.

  19. Personalized, relevance-based Multimodal Robotic Imaging and augmented reality for Computer Assisted Interventions.

    PubMed

    Navab, Nassir; Fellow, Miccai; Hennersperger, Christoph; Frisch, Benjamin; Fürst, Bernhard

    2016-10-01

    In the last decade, many researchers in medical image computing and computer assisted interventions across the world focused on the development of the Virtual Physiological Human (VPH), aiming at changing the practice of medicine from classification and treatment of diseases to that of modeling and treating patients. These projects resulted in major advancements in segmentation, registration, morphological, physiological and biomechanical modeling based on state of art medical imaging as well as other sensory data. However, a major issue which has not yet come into the focus is personalizing intra-operative imaging, allowing for optimal treatment. In this paper, we discuss the personalization of imaging and visualization process with particular focus on satisfying the challenging requirements of computer assisted interventions. We discuss such requirements and review a series of scientific contributions made by our research team to tackle some of these major challenges. Copyright © 2016. Published by Elsevier B.V.

  20. A Parametric Computational Analysis into Galvanic Coupling Intrabody Communication.

    PubMed

    Callejon, M Amparo; Del Campo, P; Reina-Tosina, Javier; Roa, Laura M

    2017-08-02

    Intrabody Communication (IBC) uses the human body tissues as transmission media for electrical signals to interconnect personal health devices in wireless body area networks. The main goal of this work is to conduct a computational analysis covering some bioelectric issues that still have not been fully explained, such as the modeling of the skin-electrode impedance, the differences associated to the use of constant voltage or current excitation modes, or the influence on attenuation of the subject's anthropometrical and bioelectric properties. With this aim, a computational finite element model has been developed, allowing the IBC channel attenuation as well as the electric field and current density through arm tissues to be computed as a function of these parameters. As a conclusion, this parametric analysis has in turn permitted us to disclose some knowledge about the causes and effects of the above-mentioned issues, thus explaining and complementing previous results reported in the literature.

  1. Efficient Conservative Reformulation Schemes for Lithium Intercalation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Urisanga, PC; Rife, D; De, S

    Porous electrode theory coupled with transport and reaction mechanisms is a widely used technique to model Li-ion batteries employing an appropriate discretization or approximation for solid phase diffusion with electrode particles. One of the major difficulties in simulating Li-ion battery models is the need to account for solid phase diffusion in a second radial dimension r, which increases the computation time/cost to a great extent. Various methods that reduce the computational cost have been introduced to treat this phenomenon, but most of them do not guarantee mass conservation. The aim of this paper is to introduce an inherently mass conservingmore » yet computationally efficient method for solid phase diffusion based on Lobatto III A quadrature. This paper also presents coupling of the new solid phase reformulation scheme with a macro-homogeneous porous electrode theory based pseudo 20 model for Li-ion battery. (C) The Author(s) 2015. Published by ECS. All rights reserved.« less

  2. Integrated Computational Materials Engineering for Magnesium in Automotive Body Applications

    NASA Astrophysics Data System (ADS)

    Allison, John E.; Liu, Baicheng; Boyle, Kevin P.; Hector, Lou; McCune, Robert

    This paper provides an overview and progress report for an international collaborative project which aims to develop an ICME infrastructure for magnesium for use in automotive body applications. Quantitative processing-micro structure-property relationships are being developed for extruded Mg alloys, sheet-formed Mg alloys and high pressure die cast Mg alloys. These relationships are captured in computational models which are then linked with manufacturing process simulation and used to provide constitutive models for component performance analysis. The long term goal is to capture this information in efficient computational models and in a web-centered knowledge base. The work is being conducted at leading universities, national labs and industrial research facilities in the US, China and Canada. This project is sponsored by the U.S. Department of Energy, the U.S. Automotive Materials Partnership (USAMP), Chinese Ministry of Science and Technology (MOST) and Natural Resources Canada (NRCan).

  3. Turbine Internal and Film Cooling Modeling For 3D Navier-Stokes Codes

    NASA Technical Reports Server (NTRS)

    DeWitt, Kenneth; Garg Vijay; Ameri, Ali

    2005-01-01

    The aim of this research project is to make use of NASA Glenn on-site computational facilities in order to develop, validate and apply aerodynamic, heat transfer, and turbine cooling models for use in advanced 3D Navier-Stokes Computational Fluid Dynamics (CFD) codes such as the Glenn-" code. Specific areas of effort include: Application of the Glenn-HT code to specific configurations made available under Turbine Based Combined Cycle (TBCC), and Ultra Efficient Engine Technology (UEET) projects. Validating the use of a multi-block code for the time accurate computation of the detailed flow and heat transfer of cooled turbine airfoils. The goal of the current research is to improve the predictive ability of the Glenn-HT code. This will enable one to design more efficient turbine components for both aviation and power generation. The models will be tested against specific configurations provided by NASA Glenn.

  4. Petri-net-based 2D design of DNA walker circuits.

    PubMed

    Gilbert, David; Heiner, Monika; Rohr, Christian

    2018-01-01

    We consider localised DNA computation, where a DNA strand walks along a binary decision graph to compute a binary function. One of the challenges for the design of reliable walker circuits consists in leakage transitions, which occur when a walker jumps into another branch of the decision graph. We automatically identify leakage transitions, which allows for a detailed qualitative and quantitative assessment of circuit designs, design comparison, and design optimisation. The ability to identify leakage transitions is an important step in the process of optimising DNA circuit layouts where the aim is to minimise the computational error inherent in a circuit while minimising the area of the circuit. Our 2D modelling approach of DNA walker circuits relies on coloured stochastic Petri nets which enable functionality, topology and dimensionality all to be integrated in one two-dimensional model. Our modelling and analysis approach can be easily extended to 3-dimensional walker systems.

  5. Thermodynamic model effects on the design and optimization of natural gas plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diaz, S.; Zabaloy, M.; Brignole, E.A.

    1999-07-01

    The design and optimization of natural gas plants is carried out on the basis of process simulators. The physical property package is generally based on cubic equations of state. By rigorous thermodynamics phase equilibrium conditions, thermodynamic functions, equilibrium phase separations, work and heat are computed. The aim of this work is to analyze the NGL turboexpansion process and identify possible process computations that are more sensitive to model predictions accuracy. Three equations of state, PR, SRK and Peneloux modification, are used to study the effect of property predictions on process calculations and plant optimization. It is shown that turboexpander plantsmore » have moderate sensitivity with respect to phase equilibrium computations, but higher accuracy is required for the prediction of enthalpy and turboexpansion work. The effect of modeling CO{sub 2} solubility is also critical in mixtures with high CO{sub 2} content in the feed.« less

  6. Electric Conduction in Solids: a Pedagogical Approach Supported by Laboratory Measurements and Computer Modelling Environments

    NASA Astrophysics Data System (ADS)

    Bonura, A.; Capizzo, M. C.; Fazio, C.; Guastella, I.

    2008-05-01

    In this paper we present a pedagogic approach aimed at modeling electric conduction in semiconductors, built by using NetLogo, a programmable modeling environment for building and exploring multi-agent systems. `Virtual experiments' are implemented to confront predictions of different microscopic models with real measurements of electric properties of matter, such as resistivity. The relations between these electric properties and other physical variables, like temperature, are, then, analyzed.

  7. Tutoring at a Distance: Modelling as a Tool to Control Chaos

    ERIC Educational Resources Information Center

    Bertin, Jean-Claude; Narcy-Combes, Jean-Paul

    2012-01-01

    This article builds on a previous article published in 2007, which aimed at clarifying the concept of tutoring. Based on a new epistemological stance (emergentism) the authors will here show how the various components of the computer-assisted language learning situation form a complex chaotic system. They advocate that modelling is a way of…

  8. Development and Evaluation of the Effectiveness of Computer-Assisted Physics Instruction

    ERIC Educational Resources Information Center

    Rahman, Mohd. Jasmy Abd; Ismail, Mohd. Arif. Hj.; Nasir, Muhammad

    2014-01-01

    This study aims to design and develop an interactive software for teaching and learning physics about motion and vectors analysis. This study also assesses its effectiveness in classroom and assesses the learning motivation of SMA Pekanbaru's students. The software is developed using ADDIE Model design and Life Cycle Model and built using the…

  9. Interaction of Simple Ions with Water: Theoretical Models for the Study of Ion Hydration

    ERIC Educational Resources Information Center

    Gancheff, Jorge S.; Kremer, Carlos; Ventura, Oscar N.

    2009-01-01

    A computational experiment aimed to create and systematically analyze models of simple cation hydrates is presented. The changes in the structure (bond distances and angles) and the electronic density distribution of the solvent and the thermodynamic parameters of the hydration process are calculated and compared with the experimental data. The…

  10. Computational Modeling of Morphological Effects in Bangla Visual Word Recognition

    ERIC Educational Resources Information Center

    Dasgupta, Tirthankar; Sinha, Manjira; Basu, Anupam

    2015-01-01

    In this paper we aim to model the organization and processing of Bangla polymorphemic words in the mental lexicon. Our objective is to determine whether the mental lexicon accesses a polymorphemic word as a whole or decomposes the word into its constituent morphemes and then recognize them accordingly. To address this issue, we adopted two…

  11. A production planning model considering uncertain demand using two-stage stochastic programming in a fresh vegetable supply chain context.

    PubMed

    Mateo, Jordi; Pla, Lluis M; Solsona, Francesc; Pagès, Adela

    2016-01-01

    Production planning models are achieving more interest for being used in the primary sector of the economy. The proposed model relies on the formulation of a location model representing a set of farms susceptible of being selected by a grocery shop brand to supply local fresh products under seasonal contracts. The main aim is to minimize overall procurement costs and meet future demand. This kind of problem is rather common in fresh vegetable supply chains where producers are located in proximity either to processing plants or retailers. The proposed two-stage stochastic model determines which suppliers should be selected for production contracts to ensure high quality products and minimal time from farm-to-table. Moreover, Lagrangian relaxation and parallel computing algorithms are proposed to solve these instances efficiently in a reasonable computational time. The results obtained show computational gains from our algorithmic proposals in front of the usage of plain CPLEX solver. Furthermore, the results ensure the competitive advantages of using the proposed model by purchase managers in the fresh vegetables industry.

  12. Dayglow and night glow of the Venusian upper atmosphere. Modelling and observations

    NASA Astrophysics Data System (ADS)

    Gronoff, G.; Lilensten, J.; Simon, C.; Barthélemy, M.; Leblanc, F.

    2007-08-01

    Aims. We present the modelling of the production of excited states of O, CO and N2 in the Venusian upper atmosphere, which allows to compute the nightglow emissions. In the dayside, we also compute several emissions, taking advantage of the small influence of resonant scattering for forbidden transitions. Methods. We compute the photoionisation and the photodissociation mechanisms, and thus the photoelectron production. We compute electron impact excitation and ionisation through a multi-stream stationary kinetic transport code. Finally, we compute the ion recombination with a stationary chemical model. Results.We predict altitude density profiles for O(1S) and O(1D) states and the emissions corresponding to their different transitions. They are found to be very comparable to the observations without the need for stratospheric emissions. In the nightside, we highlight the role of the N + O+2 reaction in the creation of the O(1S) state. This reaction has been suggested by Rees in 1975 (Frederick, 1976). It has been discussed several times afterwhile and in spite of different studies, is still controversial. However, when we take it in consideration in Venus, it is shown to be the cause of almost 90% of the state production. We calculate the production intensities of O(3S) and O(5S) states, which are needed for radiative transfer models. For CO we compute the Cameron band and the fourth positive band emissions. For N2 we compute the LBH, first and second positive bands. All these values are successfully compared to the experiment when data are available. Conclusions. For the first time, a comprehensive model is proposed to compute dayglow and nightglow emissions of the Venusian upper atmosphere. It relies on previous works with noticeable improvements, both on the transport and on the chemical aspects. In the near future, a radiative transfer model will be used to compute optically thick lines in the dayglow, and a fluid model will be added to compute ionosphere densities and temperatures. We will present the first observational results from the Pic du Midi telescope in June 2007, in order to compare with our modelling.

  13. The Utility of Cognitive Plausibility in Language Acquisition Modeling: Evidence From Word Segmentation.

    PubMed

    Phillips, Lawrence; Pearl, Lisa

    2015-11-01

    The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's cognitive plausibility. We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition model can aim to be cognitively plausible in multiple ways. We discuss these cognitive plausibility checkpoints generally and then apply them to a case study in word segmentation, investigating a promising Bayesian segmentation strategy. We incorporate cognitive plausibility by using an age-appropriate unit of perceptual representation, evaluating the model output in terms of its utility, and incorporating cognitive constraints into the inference process. Our more cognitively plausible model shows a beneficial effect of cognitive constraints on segmentation performance. One interpretation of this effect is as a synergy between the naive theories of language structure that infants may have and the cognitive constraints that limit the fidelity of their inference processes, where less accurate inference approximations are better when the underlying assumptions about how words are generated are less accurate. More generally, these results highlight the utility of incorporating cognitive plausibility more fully into computational models of language acquisition. Copyright © 2015 Cognitive Science Society, Inc.

  14. Current Status on the use of Parallel Computing in Turbulent Reacting Flow Computations Involving Sprays, Monte Carlo PDF and Unstructured Grids. Chapter 4

    NASA Technical Reports Server (NTRS)

    Raju, M. S.

    1998-01-01

    The state of the art in multidimensional combustor modeling as evidenced by the level of sophistication employed in terms of modeling and numerical accuracy considerations, is also dictated by the available computer memory and turnaround times afforded by present-day computers. With the aim of advancing the current multi-dimensional computational tools used in the design of advanced technology combustors, a solution procedure is developed that combines the novelty of the coupled CFD/spray/scalar Monte Carlo PDF (Probability Density Function) computations on unstructured grids with the ability to run on parallel architectures. In this approach, the mean gas-phase velocity and turbulence fields are determined from a standard turbulence model, the joint composition of species and enthalpy from the solution of a modeled PDF transport equation, and a Lagrangian-based dilute spray model is used for the liquid-phase representation. The gas-turbine combustor flows are often characterized by a complex interaction between various physical processes associated with the interaction between the liquid and gas phases, droplet vaporization, turbulent mixing, heat release associated with chemical kinetics, radiative heat transfer associated with highly absorbing and radiating species, among others. The rate controlling processes often interact with each other at various disparate time 1 and length scales. In particular, turbulence plays an important role in determining the rates of mass and heat transfer, chemical reactions, and liquid phase evaporation in many practical combustion devices.

  15. Making Cloud Computing Available For Researchers and Innovators (Invited)

    NASA Astrophysics Data System (ADS)

    Winsor, R.

    2010-12-01

    High Performance Computing (HPC) facilities exist in most academic institutions but are almost invariably over-subscribed. Access is allocated based on academic merit, the only practical method of assigning valuable finite compute resources. Cloud computing on the other hand, and particularly commercial clouds, draw flexibly on an almost limitless resource as long as the user has sufficient funds to pay the bill. How can the commercial cloud model be applied to scientific computing? Is there a case to be made for a publicly available research cloud and how would it be structured? This talk will explore these themes and describe how Cybera, a not-for-profit non-governmental organization in Alberta Canada, aims to leverage its high speed research and education network to provide cloud computing facilities for a much wider user base.

  16. The Effect of a Computer Program Based on Analysis, Design, Development, Implementation and Evaluation (ADDIE) in Improving Ninth Graders' Listening and Reading Comprehension Skills in English in Jordan

    ERIC Educational Resources Information Center

    Alodwan, Talal; Almosa, Mosaab

    2018-01-01

    The study aimed to assess the effectiveness of a computer program based on Analysis, Design, Development, Implementation and Evaluation (ADDIE) Model on the achievement of Ninth Graders' listening and Reading Comprehension Skills in English. The study sample comprised 70 ninth graders during the second semester of the academic year 2016/2017. The…

  17. A Formal Valuation Framework for Emotions and Their Control.

    PubMed

    Huys, Quentin J M; Renz, Daniel

    2017-09-15

    Computational psychiatry aims to apply mathematical and computational techniques to help improve psychiatric care. To achieve this, the phenomena under scrutiny should be within the scope of formal methods. As emotions play an important role across many psychiatric disorders, such computational methods must encompass emotions. Here, we consider formal valuation accounts of emotions. We focus on the fact that the flexibility of emotional responses and the nature of appraisals suggest the need for a model-based valuation framework for emotions. However, resource limitations make plain model-based valuation impossible and require metareasoning strategies to apportion cognitive resources adaptively. We argue that emotions may implement such metareasoning approximations by restricting the range of behaviors and states considered. We consider the processes that guide the deployment of the approximations, discerning between innate, model-free, heuristic, and model-based controllers. A formal valuation and metareasoning framework may thus provide a principled approach to examining emotions. Copyright © 2017 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  18. Using the CIFIST grid of CO5BOLD 3D model atmospheres to study the effects of stellar granulation on photometric colours. I. Grids of 3D corrections in the UBVRI, 2MASS, HIPPARCOS, Gaia, and SDSS systems

    NASA Astrophysics Data System (ADS)

    Bonifacio, P.; Caffau, E.; Ludwig, H.-G.; Steffen, M.; Castelli, F.; Gallagher, A. J.; Kučinskas, A.; Prakapavičius, D.; Cayrel, R.; Freytag, B.; Plez, B.; Homeier, D.

    2018-03-01

    Context. The atmospheres of cool stars are temporally and spatially inhomogeneous due to the effects of convection. The influence of this inhomogeneity, referred to as granulation, on colours has never been investigated over a large range of effective temperatures and gravities. Aim. We aim to study, in a quantitative way, the impact of granulation on colours. Methods: We use the CIFIST (Cosmological Impact of the FIrst Stars) grid of CO5BOLD (COnservative COde for the COmputation of COmpressible COnvection in a BOx of L Dimensions, L = 2, 3) hydrodynamical models to compute emerging fluxes. These in turn are used to compute theoretical colours in the UBV RI, 2MASS, HIPPARCOS, Gaia and SDSS systems. Every CO5BOLD model has a corresponding one dimensional (1D) plane-parallel LHD (Lagrangian HydroDynamics) model computed for the same atmospheric parameters, which we used to define a "3D correction" that can be applied to colours computed from fluxes computed from any 1D model atmosphere code. As an example, we illustrate these corrections applied to colours computed from ATLAS models. Results: The 3D corrections on colours are generally small, of the order of a few hundredths of a magnitude, yet they are far from negligible. We find that ignoring granulation effects can lead to underestimation of Teff by up to 200 K and overestimation of gravity by up to 0.5 dex, when using colours as diagnostics. We have identified a major shortcoming in how scattering is treated in the current version of the CIFIST grid, which could lead to offsets of the order 0.01 mag, especially for colours involving blue and UV bands. We have investigated the Gaia and HIPPARCOS photometric systems and found that the (G - Hp), (BP - RP) diagram is immune to the effects of granulation. In addition, we point to the potential of the RVS photometry as a metallicity diagnostic. Conclusions: Our investigation shows that the effects of granulation should not be neglected if one wants to use colours as diagnostics of the stellar parameters of F, G, K stars. A limitation is that scattering is treated as true absorption in our current computations, thus our 3D corrections are likely an upper limit to the true effect. We are already computing the next generation of the CIFIST grid, using an approximate treatment of scattering. The appendix tables are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/611/A68

  19. A rapid parallelization of cone-beam projection and back-projection operator based on texture fetching interpolation

    NASA Astrophysics Data System (ADS)

    Xie, Lizhe; Hu, Yining; Chen, Yang; Shi, Luyao

    2015-03-01

    Projection and back-projection are the most computational consuming parts in Computed Tomography (CT) reconstruction. Parallelization strategies using GPU computing techniques have been introduced. We in this paper present a new parallelization scheme for both projection and back-projection. The proposed method is based on CUDA technology carried out by NVIDIA Corporation. Instead of build complex model, we aimed on optimizing the existing algorithm and make it suitable for CUDA implementation so as to gain fast computation speed. Besides making use of texture fetching operation which helps gain faster interpolation speed, we fixed sampling numbers in the computation of projection, to ensure the synchronization of blocks and threads, thus prevents the latency caused by inconsistent computation complexity. Experiment results have proven the computational efficiency and imaging quality of the proposed method.

  20. Rapid prototyping modelling in oral and maxillofacial surgery: A two year retrospective study.

    PubMed

    Suomalainen, Anni; Stoor, Patricia; Mesimäki, Karri; Kontio, Risto K

    2015-12-01

    The use of rapid prototyping (RP) models in medicine to construct bony models is increasing. The aim of the study was to evaluate retrospectively the indication for the use of RP models in oral and maxillofacial surgery at Helsinki University Central Hospital during 2009-2010. Also, the used computed tomography (CT) examination - multislice CT (MSCT) or cone beam CT (CBCT) - method was evaluated. In total 114 RP models were fabricated for 102 patients. The mean age of the patients at the time of the production of the model was 50.4 years. The indications for the modelling included malignant lesions (29%), secondary reconstruction (25%), prosthodontic treatment (22%), orthognathic surgery or asymmetry (13%), benign lesions (8%), and TMJ disorders (4%). MSCT examination was used in 92 and CBCT examination in 22 cases. Most of the models (75%) were conventional hard tissue models. Models with colored tumour or other structure(s) of interest were ordered in 24%. Two out of the 114 models were soft tissue models. The main benefit of the models was in treatment planning and in connection with the production of pre-bent plates or custom made implants. The RP models both facilitate and improve treatment planning and intraoperative efficiency. Rapid prototyping, radiology, computed tomography, cone beam computed tomography.

  1. An Overview of Spray Modeling With OpenNCC and its Application to Emissions Predictions of a LDI Combustor at High Pressure

    NASA Technical Reports Server (NTRS)

    Raju, M. S.

    2016-01-01

    The open national combustion code (Open- NCC) is developed with the aim of advancing the current multi-dimensional computational tools used in the design of advanced technology combustors. In this paper we provide an overview of the spray module, LSPRAY-V, developed as a part of this effort. The spray solver is mainly designed to predict the flow, thermal, and transport properties of a rapidly evaporating multi-component liquid spray. The modeling approach is applicable over a wide-range of evaporating conditions (normal, superheat, and supercritical). The modeling approach is based on several well-established atomization, vaporization, and wall/droplet impingement models. It facilitates large-scale combustor computations through the use of massively parallel computers with the ability to perform the computations on either structured & unstructured grids. The spray module has a multi-liquid and multi-injector capability, and can be used in the calculation of both steady and unsteady computations. We conclude the paper by providing the results for a reacting spray generated by a single injector element with 600 axially swept swirler vanes. It is a configuration based on the next-generation lean-direct injection (LDI) combustor concept. The results include comparisons for both combustor exit temperature and EINOX at three different fuel/air ratios.

  2. Blood flow in intracranial aneurysms treated with Pipeline embolization devices: computational simulation and verification with Doppler ultrasonography on phantom models

    PubMed Central

    2015-01-01

    Purpose: The aim of this study was to validate a computational fluid dynamics (CFD) simulation of flow-diverter treatment through Doppler ultrasonography measurements in patient-specific models of intracranial bifurcation and side-wall aneurysms. Methods: Computational and physical models of patient-specific bifurcation and sidewall aneurysms were constructed from computed tomography angiography with use of stereolithography, a three-dimensional printing technology. Flow dynamics parameters before and after flow-diverter treatment were measured with pulse-wave and color Doppler ultrasonography, and then compared with CFD simulations. Results: CFD simulations showed drastic flow reduction after flow-diverter treatment in both aneurysms. The mean volume flow rate decreased by 90% and 85% for the bifurcation aneurysm and the side-wall aneurysm, respectively. Velocity contour plots from computer simulations before and after flow diversion closely resembled the patterns obtained by color Doppler ultrasonography. Conclusion: The CFD estimation of flow reduction in aneurysms treated with a flow-diverting stent was verified by Doppler ultrasonography in patient-specific phantom models of bifurcation and side-wall aneurysms. The combination of CFD and ultrasonography may constitute a feasible and reliable technique in studying the treatment of intracranial aneurysms with flow-diverting stents. PMID:25754367

  3. Perspective: Ab initio force field methods derived from quantum mechanics

    NASA Astrophysics Data System (ADS)

    Xu, Peng; Guidez, Emilie B.; Bertoni, Colleen; Gordon, Mark S.

    2018-03-01

    It is often desirable to accurately and efficiently model the behavior of large molecular systems in the condensed phase (thousands to tens of thousands of atoms) over long time scales (from nanoseconds to milliseconds). In these cases, ab initio methods are difficult due to the increasing computational cost with the number of electrons. A more computationally attractive alternative is to perform the simulations at the atomic level using a parameterized function to model the electronic energy. Many empirical force fields have been developed for this purpose. However, the functions that are used to model interatomic and intermolecular interactions contain many fitted parameters obtained from selected model systems, and such classical force fields cannot properly simulate important electronic effects. Furthermore, while such force fields are computationally affordable, they are not reliable when applied to systems that differ significantly from those used in their parameterization. They also cannot provide the information necessary to analyze the interactions that occur in the system, making the systematic improvement of the functional forms that are used difficult. Ab initio force field methods aim to combine the merits of both types of methods. The ideal ab initio force fields are built on first principles and require no fitted parameters. Ab initio force field methods surveyed in this perspective are based on fragmentation approaches and intermolecular perturbation theory. This perspective summarizes their theoretical foundation, key components in their formulation, and discusses key aspects of these methods such as accuracy and formal computational cost. The ab initio force fields considered here were developed for different targets, and this perspective also aims to provide a balanced presentation of their strengths and shortcomings. Finally, this perspective suggests some future directions for this actively developing area.

  4. A 3-states magnetic model of binary decisions in sociophysics

    NASA Astrophysics Data System (ADS)

    Fernandez, Miguel A.; Korutcheva, Elka; de la Rubia, F. Javier

    2016-11-01

    We study a diluted Blume-Capel model of 3-states sites as an attempt to understand how some social processes as cooperation or organization happen. For this aim, we study the effect of the complex network topology on the equilibrium properties of the model, by focusing on three different substrates: random graph, Watts-Strogatz and Newman substrates. Our computer simulations are in good agreement with the corresponding analytical results.

  5. Computational reacting gas dynamics

    NASA Technical Reports Server (NTRS)

    Lam, S. H.

    1993-01-01

    In the study of high speed flows at high altitudes, such as that encountered by re-entry spacecrafts, the interaction of chemical reactions and other non-equilibrium processes in the flow field with the gas dynamics is crucial. Generally speaking, problems of this level of complexity must resort to numerical methods for solutions, using sophisticated computational fluid dynamics (CFD) codes. The difficulties introduced by reacting gas dynamics can be classified into three distinct headings: (1) the usually inadequate knowledge of the reaction rate coefficients in the non-equilibrium reaction system; (2) the vastly larger number of unknowns involved in the computation and the expected stiffness of the equations; and (3) the interpretation of the detailed reacting CFD numerical results. The research performed accepts the premise that reacting flows of practical interest in the future will in general be too complex or 'untractable' for traditional analytical developments. The power of modern computers must be exploited. However, instead of focusing solely on the construction of numerical solutions of full-model equations, attention is also directed to the 'derivation' of the simplified model from the given full-model. In other words, the present research aims to utilize computations to do tasks which have traditionally been done by skilled theoreticians: to reduce an originally complex full-model system into an approximate but otherwise equivalent simplified model system. The tacit assumption is that once the appropriate simplified model is derived, the interpretation of the detailed numerical reacting CFD numerical results will become much easier. The approach of the research is called computational singular perturbation (CSP).

  6. Indonesia’s Electricity Demand Dynamic Modelling

    NASA Astrophysics Data System (ADS)

    Sulistio, J.; Wirabhuana, A.; Wiratama, M. G.

    2017-06-01

    Electricity Systems modelling is one of the emerging area in the Global Energy policy studies recently. System Dynamics approach and Computer Simulation has become one the common methods used in energy systems planning and evaluation in many conditions. On the other hand, Indonesia experiencing several major issues in Electricity system such as fossil fuel domination, demand - supply imbalances, distribution inefficiency, and bio-devastation. This paper aims to explain the development of System Dynamics modelling approaches and computer simulation techniques in representing and predicting electricity demand in Indonesia. In addition, this paper also described the typical characteristics and relationship of commercial business sector, industrial sector, and family / domestic sector as electricity subsystems in Indonesia. Moreover, it will be also present direct structure, behavioural, and statistical test as model validation approach and ended by conclusions.

  7. Analyzing musculoskeletal neck pain, measured as present pain and periods of pain, with three different regression models: a cohort study.

    PubMed

    Grimby-Ekman, Anna; Andersson, Eva M; Hagberg, Mats

    2009-06-19

    In the literature there are discussions on the choice of outcome and the need for more longitudinal studies of musculoskeletal disorders. The general aim of this longitudinal study was to analyze musculoskeletal neck pain, in a group of young adults. Specific aims were to determine whether psychosocial factors, computer use, high work/study demands, and lifestyle are long-term or short-term factors for musculoskeletal neck pain, and whether these factors are important for developing or ongoing musculoskeletal neck pain. Three regression models were used to analyze the different outcomes. Pain at present was analyzed with a marginal logistic model, for number of years with pain a Poisson regression model was used and for developing and ongoing pain a logistic model was used. Presented results are odds ratios and proportion ratios (logistic models) and rate ratios (Poisson model). The material consisted of web-based questionnaires answered by 1204 Swedish university students from a prospective cohort recruited in 2002. Perceived stress was a risk factor for pain at present (PR = 1.6), for developing pain (PR = 1.7) and for number of years with pain (RR = 1.3). High work/study demands was associated with pain at present (PR = 1.6); and with number of years with pain when the demands negatively affect home life (RR = 1.3). Computer use pattern (number of times/week with a computer session > or = 4 h, without break) was a risk factor for developing pain (PR = 1.7), but also associated with pain at present (PR = 1.4) and number of years with pain (RR = 1.2). Among life style factors smoking (PR = 1.8) was found to be associated to pain at present. The difference between men and women in prevalence of musculoskeletal pain was confirmed in this study. It was smallest for the outcome ongoing pain (PR = 1.4) compared to pain at present (PR = 2.4) and developing pain (PR = 2.5). By using different regression models different aspects of neck pain pattern could be addressed and the risk factors impact on pain pattern was identified. Short-term risk factors were perceived stress, high work/study demands and computer use pattern (break pattern). Those were also long-term risk factors. For developing pain perceived stress and computer use pattern were risk factors.

  8. Analyzing musculoskeletal neck pain, measured as present pain and periods of pain, with three different regression models: a cohort study

    PubMed Central

    Grimby-Ekman, Anna; Andersson, Eva M; Hagberg, Mats

    2009-01-01

    Background In the literature there are discussions on the choice of outcome and the need for more longitudinal studies of musculoskeletal disorders. The general aim of this longitudinal study was to analyze musculoskeletal neck pain, in a group of young adults. Specific aims were to determine whether psychosocial factors, computer use, high work/study demands, and lifestyle are long-term or short-term factors for musculoskeletal neck pain, and whether these factors are important for developing or ongoing musculoskeletal neck pain. Methods Three regression models were used to analyze the different outcomes. Pain at present was analyzed with a marginal logistic model, for number of years with pain a Poisson regression model was used and for developing and ongoing pain a logistic model was used. Presented results are odds ratios and proportion ratios (logistic models) and rate ratios (Poisson model). The material consisted of web-based questionnaires answered by 1204 Swedish university students from a prospective cohort recruited in 2002. Results Perceived stress was a risk factor for pain at present (PR = 1.6), for developing pain (PR = 1.7) and for number of years with pain (RR = 1.3). High work/study demands was associated with pain at present (PR = 1.6); and with number of years with pain when the demands negatively affect home life (RR = 1.3). Computer use pattern (number of times/week with a computer session ≥ 4 h, without break) was a risk factor for developing pain (PR = 1.7), but also associated with pain at present (PR = 1.4) and number of years with pain (RR = 1.2). Among life style factors smoking (PR = 1.8) was found to be associated to pain at present. The difference between men and women in prevalence of musculoskeletal pain was confirmed in this study. It was smallest for the outcome ongoing pain (PR = 1.4) compared to pain at present (PR = 2.4) and developing pain (PR = 2.5). Conclusion By using different regression models different aspects of neck pain pattern could be addressed and the risk factors impact on pain pattern was identified. Short-term risk factors were perceived stress, high work/study demands and computer use pattern (break pattern). Those were also long-term risk factors. For developing pain perceived stress and computer use pattern were risk factors. PMID:19545386

  9. Patient-Specific Simulation of Cardiac Blood Flow From High-Resolution Computed Tomography.

    PubMed

    Lantz, Jonas; Henriksson, Lilian; Persson, Anders; Karlsson, Matts; Ebbers, Tino

    2016-12-01

    Cardiac hemodynamics can be computed from medical imaging data, and results could potentially aid in cardiac diagnosis and treatment optimization. However, simulations are often based on simplified geometries, ignoring features such as papillary muscles and trabeculae due to their complex shape, limitations in image acquisitions, and challenges in computational modeling. This severely hampers the use of computational fluid dynamics in clinical practice. The overall aim of this study was to develop a novel numerical framework that incorporated these geometrical features. The model included the left atrium, ventricle, ascending aorta, and heart valves. The framework used image registration to obtain patient-specific wall motion, automatic remeshing to handle topological changes due to the complex trabeculae motion, and a fast interpolation routine to obtain intermediate meshes during the simulations. Velocity fields and residence time were evaluated, and they indicated that papillary muscles and trabeculae strongly interacted with the blood, which could not be observed in a simplified model. The framework resulted in a model with outstanding geometrical detail, demonstrating the feasibility as well as the importance of a framework that is capable of simulating blood flow in physiologically realistic hearts.

  10. Direct Methods for Predicting Movement Biomechanics Based Upon Optimal Control Theory with Implementation in OpenSim.

    PubMed

    Porsa, Sina; Lin, Yi-Chung; Pandy, Marcus G

    2016-08-01

    The aim of this study was to compare the computational performances of two direct methods for solving large-scale, nonlinear, optimal control problems in human movement. Direct shooting and direct collocation were implemented on an 8-segment, 48-muscle model of the body (24 muscles on each side) to compute the optimal control solution for maximum-height jumping. Both algorithms were executed on a freely-available musculoskeletal modeling platform called OpenSim. Direct collocation converged to essentially the same optimal solution up to 249 times faster than direct shooting when the same initial guess was assumed (3.4 h of CPU time for direct collocation vs. 35.3 days for direct shooting). The model predictions were in good agreement with the time histories of joint angles, ground reaction forces and muscle activation patterns measured for subjects jumping to their maximum achievable heights. Both methods converged to essentially the same solution when started from the same initial guess, but computation time was sensitive to the initial guess assumed. Direct collocation demonstrates exceptional computational performance and is well suited to performing predictive simulations of movement using large-scale musculoskeletal models.

  11. Accelerating Computation of DCM for ERP in MATLAB by External Function Calls to the GPU.

    PubMed

    Wang, Wei-Jen; Hsieh, I-Fan; Chen, Chun-Chuan

    2013-01-01

    This study aims to improve the performance of Dynamic Causal Modelling for Event Related Potentials (DCM for ERP) in MATLAB by using external function calls to a graphics processing unit (GPU). DCM for ERP is an advanced method for studying neuronal effective connectivity. DCM utilizes an iterative procedure, the expectation maximization (EM) algorithm, to find the optimal parameters given a set of observations and the underlying probability model. As the EM algorithm is computationally demanding and the analysis faces possible combinatorial explosion of models to be tested, we propose a parallel computing scheme using the GPU to achieve a fast estimation of DCM for ERP. The computation of DCM for ERP is dynamically partitioned and distributed to threads for parallel processing, according to the DCM model complexity and the hardware constraints. The performance efficiency of this hardware-dependent thread arrangement strategy was evaluated using the synthetic data. The experimental data were used to validate the accuracy of the proposed computing scheme and quantify the time saving in practice. The simulation results show that the proposed scheme can accelerate the computation by a factor of 155 for the parallel part. For experimental data, the speedup factor is about 7 per model on average, depending on the model complexity and the data. This GPU-based implementation of DCM for ERP gives qualitatively the same results as the original MATLAB implementation does at the group level analysis. In conclusion, we believe that the proposed GPU-based implementation is very useful for users as a fast screen tool to select the most likely model and may provide implementation guidance for possible future clinical applications such as online diagnosis.

  12. Accelerating Computation of DCM for ERP in MATLAB by External Function Calls to the GPU

    PubMed Central

    Wang, Wei-Jen; Hsieh, I-Fan; Chen, Chun-Chuan

    2013-01-01

    This study aims to improve the performance of Dynamic Causal Modelling for Event Related Potentials (DCM for ERP) in MATLAB by using external function calls to a graphics processing unit (GPU). DCM for ERP is an advanced method for studying neuronal effective connectivity. DCM utilizes an iterative procedure, the expectation maximization (EM) algorithm, to find the optimal parameters given a set of observations and the underlying probability model. As the EM algorithm is computationally demanding and the analysis faces possible combinatorial explosion of models to be tested, we propose a parallel computing scheme using the GPU to achieve a fast estimation of DCM for ERP. The computation of DCM for ERP is dynamically partitioned and distributed to threads for parallel processing, according to the DCM model complexity and the hardware constraints. The performance efficiency of this hardware-dependent thread arrangement strategy was evaluated using the synthetic data. The experimental data were used to validate the accuracy of the proposed computing scheme and quantify the time saving in practice. The simulation results show that the proposed scheme can accelerate the computation by a factor of 155 for the parallel part. For experimental data, the speedup factor is about 7 per model on average, depending on the model complexity and the data. This GPU-based implementation of DCM for ERP gives qualitatively the same results as the original MATLAB implementation does at the group level analysis. In conclusion, we believe that the proposed GPU-based implementation is very useful for users as a fast screen tool to select the most likely model and may provide implementation guidance for possible future clinical applications such as online diagnosis. PMID:23840507

  13. Performance Analysis and Optimization on the UCLA Parallel Atmospheric General Circulation Model Code

    NASA Technical Reports Server (NTRS)

    Lou, John; Ferraro, Robert; Farrara, John; Mechoso, Carlos

    1996-01-01

    An analysis is presented of several factors influencing the performance of a parallel implementation of the UCLA atmospheric general circulation model (AGCM) on massively parallel computer systems. Several modificaitons to the original parallel AGCM code aimed at improving its numerical efficiency, interprocessor communication cost, load-balance and issues affecting single-node code performance are discussed.

  14. The Combined Use of Video Modeling and Social Stories in Teaching Social Skills for Individuals with Intellectual Disability

    ERIC Educational Resources Information Center

    Gül, Seray Olçay

    2016-01-01

    There are many studies in the literature in which individuals with intellectual disabilities exhibit social skills deficits and which show the need for teaching these skills systematically. This study aims to investigate the effects of an intervention package of consisting computer-presented video modeling and Social Stories on individuals with…

  15. Computer Assisted Educational Material Preparation for Fourth Grade Primary School Students' English Language Class in Teaching Numbers

    ERIC Educational Resources Information Center

    Yüzen, Abdulkadir; Karamete, Aysen

    2016-01-01

    In this study, using ADDIE instructional design model, it is aimed to prepare English language educational material for 4th grade primary students to teach them numbers. At the same time, ARCS model of motivation's attention, relevance and satisfaction phases are also taken into consideration. This study also comprises of Design Based Research…

  16. Finite Element Study into the effect of footwear temperature on the Forces transmitted to the foot during quasi- static compression loading

    NASA Astrophysics Data System (ADS)

    Shariatmadari, M. R.; English, R.; Rothwell, G.

    2010-06-01

    The determination of plantar stresses using computational footwear models which include temperature effects are crucial to predict foam performance in service and to aid material development and product design. Finite Element Method (FEM) provides an efficient computational framework to investigate the foot-footwear interaction. The aim of this research is to use FEM to investigate the effect of varying footwear temperature on plantar stresses. The results obtained will provide data which can be used to help optimise shoe design in terms of minimising damaging stresses in the foot particularly for individuals with diabetes who are susceptible to lower extremity complications. The FE simulation results showed significant reductions in foot stresses with the modifications from FE model (1) without footwear to model (2) with midsole only and to model (3) with midsole and insole. In summary, insole and midsole layers made from various foam materials aim to reduce the Ground Reaction Forces (GRF's) and foot stresses considerably and temperature variation can affect their cushioning and consequently the shock attenuation properties. The loss of footwear cushioning effect can have important clinical implications for those individuals with a history of lower limb overuse injuries or diabetes.

  17. Different treatment modalities of fusiform basilar trunk aneurysm: study on computational hemodynamics.

    PubMed

    Wu, Chen; Xu, Bai-Nan; Sun, Zheng-Hui; Wang, Fu-Yu; Liu, Lei; Zhang, Xiao-Jun; Zhou, Ding-Biao

    2012-01-01

    Unclippable fusiform basilar trunk aneurysm is a formidable condition for surgical treatment. The aim of this study was to establish a computational model and to investigate the hemodynamic characteristics in a fusiform basilar trunk aneurysm. The three-dimensional digital model of a fusiform basilar trunk aneurysm was constructed using MIMICS, ANSYS and CFX software. Different hemodynamic modalities and border conditions were assigned to the model. Thirty points were selected randomly on the wall and within the aneurysm. Wall total pressure (WTP), wall shear stress (WSS), and blood flow velocity of each point were calculated and hemodynamic status was compared between different modalities. The quantitative average values of the 30 points on the wall and within the aneurysm were obtained by computational calculation point by point. The velocity and WSS in modalities A and B were different from those of the remaining 5 modalities; and the WTP in modalities A, E and F were higher than those of the remaining 4 modalities. The digital model of a fusiform basilar artery aneurysm is feasible and reliable. This model could provide some important information to clinical treatment options.

  18. Verification of Electromagnetic Physics Models for Parallel Computing Architectures in the GeantV Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amadio, G.; et al.

    An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physicsmore » models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.« less

  19. A combined three-dimensional in vitro–in silico approach to modelling bubble dynamics in decompression sickness

    PubMed Central

    Stride, E.; Cheema, U.

    2017-01-01

    The growth of bubbles within the body is widely believed to be the cause of decompression sickness (DCS). Dive computer algorithms that aim to prevent DCS by mathematically modelling bubble dynamics and tissue gas kinetics are challenging to validate. This is due to lack of understanding regarding the mechanism(s) leading from bubble formation to DCS. In this work, a biomimetic in vitro tissue phantom and a three-dimensional computational model, comprising a hyperelastic strain-energy density function to model tissue elasticity, were combined to investigate key areas of bubble dynamics. A sensitivity analysis indicated that the diffusion coefficient was the most influential material parameter. Comparison of computational and experimental data revealed the bubble surface's diffusion coefficient to be 30 times smaller than that in the bulk tissue and dependent on the bubble's surface area. The initial size, size distribution and proximity of bubbles within the tissue phantom were also shown to influence their subsequent dynamics highlighting the importance of modelling bubble nucleation and bubble–bubble interactions in order to develop more accurate dive algorithms. PMID:29263127

  20. Numerical model for healthy and injured ankle ligaments.

    PubMed

    Forestiero, Antonella; Carniel, Emanuele Luigi; Fontanella, Chiara Giulia; Natali, Arturo Nicola

    2017-06-01

    The aim of this work is to provide a computational tool for the investigation of ankle mechanics under different loading conditions. The attention is focused on the biomechanical role of ankle ligaments that are fundamental for joints stability. A finite element model of the human foot is developed starting from Computed Tomography and Magnetic Resonance Imaging, using particular attention to the definition of ankle ligaments. A refined fiber-reinforced visco-hyperelastic constitutive model is assumed to characterize the mechanical response of ligaments. Numerical analyses that interpret anterior drawer and the talar tilt tests reported in literature are performed. The numerical results are in agreement with the range of values obtained by experimental tests confirming the accuracy of the procedure adopted. The increase of the ankle range of motion after some ligaments rupture is also evaluated, leading to the capability of the numerical models to interpret the damage conditions. The developed computational model provides a tool for the investigation of foot and ankle functionality in terms of stress-strain of the tissues and in terms of ankle motion, considering different types of damage to ankle ligaments.

  1. Computer-assisted virtual preoperative planning in orthopedic surgery for acetabular fractures based on actual computed tomography data.

    PubMed

    Wang, Guang-Ye; Huang, Wen-Jun; Song, Qi; Qin, Yun-Tian; Liang, Jin-Feng

    2016-12-01

    Acetabular fractures have always been very challenging for orthopedic surgeons; therefore, appropriate preoperative evaluation and planning are particularly important. This study aimed to explore the application methods and clinical value of preoperative computer simulation (PCS) in treating pelvic and acetabular fractures. Spiral computed tomography (CT) was performed on 13 patients with pelvic and acetabular fractures, and Digital Imaging and Communications in Medicine (DICOM) data were then input into Mimics software to reconstruct three-dimensional (3D) models of actual pelvic and acetabular fractures for preoperative simulative reduction and fixation, and to simulate each surgical procedure. The times needed for virtual surgical modeling and reduction and fixation were also recorded. The average fracture-modeling time was 45 min (30-70 min), and the average time for bone reduction and fixation was 28 min (16-45 min). Among the surgical approaches planned for these 13 patients, 12 were finally adopted; 12 cases used the simulated surgical fixation, and only 1 case used a partial planned fixation method. PCS can provide accurate surgical plans and data support for actual surgeries.

  2. Using three-dimensional computational modeling to compare the geometrical fitness of two kinds of proximal femoral intramedullary nail for Chinese femur.

    PubMed

    Zhang, Sheng; Zhang, Kairui; Wang, Yimin; Feng, Wei; Wang, Bowei; Yu, Bin

    2013-01-01

    The aim of this study was to use three-dimensional (3D) computational modeling to compare the geometric fitness of these two kinds of proximal femoral intramedullary nails in the Chinese femurs. Computed tomography (CT) scans of a total of 120 normal adult Chinese cadaveric femurs were collected for analysis. With the three-dimensional (3D) computational technology, the anatomical fitness between the nail and bone was quantified according to the impingement incidence, maximum thicknesses and lengths by which the nail was protruding into the cortex in the virtual bone model, respectively, at the proximal, middle, and distal portions of the implant in the femur. The results showed that PFNA-II may fit better for the Chinese proximal femurs than InterTan, and the distal portion of InterTan may perform better than that of PFNA-II; the anatomic fitness of both nails for Chinese patients may not be very satisfactory. As a result, both implants need further modifications to meet the needs of the Chinese population.

  3. Modeling of near wall turbulence and modeling of bypass transition

    NASA Technical Reports Server (NTRS)

    Yang, Z.

    1992-01-01

    The objectives for this project are as follows: (1) Modeling of the near wall turbulence: We aim to develop a second order closure for the near wall turbulence. As a first step of this project, we try to develop a kappa-epsilon model for near wall turbulence. We require the resulting model to be able to handle both near wall turbulence and turbulent flows away from the wall, computationally robust, and applicable for complex flow situations, flow with separation, for example, and (2) Modeling of the bypass transition: We aim to develop a bypass transition model which contains the effect of intermittency. Thus, the model can be used for both the transitional boundary layers and the turbulent boundary layers. We require the resulting model to give a good prediction of momentum and heat transfer within the transitional boundary and a good prediction of the effect of freestream turbulence on transitional boundary layers.

  4. OpenWorm: an open-science approach to modeling Caenorhabditis elegans.

    PubMed

    Szigeti, Balázs; Gleeson, Padraig; Vella, Michael; Khayrulin, Sergey; Palyanov, Andrey; Hokanson, Jim; Currie, Michael; Cantarelli, Matteo; Idili, Giovanni; Larson, Stephen

    2014-01-01

    OpenWorm is an international collaboration with the aim of understanding how the behavior of Caenorhabditis elegans (C. elegans) emerges from its underlying physiological processes. The project has developed a modular simulation engine to create computational models of the worm. The modularity of the engine makes it possible to easily modify the model, incorporate new experimental data and test hypotheses. The modeling framework incorporates both biophysical neuronal simulations and a novel fluid-dynamics-based soft-tissue simulation for physical environment-body interactions. The project's open-science approach is aimed at overcoming the difficulties of integrative modeling within a traditional academic environment. In this article the rationale is presented for creating the OpenWorm collaboration, the tools and resources developed thus far are outlined and the unique challenges associated with the project are discussed.

  5. Learning the Task Management Space of an Aircraft Approach Model

    NASA Technical Reports Server (NTRS)

    Krall, Joseph; Menzies, Tim; Davies, Misty

    2014-01-01

    Validating models of airspace operations is a particular challenge. These models are often aimed at finding and exploring safety violations, and aim to be accurate representations of real-world behavior. However, the rules governing the behavior are quite complex: nonlinear physics, operational modes, human behavior, and stochastic environmental concerns all determine the responses of the system. In this paper, we present a study on aircraft runway approaches as modeled in Georgia Tech's Work Models that Compute (WMC) simulation. We use a new learner, Genetic-Active Learning for Search-Based Software Engineering (GALE) to discover the Pareto frontiers defined by cognitive structures. These cognitive structures organize the prioritization and assignment of tasks of each pilot during approaches. We discuss the benefits of our approach, and also discuss future work necessary to enable uncertainty quantification.

  6. An Automatic Registration Algorithm for 3D Maxillofacial Model

    NASA Astrophysics Data System (ADS)

    Qiu, Luwen; Zhou, Zhongwei; Guo, Jixiang; Lv, Jiancheng

    2016-09-01

    3D image registration aims at aligning two 3D data sets in a common coordinate system, which has been widely used in computer vision, pattern recognition and computer assisted surgery. One challenging problem in 3D registration is that point-wise correspondences between two point sets are often unknown apriori. In this work, we develop an automatic algorithm for 3D maxillofacial models registration including facial surface model and skull model. Our proposed registration algorithm can achieve a good alignment result between partial and whole maxillofacial model in spite of ambiguous matching, which has a potential application in the oral and maxillofacial reparative and reconstructive surgery. The proposed algorithm includes three steps: (1) 3D-SIFT features extraction and FPFH descriptors construction; (2) feature matching using SAC-IA; (3) coarse rigid alignment and refinement by ICP. Experiments on facial surfaces and mandible skull models demonstrate the efficiency and robustness of our algorithm.

  7. The effective application of a discrete transition model to explore cell-cycle regulation in yeast

    PubMed Central

    2013-01-01

    Background Bench biologists often do not take part in the development of computational models for their systems, and therefore, they frequently employ them as “black-boxes”. Our aim was to construct and test a model that does not depend on the availability of quantitative data, and can be directly used without a need for intensive computational background. Results We present a discrete transition model. We used cell-cycle in budding yeast as a paradigm for a complex network, demonstrating phenomena such as sequential protein expression and activity, and cell-cycle oscillation. The structure of the network was validated by its response to computational perturbations such as mutations, and its response to mating-pheromone or nitrogen depletion. The model has a strong predicative capability, demonstrating how the activity of a specific transcription factor, Hcm1, is regulated, and what determines commitment of cells to enter and complete the cell-cycle. Conclusion The model presented herein is intuitive, yet is expressive enough to elucidate the intrinsic structure and qualitative behavior of large and complex regulatory networks. Moreover our model allowed us to examine multiple hypotheses in a simple and intuitive manner, giving rise to testable predictions. This methodology can be easily integrated as a useful approach for the study of networks, enriching experimental biology with computational insights. PMID:23915717

  8. Blade loss transient dynamics analysis, volume 1. Task 1: Survey and perspective. [aircraft gas turbine engines

    NASA Technical Reports Server (NTRS)

    Gallardo, V. C.; Gaffney, E. F.; Bach, L. J.; Stallone, M. J.

    1981-01-01

    An analytical technique was developed to predict the behavior of a rotor system subjected to sudden unbalance. The technique is implemented in the Turbine Engine Transient Rotor Analysis (TETRA) computer program using the component element method. The analysis was particularly aimed toward blade-loss phenomena in gas turbine engines. A dual-rotor, casing, and pylon structure can be modeled by the computer program. Blade tip rubs, Coriolis forces, and mechanical clearances are included. The analytical system was verified by modeling and simulating actual test conditions for a rig test as well as a full-engine, blade-release demonstration.

  9. Computational models of aortic coarctation in hypoplastic left heart syndrome: considerations on validation of a detailed 3D model.

    PubMed

    Biglino, Giovanni; Corsini, Chiara; Schievano, Silvia; Dubini, Gabriele; Giardini, Alessandro; Hsia, Tain-Yen; Pennati, Giancarlo; Taylor, Andrew M

    2014-05-01

    Reliability of computational models for cardiovascular investigations strongly depends on their validation against physical data. This study aims to experimentally validate a computational model of complex congenital heart disease (i.e., surgically palliated hypoplastic left heart syndrome with aortic coarctation) thus demonstrating that hemodynamic information can be reliably extrapolated from the model for clinically meaningful investigations. A patient-specific aortic arch model was tested in a mock circulatory system and the same flow conditions were re-created in silico, by setting an appropriate lumped parameter network (LPN) attached to the same three-dimensional (3D) aortic model (i.e., multi-scale approach). The model included a modified Blalock-Taussig shunt and coarctation of the aorta. Different flow regimes were tested as well as the impact of uncertainty in viscosity. Computational flow and pressure results were in good agreement with the experimental signals, both qualitatively, in terms of the shape of the waveforms, and quantitatively (mean aortic pressure 62.3 vs. 65.1 mmHg, 4.8% difference; mean aortic flow 28.0 vs. 28.4% inlet flow, 1.4% difference; coarctation pressure drop 30.0 vs. 33.5 mmHg, 10.4% difference), proving the reliability of the numerical approach. It was observed that substantial changes in fluid viscosity or using a turbulent model in the numerical simulations did not significantly affect flows and pressures of the investigated physiology. Results highlighted how the non-linear fluid dynamic phenomena occurring in vitro must be properly described to ensure satisfactory agreement. This study presents methodological considerations for using experimental data to preliminarily set up a computational model, and then simulate a complex congenital physiology using a multi-scale approach.

  10. Designers workbench: toward real-time immersive modeling

    NASA Astrophysics Data System (ADS)

    Kuester, Falko; Duchaineau, Mark A.; Hamann, Bernd; Joy, Kenneth I.; Ma, Kwan-Liu

    2000-05-01

    This paper introduces the Designers Workbench, a semi- immersive virtual environment for two-handed modeling, sculpting and analysis tasks. The paper outlines the fundamental tools, design metaphors and hardware components required for an intuitive real-time modeling system. As companies focus on streamlining productivity to cope with global competition, the migration to computer-aided design (CAD), computer-aided manufacturing, and computer-aided engineering systems has established a new backbone of modern industrial product development. However, traditionally a product design frequently originates form a clay model that, after digitization, forms the basis for the numerical description of CAD primitives. The Designers Workbench aims at closing this technology or 'digital gap' experienced by design and CAD engineers by transforming the classical design paradigm into its fully integrate digital and virtual analog allowing collaborative development in a semi- immersive virtual environment. This project emphasizes two key components form the classical product design cycle: freeform modeling and analysis. In the freedom modeling stage, content creation in the form of two-handed sculpting of arbitrary objects using polygonal, volumetric or mathematically defined primitives is emphasized, whereas the analysis component provides the tools required for pre- and post-processing steps for finite element analysis tasks applied to the created models.

  11. Heads in the Cloud: A Primer on Neuroimaging Applications of High Performance Computing.

    PubMed

    Shatil, Anwar S; Younas, Sohail; Pourreza, Hossein; Figley, Chase R

    2015-01-01

    With larger data sets and more sophisticated analyses, it is becoming increasingly common for neuroimaging researchers to push (or exceed) the limitations of standalone computer workstations. Nonetheless, although high-performance computing platforms such as clusters, grids and clouds are already in routine use by a small handful of neuroimaging researchers to increase their storage and/or computational power, the adoption of such resources by the broader neuroimaging community remains relatively uncommon. Therefore, the goal of the current manuscript is to: 1) inform prospective users about the similarities and differences between computing clusters, grids and clouds; 2) highlight their main advantages; 3) discuss when it may (and may not) be advisable to use them; 4) review some of their potential problems and barriers to access; and finally 5) give a few practical suggestions for how interested new users can start analyzing their neuroimaging data using cloud resources. Although the aim of cloud computing is to hide most of the complexity of the infrastructure management from end-users, we recognize that this can still be an intimidating area for cognitive neuroscientists, psychologists, neurologists, radiologists, and other neuroimaging researchers lacking a strong computational background. Therefore, with this in mind, we have aimed to provide a basic introduction to cloud computing in general (including some of the basic terminology, computer architectures, infrastructure and service models, etc.), a practical overview of the benefits and drawbacks, and a specific focus on how cloud resources can be used for various neuroimaging applications.

  12. Heads in the Cloud: A Primer on Neuroimaging Applications of High Performance Computing

    PubMed Central

    Shatil, Anwar S.; Younas, Sohail; Pourreza, Hossein; Figley, Chase R.

    2015-01-01

    With larger data sets and more sophisticated analyses, it is becoming increasingly common for neuroimaging researchers to push (or exceed) the limitations of standalone computer workstations. Nonetheless, although high-performance computing platforms such as clusters, grids and clouds are already in routine use by a small handful of neuroimaging researchers to increase their storage and/or computational power, the adoption of such resources by the broader neuroimaging community remains relatively uncommon. Therefore, the goal of the current manuscript is to: 1) inform prospective users about the similarities and differences between computing clusters, grids and clouds; 2) highlight their main advantages; 3) discuss when it may (and may not) be advisable to use them; 4) review some of their potential problems and barriers to access; and finally 5) give a few practical suggestions for how interested new users can start analyzing their neuroimaging data using cloud resources. Although the aim of cloud computing is to hide most of the complexity of the infrastructure management from end-users, we recognize that this can still be an intimidating area for cognitive neuroscientists, psychologists, neurologists, radiologists, and other neuroimaging researchers lacking a strong computational background. Therefore, with this in mind, we have aimed to provide a basic introduction to cloud computing in general (including some of the basic terminology, computer architectures, infrastructure and service models, etc.), a practical overview of the benefits and drawbacks, and a specific focus on how cloud resources can be used for various neuroimaging applications. PMID:27279746

  13. Systems Toxicology: Real World Applications and Opportunities.

    PubMed

    Hartung, Thomas; FitzGerald, Rex E; Jennings, Paul; Mirams, Gary R; Peitsch, Manuel C; Rostami-Hodjegan, Amin; Shah, Imran; Wilks, Martin F; Sturla, Shana J

    2017-04-17

    Systems Toxicology aims to change the basis of how adverse biological effects of xenobiotics are characterized from empirical end points to describing modes of action as adverse outcome pathways and perturbed networks. Toward this aim, Systems Toxicology entails the integration of in vitro and in vivo toxicity data with computational modeling. This evolving approach depends critically on data reliability and relevance, which in turn depends on the quality of experimental models and bioanalysis techniques used to generate toxicological data. Systems Toxicology involves the use of large-scale data streams ("big data"), such as those derived from omics measurements that require computational means for obtaining informative results. Thus, integrative analysis of multiple molecular measurements, particularly acquired by omics strategies, is a key approach in Systems Toxicology. In recent years, there have been significant advances centered on in vitro test systems and bioanalytical strategies, yet a frontier challenge concerns linking observed network perturbations to phenotypes, which will require understanding pathways and networks that give rise to adverse responses. This summary perspective from a 2016 Systems Toxicology meeting, an international conference held in the Alps of Switzerland, describes the limitations and opportunities of selected emerging applications in this rapidly advancing field. Systems Toxicology aims to change the basis of how adverse biological effects of xenobiotics are characterized, from empirical end points to pathways of toxicity. This requires the integration of in vitro and in vivo data with computational modeling. Test systems and bioanalytical technologies have made significant advances, but ensuring data reliability and relevance is an ongoing concern. The major challenge facing the new pathway approach is determining how to link observed network perturbations to phenotypic toxicity.

  14. Systems Toxicology: Real World Applications and Opportunities

    PubMed Central

    2017-01-01

    Systems Toxicology aims to change the basis of how adverse biological effects of xenobiotics are characterized from empirical end points to describing modes of action as adverse outcome pathways and perturbed networks. Toward this aim, Systems Toxicology entails the integration of in vitro and in vivo toxicity data with computational modeling. This evolving approach depends critically on data reliability and relevance, which in turn depends on the quality of experimental models and bioanalysis techniques used to generate toxicological data. Systems Toxicology involves the use of large-scale data streams (“big data”), such as those derived from omics measurements that require computational means for obtaining informative results. Thus, integrative analysis of multiple molecular measurements, particularly acquired by omics strategies, is a key approach in Systems Toxicology. In recent years, there have been significant advances centered on in vitro test systems and bioanalytical strategies, yet a frontier challenge concerns linking observed network perturbations to phenotypes, which will require understanding pathways and networks that give rise to adverse responses. This summary perspective from a 2016 Systems Toxicology meeting, an international conference held in the Alps of Switzerland, describes the limitations and opportunities of selected emerging applications in this rapidly advancing field. Systems Toxicology aims to change the basis of how adverse biological effects of xenobiotics are characterized, from empirical end points to pathways of toxicity. This requires the integration of in vitro and in vivo data with computational modeling. Test systems and bioanalytical technologies have made significant advances, but ensuring data reliability and relevance is an ongoing concern. The major challenge facing the new pathway approach is determining how to link observed network perturbations to phenotypic toxicity. PMID:28362102

  15. Rational design of liposomal drug delivery systems, a review: Combined experimental and computational studies of lipid membranes, liposomes and their PEGylation.

    PubMed

    Bunker, Alex; Magarkar, Aniket; Viitala, Tapani

    2016-10-01

    Combined experimental and computational studies of lipid membranes and liposomes, with the aim to attain mechanistic understanding, result in a synergy that makes possible the rational design of liposomal drug delivery system (LDS) based therapies. The LDS is the leading form of nanoscale drug delivery platform, an avenue in drug research, known as "nanomedicine", that holds the promise to transcend the current paradigm of drug development that has led to diminishing returns. Unfortunately this field of research has, so far, been far more successful in generating publications than new drug therapies. This partly results from the trial and error based methodologies used. We discuss experimental techniques capable of obtaining mechanistic insight into LDS structure and behavior. Insight obtained purely experimentally is, however, limited; computational modeling using molecular dynamics simulation can provide insight not otherwise available. We review computational research, that makes use of the multiscale modeling paradigm, simulating the phospholipid membrane with all atom resolution and the entire liposome with coarse grained models. We discuss in greater detail the computational modeling of liposome PEGylation. Overall, we wish to convey the power that lies in the combined use of experimental and computational methodologies; we hope to provide a roadmap for the rational design of LDS based therapies. Computational modeling is able to provide mechanistic insight that explains the context of experimental results and can also take the lead and inspire new directions for experimental research into LDS development. This article is part of a Special Issue entitled: Biosimulations edited by Ilpo Vattulainen and Tomasz Róg. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. New Frontiers in Analyzing Dynamic Group Interactions: Bridging Social and Computer Science

    PubMed Central

    Lehmann-Willenbrock, Nale; Hung, Hayley; Keyton, Joann

    2017-01-01

    This special issue on advancing interdisciplinary collaboration between computer scientists and social scientists documents the joint results of the international Lorentz workshop, “Interdisciplinary Insights into Group and Team Dynamics,” which took place in Leiden, The Netherlands, July 2016. An equal number of scholars from social and computer science participated in the workshop and contributed to the papers included in this special issue. In this introduction, we first identify interaction dynamics as the core of group and team models and review how scholars in social and computer science have typically approached behavioral interactions in groups and teams. Next, we identify key challenges for interdisciplinary collaboration between social and computer scientists, and we provide an overview of the different articles in this special issue aimed at addressing these challenges. PMID:29249891

  17. New Frontiers in Analyzing Dynamic Group Interactions: Bridging Social and Computer Science.

    PubMed

    Lehmann-Willenbrock, Nale; Hung, Hayley; Keyton, Joann

    2017-10-01

    This special issue on advancing interdisciplinary collaboration between computer scientists and social scientists documents the joint results of the international Lorentz workshop, "Interdisciplinary Insights into Group and Team Dynamics," which took place in Leiden, The Netherlands, July 2016. An equal number of scholars from social and computer science participated in the workshop and contributed to the papers included in this special issue. In this introduction, we first identify interaction dynamics as the core of group and team models and review how scholars in social and computer science have typically approached behavioral interactions in groups and teams. Next, we identify key challenges for interdisciplinary collaboration between social and computer scientists, and we provide an overview of the different articles in this special issue aimed at addressing these challenges.

  18. Hybrid supply chain model for material requirement planning under financial constraints: A case study

    NASA Astrophysics Data System (ADS)

    Curci, Vita; Dassisti, Michele; Josefa, Mula Bru; Manuel, Díaz Madroñero

    2014-10-01

    Supply chain model (SCM) are potentially capable to integrate different aspects in supporting decision making for enterprise management tasks. The aim of the paper is to propose an hybrid mathematical programming model for optimization of production requirements resources planning. The preliminary model was conceived bottom-up from a real industrial case analysed oriented to maximize cash flow. Despite the intense computational effort required to converge to a solution, optimisation done brought good result in solving the objective function.

  19. Toward a Model for Intercultural Communication in Simulations

    ERIC Educational Resources Information Center

    Wiggins, Bradley E.

    2012-01-01

    The growing need for intercultural literacy in an increasingly interconnected and computer-mediated world contrasts with the dearth of investigation in best practices when designing simulations aimed at improving intercultural communication. Synthetic cultures inspired by real-world cultural traits, problem-based learning, and a social…

  20. A Model for Training Range Planning Data.

    DTIC Science & Technology

    1984-04-01

    firing over flank avoided; reduced accuracy Thermal Imaging System: For day and night target acLquisition and aiming Digital Ballistic Computer...ATTN: NGB-DAP US Army Engineer Districts USACC ATTN: DAAATTN Libary 41)WASH DC 20314 ATTN. Library (41) ATTN: Facilities Engineer (2) A C Chief

  1. Principles and Overview of Sampling Methods for Modeling Macromolecular Structure and Dynamics

    PubMed Central

    Moffatt, Ryan; Ma, Buyong; Nussinov, Ruth

    2016-01-01

    Investigation of macromolecular structure and dynamics is fundamental to understanding how macromolecules carry out their functions in the cell. Significant advances have been made toward this end in silico, with a growing number of computational methods proposed yearly to study and simulate various aspects of macromolecular structure and dynamics. This review aims to provide an overview of recent advances, focusing primarily on methods proposed for exploring the structure space of macromolecules in isolation and in assemblies for the purpose of characterizing equilibrium structure and dynamics. In addition to surveying recent applications that showcase current capabilities of computational methods, this review highlights state-of-the-art algorithmic techniques proposed to overcome challenges posed in silico by the disparate spatial and time scales accessed by dynamic macromolecules. This review is not meant to be exhaustive, as such an endeavor is impossible, but rather aims to balance breadth and depth of strategies for modeling macromolecular structure and dynamics for a broad audience of novices and experts. PMID:27124275

  2. Principles and Overview of Sampling Methods for Modeling Macromolecular Structure and Dynamics.

    PubMed

    Maximova, Tatiana; Moffatt, Ryan; Ma, Buyong; Nussinov, Ruth; Shehu, Amarda

    2016-04-01

    Investigation of macromolecular structure and dynamics is fundamental to understanding how macromolecules carry out their functions in the cell. Significant advances have been made toward this end in silico, with a growing number of computational methods proposed yearly to study and simulate various aspects of macromolecular structure and dynamics. This review aims to provide an overview of recent advances, focusing primarily on methods proposed for exploring the structure space of macromolecules in isolation and in assemblies for the purpose of characterizing equilibrium structure and dynamics. In addition to surveying recent applications that showcase current capabilities of computational methods, this review highlights state-of-the-art algorithmic techniques proposed to overcome challenges posed in silico by the disparate spatial and time scales accessed by dynamic macromolecules. This review is not meant to be exhaustive, as such an endeavor is impossible, but rather aims to balance breadth and depth of strategies for modeling macromolecular structure and dynamics for a broad audience of novices and experts.

  3. Partially-Averaged Navier-Stokes (PANS) approach for study of fluid flow and heat transfer characteristics in Czochralski melt

    NASA Astrophysics Data System (ADS)

    Verma, Sudeep; Dewan, Anupam

    2018-01-01

    The Partially-Averaged Navier-Stokes (PANS) approach has been applied for the first time to model turbulent flow and heat transfer in an ideal Czochralski set up with the realistic boundary conditions. This method provides variable level of resolution ranging from the Reynolds-Averaged Navier-Stokes (RANS) modelling to Direct Numerical Simulation (DNS) based on the filter control parameter. For the present case, a low-Re PANS model has been developed for Czochralski melt flow, which includes the effect of coriolis, centrifugal, buoyant and surface tension induced forces. The aim of the present study is to assess improvement in results on switching to PANS modelling from unsteady RANS (URANS) approach on the same computational mesh. The PANS computed results were found to be in good agreement with the reported experimental, DNS and Large Eddy Simulation (LES) data. A clear improvement in computational accuracy is observed in switching from the URANS approach to the PANS methodology. The computed results further improved with a reduction in the PANS filter width. Further the capability of the PANS model to capture key characteristics of the Czochralski crystal growth is also highlighted. It was observed that the PANS model was able to resolve the three-dimensional turbulent nature of the melt, characteristic flow structures arising due to flow instabilities and generation of thermal plumes and vortices in the Czochralski melt.

  4. Rapid prototyping modelling in oral and maxillofacial surgery: A two year retrospective study

    PubMed Central

    Stoor, Patricia; Mesimäki, Karri; Kontio, Risto K.

    2015-01-01

    Background The use of rapid prototyping (RP) models in medicine to construct bony models is increasing. Material and Methods The aim of the study was to evaluate retrospectively the indication for the use of RP models in oral and maxillofacial surgery at Helsinki University Central Hospital during 2009-2010. Also, the used computed tomography (CT) examination – multislice CT (MSCT) or cone beam CT (CBCT) - method was evaluated. Results In total 114 RP models were fabricated for 102 patients. The mean age of the patients at the time of the production of the model was 50.4 years. The indications for the modelling included malignant lesions (29%), secondary reconstruction (25%), prosthodontic treatment (22%), orthognathic surgery or asymmetry (13%), benign lesions (8%), and TMJ disorders (4%). MSCT examination was used in 92 and CBCT examination in 22 cases. Most of the models (75%) were conventional hard tissue models. Models with colored tumour or other structure(s) of interest were ordered in 24%. Two out of the 114 models were soft tissue models. Conclusions The main benefit of the models was in treatment planning and in connection with the production of pre-bent plates or custom made implants. The RP models both facilitate and improve treatment planning and intraoperative efficiency. Key words:Rapid prototyping, radiology, computed tomography, cone beam computed tomography. PMID:26644837

  5. Computational helioseismology in the frequency domain: acoustic waves in axisymmetric solar models with flows

    NASA Astrophysics Data System (ADS)

    Gizon, Laurent; Barucq, Hélène; Duruflé, Marc; Hanson, Chris S.; Leguèbe, Michael; Birch, Aaron C.; Chabassier, Juliette; Fournier, Damien; Hohage, Thorsten; Papini, Emanuele

    2017-04-01

    Context. Local helioseismology has so far relied on semi-analytical methods to compute the spatial sensitivity of wave travel times to perturbations in the solar interior. These methods are cumbersome and lack flexibility. Aims: Here we propose a convenient framework for numerically solving the forward problem of time-distance helioseismology in the frequency domain. The fundamental quantity to be computed is the cross-covariance of the seismic wavefield. Methods: We choose sources of wave excitation that enable us to relate the cross-covariance of the oscillations to the Green's function in a straightforward manner. We illustrate the method by considering the 3D acoustic wave equation in an axisymmetric reference solar model, ignoring the effects of gravity on the waves. The symmetry of the background model around the rotation axis implies that the Green's function can be written as a sum of longitudinal Fourier modes, leading to a set of independent 2D problems. We use a high-order finite-element method to solve the 2D wave equation in frequency space. The computation is embarrassingly parallel, with each frequency and each azimuthal order solved independently on a computer cluster. Results: We compute travel-time sensitivity kernels in spherical geometry for flows, sound speed, and density perturbations under the first Born approximation. Convergence tests show that travel times can be computed with a numerical precision better than one millisecond, as required by the most precise travel-time measurements. Conclusions: The method presented here is computationally efficient and will be used to interpret travel-time measurements in order to infer, e.g., the large-scale meridional flow in the solar convection zone. It allows the implementation of (full-waveform) iterative inversions, whereby the axisymmetric background model is updated at each iteration.

  6. An embedded multi-core parallel model for real-time stereo imaging

    NASA Astrophysics Data System (ADS)

    He, Wenjing; Hu, Jian; Niu, Jingyu; Li, Chuanrong; Liu, Guangyu

    2018-04-01

    The real-time processing based on embedded system will enhance the application capability of stereo imaging for LiDAR and hyperspectral sensor. The task partitioning and scheduling strategies for embedded multiprocessor system starts relatively late, compared with that for PC computer. In this paper, aimed at embedded multi-core processing platform, a parallel model for stereo imaging is studied and verified. After analyzing the computing amount, throughout capacity and buffering requirements, a two-stage pipeline parallel model based on message transmission is established. This model can be applied to fast stereo imaging for airborne sensors with various characteristics. To demonstrate the feasibility and effectiveness of the parallel model, a parallel software was designed using test flight data, based on the 8-core DSP processor TMS320C6678. The results indicate that the design performed well in workload distribution and had a speed-up ratio up to 6.4.

  7. Support Vector Machines Model of Computed Tomography for Assessing Lymph Node Metastasis in Esophageal Cancer with Neoadjuvant Chemotherapy.

    PubMed

    Wang, Zhi-Long; Zhou, Zhi-Guo; Chen, Ying; Li, Xiao-Ting; Sun, Ying-Shi

    The aim of this study was to diagnose lymph node metastasis of esophageal cancer by support vector machines model based on computed tomography. A total of 131 esophageal cancer patients with preoperative chemotherapy and radical surgery were included. Various indicators (tumor thickness, tumor length, tumor CT value, total number of lymph nodes, and long axis and short axis sizes of largest lymph node) on CT images before and after neoadjuvant chemotherapy were recorded. A support vector machines model based on these CT indicators was built to predict lymph node metastasis. Support vector machines model diagnosed lymph node metastasis better than preoperative short axis size of largest lymph node on CT. The area under the receiver operating characteristic curves were 0.887 and 0.705, respectively. The support vector machine model of CT images can help diagnose lymph node metastasis in esophageal cancer with preoperative chemotherapy.

  8. Computer Modeling of Violent Intent: A Content Analysis Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanfilippo, Antonio P.; Mcgrath, Liam R.; Bell, Eric B.

    We present a computational approach to modeling the intent of a communication source representing a group or an individual to engage in violent behavior. Our aim is to identify and rank aspects of radical rhetoric that are endogenously related to violent intent to predict the potential for violence as encoded in written or spoken language. We use correlations between contentious rhetoric and the propensity for violent behavior found in documents from radical terrorist and non-terrorist groups and individuals to train and evaluate models of violent intent. We then apply these models to unseen instances of linguistic behavior to detect signsmore » of contention that have a positive correlation with violent intent factors. Of particular interest is the application of violent intent models to social media, such as Twitter, that have proved to serve as effective channels in furthering sociopolitical change.« less

  9. Analysis of Uncertainty and Variability in Finite Element Computational Models for Biomedical Engineering: Characterization and Propagation

    PubMed Central

    Mangado, Nerea; Piella, Gemma; Noailly, Jérôme; Pons-Prats, Jordi; Ballester, Miguel Ángel González

    2016-01-01

    Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known, and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering. PMID:27872840

  10. Analysis of Uncertainty and Variability in Finite Element Computational Models for Biomedical Engineering: Characterization and Propagation.

    PubMed

    Mangado, Nerea; Piella, Gemma; Noailly, Jérôme; Pons-Prats, Jordi; Ballester, Miguel Ángel González

    2016-01-01

    Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known, and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering.

  11. Design Analysis Kit for Optimization and Terascale Applications 6.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-10-19

    Sandia's Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to: (1) enhance understanding of risk, (2) improve products, and (3) assess simulation credibility. In its simplest mode, Dakota can automate typical parameter variation studies through a generic interface to a computational model. However, Dakota also delivers advanced parametric analysis techniques enabling design exploration, optimization, model calibration, risk analysis, and quantification of margins and uncertainty with such models. It directly supports verificationmore » and validation activities. The algorithms implemented in Dakota aim to address challenges in performing these analyses with complex science and engineering models from desktop to high performance computers.« less

  12. Modeling the Relationship between Transportation-Related Carbon Dioxide Emissions and Hybrid-Online Courses at a Large Urban University

    ERIC Educational Resources Information Center

    Little, Matthew; Cordero, Eugene

    2014-01-01

    Purpose: This paper aims to investigate the relationship between hybrid classes (where a per cent of the class meetings are online) and transportation-related CO[subscript 2] emissions at a commuter campus similar to San José State University (SJSU). Design/methodology/approach: A computer model was developed to calculate the number of trips to…

  13. Color appearance for photorealistic image synthesis

    NASA Astrophysics Data System (ADS)

    Marini, Daniele; Rizzi, Alessandro; Rossi, Maurizio

    2000-12-01

    Photorealistic Image Synthesis is a relevant research and application field in computer graphics, whose aim is to produce synthetic images that are undistinguishable from real ones. Photorealism is based upon accurate computational models of light material interaction, that allow us to compute the spectral intensity light field of a geometrically described scene. The fundamental methods are ray tracing and radiosity. While radiosity allows us to compute the diffuse component of the emitted and reflected light, applying ray tracing in a two pass solution we can also cope with non diffuse properties of the model surfaces. Both methods can be implemented to generate an accurate photometric distribution of light of the simulated environment. A still open problem is the visualization phase, whose purpose is to display the final result of the simulated mode on a monitor screen or on a printed paper. The tone reproduction problem consists of finding the best solution to compress the extended dynamic range of the computed light field into the limited range of the displayable colors. Recently some scholars have addressed this problem considering the perception stage of image formation, so including a model of the human visual system in the visualization process. In this paper we present a working hypothesis to solve the tone reproduction problem of synthetic image generation, integrating Retinex perception model into the photo realistic image synthesis context.

  14. Welcome to health information science and systems.

    PubMed

    Zhang, Yanchun

    2013-01-01

    Health Information Science and Systems is an exciting, new, multidisciplinary journal that aims to use technologies in computer science to assist in disease diagnoses, treatment, prediction and monitoring through the modeling, design, development, visualization, integration and management of health related information. These computer-science technologies include such as information systems, web technologies, data mining, image processing, user interaction and interface, sensors and wireless networking and are applicable to a wide range of health related information including medical data, biomedical data, bioinformatics data, public health data.

  15. Real Time Text Analysis

    NASA Astrophysics Data System (ADS)

    Senthilkumar, K.; Ruchika Mehra Vijayan, E.

    2017-11-01

    This paper aims to illustrate real time analysis of large scale data. For practical implementation we are performing sentiment analysis on live Twitter feeds for each individual tweet. To analyze sentiments we will train our data model on sentiWordNet, a polarity assigned wordNet sample by Princeton University. Our main objective will be to efficiency analyze large scale data on the fly using distributed computation. Apache Spark and Apache Hadoop eco system is used as distributed computation platform with Java as development language

  16. Design of Linear Accelerator (LINAC) tanks for proton therapy via Particle Swarm Optimization (PSO) and Genetic Algorithm (GA) approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castellano, T.; De Palma, L.; Laneve, D.

    2015-07-01

    A homemade computer code for designing a Side- Coupled Linear Accelerator (SCL) is written. It integrates a simplified model of SCL tanks with the Particle Swarm Optimization (PSO) algorithm. The computer code main aim is to obtain useful guidelines for the design of Linear Accelerator (LINAC) resonant cavities. The design procedure, assisted via the aforesaid approach seems very promising, allowing future improvements towards the optimization of actual accelerating geometries. (authors)

  17. Clinical Pilot Study and Computational Modeling of Bitemporal Transcranial Direct Current Stimulation, and Safety of Repeated Courses of Treatment, in Major Depression.

    PubMed

    Ho, Kerrie-Anne; Bai, Siwei; Martin, Donel; Alonzo, Angelo; Dokos, Socrates; Loo, Colleen K

    2015-12-01

    This study aimed to examine a bitemporal (BT) transcranial direct current stimulation (tDCS) electrode montage for the treatment of depression through a clinical pilot study and computational modeling. The safety of repeated courses of stimulation was also examined. Four participants with depression who had previously received multiple courses of tDCS received a 4-week course of BT tDCS. Mood and neuropsychological function were assessed. The results were compared with previous courses of tDCS given to the same participants using different electrode montages. Computational modeling examined the electric field maps produced by the different montages. Three participants showed clinical improvement with BT tDCS (mean [SD] improvement, 49.6% [33.7%]). There were no adverse neuropsychological effects. Computational modeling showed that the BT montage activates the anterior cingulate cortices and brainstem, which are deep brain regions that are important for depression. However, a fronto-extracephalic montage stimulated these areas more effectively. No adverse effects were found in participants receiving up to 6 courses of tDCS. Bitemporal tDCS was safe and led to clinically meaningful efficacy in 3 of 4 participants. However, computational modeling suggests that the BT montage may not activate key brain regions in depression more effectively than another novel montage--fronto-extracephalic tDCS. There is also preliminary evidence to support the safety of up to 6 repeated courses of tDCS.

  18. Reliably Discriminating Stock Structure with Genetic Markers:Mixture Models with Robust and Fast Computation.

    PubMed

    Foster, Scott D; Feutry, Pierre; Grewe, Peter M; Berry, Oliver; Hui, Francis K C; Davies, Campbell R

    2018-06-26

    Delineating naturally occurring and self-sustaining sub-populations (stocks) of a species is an important task, especially for species harvested from the wild. Despite its central importance to natural resource management, analytical methods used to delineate stocks are often, and increasingly, borrowed from superficially similar analytical tasks in human genetics even though models specifically for stock identification have been previously developed. Unfortunately, the analytical tasks in resource management and human genetics are not identical { questions about humans are typically aimed at inferring ancestry (often referred to as 'admixture') rather than breeding stocks. In this article, we argue, and show through simulation experiments and an analysis of yellowfin tuna data, that ancestral analysis methods are not always appropriate for stock delineation. In this work, we advocate a variant of a previouslyintroduced and simpler model that identifies stocks directly. We also highlight that the computational aspects of the analysis, irrespective of the model, are difficult. We introduce some alternative computational methods and quantitatively compare these methods to each other and to established methods. We also present a method for quantifying uncertainty in model parameters and in assignment probabilities. In doing so, we demonstrate that point estimates can be misleading. One of the computational strategies presented here, based on an expectation-maximisation algorithm with judiciously chosen starting values, is robust and has a modest computational cost. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  19. A Computational Framework for Bioimaging Simulation.

    PubMed

    Watabe, Masaki; Arjunan, Satya N V; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.

  20. Kinetic barriers in the isomerization of substituted ureas: implications for computer-aided drug design.

    PubMed

    Loeffler, Johannes R; Ehmki, Emanuel S R; Fuchs, Julian E; Liedl, Klaus R

    2016-05-01

    Urea derivatives are ubiquitously found in many chemical disciplines. N,N'-substituted ureas may show different conformational preferences depending on their substitution pattern. The high energetic barrier for isomerization of the cis and trans state poses additional challenges on computational simulation techniques aiming at a reproduction of the biological properties of urea derivatives. Herein, we investigate energetics of urea conformations and their interconversion using a broad spectrum of methodologies ranging from data mining, via quantum chemistry to molecular dynamics simulation and free energy calculations. We find that the inversion of urea conformations is inherently slow and beyond the time scale of typical simulation protocols. Therefore, extra care needs to be taken by computational chemists to work with appropriate model systems. We find that both knowledge-driven approaches as well as physics-based methods may guide molecular modelers towards accurate starting structures for expensive calculations to ensure that conformations of urea derivatives are modeled as adequately as possible.

  1. Proactive action preparation: seeing action preparation as a continuous and proactive process.

    PubMed

    Pezzulo, Giovanni; Ognibene, Dimitri

    2012-07-01

    In this paper, we aim to elucidate the processes that occur during action preparation from both a conceptual and a computational point of view. We first introduce the traditional, serial model of goal-directed action and discuss from a computational viewpoint its subprocesses occurring during the two phases of covert action preparation and overt motor control. Then, we discuss recent evidence indicating that these subprocesses are highly intertwined at representational and neural levels, which undermines the validity of the serial model and points instead to a parallel model of action specification and selection. Within the parallel view, we analyze the case of delayed choice, arguing that action preparation can be proactive, and preparatory processes can take place even before decisions are made. Specifically, we discuss how prior knowledge and prospective abilities can be used to maximize utility even before deciding what to do. To support our view, we present a computational implementation of (an approximated version of) proactive action preparation, showing its advantages in a simulated tennis-like scenario.

  2. Optimization of Computational Performance and Accuracy in 3-D Transient CFD Model for CFB Hydrodynamics Predictions

    NASA Astrophysics Data System (ADS)

    Rampidis, I.; Nikolopoulos, A.; Koukouzas, N.; Grammelis, P.; Kakaras, E.

    2007-09-01

    This work aims to present a pure 3-D CFD model, accurate and efficient, for the simulation of a pilot scale CFB hydrodynamics. The accuracy of the model was investigated as a function of the numerical parameters, in order to derive an optimum model setup with respect to computational cost. The necessity of the in depth examination of hydrodynamics emerges by the trend to scale up CFBCs. This scale up brings forward numerous design problems and uncertainties, which can be successfully elucidated by CFD techniques. Deriving guidelines for setting a computational efficient model is important as the scale of the CFBs grows fast, while computational power is limited. However, the optimum efficiency matter has not been investigated thoroughly in the literature as authors were more concerned for their models accuracy and validity. The objective of this work is to investigate the parameters that influence the efficiency and accuracy of CFB computational fluid dynamics models, find the optimum set of these parameters and thus establish this technique as a competitive method for the simulation and design of industrial, large scale beds, where the computational cost is otherwise prohibitive. During the tests that were performed in this work, the influence of turbulence modeling approach, time and space density and discretization schemes were investigated on a 1.2 MWth CFB test rig. Using Fourier analysis dominant frequencies were extracted in order to estimate the adequate time period for the averaging of all instantaneous values. The compliance with the experimental measurements was very good. The basic differences between the predictions that arose from the various model setups were pointed out and analyzed. The results showed that a model with high order space discretization schemes when applied on a coarse grid and averaging of the instantaneous scalar values for a 20 sec period, adequately described the transient hydrodynamic behaviour of a pilot CFB while the computational cost was kept low. Flow patterns inside the bed such as the core-annulus flow and the transportation of clusters were at least qualitatively captured.

  3. Influence of ionization on the Gupta and on the Park chemical models

    NASA Astrophysics Data System (ADS)

    Morsa, Luigi; Zuppardi, Gennaro

    2014-12-01

    This study is an extension of former works by the present authors, in which the influence of the chemical models by Gupta and by Park was evaluated on thermo-fluid-dynamic parameters in the flow field, including transport coefficients, related characteristic numbers and heat flux on two current capsules (EXPERT and Orion) during the high altitude re-entry path. The results verified that the models, even computing different air compositions in the flow field, compute only slight different compositions on the capsule surface, therefore the difference in the heat flux is not very relevant. In the above mentioned studies, ionization was neglected because the velocities of the capsules (about 5000 m/s for EXPERT and about 7600 m/s for Orion) were not high enough to activate meaningful ionization. The aim of the present work is to evaluate the incidence of ionization, linked to the chemical models by Gupta and by Park, on both heat flux and thermo fluid-dynamic parameters. The present computer tests were carried out by a direct simulation Monte Carlo code (DS2V) in the velocity interval 7600-12000 m/s, considering only the Orion capsule at an altitude of 85 km. The results verified what already found namely when ionization is not considered, the chemical models compute only a slight different gas composition in the core of the shock wave and practically the same composition on the surface therefore the same heat flux. On the opposite, the results verified that when ionization is considered, the chemical models compute different compositions in the whole shock layer and on the surface therefore different heat flux. The analysis of the results relies on a qualitative and a quantitative evaluation of the effects of ionization on both chemical models. The main result of the study is that when ionization is taken into account, the Park model is more reactive than the Gupta model; consequently, the heat flux computed by Park is lower than the one computed by Gupta; using the Gupta model, in the design of a thermal protection system, is recommended.

  4. Predicting Forearm Physical Exposures During Computer Work Using Self-Reports, Software-Recorded Computer Usage Patterns, and Anthropometric and Workstation Measurements.

    PubMed

    Huysmans, Maaike A; Eijckelhof, Belinda H W; Garza, Jennifer L Bruno; Coenen, Pieter; Blatter, Birgitte M; Johnson, Peter W; van Dieën, Jaap H; van der Beek, Allard J; Dennerlein, Jack T

    2017-12-15

    Alternative techniques to assess physical exposures, such as prediction models, could facilitate more efficient epidemiological assessments in future large cohort studies examining physical exposures in relation to work-related musculoskeletal symptoms. The aim of this study was to evaluate two types of models that predict arm-wrist-hand physical exposures (i.e. muscle activity, wrist postures and kinematics, and keyboard and mouse forces) during computer use, which only differed with respect to the candidate predicting variables; (i) a full set of predicting variables, including self-reported factors, software-recorded computer usage patterns, and worksite measurements of anthropometrics and workstation set-up (full models); and (ii) a practical set of predicting variables, only including the self-reported factors and software-recorded computer usage patterns, that are relatively easy to assess (practical models). Prediction models were build using data from a field study among 117 office workers who were symptom-free at the time of measurement. Arm-wrist-hand physical exposures were measured for approximately two hours while workers performed their own computer work. Each worker's anthropometry and workstation set-up were measured by an experimenter, computer usage patterns were recorded using software and self-reported factors (including individual factors, job characteristics, computer work behaviours, psychosocial factors, workstation set-up characteristics, and leisure-time activities) were collected by an online questionnaire. We determined the predictive quality of the models in terms of R2 and root mean squared (RMS) values and exposure classification agreement to low-, medium-, and high-exposure categories (in the practical model only). The full models had R2 values that ranged from 0.16 to 0.80, whereas for the practical models values ranged from 0.05 to 0.43. Interquartile ranges were not that different for the two models, indicating that only for some physical exposures the full models performed better. Relative RMS errors ranged between 5% and 19% for the full models, and between 10% and 19% for the practical model. When the predicted physical exposures were classified into low, medium, and high, classification agreement ranged from 26% to 71%. The full prediction models, based on self-reported factors, software-recorded computer usage patterns, and additional measurements of anthropometrics and workstation set-up, show a better predictive quality as compared to the practical models based on self-reported factors and recorded computer usage patterns only. However, predictive quality varied largely across different arm-wrist-hand exposure parameters. Future exploration of the relation between predicted physical exposure and symptoms is therefore only recommended for physical exposures that can be reasonably well predicted. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  5. R&D for computational cognitive and social models : foundations for model evaluation through verification and validation (final LDRD report).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slepoy, Alexander; Mitchell, Scott A.; Backus, George A.

    2008-09-01

    Sandia National Laboratories is investing in projects that aim to develop computational modeling and simulation applications that explore human cognitive and social phenomena. While some of these modeling and simulation projects are explicitly research oriented, others are intended to support or provide insight for people involved in high consequence decision-making. This raises the issue of how to evaluate computational modeling and simulation applications in both research and applied settings where human behavior is the focus of the model: when is a simulation 'good enough' for the goals its designers want to achieve? In this report, we discuss two years' worthmore » of review and assessment of the ASC program's approach to computational model verification and validation, uncertainty quantification, and decision making. We present a framework that extends the principles of the ASC approach into the area of computational social and cognitive modeling and simulation. In doing so, we argue that the potential for evaluation is a function of how the modeling and simulation software will be used in a particular setting. In making this argument, we move from strict, engineering and physics oriented approaches to V&V to a broader project of model evaluation, which asserts that the systematic, rigorous, and transparent accumulation of evidence about a model's performance under conditions of uncertainty is a reasonable and necessary goal for model evaluation, regardless of discipline. How to achieve the accumulation of evidence in areas outside physics and engineering is a significant research challenge, but one that requires addressing as modeling and simulation tools move out of research laboratories and into the hands of decision makers. This report provides an assessment of our thinking on ASC Verification and Validation, and argues for further extending V&V research in the physical and engineering sciences toward a broader program of model evaluation in situations of high consequence decision-making.« less

  6. Collaborative Project: Improving the Representation of Coastal and Estuarine Processes in Earth System Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryan, Frank; Dennis, John; MacCready, Parker

    This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation.

  7. Final Report Collaborative Project: Improving the Representation of Coastal and Estuarine Processes in Earth System Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryan, Frank; Dennis, John; MacCready, Parker

    This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation.

  8. BES-III distributed computing status

    NASA Astrophysics Data System (ADS)

    Belov, S. D.; Deng, Z. Y.; Korenkov, V. V.; Li, W. D.; Lin, T.; Ma, Z. T.; Nicholson, C.; Pelevanyuk, I. S.; Suo, B.; Trofimov, V. V.; Tsaregorodtsev, A. U.; Uzhinskiy, A. V.; Yan, T.; Yan, X. F.; Zhang, X. M.; Zhemchugov, A. S.

    2016-09-01

    The BES-III experiment at the Institute of High Energy Physics (Beijing, China) is aimed at the precision measurements in e+e- annihilation in the energy range from 2.0 till 4.6 GeV. The world's largest samples of J/psi and psi' events and unique samples of XYZ data have been already collected. The expected increase of the data volume in the coming years required a significant evolution of the computing model, namely shift from a centralized data processing to a distributed one. This report summarizes a current design of the BES-III distributed computing system, some of key decisions and experience gained during 2 years of operations.

  9. Applying systems biology methods to the study of human physiology in extreme environments

    PubMed Central

    2013-01-01

    Systems biology is defined in this review as ‘an iterative process of computational model building and experimental model revision with the aim of understanding or simulating complex biological systems’. We propose that, in practice, systems biology rests on three pillars: computation, the omics disciplines and repeated experimental perturbation of the system of interest. The number of ethical and physiologically relevant perturbations that can be used in experiments on healthy humans is extremely limited and principally comprises exercise, nutrition, infusions (e.g. Intralipid), some drugs and altered environment. Thus, we argue that systems biology and environmental physiology are natural symbionts for those interested in a system-level understanding of human biology. However, despite excellent progress in high-altitude genetics and several proteomics studies, systems biology research into human adaptation to extreme environments is in its infancy. A brief description and overview of systems biology in its current guise is given, followed by a mini review of computational methods used for modelling biological systems. Special attention is given to high-altitude research, metabolic network reconstruction and constraint-based modelling. PMID:23849719

  10. An overview of bioinformatics methods for modeling biological pathways in yeast

    PubMed Central

    Hou, Jie; Acharya, Lipi; Zhu, Dongxiao

    2016-01-01

    The advent of high-throughput genomics techniques, along with the completion of genome sequencing projects, identification of protein–protein interactions and reconstruction of genome-scale pathways, has accelerated the development of systems biology research in the yeast organism Saccharomyces cerevisiae. In particular, discovery of biological pathways in yeast has become an important forefront in systems biology, which aims to understand the interactions among molecules within a cell leading to certain cellular processes in response to a specific environment. While the existing theoretical and experimental approaches enable the investigation of well-known pathways involved in metabolism, gene regulation and signal transduction, bioinformatics methods offer new insights into computational modeling of biological pathways. A wide range of computational approaches has been proposed in the past for reconstructing biological pathways from high-throughput datasets. Here we review selected bioinformatics approaches for modeling biological pathways in S. cerevisiae, including metabolic pathways, gene-regulatory pathways and signaling pathways. We start with reviewing the research on biological pathways followed by discussing key biological databases. In addition, several representative computational approaches for modeling biological pathways in yeast are discussed. PMID:26476430

  11. Development of a High Resolution 3D Infant Stomach Model for Surgical Planning

    NASA Astrophysics Data System (ADS)

    Chaudry, Qaiser; Raza, S. Hussain; Lee, Jeonggyu; Xu, Yan; Wulkan, Mark; Wang, May D.

    Medical surgical procedures have not changed much during the past century due to the lack of accurate low-cost workbench for testing any new improvement. The increasingly cheaper and powerful computer technologies have made computer-based surgery planning and training feasible. In our work, we have developed an accurate 3D stomach model, which aims to improve the surgical procedure that treats the infant pediatric and neonatal gastro-esophageal reflux disease (GERD). We generate the 3-D infant stomach model based on in vivo computer tomography (CT) scans of an infant. CT is a widely used clinical imaging modality that is cheap, but with low spatial resolution. To improve the model accuracy, we use the high resolution Visible Human Project (VHP) in model building. Next, we add soft muscle material properties to make the 3D model deformable. Then we use virtual reality techniques such as haptic devices to make the 3D stomach model deform upon touching force. This accurate 3D stomach model provides a workbench for testing new GERD treatment surgical procedures. It has the potential to reduce or eliminate the extensive cost associated with animal testing when improving any surgical procedure, and ultimately, to reduce the risk associated with infant GERD surgery.

  12. Computational modeling of radiobiological effects in bone metastases for different radionuclides.

    PubMed

    Liberal, Francisco D C Guerra; Tavares, Adriana Alexandre S; Tavares, João Manuel R S

    2017-06-01

    Computational simulation is a simple and practical way to study and to compare a variety of radioisotopes for different medical applications, including the palliative treatment of bone metastases. This study aimed to evaluate and compare cellular effects modelled for different radioisotopes currently in use or under research for treatment of bone metastases using computational methods. Computational models were used to estimate the radiation-induced cellular effects (Virtual Cell Radiobiology algorithm) post-irradiation with selected particles emitted by Strontium-89 ( 89 Sr), Samarium-153 ( 153 Sm), Lutetium-177 ( 177 Lu), and Radium-223 ( 223 Ra). Cellular kinetics post-irradiation using 89 Sr β - particles, 153 Sm β -  particles, 177 Lu β -  particles and 223 Ra α particles showed that the cell response was dose- and radionuclide-dependent. 177 Lu beta minus particles and, in particular, 223 Ra alpha particles, yielded the lowest survival fraction of all investigated particles. 223 Ra alpha particles induced the highest cell death of all investigated particles on metastatic prostate cells in comparison to irradiation with β -  radionuclides, two of the most frequently used radionuclides in the palliative treatment of bone metastases in clinical routine practice. Moreover, the data obtained suggest that the used computational methods might provide some perception about cellular effects following irradiation with different radionuclides.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savic, Vesna; Hector, Louis G.; Ezzat, Hesham

    This paper presents an overview of a four-year project focused on development of an integrated computational materials engineering (ICME) toolset for third generation advanced high-strength steels (3GAHSS). Following a brief look at ICME as an emerging discipline within the Materials Genome Initiative, technical tasks in the ICME project will be discussed. Specific aims of the individual tasks are multi-scale, microstructure-based material model development using state-of-the-art computational and experimental techniques, forming, toolset assembly, design optimization, integration and technical cost modeling. The integrated approach is initially illustrated using a 980 grade transformation induced plasticity (TRIP) steel, subject to a two-step quenching andmore » partitioning (Q&P) heat treatment, as an example.« less

  14. The recognition of potato varieties using of neural image analysis method

    NASA Astrophysics Data System (ADS)

    Przybył, K.; Górna, K.; Wojcieszak, D.; Czekała, W.; Ludwiczak, A.; Przybylak, A.; Boniecki, P.; Koszela, K.; Zaborowicz, M.; Janczak, D.; Lewicki, A.

    2015-07-01

    The aim of this paper was to extract the representative features and generate an appropriate neural model for classification of varieties of edible potato. Potatoes of variety the Vineta and the Denar were the empirical object of this thesis. The main concept of the project was to develop and prepare an image database using the computer image analysis software. The choice of appropriate neural model the one which will have the greatest abilities to identify the selected variety. The aim of this project is ultimately to conduct assistance and accelerate work of the expert, who classifies and keeps different varieties of potatoes in heaps.

  15. Modeling the probability distribution of peak discharge for infiltrating hillslopes

    NASA Astrophysics Data System (ADS)

    Baiamonte, Giorgio; Singh, Vijay P.

    2017-07-01

    Hillslope response plays a fundamental role in the prediction of peak discharge at the basin outlet. The peak discharge for the critical duration of rainfall and its probability distribution are needed for designing urban infrastructure facilities. This study derives the probability distribution, denoted as GABS model, by coupling three models: (1) the Green-Ampt model for computing infiltration, (2) the kinematic wave model for computing discharge hydrograph from the hillslope, and (3) the intensity-duration-frequency (IDF) model for computing design rainfall intensity. The Hortonian mechanism for runoff generation is employed for computing the surface runoff hydrograph. Since the antecedent soil moisture condition (ASMC) significantly affects the rate of infiltration, its effect on the probability distribution of peak discharge is investigated. Application to a watershed in Sicily, Italy, shows that with the increase of probability, the expected effect of ASMC to increase the maximum discharge diminishes. Only for low values of probability, the critical duration of rainfall is influenced by ASMC, whereas its effect on the peak discharge seems to be less for any probability. For a set of parameters, the derived probability distribution of peak discharge seems to be fitted by the gamma distribution well. Finally, an application to a small watershed, with the aim to test the possibility to arrange in advance the rational runoff coefficient tables to be used for the rational method, and a comparison between peak discharges obtained by the GABS model with those measured in an experimental flume for a loamy-sand soil were carried out.

  16. Cybermaterials: materials by design and accelerated insertion of materials

    NASA Astrophysics Data System (ADS)

    Xiong, Wei; Olson, Gregory B.

    2016-02-01

    Cybermaterials innovation entails an integration of Materials by Design and accelerated insertion of materials (AIM), which transfers studio ideation into industrial manufacturing. By assembling a hierarchical architecture of integrated computational materials design (ICMD) based on materials genomic fundamental databases, the ICMD mechanistic design models accelerate innovation. We here review progress in the development of linkage models of the process-structure-property-performance paradigm, as well as related design accelerating tools. Extending the materials development capability based on phase-level structural control requires more fundamental investment at the level of the Materials Genome, with focus on improving applicable parametric design models and constructing high-quality databases. Future opportunities in materials genomic research serving both Materials by Design and AIM are addressed.

  17. On the feasibility of the computational modelling of the endoluminal vacuum-assisted closure of an oesophageal anastomotic leakage

    PubMed Central

    Bellomo, Facundo J.; Rosales, Iván; del Castillo, Luis F.; Sánchez, Ricardo; Turon, Pau

    2018-01-01

    Endoluminal vacuum-assisted closure (E-VAC) is a promising therapy to treat anastomotic leakages of the oesophagus and bowel which are associated with high morbidity and mortality rates. An open-pore polyurethane foam is introduced into the leakage cavity and connected to a device that applies a suction pressure to accelerate the closure of the defect. Computational analysis of this healing process can advance our understanding of the biomechanical mechanisms at play. To this aim, we use a dual-stage finite-element analysis in which (i) the structural problem addresses the cavity reduction caused by the suction and (ii) a new constitutive formulation models tissue healing via permanent deformations coupled to a stiffness increase. The numerical implementation in an in-house code is described and a qualitative example illustrates the basic characteristics of the model. The computational model successfully reproduces the generic closure of an anastomotic leakage cavity, supporting the hypothesis that suction pressure promotes healing by means of the aforementioned mechanisms. However, the current framework needs to be enriched with empirical data to help advance device designs and treatment guidelines. Nonetheless, this conceptual study confirms that computational analysis can reproduce E-VAC of anastomotic leakages and establishes the bases for better understanding the mechanobiology of anastomotic defect healing. PMID:29515846

  18. Assessing the Electromagnetic Fields Generated by a Radiofrequency MRI Body Coil at 64 MHz: Defeaturing vs. Accuracy

    PubMed Central

    Lucano, Elena; Liberti, Micaela; Mendoza, Gonzalo G.; Lloyd, Tom; Iacono, Maria Ida; Apollonio, Francesca; Wedan, Steve; Kainz, Wolfgang; Angelone, Leonardo M.

    2016-01-01

    Goal This study aims at a systematic assessment of five computational models of a birdcage coil for magnetic resonance imaging (MRI) with respect to accuracy and computational cost. Methods The models were implemented using the same geometrical model and numerical algorithm, but different driving methods (i.e., coil “defeaturing”). The defeatured models were labeled as: specific (S2), generic (G32, G16), and hybrid (H16, H16fr-forced). The accuracy of the models was evaluated using the “Symmetric Mean Absolute Percentage Error” (“SMAPE”), by comparison with measurements in terms of frequency response, as well as electric (||E⃗||) and magnetic (||B⃗||) field magnitude. Results All the models computed the ||B⃗|| within 35 % of the measurements, only the S2, G32, and H16 were able to accurately model the ||E⃗|| inside the phantom with a maximum SMAPE of 16 %. Outside the phantom, only the S2 showed a SMAPE lower than 11 %. Conclusions Results showed that assessing the accuracy of ||B⃗|| based only on comparison along the central longitudinal line of the coil can be misleading. Generic or hybrid coils – when properly modeling the currents along the rings/rungs – were sufficient to accurately reproduce the fields inside a phantom while a specific model was needed to accurately model ||E⃗|| in the space between coil and phantom. Significance Computational modeling of birdcage body coils is extensively used in the evaluation of RF-induced heating during MRI. Experimental validation of numerical models is needed to determine if a model is an accurate representation of a physical coil. PMID:26685220

  19. Hybrid soft computing systems for electromyographic signals analysis: a review.

    PubMed

    Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates

    2014-02-03

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.

  20. Hybrid soft computing systems for electromyographic signals analysis: a review

    PubMed Central

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  1. Using Modeling and Simulation to Complement Testing for Increased Understanding of Weapon Subassembly Response.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, Michael K.; Davidson, Megan

    As part of Sandia’s nuclear deterrence mission, the B61-12 Life Extension Program (LEP) aims to modernize the aging weapon system. Modernization requires requalification and Sandia is using high performance computing to perform advanced computational simulations to better understand, evaluate, and verify weapon system performance in conjunction with limited physical testing. The Nose Bomb Subassembly (NBSA) of the B61-12 is responsible for producing a fuzing signal upon ground impact. The fuzing signal is dependent upon electromechanical impact sensors producing valid electrical fuzing signals at impact. Computer generated models were used to assess the timing between the impact sensor’s response to themore » deceleration of impact and damage to major components and system subassemblies. The modeling and simulation team worked alongside the physical test team to design a large-scale reverse ballistic test to not only assess system performance, but to also validate their computational models. The reverse ballistic test conducted at Sandia’s sled test facility sent a rocket sled with a representative target into a stationary B61-12 (NBSA) to characterize the nose crush and functional response of NBSA components. Data obtained from data recorders and high-speed photometrics were integrated with previously generated computer models in order to refine and validate the model’s ability to reliably simulate real-world effects. Large-scale tests are impractical to conduct for every single impact scenario. By creating reliable computer models, we can perform simulations that identify trends and produce estimates of outcomes over the entire range of required impact conditions. Sandia’s HPCs enable geometric resolution that was unachievable before, allowing for more fidelity and detail, and creating simulations that can provide insight to support evaluation of requirements and performance margins. As computing resources continue to improve, researchers at Sandia are hoping to improve these simulations so they provide increasingly credible analysis of the system response and performance over the full range of conditions.« less

  2. Mathematical modeling based on ordinary differential equations: A promising approach to vaccinology

    PubMed Central

    Bonin, Carla Rezende Barbosa; Fernandes, Guilherme Cortes; dos Santos, Rodrigo Weber; Lobosco, Marcelo

    2017-01-01

    ABSTRACT New contributions that aim to accelerate the development or to improve the efficacy and safety of vaccines arise from many different areas of research and technology. One of these areas is computational science, which traditionally participates in the initial steps, such as the pre-screening of active substances that have the potential to become a vaccine antigen. In this work, we present another promising way to use computational science in vaccinology: mathematical and computational models of important cell and protein dynamics of the immune system. A system of Ordinary Differential Equations represents different immune system populations, such as B cells and T cells, antigen presenting cells and antibodies. In this way, it is possible to simulate, in silico, the immune response to vaccines under development or under study. Distinct scenarios can be simulated by varying parameters of the mathematical model. As a proof of concept, we developed a model of the immune response to vaccination against the yellow fever. Our simulations have shown consistent results when compared with experimental data available in the literature. The model is generic enough to represent the action of other diseases or vaccines in the human immune system, such as dengue and Zika virus. PMID:28027002

  3. MRI-Based Computational Fluid Dynamics in Experimental Vascular Models: Toward the Development of an Approach for Prediction of Cardiovascular Changes During Prolonged Space Missions

    NASA Technical Reports Server (NTRS)

    Spirka, T. A.; Myers, J. G.; Setser, R. M.; Halliburton, S. S.; White, R. D.; Chatzimavroudis, G. P.

    2005-01-01

    A priority of NASA is to identify and study possible risks to astronauts health during prolonged space missions [l]. The goal is to develop a procedure for a preflight evaluation of the cardiovascular system of an astronaut and to forecast how it will be affected during the mission. To predict these changes, a computational cardiovascular model must be constructed. Although physiology data can be used to make a general model, a more desirable subject-specific model requires anatomical, functional, and flow data from the specific astronaut. MRI has the unique advantage of providing images with all of the above information, including three-directional velocity data which can be used as boundary conditions in a computational fluid dynamics (CFD) program [2,3]. MRI-based CFD is very promising for reproduction of the flow patterns of a specific subject and prediction of changes in the absence of gravity. The aim of this study was to test the feasibility of this approach by reconstructing the geometry of MRI-scanned arterial models and reproducing the MRI-measured velocities using CFD simulations on these geometries.

  4. Mathematical modeling based on ordinary differential equations: A promising approach to vaccinology.

    PubMed

    Bonin, Carla Rezende Barbosa; Fernandes, Guilherme Cortes; Dos Santos, Rodrigo Weber; Lobosco, Marcelo

    2017-02-01

    New contributions that aim to accelerate the development or to improve the efficacy and safety of vaccines arise from many different areas of research and technology. One of these areas is computational science, which traditionally participates in the initial steps, such as the pre-screening of active substances that have the potential to become a vaccine antigen. In this work, we present another promising way to use computational science in vaccinology: mathematical and computational models of important cell and protein dynamics of the immune system. A system of Ordinary Differential Equations represents different immune system populations, such as B cells and T cells, antigen presenting cells and antibodies. In this way, it is possible to simulate, in silico, the immune response to vaccines under development or under study. Distinct scenarios can be simulated by varying parameters of the mathematical model. As a proof of concept, we developed a model of the immune response to vaccination against the yellow fever. Our simulations have shown consistent results when compared with experimental data available in the literature. The model is generic enough to represent the action of other diseases or vaccines in the human immune system, such as dengue and Zika virus.

  5. Addressing Curse of Dimensionality in Sensitivity Analysis: How Can We Handle High-Dimensional Problems?

    NASA Astrophysics Data System (ADS)

    Safaei, S.; Haghnegahdar, A.; Razavi, S.

    2016-12-01

    Complex environmental models are now the primary tool to inform decision makers for the current or future management of environmental resources under the climate and environmental changes. These complex models often contain a large number of parameters that need to be determined by a computationally intensive calibration procedure. Sensitivity analysis (SA) is a very useful tool that not only allows for understanding the model behavior, but also helps in reducing the number of calibration parameters by identifying unimportant ones. The issue is that most global sensitivity techniques are highly computationally demanding themselves for generating robust and stable sensitivity metrics over the entire model response surface. Recently, a novel global sensitivity analysis method, Variogram Analysis of Response Surfaces (VARS), is introduced that can efficiently provide a comprehensive assessment of global sensitivity using the Variogram concept. In this work, we aim to evaluate the effectiveness of this highly efficient GSA method in saving computational burden, when applied to systems with extra-large number of input factors ( 100). We use a test function and a hydrological modelling case study to demonstrate the capability of VARS method in reducing problem dimensionality by identifying important vs unimportant input factors.

  6. A model for the neural control of pineal periodicity

    NASA Astrophysics Data System (ADS)

    de Oliveira Cruz, Frederico Alan; Soares, Marilia Amavel Gomes; Cortez, Celia Martins

    2016-12-01

    The aim of this work was verify if a computational model associating the synchronization dynamics of coupling oscillators to a set of synaptic transmission equations would be able to simulate the control of pineal by a complex neural pathway that connects the retina to this gland. Results from the simulations showed that the frequency and temporal firing patterns were in the range of values found in literature.

  7. Application of Mathematical and Three-Dimensional Computer Modeling Tools in the Planning of Processes of Fuel and Energy Complexes

    NASA Astrophysics Data System (ADS)

    Aksenova, Olesya; Nikolaeva, Evgenia; Cehlár, Michal

    2017-11-01

    This work aims to investigate the effectiveness of mathematical and three-dimensional computer modeling tools in the planning of processes of fuel and energy complexes at the planning and design phase of a thermal power plant (TPP). A solution for purification of gas emissions at the design development phase of waste treatment systems is proposed employing mathematical and three-dimensional computer modeling - using the E-nets apparatus and the development of a 3D model of the future gas emission purification system. Which allows to visualize the designed result, to select and scientifically prove economically feasible technology, as well as to ensure the high environmental and social effect of the developed waste treatment system. The authors present results of a treatment of planned technological processes and the system for purifying gas emissions in terms of E-nets. using mathematical modeling in the Simulink application. What allowed to create a model of a device from the library of standard blocks and to perform calculations. A three-dimensional model of a system for purifying gas emissions has been constructed. It allows to visualize technological processes and compare them with the theoretical calculations at the design phase of a TPP and. if necessary, make adjustments.

  8. Invasion emerges from cancer cell adaptation to competitive microenvironments: Quantitative predictions from multiscale mathematical models

    PubMed Central

    Rejniak, Katarzyna A.; Gerlee, Philip

    2013-01-01

    Summary In this review we summarize our recent efforts using mathematical modeling and computation to simulate cancer invasion, with a special emphasis on the tumor microenvironment. We consider cancer progression as a complex multiscale process and approach it with three single-cell based mathematical models that examine the interactions between tumor microenvironment and cancer cells at several scales. The models exploit distinct mathematical and computational techniques, yet they share core elements and can be compared and/or related to each other. The overall aim of using mathematical models is to uncover the fundamental mechanisms that lend cancer progression its direction towards invasion and metastasis. The models effectively simulate various modes of cancer cell adaptation to the microenvironment in a growing tumor. All three point to a general mechanism underlying cancer invasion: competition for adaptation between distinct cancer cell phenotypes, driven by a tumor microenvironment with scarce resources. These theoretical predictions pose an intriguing experimental challenge: test the hypothesis that invasion is an emergent property of cancer cell populations adapting to selective microenvironment pressure, rather than culmination of cancer progression producing cells with the “invasive phenotype”. In broader terms, we propose that fundamental insights into cancer can be achieved by experimentation interacting with theoretical frameworks provided by computational and mathematical modeling. PMID:18524624

  9. Designers Workbench: Towards Real-Time Immersive Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuester, F; Duchaineau, M A; Hamann, B

    2001-10-03

    This paper introduces the DesignersWorkbench, a semi-immersive virtual environment for two-handed modeling, sculpting and analysis tasks. The paper outlines the fundamental tools, design metaphors and hardware components required for an intuitive real-time modeling system. As companies focus on streamlining productivity to cope with global competition, the migration to computer-aided design (CAD), computer-aided manufacturing (CAM), and computer-aided engineering (CAE) systems has established a new backbone of modern industrial product development. However, traditionally a product design frequently originates from a clay model that, after digitization, forms the basis for the numerical description of CAD primitives. The DesignersWorkbench aims at closing this technologymore » or ''digital gap'' experienced by design and CAD engineers by transforming the classical design paradigm into its filly integrated digital and virtual analog allowing collaborative development in a semi-immersive virtual environment. This project emphasizes two key components from the classical product design cycle: freeform modeling and analysis. In the freeform modeling stage, content creation in the form of two-handed sculpting of arbitrary objects using polygonal, volumetric or mathematically defined primitives is emphasized, whereas the analysis component provides the tools required for pre- and post-processing steps for finite element analysis tasks applied to the created models.« less

  10. Direct Visuo-Haptic 4D Volume Rendering Using Respiratory Motion Models.

    PubMed

    Fortmeier, Dirk; Wilms, Matthias; Mastmeyer, Andre; Handels, Heinz

    2015-01-01

    This article presents methods for direct visuo-haptic 4D volume rendering of virtual patient models under respiratory motion. Breathing models are computed based on patient-specific 4D CT image data sequences. Virtual patient models are visualized in real-time by ray casting based rendering of a reference CT image warped by a time-variant displacement field, which is computed using the motion models at run-time. Furthermore, haptic interaction with the animated virtual patient models is provided by using the displacements computed at high rendering rates to translate the position of the haptic device into the space of the reference CT image. This concept is applied to virtual palpation and the haptic simulation of insertion of a virtual bendable needle. To this aim, different motion models that are applicable in real-time are presented and the methods are integrated into a needle puncture training simulation framework, which can be used for simulated biopsy or vessel puncture in the liver. To confirm real-time applicability, a performance analysis of the resulting framework is given. It is shown that the presented methods achieve mean update rates around 2,000 Hz for haptic simulation and interactive frame rates for volume rendering and thus are well suited for visuo-haptic rendering of virtual patients under respiratory motion.

  11. Brain-Computer Interfaces: A Neuroscience Paradigm of Social Interaction? A Matter of Perspective

    PubMed Central

    Mattout, Jérémie

    2012-01-01

    A number of recent studies have put human subjects in true social interactions, with the aim of better identifying the psychophysiological processes underlying social cognition. Interestingly, this emerging Neuroscience of Social Interactions (NSI) field brings up challenges which resemble important ones in the field of Brain-Computer Interfaces (BCI). Importantly, these challenges go beyond common objectives such as the eventual use of BCI and NSI protocols in the clinical domain or common interests pertaining to the use of online neurophysiological techniques and algorithms. Common fundamental challenges are now apparent and one can argue that a crucial one is to develop computational models of brain processes relevant to human interactions with an adaptive agent, whether human or artificial. Coupled with neuroimaging data, such models have proved promising in revealing the neural basis and mental processes behind social interactions. Similar models could help BCI to move from well-performing but offline static machines to reliable online adaptive agents. This emphasizes a social perspective to BCI, which is not limited to a computational challenge but extends to all questions that arise when studying the brain in interaction with its environment. PMID:22675291

  12. Analysis of the Harrier forebody/inlet design using computational techniques

    NASA Technical Reports Server (NTRS)

    Chow, Chuen-Yen

    1993-01-01

    Under the support of this Cooperative Agreement, computations of transonic flow past the complex forebody/inlet configuration of the AV-8B Harrier II have been performed. The actual aircraft configuration was measured and its surface and surrounding domain were defined using computational structured grids. The thin-layer Navier-Stokes equations were used to model the flow along with the Chimera embedded multi-grid technique. A fully conservative, alternating direction implicit (ADI), approximately-factored, partially flux-split algorithm was employed to perform the computation. An existing code was altered to conform with the needs of the study, and some special engine face boundary conditions were developed. The algorithm incorporated the Chimera technique and an algebraic turbulence model in order to deal with the embedded multi-grids and viscous governing equations. Comparison with experimental data has yielded good agreement for the simplifications incorporated into the analysis. The aim of the present research was to provide a methodology for the numerical solution of complex, combined external/internal flows. This is the first time-dependent Navier-Stokes solution for a geometry in which the fuselage and inlet share a wall. The results indicate the methodology used here is a viable tool for transonic aircraft modeling.

  13. Cooperative parallel adaptive neighbourhood search for the disjunctively constrained knapsack problem

    NASA Astrophysics Data System (ADS)

    Quan, Zhe; Wu, Lei

    2017-09-01

    This article investigates the use of parallel computing for solving the disjunctively constrained knapsack problem. The proposed parallel computing model can be viewed as a cooperative algorithm based on a multi-neighbourhood search. The cooperation system is composed of a team manager and a crowd of team members. The team members aim at applying their own search strategies to explore the solution space. The team manager collects the solutions from the members and shares the best one with them. The performance of the proposed method is evaluated on a group of benchmark data sets. The results obtained are compared to those reached by the best methods from the literature. The results show that the proposed method is able to provide the best solutions in most cases. In order to highlight the robustness of the proposed parallel computing model, a new set of large-scale instances is introduced. Encouraging results have been obtained.

  14. Pattern statistics on Markov chains and sensitivity to parameter estimation

    PubMed Central

    Nuel, Grégory

    2006-01-01

    Background: In order to compute pattern statistics in computational biology a Markov model is commonly used to take into account the sequence composition. Usually its parameter must be estimated. The aim of this paper is to determine how sensitive these statistics are to parameter estimation, and what are the consequences of this variability on pattern studies (finding the most over-represented words in a genome, the most significant common words to a set of sequences,...). Results: In the particular case where pattern statistics (overlap counting only) computed through binomial approximations we use the delta-method to give an explicit expression of σ, the standard deviation of a pattern statistic. This result is validated using simulations and a simple pattern study is also considered. Conclusion: We establish that the use of high order Markov model could easily lead to major mistakes due to the high sensitivity of pattern statistics to parameter estimation. PMID:17044916

  15. Pattern statistics on Markov chains and sensitivity to parameter estimation.

    PubMed

    Nuel, Grégory

    2006-10-17

    In order to compute pattern statistics in computational biology a Markov model is commonly used to take into account the sequence composition. Usually its parameter must be estimated. The aim of this paper is to determine how sensitive these statistics are to parameter estimation, and what are the consequences of this variability on pattern studies (finding the most over-represented words in a genome, the most significant common words to a set of sequences,...). In the particular case where pattern statistics (overlap counting only) computed through binomial approximations we use the delta-method to give an explicit expression of sigma, the standard deviation of a pattern statistic. This result is validated using simulations and a simple pattern study is also considered. We establish that the use of high order Markov model could easily lead to major mistakes due to the high sensitivity of pattern statistics to parameter estimation.

  16. On Writing and Reading Artistic Computational Ecosystems.

    PubMed

    Antunes, Rui Filipe; Leymarie, Frederic Fol; Latham, William

    2015-01-01

    We study the use of the generative systems known as computational ecosystems to convey artistic and narrative aims. These are virtual worlds running on computers, composed of agents that trade units of energy and emulate cycles of life and behaviors adapted from biological life forms. In this article we propose a conceptual framework in order to understand these systems, which are involved in processes of authorship and interpretation that this investigation analyzes in order to identify critical instruments for artistic exploration. We formulate a model of narrative that we call system stories (after Mitchell Whitelaw), characterized by the dynamic network of material and conceptual processes that define these artefacts. They account for narrative constellations with multiple agencies from which meaning and messages emerge. Finally, we present three case studies to explore the potential of this model within an artistic and generative domain, arguing that this understanding expands and enriches the palette of the language of these systems.

  17. The Peer Assisted Teaching Model for Undergraduate Research at a HBCU

    ERIC Educational Resources Information Center

    Wu, Liyun; Lewis, Marilyn W.

    2018-01-01

    Despite wide application of research skills in higher education, undergraduate students reported research and computer anxiety, and low association between research and their professional goals. This study aims to assess whether peer-assisted mentoring programs would promote positive changes in undergraduates' attitudes toward research. Using a…

  18. Interaction Network Estimation: Predicting Problem-Solving Diversity in Interactive Environments

    ERIC Educational Resources Information Center

    Eagle, Michael; Hicks, Drew; Barnes, Tiffany

    2015-01-01

    Intelligent tutoring systems and computer aided learning environments aimed at developing problem solving produce large amounts of transactional data which make it a challenge for both researchers and educators to understand how students work within the environment. Researchers have modeled student-tutor interactions using complex networks in…

  19. In Vitro Screening of Environmental Chemicals for Targeted Testing Prioritization: The ToxCast Project

    EPA Science Inventory

    Chemical toxicity testing is being transformed by advances in biology and computer modeling, concerns over animal use, and the thousands of environmental chemicals lacking toxicity data. The U.S. Environmental Protection Agency’s ToxCast program aims to address these concerns by ...

  20. An adaptive model approach for quantitative wrist rigidity evaluation during deep brain stimulation surgery.

    PubMed

    Assis, Sofia; Costa, Pedro; Rosas, Maria Jose; Vaz, Rui; Silva Cunha, Joao Paulo

    2016-08-01

    Intraoperative evaluation of the efficacy of Deep Brain Stimulation includes evaluation of the effect on rigidity. A subjective semi-quantitative scale is used, dependent on the examiner perception and experience. A system was proposed previously, aiming to tackle this subjectivity, using quantitative data and providing real-time feedback of the computed rigidity reduction, hence supporting the physician decision. This system comprised of a gyroscope-based motion sensor in a textile band, placed in the patients hand, which communicated its measurements to a laptop. The latter computed a signal descriptor from the angular velocity of the hand during wrist flexion in DBS surgery. The first approach relied on using a general rigidity reduction model, regardless of the initial severity of the symptom. Thus, to enhance the performance of the previously presented system, we aimed to develop models for high and low baseline rigidity, according to the examiner assessment before any stimulation. This would allow a more patient-oriented approach. Additionally, usability was improved by having in situ processing in a smartphone, instead of a computer. Such system has shown to be reliable, presenting an accuracy of 82.0% and a mean error of 3.4%. Relatively to previous results, the performance was similar, further supporting the importance of considering the cogwheel rigidity to better infer about the reduction in rigidity. Overall, we present a simple, wearable, mobile system, suitable for intra-operatory conditions during DBS, supporting a physician in decision-making when setting stimulation parameters.

  1. Multiscale Aspects of Modeling Gas-Phase Nanoparticle Synthesis

    PubMed Central

    Buesser, B.; Gröhn, A.J.

    2013-01-01

    Aerosol reactors are utilized to manufacture nanoparticles in industrially relevant quantities. The development, understanding and scale-up of aerosol reactors can be facilitated with models and computer simulations. This review aims to provide an overview of recent developments of models and simulations and discuss their interconnection in a multiscale approach. A short introduction of the various aerosol reactor types and gas-phase particle dynamics is presented as a background for the later discussion of the models and simulations. Models are presented with decreasing time and length scales in sections on continuum, mesoscale, molecular dynamics and quantum mechanics models. PMID:23729992

  2. The universal function in color dipole model

    NASA Astrophysics Data System (ADS)

    Jalilian, Z.; Boroun, G. R.

    2017-10-01

    In this work we review color dipole model and recall properties of the saturation and geometrical scaling in this model. Our primary aim is determining the exact universal function in terms of the introduced scaling variable in different distance than the saturation radius. With inserting the mass in calculation we compute numerically the contribution of heavy productions in small x from the total structure function by the fraction of universal functions and show the geometrical scaling is established due to our scaling variable in this study.

  3. Validation Methods Research for Fault-Tolerant Avionics and Control Systems Sub-Working Group Meeting. CARE 3 peer review

    NASA Technical Reports Server (NTRS)

    Trivedi, K. S. (Editor); Clary, J. B. (Editor)

    1980-01-01

    A computer aided reliability estimation procedure (CARE 3), developed to model the behavior of ultrareliable systems required by flight-critical avionics and control systems, is evaluated. The mathematical models, numerical method, and fault-tolerant architecture modeling requirements are examined, and the testing and characterization procedures are discussed. Recommendations aimed at enhancing CARE 3 are presented; in particular, the need for a better exposition of the method and the user interface is emphasized.

  4. Computational Methods Used in Hit-to-Lead and Lead Optimization Stages of Structure-Based Drug Discovery.

    PubMed

    Heifetz, Alexander; Southey, Michelle; Morao, Inaki; Townsend-Nicholson, Andrea; Bodkin, Mike J

    2018-01-01

    GPCR modeling approaches are widely used in the hit-to-lead (H2L) and lead optimization (LO) stages of drug discovery. The aims of these modeling approaches are to predict the 3D structures of the receptor-ligand complexes, to explore the key interactions between the receptor and the ligand and to utilize these insights in the design of new molecules with improved binding, selectivity or other pharmacological properties. In this book chapter, we present a brief survey of key computational approaches integrated with hierarchical GPCR modeling protocol (HGMP) used in hit-to-lead (H2L) and in lead optimization (LO) stages of structure-based drug discovery (SBDD). We outline the differences in modeling strategies used in H2L and LO of SBDD and illustrate how these tools have been applied in three drug discovery projects.

  5. NOTE: Acceleration of Monte Carlo-based scatter compensation for cardiac SPECT

    NASA Astrophysics Data System (ADS)

    Sohlberg, A.; Watabe, H.; Iida, H.

    2008-07-01

    Single proton emission computed tomography (SPECT) images are degraded by photon scatter making scatter compensation essential for accurate reconstruction. Reconstruction-based scatter compensation with Monte Carlo (MC) modelling of scatter shows promise for accurate scatter correction, but it is normally hampered by long computation times. The aim of this work was to accelerate the MC-based scatter compensation using coarse grid and intermittent scatter modelling. The acceleration methods were compared to un-accelerated implementation using MC-simulated projection data of the mathematical cardiac torso (MCAT) phantom modelling 99mTc uptake and clinical myocardial perfusion studies. The results showed that when combined the acceleration methods reduced the reconstruction time for 10 ordered subset expectation maximization (OS-EM) iterations from 56 to 11 min without a significant reduction in image quality indicating that the coarse grid and intermittent scatter modelling are suitable for MC-based scatter compensation in cardiac SPECT.

  6. The modeling and simulation of visuospatial working memory

    PubMed Central

    Liang, Lina; Zhang, Zhikang

    2010-01-01

    Camperi and Wang (Comput Neurosci 5:383–405, 1998) presented a network model for working memory that combines intrinsic cellular bistability with the recurrent network architecture of the neocortex. While Fall and Rinzel (Comput Neurosci 20:97–107, 2006) replaced this intrinsic bistability with a biological mechanism-Ca2+ release subsystem. In this study, we aim to further expand the above work. We integrate the traditional firing-rate network with Ca2+ subsystem-induced bistability, amend the synaptic weights and suggest that Ca2+ concentration only increase the efficacy of synaptic input but has nothing to do with the external input for the transient cue. We found that our network model maintained the persistent activity in response to a brief transient stimulus like that of the previous two models and the working memory performance was resistant to noise and distraction stimulus if Ca2+ subsystem was tuned to be bistable. PMID:22132045

  7. Is realistic neuronal modeling realistic?

    PubMed Central

    Almog, Mara

    2016-01-01

    Scientific models are abstractions that aim to explain natural phenomena. A successful model shows how a complex phenomenon arises from relatively simple principles while preserving major physical or biological rules and predicting novel experiments. A model should not be a facsimile of reality; it is an aid for understanding it. Contrary to this basic premise, with the 21st century has come a surge in computational efforts to model biological processes in great detail. Here we discuss the oxymoronic, realistic modeling of single neurons. This rapidly advancing field is driven by the discovery that some neurons don't merely sum their inputs and fire if the sum exceeds some threshold. Thus researchers have asked what are the computational abilities of single neurons and attempted to give answers using realistic models. We briefly review the state of the art of compartmental modeling highlighting recent progress and intrinsic flaws. We then attempt to address two fundamental questions. Practically, can we realistically model single neurons? Philosophically, should we realistically model single neurons? We use layer 5 neocortical pyramidal neurons as a test case to examine these issues. We subject three publically available models of layer 5 pyramidal neurons to three simple computational challenges. Based on their performance and a partial survey of published models, we conclude that current compartmental models are ad hoc, unrealistic models functioning poorly once they are stretched beyond the specific problems for which they were designed. We then attempt to plot possible paths for generating realistic single neuron models. PMID:27535372

  8. A Comparative Evaluation of Mixed Dentition Analysis on Reliability of Cone Beam Computed Tomography Image Compared to Plaster Model.

    PubMed

    Gowd, Snigdha; Shankar, T; Dash, Samarendra; Sahoo, Nivedita; Chatterjee, Suravi; Mohanty, Pritam

    2017-01-01

    The aim of the study was to evaluate the reliability of cone beam computed tomography (CBCT) obtained image over plaster model for the assessment of mixed dentition analysis. Thirty CBCT-derived images and thirty plaster models were derived from the dental archives, and Moyer's and Tanaka-Johnston analyses were performed. The data obtained were interpreted and analyzed statistically using SPSS 10.0/PC (SPSS Inc., Chicago, IL, USA). Descriptive and analytical analysis along with Student's t -test was performed to qualitatively evaluate the data and P < 0.05 was considered statistically significant. Statistically, significant results were obtained on data comparison between CBCT-derived images and plaster model; the mean for Moyer's analysis in the left and right lower arch for CBCT and plaster model was 21.2 mm, 21.1 mm and 22.5 mm, 22.5 mm, respectively. CBCT-derived images were less reliable as compared to data obtained directly from plaster model for mixed dentition analysis.

  9. Predictive biophysical modeling and understanding of the dynamics of mRNA translation and its evolution

    PubMed Central

    Zur, Hadas; Tuller, Tamir

    2016-01-01

    mRNA translation is the fundamental process of decoding the information encoded in mRNA molecules by the ribosome for the synthesis of proteins. The centrality of this process in various biomedical disciplines such as cell biology, evolution and biotechnology, encouraged the development of dozens of mathematical and computational models of translation in recent years. These models aimed at capturing various biophysical aspects of the process. The objective of this review is to survey these models, focusing on those based and/or validated on real large-scale genomic data. We consider aspects such as the complexity of the models, the biophysical aspects they regard and the predictions they may provide. Furthermore, we survey the central systems biology discoveries reported on their basis. This review demonstrates the fundamental advantages of employing computational biophysical translation models in general, and discusses the relative advantages of the different approaches and the challenges in the field. PMID:27591251

  10. A cascaded neuro-computational model for spoken word recognition

    NASA Astrophysics Data System (ADS)

    Hoya, Tetsuya; van Leeuwen, Cees

    2010-03-01

    In human speech recognition, words are analysed at both pre-lexical (i.e., sub-word) and lexical (word) levels. The aim of this paper is to propose a constructive neuro-computational model that incorporates both these levels as cascaded layers of pre-lexical and lexical units. The layered structure enables the system to handle the variability of real speech input. Within the model, receptive fields of the pre-lexical layer consist of radial basis functions; the lexical layer is composed of units that perform pattern matching between their internal template and a series of labels, corresponding to the winning receptive fields in the pre-lexical layer. The model adapts through self-tuning of all units, in combination with the formation of a connectivity structure through unsupervised (first layer) and supervised (higher layers) network growth. Simulation studies show that the model can achieve a level of performance in spoken word recognition similar to that of a benchmark approach using hidden Markov models, while enabling parallel access to word candidates in lexical decision making.

  11. Modelling total solar irradiance since 1878 from simulated magnetograms

    NASA Astrophysics Data System (ADS)

    Dasi-Espuig, M.; Jiang, J.; Krivova, N. A.; Solanki, S. K.

    2014-10-01

    Aims: We present a new model of total solar irradiance (TSI) based on magnetograms simulated with a surface flux transport model (SFTM) and the Spectral And Total Irradiance REconstructions (SATIRE) model. Our model provides daily maps of the distribution of the photospheric field and the TSI starting from 1878. Methods: The modelling is done in two main steps. We first calculate the magnetic flux on the solar surface emerging in active and ephemeral regions. The evolution of the magnetic flux in active regions (sunspots and faculae) is computed using a surface flux transport model fed with the observed record of sunspot group areas and positions. The magnetic flux in ephemeral regions is treated separately using the concept of overlapping cycles. We then use a version of the SATIRE model to compute the TSI. The area coverage and the distribution of different magnetic features as a function of time, which are required by SATIRE, are extracted from the simulated magnetograms and the modelled ephemeral region magnetic flux. Previously computed intensity spectra of the various types of magnetic features are employed. Results: Our model reproduces the PMOD composite of TSI measurements starting from 1978 at daily and rotational timescales more accurately than the previous version of the SATIRE model computing TSI over this period of time. The simulated magnetograms provide a more realistic representation of the evolution of the magnetic field on the photosphere and also allow us to make use of information on the spatial distribution of the magnetic fields before the times when observed magnetograms were available. We find that the secular increase in TSI since 1878 is fairly stable to modifications of the treatment of the ephemeral region magnetic flux.

  12. Spatial operator approach to flexible multibody system dynamics and control

    NASA Technical Reports Server (NTRS)

    Rodriguez, G.

    1991-01-01

    The inverse and forward dynamics problems for flexible multibody systems were solved using the techniques of spatially recursive Kalman filtering and smoothing. These algorithms are easily developed using a set of identities associated with mass matrix factorization and inversion. These identities are easily derived using the spatial operator algebra developed by the author. Current work is aimed at computational experiments with the described algorithms and at modelling for control design of limber manipulator systems. It is also aimed at handling and manipulation of flexible objects.

  13. Reduced Order Model Implementation in the Risk-Informed Safety Margin Characterization Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, Diego; Smith, Curtis L.; Alfonsi, Andrea

    2015-09-01

    The RISMC project aims to develop new advanced simulation-based tools to perform Probabilistic Risk Analysis (PRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermo-hydraulic behavior of the reactor primary and secondary systems but also external events temporal evolution and components/system ageing. Thus, this is not only a multi-physics problem but also a multi-scale problem (both spatial, µm-mm-m, and temporal, ms-s-minutes-years). As part of the RISMC PRA approach, a large amount of computationally expensive simulation runs are required. An important aspect is that even though computational power is regularly growing, themore » overall computational cost of a RISMC analysis may be not viable for certain cases. A solution that is being evaluated is the use of reduce order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RICM analysis computational cost by decreasing the number of simulations runs to perform and employ surrogate models instead of the actual simulation codes. This report focuses on the use of reduced order modeling techniques that can be applied to any RISMC analysis to generate, analyze and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (µs instead of hours/days). We apply reduced order and surrogate modeling techniques to several RISMC types of analyses using RAVEN and RELAP-7 and show the advantages that can be gained.« less

  14. Effects of artificial hypolimnetic oxygenation in a shallow lake. Part 2: numerical modelling.

    PubMed

    Toffolon, Marco; Serafini, Michele

    2013-01-15

    A three-dimensional numerical model is used to simulate the thermal destratification caused by hypolimnetic jets releasing oxygen-rich water for lake restoration. Focussing on the case study described in the companion paper (Toffolon et al., 2013), i.e. the small, relatively shallow Lake Serraia (Trentino, Italy), a specific simplified sub-grid model is developed in the numerical model to reproduce jet entrainment with reduced computational costs, with the aim to simulate the whole lake dynamics along several weeks. The noticeable agreement between numerical results and available measurements suggests that the model can be used to understand the main effects of the hypolimnetic oxygenation in different scenarios. Therefore, different options can be evaluated and guidelines can be proposed for lake management, with the aim to preserve the typical thermal stratification while providing sufficient oxygen mass to proceed with the restoration phase. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. Forecasting of construction and demolition waste in Brazil.

    PubMed

    Paz, Diogo Hf; Lafayette, Kalinny Pv

    2016-08-01

    The objective of this article is to develop a computerised tool (software) that facilitates the analysis of strategies for waste management on construction sites through the use of indicators of construction and demolition waste generation. The development involved the following steps: knowledge acquisition, structuring the system, coding and system evaluation. The step of knowledge acquisition aims to provide subsidies for the representation of them through models. In the step of structuring the system, it was presented the structuring and formalisation of knowledge for the development of the system, and has two stages: the construction of the conceptual model and the subsequent instantiation of the model. The coding system aims to implement (code) the conceptual model developed in a model played by computer (digital). The results showed that the system is very useful and applicable in construction sites, helping to improve the quality of waste management, and creating a database that will support new research. © The Author(s) 2016.

  16. Building confidence and credibility amid growing model and computing complexity

    NASA Astrophysics Data System (ADS)

    Evans, K. J.; Mahajan, S.; Veneziani, C.; Kennedy, J. H.

    2017-12-01

    As global Earth system models are developed to answer an ever-wider range of science questions, software products that provide robust verification, validation, and evaluation must evolve in tandem. Measuring the degree to which these new models capture past behavior, predict the future, and provide the certainty of predictions is becoming ever more challenging for reasons that are generally well known, yet are still challenging to address. Two specific and divergent needs for analysis of the Accelerated Climate Model for Energy (ACME) model - but with a similar software philosophy - are presented to show how a model developer-based focus can address analysis needs during expansive model changes to provide greater fidelity and execute on multi-petascale computing facilities. A-PRIME is a python script-based quick-look overview of a fully-coupled global model configuration to determine quickly if it captures specific behavior before significant computer time and expense is invested. EVE is an ensemble-based software framework that focuses on verification of performance-based ACME model development, such as compiler or machine settings, to determine the equivalence of relevant climate statistics. The challenges and solutions for analysis of multi-petabyte output data are highlighted from the aspect of the scientist using the software, with the aim of fostering discussion and further input from the community about improving developer confidence and community credibility.

  17. Tangible Landscape: Cognitively Grasping the Flow of Water

    NASA Astrophysics Data System (ADS)

    Harmon, B. A.; Petrasova, A.; Petras, V.; Mitasova, H.; Meentemeyer, R. K.

    2016-06-01

    Complex spatial forms like topography can be challenging to understand, much less intentionally shape, given the heavy cognitive load of visualizing and manipulating 3D form. Spatiotemporal processes like the flow of water over a landscape are even more challenging to understand and intentionally direct as they are dependent upon their context and require the simulation of forces like gravity and momentum. This cognitive work can be offloaded onto computers through 3D geospatial modeling, analysis, and simulation. Interacting with computers, however, can also be challenging, often requiring training and highly abstract thinking. Tangible computing - an emerging paradigm of human-computer interaction in which data is physically manifested so that users can feel it and directly manipulate it - aims to offload this added cognitive work onto the body. We have designed Tangible Landscape, a tangible interface powered by an open source geographic information system (GRASS GIS), so that users can naturally shape topography and interact with simulated processes with their hands in order to make observations, generate and test hypotheses, and make inferences about scientific phenomena in a rapid, iterative process. Conceptually Tangible Landscape couples a malleable physical model with a digital model of a landscape through a continuous cycle of 3D scanning, geospatial modeling, and projection. We ran a flow modeling experiment to test whether tangible interfaces like this can effectively enhance spatial performance by offloading cognitive processes onto computers and our bodies. We used hydrological simulations and statistics to quantitatively assess spatial performance. We found that Tangible Landscape enhanced 3D spatial performance and helped users understand water flow.

  18. The fourth radiation transfer model intercomparison (RAMI-IV): Proficiency testing of canopy reflectance models with ISO-13528

    NASA Astrophysics Data System (ADS)

    Widlowski, J.-L.; Pinty, B.; Lopatka, M.; Atzberger, C.; Buzica, D.; Chelle, M.; Disney, M.; Gastellu-Etchegorry, J.-P.; Gerboles, M.; Gobron, N.; Grau, E.; Huang, H.; Kallel, A.; Kobayashi, H.; Lewis, P. E.; Qin, W.; Schlerf, M.; Stuckens, J.; Xie, D.

    2013-07-01

    The radiation transfer model intercomparison (RAMI) activity aims at assessing the reliability of physics-based radiative transfer (RT) models under controlled experimental conditions. RAMI focuses on computer simulation models that mimic the interactions of radiation with plant canopies. These models are increasingly used in the development of satellite retrieval algorithms for terrestrial essential climate variables (ECVs). Rather than applying ad hoc performance metrics, RAMI-IV makes use of existing ISO standards to enhance the rigor of its protocols evaluating the quality of RT models. ISO-13528 was developed "to determine the performance of individual laboratories for specific tests or measurements." More specifically, it aims to guarantee that measurement results fall within specified tolerance criteria from a known reference. Of particular interest to RAMI is that ISO-13528 provides guidelines for comparisons where the true value of the target quantity is unknown. In those cases, "truth" must be replaced by a reliable "conventional reference value" to enable absolute performance tests. This contribution will show, for the first time, how the ISO-13528 standard developed by the chemical and physical measurement communities can be applied to proficiency testing of computer simulation models. Step by step, the pre-screening of data, the identification of reference solutions, and the choice of proficiency statistics will be discussed and illustrated with simulation results from the RAMI-IV "abstract canopy" scenarios. Detailed performance statistics of the participating RT models will be provided and the role of the accuracy of the reference solutions as well as the choice of the tolerance criteria will be highlighted.

  19. Analysis about modeling MEC7000 excitation system of nuclear power unit

    NASA Astrophysics Data System (ADS)

    Liu, Guangshi; Sun, Zhiyuan; Dou, Qian; Liu, Mosi; Zhang, Yihui; Wang, Xiaoming

    2018-02-01

    Aiming at the importance of accurate modeling excitation system in stability calculation of nuclear power plant inland and lack of research in modeling MEC7000 excitation system,this paper summarize a general method to modeling and simulate MEC7000 excitation system. Among this method also solve the key issues of computing method of IO interface parameter and the conversion process of excitation system measured model to BPA simulation model. At last complete the simulation modeling of MEC7000 excitation system first time in domestic. By used No-load small disturbance check, demonstrates that the proposed model and algorithm is corrective and efficient.

  20. Kinematics of a vertical axis wind turbine with a variable pitch angle

    NASA Astrophysics Data System (ADS)

    Jakubowski, Mateusz; Starosta, Roman; Fritzkowski, Pawel

    2018-01-01

    A computational model for the kinematics of a vertical axis wind turbine (VAWT) is presented. A H-type rotor turbine with a controlled pitch angle is considered. The aim of this solution is to improve the VAWT productivity. The discussed method is related to a narrow computational branch based on the Blade Element Momentum theory (BEM theory). The paper can be regarded as a theoretical basis and an introduction to further studies with the application of BEM. The obtained torque values show the main advantage of using the variable pitch angle.

  1. SPIP: A computer program implementing the Interaction Picture method for simulation of light-wave propagation in optical fibre

    NASA Astrophysics Data System (ADS)

    Balac, Stéphane; Fernandez, Arnaud

    2016-02-01

    The computer program SPIP is aimed at solving the Generalized Non-Linear Schrödinger equation (GNLSE), involved in optics e.g. in the modelling of light-wave propagation in an optical fibre, by the Interaction Picture method, a new efficient alternative method to the Symmetric Split-Step method. In the SPIP program a dedicated costless adaptive step-size control based on the use of a 4th order embedded Runge-Kutta method is implemented in order to speed up the resolution.

  2. Optical computing research

    NASA Astrophysics Data System (ADS)

    Goodman, Joseph W.

    1987-10-01

    Work Accomplished: OPTICAL INTERCONNECTIONS - the powerful interconnect abilities of optical beams have led much optimism about the possible roles for optics in solving interconnect problems at various levels of computer architecture. Examined were the powerful requirements of optical interconnects at the gate-to-gate and chip-to-chip levels. OPTICAL NEUTRAL NETWORKS - basic studies of the convergence properties on the Holfield model, based on mathematical approach - graph theory. OPTICS AND ARTIFICIAL INTELLIGENCE - review the field of optical processing and artificial intelligence, with the aim of finding areas that might be particularly attractive for future investigation(s).

  3. The theory of reasoned action as parallel constraint satisfaction: towards a dynamic computational model of health behavior.

    PubMed

    Orr, Mark G; Thrush, Roxanne; Plaut, David C

    2013-01-01

    The reasoned action approach, although ubiquitous in health behavior theory (e.g., Theory of Reasoned Action/Planned Behavior), does not adequately address two key dynamical aspects of health behavior: learning and the effect of immediate social context (i.e., social influence). To remedy this, we put forth a computational implementation of the Theory of Reasoned Action (TRA) using artificial-neural networks. Our model re-conceptualized behavioral intention as arising from a dynamic constraint satisfaction mechanism among a set of beliefs. In two simulations, we show that constraint satisfaction can simultaneously incorporate the effects of past experience (via learning) with the effects of immediate social context to yield behavioral intention, i.e., intention is dynamically constructed from both an individual's pre-existing belief structure and the beliefs of others in the individual's social context. In a third simulation, we illustrate the predictive ability of the model with respect to empirically derived behavioral intention. As the first known computational model of health behavior, it represents a significant advance in theory towards understanding the dynamics of health behavior. Furthermore, our approach may inform the development of population-level agent-based models of health behavior that aim to incorporate psychological theory into models of population dynamics.

  4. The Theory of Reasoned Action as Parallel Constraint Satisfaction: Towards a Dynamic Computational Model of Health Behavior

    PubMed Central

    Orr, Mark G.; Thrush, Roxanne; Plaut, David C.

    2013-01-01

    The reasoned action approach, although ubiquitous in health behavior theory (e.g., Theory of Reasoned Action/Planned Behavior), does not adequately address two key dynamical aspects of health behavior: learning and the effect of immediate social context (i.e., social influence). To remedy this, we put forth a computational implementation of the Theory of Reasoned Action (TRA) using artificial-neural networks. Our model re-conceptualized behavioral intention as arising from a dynamic constraint satisfaction mechanism among a set of beliefs. In two simulations, we show that constraint satisfaction can simultaneously incorporate the effects of past experience (via learning) with the effects of immediate social context to yield behavioral intention, i.e., intention is dynamically constructed from both an individual’s pre-existing belief structure and the beliefs of others in the individual’s social context. In a third simulation, we illustrate the predictive ability of the model with respect to empirically derived behavioral intention. As the first known computational model of health behavior, it represents a significant advance in theory towards understanding the dynamics of health behavior. Furthermore, our approach may inform the development of population-level agent-based models of health behavior that aim to incorporate psychological theory into models of population dynamics. PMID:23671603

  5. Establishing a relationship between maximum torque production of isolated joints to simulate EVA ratchet push-pull maneuver: A case study

    NASA Technical Reports Server (NTRS)

    Pandya, Abhilash; Maida, James; Hasson, Scott; Greenisen, Michael; Woolford, Barbara

    1993-01-01

    As manned exploration of space continues, analytical evaluation of human strength characteristics is critical. These extraterrestrial environments will spawn issues of human performance which will impact the designs of tools, work spaces, and space vehicles. Computer modeling is an effective method of correlating human biomechanical and anthropometric data with models of space structures and human work spaces. The aim of this study is to provide biomechanical data from isolated joints to be utilized in a computer modeling system for calculating torque resulting from any upper extremity motions: in this study, the ratchet wrench push-pull operation (a typical extravehicular activity task). Established here are mathematical relationships used to calculate maximum torque production of isolated upper extremity joints. These relationships are a function of joint angle and joint velocity.

  6. On agent-based modeling and computational social science.

    PubMed

    Conte, Rosaria; Paolucci, Mario

    2014-01-01

    In the first part of the paper, the field of agent-based modeling (ABM) is discussed focusing on the role of generative theories, aiming at explaining phenomena by growing them. After a brief analysis of the major strengths of the field some crucial weaknesses are analyzed. In particular, the generative power of ABM is found to have been underexploited, as the pressure for simple recipes has prevailed and shadowed the application of rich cognitive models. In the second part of the paper, the renewal of interest for Computational Social Science (CSS) is focused upon, and several of its variants, such as deductive, generative, and complex CSS, are identified and described. In the concluding remarks, an interdisciplinary variant, which takes after ABM, reconciling it with the quantitative one, is proposed as a fundamental requirement for a new program of the CSS.

  7. On agent-based modeling and computational social science

    PubMed Central

    Conte, Rosaria; Paolucci, Mario

    2014-01-01

    In the first part of the paper, the field of agent-based modeling (ABM) is discussed focusing on the role of generative theories, aiming at explaining phenomena by growing them. After a brief analysis of the major strengths of the field some crucial weaknesses are analyzed. In particular, the generative power of ABM is found to have been underexploited, as the pressure for simple recipes has prevailed and shadowed the application of rich cognitive models. In the second part of the paper, the renewal of interest for Computational Social Science (CSS) is focused upon, and several of its variants, such as deductive, generative, and complex CSS, are identified and described. In the concluding remarks, an interdisciplinary variant, which takes after ABM, reconciling it with the quantitative one, is proposed as a fundamental requirement for a new program of the CSS. PMID:25071642

  8. Space-time programming.

    PubMed

    Beal, Jacob; Viroli, Mirko

    2015-07-28

    Computation increasingly takes place not on an individual device, but distributed throughout a material or environment, whether it be a silicon surface, a network of wireless devices, a collection of biological cells or a programmable material. Emerging programming models embrace this reality and provide abstractions inspired by physics, such as computational fields, that allow such systems to be programmed holistically, rather than in terms of individual devices. This paper aims to provide a unified approach for the investigation and engineering of computations programmed with the aid of space-time abstractions, by bringing together a number of recent results, as well as to identify critical open problems. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  9. A comparative research of different ensemble surrogate models based on set pair analysis for the DNAPL-contaminated aquifer remediation strategy optimization.

    PubMed

    Hou, Zeyu; Lu, Wenxi; Xue, Haibo; Lin, Jin

    2017-08-01

    Surrogate-based simulation-optimization technique is an effective approach for optimizing the surfactant enhanced aquifer remediation (SEAR) strategy for clearing DNAPLs. The performance of the surrogate model, which is used to replace the simulation model for the aim of reducing computation burden, is the key of corresponding researches. However, previous researches are generally based on a stand-alone surrogate model, and rarely make efforts to improve the approximation accuracy of the surrogate model to the simulation model sufficiently by combining various methods. In this regard, we present set pair analysis (SPA) as a new method to build ensemble surrogate (ES) model, and conducted a comparative research to select a better ES modeling pattern for the SEAR strategy optimization problems. Surrogate models were developed using radial basis function artificial neural network (RBFANN), support vector regression (SVR), and Kriging. One ES model is assembling RBFANN model, SVR model, and Kriging model using set pair weights according their performance, and the other is assembling several Kriging (the best surrogate modeling method of three) models built with different training sample datasets. Finally, an optimization model, in which the ES model was embedded, was established to obtain the optimal remediation strategy. The results showed the residuals of the outputs between the best ES model and simulation model for 100 testing samples were lower than 1.5%. Using an ES model instead of the simulation model was critical for considerably reducing the computation time of simulation-optimization process and maintaining high computation accuracy simultaneously. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Allen Newell's Program of Research: The Video-Game Test.

    PubMed

    Gobet, Fernand

    2017-04-01

    Newell (1973) argued that progress in psychology was slow because research focused on experiments trying to answer binary questions, such as serial versus parallel processing. In addition, not enough attention was paid to the strategies used by participants, and there was a lack of theories implemented as computer models offering sufficient precision for being tested rigorously. He proposed a three-headed research program: to develop computational models able to carry out the task they aimed to explain; to study one complex task in detail, such as chess; and to build computational models that can account for multiple tasks. This article assesses the extent to which the papers in this issue advance Newell's program. While half of the papers devote much attention to strategies, several papers still average across them, a capital sin according to Newell. The three courses of action he proposed were not popular in these papers: Only two papers used computational models, with no model being both able to carry out the task and to account for human data; there was no systematic analysis of a specific video game; and no paper proposed a computational model accounting for human data in several tasks. It is concluded that, while they use sophisticated methods of analysis and discuss interesting results, overall these papers contribute only little to Newell's program of research. In this respect, they reflect the current state of psychology and cognitive science. This is a shame, as Newell's ideas might help address the current crisis of lack of replication and fraud in psychology. Copyright © 2017 The Author. Topics in Cognitive Science published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.

  11. A stochastic model for density-dependent microwave Snow- and Graupel scattering coefficients of the NOAA JCSDA community radiative transfer model

    NASA Astrophysics Data System (ADS)

    Stegmann, Patrick G.; Tang, Guanglin; Yang, Ping; Johnson, Benjamin T.

    2018-05-01

    A structural model is developed for the single-scattering properties of snow and graupel particles with a strongly heterogeneous morphology and an arbitrary variable mass density. This effort is aimed to provide a mechanism to consider particle mass density variation in the microwave scattering coefficients implemented in the Community Radiative Transfer Model (CRTM). The stochastic model applies a bicontinuous random medium algorithm to a simple base shape and uses the Finite-Difference-Time-Domain (FDTD) method to compute the single-scattering properties of the resulting complex morphology.

  12. Designing an Advanced Instructional Design Advisor: Principles of Instructional Design. Volume 2

    DTIC Science & Technology

    1991-05-01

    ones contained in this paper would comprise a substantial part of the knowledge base for the AIDA . 14. SUBJECT TERMS IS.NUMBER OF PAGES ucigoirlive...the classroom (e.g., computer simulations models can be used to enhance CBI). The Advanced Instructional Design Advisor is a project aimed at providing... model shares with its variations. Tennyson then identifies research- based prescriptions from the cognitive sciences which should become part of ISD in

  13. Potential implementation of reservoir computing models based on magnetic skyrmions

    NASA Astrophysics Data System (ADS)

    Bourianoff, George; Pinna, Daniele; Sitte, Matthias; Everschor-Sitte, Karin

    2018-05-01

    Reservoir Computing is a type of recursive neural network commonly used for recognizing and predicting spatio-temporal events relying on a complex hierarchy of nested feedback loops to generate a memory functionality. The Reservoir Computing paradigm does not require any knowledge of the reservoir topology or node weights for training purposes and can therefore utilize naturally existing networks formed by a wide variety of physical processes. Most efforts to implement reservoir computing prior to this have focused on utilizing memristor techniques to implement recursive neural networks. This paper examines the potential of magnetic skyrmion fabrics and the complex current patterns which form in them as an attractive physical instantiation for Reservoir Computing. We argue that their nonlinear dynamical interplay resulting from anisotropic magnetoresistance and spin-torque effects allows for an effective and energy efficient nonlinear processing of spatial temporal events with the aim of event recognition and prediction.

  14. Students Perception towards the Implementation of Computer Graphics Technology in Class via Unified Theory of Acceptance and Use of Technology (UTAUT) Model

    NASA Astrophysics Data System (ADS)

    Binti Shamsuddin, Norsila

    Technology advancement and development in a higher learning institution is a chance for students to be motivated to learn in depth in the information technology areas. Students should take hold of the opportunity to blend their skills towards these technologies as preparation for them when graduating. The curriculum itself can rise up the students' interest and persuade them to be directly involved in the evolvement of the technology. The aim of this study is to see how deep is the students' involvement as well as their acceptance towards the adoption of the technology used in Computer Graphics and Image Processing subjects. The study will be towards the Bachelor students in Faculty of Industrial Information Technology (FIIT), Universiti Industri Selangor (UNISEL); Bac. In Multimedia Industry, BSc. Computer Science and BSc. Computer Science (Software Engineering). This study utilizes the new Unified Theory of Acceptance and Use of Technology (UTAUT) to further validate the model and enhance our understanding of the adoption of Computer Graphics and Image Processing Technologies. Four (4) out of eight (8) independent factors in UTAUT will be studied towards the dependent factor.

  15. Computational fluid dynamics analysis of cyclist aerodynamics: performance of different turbulence-modelling and boundary-layer modelling approaches.

    PubMed

    Defraeye, Thijs; Blocken, Bert; Koninckx, Erwin; Hespel, Peter; Carmeliet, Jan

    2010-08-26

    This study aims at assessing the accuracy of computational fluid dynamics (CFD) for applications in sports aerodynamics, for example for drag predictions of swimmers, cyclists or skiers, by evaluating the applied numerical modelling techniques by means of detailed validation experiments. In this study, a wind-tunnel experiment on a scale model of a cyclist (scale 1:2) is presented. Apart from three-component forces and moments, also high-resolution surface pressure measurements on the scale model's surface, i.e. at 115 locations, are performed to provide detailed information on the flow field. These data are used to compare the performance of different turbulence-modelling techniques, such as steady Reynolds-averaged Navier-Stokes (RANS), with several k-epsilon and k-omega turbulence models, and unsteady large-eddy simulation (LES), and also boundary-layer modelling techniques, namely wall functions and low-Reynolds number modelling (LRNM). The commercial CFD code Fluent 6.3 is used for the simulations. The RANS shear-stress transport (SST) k-omega model shows the best overall performance, followed by the more computationally expensive LES. Furthermore, LRNM is clearly preferred over wall functions to model the boundary layer. This study showed that there are more accurate alternatives for evaluating flow around bluff bodies with CFD than the standard k-epsilon model combined with wall functions, which is often used in CFD studies in sports. 2010 Elsevier Ltd. All rights reserved.

  16. 68Ga/177Lu-labeled DOTA-TATE shows similar imaging and biodistribution in neuroendocrine tumor model.

    PubMed

    Liu, Fei; Zhu, Hua; Yu, Jiangyuan; Han, Xuedi; Xie, Qinghua; Liu, Teli; Xia, Chuanqin; Li, Nan; Yang, Zhi

    2017-06-01

    Somatostatin receptors are overexpressed in neuroendocrine tumors, whose endogenous ligands are somatostatin. DOTA-TATE is an analogue of somatostatin, which shows high binding affinity to somatostatin receptors. We aim to evaluate the 68 Ga/ 177 Lu-labeling DOTA-TATE kit in neuroendocrine tumor model for molecular imaging and to try human-positron emission tomography/computed tomography imaging of 68 Ga-DOTA-TATE in neuroendocrine tumor patients. DOTA-TATE kits were formulated and radiolabeled with 68 Ga/ 177 Lu for 68 Ga/ 177 Lu-DOTA-TATE (M-DOTA-TATE). In vitro and in vivo stability of 177 Lu-DOTA-TATE were performed. Nude mice bearing human tumors were injected with 68 Ga-DOTA-TATE or 177 Lu-DOTA-TATE for micro-positron emission tomography and micro-single-photon emission computed tomography/computed tomography imaging separately, and clinical positron emission tomography/computed tomography images of 68 Ga-DOTA-TATE were obtained at 1 h post-intravenous injection from patients with neuroendocrine tumors. Micro-positron emission tomography and micro-single-photon emission computed tomography/computed tomography imaging of 68 Ga-DOTA-TATE and 177 Lu-DOTA-TATE both showed clear tumor uptake which could be blocked by excess DOTA-TATE. In addition, 68 Ga-DOTA-TATE-positron emission tomography/computed tomography imaging in neuroendocrine tumor patients could show primary and metastatic lesions. 68 Ga-DOTA-TATE and 177 Lu-DOTA-TATE could accumulate in tumors in animal models, paving the way for better clinical peptide receptor radionuclide therapy for neuroendocrine tumor patients in Asian population.

  17. Meeting the computer technology needs of community faculty: building new models for faculty development.

    PubMed

    Baldwin, Constance D; Niebuhr, Virginia N; Sullivan, Brian

    2004-01-01

    We aimed to identify the evolving computer technology needs and interests of community faculty in order to design an effective faculty development program focused on computer skills: the Teaching and Learning Through Educational Technology (TeLeTET) program. Repeated surveys were conducted between 1994 and 2002 to assess computer resources and needs in a pool of over 800 primary care physician-educators in community practice in East Texas. Based on the results, we developed and evaluated several models to teach community preceptors about computer technologies that are useful for education. Before 1998, only half of our community faculty identified a strong interest in developing their technology skills. As the revolution in telecommunications advanced, however, preceptors' needs and interests changed, and the use of this technology to support community-based teaching became feasible. In 1998 and 1999, resource surveys showed that many of our community teaching sites had computers and Internet access. By 2001, the desire for teletechnology skills development was strong in a nucleus of community faculty, although lack of infrastructure, time, and skills were identified barriers. The TeLeTET project developed several innovative models for technology workshops and conferences, supplemented by online resources, that were well attended and positively evaluated by 181 community faculty over a 3-year period. We have identified the evolving needs of community faculty through iterative needs assessments, developed a flexible faculty development curriculum, and used open-ended, formative evaluation techniques to keep the TeLeTET program responsive to a rapidly changing environment for community-based education in computer technology.

  18. Computational Model of Primary Visual Cortex Combining Visual Attention for Action Recognition

    PubMed Central

    Shu, Na; Gao, Zhiyong; Chen, Xiangan; Liu, Haihua

    2015-01-01

    Humans can easily understand other people’s actions through visual systems, while computers cannot. Therefore, a new bio-inspired computational model is proposed in this paper aiming for automatic action recognition. The model focuses on dynamic properties of neurons and neural networks in the primary visual cortex (V1), and simulates the procedure of information processing in V1, which consists of visual perception, visual attention and representation of human action. In our model, a family of the three-dimensional spatial-temporal correlative Gabor filters is used to model the dynamic properties of the classical receptive field of V1 simple cell tuned to different speeds and orientations in time for detection of spatiotemporal information from video sequences. Based on the inhibitory effect of stimuli outside the classical receptive field caused by lateral connections of spiking neuron networks in V1, we propose surround suppressive operator to further process spatiotemporal information. Visual attention model based on perceptual grouping is integrated into our model to filter and group different regions. Moreover, in order to represent the human action, we consider the characteristic of the neural code: mean motion map based on analysis of spike trains generated by spiking neurons. The experimental evaluation on some publicly available action datasets and comparison with the state-of-the-art approaches demonstrate the superior performance of the proposed model. PMID:26132270

  19. Assessment of cell death mechanisms triggered by 177Lu-anti-CD20 in lymphoma cells.

    PubMed

    Azorín-Vega, E; Rojas-Calderón, E; Martínez-Ventura, B; Ramos-Bernal, J; Serrano-Espinoza, L; Jiménez-Mancilla, N; Ordaz-Rosado, D; Ferro-Flores, G

    2018-08-01

    The aim of this research was to evaluate the cell cycle redistribution and activation of early and late apoptotic pathways in lymphoma cells after treatment with 177 Lu-anti-CD20. Experimental and computer models were used to calculate the radiation absorbed dose to cancer cell nuclei. The computer model (Monte Carlo, PENELOPE) consisted of twenty spheres representing cells with an inner sphere (cell nucleus) embedded in culture media. Radiation emissions of the radiopharmaceutical located in cell membranes and in culture media were considered for nuclei dose calculations. Flow cytometric analyses demonstrated that doses as low as 4.8Gy are enough to induce cell cycle arrest and activate late apoptotic pathways. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. A novel track-before-detect algorithm based on optimal nonlinear filtering for detecting and tracking infrared dim target

    NASA Astrophysics Data System (ADS)

    Tian, Yuexin; Gao, Kun; Liu, Ying; Han, Lu

    2015-08-01

    Aiming at the nonlinear and non-Gaussian features of the real infrared scenes, an optimal nonlinear filtering based algorithm for the infrared dim target tracking-before-detecting application is proposed. It uses the nonlinear theory to construct the state and observation models and uses the spectral separation scheme based Wiener chaos expansion method to resolve the stochastic differential equation of the constructed models. In order to improve computation efficiency, the most time-consuming operations independent of observation data are processed on the fore observation stage. The other observation data related rapid computations are implemented subsequently. Simulation results show that the algorithm possesses excellent detection performance and is more suitable for real-time processing.

  1. 3D nozzle flow simulations including state-to-state kinetics calculation

    NASA Astrophysics Data System (ADS)

    Cutrone, L.; Tuttafesta, M.; Capitelli, M.; Schettino, A.; Pascazio, G.; Colonna, G.

    2014-12-01

    In supersonic and hypersonic flows, thermal and chemical non-equilibrium is one of the fundamental aspects that must be taken into account for the accurate characterization of the plasma. In this paper, we present an optimized methodology to approach plasma numerical simulation by state-to-state kinetics calculations in a fully 3D Navier-Stokes CFD solver. Numerical simulations of an expanding flow are presented aimed at comparing the behavior of state-to-state chemical kinetics models with respect to the macroscopic thermochemical non-equilibrium models that are usually used in the numerical computation of high temperature hypersonic flows. The comparison is focused both on the differences in the numerical results and on the computational effort associated with each approach.

  2. EUROPLANET-RI modelling service for the planetary science community: European Modelling and Data Analysis Facility (EMDAF)

    NASA Astrophysics Data System (ADS)

    Khodachenko, Maxim; Miller, Steven; Stoeckler, Robert; Topf, Florian

    2010-05-01

    Computational modeling and observational data analysis are two major aspects of the modern scientific research. Both appear nowadays under extensive development and application. Many of the scientific goals of planetary space missions require robust models of planetary objects and environments as well as efficient data analysis algorithms, to predict conditions for mission planning and to interpret the experimental data. Europe has great strength in these areas, but it is insufficiently coordinated; individual groups, models, techniques and algorithms need to be coupled and integrated. Existing level of scientific cooperation and the technical capabilities for operative communication, allow considerable progress in the development of a distributed international Research Infrastructure (RI) which is based on the existing in Europe computational modelling and data analysis centers, providing the scientific community with dedicated services in the fields of their computational and data analysis expertise. These services will appear as a product of the collaborative communication and joint research efforts of the numerical and data analysis experts together with planetary scientists. The major goal of the EUROPLANET-RI / EMDAF is to make computational models and data analysis algorithms associated with particular national RIs and teams, as well as their outputs, more readily available to their potential user community and more tailored to scientific user requirements, without compromising front-line specialized research on model and data analysis algorithms development and software implementation. This objective will be met through four keys subdivisions/tasks of EMAF: 1) an Interactive Catalogue of Planetary Models; 2) a Distributed Planetary Modelling Laboratory; 3) a Distributed Data Analysis Laboratory, and 4) enabling Models and Routines for High Performance Computing Grids. Using the advantages of the coordinated operation and efficient communication between the involved computational modelling, research and data analysis expert teams and their related research infrastructures, EMDAF will provide a 1) flexible, 2) scientific user oriented, 3) continuously developing and fast upgrading computational and data analysis service to support and intensify the European planetary scientific research. At the beginning EMDAF will create a set of demonstrators and operational tests of this service in key areas of European planetary science. This work will aim at the following objectives: (a) Development and implementation of tools for distant interactive communication between the planetary scientists and computing experts (including related RIs); (b) Development of standard routine packages, and user-friendly interfaces for operation of the existing numerical codes and data analysis algorithms by the specialized planetary scientists; (c) Development of a prototype of numerical modelling services "on demand" for space missions and planetary researchers; (d) Development of a prototype of data analysis services "on demand" for space missions and planetary researchers; (e) Development of a prototype of coordinated interconnected simulations of planetary phenomena and objects (global multi-model simulators); (f) Providing the demonstrators of a coordinated use of high performance computing facilities (super-computer networks), done in cooperation with European HPC Grid DEISA.

  3. Towards a Sufficient Theory of Transition in Cognitive Development.

    ERIC Educational Resources Information Center

    Wallace, J. G.

    The work reported aims at the construction of a sufficient theory of transition in cognitive development. The method of theory construction employed is computer simulation of cognitive process. The core of the model of transition presented comprises self-modification processes that, as a result of continuously monitoring an exhaustive record of…

  4. The Construction of a Multidimensional Spiritual Identity via ICT

    ERIC Educational Resources Information Center

    Gross, Zehavit

    2006-01-01

    This article aims to examine how media and computers can serve as a vehicle for the enhancement of spiritual and religious identity and socialization. An innovative typological model (RSTM) for assessing secularity and religiosity and its implications on the need to utilize advanced information and communication technologies (ICT) are discussed.…

  5. Using Epidemiological Survey Data to Examine Factors Influencing Participation in Parent-Training Programmes

    ERIC Educational Resources Information Center

    Morawska, Alina; Dyah Ramadewi, Mikha; Sanders, Matthew R.

    2014-01-01

    Evidence-based parent-training programmes aim to reduce child behaviour problems; however, the effects of these programmes are often limited by poor participation rates. This study proposes a model of parent, child and family factors related to parental participation in parenting interventions. A computer-assisted telephone interview was used to…

  6. Preparing Computing Students for Culturally Diverse E-Mediated IT Projects

    ERIC Educational Resources Information Center

    Conrad, Marc; French, Tim; Maple, Carsten; Zhang, Sijing

    2006-01-01

    In this paper we present an account of an undergraduate team-based assignment designed to facilitate, exhibit and record team-working skills in an e-mediated environment. By linking the student feedback received to Hofstede's classic model of cultural dimensions we aim to show the assignment's suitability in revealing the student's multi-cultural…

  7. Fun Microbiology: How To Measure Growth of a Fungus.

    ERIC Educational Resources Information Center

    Mitchell, James K.; And Others

    1997-01-01

    Describes an experiment to demonstrate a simple method for measuring fungus growth by monitoring the effect of temperature on the growth of Trichoderma viride. Among the advantages that this experimental model provides is introducing students to the importance of using the computer as a scientific tool for analyzing and presenting data. (AIM)

  8. Development of a reactive burn model based on an explicit viscoplastic pore collapse model

    NASA Astrophysics Data System (ADS)

    Bouton, E.; Lefrançois, A.; Belmas, R.

    2017-01-01

    The aim of this study is to develop a reactive burn model based upon a microscopic hot spot model to compute the shock-initiation of pressed TATB high explosives. Such a model has been implemented in a lagrangian hydrodynamic code. In our calculations, 8 pore radii, ranging from 40 nm to 0.63 μm, have been taken into account and the porosity fraction associated to each void radius has been deduced from the Ultra-Small-Angle X-ray Scattering measurements (USAXS) for PBX-9502. The last parameter of our model is a burn rate that depends on three variables. The first two are the reaction progress variable and the lead shock pressure, the last one is the chemical reaction site number produced in the flow and calculated by the microscopic model. This burn rate has been calibrated by fitting pressure, velocity profiles and run distances to detonation. As the computed results are in close agreement with the measured ones, this model is able to perform a wide variety of numerical simulations including single, double shock waves and the desensitization phenomenon.

  9. Three-Dimensional Modeling May Improve Surgical Education and Clinical Practice.

    PubMed

    Jones, Daniel B; Sung, Robert; Weinberg, Crispin; Korelitz, Theodore; Andrews, Robert

    2016-04-01

    Three-dimensional (3D) printing has been used in the manufacturing industry for rapid prototyping and product testing. The aim of our study was to assess the feasibility of creating anatomical 3D models from a digital image using 3D printers. Furthermore, we sought face validity of models and explored potential opportunities for using 3D printing to enhance surgical education and clinical practice. Computed tomography and magnetic resonance images were reviewed, converted to computer models, and printed by stereolithography to create near exact replicas of human organs. Medical students and surgeons provided feedback via survey at the 2014 Surgical Education Week conference. There were 51 respondents, and 95.8% wanted these models for their patients. Cost was a concern, but 82.6% found value in these models at a price less than $500. All respondents thought the models would be useful for integration into the medical school curriculum. Three-dimensional printing is a potentially disruptive technology to improve both surgical education and clinical practice. As the technology matures and cost decreases, we envision 3D models being increasingly used in surgery. © The Author(s) 2015.

  10. [Artificial intelligence meeting neuropsychology. Semantic memory in normal and pathological aging].

    PubMed

    Aimé, Xavier; Charlet, Jean; Maillet, Didier; Belin, Catherine

    2015-03-01

    Artificial intelligence (IA) is the subject of much research, but also many fantasies. It aims to reproduce human intelligence in its learning capacity, knowledge storage and computation. In 2014, the Defense Advanced Research Projects Agency (DARPA) started the restoring active memory (RAM) program that attempt to develop implantable technology to bridge gaps in the injured brain and restore normal memory function to people with memory loss caused by injury or disease. In another IA's field, computational ontologies (a formal and shared conceptualization) try to model knowledge in order to represent a structured and unambiguous meaning of the concepts of a target domain. The aim of these structures is to ensure a consensual understanding of their meaning and a univariant use (the same concept is used by all to categorize the same individuals). The first representations of knowledge in the AI's domain are largely based on model tests of semantic memory. This one, as a component of long-term memory is the memory of words, ideas, concepts. It is the only declarative memory system that resists so remarkably to the effects of age. In contrast, non-specific cognitive changes may decrease the performance of elderly in various events and instead report difficulties of access to semantic representations that affect the semantics stock itself. Some dementias, like semantic dementia and Alzheimer's disease, are linked to alteration of semantic memory. We propose in this paper, using the computational ontologies model, a formal and relatively thin modeling, in the service of neuropsychology: 1) for the practitioner with decision support systems, 2) for the patient as cognitive prosthesis outsourced, and 3) for the researcher to study semantic memory.

  11. Continuum and discrete approach in modeling biofilm development and structure: a review.

    PubMed

    Mattei, M R; Frunzo, L; D'Acunto, B; Pechaud, Y; Pirozzi, F; Esposito, G

    2018-03-01

    The scientific community has recognized that almost 99% of the microbial life on earth is represented by biofilms. Considering the impacts of their sessile lifestyle on both natural and human activities, extensive experimental activity has been carried out to understand how biofilms grow and interact with the environment. Many mathematical models have also been developed to simulate and elucidate the main processes characterizing the biofilm growth. Two main mathematical approaches for biomass representation can be distinguished: continuum and discrete. This review is aimed at exploring the main characteristics of each approach. Continuum models can simulate the biofilm processes in a quantitative and deterministic way. However, they require a multidimensional formulation to take into account the biofilm spatial heterogeneity, which makes the models quite complicated, requiring significant computational effort. Discrete models are more recent and can represent the typical multidimensional structural heterogeneity of biofilm reflecting the experimental expectations, but they generate computational results including elements of randomness and introduce stochastic effects into the solutions.

  12. RF Models for Plasma-Surface Interactions in VSim

    NASA Astrophysics Data System (ADS)

    Jenkins, Thomas G.; Smithe, D. N.; Pankin, A. Y.; Roark, C. M.; Zhou, C. D.; Stoltz, P. H.; Kruger, S. E.

    2014-10-01

    An overview of ongoing enhancements to the Plasma Discharge (PD) module of Tech-X's VSim software tool is presented. A sub-grid kinetic sheath model, developed for the accurate computation of sheath potentials near metal and dielectric-coated walls, enables the physical effects of DC and RF sheath physics to be included in macroscopic-scale plasma simulations that need not explicitly resolve sheath scale lengths. Sheath potential evolution, together with particle behavior near the sheath, can thus be simulated in complex geometries. Generalizations of the model to include sputtering, secondary electron emission, and effects from multiple ion species and background magnetic fields are summarized; related numerical results are also presented. In addition, improved tools for plasma chemistry and IEDF/EEDF visualization and modeling are discussed, as well as our initial efforts toward the development of hybrid fluid/kinetic transition capabilities within VSim. Ultimately, we aim to establish VSimPD as a robust, efficient computational tool for modeling industrial plasma processes. Supported by US DoE SBIR-I/II Award DE-SC0009501.

  13. A general-purpose framework to simulate musculoskeletal system of human body: using a motion tracking approach.

    PubMed

    Ehsani, Hossein; Rostami, Mostafa; Gudarzi, Mohammad

    2016-02-01

    Computation of muscle force patterns that produce specified movements of muscle-actuated dynamic models is an important and challenging problem. This problem is an undetermined one, and then a proper optimization is required to calculate muscle forces. The purpose of this paper is to develop a general model for calculating all muscle activation and force patterns in an arbitrary human body movement. For this aim, the equations of a multibody system forward dynamics, which is considered for skeletal system of the human body model, is derived using Lagrange-Euler formulation. Next, muscle contraction dynamics is added to this model and forward dynamics of an arbitrary musculoskeletal system is obtained. For optimization purpose, the obtained model is used in computed muscle control algorithm, and a closed-loop system for tracking desired motions is derived. Finally, a popular sport exercise, biceps curl, is simulated by using this algorithm and the validity of the obtained results is evaluated via EMG signals.

  14. Stents: Biomechanics, Biomaterials, and Insights from Computational Modeling.

    PubMed

    Karanasiou, Georgia S; Papafaklis, Michail I; Conway, Claire; Michalis, Lampros K; Tzafriri, Rami; Edelman, Elazer R; Fotiadis, Dimitrios I

    2017-04-01

    Coronary stents have revolutionized the treatment of coronary artery disease. Improvement in clinical outcomes requires detailed evaluation of the performance of stent biomechanics and the effectiveness as well as safety of biomaterials aiming at optimization of endovascular devices. Stents need to harmonize the hemodynamic environment and promote beneficial vessel healing processes with decreased thrombogenicity. Stent design variables and expansion properties are critical for vessel scaffolding. Drug-elution from stents, can help inhibit in-stent restenosis, but adds further complexity as drug release kinetics and coating formulations can dominate tissue responses. Biodegradable and bioabsorbable stents go one step further providing complete absorption over time governed by corrosion and erosion mechanisms. The advances in computing power and computational methods have enabled the application of numerical simulations and the in silico evaluation of the performance of stent devices made up of complex alloys and bioerodible materials in a range of dimensions and designs and with the capacity to retain and elute bioactive agents. This review presents the current knowledge on stent biomechanics, stent fatigue as well as drug release and mechanisms governing biodegradability focusing on the insights from computational modeling approaches.

  15. Sparsity-based fast CGH generation using layer-based approach for 3D point cloud model

    NASA Astrophysics Data System (ADS)

    Kim, Hak Gu; Jeong, Hyunwook; Ro, Yong Man

    2017-03-01

    Computer generated hologram (CGH) is becoming increasingly important for a 3-D display in various applications including virtual reality. In the CGH, holographic fringe patterns are generated by numerically calculating them on computer simulation systems. However, a heavy computational cost is required to calculate the complex amplitude on CGH plane for all points of 3D objects. This paper proposes a new fast CGH generation based on the sparsity of CGH for 3D point cloud model. The aim of the proposed method is to significantly reduce computational complexity while maintaining the quality of the holographic fringe patterns. To that end, we present a new layer-based approach for calculating the complex amplitude distribution on the CGH plane by using sparse FFT (sFFT). We observe the CGH of a layer of 3D objects is sparse so that dominant CGH is rapidly generated from a small set of signals by sFFT. Experimental results have shown that the proposed method is one order of magnitude faster than recently reported fast CGH generation.

  16. Joint kinematic calculation based on clinical direct kinematic versus inverse kinematic gait models.

    PubMed

    Kainz, H; Modenese, L; Lloyd, D G; Maine, S; Walsh, H P J; Carty, C P

    2016-06-14

    Most clinical gait laboratories use the conventional gait analysis model. This model uses a computational method called Direct Kinematics (DK) to calculate joint kinematics. In contrast, musculoskeletal modelling approaches use Inverse Kinematics (IK) to obtain joint angles. IK allows additional analysis (e.g. muscle-tendon length estimates), which may provide valuable information for clinical decision-making in people with movement disorders. The twofold aims of the current study were: (1) to compare joint kinematics obtained by a clinical DK model (Vicon Plug-in-Gait) with those produced by a widely used IK model (available with the OpenSim distribution), and (2) to evaluate the difference in joint kinematics that can be solely attributed to the different computational methods (DK versus IK), anatomical models and marker sets by using MRI based models. Eight children with cerebral palsy were recruited and presented for gait and MRI data collection sessions. Differences in joint kinematics up to 13° were found between the Plug-in-Gait and the gait 2392 OpenSim model. The majority of these differences (94.4%) were attributed to differences in the anatomical models, which included different anatomical segment frames and joint constraints. Different computational methods (DK versus IK) were responsible for only 2.7% of the differences. We recommend using the same anatomical model for kinematic and musculoskeletal analysis to ensure consistency between the obtained joint angles and musculoskeletal estimates. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Big Data Processing for a Central Texas Groundwater Case Study

    NASA Astrophysics Data System (ADS)

    Cantu, A.; Rivera, O.; Martínez, A.; Lewis, D. H.; Gentle, J. N., Jr.; Fuentes, G.; Pierce, S. A.

    2016-12-01

    As computational methods improve, scientists are able to expand the level and scale of experimental simulation and testing that is completed for case studies. This study presents a comparative analysis of multiple models for the Barton Springs segment of the Edwards aquifer. Several numerical simulations using state-mandated MODFLOW models ran on Stampede, a High Performance Computing system housed at the Texas Advanced Computing Center, were performed for multiple scenario testing. One goal of this multidisciplinary project aims to visualize and compare the output data of the groundwater model using the statistical programming language R to find revealing data patterns produced by different pumping scenarios. Presenting data in a friendly post-processing format is covered in this paper. Visualization of the data and creating workflows applicable to the management of the data are tasks performed after data extraction. Resulting analyses provide an example of how supercomputing can be used to accelerate evaluation of scientific uncertainty and geological knowledge in relation to policy and management decisions. Understanding the aquifer behavior helps policy makers avoid negative impact on the endangered species, environmental services and aids in maximizing the aquifer yield.

  18. Anticipatory dynamics of biological systems: from molecular quantum states to evolution

    NASA Astrophysics Data System (ADS)

    Igamberdiev, Abir U.

    2015-08-01

    Living systems possess anticipatory behaviour that is based on the flexibility of internal models generated by the system's embedded description. The idea was suggested by Aristotle and is explicitly introduced to theoretical biology by Rosen. The possibility of holding the embedded internal model is grounded in the principle of stable non-equilibrium (Bauer). From the quantum mechanical view, this principle aims to minimize energy dissipation in expense of long relaxation times. The ideas of stable non-equilibrium were developed by Liberman who viewed living systems as subdivided into the quantum regulator and the molecular computer supporting coherence of the regulator's internal quantum state. The computational power of the cell molecular computer is based on the possibility of molecular rearrangements according to molecular addresses. In evolution, the anticipatory strategies are realized both as a precession of phylogenesis by ontogenesis (Berg) and as the anticipatory search of genetic fixation of adaptive changes that incorporates them into the internal model of genetic system. We discuss how the fundamental ideas of anticipation can be introduced into the basic foundations of theoretical biology.

  19. A Computational Framework for Bioimaging Simulation

    PubMed Central

    Watabe, Masaki; Arjunan, Satya N. V.; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units. PMID:26147508

  20. Numerical Investigations of Two Typical Unsteady Flows in Turbomachinery Using the Multi-Passage Model

    NASA Astrophysics Data System (ADS)

    Zhou, Di; Lu, Zhiliang; Guo, Tongqing; Shen, Ennan

    2016-06-01

    In this paper, the research on two types of unsteady flow problems in turbomachinery including blade flutter and rotor-stator interaction is made by means of numerical simulation. For the former, the energy method is often used to predict the aeroelastic stability by calculating the aerodynamic work per vibration cycle. The inter-blade phase angle (IBPA) is an important parameter in computation and may have significant effects on aeroelastic behavior. For the latter, the numbers of blades in each row are usually not equal and the unsteady rotor-stator interactions could be strong. An effective way to perform multi-row calculations is the domain scaling method (DSM). These two cases share a common point that the computational domain has to be extended to multi passages (MP) considering their respective features. The present work is aimed at modeling these two issues with the developed MP model. Computational fluid dynamics (CFD) technique is applied to resolve the unsteady Reynolds-averaged Navier-Stokes (RANS) equations and simulate the flow fields. With the parallel technique, the additional time cost due to modeling more passages can be largely decreased. Results are presented on two test cases including a vibrating rotor blade and a turbine stage.

  1. An overview of bioinformatics methods for modeling biological pathways in yeast.

    PubMed

    Hou, Jie; Acharya, Lipi; Zhu, Dongxiao; Cheng, Jianlin

    2016-03-01

    The advent of high-throughput genomics techniques, along with the completion of genome sequencing projects, identification of protein-protein interactions and reconstruction of genome-scale pathways, has accelerated the development of systems biology research in the yeast organism Saccharomyces cerevisiae In particular, discovery of biological pathways in yeast has become an important forefront in systems biology, which aims to understand the interactions among molecules within a cell leading to certain cellular processes in response to a specific environment. While the existing theoretical and experimental approaches enable the investigation of well-known pathways involved in metabolism, gene regulation and signal transduction, bioinformatics methods offer new insights into computational modeling of biological pathways. A wide range of computational approaches has been proposed in the past for reconstructing biological pathways from high-throughput datasets. Here we review selected bioinformatics approaches for modeling biological pathways inS. cerevisiae, including metabolic pathways, gene-regulatory pathways and signaling pathways. We start with reviewing the research on biological pathways followed by discussing key biological databases. In addition, several representative computational approaches for modeling biological pathways in yeast are discussed. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  2. RGCA: A Reliable GPU Cluster Architecture for Large-Scale Internet of Things Computing Based on Effective Performance-Energy Optimization

    PubMed Central

    Chen, Qingkui; Zhao, Deyu; Wang, Jingjuan

    2017-01-01

    This paper aims to develop a low-cost, high-performance and high-reliability computing system to process large-scale data using common data mining algorithms in the Internet of Things (IoT) computing environment. Considering the characteristics of IoT data processing, similar to mainstream high performance computing, we use a GPU (Graphics Processing Unit) cluster to achieve better IoT services. Firstly, we present an energy consumption calculation method (ECCM) based on WSNs. Then, using the CUDA (Compute Unified Device Architecture) Programming model, we propose a Two-level Parallel Optimization Model (TLPOM) which exploits reasonable resource planning and common compiler optimization techniques to obtain the best blocks and threads configuration considering the resource constraints of each node. The key to this part is dynamic coupling Thread-Level Parallelism (TLP) and Instruction-Level Parallelism (ILP) to improve the performance of the algorithms without additional energy consumption. Finally, combining the ECCM and the TLPOM, we use the Reliable GPU Cluster Architecture (RGCA) to obtain a high-reliability computing system considering the nodes’ diversity, algorithm characteristics, etc. The results show that the performance of the algorithms significantly increased by 34.1%, 33.96% and 24.07% for Fermi, Kepler and Maxwell on average with TLPOM and the RGCA ensures that our IoT computing system provides low-cost and high-reliability services. PMID:28777325

  3. RGCA: A Reliable GPU Cluster Architecture for Large-Scale Internet of Things Computing Based on Effective Performance-Energy Optimization.

    PubMed

    Fang, Yuling; Chen, Qingkui; Xiong, Neal N; Zhao, Deyu; Wang, Jingjuan

    2017-08-04

    This paper aims to develop a low-cost, high-performance and high-reliability computing system to process large-scale data using common data mining algorithms in the Internet of Things (IoT) computing environment. Considering the characteristics of IoT data processing, similar to mainstream high performance computing, we use a GPU (Graphics Processing Unit) cluster to achieve better IoT services. Firstly, we present an energy consumption calculation method (ECCM) based on WSNs. Then, using the CUDA (Compute Unified Device Architecture) Programming model, we propose a Two-level Parallel Optimization Model (TLPOM) which exploits reasonable resource planning and common compiler optimization techniques to obtain the best blocks and threads configuration considering the resource constraints of each node. The key to this part is dynamic coupling Thread-Level Parallelism (TLP) and Instruction-Level Parallelism (ILP) to improve the performance of the algorithms without additional energy consumption. Finally, combining the ECCM and the TLPOM, we use the Reliable GPU Cluster Architecture (RGCA) to obtain a high-reliability computing system considering the nodes' diversity, algorithm characteristics, etc. The results show that the performance of the algorithms significantly increased by 34.1%, 33.96% and 24.07% for Fermi, Kepler and Maxwell on average with TLPOM and the RGCA ensures that our IoT computing system provides low-cost and high-reliability services.

  4. Mechanisms and Kinetics of Amyloid Aggregation Investigated by a Phenomenological Coarse-Grained Model

    NASA Astrophysics Data System (ADS)

    Magno, Andrea; Pellarin, Riccardo; Caflisch, Amedeo

    Amyloid fibrils are ordered polypeptide aggregates that have been implicated in several neurodegenerative pathologies, such as Alzheimer's, Parkinson's, Huntington's, and prion diseases, [1, 2] and, more recently, also in biological functionalities. [3, 4, 5] These findings have paved the way for a wide range of experimental and computational studies aimed at understanding the details of the fibril-formation mechanism. Computer simulations using low-resolution models, which employ a simplified representation of protein geometry and energetics, have provided insights into the basic physical principles underlying protein aggregation in general [6, 7, 8] and ordered amyloid aggregation. [9, 10, 11, 12, 13, 14, 15] For example, Dokholyan and coworkers have used the Discrete Molecular Dynamics method [16, 17] to shed light on the mechanisms of protein oligomerization [18] and the conformational changes that take place in proteins before the aggregation onset. [19, 20] One challenging observation, which is difficult to observe by computer simulations, is the wide range of aggregation scenarios emerging from a variety of biophysical measurements. [21, 22] Atomistic models have been employed to study the conformational space of amyloidogenic polypeptides in the monomeric state, [23, 24, 25] the very initial steps of amyloid formation, [26, 27, 28, 29, 30, 31, 32] and the structural stability of fibril models. [33, 34, 35) However, all-atom simulations of the kinetics of fibril formation are beyond what can be done with modern computers.

  5. Estimating Skin Cancer Risk: Evaluating Mobile Computer-Adaptive Testing.

    PubMed

    Djaja, Ngadiman; Janda, Monika; Olsen, Catherine M; Whiteman, David C; Chien, Tsair-Wei

    2016-01-22

    Response burden is a major detriment to questionnaire completion rates. Computer adaptive testing may offer advantages over non-adaptive testing, including reduction of numbers of items required for precise measurement. Our aim was to compare the efficiency of non-adaptive (NAT) and computer adaptive testing (CAT) facilitated by Partial Credit Model (PCM)-derived calibration to estimate skin cancer risk. We used a random sample from a population-based Australian cohort study of skin cancer risk (N=43,794). All 30 items of the skin cancer risk scale were calibrated with the Rasch PCM. A total of 1000 cases generated following a normal distribution (mean [SD] 0 [1]) were simulated using three Rasch models with three fixed-item (dichotomous, rating scale, and partial credit) scenarios, respectively. We calculated the comparative efficiency and precision of CAT and NAT (shortening of questionnaire length and the count difference number ratio less than 5% using independent t tests). We found that use of CAT led to smaller person standard error of the estimated measure than NAT, with substantially higher efficiency but no loss of precision, reducing response burden by 48%, 66%, and 66% for dichotomous, Rating Scale Model, and PCM models, respectively. CAT-based administrations of the skin cancer risk scale could substantially reduce participant burden without compromising measurement precision. A mobile computer adaptive test was developed to help people efficiently assess their skin cancer risk.

  6. Review on modeling of the anode solid electrolyte interphase (SEI) for lithium-ion batteries

    NASA Astrophysics Data System (ADS)

    Wang, Aiping; Kadam, Sanket; Li, Hong; Shi, Siqi; Qi, Yue

    2018-03-01

    A passivation layer called the solid electrolyte interphase (SEI) is formed on electrode surfaces from decomposition products of electrolytes. The SEI allows Li+ transport and blocks electrons in order to prevent further electrolyte decomposition and ensure continued electrochemical reactions. The formation and growth mechanism of the nanometer thick SEI films are yet to be completely understood owing to their complex structure and lack of reliable in situ experimental techniques. Significant advances in computational methods have made it possible to predictively model the fundamentals of SEI. This review aims to give an overview of state-of-the-art modeling progress in the investigation of SEI films on the anodes, ranging from electronic structure calculations to mesoscale modeling, covering the thermodynamics and kinetics of electrolyte reduction reactions, SEI formation, modification through electrolyte design, correlation of SEI properties with battery performance, and the artificial SEI design. Multi-scale simulations have been summarized and compared with each other as well as with experiments. Computational details of the fundamental properties of SEI, such as electron tunneling, Li-ion transport, chemical/mechanical stability of the bulk SEI and electrode/(SEI/) electrolyte interfaces have been discussed. This review shows the potential of computational approaches in the deconvolution of SEI properties and design of artificial SEI. We believe that computational modeling can be integrated with experiments to complement each other and lead to a better understanding of the complex SEI for the development of a highly efficient battery in the future.

  7. Computational Modeling of Tissue Self-Assembly

    NASA Astrophysics Data System (ADS)

    Neagu, Adrian; Kosztin, Ioan; Jakab, Karoly; Barz, Bogdan; Neagu, Monica; Jamison, Richard; Forgacs, Gabor

    As a theoretical framework for understanding the self-assembly of living cells into tissues, Steinberg proposed the differential adhesion hypothesis (DAH) according to which a specific cell type possesses a specific adhesion apparatus that combined with cell motility leads to cell assemblies of various cell types in the lowest adhesive energy state. Experimental and theoretical efforts of four decades turned the DAH into a fundamental principle of developmental biology that has been validated both in vitro and in vivo. Based on computational models of cell sorting, we have developed a DAH-based lattice model for tissues in interaction with their environment and simulated biological self-assembly using the Monte Carlo method. The present brief review highlights results on specific morphogenetic processes with relevance to tissue engineering applications. Our own work is presented on the background of several decades of theoretical efforts aimed to model morphogenesis in living tissues. Simulations of systems involving about 105 cells have been performed on high-end personal computers with CPU times of the order of days. Studied processes include cell sorting, cell sheet formation, and the development of endothelialized tubes from rings made of spheroids of two randomly intermixed cell types, when the medium in the interior of the tube was different from the external one. We conclude by noting that computer simulations based on mathematical models of living tissues yield useful guidelines for laboratory work and can catalyze the emergence of innovative technologies in tissue engineering.

  8. Prediction of 5-year overall survival in cervical cancer patients treated with radical hysterectomy using computational intelligence methods.

    PubMed

    Obrzut, Bogdan; Kusy, Maciej; Semczuk, Andrzej; Obrzut, Marzanna; Kluska, Jacek

    2017-12-12

    Computational intelligence methods, including non-linear classification algorithms, can be used in medical research and practice as a decision making tool. This study aimed to evaluate the usefulness of artificial intelligence models for 5-year overall survival prediction in patients with cervical cancer treated by radical hysterectomy. The data set was collected from 102 patients with cervical cancer FIGO stage IA2-IIB, that underwent primary surgical treatment. Twenty-three demographic, tumor-related parameters and selected perioperative data of each patient were collected. The simulations involved six computational intelligence methods: the probabilistic neural network (PNN), multilayer perceptron network, gene expression programming classifier, support vector machines algorithm, radial basis function neural network and k-Means algorithm. The prediction ability of the models was determined based on the accuracy, sensitivity, specificity, as well as the area under the receiver operating characteristic curve. The results of the computational intelligence methods were compared with the results of linear regression analysis as a reference model. The best results were obtained by the PNN model. This neural network provided very high prediction ability with an accuracy of 0.892 and sensitivity of 0.975. The area under the receiver operating characteristics curve of PNN was also high, 0.818. The outcomes obtained by other classifiers were markedly worse. The PNN model is an effective tool for predicting 5-year overall survival in cervical cancer patients treated with radical hysterectomy.

  9. Personalized cloud-based bioinformatics services for research and education: use cases and the elasticHPC package

    PubMed Central

    2012-01-01

    Background Bioinformatics services have been traditionally provided in the form of a web-server that is hosted at institutional infrastructure and serves multiple users. This model, however, is not flexible enough to cope with the increasing number of users, increasing data size, and new requirements in terms of speed and availability of service. The advent of cloud computing suggests a new service model that provides an efficient solution to these problems, based on the concepts of "resources-on-demand" and "pay-as-you-go". However, cloud computing has not yet been introduced within bioinformatics servers due to the lack of usage scenarios and software layers that address the requirements of the bioinformatics domain. Results In this paper, we provide different use case scenarios for providing cloud computing based services, considering both the technical and financial aspects of the cloud computing service model. These scenarios are for individual users seeking computational power as well as bioinformatics service providers aiming at provision of personalized bioinformatics services to their users. We also present elasticHPC, a software package and a library that facilitates the use of high performance cloud computing resources in general and the implementation of the suggested bioinformatics scenarios in particular. Concrete examples that demonstrate the suggested use case scenarios with whole bioinformatics servers and major sequence analysis tools like BLAST are presented. Experimental results with large datasets are also included to show the advantages of the cloud model. Conclusions Our use case scenarios and the elasticHPC package are steps towards the provision of cloud based bioinformatics services, which would help in overcoming the data challenge of recent biological research. All resources related to elasticHPC and its web-interface are available at http://www.elasticHPC.org. PMID:23281941

  10. Personalized cloud-based bioinformatics services for research and education: use cases and the elasticHPC package.

    PubMed

    El-Kalioby, Mohamed; Abouelhoda, Mohamed; Krüger, Jan; Giegerich, Robert; Sczyrba, Alexander; Wall, Dennis P; Tonellato, Peter

    2012-01-01

    Bioinformatics services have been traditionally provided in the form of a web-server that is hosted at institutional infrastructure and serves multiple users. This model, however, is not flexible enough to cope with the increasing number of users, increasing data size, and new requirements in terms of speed and availability of service. The advent of cloud computing suggests a new service model that provides an efficient solution to these problems, based on the concepts of "resources-on-demand" and "pay-as-you-go". However, cloud computing has not yet been introduced within bioinformatics servers due to the lack of usage scenarios and software layers that address the requirements of the bioinformatics domain. In this paper, we provide different use case scenarios for providing cloud computing based services, considering both the technical and financial aspects of the cloud computing service model. These scenarios are for individual users seeking computational power as well as bioinformatics service providers aiming at provision of personalized bioinformatics services to their users. We also present elasticHPC, a software package and a library that facilitates the use of high performance cloud computing resources in general and the implementation of the suggested bioinformatics scenarios in particular. Concrete examples that demonstrate the suggested use case scenarios with whole bioinformatics servers and major sequence analysis tools like BLAST are presented. Experimental results with large datasets are also included to show the advantages of the cloud model. Our use case scenarios and the elasticHPC package are steps towards the provision of cloud based bioinformatics services, which would help in overcoming the data challenge of recent biological research. All resources related to elasticHPC and its web-interface are available at http://www.elasticHPC.org.

  11. ScrumPy: metabolic modelling with Python.

    PubMed

    Poolman, M G

    2006-09-01

    ScrumPy is a software package used for the definition and analysis of metabolic models. It is written using the Python programming language that is also used as a user interface. ScrumPy has features for both kinetic and structural modelling, but the emphasis is on structural modelling and those features of most relevance to analysis of large (genome-scale) models. The aim is at describing ScrumPy's functionality to readers with some knowledge of metabolic modelling, but implementation, programming and other computational details are omitted. ScrumPy is released under the Gnu Public Licence, and available for download from http://mudshark.brookes.ac.uk/ ScrumPy.

  12. Approximate Single-Diode Photovoltaic Model for Efficient I-V Characteristics Estimation

    PubMed Central

    Ting, T. O.; Zhang, Nan; Guan, Sheng-Uei; Wong, Prudence W. H.

    2013-01-01

    Precise photovoltaic (PV) behavior models are normally described by nonlinear analytical equations. To solve such equations, it is necessary to use iterative procedures. Aiming to make the computation easier, this paper proposes an approximate single-diode PV model that enables high-speed predictions for the electrical characteristics of commercial PV modules. Based on the experimental data, statistical analysis is conducted to validate the approximate model. Simulation results show that the calculated current-voltage (I-V) characteristics fit the measured data with high accuracy. Furthermore, compared with the existing modeling methods, the proposed model reduces the simulation time by approximately 30% in this work. PMID:24298205

  13. Zero-inflated spatio-temporal models for disease mapping.

    PubMed

    Torabi, Mahmoud

    2017-05-01

    In this paper, our aim is to analyze geographical and temporal variability of disease incidence when spatio-temporal count data have excess zeros. To that end, we consider random effects in zero-inflated Poisson models to investigate geographical and temporal patterns of disease incidence. Spatio-temporal models that employ conditionally autoregressive smoothing across the spatial dimension and B-spline smoothing over the temporal dimension are proposed. The analysis of these complex models is computationally difficult from the frequentist perspective. On the other hand, the advent of the Markov chain Monte Carlo algorithm has made the Bayesian analysis of complex models computationally convenient. Recently developed data cloning method provides a frequentist approach to mixed models that is also computationally convenient. We propose to use data cloning, which yields to maximum likelihood estimation, to conduct frequentist analysis of zero-inflated spatio-temporal modeling of disease incidence. One of the advantages of the data cloning approach is that the prediction and corresponding standard errors (or prediction intervals) of smoothing disease incidence over space and time is easily obtained. We illustrate our approach using a real dataset of monthly children asthma visits to hospital in the province of Manitoba, Canada, during the period April 2006 to March 2010. Performance of our approach is also evaluated through a simulation study. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Characterization of three-dimensional anisotropic heart valve tissue mechanical properties using inverse finite element analysis.

    PubMed

    Abbasi, Mostafa; Barakat, Mohammed S; Vahidkhah, Koohyar; Azadani, Ali N

    2016-09-01

    Computational modeling has an important role in design and assessment of medical devices. In computational simulations, considering accurate constitutive models is of the utmost importance to capture mechanical response of soft tissue and biomedical materials under physiological loading conditions. Lack of comprehensive three-dimensional constitutive models for soft tissue limits the effectiveness of computational modeling in research and development of medical devices. The aim of this study was to use inverse finite element (FE) analysis to determine three-dimensional mechanical properties of bovine pericardial leaflets of a surgical bioprosthesis under dynamic loading condition. Using inverse parameter estimation, 3D anisotropic Fung model parameters were estimated for the leaflets. The FE simulations were validated using experimental in-vitro measurements, and the impact of different constitutive material models was investigated on leaflet stress distribution. The results of this study showed that the anisotropic Fung model accurately simulated the leaflet deformation and coaptation during valve opening and closing. During systole, the peak stress reached to 3.17MPa at the leaflet boundary while during diastole high stress regions were primarily observed in the commissures with the peak stress of 1.17MPa. In addition, the Rayleigh damping coefficient that was introduced to FE simulations to simulate viscous damping effects of surrounding fluid was determined. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. An Open-Source Toolbox for Surrogate Modeling of Joint Contact Mechanics

    PubMed Central

    Eskinazi, Ilan

    2016-01-01

    Goal Incorporation of elastic joint contact models into simulations of human movement could facilitate studying the interactions between muscles, ligaments, and bones. Unfortunately, elastic joint contact models are often too expensive computationally to be used within iterative simulation frameworks. This limitation can be overcome by using fast and accurate surrogate contact models that fit or interpolate input-output data sampled from existing elastic contact models. However, construction of surrogate contact models remains an arduous task. The aim of this paper is to introduce an open-source program called Surrogate Contact Modeling Toolbox (SCMT) that facilitates surrogate contact model creation, evaluation, and use. Methods SCMT interacts with the third party software FEBio to perform elastic contact analyses of finite element models and uses Matlab to train neural networks that fit the input-output contact data. SCMT features sample point generation for multiple domains, automated sampling, sample point filtering, and surrogate model training and testing. Results An overview of the software is presented along with two example applications. The first example demonstrates creation of surrogate contact models of artificial tibiofemoral and patellofemoral joints and evaluates their computational speed and accuracy, while the second demonstrates the use of surrogate contact models in a forward dynamic simulation of an open-chain leg extension-flexion motion. Conclusion SCMT facilitates the creation of computationally fast and accurate surrogate contact models. Additionally, it serves as a bridge between FEBio and OpenSim musculoskeletal modeling software. Significance Researchers may now create and deploy surrogate models of elastic joint contact with minimal effort. PMID:26186761

  16. Cloud-Based Computational Tools for Earth Science Applications

    NASA Astrophysics Data System (ADS)

    Arendt, A. A.; Fatland, R.; Howe, B.

    2015-12-01

    Earth scientists are increasingly required to think across disciplines and utilize a wide range of datasets in order to solve complex environmental challenges. Although significant progress has been made in distributing data, researchers must still invest heavily in developing computational tools to accommodate their specific domain. Here we document our development of lightweight computational data systems aimed at enabling rapid data distribution, analytics and problem solving tools for Earth science applications. Our goal is for these systems to be easily deployable, scalable and flexible to accommodate new research directions. As an example we describe "Ice2Ocean", a software system aimed at predicting runoff from snow and ice in the Gulf of Alaska region. Our backend components include relational database software to handle tabular and vector datasets, Python tools (NumPy, pandas and xray) for rapid querying of gridded climate data, and an energy and mass balance hydrological simulation model (SnowModel). These components are hosted in a cloud environment for direct access across research teams, and can also be accessed via API web services using a REST interface. This API is a vital component of our system architecture, as it enables quick integration of our analytical tools across disciplines, and can be accessed by any existing data distribution centers. We will showcase several data integration and visualization examples to illustrate how our system has expanded our ability to conduct cross-disciplinary research.

  17. Toward Multiscale Models of Cyanobacterial Growth: A Modular Approach

    PubMed Central

    Westermark, Stefanie; Steuer, Ralf

    2016-01-01

    Oxygenic photosynthesis dominates global primary productivity ever since its evolution more than three billion years ago. While many aspects of phototrophic growth are well understood, it remains a considerable challenge to elucidate the manifold dependencies and interconnections between the diverse cellular processes that together facilitate the synthesis of new cells. Phototrophic growth involves the coordinated action of several layers of cellular functioning, ranging from the photosynthetic light reactions and the electron transport chain, to carbon-concentrating mechanisms and the assimilation of inorganic carbon. It requires the synthesis of new building blocks by cellular metabolism, protection against excessive light, as well as diurnal regulation by a circadian clock and the orchestration of gene expression and cell division. Computational modeling allows us to quantitatively describe these cellular functions and processes relevant for phototrophic growth. As yet, however, computational models are mostly confined to the inner workings of individual cellular processes, rather than describing the manifold interactions between them in the context of a living cell. Using cyanobacteria as model organisms, this contribution seeks to summarize existing computational models that are relevant to describe phototrophic growth and seeks to outline their interactions and dependencies. Our ultimate aim is to understand cellular functioning and growth as the outcome of a coordinated operation of diverse yet interconnected cellular processes. PMID:28083530

  18. Evaluating the roles of detailed endocardial structures on right ventricular haemodynamics by means of CFD simulations.

    PubMed

    Sacco, Federica; Paun, Bruno; Lehmkuhl, Oriol; Iles, Tinen L; Iaizzo, Paul A; Houzeaux, Guillaume; Vázquez, Mariano; Butakoff, Constantine; Aguado-Sierra, Jazmin

    2018-06-11

    Computational modelling plays an important role in right ventricular (RV) haemodynamic analysis. However, current approaches employ smoothed ventricular anatomies. The aim of this study is to characterise RV haemodynamics including detailed endocardial structures like trabeculae, moderator band and papillary muscles (PMs). Four paired detailed and smoothed RV endocardium models (two male and two female) were reconstructed from ex-vivo human hearts high-resolution magnetic resonance images (MRI). Detailed models include structures with ≥1 mm 2 cross-sectional area. Haemodynamic characterisation was done by computational fluid dynamics (CFD) simulations with steady and transient inflows, using high performance computing (HPC). The differences between the flows in smoothed and detailed models were assessed using Q-criterion for vorticity quantification, the pressure drop between inlet and outlet, and the wall shear stress (WSS). Results demonstrated that detailed endocardial structures increase the degree of intra-ventricular pressure drop, decrease the WSS and disrupt the dominant vortex creating secondary small vortices. Increasingly turbulent blood flow was observed in the detailed RVs. Female RVs were less trabeculated and presented lower pressure drops than the males. In conclusion, neglecting endocardial structures in RV haemodynamic models may lead to inaccurate conclusions about the pressures, stresses, and blood flow behaviour in the cavity. This article is protected by copyright. All rights reserved.

  19. A computational feedforward model predicts categorization of masked emotional body language for longer, but not for shorter, latencies.

    PubMed

    Stienen, Bernard M C; Schindler, Konrad; de Gelder, Beatrice

    2012-07-01

    Given the presence of massive feedback loops in brain networks, it is difficult to disentangle the contribution of feedforward and feedback processing to the recognition of visual stimuli, in this case, of emotional body expressions. The aim of the work presented in this letter is to shed light on how well feedforward processing explains rapid categorization of this important class of stimuli. By means of parametric masking, it may be possible to control the contribution of feedback activity in human participants. A close comparison is presented between human recognition performance and the performance of a computational neural model that exclusively modeled feedforward processing and was engineered to fulfill the computational requirements of recognition. Results show that the longer the stimulus onset asynchrony (SOA), the closer the performance of the human participants was to the values predicted by the model, with an optimum at an SOA of 100 ms. At short SOA latencies, human performance deteriorated, but the categorization of the emotional expressions was still above baseline. The data suggest that, although theoretically, feedback arising from inferotemporal cortex is likely to be blocked when the SOA is 100 ms, human participants still seem to rely on more local visual feedback processing to equal the model's performance.

  20. Algorithmic mechanisms for reliable crowdsourcing computation under collusion.

    PubMed

    Fernández Anta, Antonio; Georgiou, Chryssis; Mosteiro, Miguel A; Pareja, Daniel

    2015-01-01

    We consider a computing system where a master processor assigns a task for execution to worker processors that may collude. We model the workers' decision of whether to comply (compute the task) or not (return a bogus result to save the computation cost) as a game among workers. That is, we assume that workers are rational in a game-theoretic sense. We identify analytically the parameter conditions for a unique Nash Equilibrium where the master obtains the correct result. We also evaluate experimentally mixed equilibria aiming to attain better reliability-profit trade-offs. For a wide range of parameter values that may be used in practice, our simulations show that, in fact, both master and workers are better off using a pure equilibrium where no worker cheats, even under collusion, and even for colluding behaviors that involve deviating from the game.

  1. Algorithmic Mechanisms for Reliable Crowdsourcing Computation under Collusion

    PubMed Central

    Fernández Anta, Antonio; Georgiou, Chryssis; Mosteiro, Miguel A.; Pareja, Daniel

    2015-01-01

    We consider a computing system where a master processor assigns a task for execution to worker processors that may collude. We model the workers’ decision of whether to comply (compute the task) or not (return a bogus result to save the computation cost) as a game among workers. That is, we assume that workers are rational in a game-theoretic sense. We identify analytically the parameter conditions for a unique Nash Equilibrium where the master obtains the correct result. We also evaluate experimentally mixed equilibria aiming to attain better reliability-profit trade-offs. For a wide range of parameter values that may be used in practice, our simulations show that, in fact, both master and workers are better off using a pure equilibrium where no worker cheats, even under collusion, and even for colluding behaviors that involve deviating from the game. PMID:25793524

  2. QPSO-Based Adaptive DNA Computing Algorithm

    PubMed Central

    Karakose, Mehmet; Cigdem, Ugur

    2013-01-01

    DNA (deoxyribonucleic acid) computing that is a new computation model based on DNA molecules for information storage has been increasingly used for optimization and data analysis in recent years. However, DNA computing algorithm has some limitations in terms of convergence speed, adaptability, and effectiveness. In this paper, a new approach for improvement of DNA computing is proposed. This new approach aims to perform DNA computing algorithm with adaptive parameters towards the desired goal using quantum-behaved particle swarm optimization (QPSO). Some contributions provided by the proposed QPSO based on adaptive DNA computing algorithm are as follows: (1) parameters of population size, crossover rate, maximum number of operations, enzyme and virus mutation rate, and fitness function of DNA computing algorithm are simultaneously tuned for adaptive process, (2) adaptive algorithm is performed using QPSO algorithm for goal-driven progress, faster operation, and flexibility in data, and (3) numerical realization of DNA computing algorithm with proposed approach is implemented in system identification. Two experiments with different systems were carried out to evaluate the performance of the proposed approach with comparative results. Experimental results obtained with Matlab and FPGA demonstrate ability to provide effective optimization, considerable convergence speed, and high accuracy according to DNA computing algorithm. PMID:23935409

  3. Computational understanding of Li-ion batteries

    NASA Astrophysics Data System (ADS)

    Urban, Alexander; Seo, Dong-Hwa; Ceder, Gerbrand

    2016-03-01

    Over the last two decades, computational methods have made tremendous advances, and today many key properties of lithium-ion batteries can be accurately predicted by first principles calculations. For this reason, computations have become a cornerstone of battery-related research by providing insight into fundamental processes that are not otherwise accessible, such as ionic diffusion mechanisms and electronic structure effects, as well as a quantitative comparison with experimental results. The aim of this review is to provide an overview of state-of-the-art ab initio approaches for the modelling of battery materials. We consider techniques for the computation of equilibrium cell voltages, 0-Kelvin and finite-temperature voltage profiles, ionic mobility and thermal and electrolyte stability. The strengths and weaknesses of different electronic structure methods, such as DFT+U and hybrid functionals, are discussed in the context of voltage and phase diagram predictions, and we review the merits of lattice models for the evaluation of finite-temperature thermodynamics and kinetics. With such a complete set of methods at hand, first principles calculations of ordered, crystalline solids, i.e., of most electrode materials and solid electrolytes, have become reliable and quantitative. However, the description of molecular materials and disordered or amorphous phases remains an important challenge. We highlight recent exciting progress in this area, especially regarding the modelling of organic electrolytes and solid-electrolyte interfaces.

  4. From design to manufacturing of asymmetric teeth gears using computer application

    NASA Astrophysics Data System (ADS)

    Suciu, F.; Dascalescu, A.; Ungureanu, M.

    2017-05-01

    The asymmetric cylindrical gears, with involutes teeth profiles having different base circle diameters, are nonstandard gears, used with the aim to obtain better function parameters for the active profile. We will expect that the manufacturing of these gears became possible only after the design and realization of some specific tools. The paper present how the computer aided design and applications developed in MATLAB, for obtain the geometrical parameters, in the same time for calculation some functional parameters like stress and displacements, transmission error, efficiency of the gears and the 2D models, generated with AUTOLISP applications, are used for computer aided manufacturing of asymmetric gears with standard tools. So the specific tools considered one of the disadvantages of these gears are not necessary and implicitly the expected supplementary costs are reduced. The calculus algorithm established for the asymmetric gear design application use the „direct design“ of the spur gears. This method offers the possibility of determining first the parameters of the gears, followed by the determination of the asymmetric gear rack’s parameters, based on those of the gears. Using original design method and computer applications have been determined the geometrical parameters, the 2D and 3D models of the asymmetric gears and on the base of these models have been manufacturing on CNC machine tool asymmetric gears.

  5. Analytical and scale model research aimed at improved hangglider design

    NASA Technical Reports Server (NTRS)

    Kroo, I.; Chang, L. S.

    1979-01-01

    Research consisted of a theoretical analysis which attempts to predict aerodynamic characteristics using lifting surface theory and finite-element structural analysis as well as an experimental investigation using 1/5 scale elastically similar models in the NASA Ames 2m x 3m (7' x 10') wind tunnel. Experimental data were compared with theoretical results in the development of a computer program which may be used in the design and evaluation of ultralight gliders.

  6. High-resolution downscaling for hydrological management

    NASA Astrophysics Data System (ADS)

    Ulbrich, Uwe; Rust, Henning; Meredith, Edmund; Kpogo-Nuwoklo, Komlan; Vagenas, Christos

    2017-04-01

    Hydrological modellers and water managers require high-resolution climate data to model regional hydrologies and how these may respond to future changes in the large-scale climate. The ability to successfully model such changes and, by extension, critical infrastructure planning is often impeded by a lack of suitable climate data. This typically takes the form of too-coarse data from climate models, which are not sufficiently detailed in either space or time to be able to support water management decisions and hydrological research. BINGO (Bringing INnovation in onGOing water management; ) aims to bridge the gap between the needs of hydrological modellers and planners, and the currently available range of climate data, with the overarching aim of providing adaptation strategies for climate change-related challenges. Producing the kilometre- and sub-daily-scale climate data needed by hydrologists through continuous simulations is generally computationally infeasible. To circumvent this hurdle, we adopt a two-pronged approach involving (1) selective dynamical downscaling and (2) conditional stochastic weather generators, with the former presented here. We take an event-based approach to downscaling in order to achieve the kilometre-scale input needed by hydrological modellers. Computational expenses are minimized by identifying extremal weather patterns for each BINGO research site in lower-resolution simulations and then only downscaling to the kilometre-scale (convection permitting) those events during which such patterns occur. Here we (1) outline the methodology behind the selection of the events, and (2) compare the modelled precipitation distribution and variability (preconditioned on the extremal weather patterns) with that found in observations.

  7. Computational Modeling as a Design Tool in Microelectronics Manufacturing

    NASA Technical Reports Server (NTRS)

    Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1997-01-01

    Plans to introduce pilot lines or fabs for 300 mm processing are in progress. The IC technology is simultaneously moving towards 0.25/0.18 micron. The convergence of these two trends places unprecedented stringent demands on processes and equipments. More than ever, computational modeling is called upon to play a complementary role in equipment and process design. The pace in hardware/process development needs a matching pace in software development: an aggressive move towards developing "virtual reactors" is desirable and essential to reduce design cycle and costs. This goal has three elements: reactor scale model, feature level model, and database of physical/chemical properties. With these elements coupled, the complete model should function as a design aid in a CAD environment. This talk would aim at the description of various elements. At the reactor level, continuum, DSMC(or particle) and hybrid models will be discussed and compared using examples of plasma and thermal process simulations. In microtopography evolution, approaches such as level set methods compete with conventional geometric models. Regardless of the approach, the reliance on empricism is to be eliminated through coupling to reactor model and computational surface science. This coupling poses challenging issues of orders of magnitude variation in length and time scales. Finally, database development has fallen behind; current situation is rapidly aggravated by the ever newer chemistries emerging to meet process metrics. The virtual reactor would be a useless concept without an accompanying reliable database that consists of: thermal reaction pathways and rate constants, electron-molecule cross sections, thermochemical properties, transport properties, and finally, surface data on the interaction of radicals, atoms and ions with various surfaces. Large scale computational chemistry efforts are critical as experiments alone cannot meet database needs due to the difficulties associated with such controlled experiments and costs.

  8. Poster — Thur Eve — 44: Linearization of Compartmental Models for More Robust Estimates of Regional Hemodynamic, Metabolic and Functional Parameters using DCE-CT/PET Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blais, AR; Dekaban, M; Lee, T-Y

    2014-08-15

    Quantitative analysis of dynamic positron emission tomography (PET) data usually involves minimizing a cost function with nonlinear regression, wherein the choice of starting parameter values and the presence of local minima affect the bias and variability of the estimated kinetic parameters. These nonlinear methods can also require lengthy computation time, making them unsuitable for use in clinical settings. Kinetic modeling of PET aims to estimate the rate parameter k{sub 3}, which is the binding affinity of the tracer to a biological process of interest and is highly susceptible to noise inherent in PET image acquisition. We have developed linearized kineticmore » models for kinetic analysis of dynamic contrast enhanced computed tomography (DCE-CT)/PET imaging, including a 2-compartment model for DCE-CT and a 3-compartment model for PET. Use of kinetic parameters estimated from DCE-CT can stabilize the kinetic analysis of dynamic PET data, allowing for more robust estimation of k{sub 3}. Furthermore, these linearized models are solved with a non-negative least squares algorithm and together they provide other advantages including: 1) only one possible solution and they do not require a choice of starting parameter values, 2) parameter estimates are comparable in accuracy to those from nonlinear models, 3) significantly reduced computational time. Our simulated data show that when blood volume and permeability are estimated with DCE-CT, the bias of k{sub 3} estimation with our linearized model is 1.97 ± 38.5% for 1,000 runs with a signal-to-noise ratio of 10. In summary, we have developed a computationally efficient technique for accurate estimation of k{sub 3} from noisy dynamic PET data.« less

  9. A Mixtures-of-Trees Framework for Multi-Label Classification

    PubMed Central

    Hong, Charmgil; Batal, Iyad; Hauskrecht, Milos

    2015-01-01

    We propose a new probabilistic approach for multi-label classification that aims to represent the class posterior distribution P(Y|X). Our approach uses a mixture of tree-structured Bayesian networks, which can leverage the computational advantages of conditional tree-structured models and the abilities of mixtures to compensate for tree-structured restrictions. We develop algorithms for learning the model from data and for performing multi-label predictions using the learned model. Experiments on multiple datasets demonstrate that our approach outperforms several state-of-the-art multi-label classification methods. PMID:25927011

  10. Dynamic Chest Image Analysis: Evaluation of Model-Based Pulmonary Perfusion Analysis With Pyramid Images

    DTIC Science & Technology

    2001-10-25

    Image Analysis aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the Dynamic Pulmonary Imaging technique 18,5,17,6. We have proposed and evaluated a multiresolutional method with an explicit ventilation model based on pyramid images for ventilation analysis. We have further extended the method for ventilation analysis to pulmonary perfusion. This paper focuses on the clinical evaluation of our method for

  11. Image quality assessment by preprocessing and full reference model combination

    NASA Astrophysics Data System (ADS)

    Bianco, S.; Ciocca, G.; Marini, F.; Schettini, R.

    2009-01-01

    This paper focuses on full-reference image quality assessment and presents different computational strategies aimed to improve the robustness and accuracy of some well known and widely used state of the art models, namely the Structural Similarity approach (SSIM) by Wang and Bovik and the S-CIELAB spatial-color model by Zhang and Wandell. We investigate the hypothesis that combining error images with a visual attention model could allow a better fit of the psycho-visual data of the LIVE Image Quality assessment Database Release 2. We show that the proposed quality assessment metric better correlates with the experimental data.

  12. Global Seismic Imaging Based on Adjoint Tomography

    NASA Astrophysics Data System (ADS)

    Bozdag, E.; Lefebvre, M.; Lei, W.; Peter, D. B.; Smith, J. A.; Zhu, H.; Komatitsch, D.; Tromp, J.

    2013-12-01

    Our aim is to perform adjoint tomography at the scale of globe to image the entire planet. We have started elastic inversions with a global data set of 253 CMT earthquakes with moment magnitudes in the range 5.8 ≤ Mw ≤ 7 and used GSN stations as well as some local networks such as USArray, European stations, etc. Using an iterative pre-conditioned conjugate gradient scheme, we initially set the aim to obtain a global crustal and mantle model with confined transverse isotropy in the upper mantle. Global adjoint tomography has so far remained a challenge mainly due to computational limitations. Recent improvements in our 3D solvers (e.g., a GPU version) and access to high-performance computational centers (e.g., ORNL's Cray XK7 "Titan" system) now enable us to perform iterations with higher-resolution (T > 9 s) and longer-duration (200 min) simulations to accommodate high-frequency body waves and major-arc surface waves, respectively, which help improve data coverage. The remaining challenge is the heavy I/O traffic caused by the numerous files generated during the forward/adjoint simulations and the pre- and post-processing stages of our workflow. We improve the global adjoint tomography workflow by adopting the ADIOS file format for our seismic data as well as models, kernels, etc., to improve efficiency on high-performance clusters. Our ultimate aim is to use data from all available networks and earthquakes within the magnitude range of our interest (5.5 ≤ Mw ≤ 7) which requires a solid framework to manage big data in our global adjoint tomography workflow. We discuss the current status and future of global adjoint tomography based on our initial results as well as practical issues such as handling big data in inversions and on high-performance computing systems.

  13. Development of a Model and Computer Code to Describe Solar Grade Silicon Production Processes

    NASA Technical Reports Server (NTRS)

    Srivastava, R.; Gould, R. K.

    1979-01-01

    The program aims at developing mathematical models and computer codes based on these models, which allow prediction of the product distribution in chemical reactors for converting gaseous silicon compounds to condensed-phase silicon. The major interest is in collecting silicon as a liquid on the reactor walls and other collection surfaces. Two reactor systems are of major interest, a SiCl4/Na reactor in which Si(l) is collected on the flow tube reactor walls and a reactor in which Si(l) droplets formed by the SiCl4/Na reaction are collected by a jet impingement method. During this quarter the following tasks were accomplished: (1) particle deposition routines were added to the boundary layer code; and (2) Si droplet sizes in SiCl4/Na reactors at temperatures below the dew point of Si are being calculated.

  14. Rough set classification based on quantum logic

    NASA Astrophysics Data System (ADS)

    Hassan, Yasser F.

    2017-11-01

    By combining the advantages of quantum computing and soft computing, the paper shows that rough sets can be used with quantum logic for classification and recognition systems. We suggest the new definition of rough set theory as quantum logic theory. Rough approximations are essential elements in rough set theory, the quantum rough set model for set-valued data directly construct set approximation based on a kind of quantum similarity relation which is presented here. Theoretical analyses demonstrate that the new model for quantum rough sets has new type of decision rule with less redundancy which can be used to give accurate classification using principles of quantum superposition and non-linear quantum relations. To our knowledge, this is the first attempt aiming to define rough sets in representation of a quantum rather than logic or sets. The experiments on data-sets have demonstrated that the proposed model is more accuracy than the traditional rough sets in terms of finding optimal classifications.

  15. Multilayer shallow water models with locally variable number of layers and semi-implicit time discretization

    NASA Astrophysics Data System (ADS)

    Bonaventura, Luca; Fernández-Nieto, Enrique D.; Garres-Díaz, José; Narbona-Reina, Gladys

    2018-07-01

    We propose an extension of the discretization approaches for multilayer shallow water models, aimed at making them more flexible and efficient for realistic applications to coastal flows. A novel discretization approach is proposed, in which the number of vertical layers and their distribution are allowed to change in different regions of the computational domain. Furthermore, semi-implicit schemes are employed for the time discretization, leading to a significant efficiency improvement for subcritical regimes. We show that, in the typical regimes in which the application of multilayer shallow water models is justified, the resulting discretization does not introduce any major spurious feature and allows again to reduce substantially the computational cost in areas with complex bathymetry. As an example of the potential of the proposed technique, an application to a sediment transport problem is presented, showing a remarkable improvement with respect to standard discretization approaches.

  16. Nanotoxicity prediction using computational modelling - review and future directions

    NASA Astrophysics Data System (ADS)

    Saini, Bhavna; Srivastava, Sumit

    2018-04-01

    Nanomaterials has stimulated various outlooks for future in a number of industries and scientific ventures. A number of applications such as cosmetics, medicines, and electronics are employing nanomaterials due to their various compelling properties. The unending growth of nanomaterials usage in our daily life has escalated the health and environmental risks. Early nanotoxicity recognition is a big challenge. Various researches are going on in the field of nanotoxicity, which comprised of several problems such as inadequacy of proper datasets, lack of appropriate rules and characterization of nanomaterials. Computational modelling would be beneficial asset for nanomaterials researchers because it can foresee the toxicity, rest on previous experimental data. In this study, we have reviewed sufficient work demonstrating a proper pathway to proceed with QSAR analysis of Nanomaterials for toxicity modelling. The paper aims at providing comprehensive insight of Nano QSAR, various theories, tools and approaches used, along with an outline for future research directions to work on.

  17. A local time stepping algorithm for GPU-accelerated 2D shallow water models

    NASA Astrophysics Data System (ADS)

    Dazzi, Susanna; Vacondio, Renato; Dal Palù, Alessandro; Mignosa, Paolo

    2018-01-01

    In the simulation of flooding events, mesh refinement is often required to capture local bathymetric features and/or to detail areas of interest; however, if an explicit finite volume scheme is adopted, the presence of small cells in the domain can restrict the allowable time step due to the stability condition, thus reducing the computational efficiency. With the aim of overcoming this problem, the paper proposes the application of a Local Time Stepping (LTS) strategy to a GPU-accelerated 2D shallow water numerical model able to handle non-uniform structured meshes. The algorithm is specifically designed to exploit the computational capability of GPUs, minimizing the overheads associated with the LTS implementation. The results of theoretical and field-scale test cases show that the LTS model guarantees appreciable reductions in the execution time compared to the traditional Global Time Stepping strategy, without compromising the solution accuracy.

  18. Toward an integrated software platform for systems pharmacology

    PubMed Central

    Ghosh, Samik; Matsuoka, Yukiko; Asai, Yoshiyuki; Hsin, Kun-Yi; Kitano, Hiroaki

    2013-01-01

    Understanding complex biological systems requires the extensive support of computational tools. This is particularly true for systems pharmacology, which aims to understand the action of drugs and their interactions in a systems context. Computational models play an important role as they can be viewed as an explicit representation of biological hypotheses to be tested. A series of software and data resources are used for model development, verification and exploration of the possible behaviors of biological systems using the model that may not be possible or not cost effective by experiments. Software platforms play a dominant role in creativity and productivity support and have transformed many industries, techniques that can be applied to biology as well. Establishing an integrated software platform will be the next important step in the field. © 2013 The Authors. Biopharmaceutics & Drug Disposition published by John Wiley & Sons, Ltd. PMID:24150748

  19. Restoring Fort Frontenac in 3D: Effective Usage of 3D Technology for Heritage Visualization

    NASA Astrophysics Data System (ADS)

    Yabe, M.; Goins, E.; Jackson, C.; Halbstein, D.; Foster, S.; Bazely, S.

    2015-02-01

    This paper is composed of three elements: 3D modeling, web design, and heritage visualization. The aim is to use computer graphics design to inform and create an interest in historical visualization by rebuilding Fort Frontenac using 3D modeling and interactive design. The final model will be integr ated into an interactive website to learn more about the fort's historic imp ortance. It is apparent that using computer graphics can save time and money when it comes to historical visualization. Visitors do not have to travel to the actual archaeological buildings. They can simply use the Web in their own home to learn about this information virtually. Meticulously following historical records to create a sophisticated restoration of archaeological buildings will draw viewers into visualizations, such as the historical world of Fort Frontenac. As a result, it allows the viewers to effectively understand the fort's social sy stem, habits, and historical events.

  20. A full-wave Helmholtz model for continuous-wave ultrasound transmission.

    PubMed

    Huttunen, Tomi; Malinen, Matti; Kaipio, Jari P; White, Phillip Jason; Hynynen, Kullervo

    2005-03-01

    A full-wave Helmholtz model of continuous-wave (CW) ultrasound fields may offer several attractive features over widely used partial-wave approximations. For example, many full-wave techniques can be easily adjusted for complex geometries, and multiple reflections of sound are automatically taken into account in the model. To date, however, the full-wave modeling of CW fields in general 3D geometries has been avoided due to the large computational cost associated with the numerical approximation of the Helmholtz equation. Recent developments in computing capacity together with improvements in finite element type modeling techniques are making possible wave simulations in 3D geometries which reach over tens of wavelengths. The aim of this study is to investigate the feasibility of a full-wave solution of the 3D Helmholtz equation for modeling of continuous-wave ultrasound fields in an inhomogeneous medium. The numerical approximation of the Helmholtz equation is computed using the ultraweak variational formulation (UWVF) method. In addition, an inverse problem technique is utilized to reconstruct the velocity distribution on the transducer which is used to model the sound source in the UWVF scheme. The modeling method is verified by comparing simulated and measured fields in the case of transmission of 531 kHz CW fields through layered plastic plates. The comparison shows a reasonable agreement between simulations and measurements at low angles of incidence but, due to mode conversion, the Helmholtz model becomes insufficient for simulating ultrasound fields in plates at large angles of incidence.

  1. Model reduction of the numerical analysis of Low Impact Developments techniques

    NASA Astrophysics Data System (ADS)

    Brunetti, Giuseppe; Šimůnek, Jirka; Wöhling, Thomas; Piro, Patrizia

    2017-04-01

    Mechanistic models have proven to be accurate and reliable tools for the numerical analysis of the hydrological behavior of Low Impact Development (LIDs) techniques. However, their widespread adoption is limited by their complexity and computational cost. Recent studies have tried to address this issue by investigating the application of new techniques, such as surrogate-based modeling. However, current results are still limited and fragmented. One of such approaches, the Model Order Reduction (MOR) technique, can represent a valuable tool for reducing the computational complexity of a numerical problems by computing an approximation of the original model. While this technique has been extensively used in water-related problems, no studies have evaluated its use in LIDs modeling. Thus, the main aim of this study is to apply the MOR technique for the development of a reduced order model (ROM) for the numerical analysis of the hydrologic behavior of LIDs, in particular green roofs. The model should be able to correctly reproduce all the hydrological processes of a green roof while reducing the computational cost. The proposed model decouples the subsurface water dynamic of a green roof in a) one-dimensional (1D) vertical flow through a green roof itself and b) one-dimensional saturated lateral flow along the impervious rooftop. The green roof is horizontally discretized in N elements. Each element represents a vertical domain, which can have different properties or boundary conditions. The 1D Richards equation is used to simulate flow in the substrate and drainage layers. Simulated outflow from the vertical domain is used as a recharge term for saturated lateral flow, which is described using the kinematic wave approximation of the Boussinesq equation. The proposed model has been compared with the mechanistic model HYDRUS-2D, which numerically solves the Richards equation for the whole domain. The HYDRUS-1D code has been used for the description of vertical flow, while a Finite Volume Scheme has been adopted for lateral flow. Two scenarios involving flat and steep green roofs were analyzed. Results confirmed the accuracy of the reduced order model, which was able to reproduce both subsurface outflow and the moisture distribution in the green roof, significantly reducing the computational cost.

  2. Optimization-Based Inverse Identification of the Parameters of a Concrete Cap Material Model

    NASA Astrophysics Data System (ADS)

    Král, Petr; Hokeš, Filip; Hušek, Martin; Kala, Jiří; Hradil, Petr

    2017-10-01

    Issues concerning the advanced numerical analysis of concrete building structures in sophisticated computing systems currently require the involvement of nonlinear mechanics tools. The efforts to design safer, more durable and mainly more economically efficient concrete structures are supported via the use of advanced nonlinear concrete material models and the geometrically nonlinear approach. The application of nonlinear mechanics tools undoubtedly presents another step towards the approximation of the real behaviour of concrete building structures within the framework of computer numerical simulations. However, the success rate of this application depends on having a perfect understanding of the behaviour of the concrete material models used and having a perfect understanding of the used material model parameters meaning. The effective application of nonlinear concrete material models within computer simulations often becomes very problematic because these material models very often contain parameters (material constants) whose values are difficult to obtain. However, getting of the correct values of material parameters is very important to ensure proper function of a concrete material model used. Today, one possibility, which permits successful solution of the mentioned problem, is the use of optimization algorithms for the purpose of the optimization-based inverse material parameter identification. Parameter identification goes hand in hand with experimental investigation while it trying to find parameter values of the used material model so that the resulting data obtained from the computer simulation will best approximate the experimental data. This paper is focused on the optimization-based inverse identification of the parameters of a concrete cap material model which is known under the name the Continuous Surface Cap Model. Within this paper, material parameters of the model are identified on the basis of interaction between nonlinear computer simulations, gradient based and nature inspired optimization algorithms and experimental data, the latter of which take the form of a load-extension curve obtained from the evaluation of uniaxial tensile test results. The aim of this research was to obtain material model parameters corresponding to the quasi-static tensile loading which may be further used for the research involving dynamic and high-speed tensile loading. Based on the obtained results it can be concluded that the set goal has been reached.

  3. Three-dimensional modeling of the Ca II H and K lines in the solar atmosphere

    NASA Astrophysics Data System (ADS)

    Bjørgen, Johan P.; Sukhorukov, Andrii V.; Leenaarts, Jorrit; Carlsson, Mats; de la Cruz Rodríguez, Jaime; Scharmer, Göran B.; Hansteen, Viggo H.

    2018-03-01

    Context. CHROMIS, a new imaging spectrometer at the Swedish 1-m Solar Telescope (SST), can observe the chromosphere in the H and K lines of Ca II at high spatial and spectral resolution. Accurate modeling as well as an understanding of the formation of these lines are needed to interpret the SST/CHROMIS observations. Such modeling is computationally challenging because these lines are influenced by strong departures from local thermodynamic equilibrium, three-dimensional radiative transfer, and partially coherent resonance scattering of photons. Aim. We aim to model the Ca II H and K lines in 3D model atmospheres to understand their formation and to investigate their diagnostic potential for probing the chromosphere. Methods: We model the synthetic spectrum of Ca II using the radiative transfer code Multi3D in three different radiation-magnetohydrodynamic model atmospheres computed with the Bifrost code. We classify synthetic intensity profiles according to their shapes and study how their features are related to the physical properties in the model atmospheres. We investigate whether the synthetic data reproduce the observed spatially-averaged line shapes, center-to-limb variation and compare this data with SST/CHROMIS images. Results: The spatially-averaged synthetic line profiles show too low central emission peaks, and too small separation between the peaks. The trends of the observed center-to-limb variation of the profiles properties are reproduced by the models. The Ca II H and K line profiles provide a temperature diagnostic of the temperature minimum and the temperature at the formation height of the emission peaks. The Doppler shift of the central depression is an excellent probe of the velocity in the upper chromosphere.

  4. Development and experimental assessment of a numerical modelling code to aid the design of profile extrusion cooling tools

    NASA Astrophysics Data System (ADS)

    Carneiro, O. S.; Rajkumar, A.; Fernandes, C.; Ferrás, L. L.; Habla, F.; Nóbrega, J. M.

    2017-10-01

    On the extrusion of thermoplastic profiles, upon the forming stage that takes place in the extrusion die, the profile must be cooled in a metallic calibrator. This stage must be done at a high rate, to assure increased productivity, but avoiding the development of high temperature gradients, in order to minimize the level of induced thermal residual stresses. In this work, we present a new coupled numerical solver, developed in the framework of the OpenFOAM® computational library, that computes the temperature distribution in both domains simultaneously (metallic calibrator and plastic profile), whose implementation aimed the minimization of the computational time. The new solver was experimentally assessed with an industrial case study.

  5. Study of unsteady performance of a twin-entry mixed flow turbine

    NASA Astrophysics Data System (ADS)

    Bencherif, M. M.; Hamidou, M. K.; Hamel, M.; Abidat, M.

    2016-03-01

    The aim of this investigation is to study the performance of a twin-entry turbine under pulsed flow conditions. The ANSYS-CFX code is used to solve three-dimensional compressible turbulent flow equations. The computational results are compared with those of a one-dimensional model and experimental data, and good agreement is found.

  6. A Model for New Linkages for Prior Learning Assessment

    ERIC Educational Resources Information Center

    Kalz, Marco; van Bruggen, Jan; Giesbers, Bas; Waterink, Wim; Eshuis, Jannes; Koper, Rob

    2008-01-01

    Purpose: The purpose of this paper is twofold: first the paper aims to sketch the theoretical basis for the use of electronic portfolios for prior learning assessment; second it endeavours to introduce latent semantic analysis (LSA) as a powerful method for the computation of semantic similarity between texts and a basis for a new observation link…

  7. Screen Design Principles of Computer-Aided Instructional Software for Elementary School Students

    ERIC Educational Resources Information Center

    Berrin, Atiker; Turan, Bülent Onur

    2017-01-01

    This study aims to present primary school students' views about current educational software interfaces, and to propose principles for educational software screens. The study was carried out with a general screening model. Sample group of the study consisted of sixth grade students in Sehit Ögretmen Hasan Akan Elementary School. In this context,…

  8. Distance Learning Success--A Perspective from Socio-Technical Systems Theory

    ERIC Educational Resources Information Center

    Wang, Jianfeng; Solan, David; Ghods, Abe

    2010-01-01

    With widespread adoption of computer-based distance education as a mission-critical component of the institution's educational program, the need for evaluation has emerged. In this research, we aim to expand on the systems approach by offering a model for evaluation based on socio-technical systems theory addressing a stated need in the literature…

  9. Exploring Meaning Negotiation Patterns in Synchronous Audio and Video Conferencing English Classes in China

    ERIC Educational Resources Information Center

    Li, Chenxi; Wu, Ligao; Li, Chen; Tang, Jinlan

    2017-01-01

    This work-in-progress doctoral research project aims to identify meaning negotiation patterns in synchronous audio and video Computer-Mediated Communication (CMC) environments based on the model of CMC text chat proposed by Smith (2003). The study was conducted in the Institute of Online Education at Beijing Foreign Studies University. Four dyads…

  10. Computer aided design of architecture of degradable tissue engineering scaffolds.

    PubMed

    Heljak, M K; Kurzydlowski, K J; Swieszkowski, W

    2017-11-01

    One important factor affecting the process of tissue regeneration is scaffold stiffness loss, which should be properly balanced with the rate of tissue regeneration. The aim of the research reported here was to develop a computer tool for designing the architecture of biodegradable scaffolds fabricated by melt-dissolution deposition systems (e.g. Fused Deposition Modeling) to provide the required scaffold stiffness at each stage of degradation/regeneration. The original idea presented in the paper is that the stiffness of a tissue engineering scaffold can be controlled during degradation by means of a proper selection of the diameter of the constituent fibers and the distances between them. This idea is based on the size-effect on degradation of aliphatic polyesters. The presented computer tool combines a genetic algorithm and a diffusion-reaction model of polymer hydrolytic degradation. In particular, we show how to design the architecture of scaffolds made of poly(DL-lactide-co-glycolide) with the required Young's modulus change during hydrolytic degradation.

  11. Computational model of lightness perception in high dynamic range imaging

    NASA Astrophysics Data System (ADS)

    Krawczyk, Grzegorz; Myszkowski, Karol; Seidel, Hans-Peter

    2006-02-01

    An anchoring theory of lightness perception by Gilchrist et al. [1999] explains many characteristics of human visual system such as lightness constancy and its spectacular failures which are important in the perception of images. The principal concept of this theory is the perception of complex scenes in terms of groups of consistent areas (frameworks). Such areas, following the gestalt theorists, are defined by the regions of common illumination. The key aspect of the image perception is the estimation of lightness within each framework through the anchoring to the luminance perceived as white, followed by the computation of the global lightness. In this paper we provide a computational model for automatic decomposition of HDR images into frameworks. We derive a tone mapping operator which predicts lightness perception of the real world scenes and aims at its accurate reproduction on low dynamic range displays. Furthermore, such a decomposition into frameworks opens new grounds for local image analysis in view of human perception.

  12. Scale-Resolving simulations (SRS): How much resolution do we really need?

    NASA Astrophysics Data System (ADS)

    Pereira, Filipe M. S.; Girimaji, Sharath

    2017-11-01

    Scale-resolving simulations (SRS) are emerging as the computational approach of choice for many engineering flows with coherent structures. The SRS methods seek to resolve only the most important features of the coherent structures and model the remainder of the flow field with canonical closures. With reference to a typical Large-Eddy Simulation (LES), practical SRS methods aim to resolve a considerably narrower range of scales (reduced physical resolution) to achieve an adequate degree of accuracy at reasonable computational effort. While the objective of SRS is well-founded, the criteria for establishing the optimal degree of resolution required to achieve an acceptable level of accuracy are not clear. This study considers the canonical case of the flow around a circular cylinder to address the issue of `optimal' resolution. Two important criteria are developed. The first condition addresses the issue of adequate resolution of the flow field. The second guideline provides an assessment of whether the modeled field is canonical (stochastic) turbulence amenable to closure-based computations.

  13. Infrastructures for Distributed Computing: the case of BESIII

    NASA Astrophysics Data System (ADS)

    Pellegrino, J.

    2018-05-01

    The BESIII is an electron-positron collision experiment hosted at BEPCII in Beijing and aimed to investigate Tau-Charm physics. Now BESIII has been running for several years and gathered more than 1PB raw data. In order to analyze these data and perform massive Monte Carlo simulations, a large amount of computing and storage resources is needed. The distributed computing system is based up on DIRAC and it is in production since 2012. It integrates computing and storage resources from different institutes and a variety of resource types such as cluster, grid, cloud or volunteer computing. About 15 sites from BESIII Collaboration from all over the world joined this distributed computing infrastructure, giving a significant contribution to the IHEP computing facility. Nowadays cloud computing is playing a key role in the HEP computing field, due to its scalability and elasticity. Cloud infrastructures take advantages of several tools, such as VMDirac, to manage virtual machines through cloud managers according to the job requirements. With the virtually unlimited resources from commercial clouds, the computing capacity could scale accordingly in order to deal with any burst demands. General computing models have been discussed in the talk and are addressed herewith, with particular focus on the BESIII infrastructure. Moreover new computing tools and upcoming infrastructures will be addressed.

  14. Study of reactive collisions between electrons and molecular cations using multichannel quantum defect theory: Application to HD{sup +} and BeH{sup +}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pop, N., E-mail: nicolina.pop@upt.ro; Ilie, S.; Motapon, O.

    2014-11-24

    The present work is aimed at performing the computation of cross sections and Maxwell rate coefficients in the framework of the stepwise version of the Multichannel Quantum Defect Theory (MQDT). Cross sections and rate coefficients suitable for the modelling of the kinetics of HD{sup +} and BeH{sup +} in fusion plasmas and in the stellar atmospheres are presented and discussed. A very good agreement is found between our results for rotational transitions for HD{sup +} and other computations, as well as with experiment.

  15. Coupling of Mechanical Behavior of Cell Components to Electrochemical-Thermal Models for Computer-Aided Engineering of Batteries under Abuse (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pesaran, A.; Wierzbicki, T.; Sahraei, E.

    The EV Everywhere Grand Challenge aims to produce plug-in electric vehicles as affordable and convenient for the American family as gasoline-powered vehicles by 2022. Among the requirements set by the challenge, electric vehicles must be as safe as conventional vehicles, and EV batteries must not lead to unsafe situations under abuse conditions. NREL's project started in October 2013, based on a proposal in response to the January 2013 DOE VTO FOA, with the goal of developing computer aided engineering tools to accelerate the development of safer lithium ion batteries.

  16. Systems Biology and Cancer Prevention: All Options on the Table

    PubMed Central

    Rosenfeld, Simon; Kapetanovic, Izet

    2008-01-01

    In this paper, we outline the status quo and approaches to further development of the systems biology concepts with focus on applications in cancer prevention science. We discuss the biological aspects of cancer research that are of primary importance in cancer prevention, motivations for their mathematical modeling and some recent advances in computational oncology. We also make an attempt to outline in big conceptual terms the contours of future work aimed at creation of large-scale computational and informational infrastructure for using as a routine tool in cancer prevention science and decision making. PMID:19787092

  17. Integrative computational models of cardiac arrhythmias -- simulating the structurally realistic heart

    PubMed Central

    Trayanova, Natalia A; Tice, Brock M

    2009-01-01

    Simulation of cardiac electrical function, and specifically, simulation aimed at understanding the mechanisms of cardiac rhythm disorders, represents an example of a successful integrative multiscale modeling approach, uncovering emergent behavior at the successive scales in the hierarchy of structural complexity. The goal of this article is to present a review of the integrative multiscale models of realistic ventricular structure used in the quest to understand and treat ventricular arrhythmias. It concludes with the new advances in image-based modeling of the heart and the promise it holds for the development of individualized models of ventricular function in health and disease. PMID:20628585

  18. Short-term effects of playing computer games on attention.

    PubMed

    Tahiroglu, Aysegul Yolga; Celik, Gonca Gul; Avci, Ayse; Seydaoglu, Gulsah; Uzel, Mehtap; Altunbas, Handan

    2010-05-01

    The main aim of the present study is to investigate the short-term cognitive effects of computer games in children with different psychiatric disorders and normal controls. One hundred one children are recruited for the study (aged between 9 and 12 years). All participants played a motor-racing game on the computer for 1 hour. The TBAG form of the Stroop task was administered to all participants twice, before playing and immediately after playing the game. Participants with improved posttest scores, compared to their pretest scores, used the computer on average 0.67 +/- 1.1 hr/day, while the average administered was measured at 1.6 +/- 1.4 hr/day and 1.3 +/- 0.9 hr/day computer use for participants with worse or unaltered scores, respectively. According to the regression model, male gender, younger ages, duration of daily computer use, and ADHD inattention type were found to be independent risk factors for worsened posttest scores. Time spent playing computer games can exert a short-term effect on attention as measured by the Stroop test.

  19. Equivalent Viscous Damping Methodologies Applied on VEGA Launch Vehicle Numerical Model

    NASA Astrophysics Data System (ADS)

    Bartoccini, D.; Di Trapani, C.; Fransen, S.

    2014-06-01

    Part of the mission analysis of a spacecraft is the so- called launcher-satellite coupled loads analysis which aims at computing the dynamic environment of the satellite and of the launch vehicle for the most severe load cases in flight. Evidently the damping of the coupled system shall be defined with care as to not overestimate or underestimate the loads derived for the spacecraft. In this paper the application of several EqVD (Equivalent Viscous Damping) for Craig an Bampton (CB)-systems are investigated. Based on the structural damping defined for the various materials in the parent FE-models of the CB-components, EqVD matrices can be computed according to different methodologies. The effect of these methodologies on the numerical reconstruction of the VEGA launch vehicle dynamic environment will be presented.

  20. Single-ion hydration thermodynamics from clusters to bulk solutions: Recent insights from molecular modeling

    DOE PAGES

    Vlcek, Lukas; Chialvo, Ariel A.

    2016-01-03

    The importance of single-ion hydration thermodynamic properties for understanding the driving forces of aqueous electrolyte processes, along with the impossibility of their direct experimental measurement, have prompted a large number of experimental, theoretical, and computational studies aimed at separating the cation and anion contributions. Here we provide an overview of historical approaches based on extrathermodynamic assumptions and more recent computational studies of single-ion hydration in order to evaluate the approximations involved in these methods, quantify their accuracy, reliability, and limitations in the light of the latest developments. Finally, we also offer new insights into the factors that influence the accuracymore » of ion–water interaction models and our views on possible ways to fill this substantial knowledge gap in aqueous physical chemistry.« less

  1. Value-based decision making via sequential sampling with hierarchical competition and attentional modulation

    PubMed Central

    2017-01-01

    In principle, formal dynamical models of decision making hold the potential to represent fundamental computations underpinning value-based (i.e., preferential) decisions in addition to perceptual decisions. Sequential-sampling models such as the race model and the drift-diffusion model that are grounded in simplicity, analytical tractability, and optimality remain popular, but some of their more recent counterparts have instead been designed with an aim for more feasibility as architectures to be implemented by actual neural systems. Connectionist models are proposed herein at an intermediate level of analysis that bridges mental phenomena and underlying neurophysiological mechanisms. Several such models drawing elements from the established race, drift-diffusion, feedforward-inhibition, divisive-normalization, and competing-accumulator models were tested with respect to fitting empirical data from human participants making choices between foods on the basis of hedonic value rather than a traditional perceptual attribute. Even when considering performance at emulating behavior alone, more neurally plausible models were set apart from more normative race or drift-diffusion models both quantitatively and qualitatively despite remaining parsimonious. To best capture the paradigm, a novel six-parameter computational model was formulated with features including hierarchical levels of competition via mutual inhibition as well as a static approximation of attentional modulation, which promotes “winner-take-all” processing. Moreover, a meta-analysis encompassing several related experiments validated the robustness of model-predicted trends in humans’ value-based choices and concomitant reaction times. These findings have yet further implications for analysis of neurophysiological data in accordance with computational modeling, which is also discussed in this new light. PMID:29077746

  2. Value-based decision making via sequential sampling with hierarchical competition and attentional modulation.

    PubMed

    Colas, Jaron T

    2017-01-01

    In principle, formal dynamical models of decision making hold the potential to represent fundamental computations underpinning value-based (i.e., preferential) decisions in addition to perceptual decisions. Sequential-sampling models such as the race model and the drift-diffusion model that are grounded in simplicity, analytical tractability, and optimality remain popular, but some of their more recent counterparts have instead been designed with an aim for more feasibility as architectures to be implemented by actual neural systems. Connectionist models are proposed herein at an intermediate level of analysis that bridges mental phenomena and underlying neurophysiological mechanisms. Several such models drawing elements from the established race, drift-diffusion, feedforward-inhibition, divisive-normalization, and competing-accumulator models were tested with respect to fitting empirical data from human participants making choices between foods on the basis of hedonic value rather than a traditional perceptual attribute. Even when considering performance at emulating behavior alone, more neurally plausible models were set apart from more normative race or drift-diffusion models both quantitatively and qualitatively despite remaining parsimonious. To best capture the paradigm, a novel six-parameter computational model was formulated with features including hierarchical levels of competition via mutual inhibition as well as a static approximation of attentional modulation, which promotes "winner-take-all" processing. Moreover, a meta-analysis encompassing several related experiments validated the robustness of model-predicted trends in humans' value-based choices and concomitant reaction times. These findings have yet further implications for analysis of neurophysiological data in accordance with computational modeling, which is also discussed in this new light.

  3. Pelvic belt effects on sacroiliac joint ligaments: a computational approach to understand therapeutic effects of pelvic belts.

    PubMed

    Sichting, Freddy; Rossol, Jerome; Soisson, Odette; Klima, Stefan; Milani, Thomas; Hammer, Niels

    2014-01-01

    The sacroiliac joint is a widely described source of low back pain. Therapeutic approaches to relieve pain include the application of pelvic belts. However, the effects of pelvic belts on sacroiliac joint ligaments as potential pain generators are mostly unknown. The aim of our study was to analyze the influence of pelvic belts on ligament load by means of a computer model. Experimental computer study using a finite element method. A computer model of the human pelvis was created, comprising bones, ligaments, and cartilage. Detailed geometries, material properties of ligaments, and in-vivo pressure distribution patterns of a pelvic belt were implemented. The effects of pelvic belts on ligament strain were computed in the double-leg stance. Pelvic belts increase sacroiliac joint motion around the sagittal axis but decrease motion around the transverse axis. With pelvic belt application, most of the strained sacroiliac joint ligaments were relieved, especially the sacrospinous, sacrotuberous, and the interosseous sacroiliac ligaments. Sacroiliac joint motion and ligament strains were minute. These results agree with validation data from other studies. Assigning homogenous and linear material properties and excluding muscle forces are clear simplifications of the complex reality. Pelvic belts alter sacroiliac joint motion and provide partial relief of ligament strain that is subjectively marked, although minimal in absolute terms. These findings confirm theories that besides being mechanical stabilizers, the sacroiliac joint ligaments are likely involved in neuromuscular feedback mechanisms. The results from our computer model help with unraveling the therapeutic mechanisms of pelvic belts.

  4. CelOWS: an ontology based framework for the provision of semantic web services related to biological models.

    PubMed

    Matos, Ely Edison; Campos, Fernanda; Braga, Regina; Palazzi, Daniele

    2010-02-01

    The amount of information generated by biological research has lead to an intensive use of models. Mathematical and computational modeling needs accurate description to share, reuse and simulate models as formulated by original authors. In this paper, we introduce the Cell Component Ontology (CelO), expressed in OWL-DL. This ontology captures both the structure of a cell model and the properties of functional components. We use this ontology in a Web project (CelOWS) to describe, query and compose CellML models, using semantic web services. It aims to improve reuse and composition of existent components and allow semantic validation of new models.

  5. Artificial Exo-Society Modeling: a New Tool for SETI Research

    NASA Astrophysics Data System (ADS)

    Gardner, James N.

    2002-01-01

    One of the newest fields of complexity research is artificial society modeling. Methodologically related to artificial life research, artificial society modeling utilizes agent-based computer simulation tools like SWARM and SUGARSCAPE developed by the Santa Fe Institute, Los Alamos National Laboratory and the Bookings Institution in an effort to introduce an unprecedented degree of rigor and quantitative sophistication into social science research. The broad aim of artificial society modeling is to begin the development of a more unified social science that embeds cultural evolutionary processes in a computational environment that simulates demographics, the transmission of culture, conflict, economics, disease, the emergence of groups and coadaptation with an environment in a bottom-up fashion. When an artificial society computer model is run, artificial societal patterns emerge from the interaction of autonomous software agents (the "inhabitants" of the artificial society). Artificial society modeling invites the interpretation of society as a distributed computational system and the interpretation of social dynamics as a specialized category of computation. Artificial society modeling techniques offer the potential of computational simulation of hypothetical alien societies in much the same way that artificial life modeling techniques offer the potential to model hypothetical exobiological phenomena. NASA recently announced its intention to begin exploring the possibility of including artificial life research within the broad portfolio of scientific fields comprised by the interdisciplinary astrobiology research endeavor. It may be appropriate for SETI researchers to likewise commence an exploration of the possible inclusion of artificial exo-society modeling within the SETI research endeavor. Artificial exo-society modeling might be particularly useful in a post-detection environment by (1) coherently organizing the set of data points derived from a detected ETI signal, (2) mapping trends in the data points over time (assuming receipt of an extended ETI signal), and (3) projecting such trends forward to derive alternative cultural evolutionary scenarios for the exo-society under analysis. The latter exercise might be particularly useful to compensate for the inevitable time lag between generation of an ETI signal and receipt of an ETI signal on Earth. For this reason, such an exercise might be a helpful adjunct to the decisional process contemplated by Paragraph 9 of the Declaration of Principles Concerning Activities Following the Detection of Extraterrestrial Intelligence.

  6. Fast and Accurate Poisson Denoising With Trainable Nonlinear Diffusion.

    PubMed

    Feng, Wensen; Qiao, Peng; Chen, Yunjin; Wensen Feng; Peng Qiao; Yunjin Chen; Feng, Wensen; Chen, Yunjin; Qiao, Peng

    2018-06-01

    The degradation of the acquired signal by Poisson noise is a common problem for various imaging applications, such as medical imaging, night vision, and microscopy. Up to now, many state-of-the-art Poisson denoising techniques mainly concentrate on achieving utmost performance, with little consideration for the computation efficiency. Therefore, in this paper we aim to propose an efficient Poisson denoising model with both high computational efficiency and recovery quality. To this end, we exploit the newly developed trainable nonlinear reaction diffusion (TNRD) model which has proven an extremely fast image restoration approach with performance surpassing recent state-of-the-arts. However, the straightforward direct gradient descent employed in the original TNRD-based denoising task is not applicable in this paper. To solve this problem, we resort to the proximal gradient descent method. We retrain the model parameters, including the linear filters and influence functions by taking into account the Poisson noise statistics, and end up with a well-trained nonlinear diffusion model specialized for Poisson denoising. The trained model provides strongly competitive results against state-of-the-art approaches, meanwhile bearing the properties of simple structure and high efficiency. Furthermore, our proposed model comes along with an additional advantage, that the diffusion process is well-suited for parallel computation on graphics processing units (GPUs). For images of size , our GPU implementation takes less than 0.1 s to produce state-of-the-art Poisson denoising performance.

  7. Developing an Online Framework for Publication of Uncertainty Information in Hydrological Modeling

    NASA Astrophysics Data System (ADS)

    Etienne, E.; Piasecki, M.

    2012-12-01

    Inaccuracies in data collection and parameters estimation, and imperfection of models structures imply uncertain predictions of the hydrological models. Finding a way to communicate the uncertainty information in a model output is important in decision-making. This work aims to publish uncertainty information (computed by project partner at Penn State) associated with hydrological predictions on catchments. To this end we have developed a DB schema (derived from the CUAHSI ODM design) which is focused on storing uncertainty information and its associated metadata. The technologies used to build the system are: OGC's Sensor Observation Service (SOS) for publication, the uncertML markup language (also developed by the OGC) to describe uncertainty information, and use of the Interoperability and Automated Mapping (INTAMAP) Web Processing Service (WPS) that handles part of the statistics computations. We develop a service to provide users with the capability to exploit all the functionality of the system (based on DRUPAL). Users will be able to request and visualize uncertainty data, and also publish their data in the system.

  8. Bayesian analysis of caustic-crossing microlensing events

    NASA Astrophysics Data System (ADS)

    Cassan, A.; Horne, K.; Kains, N.; Tsapras, Y.; Browne, P.

    2010-06-01

    Aims: Caustic-crossing binary-lens microlensing events are important anomalous events because they are capable of detecting an extrasolar planet companion orbiting the lens star. Fast and robust modelling methods are thus of prime interest in helping to decide whether a planet is detected by an event. Cassan introduced a new set of parameters to model binary-lens events, which are closely related to properties of the light curve. In this work, we explain how Bayesian priors can be added to this framework, and investigate on interesting options. Methods: We develop a mathematical formulation that allows us to compute analytically the priors on the new parameters, given some previous knowledge about other physical quantities. We explicitly compute the priors for a number of interesting cases, and show how this can be implemented in a fully Bayesian, Markov chain Monte Carlo algorithm. Results: Using Bayesian priors can accelerate microlens fitting codes by reducing the time spent considering physically implausible models, and helps us to discriminate between alternative models based on the physical plausibility of their parameters.

  9. Rollover risk prediction of heavy vehicles by reliability index and empirical modelling

    NASA Astrophysics Data System (ADS)

    Sellami, Yamine; Imine, Hocine; Boubezoul, Abderrahmane; Cadiou, Jean-Charles

    2018-03-01

    This paper focuses on a combination of a reliability-based approach and an empirical modelling approach for rollover risk assessment of heavy vehicles. A reliability-based warning system is developed to alert the driver to a potential rollover before entering into a bend. The idea behind the proposed methodology is to estimate the rollover risk by the probability that the vehicle load transfer ratio (LTR) exceeds a critical threshold. Accordingly, a so-called reliability index may be used as a measure to assess the vehicle safe functioning. In the reliability method, computing the maximum of LTR requires to predict the vehicle dynamics over the bend which can be in some cases an intractable problem or time-consuming. With the aim of improving the reliability computation time, an empirical model is developed to substitute the vehicle dynamics and rollover models. This is done by using the SVM (Support Vector Machines) algorithm. The preliminary obtained results demonstrate the effectiveness of the proposed approach.

  10. Methods for Real-Time Prediction of the Mode of Travel Using Smartphone-Based GPS and Accelerometer Data

    PubMed Central

    Martin, Bryan D.; Wolfson, Julian; Adomavicius, Gediminas; Fan, Yingling

    2017-01-01

    We propose and compare combinations of several methods for classifying transportation activity data from smartphone GPS and accelerometer sensors. We have two main objectives. First, we aim to classify our data as accurately as possible. Second, we aim to reduce the dimensionality of the data as much as possible in order to reduce the computational burden of the classification. We combine dimension reduction and classification algorithms and compare them with a metric that balances accuracy and dimensionality. In doing so, we develop a classification algorithm that accurately classifies five different modes of transportation (i.e., walking, biking, car, bus and rail) while being computationally simple enough to run on a typical smartphone. Further, we use data that required no behavioral changes from the smartphone users to collect. Our best classification model uses the random forest algorithm to achieve 96.8% accuracy. PMID:28885550

  11. Methods for Real-Time Prediction of the Mode of Travel Using Smartphone-Based GPS and Accelerometer Data.

    PubMed

    Martin, Bryan D; Addona, Vittorio; Wolfson, Julian; Adomavicius, Gediminas; Fan, Yingling

    2017-09-08

    We propose and compare combinations of several methods for classifying transportation activity data from smartphone GPS and accelerometer sensors. We have two main objectives. First, we aim to classify our data as accurately as possible. Second, we aim to reduce the dimensionality of the data as much as possible in order to reduce the computational burden of the classification. We combine dimension reduction and classification algorithms and compare them with a metric that balances accuracy and dimensionality. In doing so, we develop a classification algorithm that accurately classifies five different modes of transportation (i.e., walking, biking, car, bus and rail) while being computationally simple enough to run on a typical smartphone. Further, we use data that required no behavioral changes from the smartphone users to collect. Our best classification model uses the random forest algorithm to achieve 96.8% accuracy.

  12. Radiotherapy and chemotherapy change vessel tree geometry and metastatic spread in a small cell lung cancer xenograft mouse tumor model

    PubMed Central

    Bethge, Anja; Schumacher, Udo

    2017-01-01

    Background Tumor vasculature is critical for tumor growth, formation of distant metastases and efficiency of radio- and chemotherapy treatments. However, how the vasculature itself is affected during cancer treatment regarding to the metastatic behavior has not been thoroughly investigated. Therefore, the aim of this study was to analyze the influence of hypofractionated radiotherapy and cisplatin chemotherapy on vessel tree geometry and metastasis formation in a small cell lung cancer xenograft mouse tumor model to investigate the spread of malignant cells during different treatments modalities. Methods The biological data gained during these experiments were fed into our previously developed computer model “Cancer and Treatment Simulation Tool” (CaTSiT) to model the growth of the primary tumor, its metastatic deposit and also the influence on different therapies. Furthermore, we performed quantitative histology analyses to verify our predictions in xenograft mouse tumor model. Results According to the computer simulation the number of cells engrafting must vary considerably to explain the different weights of the primary tumor at the end of the experiment. Once a primary tumor is established, the fractal dimension of its vasculature correlates with the tumor size. Furthermore, the fractal dimension of the tumor vasculature changes during treatment, indicating that the therapy affects the blood vessels’ geometry. We corroborated these findings with a quantitative histological analysis showing that the blood vessel density is depleted during radiotherapy and cisplatin chemotherapy. The CaTSiT computer model reveals that chemotherapy influences the tumor’s therapeutic susceptibility and its metastatic spreading behavior. Conclusion Using a system biological approach in combination with xenograft models and computer simulations revealed that the usage of chemotherapy and radiation therapy determines the spreading behavior by changing the blood vessel geometry of the primary tumor. PMID:29107953

  13. Plant metabolic modeling: achieving new insight into metabolism and metabolic engineering.

    PubMed

    Baghalian, Kambiz; Hajirezaei, Mohammad-Reza; Schreiber, Falk

    2014-10-01

    Models are used to represent aspects of the real world for specific purposes, and mathematical models have opened up new approaches in studying the behavior and complexity of biological systems. However, modeling is often time-consuming and requires significant computational resources for data development, data analysis, and simulation. Computational modeling has been successfully applied as an aid for metabolic engineering in microorganisms. But such model-based approaches have only recently been extended to plant metabolic engineering, mainly due to greater pathway complexity in plants and their highly compartmentalized cellular structure. Recent progress in plant systems biology and bioinformatics has begun to disentangle this complexity and facilitate the creation of efficient plant metabolic models. This review highlights several aspects of plant metabolic modeling in the context of understanding, predicting and modifying complex plant metabolism. We discuss opportunities for engineering photosynthetic carbon metabolism, sucrose synthesis, and the tricarboxylic acid cycle in leaves and oil synthesis in seeds and the application of metabolic modeling to the study of plant acclimation to the environment. The aim of the review is to offer a current perspective for plant biologists without requiring specialized knowledge of bioinformatics or systems biology. © 2014 American Society of Plant Biologists. All rights reserved.

  14. Plant Metabolic Modeling: Achieving New Insight into Metabolism and Metabolic Engineering

    PubMed Central

    Baghalian, Kambiz; Hajirezaei, Mohammad-Reza; Schreiber, Falk

    2014-01-01

    Models are used to represent aspects of the real world for specific purposes, and mathematical models have opened up new approaches in studying the behavior and complexity of biological systems. However, modeling is often time-consuming and requires significant computational resources for data development, data analysis, and simulation. Computational modeling has been successfully applied as an aid for metabolic engineering in microorganisms. But such model-based approaches have only recently been extended to plant metabolic engineering, mainly due to greater pathway complexity in plants and their highly compartmentalized cellular structure. Recent progress in plant systems biology and bioinformatics has begun to disentangle this complexity and facilitate the creation of efficient plant metabolic models. This review highlights several aspects of plant metabolic modeling in the context of understanding, predicting and modifying complex plant metabolism. We discuss opportunities for engineering photosynthetic carbon metabolism, sucrose synthesis, and the tricarboxylic acid cycle in leaves and oil synthesis in seeds and the application of metabolic modeling to the study of plant acclimation to the environment. The aim of the review is to offer a current perspective for plant biologists without requiring specialized knowledge of bioinformatics or systems biology. PMID:25344492

  15. Respectful Modeling: Addressing Uncertainty in Dynamic System Models for Molecular Biology.

    PubMed

    Tsigkinopoulou, Areti; Baker, Syed Murtuza; Breitling, Rainer

    2017-06-01

    Although there is still some skepticism in the biological community regarding the value and significance of quantitative computational modeling, important steps are continually being taken to enhance its accessibility and predictive power. We view these developments as essential components of an emerging 'respectful modeling' framework which has two key aims: (i) respecting the models themselves and facilitating the reproduction and update of modeling results by other scientists, and (ii) respecting the predictions of the models and rigorously quantifying the confidence associated with the modeling results. This respectful attitude will guide the design of higher-quality models and facilitate the use of models in modern applications such as engineering and manipulating microbial metabolism by synthetic biology. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Computational modeling to predict nitrogen balance during acute metabolic decompensation in patients with urea cycle disorders

    PubMed Central

    MacLeod, Erin L.; Hall, Kevin D.; McGuire, Peter J.

    2015-01-01

    SUMMARY Nutritional management of acute metabolic decompensation in amino acid inborn errors of metabolism (AA IEM) aims to restore nitrogen balance. While nutritional recommendations have been published, they have never been rigorously evaluated. Furthermore, despite these recommendations, there is a wide variation in the nutritional strategies employed amongst providers, particularly regarding the inclusion of parenteral lipids for protein-free caloric support. Since randomized clinical trials during acute metabolic decompensation are difficult and potentially dangerous, mathematical modeling of metabolism can serve as a surrogate for the preclinical evaluation of nutritional interventions aimed at restoring nitrogen balance during acute decompensation in AA IEM. A validated computational model of human macronutrient metabolism was adapted to predict nitrogen balance in response to various nutritional interventions in a simulated patient with a urea cycle disorder (UCD) during acute metabolic decompensation due to dietary non-adherence or infection. The nutritional interventions were constructed from published recommendations as well as clinical anecdotes. Overall, dextrose alone (DEX) was predicted to be better at restoring nitrogen balance and limiting nitrogen excretion during dietary non-adherence and infection scenarios, suggesting that the published recommended nutritional strategy involving dextrose and parenteral lipids (ISO) may be suboptimal. The implications for patients with AA IEM are that the medical course during acute metabolic decompensation may be influenced by the choice of protein-free caloric support. These results are also applicable to intensive care patients undergoing catabolism (postoperative phase or sepsis), where parenteral nutritional support aimed at restoring nitrogen balance may be more tailored regarding metabolic fuel selection. PMID:26260782

  17. Computational modeling to predict nitrogen balance during acute metabolic decompensation in patients with urea cycle disorders.

    PubMed

    MacLeod, Erin L; Hall, Kevin D; McGuire, Peter J

    2016-01-01

    Nutritional management of acute metabolic decompensation in amino acid inborn errors of metabolism (AA IEM) aims to restore nitrogen balance. While nutritional recommendations have been published, they have never been rigorously evaluated. Furthermore, despite these recommendations, there is a wide variation in the nutritional strategies employed amongst providers, particularly regarding the inclusion of parenteral lipids for protein-free caloric support. Since randomized clinical trials during acute metabolic decompensation are difficult and potentially dangerous, mathematical modeling of metabolism can serve as a surrogate for the preclinical evaluation of nutritional interventions aimed at restoring nitrogen balance during acute decompensation in AA IEM. A validated computational model of human macronutrient metabolism was adapted to predict nitrogen balance in response to various nutritional interventions in a simulated patient with a urea cycle disorder (UCD) during acute metabolic decompensation due to dietary non-adherence or infection. The nutritional interventions were constructed from published recommendations as well as clinical anecdotes. Overall, dextrose alone (DEX) was predicted to be better at restoring nitrogen balance and limiting nitrogen excretion during dietary non-adherence and infection scenarios, suggesting that the published recommended nutritional strategy involving dextrose and parenteral lipids (ISO) may be suboptimal. The implications for patients with AA IEM are that the medical course during acute metabolic decompensation may be influenced by the choice of protein-free caloric support. These results are also applicable to intensive care patients undergoing catabolism (postoperative phase or sepsis), where parenteral nutritional support aimed at restoring nitrogen balance may be more tailored regarding metabolic fuel selection.

  18. Time-domain seismic modeling in viscoelastic media for full waveform inversion on heterogeneous computing platforms with OpenCL

    NASA Astrophysics Data System (ADS)

    Fabien-Ouellet, Gabriel; Gloaguen, Erwan; Giroux, Bernard

    2017-03-01

    Full Waveform Inversion (FWI) aims at recovering the elastic parameters of the Earth by matching recordings of the ground motion with the direct solution of the wave equation. Modeling the wave propagation for realistic scenarios is computationally intensive, which limits the applicability of FWI. The current hardware evolution brings increasing parallel computing power that can speed up the computations in FWI. However, to take advantage of the diversity of parallel architectures presently available, new programming approaches are required. In this work, we explore the use of OpenCL to develop a portable code that can take advantage of the many parallel processor architectures now available. We present a program called SeisCL for 2D and 3D viscoelastic FWI in the time domain. The code computes the forward and adjoint wavefields using finite-difference and outputs the gradient of the misfit function given by the adjoint state method. To demonstrate the code portability on different architectures, the performance of SeisCL is tested on three different devices: Intel CPUs, NVidia GPUs and Intel Xeon PHI. Results show that the use of GPUs with OpenCL can speed up the computations by nearly two orders of magnitudes over a single threaded application on the CPU. Although OpenCL allows code portability, we show that some device-specific optimization is still required to get the best performance out of a specific architecture. Using OpenCL in conjunction with MPI allows the domain decomposition of large models on several devices located on different nodes of a cluster. For large enough models, the speedup of the domain decomposition varies quasi-linearly with the number of devices. Finally, we investigate two different approaches to compute the gradient by the adjoint state method and show the significant advantages of using OpenCL for FWI.

  19. a Cultural Market Model

    NASA Astrophysics Data System (ADS)

    HerdaǦDELEN, Amaç; Bingol, Haluk

    Social interactions and personal tastes shape our consumption behavior of cultural products. In this study, we present a computational model of a cultural market and we aim to analyze the behavior of the consumer population as an emergent phenomena. Our results suggest that the final market shares of cultural products dramatically depend on consumer heterogeneity and social interaction pressure. Furthermore, the relation between the resulting market shares and social interaction is robust with respect to a wide range of variation in the parameter values and the type of topology.

  20. Hypersonic research engine/aerothermodynamic integration model, experimental results. Volume 1: Mach 6 component integration

    NASA Technical Reports Server (NTRS)

    Andrews, E. H., Jr.; Mackley, E. A.

    1976-01-01

    The NASA Hypersonic Research Engine (HRE) Project was initiated for the purpose of advancing the technology of airbreathing propulsion for hypersonic flight. A large component (inlet, combustor, and nozzle) and structures development program was encompassed by the project. The tests of a full-scale (18 in. diameter cowl and 87 in. long) HRE concept, designated the Aerothermodynamic Integration Model (AIM), at Mach numbers of 5, 6, and 7. Computer program results for Mach 6 component integration tests are presented.

  1. Computational Psychiatry: towards a mathematically informed understanding of mental illness

    PubMed Central

    Huys, Quentin J M; Roiser, Jonathan P

    2016-01-01

    Computational Psychiatry aims to describe the relationship between the brain's neurobiology, its environment and mental symptoms in computational terms. In so doing, it may improve psychiatric classification and the diagnosis and treatment of mental illness. It can unite many levels of description in a mechanistic and rigorous fashion, while avoiding biological reductionism and artificial categorisation. We describe how computational models of cognition can infer the current state of the environment and weigh up future actions, and how these models provide new perspectives on two example disorders, depression and schizophrenia. Reinforcement learning describes how the brain can choose and value courses of actions according to their long-term future value. Some depressive symptoms may result from aberrant valuations, which could arise from prior beliefs about the loss of agency (‘helplessness’), or from an inability to inhibit the mental exploration of aversive events. Predictive coding explains how the brain might perform Bayesian inference about the state of its environment by combining sensory data with prior beliefs, each weighted according to their certainty (or precision). Several cortical abnormalities in schizophrenia might reduce precision at higher levels of the inferential hierarchy, biasing inference towards sensory data and away from prior beliefs. We discuss whether striatal hyperdopaminergia might have an adaptive function in this context, and also how reinforcement learning and incentive salience models may shed light on the disorder. Finally, we review some of Computational Psychiatry's applications to neurological disorders, such as Parkinson's disease, and some pitfalls to avoid when applying its methods. PMID:26157034

  2. Computational Methods for HSCT-Inlet Controls/CFD Interdisciplinary Research

    NASA Technical Reports Server (NTRS)

    Cole, Gary L.; Melcher, Kevin J.; Chicatelli, Amy K.; Hartley, Tom T.; Chung, Joongkee

    1994-01-01

    A program aimed at facilitating the use of computational fluid dynamics (CFD) simulations by the controls discipline is presented. The objective is to reduce the development time and cost for propulsion system controls by using CFD simulations to obtain high-fidelity system models for control design and as numerical test beds for control system testing and validation. An interdisciplinary team has been formed to develop analytical and computational tools in three discipline areas: controls, CFD, and computational technology. The controls effort has focused on specifying requirements for an interface between the controls specialist and CFD simulations and a new method for extracting linear, reduced-order control models from CFD simulations. Existing CFD codes are being modified to permit time accurate execution and provide realistic boundary conditions for controls studies. Parallel processing and distributed computing techniques, along with existing system integration software, are being used to reduce CFD execution times and to support the development of an integrated analysis/design system. This paper describes: the initial application for the technology being developed, the high speed civil transport (HSCT) inlet control problem; activities being pursued in each discipline area; and a prototype analysis/design system in place for interactive operation and visualization of a time-accurate HSCT-inlet simulation.

  3. OptFlux: an open-source software platform for in silico metabolic engineering.

    PubMed

    Rocha, Isabel; Maia, Paulo; Evangelista, Pedro; Vilaça, Paulo; Soares, Simão; Pinto, José P; Nielsen, Jens; Patil, Kiran R; Ferreira, Eugénio C; Rocha, Miguel

    2010-04-19

    Over the last few years a number of methods have been proposed for the phenotype simulation of microorganisms under different environmental and genetic conditions. These have been used as the basis to support the discovery of successful genetic modifications of the microbial metabolism to address industrial goals. However, the use of these methods has been restricted to bioinformaticians or other expert researchers. The main aim of this work is, therefore, to provide a user-friendly computational tool for Metabolic Engineering applications. OptFlux is an open-source and modular software aimed at being the reference computational application in the field. It is the first tool to incorporate strain optimization tasks, i.e., the identification of Metabolic Engineering targets, using Evolutionary Algorithms/Simulated Annealing metaheuristics or the previously proposed OptKnock algorithm. It also allows the use of stoichiometric metabolic models for (i) phenotype simulation of both wild-type and mutant organisms, using the methods of Flux Balance Analysis, Minimization of Metabolic Adjustment or Regulatory on/off Minimization of Metabolic flux changes, (ii) Metabolic Flux Analysis, computing the admissible flux space given a set of measured fluxes, and (iii) pathway analysis through the calculation of Elementary Flux Modes. OptFlux also contemplates several methods for model simplification and other pre-processing operations aimed at reducing the search space for optimization algorithms. The software supports importing/exporting to several flat file formats and it is compatible with the SBML standard. OptFlux has a visualization module that allows the analysis of the model structure that is compatible with the layout information of Cell Designer, allowing the superimposition of simulation results with the model graph. The OptFlux software is freely available, together with documentation and other resources, thus bridging the gap from research in strain optimization algorithms and the final users. It is a valuable platform for researchers in the field that have available a number of useful tools. Its open-source nature invites contributions by all those interested in making their methods available for the community. Given its plug-in based architecture it can be extended with new functionalities. Currently, several plug-ins are being developed, including network topology analysis tools and the integration with Boolean network based regulatory models.

  4. OptFlux: an open-source software platform for in silico metabolic engineering

    PubMed Central

    2010-01-01

    Background Over the last few years a number of methods have been proposed for the phenotype simulation of microorganisms under different environmental and genetic conditions. These have been used as the basis to support the discovery of successful genetic modifications of the microbial metabolism to address industrial goals. However, the use of these methods has been restricted to bioinformaticians or other expert researchers. The main aim of this work is, therefore, to provide a user-friendly computational tool for Metabolic Engineering applications. Results OptFlux is an open-source and modular software aimed at being the reference computational application in the field. It is the first tool to incorporate strain optimization tasks, i.e., the identification of Metabolic Engineering targets, using Evolutionary Algorithms/Simulated Annealing metaheuristics or the previously proposed OptKnock algorithm. It also allows the use of stoichiometric metabolic models for (i) phenotype simulation of both wild-type and mutant organisms, using the methods of Flux Balance Analysis, Minimization of Metabolic Adjustment or Regulatory on/off Minimization of Metabolic flux changes, (ii) Metabolic Flux Analysis, computing the admissible flux space given a set of measured fluxes, and (iii) pathway analysis through the calculation of Elementary Flux Modes. OptFlux also contemplates several methods for model simplification and other pre-processing operations aimed at reducing the search space for optimization algorithms. The software supports importing/exporting to several flat file formats and it is compatible with the SBML standard. OptFlux has a visualization module that allows the analysis of the model structure that is compatible with the layout information of Cell Designer, allowing the superimposition of simulation results with the model graph. Conclusions The OptFlux software is freely available, together with documentation and other resources, thus bridging the gap from research in strain optimization algorithms and the final users. It is a valuable platform for researchers in the field that have available a number of useful tools. Its open-source nature invites contributions by all those interested in making their methods available for the community. Given its plug-in based architecture it can be extended with new functionalities. Currently, several plug-ins are being developed, including network topology analysis tools and the integration with Boolean network based regulatory models. PMID:20403172

  5. Java Performance for Scientific Applications on LLNL Computer Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kapfer, C; Wissink, A

    2002-05-10

    Languages in use for high performance computing at the laboratory--Fortran (f77 and f90), C, and C++--have many years of development behind them and are generally considered the fastest available. However, Fortran and C do not readily extend to object-oriented programming models, limiting their capability for very complex simulation software. C++ facilitates object-oriented programming but is a very complex and error-prone language. Java offers a number of capabilities that these other languages do not. For instance it implements cleaner (i.e., easier to use and less prone to errors) object-oriented models than C++. It also offers networking and security as part ofmore » the language standard, and cross-platform executables that make it architecture neutral, to name a few. These features have made Java very popular for industrial computing applications. The aim of this paper is to explain the trade-offs in using Java for large-scale scientific applications at LLNL. Despite its advantages, the computational science community has been reluctant to write large-scale computationally intensive applications in Java due to concerns over its poor performance. However, considerable progress has been made over the last several years. The Java Grande Forum [1] has been promoting the use of Java for large-scale computing. Members have introduced efficient array libraries, developed fast just-in-time (JIT) compilers, and built links to existing packages used in high performance parallel computing.« less

  6. An Efficient Finite Element Framework to Assess Flexibility Performances of SMA Self-Expandable Carotid Artery Stents

    PubMed Central

    Ferraro, Mauro; Auricchio, Ferdinando; Boatti, Elisa; Scalet, Giulia; Conti, Michele; Morganti, Simone; Reali, Alessandro

    2015-01-01

    Computer-based simulations are nowadays widely exploited for the prediction of the mechanical behavior of different biomedical devices. In this aspect, structural finite element analyses (FEA) are currently the preferred computational tool to evaluate the stent response under bending. This work aims at developing a computational framework based on linear and higher order FEA to evaluate the flexibility of self-expandable carotid artery stents. In particular, numerical simulations involving large deformations and inelastic shape memory alloy constitutive modeling are performed, and the results suggest that the employment of higher order FEA allows accurately representing the computational domain and getting a better approximation of the solution with a widely-reduced number of degrees of freedom with respect to linear FEA. Moreover, when buckling phenomena occur, higher order FEA presents a superior capability of reproducing the nonlinear local effects related to buckling phenomena. PMID:26184329

  7. Effectiveness of a Case-Based Computer Program on Students' Ethical Decision Making.

    PubMed

    Park, Eun-Jun; Park, Mihyun

    2015-11-01

    The aim of this study was to test the effectiveness of a case-based computer program, using an integrative ethical decision-making model, on the ethical decision-making competency of nursing students in South Korea. This study used a pre- and posttest comparison design. Students in the intervention group used a computer program for case analysis assignments, whereas students in the standard group used a traditional paper assignment for case analysis. The findings showed that using the case-based computer program as a complementary tool for the ethics courses offered at the university enhanced students' ethical preparedness and satisfaction with the course. On the basis of the findings, it is recommended that nurse educators use a case-based computer program as a complementary self-study tool in ethics courses to supplement student learning without an increase in course hours, particularly in terms of analyzing ethics cases with dilemma scenarios and exercising ethical decision making. Copyright 2015, SLACK Incorporated.

  8. A CFD-informed quasi-steady model of flapping wing aerodynamics.

    PubMed

    Nakata, Toshiyuki; Liu, Hao; Bomphrey, Richard J

    2015-11-01

    Aerodynamic performance and agility during flapping flight are determined by the combination of wing shape and kinematics. The degree of morphological and kinematic optimisation is unknown and depends upon a large parameter space. Aimed at providing an accurate and computationally inexpensive modelling tool for flapping-wing aerodynamics, we propose a novel CFD (computational fluid dynamics)-informed quasi-steady model (CIQSM), which assumes that the aerodynamic forces on a flapping wing can be decomposed into the quasi-steady forces and parameterised based on CFD results. Using least-squares fitting, we determine a set of proportional coefficients for the quasi-steady model relating wing kinematics to instantaneous aerodynamic force and torque; we calculate power with the product of quasi-steady torques and angular velocity. With the quasi-steady model fully and independently parameterised on the basis of high-fidelity CFD modelling, it is capable of predicting flapping-wing aerodynamic forces and power more accurately than the conventional blade element model (BEM) does. The improvement can be attributed to, for instance, taking into account the effects of the induced downwash and the wing tip vortex on the force generation and power consumption. Our model is validated by comparing the aerodynamics of a CFD model and the present quasi-steady model using the example case of a hovering hawkmoth. It demonstrates that the CIQSM outperforms the conventional BEM while remaining computationally cheap, and hence can be an effective tool for revealing the mechanisms of optimization and control of kinematics and morphology in flapping-wing flight for both bio-flyers and unmanned air systems.

  9. A CFD-informed quasi-steady model of flapping wing aerodynamics

    PubMed Central

    Nakata, Toshiyuki; Liu, Hao; Bomphrey, Richard J.

    2016-01-01

    Aerodynamic performance and agility during flapping flight are determined by the combination of wing shape and kinematics. The degree of morphological and kinematic optimisation is unknown and depends upon a large parameter space. Aimed at providing an accurate and computationally inexpensive modelling tool for flapping-wing aerodynamics, we propose a novel CFD (computational fluid dynamics)-informed quasi-steady model (CIQSM), which assumes that the aerodynamic forces on a flapping wing can be decomposed into the quasi-steady forces and parameterised based on CFD results. Using least-squares fitting, we determine a set of proportional coefficients for the quasi-steady model relating wing kinematics to instantaneous aerodynamic force and torque; we calculate power with the product of quasi-steady torques and angular velocity. With the quasi-steady model fully and independently parameterised on the basis of high-fidelity CFD modelling, it is capable of predicting flapping-wing aerodynamic forces and power more accurately than the conventional blade element model (BEM) does. The improvement can be attributed to, for instance, taking into account the effects of the induced downwash and the wing tip vortex on the force generation and power consumption. Our model is validated by comparing the aerodynamics of a CFD model and the present quasi-steady model using the example case of a hovering hawkmoth. It demonstrates that the CIQSM outperforms the conventional BEM while remaining computationally cheap, and hence can be an effective tool for revealing the mechanisms of optimization and control of kinematics and morphology in flapping-wing flight for both bio-flyers and unmanned air systems. PMID:27346891

  10. Numerical Modelling of Foundation Slabs with use of Schur Complement Method

    NASA Astrophysics Data System (ADS)

    Koktan, Jiří; Brožovský, Jiří

    2017-10-01

    The paper discusses numerical modelling of foundation slabs with use of advanced numerical approaches, which are suitable for parallel processing. The solution is based on the Finite Element Method with the slab-type elements. The subsoil is modelled with use of Winklertype contact model (as an alternative a multi-parameter model can be used). The proposed modelling approach uses the Schur Complement method to speed-up the computations of the problem. The method is based on a special division of the analyzed model to several substructures. It adds some complexity to the numerical procedures, especially when subsoil models are used inside the finite element method solution. In other hand, this method makes possible a fast solution of large models but it introduces further problems to the process. Thus, the main aim of this paper is to verify that such method can be successfully used for this type of problem. The most suitable finite elements will be discussed, there will be also discussion related to finite element mesh and limitations of its construction for such problem. The core approaches of the implementation of the Schur Complement Method for this type of the problem will be also presented. The proposed approach was implemented in the form of a computer program, which will be also briefly introduced. There will be also presented results of example computations, which prove the speed-up of the solution - there will be shown important speed-up of solution even in the case of on-parallel processing and the ability of bypass size limitations of numerical models with use of the discussed approach.

  11. A framework for different levels of integration of computational models into web-based virtual patients.

    PubMed

    Kononowicz, Andrzej A; Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-23

    Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients' interactivity by enriching them with computational models of physiological and pathological processes. The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the model. The third element is the description of four integration strategies, and the last element consisted of evaluation profiles specifying the relevant feasibility features and acceptance thresholds for specific purposes. The group of experts who evaluated the virtual patient exemplar found higher integration more interesting, but at the same time they were more concerned with the validity of the result. The observed differences were not statistically significant. This paper outlines a framework for the integration of computational models into virtual patients. The opportunities and challenges of model exploitation are discussed from a number of user perspectives, considering different levels of model integration. The long-term aim for future research is to isolate the most crucial factors in the framework and to determine their influence on the integration outcome.

  12. A Framework for Different Levels of Integration of Computational Models Into Web-Based Virtual Patients

    PubMed Central

    Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-01

    Background Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients’ interactivity by enriching them with computational models of physiological and pathological processes. Objective The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. Methods The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. Results The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the model. The third element is the description of four integration strategies, and the last element consisted of evaluation profiles specifying the relevant feasibility features and acceptance thresholds for specific purposes. The group of experts who evaluated the virtual patient exemplar found higher integration more interesting, but at the same time they were more concerned with the validity of the result. The observed differences were not statistically significant. Conclusions This paper outlines a framework for the integration of computational models into virtual patients. The opportunities and challenges of model exploitation are discussed from a number of user perspectives, considering different levels of model integration. The long-term aim for future research is to isolate the most crucial factors in the framework and to determine their influence on the integration outcome. PMID:24463466

  13. Student Perceptions in the Design of a Computer Card Game for Learning Computer Literacy Issues: A Case Study

    ERIC Educational Resources Information Center

    Kordaki, Maria; Papastergiou, Marina; Psomos, Panagiotis

    2016-01-01

    The aim of this work was twofold. First, an empirical study was designed aimed at investigating the perceptions that entry-level non-computing majors--namely Physical Education and Sport Science (PESS) undergraduate students--hold about basic Computer Literacy (CL) issues. The participants were 90 first-year PESS students, and their perceptions…

  14. The development of global GRAPES 4DVAR

    NASA Astrophysics Data System (ADS)

    Liu, Yongzhu

    2017-04-01

    Four-dimensional variation data assimilation (4DVAR) has given a great contribution to the improvement of NWP system over the past twenty years. Therefore, our strategy is to develop an operational global 4D-Var system from the outset. The aim at the paper is to introduce the development of the global GRAPES four-dimensional variation data assimilation (4DVAR) using incremental analysis schemes and to presents results of a comparison between 4DVAR using 6-hour assimilation window and simplified physics during the minimization with three-dimensional variation data assimilation (3DVAR). The dynamical cores of the tangent-linear and adjoint models are developed directly based on the non-hydrostatic forecast model. In addition, the standard correctness checks have been performed. As well as the development adjoint codes, most of our work is focused on improving the computational efficiency since the bulk of the computational cost of 4D-Var is in the integration of the tangent-linear and adjoint models. In terms of tangent-linear model, the wall-clock time is reduced to about 1.2 times as much as one of nonlinear model through the optimizing of the software framework. The significant computational cost savings on adjoint model result from the removing the redundant recompilations of model trajectories. It is encouraging that the wall-clock time of adjoint model is less than 1.5 times as much as one of nonlinear model. The current difficulty is that the numerical scheme used within the linear model is based on strategically on the numeric of the corresponding nonlinear model. Further computational acceleration should be expected from the improvement on nonlinear numerical algorithm. A series of linearized physical parameterization schemes has been developed to improve the representation of perturbed fields in the linear model. It consists of horizontal and vertical diffusion, sub-grid scale orographic gravity wave drag, large-scale condensation and cumulus convection schemes. We also found the straightforward linearization based on the nonlinear physical scheme might lead to significant growing of spurious unstable perturbations. It is essential to simplify the linear physics with respect to the non-linear schemes. The improvement on the perturbed fields in the tangent-linear model is visible with the linear physics included, especially at the low level. GRAPES variation data assimilation system adopts the incremental approach. The work is ongoing to develop a pre-operational 4DVAR suite with 0.25° outer loop resolution and multiple outer-loops configurations. One 4DVAR analysis using 6-hour assimilation windows can be finished within 40-minutes when using the available conventional and satellite data. In summary, it was found that the analysis over the northern, southern hemispheres, tropical region and East Asian area of GRAPES 4DVAR performed better than GRAPES 3DVAR for one month experiments. Moreover, the forecast results show that northern and southern extra-tropical scores for GRAPES 4DVAR are already better than GRAPES 3DVAR, but the tropical performance needs further investigations. Therefore, the subsequent main improvements will aim to enhance its computational efficiency and accuracy in 2017. The global GRAPES 4DVAR is planned for operation in 2018.

  15. A sense of life: computational and experimental investigations with models of biochemical and evolutionary processes.

    PubMed

    Mishra, Bud; Daruwala, Raoul-Sam; Zhou, Yi; Ugel, Nadia; Policriti, Alberto; Antoniotti, Marco; Paxia, Salvatore; Rejali, Marc; Rudra, Archisman; Cherepinsky, Vera; Silver, Naomi; Casey, William; Piazza, Carla; Simeoni, Marta; Barbano, Paolo; Spivak, Marina; Feng, Jiawu; Gill, Ofer; Venkatesh, Mysore; Cheng, Fang; Sun, Bing; Ioniata, Iuliana; Anantharaman, Thomas; Hubbard, E Jane Albert; Pnueli, Amir; Harel, David; Chandru, Vijay; Hariharan, Ramesh; Wigler, Michael; Park, Frank; Lin, Shih-Chieh; Lazebnik, Yuri; Winkler, Franz; Cantor, Charles R; Carbone, Alessandra; Gromov, Mikhael

    2003-01-01

    We collaborate in a research program aimed at creating a rigorous framework, experimental infrastructure, and computational environment for understanding, experimenting with, manipulating, and modifying a diverse set of fundamental biological processes at multiple scales and spatio-temporal modes. The novelty of our research is based on an approach that (i) requires coevolution of experimental science and theoretical techniques and (ii) exploits a certain universality in biology guided by a parsimonious model of evolutionary mechanisms operating at the genomic level and manifesting at the proteomic, transcriptomic, phylogenic, and other higher levels. Our current program in "systems biology" endeavors to marry large-scale biological experiments with the tools to ponder and reason about large, complex, and subtle natural systems. To achieve this ambitious goal, ideas and concepts are combined from many different fields: biological experimentation, applied mathematical modeling, computational reasoning schemes, and large-scale numerical and symbolic simulations. From a biological viewpoint, the basic issues are many: (i) understanding common and shared structural motifs among biological processes; (ii) modeling biological noise due to interactions among a small number of key molecules or loss of synchrony; (iii) explaining the robustness of these systems in spite of such noise; and (iv) cataloging multistatic behavior and adaptation exhibited by many biological processes.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thrall, Brian D.; Minard, Kevin R.; Teeguarden, Justin G.

    A Cooperative Research and Development Agreement (CRADA) was sponsored by Battelle Memorial Institute (Battelle, Columbus), to initiate a collaborative research program across multiple Department of Energy (DOE) National Laboratories aimed at developing a suite of new capabilities for predictive toxicology. Predicting the potential toxicity of emerging classes of engineered nanomaterials was chosen as one of two focusing problems for this program. PNNL’s focus toward this broader goal was to refine and apply experimental and computational tools needed to provide quantitative understanding of nanoparticle dosimetry for in vitro cell culture systems, which is necessary for comparative risk estimates for different nanomaterialsmore » or biological systems. Research conducted using lung epithelial and macrophage cell models successfully adapted magnetic particle detection and fluorescent microscopy technologies to quantify uptake of various forms of engineered nanoparticles, and provided experimental constraints and test datasets for benchmark comparison against results obtained using an in vitro computational dosimetry model, termed the ISSD model. The experimental and computational approaches developed were used to demonstrate how cell dosimetry is applied to aid in interpretation of genomic studies of nanoparticle-mediated biological responses in model cell culture systems. The combined experimental and theoretical approach provides a highly quantitative framework for evaluating relationships between biocompatibility of nanoparticles and their physical form in a controlled manner.« less

  17. An extended Kalman filter approach to non-stationary Bayesian estimation of reduced-order vocal fold model parameters.

    PubMed

    Hadwin, Paul J; Peterson, Sean D

    2017-04-01

    The Bayesian framework for parameter inference provides a basis from which subject-specific reduced-order vocal fold models can be generated. Previously, it has been shown that a particle filter technique is capable of producing estimates and associated credibility intervals of time-varying reduced-order vocal fold model parameters. However, the particle filter approach is difficult to implement and has a high computational cost, which can be barriers to clinical adoption. This work presents an alternative estimation strategy based upon Kalman filtering aimed at reducing the computational cost of subject-specific model development. The robustness of this approach to Gaussian and non-Gaussian noise is discussed. The extended Kalman filter (EKF) approach is found to perform very well in comparison with the particle filter technique at dramatically lower computational cost. Based upon the test cases explored, the EKF is comparable in terms of accuracy to the particle filter technique when greater than 6000 particles are employed; if less particles are employed, the EKF actually performs better. For comparable levels of accuracy, the solution time is reduced by 2 orders of magnitude when employing the EKF. By virtue of the approximations used in the EKF, however, the credibility intervals tend to be slightly underpredicted.

  18. The Impact of Time Delay on the Content of Discussions at a Computer-Mediated Conference

    NASA Astrophysics Data System (ADS)

    Huntley, Byron C.; Thatcher, Andrew

    2008-11-01

    This study investigates the relationship between the content of computer-mediated discussions and the time delay between online postings. The study aims to broaden understanding of the dynamics of computer-mediated discussion regarding the time delay and the actual content of computer-mediated discussions (knowledge construction, social aspects, amount of words and number of postings) which has barely been researched. The computer-mediated discussions of the CybErg 2005 virtual conference served as the sample for this study. The Interaction Analysis Model [1] was utilized to analyze the level of knowledge construction in the content of the computer-mediated discussions. Correlations have been computed for all combinations of the variables. The results demonstrate that knowledge construction, social aspects and amount of words generated within postings were independent of, and not affected by, the time delay between the postings and the posting from which the reply was formulated. When greater numbers of words were utilized within postings, this was typically associated with a greater level of knowledge construction. Social aspects in the discussion were found to neither advantage nor disadvantage the overall effectiveness of the computer-mediated discussion.

  19. A COMPUTATIONAL APPROACH TO UNDERSTANDING THE CARDIAC ELECTROMECHANICAL ACTIVATION SEQUENCE IN THE NORMAL AND FAILING HEART, WITH TRANSLATION TO THE CLINICAL PRACTICE OF CRT

    PubMed Central

    Constantino, Jason; Hu, Yuxuan; Trayanova, Natalia A.

    2012-01-01

    Cardiac resynchronization therapy (CRT) is an established clinical treatment modality that aims to recoordinate contraction of the heart in dyssynchrous heart failure (DHF) patients. Although CRT reduces morbidity and mortality, a significant percentage of CRT patients fail to respond to the therapy, reflecting an insufficient understanding of the electromechanical activity of the DHF heart. Computational models of ventricular electromechanics, are now poised to fill this knowledge gap and provide a comprehensive characterization of the spatiotemporal electromechanical interactions in the normal and DHF heart. The objective of this paper is to demonstrate the powerful utility of computational models of ventricular electromechanics in characterizing the relationship between the electrical and mechanical activation in the DHF heart, and how this understanding can be utilized to devise better CRT strategies. The computational research presented here exploits knowledge regarding the three dimensional distribution of the electromechanical delay, defined as the time interval between myocyte depolarization and onset of myofiber shortening, in determining the optimal location of the LV pacing electrode for CRT. The simulation results shown here also suggest utilizing myocardial efficiency and regional energy consumption as a guide to optimize CRT. PMID:22884712

  20. A study on strategic provisioning of cloud computing services.

    PubMed

    Whaiduzzaman, Md; Haque, Mohammad Nazmul; Rejaul Karim Chowdhury, Md; Gani, Abdullah

    2014-01-01

    Cloud computing is currently emerging as an ever-changing, growing paradigm that models "everything-as-a-service." Virtualised physical resources, infrastructure, and applications are supplied by service provisioning in the cloud. The evolution in the adoption of cloud computing is driven by clear and distinct promising features for both cloud users and cloud providers. However, the increasing number of cloud providers and the variety of service offerings have made it difficult for the customers to choose the best services. By employing successful service provisioning, the essential services required by customers, such as agility and availability, pricing, security and trust, and user metrics can be guaranteed by service provisioning. Hence, continuous service provisioning that satisfies the user requirements is a mandatory feature for the cloud user and vitally important in cloud computing service offerings. Therefore, we aim to review the state-of-the-art service provisioning objectives, essential services, topologies, user requirements, necessary metrics, and pricing mechanisms. We synthesize and summarize different provision techniques, approaches, and models through a comprehensive literature review. A thematic taxonomy of cloud service provisioning is presented after the systematic review. Finally, future research directions and open research issues are identified.

  1. Computer-aided design and rapid prototyping-assisted contouring of costal cartilage graft for facial reconstructive surgery.

    PubMed

    Lee, Shu Jin; Lee, Heow Pueh; Tse, Kwong Ming; Cheong, Ee Cherk; Lim, Siak Piang

    2012-06-01

    Complex 3-D defects of the facial skeleton are difficult to reconstruct with freehand carving of autogenous bone grafts. Onlay bone grafts are hard to carve and are associated with imprecise graft-bone interface contact and bony resorption. Autologous cartilage is well established in ear reconstruction as it is easy to carve and is associated with minimal resorption. In the present study, we aimed to reconstruct the hypoplastic orbitozygomatic region in a patient with left hemifacial microsomia using computer-aided design and rapid prototyping to facilitate costal cartilage carving and grafting. A three-step process of (1) 3-D reconstruction of the computed tomographic image, (2) mirroring the facial skeleton, and (3) modeling and rapid prototyping of the left orbitozygomaticomalar region and reconstruction template was performed. The template aided in donor site selection and extracorporeal contouring of the rib cartilage graft to allow for an accurate fit of the graft to the bony model prior to final fixation in the patient. We are able to refine the existing computer-aided design and rapid prototyping methods to allow for extracorporeal contouring of grafts and present rib cartilage as a good alternative to bone for autologous reconstruction.

  2. A Study on Strategic Provisioning of Cloud Computing Services

    PubMed Central

    Rejaul Karim Chowdhury, Md

    2014-01-01

    Cloud computing is currently emerging as an ever-changing, growing paradigm that models “everything-as-a-service.” Virtualised physical resources, infrastructure, and applications are supplied by service provisioning in the cloud. The evolution in the adoption of cloud computing is driven by clear and distinct promising features for both cloud users and cloud providers. However, the increasing number of cloud providers and the variety of service offerings have made it difficult for the customers to choose the best services. By employing successful service provisioning, the essential services required by customers, such as agility and availability, pricing, security and trust, and user metrics can be guaranteed by service provisioning. Hence, continuous service provisioning that satisfies the user requirements is a mandatory feature for the cloud user and vitally important in cloud computing service offerings. Therefore, we aim to review the state-of-the-art service provisioning objectives, essential services, topologies, user requirements, necessary metrics, and pricing mechanisms. We synthesize and summarize different provision techniques, approaches, and models through a comprehensive literature review. A thematic taxonomy of cloud service provisioning is presented after the systematic review. Finally, future research directions and open research issues are identified. PMID:25032243

  3. Development and validation of real-time simulation of X-ray imaging with respiratory motion.

    PubMed

    Vidal, Franck P; Villard, Pierre-Frédéric

    2016-04-01

    We present a framework that combines evolutionary optimisation, soft tissue modelling and ray tracing on GPU to simultaneously compute the respiratory motion and X-ray imaging in real-time. Our aim is to provide validated building blocks with high fidelity to closely match both the human physiology and the physics of X-rays. A CPU-based set of algorithms is presented to model organ behaviours during respiration. Soft tissue deformation is computed with an extension of the Chain Mail method. Rigid elements move according to kinematic laws. A GPU-based surface rendering method is proposed to compute the X-ray image using the Beer-Lambert law. It is provided as an open-source library. A quantitative validation study is provided to objectively assess the accuracy of both components: (i) the respiration against anatomical data, and (ii) the X-ray against the Beer-Lambert law and the results of Monte Carlo simulations. Our implementation can be used in various applications, such as interactive medical virtual environment to train percutaneous transhepatic cholangiography in interventional radiology, 2D/3D registration, computation of digitally reconstructed radiograph, simulation of 4D sinograms to test tomography reconstruction tools. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. An Empirical Examination of EFL Learners' Perceptual Learning Styles and Acceptance of ASR-Based Computer-Assisted Pronunciation Training

    ERIC Educational Resources Information Center

    Hsu, Liwei

    2016-01-01

    This study aims to explore the structural relationships among the variables of EFL (English as a foreign language) learners' perceptual learning styles and Technology Acceptance Model (TAM). Three hundred and forty-one (n = 341) EFL learners were invited to join a self-regulated English pronunciation training program through automatic speech…

  5. Factors Affecting Students' Acceptance of Tablet PCs: A Study in Italian High Schools

    ERIC Educational Resources Information Center

    Cacciamani, Stefano; Villani, Daniela; Bonanomi, Andrea; Carissoli, Claudia; Olivari, Maria Giulia; Morganti, Laura; Riva, Giuseppe; Confalonieri, Emanuela

    2018-01-01

    To maximize the advantages of the tablet personal computer (TPC) at school, this technology needs to be accepted by students as new tool for learning. With reference to the Technology Acceptance Model and the Unified Theory of Acceptance and Use of Technology, the aims of this study were (a) to analyze factors influencing high school students'…

  6. Air pollution-induced health impacts on the national economy of China: demonstration of a computable general equilibrium approach.

    PubMed

    Wan, Yue; Yang, Hongwei; Masui, Toshihiko

    2005-01-01

    At the present time, ambient air pollution is a serious public health problem in China. Based on the concentration-response relationship provided by international and domestic epidemiologic studies, the authors estimated the mortality and morbidity induced by the ambient air pollution of 2000. To address the mechanism of the health impact on the national economy, the authors applied a computable general equilibrium (CGE) model, named AIM/Material China, containing 39 production sectors and 32 commodities. AIM/Material analyzes changes of the gross domestic product (GDP), final demand, and production activity originating from health damages. If ambient air quality met Grade II of China's air quality standard in 2000, then the avoidable GDP loss would be 0.38%o of the national total, of which 95% was led by labor loss. Comparatively, medical expenditure had less impact on national economy, which is explained from the aspect of the final demand by commodities and the production activities by sectors. The authors conclude that the CGE model is a suitable tool for assessing health impacts from a point of view of national economy through the discussion about its applicability.

  7. Computational tools for comparative phenomics; the role and promise of ontologies

    PubMed Central

    Gkoutos, Georgios V.; Schofield, Paul N.; Hoehndorf, Robert

    2012-01-01

    A major aim of the biological sciences is to gain an understanding of human physiology and disease. One important step towards such a goal is the discovery of the function of genes that will lead to better understanding of the physiology and pathophysiology of organisms ultimately providing better understanding, diagnosis, and therapy. Our increasing ability to phenotypically characterise genetic variants of model organisms coupled with systematic and hypothesis-driven mutagenesis is resulting in a wealth of information that could potentially provide insight to the functions of all genes in an organism. The challenge we are now facing is to develop computational methods that can integrate and analyse such data. The introduction of formal ontologies that make their semantics explicit and accessible to automated reasoning promises the tantalizing possibility of standardizing biomedical knowledge allowing for novel, powerful queries that bridge multiple domains, disciplines, species and levels of granularity. We review recent computational approaches that facilitate the integration of experimental data from model organisms with clinical observations in humans. These methods foster novel cross species analysis approaches, thereby enabling comparative phenomics and leading to the potential of translating basic discoveries from the model systems into diagnostic and therapeutic advances at the clinical level. PMID:22814867

  8. A pointing facilitation system for motor-impaired users combining polynomial smoothing and time-weighted gradient target prediction models.

    PubMed

    Blow, Nikolaus; Biswas, Pradipta

    2017-01-01

    As computers become more and more essential for everyday life, people who cannot use them are missing out on an important tool. The predominant method of interaction with a screen is a mouse, and difficulty in using a mouse can be a huge obstacle for people who would otherwise gain great value from using a computer. If mouse pointing were to be made easier, then a large number of users may be able to begin using a computer efficiently where they may previously have been unable to. The present article aimed to improve pointing speeds for people with arm or hand impairments. The authors investigated different smoothing and prediction models on a stored data set involving 25 people, and the best of these algorithms were chosen. A web-based prototype was developed combining a polynomial smoothing algorithm with a time-weighted gradient target prediction model. The adapted interface gave an average improvement of 13.5% in target selection times in a 10-person study of representative users of the system. A demonstration video of the system is available at https://youtu.be/sAzbrKHivEY.

  9. Regression-based adaptive sparse polynomial dimensional decomposition for sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Tang, Kunkun; Congedo, Pietro; Abgrall, Remi

    2014-11-01

    Polynomial dimensional decomposition (PDD) is employed in this work for global sensitivity analysis and uncertainty quantification of stochastic systems subject to a large number of random input variables. Due to the intimate structure between PDD and Analysis-of-Variance, PDD is able to provide simpler and more direct evaluation of the Sobol' sensitivity indices, when compared to polynomial chaos (PC). Unfortunately, the number of PDD terms grows exponentially with respect to the size of the input random vector, which makes the computational cost of the standard method unaffordable for real engineering applications. In order to address this problem of curse of dimensionality, this work proposes a variance-based adaptive strategy aiming to build a cheap meta-model by sparse-PDD with PDD coefficients computed by regression. During this adaptive procedure, the model representation by PDD only contains few terms, so that the cost to resolve repeatedly the linear system of the least-square regression problem is negligible. The size of the final sparse-PDD representation is much smaller than the full PDD, since only significant terms are eventually retained. Consequently, a much less number of calls to the deterministic model is required to compute the final PDD coefficients.

  10. Comparison of Gunshot Entrance Morphologies Caused by .40-Caliber Smith & Wesson, .380-Caliber, and 9-mm Luger Bullets: A Finite Element Analysis Study

    PubMed Central

    Matoso, Rodrigo Ivo; Freire, Alexandre Rodrigues; Santos, Leonardo Soriano de Mello; Daruge Junior, Eduardo; Rossi, Ana Claudia; Prado, Felippe Bevilacqua

    2014-01-01

    Firearms can cause fatal wounds, which can be identified by traces on or around the body. However, there are cases where neither the bullet nor gun is found at the crime scene. Ballistic research involving finite element models can reproduce computational biomechanical conditions, without compromising bioethics, as they involve no direct tests on animals or humans. This study aims to compare the morphologies of gunshot entrance holes caused by.40-caliber Smith & Wesson (S&W), .380-caliber, and 9×19-mm Luger bullets. A fully metal-jacketed.40 S&W projectile, a fully metal-jacketed.380 projectile, and a fully metal-jacketed 9×19-mm Luger projectile were computationally fired at the glabellar region of the finite element model from a distance of 10 cm, at perpendicular incidence. The results show different morphologies in the entrance holes produced by the three bullets, using the same skull at the same shot distance. The results and traits of the entrance holes are discussed. Finite element models allow feasible computational ballistic research, which may be useful to forensic experts when comparing and analyzing data related to gunshot wounds in the forehead. PMID:25343337

  11. Knowledge-acquisition tools for medical knowledge-based systems.

    PubMed

    Lanzola, G; Quaglini, S; Stefanelli, M

    1995-03-01

    Knowledge-based systems (KBS) have been proposed to solve a large variety of medical problems. A strategic issue for KBS development and maintenance are the efforts required for both knowledge engineers and domain experts. The proposed solution is building efficient knowledge acquisition (KA) tools. This paper presents a set of KA tools we are developing within a European Project called GAMES II. They have been designed after the formulation of an epistemological model of medical reasoning. The main goal is that of developing a computational framework which allows knowledge engineers and domain experts to interact cooperatively in developing a medical KBS. To this aim, a set of reusable software components is highly recommended. Their design was facilitated by the development of a methodology for KBS construction. It views this process as comprising two activities: the tailoring of the epistemological model to the specific medical task to be executed and the subsequent translation of this model into a computational architecture so that the connections between computational structures and their knowledge level counterparts are maintained. The KA tools we developed are illustrated taking examples from the behavior of a KBS we are building for the management of children with acute myeloid leukemia.

  12. Networks and games for precision medicine.

    PubMed

    Biane, Célia; Delaplace, Franck; Klaudel, Hanna

    2016-12-01

    Recent advances in omics technologies provide the leverage for the emergence of precision medicine that aims at personalizing therapy to patient. In this undertaking, computational methods play a central role for assisting physicians in their clinical decision-making by combining data analysis and systems biology modelling. Complex diseases such as cancer or diabetes arise from the intricate interplay of various biological molecules. Therefore, assessing drug efficiency requires to study the effects of elementary perturbations caused by diseases on relevant biological networks. In this paper, we propose a computational framework called Network-Action Game applied to best drug selection problem combining Game Theory and discrete models of dynamics (Boolean networks). Decision-making is modelled using Game Theory that defines the process of drug selection among alternative possibilities, while Boolean networks are used to model the effects of the interplay between disease and drugs actions on the patient's molecular system. The actions/strategies of disease and drugs are focused on arc alterations of the interactome. The efficiency of this framework has been evaluated for drug prediction on a model of breast cancer signalling. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  13. Imaging genetics approach to predict progression of Parkinson's diseases.

    PubMed

    Mansu Kim; Seong-Jin Son; Hyunjin Park

    2017-07-01

    Imaging genetics is a tool to extract genetic variants associated with both clinical phenotypes and imaging information. The approach can extract additional genetic variants compared to conventional approaches to better investigate various diseased conditions. Here, we applied imaging genetics to study Parkinson's disease (PD). We aimed to extract significant features derived from imaging genetics and neuroimaging. We built a regression model based on extracted significant features combining genetics and neuroimaging to better predict clinical scores of PD progression (i.e. MDS-UPDRS). Our model yielded high correlation (r = 0.697, p <; 0.001) and low root mean squared error (8.36) between predicted and actual MDS-UPDRS scores. Neuroimaging (from 123 I-Ioflupane SPECT) predictors of regression model were computed from independent component analysis approach. Genetic features were computed using image genetics approach based on identified neuroimaging features as intermediate phenotypes. Joint modeling of neuroimaging and genetics could provide complementary information and thus have the potential to provide further insight into the pathophysiology of PD. Our model included newly found neuroimaging features and genetic variants which need further investigation.

  14. Hierarchical Processing of Auditory Objects in Humans

    PubMed Central

    Kumar, Sukhbinder; Stephan, Klaas E; Warren, Jason D; Friston, Karl J; Griffiths, Timothy D

    2007-01-01

    This work examines the computational architecture used by the brain during the analysis of the spectral envelope of sounds, an important acoustic feature for defining auditory objects. Dynamic causal modelling and Bayesian model selection were used to evaluate a family of 16 network models explaining functional magnetic resonance imaging responses in the right temporal lobe during spectral envelope analysis. The models encode different hypotheses about the effective connectivity between Heschl's Gyrus (HG), containing the primary auditory cortex, planum temporale (PT), and superior temporal sulcus (STS), and the modulation of that coupling during spectral envelope analysis. In particular, we aimed to determine whether information processing during spectral envelope analysis takes place in a serial or parallel fashion. The analysis provides strong support for a serial architecture with connections from HG to PT and from PT to STS and an increase of the HG to PT connection during spectral envelope analysis. The work supports a computational model of auditory object processing, based on the abstraction of spectro-temporal “templates” in the PT before further analysis of the abstracted form in anterior temporal lobe areas. PMID:17542641

  15. Computer graphics testbed to simulate and test vision systems for space applications

    NASA Technical Reports Server (NTRS)

    Cheatham, John B.; Wu, Chris K.; Lin, Y. H.

    1991-01-01

    A system was developed for displaying computer graphics images of space objects and the use of the system was demonstrated as a testbed for evaluating vision systems for space applications. In order to evaluate vision systems, it is desirable to be able to control all factors involved in creating the images used for processing by the vision system. Considerable time and expense is involved in building accurate physical models of space objects. Also, precise location of the model relative to the viewer and accurate location of the light source require additional effort. As part of this project, graphics models of space objects such as the Solarmax satellite are created that the user can control the light direction and the relative position of the object and the viewer. The work is also aimed at providing control of hue, shading, noise and shadows for use in demonstrating and testing imaging processing techniques. The simulated camera data can provide XYZ coordinates, pitch, yaw, and roll for the models. A physical model is also being used to provide comparison of camera images with the graphics images.

  16. Plate refractive camera model and its applications

    NASA Astrophysics Data System (ADS)

    Huang, Longxiang; Zhao, Xu; Cai, Shen; Liu, Yuncai

    2017-03-01

    In real applications, a pinhole camera capturing objects through a planar parallel transparent plate is frequently employed. Due to the refractive effects of the plate, such an imaging system does not comply with the conventional pinhole camera model. Although the system is ubiquitous, it has not been thoroughly studied. This paper aims at presenting a simple virtual camera model, called a plate refractive camera model, which has a form similar to a pinhole camera model and can efficiently model refractions through a plate. The key idea is to employ a pixel-wise viewpoint concept to encode the refraction effects into a pixel-wise pinhole camera model. The proposed camera model realizes an efficient forward projection computation method and has some advantages in applications. First, the model can help to compute the caustic surface to represent the changes of the camera viewpoints. Second, the model has strengths in analyzing and rectifying the image caustic distortion caused by the plate refraction effects. Third, the model can be used to calibrate the camera's intrinsic parameters without removing the plate. Last but not least, the model contributes to putting forward the plate refractive triangulation methods in order to solve the plate refractive triangulation problem easily in multiviews. We verify our theory in both synthetic and real experiments.

  17. Cloud-based large-scale air traffic flow optimization

    NASA Astrophysics Data System (ADS)

    Cao, Yi

    The ever-increasing traffic demand makes the efficient use of airspace an imperative mission, and this paper presents an effort in response to this call. Firstly, a new aggregate model, called Link Transmission Model (LTM), is proposed, which models the nationwide traffic as a network of flight routes identified by origin-destination pairs. The traversal time of a flight route is assumed to be the mode of distribution of historical flight records, and the mode is estimated by using Kernel Density Estimation. As this simplification abstracts away physical trajectory details, the complexity of modeling is drastically decreased, resulting in efficient traffic forecasting. The predicative capability of LTM is validated against recorded traffic data. Secondly, a nationwide traffic flow optimization problem with airport and en route capacity constraints is formulated based on LTM. The optimization problem aims at alleviating traffic congestions with minimal global delays. This problem is intractable due to millions of variables. A dual decomposition method is applied to decompose the large-scale problem such that the subproblems are solvable. However, the whole problem is still computational expensive to solve since each subproblem is an smaller integer programming problem that pursues integer solutions. Solving an integer programing problem is known to be far more time-consuming than solving its linear relaxation. In addition, sequential execution on a standalone computer leads to linear runtime increase when the problem size increases. To address the computational efficiency problem, a parallel computing framework is designed which accommodates concurrent executions via multithreading programming. The multithreaded version is compared with its monolithic version to show decreased runtime. Finally, an open-source cloud computing framework, Hadoop MapReduce, is employed for better scalability and reliability. This framework is an "off-the-shelf" parallel computing model that can be used for both offline historical traffic data analysis and online traffic flow optimization. It provides an efficient and robust platform for easy deployment and implementation. A small cloud consisting of five workstations was configured and used to demonstrate the advantages of cloud computing in dealing with large-scale parallelizable traffic problems.

  18. Intentions of hospital nurses to work with computers: based on the theory of planned behavior.

    PubMed

    Shoham, Snunith; Gonen, Ayala

    2008-01-01

    The purpose of this study was to determine registered nurses' attitudes related to intent to use computers in the hospital setting as a predictor of their future behavior. The study was further aimed at identifying the relationship between these attitudes and selected sociological, professional, and personal factors and to describe a research model integrating these various factors. The study was based on the theory of planned behavior. A random sample of 411 registered nurses was selected from a single large medical center in Israel. The study tool was a Likert-style questionnaire. Nine different indices were used: (1) behavioral intention toward computer use; (2) general attitudes toward computer use; (3) nursing attitudes toward computer use; (4) threat involved in computer use; (5) challenge involved in computer use; (6) organizational climate; (7) departmental climate; (8) attraction to technological innovations/innovativeness; (9) self-efficacy, ability to control behavior. Strong significant positive correlations were found between the nurses' attitudes (general attitudes and nursing attitudes), self-efficacy, innovativeness, and intentions to use computers. Higher correlations were found between departmental climate and attitudes than between organizational climate and attitudes. The threat and challenge that are involved in computer use were shown as important mediating variables to the understanding of the process of predicting attitudes and intentions toward using computers.

  19. The effect of in situ/in vitro three-dimensional quantitative computed tomography image voxel size on the finite element model of human vertebral cancellous bone.

    PubMed

    Lu, Yongtao; Engelke, Klaus; Glueer, Claus-C; Morlock, Michael M; Huber, Gerd

    2014-11-01

    Quantitative computed tomography-based finite element modeling technique is a promising clinical tool for the prediction of bone strength. However, quantitative computed tomography-based finite element models were created from image datasets with different image voxel sizes. The aim of this study was to investigate whether there is an influence of image voxel size on the finite element models. In all 12 thoracolumbar vertebrae were scanned prior to autopsy (in situ) using two different quantitative computed tomography scan protocols, which resulted in image datasets with two different voxel sizes (0.29 × 0.29 × 1.3 mm(3) vs 0.18 × 0.18 × 0.6 mm(3)). Eight of them were scanned after autopsy (in vitro) and the datasets were reconstructed with two voxel sizes (0.32 × 0.32 × 0.6 mm(3) vs. 0.18 × 0.18 × 0.3 mm(3)). Finite element models with cuboid volume of interest extracted from the vertebral cancellous part were created and inhomogeneous bilinear bone properties were defined. Axial compression was simulated. No effect of voxel size was detected on the apparent bone mineral density for both the in situ and in vitro cases. However, the apparent modulus and yield strength showed significant differences in the two voxel size group pairs (in situ and in vitro). In conclusion, the image voxel size may have to be considered when the finite element voxel modeling technique is used in clinical applications. © IMechE 2014.

  20. Modeling plant growth and development.

    PubMed

    Prusinkiewicz, Przemyslaw

    2004-02-01

    Computational plant models or 'virtual plants' are increasingly seen as a useful tool for comprehending complex relationships between gene function, plant physiology, plant development, and the resulting plant form. The theory of L-systems, which was introduced by Lindemayer in 1968, has led to a well-established methodology for simulating the branching architecture of plants. Many current architectural models provide insights into the mechanisms of plant development by incorporating physiological processes, such as the transport and allocation of carbon. Other models aim at elucidating the geometry of plant organs, including flower petals and apical meristems, and are beginning to address the relationship between patterns of gene expression and the resulting plant form.

  1. A combustion model of vegetation burning in "Tiger" fire propagation tool

    NASA Astrophysics Data System (ADS)

    Giannino, F.; Ascoli, D.; Sirignano, M.; Mazzoleni, S.; Russo, L.; Rego, F.

    2017-11-01

    In this paper, we propose a semi-physical model for the burning of vegetation in a wildland fire. The main physical-chemical processes involved in fire spreading are modelled through a set of ordinary differential equations, which describe the combustion process as linearly related to the consumption of fuel. The water evaporation process from leaves and wood is also considered. Mass and energy balance equations are written for fuel (leaves and wood) assuming that combustion process is homogeneous in space. The model is developed with the final aim of simulating large-scale wildland fires which spread on heterogeneous landscape while keeping the computation cost very low.

  2. The Hydrodynamic Study of the Swimming Gliding: a Two-Dimensional Computational Fluid Dynamics (CFD) Analysis.

    PubMed

    Marinho, Daniel A; Barbosa, Tiago M; Rouboa, Abel I; Silva, António J

    2011-09-01

    Nowadays the underwater gliding after the starts and the turns plays a major role in the overall swimming performance. Hence, minimizing hydrodynamic drag during the underwater phases should be a main aim during swimming. Indeed, there are several postures that swimmers can assume during the underwater gliding, although experimental results were not conclusive concerning the best body position to accomplish this aim. Therefore, the purpose of this study was to analyse the effect in hydrodynamic drag forces of using different body positions during gliding through computational fluid dynamics (CFD) methodology. For this purpose, two-dimensional models of the human body in steady flow conditions were studied. Two-dimensional virtual models had been created: (i) a prone position with the arms extended at the front of the body; (ii) a prone position with the arms placed alongside the trunk; (iii) a lateral position with the arms extended at the front and; (iv) a dorsal position with the arms extended at the front. The drag forces were computed between speeds of 1.6 m/s and 2 m/s in a two-dimensional Fluent(®) analysis. The positions with the arms extended at the front presented lower drag values than the position with the arms aside the trunk. The lateral position was the one in which the drag was lower and seems to be the one that should be adopted during the gliding after starts and turns.

  3. Construction of three-dimensional tooth model by micro-computed tomography and application for data sharing.

    PubMed

    Kato, A; Ohno, N

    2009-03-01

    The study of dental morphology is essential in terms of phylogeny. Advances in three-dimensional (3D) measurement devices have enabled us to make 3D images of teeth without destruction of samples. However, raw fundamental data on tooth shape requires complex equipment and techniques. An online database of 3D teeth models is therefore indispensable. We aimed to explore the basic methodology for constructing 3D teeth models, with application for data sharing. Geometric information on the human permanent upper left incisor was obtained using micro-computed tomography (micro-CT). Enamel, dentine, and pulp were segmented by thresholding of different gray-scale intensities. Segmented data were separately exported in STereo-Lithography Interface Format (STL). STL data were converted to Wavefront OBJ (OBJect), as many 3D computer graphics programs support the Wavefront OBJ format. Data were also applied to Quick Time Virtual Reality (QTVR) format, which allows the image to be viewed from any direction. In addition to Wavefront OBJ and QTVR data, the original CT series were provided as 16-bit Tag Image File Format (TIFF) images on the website. In conclusion, 3D teeth models were constructed in general-purpose data formats, using micro-CT and commercially available programs. Teeth models that can be used widely would benefit all those who study dental morphology.

  4. Global mean dynamic topography based on GOCE data and Wiener filters

    NASA Astrophysics Data System (ADS)

    Gilardoni, Maddalena; Reguzzoni, Mirko; Albertella, Alberta

    2015-04-01

    A mean dynamic ocean topography (MDT) has been computed by using a GOCE-only gravity model and a given mean sea surface (MSS) obtained from satellite altimetry. Since the used gravity model, i.e. the fifth release of the time-wise solution covering the full mission lifetime, is truncated at a maximum harmonic degree of 280, the obtained MDT has to be consistently filtered. This has been done globally by using the spherical harmonic representation and following a Wiener minimization principle. This global filtering approach is convenient from the computational point of view but requires to have MDT values all over the Earth surface and therefore to fill the continents with fictitious data. The main improvements with respect to the already presented results are in the MDT filling procedure (to guarantee that the global signal has the same covariance of the one over the oceans), in the error modelling of the input MSS and in the error estimation of the filtered MDT and of the corresponding geostrophic velocities. The impact of GOCE data in the ocean circulation global modelling has been assessed by comparing the pattern of the obtained geostrophic currents with those computed by using EGM2008. Comparisons with independent circulation data based on drifters and other MDT models have been also performed with the aim of evaluating the accuracy of the obtained results.

  5. Computer Simulation in Predicting Biochemical Processes and Energy Balance at WWTPs

    NASA Astrophysics Data System (ADS)

    Drewnowski, Jakub; Zaborowska, Ewa; Hernandez De Vega, Carmen

    2018-02-01

    Nowadays, the use of mathematical models and computer simulation allow analysis of many different technological solutions as well as testing various scenarios in a short time and at low financial budget in order to simulate the scenario under typical conditions for the real system and help to find the best solution in design or operation process. The aim of the study was to evaluate different concepts of biochemical processes and energy balance modelling using a simulation platform GPS-x and a comprehensive model Mantis2. The paper presents the example of calibration and validation processes in the biological reactor as well as scenarios showing an influence of operational parameters on the WWTP energy balance. The results of batch tests and full-scale campaign obtained in the former work were used to predict biochemical and operational parameters in a newly developed plant model. The model was extended with sludge treatment devices, including anaerobic digester. Primary sludge removal efficiency was found as a significant factor determining biogas production and further renewable energy production in cogeneration. Water and wastewater utilities, which run and control WWTP, are interested in optimizing the process in order to save environment, their budget and decrease the pollutant emissions to water and air. In this context, computer simulation can be the easiest and very useful tool to improve the efficiency without interfering in the actual process performance.

  6. Refinement, Reduction, and Replacement of Animal Toxicity Tests by Computational Methods.

    PubMed

    Ford, Kevin A

    2016-12-01

    Widespread public and scientific interest in promoting the care and well-being of animals used for toxicity testing has given rise to improvements in animal welfare practices and views over time, as well as laws and regulations that support means to reduce, refine, and replace animal use (known as the 3Rs) in certain toxicity studies. One way these regulations continue to achieve their aim is by promoting the research, development, and application of alternative testing approaches to characterize potential toxicities either without animals or with minimal use. An important example of an alternative approach is the use of computational toxicology models. Along with the potential capacity to reduce or replace the use of animals for the assessment of particular toxicological endpoints, computational models offer several advantages compared to in vitro and in vivo approaches, including cost-effectiveness, rapid availability of results, and the ability to fully standardize procedures. Pharmaceutical research incorporating the use of computational models has increased steadily over the past 15 years, likely driven by the motivation of companies to screen out toxic compounds in the early stages of development. Models are currently available to aid in the prediction of several important toxicological endpoints, including mutagenicity, carcinogenicity, eye irritation, hepatotoxicity, and skin sensitization, albeit with varying degrees of success. This review serves to introduce the concepts of computational toxicology and evaluate their role in the safety assessment of compounds, while also highlighting the application of in silico methods in the support of the goal and vision of the 3Rs. © The Author 2016. Published by Oxford University Press on behalf of the Institute for Laboratory Animal Research.All rights reserved. For permissions, please email: journals.permissions@oup.com.

  7. Multiscale approach including microfibril scale to assess elastic constants of cortical bone based on neural network computation and homogenization method.

    PubMed

    Barkaoui, Abdelwahed; Chamekh, Abdessalem; Merzouki, Tarek; Hambli, Ridha; Mkaddem, Ali

    2014-03-01

    The complexity and heterogeneity of bone tissue require a multiscale modeling to understand its mechanical behavior and its remodeling mechanisms. In this paper, a novel multiscale hierarchical approach including microfibril scale based on hybrid neural network (NN) computation and homogenization equations was developed to link nanoscopic and macroscopic scales to estimate the elastic properties of human cortical bone. The multiscale model is divided into three main phases: (i) in step 0, the elastic constants of collagen-water and mineral-water composites are calculated by averaging the upper and lower Hill bounds; (ii) in step 1, the elastic properties of the collagen microfibril are computed using a trained NN simulation. Finite element calculation is performed at nanoscopic levels to provide a database to train an in-house NN program; and (iii) in steps 2-10 from fibril to continuum cortical bone tissue, homogenization equations are used to perform the computation at the higher scales. The NN outputs (elastic properties of the microfibril) are used as inputs for the homogenization computation to determine the properties of mineralized collagen fibril. The mechanical and geometrical properties of bone constituents (mineral, collagen, and cross-links) as well as the porosity were taken in consideration. This paper aims to predict analytically the effective elastic constants of cortical bone by modeling its elastic response at these different scales, ranging from the nanostructural to mesostructural levels. Our findings of the lowest scale's output were well integrated with the other higher levels and serve as inputs for the next higher scale modeling. Good agreement was obtained between our predicted results and literature data. Copyright © 2013 John Wiley & Sons, Ltd.

  8. Novel metaheuristic for parameter estimation in nonlinear dynamic biological systems

    PubMed Central

    Rodriguez-Fernandez, Maria; Egea, Jose A; Banga, Julio R

    2006-01-01

    Background We consider the problem of parameter estimation (model calibration) in nonlinear dynamic models of biological systems. Due to the frequent ill-conditioning and multi-modality of many of these problems, traditional local methods usually fail (unless initialized with very good guesses of the parameter vector). In order to surmount these difficulties, global optimization (GO) methods have been suggested as robust alternatives. Currently, deterministic GO methods can not solve problems of realistic size within this class in reasonable computation times. In contrast, certain types of stochastic GO methods have shown promising results, although the computational cost remains large. Rodriguez-Fernandez and coworkers have presented hybrid stochastic-deterministic GO methods which could reduce computation time by one order of magnitude while guaranteeing robustness. Our goal here was to further reduce the computational effort without loosing robustness. Results We have developed a new procedure based on the scatter search methodology for nonlinear optimization of dynamic models of arbitrary (or even unknown) structure (i.e. black-box models). In this contribution, we describe and apply this novel metaheuristic, inspired by recent developments in the field of operations research, to a set of complex identification problems and we make a critical comparison with respect to the previous (above mentioned) successful methods. Conclusion Robust and efficient methods for parameter estimation are of key importance in systems biology and related areas. The new metaheuristic presented in this paper aims to ensure the proper solution of these problems by adopting a global optimization approach, while keeping the computational effort under reasonable values. This new metaheuristic was applied to a set of three challenging parameter estimation problems of nonlinear dynamic biological systems, outperforming very significantly all the methods previously used for these benchmark problems. PMID:17081289

  9. Novel metaheuristic for parameter estimation in nonlinear dynamic biological systems.

    PubMed

    Rodriguez-Fernandez, Maria; Egea, Jose A; Banga, Julio R

    2006-11-02

    We consider the problem of parameter estimation (model calibration) in nonlinear dynamic models of biological systems. Due to the frequent ill-conditioning and multi-modality of many of these problems, traditional local methods usually fail (unless initialized with very good guesses of the parameter vector). In order to surmount these difficulties, global optimization (GO) methods have been suggested as robust alternatives. Currently, deterministic GO methods can not solve problems of realistic size within this class in reasonable computation times. In contrast, certain types of stochastic GO methods have shown promising results, although the computational cost remains large. Rodriguez-Fernandez and coworkers have presented hybrid stochastic-deterministic GO methods which could reduce computation time by one order of magnitude while guaranteeing robustness. Our goal here was to further reduce the computational effort without loosing robustness. We have developed a new procedure based on the scatter search methodology for nonlinear optimization of dynamic models of arbitrary (or even unknown) structure (i.e. black-box models). In this contribution, we describe and apply this novel metaheuristic, inspired by recent developments in the field of operations research, to a set of complex identification problems and we make a critical comparison with respect to the previous (above mentioned) successful methods. Robust and efficient methods for parameter estimation are of key importance in systems biology and related areas. The new metaheuristic presented in this paper aims to ensure the proper solution of these problems by adopting a global optimization approach, while keeping the computational effort under reasonable values. This new metaheuristic was applied to a set of three challenging parameter estimation problems of nonlinear dynamic biological systems, outperforming very significantly all the methods previously used for these benchmark problems.

  10. HIGH-FIDELITY SIMULATION-DRIVEN MODEL DEVELOPMENT FOR COARSE-GRAINED COMPUTATIONAL FLUID DYNAMICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanna, Botros N.; Dinh, Nam T.; Bolotnov, Igor A.

    Nuclear reactor safety analysis requires identifying various credible accident scenarios and determining their consequences. For a full-scale nuclear power plant system behavior, it is impossible to obtain sufficient experimental data for a broad range of risk-significant accident scenarios. In single-phase flow convective problems, Direct Numerical Simulation (DNS) and Large Eddy Simulation (LES) can provide us with high fidelity results when physical data are unavailable. However, these methods are computationally expensive and cannot be afforded for simulation of long transient scenarios in nuclear accidents despite extraordinary advances in high performance scientific computing over the past decades. The major issue is themore » inability to make the transient computation parallel, thus making number of time steps required in high-fidelity methods unaffordable for long transients. In this work, we propose to apply a high fidelity simulation-driven approach to model sub-grid scale (SGS) effect in Coarse Grained Computational Fluid Dynamics CG-CFD. This approach aims to develop a statistical surrogate model instead of the deterministic SGS model. We chose to start with a turbulent natural convection case with volumetric heating in a horizontal fluid layer with a rigid, insulated lower boundary and isothermal (cold) upper boundary. This scenario of unstable stratification is relevant to turbulent natural convection in a molten corium pool during a severe nuclear reactor accident, as well as in containment mixing and passive cooling. The presented approach demonstrates how to create a correction for the CG-CFD solution by modifying the energy balance equation. A global correction for the temperature equation proves to achieve a significant improvement to the prediction of steady state temperature distribution through the fluid layer.« less

  11. A Parallel Numerical Micromagnetic Code Using FEniCS

    NASA Astrophysics Data System (ADS)

    Nagy, L.; Williams, W.; Mitchell, L.

    2013-12-01

    Many problems in the geosciences depend on understanding the ability of magnetic minerals to provide stable paleomagnetic recordings. Numerical micromagnetic modelling allows us to calculate the domain structures found in naturally occurring magnetic materials. However the computational cost rises exceedingly quickly with respect to the size and complexity of the geometries that we wish to model. This problem is compounded by the fact that the modern processor design no longer focuses on the speed at which calculations are performed, but rather on the number of computational units amongst which we may distribute our calculations. Consequently to better exploit modern computational resources our micromagnetic simulations must "go parallel". We present a parallel and scalable micromagnetics code written using FEniCS. FEniCS is a multinational collaboration involving several institutions (University of Cambridge, University of Chicago, The Simula Research Laboratory, etc.) that aims to provide a set of tools for writing scientific software; in particular software that employs the finite element method. The advantages of this approach are the leveraging of pre-existing projects from the world of scientific computing (PETSc, Trilinos, Metis/Parmetis, etc.) and exposing these so that researchers may pose problems in a manner closer to the mathematical language of their domain. Our code provides a scriptable interface (in Python) that allows users to not only run micromagnetic models in parallel, but also to perform pre/post processing of data.

  12. Archetype-Based Modeling of Persona for Comprehensive Personality Computing from Personal Big Data.

    PubMed

    Guo, Ao; Ma, Jianhua

    2018-02-25

    A model describing the wide variety of human behaviours called personality, is becoming increasingly popular among researchers due to the widespread availability of personal big data generated from the use of prevalent digital devices, e.g., smartphones and wearables. Such an approach can be used to model an individual and even digitally clone a person, e.g., a Cyber-I (cyber individual). This work is aimed at establishing a unique and comprehensive description for an individual to mesh with various personalized services and applications. An extensive research literature on or related to psychological modelling exists, i.e., into automatic personality computing. However, the integrity and accuracy of the results from current automatic personality computing is insufficient for the elaborate modeling in Cyber-I due to an insufficient number of data sources. To reach a comprehensive psychological description of a person, it is critical to bring in heterogeneous data sources that could provide plenty of personal data, i.e., the physiological data, and the Internet data. In addition, instead of calculating personality traits from personal data directly, an approach to a personality model derived from the theories of Carl Gustav Jung is used to measure a human subject's persona. Therefore, this research is focused on designing an archetype-based modeling of persona covering an individual's facets in different situations to approach a comprehensive personality model. Using personal big data to measure a specific persona in a certain scenario, our research is designed to ensure the accuracy and integrity of the generated personality model.

  13. Archetype-Based Modeling of Persona for Comprehensive Personality Computing from Personal Big Data

    PubMed Central

    Ma, Jianhua

    2018-01-01

    A model describing the wide variety of human behaviours called personality, is becoming increasingly popular among researchers due to the widespread availability of personal big data generated from the use of prevalent digital devices, e.g., smartphones and wearables. Such an approach can be used to model an individual and even digitally clone a person, e.g., a Cyber-I (cyber individual). This work is aimed at establishing a unique and comprehensive description for an individual to mesh with various personalized services and applications. An extensive research literature on or related to psychological modelling exists, i.e., into automatic personality computing. However, the integrity and accuracy of the results from current automatic personality computing is insufficient for the elaborate modeling in Cyber-I due to an insufficient number of data sources. To reach a comprehensive psychological description of a person, it is critical to bring in heterogeneous data sources that could provide plenty of personal data, i.e., the physiological data, and the Internet data. In addition, instead of calculating personality traits from personal data directly, an approach to a personality model derived from the theories of Carl Gustav Jung is used to measure a human subject’s persona. Therefore, this research is focused on designing an archetype-based modeling of persona covering an individual’s facets in different situations to approach a comprehensive personality model. Using personal big data to measure a specific persona in a certain scenario, our research is designed to ensure the accuracy and integrity of the generated personality model. PMID:29495343

  14. Structural contributions to fibrillatory rotors in a patient-derived computational model of the atria

    PubMed Central

    Gonzales, Matthew J.; Vincent, Kevin P.; Rappel, Wouter-Jan; Narayan, Sanjiv M.; McCulloch, Andrew D.

    2014-01-01

    Aims The aim of this study was to investigate structural contributions to the maintenance of rotors in human atrial fibrillation (AF) and possible mechanisms of termination. Methods and results A three-dimensional human biatrial finite element model based on patient-derived computed tomography and arrhythmia observed at electrophysiology study was used to study AF. With normal physiological electrical conductivity and effective refractory periods (ERPs), wave break failed to sustain reentrant activity or electrical rotors. With depressed excitability, decreased conduction anisotropy, and shorter ERP characteristic of AF, reentrant rotors were readily maintained. Rotors were transiently or permanently trapped by fibre discontinuities on the lateral wall of the right atrium near the tricuspid valve orifice and adjacent to the crista terminalis, both known sites of right atrial arrhythmias. Modelling inexcitable regions near the rotor tip to simulate fibrosis anchored the rotors, converting the arrhythmia to macro-reentry. Accordingly, increasing the spatial core of inexcitable tissue decreased the frequency of rotation, widened the excitable gap, and enabled an external wave to impinge on the rotor core and displace the source. Conclusion These model findings highlight the importance of structural features in rotor dynamics and suggest that regions of fibrosis may anchor fibrillatory rotors. Increasing extent of fibrosis and scar may eventually convert fibrillation to excitable gap reentry. Such macro-reentry can then be eliminated by extending the obstacle or by external stimuli that penetrate the excitable gap. PMID:25362167

  15. Influence of speckle image reconstruction on photometric precision for large solar telescopes

    NASA Astrophysics Data System (ADS)

    Peck, C. L.; Wöger, F.; Marino, J.

    2017-11-01

    Context. High-resolution observations from large solar telescopes require adaptive optics (AO) systems to overcome image degradation caused by Earth's turbulent atmosphere. AO corrections are, however, only partial. Achieving near-diffraction limited resolution over a large field of view typically requires post-facto image reconstruction techniques to reconstruct the source image. Aims: This study aims to examine the expected photometric precision of amplitude reconstructed solar images calibrated using models for the on-axis speckle transfer functions and input parameters derived from AO control data. We perform a sensitivity analysis of the photometric precision under variations in the model input parameters for high-resolution solar images consistent with four-meter class solar telescopes. Methods: Using simulations of both atmospheric turbulence and partial compensation by an AO system, we computed the speckle transfer function under variations in the input parameters. We then convolved high-resolution numerical simulations of the solar photosphere with the simulated atmospheric transfer function, and subsequently deconvolved them with the model speckle transfer function to obtain a reconstructed image. To compute the resulting photometric precision, we compared the intensity of the original image with the reconstructed image. Results: The analysis demonstrates that high photometric precision can be obtained for speckle amplitude reconstruction using speckle transfer function models combined with AO-derived input parameters. Additionally, it shows that the reconstruction is most sensitive to the input parameter that characterizes the atmospheric distortion, and sub-2% photometric precision is readily obtained when it is well estimated.

  16. Variability of hemodynamic parameters using the common viscosity assumption in a computational fluid dynamics analysis of intracranial aneurysms.

    PubMed

    Suzuki, Takashi; Takao, Hiroyuki; Suzuki, Takamasa; Suzuki, Tomoaki; Masuda, Shunsuke; Dahmani, Chihebeddine; Watanabe, Mitsuyoshi; Mamori, Hiroya; Ishibashi, Toshihiro; Yamamoto, Hideki; Yamamoto, Makoto; Murayama, Yuichi

    2017-01-01

    In most simulations of intracranial aneurysm hemodynamics, blood is assumed to be a Newtonian fluid. However, it is a non-Newtonian fluid, and its viscosity profile differs among individuals. Therefore, the common viscosity assumption may not be valid for all patients. This study aims to test the suitability of the common viscosity assumption. Blood viscosity datasets were obtained from two healthy volunteers. Three simulations were performed for three different-sized aneurysms, two using measured value-based non-Newtonian models and one using a Newtonian model. The parameters proposed to predict an aneurysmal rupture obtained using the non-Newtonian models were compared with those obtained using the Newtonian model. The largest difference (25%) in the normalized wall shear stress (NWSS) was observed in the smallest aneurysm. Comparing the difference ratio to the NWSS with the Newtonian model between the two Non-Newtonian models, the difference of the ratio was 17.3%. Irrespective of the aneurysmal size, computational fluid dynamics simulations with either the common Newtonian or non-Newtonian viscosity assumption could lead to values different from those of the patient-specific viscosity model for hemodynamic parameters such as NWSS.

  17. Detection and quantification of flow consistency in business process models.

    PubMed

    Burattin, Andrea; Bernstein, Vered; Neurauter, Manuel; Soffer, Pnina; Weber, Barbara

    2018-01-01

    Business process models abstract complex business processes by representing them as graphical models. Their layout, as determined by the modeler, may have an effect when these models are used. However, this effect is currently not fully understood. In order to systematically study this effect, a basic set of measurable key visual features is proposed, depicting the layout properties that are meaningful to the human user. The aim of this research is thus twofold: first, to empirically identify key visual features of business process models which are perceived as meaningful to the user and second, to show how such features can be quantified into computational metrics, which are applicable to business process models. We focus on one particular feature, consistency of flow direction, and show the challenges that arise when transforming it into a precise metric. We propose three different metrics addressing these challenges, each following a different view of flow consistency. We then report the results of an empirical evaluation, which indicates which metric is more effective in predicting the human perception of this feature. Moreover, two other automatic evaluations describing the performance and the computational capabilities of our metrics are reported as well.

  18. Patient-specific finite element modeling of bones.

    PubMed

    Poelert, Sander; Valstar, Edward; Weinans, Harrie; Zadpoor, Amir A

    2013-04-01

    Finite element modeling is an engineering tool for structural analysis that has been used for many years to assess the relationship between load transfer and bone morphology and to optimize the design and fixation of orthopedic implants. Due to recent developments in finite element model generation, for example, improved computed tomography imaging quality, improved segmentation algorithms, and faster computers, the accuracy of finite element modeling has increased vastly and finite element models simulating the anatomy and properties of an individual patient can be constructed. Such so-called patient-specific finite element models are potentially valuable tools for orthopedic surgeons in fracture risk assessment or pre- and intraoperative planning of implant placement. The aim of this article is to provide a critical overview of current themes in patient-specific finite element modeling of bones. In addition, the state-of-the-art in patient-specific modeling of bones is compared with the requirements for a clinically applicable patient-specific finite element method, and judgment is passed on the feasibility of application of patient-specific finite element modeling as a part of clinical orthopedic routine. It is concluded that further development in certain aspects of patient-specific finite element modeling are needed before finite element modeling can be used as a routine clinical tool.

  19. Creating, generating and comparing random network models with NetworkRandomizer.

    PubMed

    Tosadori, Gabriele; Bestvina, Ivan; Spoto, Fausto; Laudanna, Carlo; Scardoni, Giovanni

    2016-01-01

    Biological networks are becoming a fundamental tool for the investigation of high-throughput data in several fields of biology and biotechnology. With the increasing amount of information, network-based models are gaining more and more interest and new techniques are required in order to mine the information and to validate the results. To fill the validation gap we present an app, for the Cytoscape platform, which aims at creating randomised networks and randomising existing, real networks. Since there is a lack of tools that allow performing such operations, our app aims at enabling researchers to exploit different, well known random network models that could be used as a benchmark for validating real, biological datasets. We also propose a novel methodology for creating random weighted networks, i.e. the multiplication algorithm, starting from real, quantitative data. Finally, the app provides a statistical tool that compares real versus randomly computed attributes, in order to validate the numerical findings. In summary, our app aims at creating a standardised methodology for the validation of the results in the context of the Cytoscape platform.

  20. Low-profile heliostat design for solar central receiver systems

    NASA Technical Reports Server (NTRS)

    Fourakis, E.; Severson, A. M.

    1977-01-01

    Heliostat designs intended to reduce costs and the effect of adverse wind loads on the devices were developed. Included was the low-profile heliostat consisting of a stiff frame with sectional focusing reflectors coupled together to turn as a unit. The entire frame is arranged to turn angularly about a center point. The ability of the heliostat to rotate about both the vertical and horizontal axes permits a central computer control system to continuously aim the sun's reflection onto a selected target. An engineering model of the basic device was built and is being tested. Control and mirror parameters, such as roughness and need for fine aiming, are being studied. The fabrication of these prototypes is in process. The model was also designed to test mirror focusing techniques, heliostat geometry, mechanical functioning, and tracking control. The model can be easily relocated to test mirror imaging on a tower from various directions. In addition to steering and aiming studies, the tests include the effects of temperature changes, wind gusting and weathering. The results of economic studies on this heliostat are also presented.

Top