NASA Astrophysics Data System (ADS)
Seo, Hyeon; Kim, Donghyeon; Jun, Sung Chan
2016-06-01
Electrical brain stimulation (EBS) is an emerging therapy for the treatment of neurological disorders, and computational modeling studies of EBS have been used to determine the optimal parameters for highly cost-effective electrotherapy. Recent notable growth in computing capability has enabled researchers to consider an anatomically realistic head model that represents the full head and complex geometry of the brain rather than the previous simplified partial head model (extruded slab) that represents only the precentral gyrus. In this work, subdural cortical stimulation (SuCS) was found to offer a better understanding of the differential activation of cortical neurons in the anatomically realistic full-head model than in the simplified partial-head models. We observed that layer 3 pyramidal neurons had comparable stimulation thresholds in both head models, while layer 5 pyramidal neurons showed a notable discrepancy between the models; in particular, layer 5 pyramidal neurons demonstrated asymmetry in the thresholds and action potential initiation sites in the anatomically realistic full-head model. Overall, the anatomically realistic full-head model may offer a better understanding of layer 5 pyramidal neuronal responses. Accordingly, the effects of using the realistic full-head model in SuCS are compelling in computational modeling studies, even though this modeling requires substantially more effort.
Is realistic neuronal modeling realistic?
Almog, Mara
2016-01-01
Scientific models are abstractions that aim to explain natural phenomena. A successful model shows how a complex phenomenon arises from relatively simple principles while preserving major physical or biological rules and predicting novel experiments. A model should not be a facsimile of reality; it is an aid for understanding it. Contrary to this basic premise, with the 21st century has come a surge in computational efforts to model biological processes in great detail. Here we discuss the oxymoronic, realistic modeling of single neurons. This rapidly advancing field is driven by the discovery that some neurons don't merely sum their inputs and fire if the sum exceeds some threshold. Thus researchers have asked what are the computational abilities of single neurons and attempted to give answers using realistic models. We briefly review the state of the art of compartmental modeling highlighting recent progress and intrinsic flaws. We then attempt to address two fundamental questions. Practically, can we realistically model single neurons? Philosophically, should we realistically model single neurons? We use layer 5 neocortical pyramidal neurons as a test case to examine these issues. We subject three publically available models of layer 5 pyramidal neurons to three simple computational challenges. Based on their performance and a partial survey of published models, we conclude that current compartmental models are ad hoc, unrealistic models functioning poorly once they are stretched beyond the specific problems for which they were designed. We then attempt to plot possible paths for generating realistic single neuron models. PMID:27535372
Fast multigrid-based computation of the induced electric field for transcranial magnetic stimulation
NASA Astrophysics Data System (ADS)
Laakso, Ilkka; Hirata, Akimasa
2012-12-01
In transcranial magnetic stimulation (TMS), the distribution of the induced electric field, and the affected brain areas, depends on the position of the stimulation coil and the individual geometry of the head and brain. The distribution of the induced electric field in realistic anatomies can be modelled using computational methods. However, existing computational methods for accurately determining the induced electric field in realistic anatomical models have suffered from long computation times, typically in the range of tens of minutes or longer. This paper presents a matrix-free implementation of the finite-element method with a geometric multigrid method that can potentially reduce the computation time to several seconds or less even when using an ordinary computer. The performance of the method is studied by computing the induced electric field in two anatomically realistic models. An idealized two-loop coil is used as the stimulating coil. Multiple computational grid resolutions ranging from 2 to 0.25 mm are used. The results show that, for macroscopic modelling of the electric field in an anatomically realistic model, computational grid resolutions of 1 mm or 2 mm appear to provide good numerical accuracy compared to higher resolutions. The multigrid iteration typically converges in less than ten iterations independent of the grid resolution. Even without parallelization, each iteration takes about 1.0 s or 0.1 s for the 1 and 2 mm resolutions, respectively. This suggests that calculating the electric field with sufficient accuracy in real time is feasible.
A fast analytical undulator model for realistic high-energy FEL simulations
NASA Astrophysics Data System (ADS)
Tatchyn, R.; Cremer, T.
1997-02-01
A number of leading FEL simulation codes used for modeling gain in the ultralong undulators required for SASE saturation in the <100 Å range employ simplified analytical models both for field and error representations. Although it is recognized that both the practical and theoretical validity of such codes could be enhanced by incorporating realistic undulator field calculations, the computational cost of doing this can be prohibitive, especially for point-to-point integration of the equations of motion through each undulator period. In this paper we describe a simple analytical model suitable for modeling realistic permanent magnet (PM), hybrid/PM, and non-PM undulator structures, and discuss selected techniques for minimizing computation time.
Computational Difficulties in the Identification and Optimization of Control Systems.
1980-01-01
necessary and Identify by block number) - -. 3. iABSTRACT (Continue on revers, side It necessary and Identify by block number) As more realistic models ...Island 02912 ABSTRACT As more realistic models for resource management are developed, the need for efficient computational techniques for parameter...optimization (optimal control) in "state" models which This research was supported in part by ttfe National Science Foundation under grant NSF-MCS 79-05774
Modelling DC responses of 3D complex fracture networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beskardes, Gungor Didem; Weiss, Chester Joseph
Here, the determination of the geometrical properties of fractures plays a critical role in many engineering problems to assess the current hydrological and mechanical states of geological media and to predict their future states. However, numerical modeling of geoelectrical responses in realistic fractured media has been challenging due to the explosive computational cost imposed by the explicit discretizations of fractures at multiple length scales, which often brings about a tradeoff between computational efficiency and geologic realism. Here, we use the hierarchical finite element method to model electrostatic response of realistically complex 3D conductive fracture networks with minimal computational cost.
Modelling DC responses of 3D complex fracture networks
Beskardes, Gungor Didem; Weiss, Chester Joseph
2018-03-01
Here, the determination of the geometrical properties of fractures plays a critical role in many engineering problems to assess the current hydrological and mechanical states of geological media and to predict their future states. However, numerical modeling of geoelectrical responses in realistic fractured media has been challenging due to the explosive computational cost imposed by the explicit discretizations of fractures at multiple length scales, which often brings about a tradeoff between computational efficiency and geologic realism. Here, we use the hierarchical finite element method to model electrostatic response of realistically complex 3D conductive fracture networks with minimal computational cost.
Colour computer-generated holography for point clouds utilizing the Phong illumination model.
Symeonidou, Athanasia; Blinder, David; Schelkens, Peter
2018-04-16
A technique integrating the bidirectional reflectance distribution function (BRDF) is proposed to generate realistic high-quality colour computer-generated holograms (CGHs). We build on prior work, namely a fast computer-generated holography method for point clouds that handles occlusions. We extend the method by integrating the Phong illumination model so that the properties of the objects' surfaces are taken into account to achieve natural light phenomena such as reflections and shadows. Our experiments show that rendering holograms with the proposed algorithm provides realistic looking objects without any noteworthy increase to the computational cost.
Quantitative, steady-state properties of Catania's computational model of the operant reserve.
Berg, John P; McDowell, J J
2011-05-01
Catania (2005) found that a computational model of the operant reserve (Skinner, 1938) produced realistic behavior in initial, exploratory analyses. Although Catania's operant reserve computational model demonstrated potential to simulate varied behavioral phenomena, the model was not systematically tested. The current project replicated and extended the Catania model, clarified its capabilities through systematic testing, and determined the extent to which it produces behavior corresponding to matching theory. Significant departures from both classic and modern matching theory were found in behavior generated by the model across all conditions. The results suggest that a simple, dynamic operant model of the reflex reserve does not simulate realistic steady state behavior. Copyright © 2011 Elsevier B.V. All rights reserved.
Junwei Ma; Han Yuan; Sunderam, Sridhar; Besio, Walter; Lei Ding
2017-07-01
Neural activity inside the human brain generate electrical signals that can be detected on the scalp. Electroencephalograph (EEG) is one of the most widely utilized techniques helping physicians and researchers to diagnose and understand various brain diseases. Due to its nature, EEG signals have very high temporal resolution but poor spatial resolution. To achieve higher spatial resolution, a novel tri-polar concentric ring electrode (TCRE) has been developed to directly measure Surface Laplacian (SL). The objective of the present study is to accurately calculate SL for TCRE based on a realistic geometry head model. A locally dense mesh was proposed to represent the head surface, where the local dense parts were to match the small structural components in TCRE. Other areas without dense mesh were used for the purpose of reducing computational load. We conducted computer simulations to evaluate the performance of the proposed mesh and evaluated possible numerical errors as compared with a low-density model. Finally, with achieved accuracy, we presented the computed forward lead field of SL for TCRE for the first time in a realistic geometry head model and demonstrated that it has better spatial resolution than computed SL from classic EEG recordings.
A rapid algorithm for realistic human reaching and its use in a virtual reality system
NASA Technical Reports Server (NTRS)
Aldridge, Ann; Pandya, Abhilash; Goldsby, Michael; Maida, James
1994-01-01
The Graphics Analysis Facility (GRAF) at JSC has developed a rapid algorithm for computing realistic human reaching. The algorithm was applied to GRAF's anthropometrically correct human model and used in a 3D computer graphics system and a virtual reality system. The nature of the algorithm and its uses are discussed.
Electrical Wave Propagation in a Minimally Realistic Fiber Architecture Model of the Left Ventricle
NASA Astrophysics Data System (ADS)
Song, Xianfeng; Setayeshgar, Sima
2006-03-01
Experimental results indicate a nested, layered geometry for the fiber surfaces of the left ventricle, where fiber directions are approximately aligned in each surface and gradually rotate through the thickness of the ventricle. Numerical and analytical results have highlighted the importance of this rotating anisotropy and its possible destabilizing role on the dynamics of scroll waves in excitable media with application to the heart. Based on the work of Peskin[1] and Peskin and McQueen[2], we present a minimally realistic model of the left ventricle that adequately captures the geometry and anisotropic properties of the heart as a conducting medium while being easily parallelizable, and computationally more tractable than fully realistic anatomical models. Complementary to fully realistic and anatomically-based computational approaches, studies using such a minimal model with the addition of successively realistic features, such as excitation-contraction coupling, should provide unique insight into the basic mechanisms of formation and obliteration of electrical wave instabilities. We describe our construction, implementation and validation of this model. [1] C. S. Peskin, Communications on Pure and Applied Mathematics 42, 79 (1989). [2] C. S. Peskin and D. M. McQueen, in Case Studies in Mathematical Modeling: Ecology, Physiology, and Cell Biology, 309(1996)
Igarashi, Jun; Shouno, Osamu; Fukai, Tomoki; Tsujino, Hiroshi
2011-11-01
Real-time simulation of a biologically realistic spiking neural network is necessary for evaluation of its capacity to interact with real environments. However, the real-time simulation of such a neural network is difficult due to its high computational costs that arise from two factors: (1) vast network size and (2) the complicated dynamics of biologically realistic neurons. In order to address these problems, mainly the latter, we chose to use general purpose computing on graphics processing units (GPGPUs) for simulation of such a neural network, taking advantage of the powerful computational capability of a graphics processing unit (GPU). As a target for real-time simulation, we used a model of the basal ganglia that has been developed according to electrophysiological and anatomical knowledge. The model consists of heterogeneous populations of 370 spiking model neurons, including computationally heavy conductance-based models, connected by 11,002 synapses. Simulation of the model has not yet been performed in real-time using a general computing server. By parallelization of the model on the NVIDIA Geforce GTX 280 GPU in data-parallel and task-parallel fashion, faster-than-real-time simulation was robustly realized with only one-third of the GPU's total computational resources. Furthermore, we used the GPU's full computational resources to perform faster-than-real-time simulation of three instances of the basal ganglia model; these instances consisted of 1100 neurons and 33,006 synapses and were synchronized at each calculation step. Finally, we developed software for simultaneous visualization of faster-than-real-time simulation output. These results suggest the potential power of GPGPU techniques in real-time simulation of realistic neural networks. Copyright © 2011 Elsevier Ltd. All rights reserved.
Training in Methods in Computational Neuroscience
1992-08-29
in Tritonia. Roger Traub Models with realistic neurons , with an emphasis on large-scale modeling of epileptic phenomena in hippocampus. Rodolpho...Cell Model Plan: 1) Convert some of my simulations from NEURON to GENESIS (and thus learn GENESIS). 2) Develop a realistic inhibtory model . 3) Further...General Hospital, MA Course Project: Membrane Properties of a Neostriatal Neuron and Dopamine Modulation The purpose of my project was to model the
Babiloni, F; Babiloni, C; Carducci, F; Fattorini, L; Onorati, P; Urbano, A
1996-04-01
This paper presents a realistic Laplacian (RL) estimator based on a tensorial formulation of the surface Laplacian (SL) that uses the 2-D thin plate spline function to obtain a mathematical description of a realistic scalp surface. Because of this tensorial formulation, the RL does not need an orthogonal reference frame placed on the realistic scalp surface. In simulation experiments the RL was estimated with an increasing number of "electrodes" (up to 256) on a mathematical scalp model, the analytic Laplacian being used as a reference. Second and third order spherical spline Laplacian estimates were examined for comparison. Noise of increasing magnitude and spatial frequency was added to the simulated potential distributions. Movement-related potentials and somatosensory evoked potentials sampled with 128 electrodes were used to estimate the RL on a realistically shaped, MR-constructed model of the subject's scalp surface. The RL was also estimated on a mathematical spherical scalp model computed from the real scalp surface. Simulation experiments showed that the performances of the RL estimator were similar to those of the second and third order spherical spline Laplacians. Furthermore, the information content of scalp-recorded potentials was clearly better when the RL estimator computed the SL of the potential on an MR-constructed scalp surface model.
Simulation of radiofrequency ablation in real human anatomy.
Zorbas, George; Samaras, Theodoros
2014-12-01
The objective of the current work was to simulate radiofrequency ablation treatment in computational models with realistic human anatomy, in order to investigate the effect of realistic geometry in the treatment outcome. The body sites considered in the study were liver, lung and kidney. One numerical model for each body site was obtained from Duke, member of the IT'IS Virtual Family. A spherical tumour was embedded in each model and a single electrode was inserted into the tumour. The same excitation voltage was used in all cases to underline the differences in the resulting temperature rise, due to different anatomy at each body site investigated. The same numerical calculations were performed for a two-compartment model of the tissue geometry, as well as with the use of an analytical approximation for a single tissue compartment. Radiofrequency ablation (RFA) therapy appears efficient for tumours in liver and lung, but less efficient in kidney. Moreover, the time evolution of temperature for a realistic geometry differs from that for a two-compartment model, but even more for an infinite homogenous tissue model. However, it appears that the most critical parameters of computational models for RFA treatment planning are tissue properties rather than tissue geometry. Computational simulations of realistic anatomy models show that the conventional technique of a single electrode inside the tumour volume requires a careful choice of both the excitation voltage and treatment time in order to achieve effective treatment, since the ablation zone differs considerably for various body sites.
Computational 3-D Model of the Human Respiratory System
We are developing a comprehensive, morphologically-realistic computational model of the human respiratory system that can be used to study the inhalation, deposition, and clearance of contaminants, while being adaptable for age, race, gender, and health/disease status. The model ...
Schuch, Klaus; Logothetis, Nikos K.; Maass, Wolfgang
2011-01-01
A major goal of computational neuroscience is the creation of computer models for cortical areas whose response to sensory stimuli resembles that of cortical areas in vivo in important aspects. It is seldom considered whether the simulated spiking activity is realistic (in a statistical sense) in response to natural stimuli. Because certain statistical properties of spike responses were suggested to facilitate computations in the cortex, acquiring a realistic firing regimen in cortical network models might be a prerequisite for analyzing their computational functions. We present a characterization and comparison of the statistical response properties of the primary visual cortex (V1) in vivo and in silico in response to natural stimuli. We recorded from multiple electrodes in area V1 of 4 macaque monkeys and developed a large state-of-the-art network model for a 5 × 5-mm patch of V1 composed of 35,000 neurons and 3.9 million synapses that integrates previously published anatomical and physiological details. By quantitative comparison of the model response to the “statistical fingerprint” of responses in vivo, we find that our model for a patch of V1 responds to the same movie in a way which matches the statistical structure of the recorded data surprisingly well. The deviation between the firing regimen of the model and the in vivo data are on the same level as deviations among monkeys and sessions. This suggests that, despite strong simplifications and abstractions of cortical network models, they are nevertheless capable of generating realistic spiking activity. To reach a realistic firing state, it was not only necessary to include both N-methyl-d-aspartate and GABAB synaptic conductances in our model, but also to markedly increase the strength of excitatory synapses onto inhibitory neurons (>2-fold) in comparison to literature values, hinting at the importance to carefully adjust the effect of inhibition for achieving realistic dynamics in current network models. PMID:21106898
A 4DCT imaging-based breathing lung model with relative hysteresis
Miyawaki, Shinjiro; Choi, Sanghun; Hoffman, Eric A.; Lin, Ching-Long
2016-01-01
To reproduce realistic airway motion and airflow, the authors developed a deforming lung computational fluid dynamics (CFD) model based on four-dimensional (4D, space and time) dynamic computed tomography (CT) images. A total of 13 time points within controlled tidal volume respiration were used to account for realistic and irregular lung motion in human volunteers. Because of the irregular motion of 4DCT-based airways, we identified an optimal interpolation method for airway surface deformation during respiration, and implemented a computational solid mechanics-based moving mesh algorithm to produce smooth deforming airway mesh. In addition, we developed physiologically realistic airflow boundary conditions for both models based on multiple images and a single image. Furthermore, we examined simplified models based on one or two dynamic or static images. By comparing these simplified models with the model based on 13 dynamic images, we investigated the effects of relative hysteresis of lung structure with respect to lung volume, lung deformation, and imaging methods, i.e., dynamic vs. static scans, on CFD-predicted pressure drop. The effect of imaging method on pressure drop was 24 percentage points due to the differences in airflow distribution and airway geometry. PMID:28260811
A 4DCT imaging-based breathing lung model with relative hysteresis
NASA Astrophysics Data System (ADS)
Miyawaki, Shinjiro; Choi, Sanghun; Hoffman, Eric A.; Lin, Ching-Long
2016-12-01
To reproduce realistic airway motion and airflow, the authors developed a deforming lung computational fluid dynamics (CFD) model based on four-dimensional (4D, space and time) dynamic computed tomography (CT) images. A total of 13 time points within controlled tidal volume respiration were used to account for realistic and irregular lung motion in human volunteers. Because of the irregular motion of 4DCT-based airways, we identified an optimal interpolation method for airway surface deformation during respiration, and implemented a computational solid mechanics-based moving mesh algorithm to produce smooth deforming airway mesh. In addition, we developed physiologically realistic airflow boundary conditions for both models based on multiple images and a single image. Furthermore, we examined simplified models based on one or two dynamic or static images. By comparing these simplified models with the model based on 13 dynamic images, we investigated the effects of relative hysteresis of lung structure with respect to lung volume, lung deformation, and imaging methods, i.e., dynamic vs. static scans, on CFD-predicted pressure drop. The effect of imaging method on pressure drop was 24 percentage points due to the differences in airflow distribution and airway geometry.
Kovačič, Aljaž; Borovinšek, Matej; Vesenjak, Matej; Ren, Zoran
2018-01-26
This paper addresses the problem of reconstructing realistic, irregular pore geometries of lotus-type porous iron for computer models that allow for simple porosity and pore size variation in computational characterization of their mechanical properties. The presented methodology uses image-recognition algorithms for the statistical analysis of pore morphology in real material specimens, from which a unique fingerprint of pore morphology at a certain porosity level is derived. The representative morphology parameter is introduced and used for the indirect reconstruction of realistic and statistically representative pore morphologies, which can be used for the generation of computational models with an arbitrary porosity. Such models were subjected to parametric computer simulations to characterize the dependence of engineering elastic modulus on the porosity of lotus-type porous iron. The computational results are in excellent agreement with experimental observations, which confirms the suitability of the presented methodology of indirect pore geometry reconstruction for computational simulations of similar porous materials.
Three-Dimensional Computer-Assisted Two-Layer Elastic Models of the Face.
Ueda, Koichi; Shigemura, Yuka; Otsuki, Yuki; Fuse, Asuka; Mitsuno, Daisuke
2017-11-01
To make three-dimensional computer-assisted elastic models for the face, we decided on five requirements: (1) an elastic texture like skin and subcutaneous tissue; (2) the ability to take pen marking for incisions; (3) the ability to be cut with a surgical knife; (4) the ability to keep stitches in place for a long time; and (5) a layered structure. After testing many elastic solvents, we have made realistic three-dimensional computer-assisted two-layer elastic models of the face and cleft lip from the computed tomographic and magnetic resonance imaging stereolithographic data. The surface layer is made of polyurethane and the inner layer is silicone. Using this elastic model, we taught residents and young doctors how to make several typical local flaps and to perform cheiloplasty. They could experience realistic simulated surgery and understand three-dimensional movement of the flaps.
Computer model for harmonic ultrasound imaging.
Li, Y; Zagzebski, J A
2000-01-01
Harmonic ultrasound imaging has received great attention from ultrasound scanner manufacturers and researchers. In this paper, we present a computer model that can generate realistic harmonic images. In this model, the incident ultrasound is modeled after the "KZK" equation, and the echo signal is modeled using linear propagation theory because the echo signal is much weaker than the incident pulse. Both time domain and frequency domain numerical solutions to the "KZK" equation were studied. Realistic harmonic images of spherical lesion phantoms were generated for scans by a circular transducer. This model can be a very useful tool for studying the harmonic buildup and dissipation processes in a nonlinear medium, and it can be used to investigate a wide variety of topics related to B-mode harmonic imaging.
Computer model for harmonic ultrasound imaging.
Li, Y; Zagzebski, J A
2000-01-01
Harmonic ultrasound imaging has received great attention from ultrasound scanner manufacturers and researchers. Here, the authors present a computer model that can generate realistic harmonic images. In this model, the incident ultrasound is modeled after the "KZK" equation, and the echo signal is modeled using linear propagation theory because the echo signal is much weaker than the incident pulse. Both time domain and frequency domain numerical solutions to the "KZK" equation were studied. Realistic harmonic images of spherical lesion phantoms were generated for scans by a circular transducer. This model can be a very useful tool for studying the harmonic buildup and dissipation processes in a nonlinear medium, and it can be used to investigate a wide variety of topics related to B-mode harmonic imaging.
Deana D. Pennington
2007-01-01
Exploratory modeling is an approach used when process and/or parameter uncertainties are such that modeling attempts at realistic prediction are not appropriate. Exploratory modeling makes use of computational experimentation to test how varying model scenarios drive model outcome. The goal of exploratory modeling is to better understand the system of interest through...
ARACHNE: A neural-neuroglial network builder with remotely controlled parallel computing
Rusakov, Dmitri A.; Savtchenko, Leonid P.
2017-01-01
Creating and running realistic models of neural networks has hitherto been a task for computing professionals rather than experimental neuroscientists. This is mainly because such networks usually engage substantial computational resources, the handling of which requires specific programing skills. Here we put forward a newly developed simulation environment ARACHNE: it enables an investigator to build and explore cellular networks of arbitrary biophysical and architectural complexity using the logic of NEURON and a simple interface on a local computer or a mobile device. The interface can control, through the internet, an optimized computational kernel installed on a remote computer cluster. ARACHNE can combine neuronal (wired) and astroglial (extracellular volume-transmission driven) network types and adopt realistic cell models from the NEURON library. The program and documentation (current version) are available at GitHub repository https://github.com/LeonidSavtchenko/Arachne under the MIT License (MIT). PMID:28362877
Kahnert, Michael; Nousiainen, Timo; Lindqvist, Hannakaisa
2013-04-08
Optical properties of light absorbing carbon (LAC) aggregates encapsulated in a shell of sulfate are computed for realistic model geometries based on field measurements. Computations are performed for wavelengths from the UV-C to the mid-IR. Both climate- and remote sensing-relevant optical properties are considered. The results are compared to commonly used simplified model geometries, none of which gives a realistic representation of the distribution of the LAC mass within the host material and, as a consequence, fail to predict the optical properties accurately. A new core-gray shell model is introduced, which accurately reproduces the size- and wavelength dependence of the integrated and differential optical properties.
Comprehensive review: Computational modelling of schizophrenia.
Valton, Vincent; Romaniuk, Liana; Douglas Steele, J; Lawrie, Stephen; Seriès, Peggy
2017-12-01
Computational modelling has been used to address: (1) the variety of symptoms observed in schizophrenia using abstract models of behavior (e.g. Bayesian models - top-down descriptive models of psychopathology); (2) the causes of these symptoms using biologically realistic models involving abnormal neuromodulation and/or receptor imbalance (e.g. connectionist and neural networks - bottom-up realistic models of neural processes). These different levels of analysis have been used to answer different questions (i.e. understanding behavioral vs. neurobiological anomalies) about the nature of the disorder. As such, these computational studies have mostly supported diverging hypotheses of schizophrenia's pathophysiology, resulting in a literature that is not always expanding coherently. Some of these hypotheses are however ripe for revision using novel empirical evidence. Here we present a review that first synthesizes the literature of computational modelling for schizophrenia and psychotic symptoms into categories supporting the dopamine, glutamate, GABA, dysconnection and Bayesian inference hypotheses respectively. Secondly, we compare model predictions against the accumulated empirical evidence and finally we identify specific hypotheses that have been left relatively under-investigated. Copyright © 2017. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Heister, Timo; Dannberg, Juliane; Gassmöller, Rene; Bangerth, Wolfgang
2017-08-01
Computations have helped elucidate the dynamics of Earth's mantle for several decades already. The numerical methods that underlie these simulations have greatly evolved within this time span, and today include dynamically changing and adaptively refined meshes, sophisticated and efficient solvers, and parallelization to large clusters of computers. At the same time, many of the methods - discussed in detail in a previous paper in this series - were developed and tested primarily using model problems that lack many of the complexities that are common to the realistic models our community wants to solve today. With several years of experience solving complex and realistic models, we here revisit some of the algorithm designs of the earlier paper and discuss the incorporation of more complex physics. In particular, we re-consider time stepping and mesh refinement algorithms, evaluate approaches to incorporate compressibility, and discuss dealing with strongly varying material coefficients, latent heat, and how to track chemical compositions and heterogeneities. Taken together and implemented in a high-performance, massively parallel code, the techniques discussed in this paper then allow for high resolution, 3-D, compressible, global mantle convection simulations with phase transitions, strongly temperature dependent viscosity and realistic material properties based on mineral physics data.
Arenas, Miguel
2015-04-01
NGS technologies present a fast and cheap generation of genomic data. Nevertheless, ancestral genome inference is not so straightforward due to complex evolutionary processes acting on this material such as inversions, translocations, and other genome rearrangements that, in addition to their implicit complexity, can co-occur and confound ancestral inferences. Recently, models of genome evolution that accommodate such complex genomic events are emerging. This letter explores these novel evolutionary models and proposes their incorporation into robust statistical approaches based on computer simulations, such as approximate Bayesian computation, that may produce a more realistic evolutionary analysis of genomic data. Advantages and pitfalls in using these analytical methods are discussed. Potential applications of these ancestral genomic inferences are also pointed out.
Effect of conductor geometry on source localization: Implications for epilepsy studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schlitt, H.; Heller, L.; Best, E.
1994-07-01
We shall discuss the effects of conductor geometry on source localization for applications in epilepsy studies. The most popular conductor model for clinical MEG studies is a homogeneous sphere. However, several studies have indicated that a sphere is a poor model for the head when the sources are deep, as is the case for epileptic foci in the mesial temporal lobe. We believe that replacing the spherical model with a more realistic one in the inverse fitting procedure will improve the accuracy of localizing epileptic sources. In order to include a realistic head model in the inverse problem, we mustmore » first solve the forward problem for the realistic conductor geometry. We create a conductor geometry model from MR images, and then solve the forward problem via a boundary integral equation for the electric potential due to a specified primary source. One the electric potential is known, the magnetic field can be calculated directly. The most time-intensive part of the problem is generating the conductor model; fortunately, this needs to be done only once for each patient. It takes little time to change the primary current and calculate a new magnetic field for use in the inverse fitting procedure. We present the results of a series of computer simulations in which we investigate the localization accuracy due to replacing the spherical model with the realistic head model in the inverse fitting procedure. The data to be fit consist of a computer generated magnetic field due to a known current dipole in a realistic head model, with added noise. We compare the localization errors when this field is fit using a spherical model to the fit using a realistic head model. Using a spherical model is comparable to what is usually done when localizing epileptic sources in humans, where the conductor model used in the inverse fitting procedure does not correspond to the actual head.« less
A Pulsatile Cardiovascular Computer Model for Teaching Heart-Blood Vessel Interaction.
ERIC Educational Resources Information Center
Campbell, Kenneth; And Others
1982-01-01
Describes a model which gives realistic predictions of pulsatile pressure, flow, and volume events in the cardiovascular system. Includes computer oriented laboratory exercises for veterinary and graduate students; equations of the dynamic and algebraic models; and a flow chart for the cardiovascular teaching program. (JN)
Xie, Tianwu; Zaidi, Habib
2016-01-01
The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.
An empirical generative framework for computational modeling of language acquisition.
Waterfall, Heidi R; Sandbank, Ben; Onnis, Luca; Edelman, Shimon
2010-06-01
This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of generative grammars from raw CHILDES data and give an account of the generative performance of the acquired grammars. Next, we summarize findings from recent longitudinal and experimental work that suggests how certain statistically prominent structural properties of child-directed speech may facilitate language acquisition. We then present a series of new analyses of CHILDES data indicating that the desired properties are indeed present in realistic child-directed speech corpora. Finally, we suggest how our computational results, behavioral findings, and corpus-based insights can be integrated into a next-generation model aimed at meeting the four requirements of our modeling framework.
Challenges to the development of complex virtual reality surgical simulations.
Seymour, N E; Røtnes, J S
2006-11-01
Virtual reality simulation in surgical training has become more widely used and intensely investigated in an effort to develop safer, more efficient, measurable training processes. The development of virtual reality simulation of surgical procedures has begun, but well-described technical obstacles must be overcome to permit varied training in a clinically realistic computer-generated environment. These challenges include development of realistic surgical interfaces and physical objects within the computer-generated environment, modeling of realistic interactions between objects, rendering of the surgical field, and development of signal processing for complex events associated with surgery. Of these, the realistic modeling of tissue objects that are fully responsive to surgical manipulations is the most challenging. Threats to early success include relatively limited resources for development and procurement, as well as smaller potential for return on investment than in other simulation industries that face similar problems. Despite these difficulties, steady progress continues to be made in these areas. If executed properly, virtual reality offers inherent advantages over other training systems in creating a realistic surgical environment and facilitating measurement of surgeon performance. Once developed, complex new virtual reality training devices must be validated for their usefulness in formative training and assessment of skill to be established.
The Use of Computer Simulation Techniques in Educational Planning.
ERIC Educational Resources Information Center
Wilson, Charles Z.
Computer simulations provide powerful models for establishing goals, guidelines, and constraints in educational planning. They are dynamic models that allow planners to examine logical descriptions of organizational behavior over time as well as permitting consideration of the large and complex systems required to provide realistic descriptions of…
Computer modeling of human decision making
NASA Technical Reports Server (NTRS)
Gevarter, William B.
1991-01-01
Models of human decision making are reviewed. Models which treat just the cognitive aspects of human behavior are included as well as models which include motivation. Both models which have associated computer programs, and those that do not, are considered. Since flow diagrams, that assist in constructing computer simulation of such models, were not generally available, such diagrams were constructed and are presented. The result provides a rich source of information, which can aid in construction of more realistic future simulations of human decision making.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Tianwu; Zaidi, Habib, E-mail: habib.zaidi@hcuge.ch; Geneva Neuroscience Center, Geneva University, Geneva CH-1205
The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and themore » development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.« less
CFD simulation of flow through heart: a perspective review.
Khalafvand, S S; Ng, E Y K; Zhong, L
2011-01-01
The heart is an organ which pumps blood around the body by contraction of muscular wall. There is a coupled system in the heart containing the motion of wall and the motion of blood fluid; both motions must be computed simultaneously, which make biological computational fluid dynamics (CFD) difficult. The wall of the heart is not rigid and hence proper boundary conditions are essential for CFD modelling. Fluid-wall interaction is very important for real CFD modelling. There are many assumptions for CFD simulation of the heart that make it far from a real model. A realistic fluid-structure interaction modelling the structure by the finite element method and the fluid flow by CFD use more realistic coupling algorithms. This type of method is very powerful to solve the complex properties of the cardiac structure and the sensitive interaction of fluid and structure. The final goal of heart modelling is to simulate the total heart function by integrating cardiac anatomy, electrical activation, mechanics, metabolism and fluid mechanics together, as in the computational framework.
Zhou, Xiangmin; Zhang, Nan; Sha, Desong; Shen, Yunhe; Tamma, Kumar K; Sweet, Robert
2009-01-01
The inability to render realistic soft-tissue behavior in real time has remained a barrier to face and content aspects of validity for many virtual reality surgical training systems. Biophysically based models are not only suitable for training purposes but also for patient-specific clinical applications, physiological modeling and surgical planning. When considering the existing approaches for modeling soft tissue for virtual reality surgical simulation, the computer graphics-based approach lacks predictive capability; the mass-spring model (MSM) based approach lacks biophysically realistic soft-tissue dynamic behavior; and the finite element method (FEM) approaches fail to meet the real-time requirement. The present development stems from physics fundamental thermodynamic first law; for a space discrete dynamic system directly formulates the space discrete but time continuous governing equation with embedded material constitutive relation and results in a discrete mechanics framework which possesses a unique balance between the computational efforts and the physically realistic soft-tissue dynamic behavior. We describe the development of the discrete mechanics framework with focused attention towards a virtual laparoscopic nephrectomy application.
An Empirical Generative Framework for Computational Modeling of Language Acquisition
ERIC Educational Resources Information Center
Waterfall, Heidi R.; Sandbank, Ben; Onnis, Luca; Edelman, Shimon
2010-01-01
This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of…
Development of a realistic stress analysis for fatigue analysis of notched composite laminates
NASA Technical Reports Server (NTRS)
Humphreys, E. A.; Rosen, B. W.
1979-01-01
A finite element stress analysis which consists of a membrane and interlaminar shear spring analysis was developed. This approach was utilized in order to model physically realistic failure mechanisms while maintaining a high degree of computational economy. The accuracy of the stress analysis predictions is verified through comparisons with other solutions to the composite laminate edge effect problem. The stress analysis model was incorporated into an existing fatigue analysis methodology and the entire procedure computerized. A fatigue analysis is performed upon a square laminated composite plate with a circular central hole. A complete description and users guide for the computer code FLAC (Fatigue of Laminated Composites) is included as an appendix.
A 4DCT imaging-based breathing lung model with relative hysteresis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miyawaki, Shinjiro; Choi, Sanghun; Hoffman, Eric A.
To reproduce realistic airway motion and airflow, the authors developed a deforming lung computational fluid dynamics (CFD) model based on four-dimensional (4D, space and time) dynamic computed tomography (CT) images. A total of 13 time points within controlled tidal volume respiration were used to account for realistic and irregular lung motion in human volunteers. Because of the irregular motion of 4DCT-based airways, we identified an optimal interpolation method for airway surface deformation during respiration, and implemented a computational solid mechanics-based moving mesh algorithm to produce smooth deforming airway mesh. In addition, we developed physiologically realistic airflow boundary conditions for bothmore » models based on multiple images and a single image. Furthermore, we examined simplified models based on one or two dynamic or static images. By comparing these simplified models with the model based on 13 dynamic images, we investigated the effects of relative hysteresis of lung structure with respect to lung volume, lung deformation, and imaging methods, i.e., dynamic vs. static scans, on CFD-predicted pressure drop. The effect of imaging method on pressure drop was 24 percentage points due to the differences in airflow distribution and airway geometry. - Highlights: • We developed a breathing human lung CFD model based on 4D-dynamic CT images. • The 4DCT-based breathing lung model is able to capture lung relative hysteresis. • A new boundary condition for lung model based on one static CT image was proposed. • The difference between lung models based on 4D and static CT images was quantified.« less
Blend Shape Interpolation and FACS for Realistic Avatar
NASA Astrophysics Data System (ADS)
Alkawaz, Mohammed Hazim; Mohamad, Dzulkifli; Basori, Ahmad Hoirul; Saba, Tanzila
2015-03-01
The quest of developing realistic facial animation is ever-growing. The emergence of sophisticated algorithms, new graphical user interfaces, laser scans and advanced 3D tools imparted further impetus towards the rapid advancement of complex virtual human facial model. Face-to-face communication being the most natural way of human interaction, the facial animation systems became more attractive in the information technology era for sundry applications. The production of computer-animated movies using synthetic actors are still challenging issues. Proposed facial expression carries the signature of happiness, sadness, angry or cheerful, etc. The mood of a particular person in the midst of a large group can immediately be identified via very subtle changes in facial expressions. Facial expressions being very complex as well as important nonverbal communication channel are tricky to synthesize realistically using computer graphics. Computer synthesis of practical facial expressions must deal with the geometric representation of the human face and the control of the facial animation. We developed a new approach by integrating blend shape interpolation (BSI) and facial action coding system (FACS) to create a realistic and expressive computer facial animation design. The BSI is used to generate the natural face while the FACS is employed to reflect the exact facial muscle movements for four basic natural emotional expressions such as angry, happy, sad and fear with high fidelity. The results in perceiving the realistic facial expression for virtual human emotions based on facial skin color and texture may contribute towards the development of virtual reality and game environment of computer aided graphics animation systems.
Simulation studies of the application of SEASAT data in weather and state of sea forecasting models
NASA Technical Reports Server (NTRS)
Cardone, V. J.; Greenwood, J. A.
1979-01-01
The design and analysis of SEASAT simulation studies in which the error structure of conventional analyses and forecasts is modeled realistically are presented. The development and computer implementation of a global spectral ocean wave model is described. The design of algorithms for the assimilation of theoretical wind data into computers and for the utilization of real wind data and wave height data in a coupled computer system are presented.
Canstein, C; Cachot, P; Faust, A; Stalder, A F; Bock, J; Frydrychowicz, A; Küffer, J; Hennig, J; Markl, M
2008-03-01
The knowledge of local vascular anatomy and function in the human body is of high interest for the diagnosis and treatment of cardiovascular disease. A comprehensive analysis of the hemodynamics in the thoracic aorta is presented based on the integration of flow-sensitive 4D MRI with state-of-the-art rapid prototyping technology and computational fluid dynamics (CFD). Rapid prototyping was used to transform aortic geometries as measured by contrast-enhanced MR angiography into realistic vascular models with large anatomical coverage. Integration into a flow circuit with patient-specific pulsatile in-flow conditions and application of flow-sensitive 4D MRI permitted detailed analysis of local and global 3D flow dynamics in a realistic vascular geometry. Visualization of characteristic 3D flow patterns and quantitative comparisons of the in vitro experiments with in vivo data and CFD simulations in identical vascular geometries were performed to evaluate the accuracy of vascular model systems. The results indicate the potential of such patient-specific model systems for detailed experimental simulation of realistic vascular hemodynamics. Further studies are warranted to examine the influence of refined boundary conditions of the human circulatory system such as fluid-wall interaction and their effect on normal and pathological blood flow characteristics associated with vascular geometry. (c) 2008 Wiley-Liss, Inc.
A gridded global description of the ionosphere and thermosphere for 1996 - 2000
NASA Astrophysics Data System (ADS)
Ridley, A.; Kihn, E.; Kroehl, H.
The modeling and simulation community has asked for a realistic representation of the near-Earth space environment covering a significant number of years to be used in scientific and engineering applications. The data, data management systems, assimilation techniques, physical models, and computer resources are now available to construct a realistic description of the ionosphere and thermosphere over a 5 year period. DMSP and NOAA POES satellite data and solar emissions were used to compute Hall and Pederson conductances in the ionosphere. Interplanetary magnetic field measurements on the ACE satellite define average electrostatic potential patterns over the northern and southern Polar Regions. These conductances, electric field patterns, and ground-based magnetometer data were input to the Assimilative Mapping of Ionospheric Electrodynamics model to compute the distribution of electric fields and currents in the ionosphere. The Global Thermosphere Ionosphere Model (GITM) used the ionospheric electrodynamic parameters to compute the distribution of particles and fields in the ionosphere and thermosphere. GITM uses a general circulation approach to solve the fundamental equations. Model results offer a unique opportunity to assess the relative importance of different forcing terms under a variety of conditions as well as the accuracies of different estimates of ionospheric electrodynamic parameters.
A Neural Model of How the Brain Computes Heading from Optic Flow in Realistic Scenes
ERIC Educational Resources Information Center
Browning, N. Andrew; Grossberg, Stephen; Mingolla, Ennio
2009-01-01
Visually-based navigation is a key competence during spatial cognition. Animals avoid obstacles and approach goals in novel cluttered environments using optic flow to compute heading with respect to the environment. Most navigation models try either explain data, or to demonstrate navigational competence in real-world environments without regard…
Computational Psychometrics for Modeling System Dynamics during Stressful Disasters.
Cipresso, Pietro; Bessi, Alessandro; Colombo, Desirée; Pedroli, Elisa; Riva, Giuseppe
2017-01-01
Disasters can be very stressful events. However, computational models of stress require data that might be very difficult to collect during disasters. Moreover, personal experiences are not repeatable, so it is not possible to collect bottom-up information when building a coherent model. To overcome these problems, we propose the use of computational models and virtual reality integration to recreate disaster situations, while examining possible dynamics in order to understand human behavior and relative consequences. By providing realistic parameters associated with disaster situations, computational scientists can work more closely with emergency responders to improve the quality of interventions in the future.
Leahy, P.P.
1982-01-01
The Trescott computer program for modeling groundwater flow in three dimensions has been modified to (1) treat aquifer and confining bed pinchouts more realistically and (2) reduce the computer memory requirements needed for the input data. Using the original program, simulation of aquifer systems with nonrectangular external boundaries may result in a large number of nodes that are not involved in the numerical solution of the problem, but require computer storage. (USGS)
NASA Astrophysics Data System (ADS)
Chun, Poo-Reum; Lee, Se-Ah; Yook, Yeong-Geun; Choi, Kwang-Sung; Cho, Deog-Geun; Yu, Dong-Hun; Chang, Won-Seok; Kwon, Deuk-Chul; Im, Yeon-Ho
2013-09-01
Although plasma etch profile simulation has been attracted much interest for developing reliable plasma etching, there still exist big gaps between current research status and predictable modeling due to the inherent complexity of plasma process. As an effort to address this issue, we present 3D feature profile simulation coupled with well-defined plasma-surface kinetic model for silicon dioxide etching process under fluorocarbon plasmas. To capture the realistic plasma surface reaction behaviors, a polymer layer based surface kinetic model was proposed to consider the simultaneous polymer deposition and oxide etching. Finally, the realistic plasma surface model was used for calculation of speed function for 3D topology simulation, which consists of multiple level set based moving algorithm, and ballistic transport module. In addition, the time consumable computations in the ballistic transport calculation were improved drastically by GPU based numerical computation, leading to the real time computation. Finally, we demonstrated that the surface kinetic model could be coupled successfully for 3D etch profile simulations in high-aspect ratio contact hole plasma etching.
Perfusion kinetics in human brain tumor with DCE-MRI derived model and CFD analysis.
Bhandari, A; Bansal, A; Singh, A; Sinha, N
2017-07-05
Cancer is one of the leading causes of death all over the world. Among the strategies that are used for cancer treatment, the effectiveness of chemotherapy is often hindered by factors such as irregular and non-uniform uptake of drugs inside tumor. Thus, accurate prediction of drug transport and deposition inside tumor is crucial for increasing the effectiveness of chemotherapeutic treatment. In this study, a computational model of human brain tumor is developed that incorporates dynamic contrast enhanced-magnetic resonance imaging (DCE-MRI) data into a voxelized porous media model. The model takes into account realistic transport and perfusion kinetics parameters together with realistic heterogeneous tumor vasculature and accurate arterial input function (AIF), which makes it patient specific. The computational results for interstitial fluid pressure (IFP), interstitial fluid velocity (IFV) and tracer concentration show good agreement with the experimental results. The computational model can be extended further for predicting the deposition of chemotherapeutic drugs in tumor environment as well as selection of the best chemotherapeutic drug for a specific patient. Copyright © 2017 Elsevier Ltd. All rights reserved.
Mathematical Description of Complex Chemical Kinetics and Application to CFD Modeling Codes
NASA Technical Reports Server (NTRS)
Bittker, D. A.
1993-01-01
A major effort in combustion research at the present time is devoted to the theoretical modeling of practical combustion systems. These include turbojet and ramjet air-breathing engines as well as ground-based gas-turbine power generating systems. The ability to use computational modeling extensively in designing these products not only saves time and money, but also helps designers meet the quite rigorous environmental standards that have been imposed on all combustion devices. The goal is to combine the very complex solution of the Navier-Stokes flow equations with realistic turbulence and heat-release models into a single computer code. Such a computational fluid-dynamic (CFD) code simulates the coupling of fluid mechanics with the chemistry of combustion to describe the practical devices. This paper will focus on the task of developing a simplified chemical model which can predict realistic heat-release rates as well as species composition profiles, and is also computationally rapid. We first discuss the mathematical techniques used to describe a complex, multistep fuel oxidation chemical reaction and develop a detailed mechanism for the process. We then show how this mechanism may be reduced and simplified to give an approximate model which adequately predicts heat release rates and a limited number of species composition profiles, but is computationally much faster than the original one. Only such a model can be incorporated into a CFD code without adding significantly to long computation times. Finally, we present some of the recent advances in the development of these simplified chemical mechanisms.
Mathematical description of complex chemical kinetics and application to CFD modeling codes
NASA Technical Reports Server (NTRS)
Bittker, D. A.
1993-01-01
A major effort in combustion research at the present time is devoted to the theoretical modeling of practical combustion systems. These include turbojet and ramjet air-breathing engines as well as ground-based gas-turbine power generating systems. The ability to use computational modeling extensively in designing these products not only saves time and money, but also helps designers meet the quite rigorous environmental standards that have been imposed on all combustion devices. The goal is to combine the very complex solution of the Navier-Stokes flow equations with realistic turbulence and heat-release models into a single computer code. Such a computational fluid-dynamic (CFD) code simulates the coupling of fluid mechanics with the chemistry of combustion to describe the practical devices. This paper will focus on the task of developing a simplified chemical model which can predict realistic heat-release rates as well as species composition profiles, and is also computationally rapid. We first discuss the mathematical techniques used to describe a complex, multistep fuel oxidation chemical reaction and develop a detailed mechanism for the process. We then show how this mechanism may be reduced and simplified to give an approximate model which adequately predicts heat release rates and a limited number of species composition profiles, but is computationally much faster than the original one. Only such a model can be incorporated into a CFD code without adding significantly to long computation times. Finally, we present some of the recent advances in the development of these simplified chemical mechanisms.
Feasibility of Equivalent Dipole Models for Electroencephalogram-Based Brain Computer Interfaces.
Schimpf, Paul H
2017-09-15
This article examines the localization errors of equivalent dipolar sources inverted from the surface electroencephalogram in order to determine the feasibility of using their location as classification parameters for non-invasive brain computer interfaces. Inverse localization errors are examined for two head models: a model represented by four concentric spheres and a realistic model based on medical imagery. It is shown that the spherical model results in localization ambiguity such that a number of dipolar sources, with different azimuths and varying orientations, provide a near match to the electroencephalogram of the best equivalent source. No such ambiguity exists for the elevation of inverted sources, indicating that for spherical head models, only the elevation of inverted sources (and not the azimuth) can be expected to provide meaningful classification parameters for brain-computer interfaces. In a realistic head model, all three parameters of the inverted source location are found to be reliable, providing a more robust set of parameters. In both cases, the residual error hypersurfaces demonstrate local minima, indicating that a search for the best-matching sources should be global. Source localization error vs. signal-to-noise ratio is also demonstrated for both head models.
2015-06-24
physically . While not distinct from IH models, they require inner boundary magnetic field and plasma property values, the latter not currently measured...initialization for the computational grid. Model integration continues until a physically consistent steady-state is attained. Because of the more... physical basis and greater likelihood of realistic solutions, only MHD-type coronal models were considered in the review. There are two major types of
Computational Modeling of Inflammation and Wound Healing
Ziraldo, Cordelia; Mi, Qi; An, Gary; Vodovotz, Yoram
2013-01-01
Objective Inflammation is both central to proper wound healing and a key driver of chronic tissue injury via a positive-feedback loop incited by incidental cell damage. We seek to derive actionable insights into the role of inflammation in wound healing in order to improve outcomes for individual patients. Approach To date, dynamic computational models have been used to study the time evolution of inflammation in wound healing. Emerging clinical data on histo-pathological and macroscopic images of evolving wounds, as well as noninvasive measures of blood flow, suggested the need for tissue-realistic, agent-based, and hybrid mechanistic computational simulations of inflammation and wound healing. Innovation We developed a computational modeling system, Simple Platform for Agent-based Representation of Knowledge, to facilitate the construction of tissue-realistic models. Results A hybrid equation–agent-based model (ABM) of pressure ulcer formation in both spinal cord-injured and -uninjured patients was used to identify control points that reduce stress caused by tissue ischemia/reperfusion. An ABM of arterial restenosis revealed new dynamics of cell migration during neointimal hyperplasia that match histological features, but contradict the currently prevailing mechanistic hypothesis. ABMs of vocal fold inflammation were used to predict inflammatory trajectories in individuals, possibly allowing for personalized treatment. Conclusions The intertwined inflammatory and wound healing responses can be modeled computationally to make predictions in individuals, simulate therapies, and gain mechanistic insights. PMID:24527362
Image-Based Reverse Engineering and Visual Prototyping of Woven Cloth.
Schroder, Kai; Zinke, Arno; Klein, Reinhard
2015-02-01
Realistic visualization of cloth has many applications in computer graphics. An ongoing research problem is how to best represent and capture cloth models, specifically when considering computer aided design of cloth. Previous methods produce highly realistic images, however, they are either difficult to edit or require the measurement of large databases to capture all variations of a cloth sample. We propose a pipeline to reverse engineer cloth and estimate a parametrized cloth model from a single image. We introduce a geometric yarn model, integrating state-of-the-art textile research. We present an automatic analysis approach to estimate yarn paths, yarn widths, their variation and a weave pattern. Several examples demonstrate that we are able to model the appearance of the original cloth sample. Properties derived from the input image give a physically plausible basis that is fully editable using a few intuitive parameters.
Particle-Size-Grouping Model of Precipitation Kinetics in Microalloyed Steels
NASA Astrophysics Data System (ADS)
Xu, Kun; Thomas, Brian G.
2012-03-01
The formation, growth, and size distribution of precipitates greatly affects the microstructure and properties of microalloyed steels. Computational particle-size-grouping (PSG) kinetic models based on population balances are developed to simulate precipitate particle growth resulting from collision and diffusion mechanisms. First, the generalized PSG method for collision is explained clearly and verified. Then, a new PSG method is proposed to model diffusion-controlled precipitate nucleation, growth, and coarsening with complete mass conservation and no fitting parameters. Compared with the original population-balance models, this PSG method saves significant computation and preserves enough accuracy to model a realistic range of particle sizes. Finally, the new PSG method is combined with an equilibrium phase fraction model for plain carbon steels and is applied to simulate the precipitated fraction of aluminum nitride and the size distribution of niobium carbide during isothermal aging processes. Good matches are found with experimental measurements, suggesting that the new PSG method offers a promising framework for the future development of realistic models of precipitation.
Theoretical models for duct acoustic propagation and radiation
NASA Technical Reports Server (NTRS)
Eversman, Walter
1991-01-01
The development of computational methods in acoustics has led to the introduction of analysis and design procedures which model the turbofan inlet as a coupled system, simultaneously modeling propagation and radiation in the presence of realistic internal and external flows. Such models are generally large, require substantial computer speed and capacity, and can be expected to be used in the final design stages, with the simpler models being used in the early design iterations. Emphasis is given to practical modeling methods that have been applied to the acoustical design problem in turbofan engines. The mathematical model is established and the simplest case of propagation in a duct with hard walls is solved to introduce concepts and terminologies. An extensive overview is given of methods for the calculation of attenuation in uniform ducts with uniform flow and with shear flow. Subsequent sections deal with numerical techniques which provide an integrated representation of duct propagation and near- and far-field radiation for realistic geometries and flight conditions.
Trayanova, Natalia A; Tice, Brock M
2009-01-01
Simulation of cardiac electrical function, and specifically, simulation aimed at understanding the mechanisms of cardiac rhythm disorders, represents an example of a successful integrative multiscale modeling approach, uncovering emergent behavior at the successive scales in the hierarchy of structural complexity. The goal of this article is to present a review of the integrative multiscale models of realistic ventricular structure used in the quest to understand and treat ventricular arrhythmias. It concludes with the new advances in image-based modeling of the heart and the promise it holds for the development of individualized models of ventricular function in health and disease. PMID:20628585
Virtual reality neurosurgery: a simulator blueprint.
Spicer, Mark A; van Velsen, Martin; Caffrey, John P; Apuzzo, Michael L J
2004-04-01
This article details preliminary studies undertaken to integrate the most relevant advancements across multiple disciplines in an effort to construct a highly realistic neurosurgical simulator based on a distributed computer architecture. Techniques based on modified computational modeling paradigms incorporating finite element analysis are presented, as are current and projected efforts directed toward the implementation of a novel bidirectional haptic device. Patient-specific data derived from noninvasive magnetic resonance imaging sequences are used to construct a computational model of the surgical region of interest. Magnetic resonance images of the brain may be coregistered with those obtained from magnetic resonance angiography, magnetic resonance venography, and diffusion tensor imaging to formulate models of varying anatomic complexity. The majority of the computational burden is encountered in the presimulation reduction of the computational model and allows realization of the required threshold rates for the accurate and realistic representation of real-time visual animations. Intracranial neurosurgical procedures offer an ideal testing site for the development of a totally immersive virtual reality surgical simulator when compared with the simulations required in other surgical subspecialties. The material properties of the brain as well as the typically small volumes of tissue exposed in the surgical field, coupled with techniques and strategies to minimize computational demands, provide unique opportunities for the development of such a simulator. Incorporation of real-time haptic and visual feedback is approached here and likely will be accomplished soon.
Computational Psychometrics for Modeling System Dynamics during Stressful Disasters
Cipresso, Pietro; Bessi, Alessandro; Colombo, Desirée; Pedroli, Elisa; Riva, Giuseppe
2017-01-01
Disasters can be very stressful events. However, computational models of stress require data that might be very difficult to collect during disasters. Moreover, personal experiences are not repeatable, so it is not possible to collect bottom-up information when building a coherent model. To overcome these problems, we propose the use of computational models and virtual reality integration to recreate disaster situations, while examining possible dynamics in order to understand human behavior and relative consequences. By providing realistic parameters associated with disaster situations, computational scientists can work more closely with emergency responders to improve the quality of interventions in the future. PMID:28861026
Bohme, Andrea; van Rienen, Ursula
2016-08-01
Computational modeling of the stimulating field distribution during Deep Brain Stimulation provides an opportunity to advance our knowledge of this neurosurgical therapy for Parkinson's disease. There exist several approaches to model the target region for Deep Brain Stimulation in Hemi-parkinson Rats with volume conductor models. We have described and compared the normalized mapping approach as well as the modeling with three-dimensional structures, which include curvilinear coordinates to assure an anatomically realistic conductivity tensor orientation.
NASA Astrophysics Data System (ADS)
He, Xiao Dong
This thesis studies light scattering processes off rough surfaces. Analytic models for reflection, transmission and subsurface scattering of light are developed. The results are applicable to realistic image generation in computer graphics. The investigation focuses on the basic issue of how light is scattered locally by general surfaces which are neither diffuse nor specular; Physical optics is employed to account for diffraction and interference which play a crucial role in the scattering of light for most surfaces. The thesis presents: (1) A new reflectance model; (2) A new transmittance model; (3) A new subsurface scattering model. All of these models are physically-based, depend on only physical parameters, apply to a wide range of materials and surface finishes and more importantly, provide a smooth transition from diffuse-like to specular reflection as the wavelength and incidence angle are increased or the surface roughness is decreased. The reflectance and transmittance models are based on the Kirchhoff Theory and the subsurface scattering model is based on Energy Transport Theory. They are valid only for surfaces with shallow slopes. The thesis shows that predicted reflectance distributions given by the reflectance model compare favorably with experiment. The thesis also investigates and implements fast ways of computing the reflectance and transmittance models. Furthermore, the thesis demonstrates that a high level of realistic image generation can be achieved due to the physically -correct treatment of the scattering processes by the reflectance model.
Effects of damping on mode shapes, volume 2
NASA Technical Reports Server (NTRS)
Gates, R. M.; Merchant, D. H.; Arnquist, J. L.
1977-01-01
Displacement, velocity, and acceleration admittances were calculated for a realistic NASTRAN structural model of space shuttle for three conditions: liftoff, maximum dynamic pressure and end of solid rocket booster burn. The realistic model of the orbiter, external tank, and solid rocket motors included the representation of structural joint transmissibilities by finite stiffness and damping elements. Data values for the finite damping elements were assigned to duplicate overall low-frequency modal damping values taken from tests of similar vehicles. For comparison with the calculated admittances, position and rate gains were computed for a conventional shuttle model for the liftoff condition. Dynamic characteristics and admittances for the space shuttle model are presented.
Navarro, Rafael; Palos, Fernando; Lanchares, Elena; Calvo, Begoña; Cristóbal, José A
2009-01-01
To develop a realistic model of the optomechanical behavior of the cornea after curved relaxing incisions to simulate the induced astigmatic change and predict the optical aberrations produced by the incisions. ICMA Consejo Superior de Investigaciones Científicas and Universidad de Zaragoza, Zaragoza, Spain. A 3-dimensional finite element model of the anterior hemisphere of the ocular surface was used. The corneal tissue was modeled as a quasi-incompressible, anisotropic hyperelastic constitutive behavior strongly dependent on the physiological collagen fibril distribution. Similar behaviors were assigned to the limbus and sclera. With this model, some corneal incisions were computer simulated after the Lindstrom nomogram. The resulting geometry of the biomechanical simulation was analyzed in the optical zone, and finite ray tracing was performed to compute refractive power and higher-order aberrations (HOAs). The finite-element simulation provided new geometry of the corneal surfaces, from which elevation topographies were obtained. The surgically induced astigmatism (SIA) of the simulated incisions according to the Lindstrom nomogram was computed by finite ray tracing. However, paraxial computations would yield slightly different results (undercorrection of astigmatism). In addition, arcuate incisions would induce significant amounts of HOAs. Finite-element models, together with finite ray-tracing computations, yielded realistic simulations of the biomechanical and optical changes induced by relaxing incisions. The model reproduced the SIA indicated by the Lindstrom nomogram for the simulated incisions and predicted a significant increase in optical aberrations induced by arcuate keratotomy.
Precision Modeling Of Targets Using The VALUE Computer Program
NASA Astrophysics Data System (ADS)
Hoffman, George A.; Patton, Ronald; Akerman, Alexander
1989-08-01
The 1976-vintage LASERX computer code has been augmented to produce realistic electro-optical images of targets. Capabilities lacking in LASERX but recently incorporated into its VALUE successor include: •Shadows cast onto the ground •Shadows cast onto parts of the target •See-through transparencies (e.g.,canopies) •Apparent images due both to atmospheric scattering and turbulence •Surfaces characterized by multiple bi-directional reflectance functions VALUE provides not only realistic target modeling by its precise and comprehensive representation of all target attributes, but additionally VALUE is very user friendly. Specifically, setup of runs is accomplished by screen prompting menus in a sequence of queries that is logical to the user. VALUE also incorporates the Optical Encounter (OPEC) software developed by Tricor Systems,Inc., Elgin, IL.
A computationally tractable version of the collective model
NASA Astrophysics Data System (ADS)
Rowe, D. J.
2004-05-01
A computationally tractable version of the Bohr-Mottelson collective model is presented which makes it possible to diagonalize realistic collective models and obtain convergent results in relatively small appropriately chosen subspaces of the collective model Hilbert space. Special features of the proposed model are that it makes use of the beta wave functions given analytically by the softened-beta version of the Wilets-Jean model, proposed by Elliott et al., and a simple algorithm for computing SO(5)⊃SO(3) spherical harmonics. The latter has much in common with the methods of Chacon, Moshinsky, and Sharp but is conceptually and computationally simpler. Results are presented for collective models ranging from the spherical vibrator to the Wilets-Jean and axially symmetric rotor-vibrator models.
A computational model for epidural electrical stimulation of spinal sensorimotor circuits.
Capogrosso, Marco; Wenger, Nikolaus; Raspopovic, Stanisa; Musienko, Pavel; Beauparlant, Janine; Bassi Luciani, Lorenzo; Courtine, Grégoire; Micera, Silvestro
2013-12-04
Epidural electrical stimulation (EES) of lumbosacral segments can restore a range of movements after spinal cord injury. However, the mechanisms and neural structures through which EES facilitates movement execution remain unclear. Here, we designed a computational model and performed in vivo experiments to investigate the type of fibers, neurons, and circuits recruited in response to EES. We first developed a realistic finite element computer model of rat lumbosacral segments to identify the currents generated by EES. To evaluate the impact of these currents on sensorimotor circuits, we coupled this model with an anatomically realistic axon-cable model of motoneurons, interneurons, and myelinated afferent fibers for antagonistic ankle muscles. Comparisons between computer simulations and experiments revealed the ability of the model to predict EES-evoked motor responses over multiple intensities and locations. Analysis of the recruited neural structures revealed the lack of direct influence of EES on motoneurons and interneurons. Simulations and pharmacological experiments demonstrated that EES engages spinal circuits trans-synaptically through the recruitment of myelinated afferent fibers. The model also predicted the capacity of spatially distinct EES to modulate side-specific limb movements and, to a lesser extent, extension versus flexion. These predictions were confirmed during standing and walking enabled by EES in spinal rats. These combined results provide a mechanistic framework for the design of spinal neuroprosthetic systems to improve standing and walking after neurological disorders.
2011-01-01
Background Computational models play an increasingly important role in the assessment and control of public health crises, as demonstrated during the 2009 H1N1 influenza pandemic. Much research has been done in recent years in the development of sophisticated data-driven models for realistic computer-based simulations of infectious disease spreading. However, only a few computational tools are presently available for assessing scenarios, predicting epidemic evolutions, and managing health emergencies that can benefit a broad audience of users including policy makers and health institutions. Results We present "GLEaMviz", a publicly available software system that simulates the spread of emerging human-to-human infectious diseases across the world. The GLEaMviz tool comprises three components: the client application, the proxy middleware, and the simulation engine. The latter two components constitute the GLEaMviz server. The simulation engine leverages on the Global Epidemic and Mobility (GLEaM) framework, a stochastic computational scheme that integrates worldwide high-resolution demographic and mobility data to simulate disease spread on the global scale. The GLEaMviz design aims at maximizing flexibility in defining the disease compartmental model and configuring the simulation scenario; it allows the user to set a variety of parameters including: compartment-specific features, transition values, and environmental effects. The output is a dynamic map and a corresponding set of charts that quantitatively describe the geo-temporal evolution of the disease. The software is designed as a client-server system. The multi-platform client, which can be installed on the user's local machine, is used to set up simulations that will be executed on the server, thus avoiding specific requirements for large computational capabilities on the user side. Conclusions The user-friendly graphical interface of the GLEaMviz tool, along with its high level of detail and the realism of its embedded modeling approach, opens up the platform to simulate realistic epidemic scenarios. These features make the GLEaMviz computational tool a convenient teaching/training tool as well as a first step toward the development of a computational tool aimed at facilitating the use and exploitation of computational models for the policy making and scenario analysis of infectious disease outbreaks. PMID:21288355
Where Next for Marine Cloud Brightening Research?
NASA Astrophysics Data System (ADS)
Jenkins, A. K. L.; Forster, P.
2014-12-01
Realistic estimates of geoengineering effectiveness will be central to informed decision-making on its possible role in addressing climate change. Over the last decade, global-scale computer climate modelling of geoengineering has been developing. While these developments have allowed quantitative estimates of geoengineering effectiveness to be produced, the relative coarseness of the grid of these models (tens of kilometres) means that key practical details of the proposed geoengineering is not always realistically captured. This is particularly true for marine cloud brightening (MCB), where both the clouds, as well as the tens-of-meters scale sea-going implementation vessels cannot be captured in detail. Previous research using cloud resolving modelling has shown that neglecting such details may lead to MCB effectiveness being overestimated by up to half. Realism of MCB effectiveness will likely improve from ongoing developments in the understanding and modelling of clouds. We also propose that realism can be increased via more specific improvements (see figure). A readily achievable example would be the reframing of previous MCB effectiveness estimates in light of the cloud resolving scale findings. Incorporation of implementation details could also be made - via parameterisation - into future global-scale modelling of MCB. However, as significant unknowns regarding the design of the MCB aerosol production technique remain, resource-intensive cloud resolving computer modelling of MCB may be premature unless of broader benefit to the wider understanding of clouds. One of the most essential recommendations is for enhanced communication between climate scientists and MCB designers. This would facilitate the identification of potentially important design aspects necessary for realistic computer simulations. Such relationships could be mutually beneficial, with computer modelling potentially informing more efficient designs of the MCB implementation technique. (Acknowledgment) This work is part of the Integrated Assessment of Geoengineering Proposals (IAGP) project, funded by the Engineering and Physical Sciences Research Council and the Natural Environment Research Council (EP/I014721/1).
Schwalenberg, Simon
2005-06-01
The present work represents a first attempt to perform computations of output intensity distributions for different parametric holographic scattering patterns. Based on the model for parametric four-wave mixing processes in photorefractive crystals and taking into account realistic material properties, we present computed images of selected scattering patterns. We compare these calculated light distributions to the corresponding experimental observations. Our analysis is especially devoted to dark scattering patterns as they make high demands on the underlying model.
A video, text, and speech-driven realistic 3-d virtual head for human-machine interface.
Yu, Jun; Wang, Zeng-Fu
2015-05-01
A multiple inputs-driven realistic facial animation system based on 3-D virtual head for human-machine interface is proposed. The system can be driven independently by video, text, and speech, thus can interact with humans through diverse interfaces. The combination of parameterized model and muscular model is used to obtain a tradeoff between computational efficiency and high realism of 3-D facial animation. The online appearance model is used to track 3-D facial motion from video in the framework of particle filtering, and multiple measurements, i.e., pixel color value of input image and Gabor wavelet coefficient of illumination ratio image, are infused to reduce the influence of lighting and person dependence for the construction of online appearance model. The tri-phone model is used to reduce the computational consumption of visual co-articulation in speech synchronized viseme synthesis without sacrificing any performance. The objective and subjective experiments show that the system is suitable for human-machine interaction.
Computational evolution: taking liberties.
Correia, Luís
2010-09-01
Evolution has, for a long time, inspired computer scientists to produce computer models mimicking its behavior. Evolutionary algorithm (EA) is one of the areas where this approach has flourished. EAs have been used to model and study evolution, but they have been especially developed for their aptitude as optimization tools for engineering. Developed models are quite simple in comparison with their natural sources of inspiration. However, since EAs run on computers, we have the freedom, especially in optimization models, to test approaches both realistic and outright speculative, from the biological point of view. In this article, we discuss different common evolutionary algorithm models, and then present some alternatives of interest. These include biologically inspired models, such as co-evolution and, in particular, symbiogenetics and outright artificial operators and representations. In each case, the advantages of the modifications to the standard model are identified. The other area of computational evolution, which has allowed us to study basic principles of evolution and ecology dynamics, is the development of artificial life platforms for open-ended evolution of artificial organisms. With these platforms, biologists can test theories by directly manipulating individuals and operators, observing the resulting effects in a realistic way. An overview of the most prominent of such environments is also presented. If instead of artificial platforms we use the real world for evolving artificial life, then we are dealing with evolutionary robotics (ERs). A brief description of this area is presented, analyzing its relations to biology. Finally, we present the conclusions and identify future research avenues in the frontier of computation and biology. Hopefully, this will help to draw the attention of more biologists and computer scientists to the benefits of such interdisciplinary research.
Fast Realistic MRI Simulations Based on Generalized Multi-Pool Exchange Tissue Model.
Liu, Fang; Velikina, Julia V; Block, Walter F; Kijowski, Richard; Samsonov, Alexey A
2017-02-01
We present MRiLab, a new comprehensive simulator for large-scale realistic MRI simulations on a regular PC equipped with a modern graphical processing unit (GPU). MRiLab combines realistic tissue modeling with numerical virtualization of an MRI system and scanning experiment to enable assessment of a broad range of MRI approaches including advanced quantitative MRI methods inferring microstructure on a sub-voxel level. A flexible representation of tissue microstructure is achieved in MRiLab by employing the generalized tissue model with multiple exchanging water and macromolecular proton pools rather than a system of independent proton isochromats typically used in previous simulators. The computational power needed for simulation of the biologically relevant tissue models in large 3D objects is gained using parallelized execution on GPU. Three simulated and one actual MRI experiments were performed to demonstrate the ability of the new simulator to accommodate a wide variety of voxel composition scenarios and demonstrate detrimental effects of simplified treatment of tissue micro-organization adapted in previous simulators. GPU execution allowed ∼ 200× improvement in computational speed over standard CPU. As a cross-platform, open-source, extensible environment for customizing virtual MRI experiments, MRiLab streamlines the development of new MRI methods, especially those aiming to infer quantitatively tissue composition and microstructure.
Fast Realistic MRI Simulations Based on Generalized Multi-Pool Exchange Tissue Model
Velikina, Julia V.; Block, Walter F.; Kijowski, Richard; Samsonov, Alexey A.
2017-01-01
We present MRiLab, a new comprehensive simulator for large-scale realistic MRI simulations on a regular PC equipped with a modern graphical processing unit (GPU). MRiLab combines realistic tissue modeling with numerical virtualization of an MRI system and scanning experiment to enable assessment of a broad range of MRI approaches including advanced quantitative MRI methods inferring microstructure on a sub-voxel level. A flexibl representation of tissue microstructure is achieved in MRiLab by employing the generalized tissue model with multiple exchanging water and macromolecular proton pools rather than a system of independent proton isochromats typically used in previous simulators. The computational power needed for simulation of the biologically relevant tissue models in large 3D objects is gained using parallelized execution on GPU. Three simulated and one actual MRI experiments were performed to demonstrate the ability of the new simulator to accommodate a wide variety of voxel composition scenarios and demonstrate detrimental effects of simplifie treatment of tissue micro-organization adapted in previous simulators. GPU execution allowed ∼200× improvement in computational speed over standard CPU. As a cross-platform, open-source, extensible environment for customizing virtual MRI experiments, MRiLab streamlines the development of new MRI methods, especially those aiming to infer quantitatively tissue composition and microstructure. PMID:28113746
Simulation of Combustion Systems with Realistic g-jitter
NASA Technical Reports Server (NTRS)
Mell, William E.; McGrattan, Kevin B.; Baum, Howard R.
2003-01-01
In this project a transient, fully three-dimensional computer simulation code was developed to simulate the effects of realistic g-jitter on a number of combustion systems. The simulation code is capable of simulating flame spread on a solid and nonpremixed or premixed gaseous combustion in nonturbulent flow with simple combustion models. Simple combustion models were used to preserve computational efficiency since this is meant to be an engineering code. Also, the use of sophisticated turbulence models was not pursued (a simple Smagorinsky type model can be implemented if deemed appropriate) because if flow velocities are large enough for turbulence to develop in a reduced gravity combustion scenario it is unlikely that g-jitter disturbances (in NASA's reduced gravity facilities) will play an important role in the flame dynamics. Acceleration disturbances of realistic orientation, magnitude, and time dependence can be easily included in the simulation. The simulation algorithm was based on techniques used in an existing large eddy simulation code which has successfully simulated fire dynamics in complex domains. A series of simulations with measured and predicted acceleration disturbances on the International Space Station (ISS) are presented. The results of this series of simulations suggested a passive isolation system and appropriate scheduling of crew activity would provide a sufficiently "quiet" acceleration environment for spherical diffusion flames.
Patient-Specific Computational Modeling of Human Phonation
NASA Astrophysics Data System (ADS)
Xue, Qian; Zheng, Xudong; University of Maine Team
2013-11-01
Phonation is a common biological process resulted from the complex nonlinear coupling between glottal aerodynamics and vocal fold vibrations. In the past, the simplified symmetric straight geometric models were commonly employed for experimental and computational studies. The shape of larynx lumen and vocal folds are highly three-dimensional indeed and the complex realistic geometry produces profound impacts on both glottal flow and vocal fold vibrations. To elucidate the effect of geometric complexity on voice production and improve the fundamental understanding of human phonation, a full flow-structure interaction simulation is carried out on a patient-specific larynx model. To the best of our knowledge, this is the first patient-specific flow-structure interaction study of human phonation. The simulation results are well compared to the established human data. The effects of realistic geometry on glottal flow and vocal fold dynamics are investigated. It is found that both glottal flow and vocal fold dynamics present a high level of difference from the previous simplified model. This study also paved the important step toward the development of computer model for voice disease diagnosis and surgical planning. The project described was supported by Grant Number ROlDC007125 from the National Institute on Deafness and Other Communication Disorders (NIDCD).
Riveros, Fabián; Chandra, Santanu; Finol, Ender A; Gasser, T Christian; Rodriguez, Jose F
2013-04-01
Biomechanical studies on abdominal aortic aneurysms (AAA) seek to provide for better decision criteria to undergo surgical intervention for AAA repair. More accurate results can be obtained by using appropriate material models for the tissues along with accurate geometric models and more realistic boundary conditions for the lesion. However, patient-specific AAA models are generated from gated medical images in which the artery is under pressure. Therefore, identification of the AAA zero pressure geometry would allow for a more realistic estimate of the aneurysmal wall mechanics. This study proposes a novel iterative algorithm to find the zero pressure geometry of patient-specific AAA models. The methodology allows considering the anisotropic hyperelastic behavior of the aortic wall, its thickness and accounts for the presence of the intraluminal thrombus. Results on 12 patient-specific AAA geometric models indicate that the procedure is computational tractable and efficient, and preserves the global volume of the model. In addition, a comparison of the peak wall stress computed with the zero pressure and CT-based geometries during systole indicates that computations using CT-based geometric models underestimate the peak wall stress by 59 ± 64 and 47 ± 64 kPa for the isotropic and anisotropic material models of the arterial wall, respectively.
Economic Modeling as a Component of Academic Strategic Planning.
ERIC Educational Resources Information Center
MacKinnon, Joyce; Sothmann, Mark; Johnson, James
2001-01-01
Computer-based economic modeling was used to enable a school of allied health to define outcomes, identify associated costs, develop cost and revenue models, and create a financial planning system. As a strategic planning tool, it assisted realistic budgeting and improved efficiency and effectiveness. (Contains 18 references.) (SK)
On Connectivity of Wireless Sensor Networks with Directional Antennas
Wang, Qiu; Dai, Hong-Ning; Zheng, Zibin; Imran, Muhammad; Vasilakos, Athanasios V.
2017-01-01
In this paper, we investigate the network connectivity of wireless sensor networks with directional antennas. In particular, we establish a general framework to analyze the network connectivity while considering various antenna models and the channel randomness. Since existing directional antenna models have their pros and cons in the accuracy of reflecting realistic antennas and the computational complexity, we propose a new analytical directional antenna model called the iris model to balance the accuracy against the complexity. We conduct extensive simulations to evaluate the analytical framework. Our results show that our proposed analytical model on the network connectivity is accurate, and our iris antenna model can provide a better approximation to realistic directional antennas than other existing antenna models. PMID:28085081
An exactly solvable, spatial model of mutation accumulation in cancer
NASA Astrophysics Data System (ADS)
Paterson, Chay; Nowak, Martin A.; Waclaw, Bartlomiej
2016-12-01
One of the hallmarks of cancer is the accumulation of driver mutations which increase the net reproductive rate of cancer cells and allow them to spread. This process has been studied in mathematical models of well mixed populations, and in computer simulations of three-dimensional spatial models. But the computational complexity of these more realistic, spatial models makes it difficult to simulate realistically large and clinically detectable solid tumours. Here we describe an exactly solvable mathematical model of a tumour featuring replication, mutation and local migration of cancer cells. The model predicts a quasi-exponential growth of large tumours, even if different fragments of the tumour grow sub-exponentially due to nutrient and space limitations. The model reproduces clinically observed tumour growth times using biologically plausible rates for cell birth, death, and migration rates. We also show that the expected number of accumulated driver mutations increases exponentially in time if the average fitness gain per driver is constant, and that it reaches a plateau if the gains decrease over time. We discuss the realism of the underlying assumptions and possible extensions of the model.
Stewart, Terrence C; Eliasmith, Chris
2013-06-01
Quantum probability (QP) theory can be seen as a type of vector symbolic architecture (VSA): mental states are vectors storing structured information and manipulated using algebraic operations. Furthermore, the operations needed by QP match those in other VSAs. This allows existing biologically realistic neural models to be adapted to provide a mechanistic explanation of the cognitive phenomena described in the target article by Pothos & Busemeyer (P&B).
Modeling the Cerebellar Microcircuit: New Strategies for a Long-Standing Issue.
D'Angelo, Egidio; Antonietti, Alberto; Casali, Stefano; Casellato, Claudia; Garrido, Jesus A; Luque, Niceto Rafael; Mapelli, Lisa; Masoli, Stefano; Pedrocchi, Alessandra; Prestori, Francesca; Rizza, Martina Francesca; Ros, Eduardo
2016-01-01
The cerebellar microcircuit has been the work bench for theoretical and computational modeling since the beginning of neuroscientific research. The regular neural architecture of the cerebellum inspired different solutions to the long-standing issue of how its circuitry could control motor learning and coordination. Originally, the cerebellar network was modeled using a statistical-topological approach that was later extended by considering the geometrical organization of local microcircuits. However, with the advancement in anatomical and physiological investigations, new discoveries have revealed an unexpected richness of connections, neuronal dynamics and plasticity, calling for a change in modeling strategies, so as to include the multitude of elementary aspects of the network into an integrated and easily updatable computational framework. Recently, biophysically accurate "realistic" models using a bottom-up strategy accounted for both detailed connectivity and neuronal non-linear membrane dynamics. In this perspective review, we will consider the state of the art and discuss how these initial efforts could be further improved. Moreover, we will consider how embodied neurorobotic models including spiking cerebellar networks could help explaining the role and interplay of distributed forms of plasticity. We envisage that realistic modeling, combined with closed-loop simulations, will help to capture the essence of cerebellar computations and could eventually be applied to neurological diseases and neurorobotic control systems.
A baroclinic quasigeostrophic open ocean model
NASA Technical Reports Server (NTRS)
Miller, R. N.; Robinson, A. R.; Haidvogel, D. B.
1983-01-01
A baroclinic quasigeostrophic open ocean model is presented, calibrated by a series of test problems, and demonstrated to be feasible and efficient for application to realistic mid-oceanic mesoscale eddy flow regimes. Two methods of treating the depth dependence of the flow, a finite difference method and a collocation method, are tested and intercompared. Sample Rossby wave calculations with and without advection are performed with constant stratification and two levels of nonlinearity, one weaker than and one typical of real ocean flows. Using exact analytical solutions for comparison, the accuracy and efficiency of the model is tabulated as a function of the computational parameters and stability limits set; typically, errors were controlled between 1 percent and 10 percent RMS after two wave periods. Further Rossby wave tests with realistic stratification and wave parameters chosen to mimic real ocean conditions were performed to determine computational parameters for use with real and simulated data. Finally, a prototype calculation with quasiturbulent simulated data was performed successfully, which demonstrates the practicality of the model for scientific use.
A computational method for comparing the behavior and possible failure of prosthetic implants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nielsen, C.; Hollerbach, K.; Perfect, S.
1995-05-01
Prosthetic joint implants currently in use exhibit high Realistic computer modeling of prosthetic implants provides an opportunity for orthopedic biomechanics researchers and physicians to understand possible in vivo failure modes, without having to resort to lengthy and costly clinical trials. The research presented here is part of a larger effort to develop realistic models of implanted joint prostheses. The example used here is the thumb carpo-metacarpal (cmc) joint. The work, however, can be applied to any other human joints for which prosthetic implants have been designed. Preliminary results of prosthetic joint loading, without surrounding human tissue (i.e., simulating conditions undermore » which the prosthetic joint has not yet been implanted into the human joint), are presented, based on a three-dimensional, nonlinear finite element analysis of three different joint implant designs.« less
Uterus models for use in virtual reality hysteroscopy simulators.
Niederer, Peter; Weiss, Stephan; Caduff, Rosmarie; Bajka, Michael; Szekély, Gabor; Harders, Matthias
2009-05-01
Virtual reality models of human organs are needed in surgery simulators which are developed for educational and training purposes. A simulation can only be useful, however, if the mechanical performance of the system in terms of force-feedback for the user as well as the visual representation is realistic. We therefore aim at developing a mechanical computer model of the organ in question which yields realistic force-deformation behavior under virtual instrument-tissue interactions and which, in particular, runs in real time. The modeling of the human uterus is described as it is to be implemented in a simulator for minimally invasive gynecological procedures. To this end, anatomical information which was obtained from specially designed computed tomography and magnetic resonance imaging procedures as well as constitutive tissue properties recorded from mechanical testing were used. In order to achieve real-time performance, the combination of mechanically realistic numerical uterus models of various levels of complexity with a statistical deformation approach is suggested. In view of mechanical accuracy of such models, anatomical characteristics including the fiber architecture along with the mechanical deformation properties are outlined. In addition, an approach to make this numerical representation potentially usable in an interactive simulation is discussed. The numerical simulation of hydrometra is shown in this communication. The results were validated experimentally. In order to meet the real-time requirements and to accommodate the large biological variability associated with the uterus, a statistical modeling approach is demonstrated to be useful.
Kahnert, Michael; Nousiainen, Timo; Lindqvist, Hannakaisa; Ebert, Martin
2012-04-23
Light scattering by light absorbing carbon (LAC) aggregates encapsulated into sulfate shells is computed by use of the discrete dipole method. Computations are performed for a UV, visible, and IR wavelength, different particle sizes, and volume fractions. Reference computations are compared to three classes of simplified model particles that have been proposed for climate modeling purposes. Neither model matches the reference results sufficiently well. Remarkably, more realistic core-shell geometries fall behind homogeneous mixture models. An extended model based on a core-shell-shell geometry is proposed and tested. Good agreement is found for total optical cross sections and the asymmetry parameter. © 2012 Optical Society of America
Characterization of photomultiplier tubes with a realistic model through GPU-boosted simulation
NASA Astrophysics Data System (ADS)
Anthony, M.; Aprile, E.; Grandi, L.; Lin, Q.; Saldanha, R.
2018-02-01
The accurate characterization of a photomultiplier tube (PMT) is crucial in a wide-variety of applications. However, current methods do not give fully accurate representations of the response of a PMT, especially at very low light levels. In this work, we present a new and more realistic model of the response of a PMT, called the cascade model, and use it to characterize two different PMTs at various voltages and light levels. The cascade model is shown to outperform the more common Gaussian model in almost all circumstances and to agree well with a newly introduced model independent approach. The technical and computational challenges of this model are also presented along with the employed solution of developing a robust GPU-based analysis framework for this and other non-analytical models.
A generic framework to simulate realistic lung, liver and renal pathologies in CT imaging
NASA Astrophysics Data System (ADS)
Solomon, Justin; Samei, Ehsan
2014-11-01
Realistic three-dimensional (3D) mathematical models of subtle lesions are essential for many computed tomography (CT) studies focused on performance evaluation and optimization. In this paper, we develop a generic mathematical framework that describes the 3D size, shape, contrast, and contrast-profile characteristics of a lesion, as well as a method to create lesion models based on CT data of real lesions. Further, we implemented a technique to insert the lesion models into CT images in order to create hybrid CT datasets. This framework was used to create a library of realistic lesion models and corresponding hybrid CT images. The goodness of fit of the models was assessed using the coefficient of determination (R2) and the visual appearance of the hybrid images was assessed with an observer study using images of both real and simulated lesions and receiver operator characteristic (ROC) analysis. The average R2 of the lesion models was 0.80, implying that the models provide a good fit to real lesion data. The area under the ROC curve was 0.55, implying that the observers could not readily distinguish between real and simulated lesions. Therefore, we conclude that the lesion-modeling framework presented in this paper can be used to create realistic lesion models and hybrid CT images. These models could be instrumental in performance evaluation and optimization of novel CT systems.
Tena, Ana F; Fernández, Joaquín; Álvarez, Eduardo; Casan, Pere; Walters, D Keith
2017-06-01
The need for a better understanding of pulmonary diseases has led to increased interest in the development of realistic computational models of the human lung. To minimize computational cost, a reduced geometry model is used for a model lung airway geometry up to generation 16. Truncated airway branches require physiologically realistic boundary conditions to accurately represent the effect of the removed airway sections. A user-defined function has been developed, which applies velocities mapped from similar locations in fully resolved airway sections. The methodology can be applied in any general purpose computational fluid dynamics code, with the only limitation that the lung model must be symmetrical in each truncated branch. Unsteady simulations have been performed to verify the operation of the model. The test case simulates a spirometry because the lung is obliged to rapidly perform both inspiration and expiration. Once the simulation was completed, the obtained pressure in the lower level of the lung was used as a boundary condition. The output velocity, which is a numerical spirometry, was compared with the experimental spirometry for validation purposes. This model can be applied for a wide range of patient-specific resolution levels. If the upper airway generations have been constructed from a computed tomography scan, it would be possible to quickly obtain a complete reconstruction of the lung specific to a specific person, which would allow individualized therapies. Copyright © 2016 John Wiley & Sons, Ltd.
Sachetto Oliveira, Rafael; Martins Rocha, Bernardo; Burgarelli, Denise; Meira, Wagner; Constantinides, Christakis; Weber Dos Santos, Rodrigo
2018-02-01
The use of computer models as a tool for the study and understanding of the complex phenomena of cardiac electrophysiology has attained increased importance nowadays. At the same time, the increased complexity of the biophysical processes translates into complex computational and mathematical models. To speed up cardiac simulations and to allow more precise and realistic uses, 2 different techniques have been traditionally exploited: parallel computing and sophisticated numerical methods. In this work, we combine a modern parallel computing technique based on multicore and graphics processing units (GPUs) and a sophisticated numerical method based on a new space-time adaptive algorithm. We evaluate each technique alone and in different combinations: multicore and GPU, multicore and GPU and space adaptivity, multicore and GPU and space adaptivity and time adaptivity. All the techniques and combinations were evaluated under different scenarios: 3D simulations on slabs, 3D simulations on a ventricular mouse mesh, ie, complex geometry, sinus-rhythm, and arrhythmic conditions. Our results suggest that multicore and GPU accelerate the simulations by an approximate factor of 33×, whereas the speedups attained by the space-time adaptive algorithms were approximately 48. Nevertheless, by combining all the techniques, we obtained speedups that ranged between 165 and 498. The tested methods were able to reduce the execution time of a simulation by more than 498× for a complex cellular model in a slab geometry and by 165× in a realistic heart geometry simulating spiral waves. The proposed methods will allow faster and more realistic simulations in a feasible time with no significant loss of accuracy. Copyright © 2017 John Wiley & Sons, Ltd.
Testolin, Alberto; Zorzi, Marco
2016-01-01
Connectionist models can be characterized within the more general framework of probabilistic graphical models, which allow to efficiently describe complex statistical distributions involving a large number of interacting variables. This integration allows building more realistic computational models of cognitive functions, which more faithfully reflect the underlying neural mechanisms at the same time providing a useful bridge to higher-level descriptions in terms of Bayesian computations. Here we discuss a powerful class of graphical models that can be implemented as stochastic, generative neural networks. These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account top-down, predictive processing supported by feedback loops. We review some recent cognitive models based on generative networks, and we point out promising research directions to investigate neuropsychological disorders within this approach. Though further efforts are required in order to fill the gap between structured Bayesian models and more realistic, biophysical models of neuronal dynamics, we argue that generative neural networks have the potential to bridge these levels of analysis, thereby improving our understanding of the neural bases of cognition and of pathologies caused by brain damage. PMID:27468262
Testolin, Alberto; Zorzi, Marco
2016-01-01
Connectionist models can be characterized within the more general framework of probabilistic graphical models, which allow to efficiently describe complex statistical distributions involving a large number of interacting variables. This integration allows building more realistic computational models of cognitive functions, which more faithfully reflect the underlying neural mechanisms at the same time providing a useful bridge to higher-level descriptions in terms of Bayesian computations. Here we discuss a powerful class of graphical models that can be implemented as stochastic, generative neural networks. These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account top-down, predictive processing supported by feedback loops. We review some recent cognitive models based on generative networks, and we point out promising research directions to investigate neuropsychological disorders within this approach. Though further efforts are required in order to fill the gap between structured Bayesian models and more realistic, biophysical models of neuronal dynamics, we argue that generative neural networks have the potential to bridge these levels of analysis, thereby improving our understanding of the neural bases of cognition and of pathologies caused by brain damage.
On the usage of ultrasound computational models for decision making under ambiguity
NASA Astrophysics Data System (ADS)
Dib, Gerges; Sexton, Samuel; Prowant, Matthew; Crawford, Susan; Diaz, Aaron
2018-04-01
Computer modeling and simulation is becoming pervasive within the non-destructive evaluation (NDE) industry as a convenient tool for designing and assessing inspection techniques. This raises a pressing need for developing quantitative techniques for demonstrating the validity and applicability of the computational models. Computational models provide deterministic results based on deterministic and well-defined input, or stochastic results based on inputs defined by probability distributions. However, computational models cannot account for the effects of personnel, procedures, and equipment, resulting in ambiguity about the efficacy of inspections based on guidance from computational models only. In addition, ambiguity arises when model inputs, such as the representation of realistic cracks, cannot be defined deterministically, probabilistically, or by intervals. In this work, Pacific Northwest National Laboratory demonstrates the ability of computational models to represent field measurements under known variabilities, and quantify the differences using maximum amplitude and power spectrum density metrics. Sensitivity studies are also conducted to quantify the effects of different input parameters on the simulation results.
Zelenyak, Andreea-Manuela; Schorer, Nora; Sause, Markus G R
2018-02-01
This paper presents a method for embedding realistic defect geometries of a fiber reinforced material in a finite element modeling environment in order to simulate active ultrasonic inspection. When ultrasonic inspection is used experimentally to investigate the presence of defects in composite materials, the microscopic defect geometry may cause signal characteristics that are difficult to interpret. Hence, modeling of this interaction is key to improve our understanding and way of interpreting the acquired ultrasonic signals. To model the true interaction of the ultrasonic wave field with such defect structures as pores, cracks or delamination, a realistic three dimensional geometry reconstruction is required. We present a 3D-image based reconstruction process which converts computed tomography data in adequate surface representations ready to be embedded for processing with finite element methods. Subsequent modeling using these geometries uses a multi-scale and multi-physics simulation approach which results in quantitative A-Scan ultrasonic signals which can be directly compared with experimental signals. Therefore, besides the properties of the composite material, a full transducer implementation, piezoelectric conversion and simultaneous modeling of the attached circuit is applied. Comparison between simulated and experimental signals provides very good agreement in electrical voltage amplitude and the signal arrival time and thus validates the proposed modeling approach. Simulating ultrasound wave propagation in a medium with a realistic shape of the geometry clearly shows a difference in how the disturbance of the waves takes place and finally allows more realistic modeling of A-scans. Copyright © 2017 Elsevier B.V. All rights reserved.
Realistic micromechanical modeling and simulation of two-phase heterogeneous materials
NASA Astrophysics Data System (ADS)
Sreeranganathan, Arun
This dissertation research focuses on micromechanical modeling and simulations of two-phase heterogeneous materials exhibiting anisotropic and non-uniform microstructures with long-range spatial correlations. Completed work involves development of methodologies for realistic micromechanical analyses of materials using a combination of stereological techniques, two- and three-dimensional digital image processing, and finite element based modeling tools. The methodologies are developed via its applications to two technologically important material systems, namely, discontinuously reinforced aluminum composites containing silicon carbide particles as reinforcement, and boron modified titanium alloys containing in situ formed titanium boride whiskers. Microstructural attributes such as the shape, size, volume fraction, and spatial distribution of the reinforcement phase in these materials were incorporated in the models without any simplifying assumptions. Instrumented indentation was used to determine the constitutive properties of individual microstructural phases. Micromechanical analyses were performed using realistic 2D and 3D models and the results were compared with experimental data. Results indicated that 2D models fail to capture the deformation behavior of these materials and 3D analyses are required for realistic simulations. The effect of clustering of silicon carbide particles and associated porosity on the mechanical response of discontinuously reinforced aluminum composites was investigated using 3D models. Parametric studies were carried out using computer simulated microstructures incorporating realistic microstructural attributes. The intrinsic merit of this research is the development and integration of the required enabling techniques and methodologies for representation, modeling, and simulations of complex geometry of microstructures in two- and three-dimensional space facilitating better understanding of the effects of microstructural geometry on the mechanical behavior of materials.
Loudos, George K; Papadimitroulas, Panagiotis G; Kagadis, George C
2014-01-01
Monte Carlo (MC) simulations play a crucial role in nuclear medical imaging since they can provide the ground truth for clinical acquisitions, by integrating and quantifing all physical parameters that affect image quality. The last decade a number of realistic computational anthropomorphic models have been developed to serve imaging, as well as other biomedical engineering applications. The combination of MC techniques with realistic computational phantoms can provide a powerful tool for pre and post processing in imaging, data analysis and dosimetry. This work aims to create a global database for simulated Single Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) exams and the methodology, as well as the first elements are presented. Simulations are performed using the well validated GATE opensource toolkit, standard anthropomorphic phantoms and activity distribution of various radiopharmaceuticals, derived from literature. The resulting images, projections and sinograms of each study are provided in the database and can be further exploited to evaluate processing and reconstruction algorithms. Patient studies using different characteristics are included in the database and different computational phantoms were tested for the same acquisitions. These include the XCAT, Zubal and the Virtual Family, which some of which are used for the first time in nuclear imaging. The created database will be freely available and our current work is towards its extension by simulating additional clinical pathologies.
A Theory of Eye Movements during Target Acquisition
ERIC Educational Resources Information Center
Zelinsky, Gregory J.
2008-01-01
The gaze movements accompanying target localization were examined via human observers and a computational model (target acquisition model [TAM]). Search contexts ranged from fully realistic scenes to toys in a crib to Os and Qs, and manipulations included set size, target eccentricity, and target-distractor similarity. Observers and the model…
L. Linsen; B.J. Karis; E.G. McPherson; B. Hamann
2005-01-01
In computer graphics, models describing the fractal branching structure of trees typically exploit the modularity of tree structures. The models are based on local production rules, which are applied iteratively and simultaneously to create a complex branching system. The objective is to generate three-dimensional scenes of often many realistic- looking and non-...
ERIC Educational Resources Information Center
Blanco, Francesco; La Rocca, Paola; Petta, Catia; Riggi, Francesco
2009-01-01
An educational model simulation of the sound produced by lightning in the sky has been employed to demonstrate realistic signatures of thunder and its connection to the particular structure of the lightning channel. Algorithms used in the past have been revisited and implemented, making use of current computer techniques. The basic properties of…
Fluid-Structure Interaction Analysis of Ruptured Mitral Chordae Tendineae.
Toma, Milan; Bloodworth, Charles H; Pierce, Eric L; Einstein, Daniel R; Cochran, Richard P; Yoganathan, Ajit P; Kunzelman, Karyn S
2017-03-01
The chordal structure is a part of mitral valve geometry that has been commonly neglected or simplified in computational modeling due to its complexity. However, these simplifications cannot be used when investigating the roles of individual chordae tendineae in mitral valve closure. For the first time, advancements in imaging, computational techniques, and hardware technology make it possible to create models of the mitral valve without simplifications to its complex geometry, and to quickly run validated computer simulations that more realistically capture its function. Such simulations can then be used for a detailed analysis of chordae-related diseases. In this work, a comprehensive model of a subject-specific mitral valve with detailed chordal structure is used to analyze the distinct role played by individual chordae in closure of the mitral valve leaflets. Mitral closure was simulated for 51 possible chordal rupture points. Resultant regurgitant orifice area and strain change in the chordae at the papillary muscle tips were then calculated to examine the role of each ruptured chorda in the mitral valve closure. For certain subclassifications of chordae, regurgitant orifice area was found to trend positively with ruptured chordal diameter, and strain changes correlated negatively with regurgitant orifice area. Further advancements in clinical imaging modalities, coupled with the next generation of computational techniques will enable more physiologically realistic simulations.
Fluid-Structure Interaction Analysis of Ruptured Mitral Chordae Tendineae
Toma, Milan; Bloodworth, Charles H.; Pierce, Eric L.; Einstein, Daniel R.; Cochran, Richard P.; Yoganathan, Ajit P.; Kunzelman, Karyn S.
2016-01-01
The chordal structure is a part of mitral valve geometry that has been commonly neglected or simplified in computational modeling due to its complexity. However, these simplifications cannot be used when investigating the roles of individual chordae tendineae in mitral valve closure. For the first time, advancements in imaging, computational techniques, and hardware technology make it possible to create models of the mitral valve without simplifications to its complex geometry, and to quickly run validated computer simulations that more realistically capture its function. Such simulations can then be used for a detailed analysis of chordae-related diseases. In this work, a comprehensive model of a subject-specific mitral valve with detailed chordal structure is used to analyze the distinct role played by individual chordae in closure of the mitral valve leaflets. Mitral closure was simulated for 51 possible chordal rupture points. Resultant regurgitant orifice area and strain change in the chordae at the papillary muscle tips were then calculated to examine the role of each ruptured chorda in the mitral valve closure. For certain subclassifications of chordae, regurgitant orifice area was found to trend positively with ruptured chordal diameter, and strain changes correlated negatively with regurgitant orifice area. Further advancements in clinical imaging modalities, coupled with the next generation of computational techniques will enable more physiologically realistic simulations. PMID:27624659
Cortical Spiking Network Interfaced with Virtual Musculoskeletal Arm and Robotic Arm.
Dura-Bernal, Salvador; Zhou, Xianlian; Neymotin, Samuel A; Przekwas, Andrzej; Francis, Joseph T; Lytton, William W
2015-01-01
Embedding computational models in the physical world is a critical step towards constraining their behavior and building practical applications. Here we aim to drive a realistic musculoskeletal arm model using a biomimetic cortical spiking model, and make a robot arm reproduce the same trajectories in real time. Our cortical model consisted of a 3-layered cortex, composed of several hundred spiking model-neurons, which display physiologically realistic dynamics. We interconnected the cortical model to a two-joint musculoskeletal model of a human arm, with realistic anatomical and biomechanical properties. The virtual arm received muscle excitations from the neuronal model, and fed back proprioceptive information, forming a closed-loop system. The cortical model was trained using spike timing-dependent reinforcement learning to drive the virtual arm in a 2D reaching task. Limb position was used to simultaneously control a robot arm using an improved network interface. Virtual arm muscle activations responded to motoneuron firing rates, with virtual arm muscles lengths encoded via population coding in the proprioceptive population. After training, the virtual arm performed reaching movements which were smoother and more realistic than those obtained using a simplistic arm model. This system provided access to both spiking network properties and to arm biophysical properties, including muscle forces. The use of a musculoskeletal virtual arm and the improved control system allowed the robot arm to perform movements which were smoother than those reported in our previous paper using a simplistic arm. This work provides a novel approach consisting of bidirectionally connecting a cortical model to a realistic virtual arm, and using the system output to drive a robotic arm in real time. Our techniques are applicable to the future development of brain neuroprosthetic control systems, and may enable enhanced brain-machine interfaces with the possibility for finer control of limb prosthetics.
Robust mode space approach for atomistic modeling of realistically large nanowire transistors
NASA Astrophysics Data System (ADS)
Huang, Jun Z.; Ilatikhameneh, Hesameddin; Povolotskyi, Michael; Klimeck, Gerhard
2018-01-01
Nanoelectronic transistors have reached 3D length scales in which the number of atoms is countable. Truly atomistic device representations are needed to capture the essential functionalities of the devices. Atomistic quantum transport simulations of realistically extended devices are, however, computationally very demanding. The widely used mode space (MS) approach can significantly reduce the numerical cost, but a good MS basis is usually very hard to obtain for atomistic full-band models. In this work, a robust and parallel algorithm is developed to optimize the MS basis for atomistic nanowires. This enables engineering-level, reliable tight binding non-equilibrium Green's function simulation of nanowire metal-oxide-semiconductor field-effect transistor (MOSFET) with a realistic cross section of 10 nm × 10 nm using a small computer cluster. This approach is applied to compare the performance of InGaAs and Si nanowire n-type MOSFETs (nMOSFETs) with various channel lengths and cross sections. Simulation results with full-band accuracy indicate that InGaAs nanowire nMOSFETs have no drive current advantage over their Si counterparts for cross sections up to about 10 nm × 10 nm.
NASA Technical Reports Server (NTRS)
Cohen, C.
1981-01-01
A hierarchy of experiments was run, starting with an all water planet with zonally symmetric sea surface temperatures, then adding, one at a time, flat continents, mountains, surface physics, and realistic sea surface temperatures. The model was run with the sun fixed at a perpetual January. Ensemble means and standard deviations were computed and the t-test was used to determine the statistical significance of the results. The addition of realistic surface physics does not affect the model climatology to as large as extent as does the addition of mountains. Departures from zonal symmetry of the SST field result in a better simulation of the real atmosphere.
CG2Real: Improving the Realism of Computer Generated Images Using a Large Collection of Photographs.
Johnson, Micah K; Dale, Kevin; Avidan, Shai; Pfister, Hanspeter; Freeman, William T; Matusik, Wojciech
2011-09-01
Computer-generated (CG) images have achieved high levels of realism. This realism, however, comes at the cost of long and expensive manual modeling, and often humans can still distinguish between CG and real images. We introduce a new data-driven approach for rendering realistic imagery that uses a large collection of photographs gathered from online repositories. Given a CG image, we retrieve a small number of real images with similar global structure. We identify corresponding regions between the CG and real images using a mean-shift cosegmentation algorithm. The user can then automatically transfer color, tone, and texture from matching regions to the CG image. Our system only uses image processing operations and does not require a 3D model of the scene, making it fast and easy to integrate into digital content creation workflows. Results of a user study show that our hybrid images appear more realistic than the originals.
Roche, Benjamin; Guégan, Jean-François; Bousquet, François
2008-10-15
Computational biology is often associated with genetic or genomic studies only. However, thanks to the increase of computational resources, computational models are appreciated as useful tools in many other scientific fields. Such modeling systems are particularly relevant for the study of complex systems, like the epidemiology of emerging infectious diseases. So far, mathematical models remain the main tool for the epidemiological and ecological analysis of infectious diseases, with SIR models could be seen as an implicit standard in epidemiology. Unfortunately, these models are based on differential equations and, therefore, can become very rapidly unmanageable due to the too many parameters which need to be taken into consideration. For instance, in the case of zoonotic and vector-borne diseases in wildlife many different potential host species could be involved in the life-cycle of disease transmission, and SIR models might not be the most suitable tool to truly capture the overall disease circulation within that environment. This limitation underlines the necessity to develop a standard spatial model that can cope with the transmission of disease in realistic ecosystems. Computational biology may prove to be flexible enough to take into account the natural complexity observed in both natural and man-made ecosystems. In this paper, we propose a new computational model to study the transmission of infectious diseases in a spatially explicit context. We developed a multi-agent system model for vector-borne disease transmission in a realistic spatial environment. Here we describe in detail the general behavior of this model that we hope will become a standard reference for the study of vector-borne disease transmission in wildlife. To conclude, we show how this simple model could be easily adapted and modified to be used as a common framework for further research developments in this field.
Investigation of the effects of aeroelastic deformations on the radar cross section of aircraft
NASA Astrophysics Data System (ADS)
McKenzie, Samuel D.
1991-12-01
The effects of aeroelastic deformations on the radar cross section (RCS) of a T-38 trainer jet and a C-5A transport aircraft are examined and characterized. Realistic representations of structural wing deformations are obtained from a mechanical/computer aided design software package called NASTRAN. NASTRAN is used to evaluate the structural parameters of the aircraft as well as the restraints and loads associated with realistic flight conditions. Geometries for both the non-deformed and deformed airframes are obtained from the NASTRAN models and translated into RCS models. The RCS is analyzed using a numerical modeling code called the Radar Cross Section - Basic Scattering Code, version 2 which was developed at the Ohio State University and is based on the uniform geometric theory of diffraction. The code is used to analyze the effects of aeroelastic deformations on the RCS of the aircraft by comparing the computed RCS representing the deformed airframe to that of the non-deformed airframe and characterizing the differences between them.
Stimulation from Simulation? A Teaching Model of Hillslope Hydrology for Use on Microcomputers.
ERIC Educational Resources Information Center
Burt, Tim; Butcher, Dave
1986-01-01
The design and use of a simple computer model which simulates a hillslope hydrology is described in a teaching context. The model shows a relatively complex environmental system can be constructed on the basis of a simple but realistic theory, thus allowing students to simulate the hydrological response of real hillslopes. (Author/TRS)
NASA Astrophysics Data System (ADS)
Gherghel-Lascu, A.; Apel, W. D.; Arteaga-Velázquez, J. C.; Bekk, K.; Bertania, M.; Blümer, J.; Bozdog, H.; Brancus, I. M.; Cantoni, E.; Chiavassa, A.; Cossavella, F.; Daumiller, K.; de Souza, V.; Di Pierro, F.; Doll, P.; Engel, R.; Fuhrmann, D.; Gils, H. J.; Glasstetter, R.; Grupen, C.; Haungs, A.; Heck, D.; Hörandel, J. R.; Huber, D.; Huege, T.; Kampert, K.-H.; Kang, D.; Klages, H. O.; Link, K.; Łuczak, P.; Mathes, H. J.; Mayer, H. J.; Milke, J.; Mitrica, B.; Morello, C.; Oehlschläger, J.; Ostapchenko, S.; Palmieri, N.; Pierog, T.; Rebel, H.; Roth, M.; Schieler, H.; Schoo, S.; Schröder, F. G.; Sima, O.; Toma, G.; Trinchero, G. C.; Ulrich, H.; Weindl, A.; Wochele, J.; Zabierowski, J.
2017-06-01
The charged particle densities obtained from CORSIKA simulated EAS, using the QGSJet-II.04 hadronic interaction model are used for primary energy reconstruction. Simulated data are reconstructed by using Lateral Energy Correction Functions computed with a new realistic model of the Grande stations implemented in Geant4.10.
Na, Okpin; Cai, Xiao-Chuan; Xi, Yunping
2017-01-01
The prediction of the chloride-induced corrosion is very important because of the durable life of concrete structure. To simulate more realistic durability performance of concrete structures, complex scientific methods and more accurate material models are needed. In order to predict the robust results of corrosion initiation time and to describe the thin layer from concrete surface to reinforcement, a large number of fine meshes are also used. The purpose of this study is to suggest more realistic physical model regarding coupled hygro-chemo transport and to implement the model with parallel finite element algorithm. Furthermore, microclimate model with environmental humidity and seasonal temperature is adopted. As a result, the prediction model of chloride diffusion under unsaturated condition was developed with parallel algorithms and was applied to the existing bridge to validate the model with multi-boundary condition. As the number of processors increased, the computational time decreased until the number of processors became optimized. Then, the computational time increased because the communication time between the processors increased. The framework of present model can be extended to simulate the multi-species de-icing salts ingress into non-saturated concrete structures in future work. PMID:28772714
NASA Astrophysics Data System (ADS)
De Geeter, N.; Crevecoeur, G.; Dupré, L.; Van Hecke, W.; Leemans, A.
2012-04-01
Accurate simulations on detailed realistic head models are necessary to gain a better understanding of the response to transcranial magnetic stimulation (TMS). Hitherto, head models with simplified geometries and constant isotropic material properties are often used, whereas some biological tissues have anisotropic characteristics which vary naturally with frequency. Moreover, most computational methods do not take the tissue permittivity into account. Therefore, we calculate the electromagnetic behaviour due to TMS in a head model with realistic geometry and where realistic dispersive anisotropic tissue properties are incorporated, based on T1-weighted and diffusion-weighted magnetic resonance images. This paper studies the impact of tissue anisotropy, permittivity and frequency dependence, using the anisotropic independent impedance method. The results show that anisotropy yields differences up to 32% and 19% of the maximum induced currents and electric field, respectively. Neglecting the permittivity values leads to a decrease of about 72% and 24% of the maximum currents and field, respectively. Implementing the dispersive effects of biological tissues results in a difference of 6% of the maximum currents. The cerebral voxels show limited sensitivity of the induced electric field to changes in conductivity and permittivity, whereas the field varies approximately linearly with frequency. These findings illustrate the importance of including each of the above parameters in the model and confirm the need for accuracy in the applied patient-specific method, which can be used in computer-assisted TMS.
NASA Technical Reports Server (NTRS)
Leser, Patrick E.; Hochhalter, Jacob D.; Newman, John A.; Leser, William P.; Warner, James E.; Wawrzynek, Paul A.; Yuan, Fuh-Gwo
2015-01-01
Utilizing inverse uncertainty quantification techniques, structural health monitoring can be integrated with damage progression models to form probabilistic predictions of a structure's remaining useful life. However, damage evolution in realistic structures is physically complex. Accurately representing this behavior requires high-fidelity models which are typically computationally prohibitive. In the present work, a high-fidelity finite element model is represented by a surrogate model, reducing computation times. The new approach is used with damage diagnosis data to form a probabilistic prediction of remaining useful life for a test specimen under mixed-mode conditions.
Multi-Scale Computational Models for Electrical Brain Stimulation
Seo, Hyeon; Jun, Sung C.
2017-01-01
Electrical brain stimulation (EBS) is an appealing method to treat neurological disorders. To achieve optimal stimulation effects and a better understanding of the underlying brain mechanisms, neuroscientists have proposed computational modeling studies for a decade. Recently, multi-scale models that combine a volume conductor head model and multi-compartmental models of cortical neurons have been developed to predict stimulation effects on the macroscopic and microscopic levels more precisely. As the need for better computational models continues to increase, we overview here recent multi-scale modeling studies; we focused on approaches that coupled a simplified or high-resolution volume conductor head model and multi-compartmental models of cortical neurons, and constructed realistic fiber models using diffusion tensor imaging (DTI). Further implications for achieving better precision in estimating cellular responses are discussed. PMID:29123476
NASA Technical Reports Server (NTRS)
Narayan, K. Lakshmi; Kelton, K. F.; Ray, C. S.
1996-01-01
Heterogeneous nucleation and its effects on the crystallization of lithium disilicate glass containing small amounts of Pt are investigated. Measurements of the nucleation frequencies and induction times with and without Pt are shown to be consistent with predictions based on the classical nucleation theory. A realistic computer model for the transformation is presented. Computed differential thermal analysis data (such as crystallization rates as a function of time and temperature) are shown to be in good agreement with experimental results. This modeling provides a new, more quantitative method for analyzing calorimetric data.
Analysis and Modeling of Realistic Compound Channels in Transparent Relay Transmissions
Kanjirathumkal, Cibile K.; Mohammed, Sameer S.
2014-01-01
Analytical approaches for the characterisation of the compound channels in transparent multihop relay transmissions over independent fading channels are considered in this paper. Compound channels with homogeneous links are considered first. Using Mellin transform technique, exact expressions are derived for the moments of cascaded Weibull distributions. Subsequently, two performance metrics, namely, coefficient of variation and amount of fade, are derived using the computed moments. These metrics quantify the possible variations in the channel gain and signal to noise ratio from their respective average values and can be used to characterise the achievable receiver performance. This approach is suitable for analysing more realistic compound channel models for scattering density variations of the environment, experienced in multihop relay transmissions. The performance metrics for such heterogeneous compound channels having distinct distribution in each hop are computed and compared with those having identical constituent component distributions. The moments and the coefficient of variation computed are then used to develop computationally efficient estimators for the distribution parameters and the optimal hop count. The metrics and estimators proposed are complemented with numerical and simulation results to demonstrate the impact of the accuracy of the approaches. PMID:24701175
Effects of convection electric field on upwelling and escape of ionospheric O(+)
NASA Technical Reports Server (NTRS)
Cladis, J. B.; Chiu, Yam T.; Peterson, William K.
1992-01-01
A Monte Carlo code is used to explore the full effects of the convection electric field on distributions of upflowing O(+) ions from the cusp/cleft ionosphere. Trajectories of individual ions/neutrals are computed as they undergo multiple charge-exchange collisions. In the ion state, the trajectories are computed in realistic models of the magnetic field and the convection, corotation, and ambipolar electric fields. The effects of ion-ion collisions are included, and the trajectories are computed with and without simultaneous stochastic heating perpendicular to the magnetic field by a realistic model of broadband, low frequency waves. In the neutral state, ballistic trajectories in the gravitational field are computed. The initial conditions of the ions, in addition to ambipolar electric field and the number densities and temperatures of O(+), H(+), and electrons as a function of height in the cusp/cleft region were obtained from the results of Gombosi and Killeen (1987), who used a hydrodynamic code to simulate the time-dependent frictional-heating effects in a magnetic tube during its motion though the convection throat. The distribution of the ion fluxes as a function of height are constructed from the case histories.
Shaded-Color Picture Generation of Computer-Defined Arbitrary Shapes
NASA Technical Reports Server (NTRS)
Cozzolongo, J. V.; Hermstad, D. L.; Mccoy, D. S.; Clark, J.
1986-01-01
SHADE computer program generates realistic color-shaded pictures from computer-defined arbitrary shapes. Objects defined for computer representation displayed as smooth, color-shaded surfaces, including varying degrees of transparency. Results also used for presentation of computational results. By performing color mapping, SHADE colors model surface to display analysis results as pressures, stresses, and temperatures. NASA has used SHADE extensively in sign and analysis of high-performance aircraft. Industry should find applications for SHADE in computer-aided design and computer-aided manufacturing. SHADE written in VAX FORTRAN and MACRO Assembler for either interactive or batch execution.
Cloud immersion building shielding factors for US residential structures.
Dickson, E D; Hamby, D M
2014-12-01
This paper presents validated building shielding factors designed for contemporary US housing-stock under an idealized, yet realistic, exposure scenario within a semi-infinite cloud of radioactive material. The building shielding factors are intended for use in emergency planning and level three probabilistic risk assessments for a variety of postulated radiological events in which a realistic assessment is necessary to better understand the potential risks for accident mitigation and emergency response planning. Factors are calculated from detailed computational housing-units models using the general-purpose Monte Carlo N-Particle computational code, MCNP5, and are benchmarked from a series of narrow- and broad-beam measurements analyzing the shielding effectiveness of ten common general-purpose construction materials and ten shielding models representing the primary weather barriers (walls and roofs) of likely US housing-stock. Each model was designed to scale based on common residential construction practices and include, to the extent practical, all structurally significant components important for shielding against ionizing radiation. Calculations were performed for floor-specific locations as well as for computing a weighted-average representative building shielding factor for single- and multi-story detached homes, both with and without basement, as well for single-wide manufactured housing-units.
Fang, Yibin; Yu, Ying; Cheng, Jiyong; Wang, Shengzhang; Wang, Kuizhong; Liu, Jian-Min; Huang, Qinghai
2013-01-01
Adjusting hemodynamics via flow diverter (FD) implantation is emerging as a novel method of treating cerebral aneurysms. However, most previous FD-related hemodynamic studies were based on virtual FD deployment, which may produce different hemodynamic outcomes than realistic (in vivo) FD deployment. We compared hemodynamics between virtual FD and realistic FD deployments in rabbit aneurysm models using computational fluid dynamics (CFD) simulations. FDs were implanted for aneurysms in 14 rabbits. Vascular models based on rabbit-specific angiograms were reconstructed for CFD studies. Real FD configurations were reconstructed based on micro-CT scans after sacrifice, while virtual FD configurations were constructed with SolidWorks software. Hemodynamic parameters before and after FD deployment were analyzed. According to the metal coverage (MC) of implanted FDs calculated based on micro-CT reconstruction, 14 rabbits were divided into two groups (A, MC >35%; B, MC <35%). Normalized mean wall shear stress (WSS), relative residence time (RRT), inflow velocity, and inflow volume in Group A were significantly different (P<0.05) from virtual FD deployment, but pressure was not (P>0.05). The normalized mean WSS in Group A after realistic FD implantation was significantly lower than that of Group B. All parameters in Group B exhibited no significant difference between realistic and virtual FDs. This study confirmed MC-correlated differences in hemodynamic parameters between realistic and virtual FD deployment. PMID:23823503
[The research on bidirectional reflectance computer simulation of forest canopy at pixel scale].
Song, Jin-Ling; Wang, Jin-Di; Shuai, Yan-Min; Xiao, Zhi-Qiang
2009-08-01
Computer simulation is based on computer graphics to generate the realistic 3D structure scene of vegetation, and to simulate the canopy regime using radiosity method. In the present paper, the authors expand the computer simulation model to simulate forest canopy bidirectional reflectance at pixel scale. But usually, the trees are complex structures, which are tall and have many branches. So there is almost a need for hundreds of thousands or even millions of facets to built up the realistic structure scene for the forest It is difficult for the radiosity method to compute so many facets. In order to make the radiosity method to simulate the forest scene at pixel scale, in the authors' research, the authors proposed one idea to simplify the structure of forest crowns, and abstract the crowns to ellipsoids. And based on the optical characteristics of the tree component and the characteristics of the internal energy transmission of photon in real crown, the authors valued the optical characteristics of ellipsoid surface facets. In the computer simulation of the forest, with the idea of geometrical optics model, the gap model is considered to get the forest canopy bidirectional reflectance at pixel scale. Comparing the computer simulation results with the GOMS model, and Multi-angle Imaging SpectroRadiometer (MISR) multi-angle remote sensing data, the simulation results are in agreement with the GOMS simulation result and MISR BRF. But there are also some problems to be solved. So the authors can conclude that the study has important value for the application of multi-angle remote sensing and the inversion of vegetation canopy structure parameters.
Turbofan Duct Propagation Model
NASA Technical Reports Server (NTRS)
Lan, Justin H.; Posey, Joe W. (Technical Monitor)
2001-01-01
The CDUCT code utilizes a parabolic approximation to the convected Helmholtz equation in order to efficiently model acoustic propagation in acoustically treated, complex shaped ducts. The parabolic approximation solves one-way wave propagation with a marching method which neglects backwards reflected waves. The derivation of the parabolic approximation is presented. Several code validation cases are given. An acoustic lining design process for an example aft fan duct is discussed. It is noted that the method can efficiently model realistic three-dimension effects, acoustic lining, and flow within the computational capabilities of a typical computer workstation.
NASA Technical Reports Server (NTRS)
Austin, F.; Markowitz, J.; Goldenberg, S.; Zetkov, G. A.
1973-01-01
The formulation of a mathematical model for predicting the dynamic behavior of rotating flexible space station configurations was conducted. The overall objectives of the study were: (1) to develop the theoretical techniques for determining the behavior of a realistically modeled rotating space station, (2) to provide a versatile computer program for the numerical analysis, and (3) to present practical concepts for experimental verification of the analytical results. The mathematical model and its associated computer program are described.
NASA Astrophysics Data System (ADS)
Dünser, Simon; Meyer, Daniel W.
2016-06-01
In most groundwater aquifers, dispersion of tracers is dominated by flow-field inhomogeneities resulting from the underlying heterogeneous conductivity or transmissivity field. This effect is referred to as macrodispersion. Since in practice, besides a few point measurements the complete conductivity field is virtually never available, a probabilistic treatment is needed. To quantify the uncertainty in tracer concentrations from a given geostatistical model for the conductivity, Monte Carlo (MC) simulation is typically used. To avoid the excessive computational costs of MC, the polar Markovian velocity process (PMVP) model was recently introduced delivering predictions at about three orders of magnitude smaller computing times. In artificial test cases, the PMVP model has provided good results in comparison with MC. In this study, we further validate the model in a more challenging and realistic setup. The setup considered is derived from the well-known benchmark macrodispersion experiment (MADE), which is highly heterogeneous and non-stationary with a large number of unevenly scattered conductivity measurements. Validations were done against reference MC and good overall agreement was found. Moreover, simulations of a simplified setup with a single measurement were conducted in order to reassess the model's most fundamental assumptions and to provide guidance for model improvements.
NASA Astrophysics Data System (ADS)
Ingley, Spencer J.; Rahmani Asl, Mohammad; Wu, Chengde; Cui, Rongfeng; Gadelhak, Mahmoud; Li, Wen; Zhang, Ji; Simpson, Jon; Hash, Chelsea; Butkowski, Trisha; Veen, Thor; Johnson, Jerald B.; Yan, Wei; Rosenthal, Gil G.
2015-12-01
Experimental approaches to studying behaviors based on visual signals are ubiquitous, yet these studies are limited by the difficulty of combining realistic models with the manipulation of signals in isolation. Computer animations are a promising way to break this trade-off. However, animations are often prohibitively expensive and difficult to program, thus limiting their utility in behavioral research. We present anyFish 2.0, a user-friendly platform for creating realistic animated 3D fish. anyFish 2.0 dramatically expands anyFish's utility by allowing users to create animations of members of several groups of fish from model systems in ecology and evolution (e.g., sticklebacks, Poeciliids, and zebrafish). The visual appearance and behaviors of the model can easily be modified. We have added several features that facilitate more rapid creation of realistic behavioral sequences. anyFish 2.0 provides a powerful tool that will be of broad use in animal behavior and evolution and serves as a model for transparency, repeatability, and collaboration.
NASA Astrophysics Data System (ADS)
Satoh, Masaki; Tomita, Hirofumi; Yashiro, Hisashi; Kajikawa, Yoshiyuki; Miyamoto, Yoshiaki; Yamaura, Tsuyoshi; Miyakawa, Tomoki; Nakano, Masuo; Kodama, Chihiro; Noda, Akira T.; Nasuno, Tomoe; Yamada, Yohei; Fukutomi, Yoshiki
2017-12-01
This article reviews the major outcomes of a 5-year (2011-2016) project using the K computer to perform global numerical atmospheric simulations based on the non-hydrostatic icosahedral atmospheric model (NICAM). The K computer was made available to the public in September 2012 and was used as a primary resource for Japan's Strategic Programs for Innovative Research (SPIRE), an initiative to investigate five strategic research areas; the NICAM project fell under the research area of climate and weather simulation sciences. Combining NICAM with high-performance computing has created new opportunities in three areas of research: (1) higher resolution global simulations that produce more realistic representations of convective systems, (2) multi-member ensemble simulations that are able to perform extended-range forecasts 10-30 days in advance, and (3) multi-decadal simulations for climatology and variability. Before the K computer era, NICAM was used to demonstrate realistic simulations of intra-seasonal oscillations including the Madden-Julian oscillation (MJO), merely as a case study approach. Thanks to the big leap in computational performance of the K computer, we could greatly increase the number of cases of MJO events for numerical simulations, in addition to integrating time and horizontal resolution. We conclude that the high-resolution global non-hydrostatic model, as used in this five-year project, improves the ability to forecast intra-seasonal oscillations and associated tropical cyclogenesis compared with that of the relatively coarser operational models currently in use. The impacts of the sub-kilometer resolution simulation and the multi-decadal simulations using NICAM are also reviewed.
Synthesized interstitial lung texture for use in anthropomorphic computational phantoms
NASA Astrophysics Data System (ADS)
Becchetti, Marc F.; Solomon, Justin B.; Segars, W. Paul; Samei, Ehsan
2016-04-01
A realistic model of the anatomical texture from the pulmonary interstitium was developed with the goal of extending the capability of anthropomorphic computational phantoms (e.g., XCAT, Duke University), allowing for more accurate image quality assessment. Contrast-enhanced, high dose, thorax images for a healthy patient from a clinical CT system (Discovery CT750HD, GE healthcare) with thin (0.625 mm) slices and filtered back- projection (FBP) were used to inform the model. The interstitium which gives rise to the texture was defined using 24 volumes of interest (VOIs). These VOIs were selected manually to avoid vasculature, bronchi, and bronchioles. A small scale Hessian-based line filter was applied to minimize the amount of partial-volumed supernumerary vessels and bronchioles within the VOIs. The texture in the VOIs was characterized using 8 Haralick and 13 gray-level run length features. A clustered lumpy background (CLB) model with added noise and blurring to match CT system was optimized to resemble the texture in the VOIs using a genetic algorithm with the Mahalanobis distance as a similarity metric between the texture features. The most similar CLB model was then used to generate the interstitial texture to fill the lung. The optimization improved the similarity by 45%. This will substantially enhance the capabilities of anthropomorphic computational phantoms, allowing for more realistic CT simulations.
Mechanical Modeling and Computer Simulation of Protein Folding
ERIC Educational Resources Information Center
Prigozhin, Maxim B.; Scott, Gregory E.; Denos, Sharlene
2014-01-01
In this activity, science education and modern technology are bridged to teach students at the high school and undergraduate levels about protein folding and to strengthen their model building skills. Students are guided from a textbook picture of a protein as a rigid crystal structure to a more realistic view: proteins are highly dynamic…
Computational Labs Using VPython Complement Conventional Labs in Online and Regular Physics Classes
NASA Astrophysics Data System (ADS)
Bachlechner, Martina E.
2009-03-01
Fairmont State University has developed online physics classes for the high-school teaching certificate based on the text book Matter and Interaction by Chabay and Sherwood. This lead to using computational VPython labs also in the traditional class room setting to complement conventional labs. The computational modeling process has proven to provide an excellent basis for the subsequent conventional lab and allows for a concrete experience of the difference between behavior according to a model and realistic behavior. Observations in the regular class room setting feed back into the development of the online classes.
Computation of the unsteady facilitated transport of oxygen in hemoglobin
NASA Technical Reports Server (NTRS)
Davis, Sanford
1990-01-01
The transport of a reacting permeant diffusing through a thin membrane is extended to more realistic dissociation models. A new nonlinear analysis of the reaction-diffusion equations, using implicit finite-difference methods and direct block solvers, is used to study the limits of linearized and equilibrium theories. Computed curves of molecular oxygen permeating through hemoglobin solution are used to illustrate higher-order reaction models, the effect of concentration boundary layers at the membrane interfaces, and the transient buildup of oxygen flux.
High-efficiency AlGaAs-GaAs Cassegrainian concentrator cells
NASA Technical Reports Server (NTRS)
Werthen, J. G.; Hamaker, H. C.; Virshup, G. F.; Lewis, C. R.; Ford, C. W.
1985-01-01
AlGaAs-GaAs heteroface space concentrator solar cells have been fabricated by metalorganic chemical vapor deposition. AMO efficiencies as high as 21.1% have been observed both for p-n and np structures under concentration (90 to 100X) at 25 C. Both cell structures are characterized by high quantum efficiencies and their performances are close to those predicted by a realistic computer model. In agreement with the computer model, the n-p cell exhibits a higher short-circuit current density.
Neuronize: a tool for building realistic neuronal cell morphologies
Brito, Juan P.; Mata, Susana; Bayona, Sofia; Pastor, Luis; DeFelipe, Javier; Benavides-Piccione, Ruth
2013-01-01
This study presents a tool, Neuronize, for building realistic three-dimensional models of neuronal cells from the morphological information extracted through computer-aided tracing applications. Neuronize consists of a set of methods designed to build 3D neural meshes that approximate the cell membrane at different resolution levels, allowing a balance to be reached between the complexity and the quality of the final model. The main contribution of the present study is the proposal of a novel approach to build a realistic and accurate 3D shape of the soma from the incomplete information stored in the digitally traced neuron, which usually consists of a 2D cell body contour. This technique is based on the deformation of an initial shape driven by the position and thickness of the first order dendrites. The addition of a set of spines along the dendrites completes the model, building a final 3D neuronal cell suitable for its visualization in a wide range of 3D environments. PMID:23761740
Neuronize: a tool for building realistic neuronal cell morphologies.
Brito, Juan P; Mata, Susana; Bayona, Sofia; Pastor, Luis; Defelipe, Javier; Benavides-Piccione, Ruth
2013-01-01
This study presents a tool, Neuronize, for building realistic three-dimensional models of neuronal cells from the morphological information extracted through computer-aided tracing applications. Neuronize consists of a set of methods designed to build 3D neural meshes that approximate the cell membrane at different resolution levels, allowing a balance to be reached between the complexity and the quality of the final model. The main contribution of the present study is the proposal of a novel approach to build a realistic and accurate 3D shape of the soma from the incomplete information stored in the digitally traced neuron, which usually consists of a 2D cell body contour. This technique is based on the deformation of an initial shape driven by the position and thickness of the first order dendrites. The addition of a set of spines along the dendrites completes the model, building a final 3D neuronal cell suitable for its visualization in a wide range of 3D environments.
A Fully Distributed Approach to the Design of a KBIT/SEC VHF Packet Radio Network,
1984-02-01
topological change and consequent out-modea routing data. Algorithm development has been aided by computer simulation using a finite state machine technique...development has been aided by computer simulation using a finite state machine technique to model a realistic network of up to fifty nodes. This is...use of computer based equipments in weapons systems and their associated sensors and command and control elements and the trend from voice to data
NASA Astrophysics Data System (ADS)
Shi, X.; Zhang, G.
2013-12-01
Because of the extensive computational burden, parametric uncertainty analyses are rarely conducted for geological carbon sequestration (GCS) process based multi-phase models. The difficulty of predictive uncertainty analysis for the CO2 plume migration in realistic GCS models is not only due to the spatial distribution of the caprock and reservoir (i.e. heterogeneous model parameters), but also because the GCS optimization estimation problem has multiple local minima due to the complex nonlinear multi-phase (gas and aqueous), and multi-component (water, CO2, salt) transport equations. The geological model built by Doughty and Pruess (2004) for the Frio pilot site (Texas) was selected and assumed to represent the 'true' system, which was composed of seven different facies (geological units) distributed among 10 layers. We chose to calibrate the permeabilities of these facies. Pressure and gas saturation values from this true model were then extracted and used as observations for subsequent model calibration. Random noise was added to the observations to approximate realistic field conditions. Each simulation of the model lasts about 2 hours. In this study, we develop a new approach that improves computational efficiency of Bayesian inference by constructing a surrogate system based on an adaptive sparse-grid stochastic collocation method. This surrogate response surface global optimization algorithm is firstly used to calibrate the model parameters, then prediction uncertainty of the CO2 plume position is quantified due to the propagation from parametric uncertainty in the numerical experiments, which is also compared to the actual plume from the 'true' model. Results prove that the approach is computationally efficient for multi-modal optimization and prediction uncertainty quantification for computationally expensive simulation models. Both our inverse methodology and findings can be broadly applicable to GCS in heterogeneous storage formations.
A computer model of molecular arrangement in a n-paraffinic liquid
NASA Astrophysics Data System (ADS)
Vacatello, Michele; Avitabile, Gustavo; Corradini, Paolo; Tuzi, Angela
1980-07-01
A computer model of a bulk liquid polymer was built to investigate the problem of local order. The model is made of C30 n-alkane molecules; it is not a lattice model, but it allows for a continuous variability of torsion angles and interchain distances, subject to realistic intra- and intermolecular potentials. Experimental x-ray scattering curves and radial distribution functions are well reproduced. Calculated properties like end-to-end distances, distribution of torsion angles, radial distribution functions, and chain direction correlation parameters, all indicate a random coil conformation and no tendency to form bundles of parallel chains.
Spectral decontamination of a real-time helicopter simulation
NASA Technical Reports Server (NTRS)
Mcfarland, R. E.
1983-01-01
Nonlinear mathematical models of a rotor system, referred to as rotating blade-element models, produce steady-state, high-frequency harmonics of significant magnitude. In a discrete simulation model, certain of these harmonics may be incompatible with realistic real-time computational constraints because of their aliasing into the operational low-pass region. However, the energy is an aliased harmonic may be suppressed by increasing the computation rate of an isolated, causal nonlinearity and using an appropriate filter. This decontamination technique is applied to Sikorsky's real-time model of the Black Hawk helicopter, as supplied to NASA for handling-qualities investigations.
Cognitive and Neural Bases of Skilled Performance.
1987-10-04
advantage is that this method is not computationally demanding, and model -specific analyses such as high -precision source localization with realistic...and a two- < " high -threshold model satisfy theoretical and pragmatic independence. Discrimination and bias measures from these two models comparing...recognition memory of patients with dementing diseases, amnesics, and normal controls. We found the two- high -threshold model to be more sensitive Lloyd
João A. N. Filipe; Richard C. Cobb; David M. Rizzo; Ross K. Meentemeyer; Christopher A. Gilligan
2010-01-01
Landscape- to regional-scale models of plant epidemics are direly needed to predict largescale impacts of disease and assess practicable options for control. While landscape heterogeneity is recognized as a major driver of disease dynamics, epidemiological models are rarely applied to realistic landscape conditions due to computational and data limitations. Here we...
ERIC Educational Resources Information Center
Hirsch, Jorge E.; Scalapino, Douglas J.
1983-01-01
Discusses ways computers are being used in condensed-matter physics by experimenters and theorists. Experimenters use them to control experiments and to gather and analyze data. Theorists use them for detailed predictions based on realistic models and for studies on systems not realizable in practice. (JN)
Computational Fluid Dynamics Analysis of the Effect of Plaques in the Left Coronary Artery
Chaichana, Thanapong; Sun, Zhonghua; Jewkes, James
2012-01-01
This study was to investigate the hemodynamic effect of simulated plaques in left coronary artery models, which were generated from a sample patient's data. Plaques were simulated and placed at the left main stem and the left anterior descending (LAD) to produce at least 60% coronary stenosis. Computational fluid dynamics analysis was performed to simulate realistic physiological conditions that reflect the in vivo cardiac hemodynamics, and comparison of wall shear stress (WSS) between Newtonian and non-Newtonian fluid models was performed. The pressure gradient (PSG) and flow velocities in the left coronary artery were measured and compared in the left coronary models with and without presence of plaques during cardiac cycle. Our results showed that the highest PSG was observed in stenotic regions caused by the plaques. Low flow velocity areas were found at postplaque locations in the left circumflex, LAD, and bifurcation. WSS at the stenotic locations was similar between the non-Newtonian and Newtonian models although some more details were observed with non-Newtonian model. There is a direct correlation between coronary plaques and subsequent hemodynamic changes, based on the simulation of plaques in the realistic coronary models. PMID:22400051
Computational physics of the mind
NASA Astrophysics Data System (ADS)
Duch, Włodzisław
1996-08-01
In the XIX century and earlier physicists such as Newton, Mayer, Hooke, Helmholtz and Mach were actively engaged in the research on psychophysics, trying to relate psychological sensations to intensities of physical stimuli. Computational physics allows to simulate complex neural processes giving a chance to answer not only the original psychophysical questions but also to create models of the mind. In this paper several approaches relevant to modeling of the mind are outlined. Since direct modeling of the brain functions is rather limited due to the complexity of such models a number of approximations is introduced. The path from the brain, or computational neurosciences, to the mind, or cognitive sciences, is sketched, with emphasis on higher cognitive functions such as memory and consciousness. No fundamental problems in understanding of the mind seem to arise. From a computational point of view realistic models require massively parallel architectures.
Griffiths, Thomas L; Lieder, Falk; Goodman, Noah D
2015-04-01
Marr's levels of analysis-computational, algorithmic, and implementation-have served cognitive science well over the last 30 years. But the recent increase in the popularity of the computational level raises a new challenge: How do we begin to relate models at different levels of analysis? We propose that it is possible to define levels of analysis that lie between the computational and the algorithmic, providing a way to build a bridge between computational- and algorithmic-level models. The key idea is to push the notion of rationality, often used in defining computational-level models, deeper toward the algorithmic level. We offer a simple recipe for reverse-engineering the mind's cognitive strategies by deriving optimal algorithms for a series of increasingly more realistic abstract computational architectures, which we call "resource-rational analysis." Copyright © 2015 Cognitive Science Society, Inc.
A real-time photo-realistic rendering algorithm of ocean color based on bio-optical model
NASA Astrophysics Data System (ADS)
Ma, Chunyong; Xu, Shu; Wang, Hongsong; Tian, Fenglin; Chen, Ge
2016-12-01
A real-time photo-realistic rendering algorithm of ocean color is introduced in the paper, which considers the impact of ocean bio-optical model. The ocean bio-optical model mainly involves the phytoplankton, colored dissolved organic material (CDOM), inorganic suspended particle, etc., which have different contributions to absorption and scattering of light. We decompose the emergent light of the ocean surface into the reflected light from the sun and the sky, and the subsurface scattering light. We establish an ocean surface transmission model based on ocean bidirectional reflectance distribution function (BRDF) and the Fresnel law, and this model's outputs would be the incident light parameters of subsurface scattering. Using ocean subsurface scattering algorithm combined with bio-optical model, we compute the scattering light emergent radiation in different directions. Then, we blend the reflection of sunlight and sky light to implement the real-time ocean color rendering in graphics processing unit (GPU). Finally, we use two kinds of radiance reflectance calculated by Hydrolight radiative transfer model and our algorithm to validate the physical reality of our method, and the results show that our algorithm can achieve real-time highly realistic ocean color scenes.
Multi-ray medical ultrasound simulation without explicit speckle modelling.
Tuzer, Mert; Yazıcı, Abdulkadir; Türkay, Rüştü; Boyman, Michael; Acar, Burak
2018-05-04
To develop a medical ultrasound (US) simulation method using T1-weighted magnetic resonance images (MRI) as the input that offers a compromise between low-cost ray-based and high-cost realistic wave-based simulations. The proposed method uses a novel multi-ray image formation approach with a virtual phased array transducer probe. A domain model is built from input MR images. Multiple virtual acoustic rays are emerged from each element of the linear transducer array. Reflected and transmitted acoustic energy at discrete points along each ray is computed independently. Simulated US images are computed by fusion of the reflected energy along multiple rays from multiple transducers, while phase delays due to differences in distances to transducers are taken into account. A preliminary implementation using GPUs is presented. Preliminary results show that the multi-ray approach is capable of generating view point-dependent realistic US images with an inherent Rician distributed speckle pattern automatically. The proposed simulator can reproduce the shadowing artefacts and demonstrates frequency dependence apt for practical training purposes. We also have presented preliminary results towards the utilization of the method for real-time simulations. The proposed method offers a low-cost near-real-time wave-like simulation of realistic US images from input MR data. It can further be improved to cover the pathological findings using an improved domain model, without any algorithmic updates. Such a domain model would require lesion segmentation or manual embedding of virtual pathologies for training purposes.
Physically-Based Modelling and Real-Time Simulation of Fluids.
NASA Astrophysics Data System (ADS)
Chen, Jim Xiong
1995-01-01
Simulating physically realistic complex fluid behaviors presents an extremely challenging problem for computer graphics researchers. Such behaviors include the effects of driving boats through water, blending differently colored fluids, rain falling and flowing on a terrain, fluids interacting in a Distributed Interactive Simulation (DIS), etc. Such capabilities are useful in computer art, advertising, education, entertainment, and training. We present a new method for physically-based modeling and real-time simulation of fluids in computer graphics and dynamic virtual environments. By solving the 2D Navier -Stokes equations using a CFD method, we map the surface into 3D using the corresponding pressures in the fluid flow field. This achieves realistic real-time fluid surface behaviors by employing the physical governing laws of fluids but avoiding extensive 3D fluid dynamics computations. To complement the surface behaviors, we calculate fluid volume and external boundary changes separately to achieve full 3D general fluid flow. To simulate physical activities in a DIS, we introduce a mechanism which uses a uniform time scale proportional to the clock-time and variable time-slicing to synchronize physical models such as fluids in the networked environment. Our approach can simulate many different fluid behaviors by changing the internal or external boundary conditions. It can model different kinds of fluids by varying the Reynolds number. It can simulate objects moving or floating in fluids. It can also produce synchronized general fluid flows in a DIS. Our model can serve as a testbed to simulate many other fluid phenomena which have never been successfully modeled previously.
Sorensen, Mads Solvsten; Mosegaard, Jesper; Trier, Peter
2009-06-01
Existing virtual simulators for middle ear surgery are based on 3-dimensional (3D) models from computed tomographic or magnetic resonance imaging data in which image quality is limited by the lack of detail (maximum, approximately 50 voxels/mm3), natural color, and texture of the source material.Virtual training often requires the purchase of a program, a customized computer, and expensive peripherals dedicated exclusively to this purpose. The Visible Ear freeware library of digital images from a fresh-frozen human temporal bone was segmented, and real-time volume rendered as a 3D model of high-fidelity, true color, and great anatomic detail and realism of the surgically relevant structures. A haptic drilling model was developed for surgical interaction with the 3D model. Realistic visualization in high-fidelity (approximately 125 voxels/mm3) and true color, 2D, or optional anaglyph stereoscopic 3D was achieved on a standard Core 2 Duo personal computer with a GeForce 8,800 GTX graphics card, and surgical interaction was provided through a relatively inexpensive (approximately $2,500) Phantom Omni haptic 3D pointing device. This prototype is published for download (approximately 120 MB) as freeware at http://www.alexandra.dk/ves/index.htm.With increasing personal computer performance, future versions may include enhanced resolution (up to 8,000 voxels/mm3) and realistic interaction with deformable soft tissue components such as skin, tympanic membrane, dura, and cholesteatomas-features some of which are not possible with computed tomographic-/magnetic resonance imaging-based systems.
Optimizing human activity patterns using global sensitivity analysis.
Fairchild, Geoffrey; Hickmann, Kyle S; Mniszewski, Susan M; Del Valle, Sara Y; Hyman, James M
2014-12-01
Implementing realistic activity patterns for a population is crucial for modeling, for example, disease spread, supply and demand, and disaster response. Using the dynamic activity simulation engine, DASim, we generate schedules for a population that capture regular (e.g., working, eating, and sleeping) and irregular activities (e.g., shopping or going to the doctor). We use the sample entropy (SampEn) statistic to quantify a schedule's regularity for a population. We show how to tune an activity's regularity by adjusting SampEn, thereby making it possible to realistically design activities when creating a schedule. The tuning process sets up a computationally intractable high-dimensional optimization problem. To reduce the computational demand, we use Bayesian Gaussian process regression to compute global sensitivity indices and identify the parameters that have the greatest effect on the variance of SampEn. We use the harmony search (HS) global optimization algorithm to locate global optima. Our results show that HS combined with global sensitivity analysis can efficiently tune the SampEn statistic with few search iterations. We demonstrate how global sensitivity analysis can guide statistical emulation and global optimization algorithms to efficiently tune activities and generate realistic activity patterns. Though our tuning methods are applied to dynamic activity schedule generation, they are general and represent a significant step in the direction of automated tuning and optimization of high-dimensional computer simulations.
Optimizing human activity patterns using global sensitivity analysis
Hickmann, Kyle S.; Mniszewski, Susan M.; Del Valle, Sara Y.; Hyman, James M.
2014-01-01
Implementing realistic activity patterns for a population is crucial for modeling, for example, disease spread, supply and demand, and disaster response. Using the dynamic activity simulation engine, DASim, we generate schedules for a population that capture regular (e.g., working, eating, and sleeping) and irregular activities (e.g., shopping or going to the doctor). We use the sample entropy (SampEn) statistic to quantify a schedule’s regularity for a population. We show how to tune an activity’s regularity by adjusting SampEn, thereby making it possible to realistically design activities when creating a schedule. The tuning process sets up a computationally intractable high-dimensional optimization problem. To reduce the computational demand, we use Bayesian Gaussian process regression to compute global sensitivity indices and identify the parameters that have the greatest effect on the variance of SampEn. We use the harmony search (HS) global optimization algorithm to locate global optima. Our results show that HS combined with global sensitivity analysis can efficiently tune the SampEn statistic with few search iterations. We demonstrate how global sensitivity analysis can guide statistical emulation and global optimization algorithms to efficiently tune activities and generate realistic activity patterns. Though our tuning methods are applied to dynamic activity schedule generation, they are general and represent a significant step in the direction of automated tuning and optimization of high-dimensional computer simulations. PMID:25580080
Optimizing human activity patterns using global sensitivity analysis
Fairchild, Geoffrey; Hickmann, Kyle S.; Mniszewski, Susan M.; ...
2013-12-10
Implementing realistic activity patterns for a population is crucial for modeling, for example, disease spread, supply and demand, and disaster response. Using the dynamic activity simulation engine, DASim, we generate schedules for a population that capture regular (e.g., working, eating, and sleeping) and irregular activities (e.g., shopping or going to the doctor). We use the sample entropy (SampEn) statistic to quantify a schedule’s regularity for a population. We show how to tune an activity’s regularity by adjusting SampEn, thereby making it possible to realistically design activities when creating a schedule. The tuning process sets up a computationally intractable high-dimensional optimizationmore » problem. To reduce the computational demand, we use Bayesian Gaussian process regression to compute global sensitivity indices and identify the parameters that have the greatest effect on the variance of SampEn. Here we use the harmony search (HS) global optimization algorithm to locate global optima. Our results show that HS combined with global sensitivity analysis can efficiently tune the SampEn statistic with few search iterations. We demonstrate how global sensitivity analysis can guide statistical emulation and global optimization algorithms to efficiently tune activities and generate realistic activity patterns. Finally, though our tuning methods are applied to dynamic activity schedule generation, they are general and represent a significant step in the direction of automated tuning and optimization of high-dimensional computer simulations.« less
A 'Turing' Test for Landscape Evolution Models
NASA Astrophysics Data System (ADS)
Parsons, A. J.; Wise, S. M.; Wainwright, J.; Swift, D. A.
2008-12-01
Resolving the interactions among tectonics, climate and surface processes at long timescales has benefited from the development of computer models of landscape evolution. However, testing these Landscape Evolution Models (LEMs) has been piecemeal and partial. We argue that a more systematic approach is required. What is needed is a test that will establish how 'realistic' an LEM is and thus the extent to which its predictions may be trusted. We propose a test based upon the Turing Test of artificial intelligence as a way forward. In 1950 Alan Turing posed the question of whether a machine could think. Rather than attempt to address the question directly he proposed a test in which an interrogator asked questions of a person and a machine, with no means of telling which was which. If the machine's answer could not be distinguished from those of the human, the machine could be said to demonstrate artificial intelligence. By analogy, if an LEM cannot be distinguished from a real landscape it can be deemed to be realistic. The Turing test of intelligence is a test of the way in which a computer behaves. The analogy in the case of an LEM is that it should show realistic behaviour in terms of form and process, both at a given moment in time (punctual) and in the way both form and process evolve over time (dynamic). For some of these behaviours, tests already exist. For example there are numerous morphometric tests of punctual form and measurements of punctual process. The test discussed in this paper provides new ways of assessing dynamic behaviour of an LEM over realistically long timescales. However challenges remain in developing an appropriate suite of challenging tests, in applying these tests to current LEMs and in developing LEMs that pass them.
Realistic and efficient 2D crack simulation
NASA Astrophysics Data System (ADS)
Yadegar, Jacob; Liu, Xiaoqing; Singh, Abhishek
2010-04-01
Although numerical algorithms for 2D crack simulation have been studied in Modeling and Simulation (M&S) and computer graphics for decades, realism and computational efficiency are still major challenges. In this paper, we introduce a high-fidelity, scalable, adaptive and efficient/runtime 2D crack/fracture simulation system by applying the mathematically elegant Peano-Cesaro triangular meshing/remeshing technique to model the generation of shards/fragments. The recursive fractal sweep associated with the Peano-Cesaro triangulation provides efficient local multi-resolution refinement to any level-of-detail. The generated binary decomposition tree also provides efficient neighbor retrieval mechanism used for mesh element splitting and merging with minimal memory requirements essential for realistic 2D fragment formation. Upon load impact/contact/penetration, a number of factors including impact angle, impact energy, and material properties are all taken into account to produce the criteria of crack initialization, propagation, and termination leading to realistic fractal-like rubble/fragments formation. The aforementioned parameters are used as variables of probabilistic models of cracks/shards formation, making the proposed solution highly adaptive by allowing machine learning mechanisms learn the optimal values for the variables/parameters based on prior benchmark data generated by off-line physics based simulation solutions that produce accurate fractures/shards though at highly non-real time paste. Crack/fracture simulation has been conducted on various load impacts with different initial locations at various impulse scales. The simulation results demonstrate that the proposed system has the capability to realistically and efficiently simulate 2D crack phenomena (such as window shattering and shards generation) with diverse potentials in military and civil M&S applications such as training and mission planning.
An evaluation of differences due to changing source directivity in room acoustic computer modeling
NASA Astrophysics Data System (ADS)
Vigeant, Michelle C.; Wang, Lily M.
2004-05-01
This project examines the effects of changing source directivity in room acoustic computer models on objective parameters and subjective perception. Acoustic parameters and auralizations calculated from omnidirectional versus directional sources were compared. Three realistic directional sources were used, measured in a limited number of octave bands from a piano, singing voice, and violin. A highly directional source that beams only within a sixteenth-tant of a sphere was also tested. Objectively, there were differences of 5% or more in reverberation time (RT) between the realistic directional and omnidirectional sources. Between the beaming directional and omnidirectional sources, differences in clarity were close to the just-noticeable-difference (jnd) criterion of 1 dB. Subjectively, participants had great difficulty distinguishing between the realistic and omnidirectional sources; very few could discern the differences in RTs. However, a larger percentage (32% vs 20%) could differentiate between the beaming and omnidirectional sources, as well as the respective differences in clarity. Further studies of the objective results from different beaming sources have been pursued. The direction of the beaming source in the room is changed, as well as the beamwidth. The objective results are analyzed to determine if differences fall within the jnd of sound-pressure level, RT, and clarity.
Challenges of forest landscape modeling - simulating large landscapes and validating results
Hong S. He; Jian Yang; Stephen R. Shifley; Frank R. Thompson
2011-01-01
Over the last 20 years, we have seen a rapid development in the field of forest landscape modeling, fueled by both technological and theoretical advances. Two fundamental challenges have persisted since the inception of FLMs: (1) balancing realistic simulation of ecological processes at broad spatial and temporal scales with computing capacity, and (2) validating...
An Eight-Parameter Function for Simulating Model Rocket Engine Thrust Curves
ERIC Educational Resources Information Center
Dooling, Thomas A.
2007-01-01
The toy model rocket is used extensively as an example of a realistic physical system. Teachers from grade school to the university level use them. Many teachers and students write computer programs to investigate rocket physics since the problem involves nonlinear functions related to air resistance and mass loss. This paper describes a nonlinear…
Antonietti, Alberto; Casellato, Claudia; Garrido, Jesús A; Luque, Niceto R; Naveros, Francisco; Ros, Eduardo; D' Angelo, Egidio; Pedrocchi, Alessandra
2016-01-01
In this study, we defined a realistic cerebellar model through the use of artificial spiking neural networks, testing it in computational simulations that reproduce associative motor tasks in multiple sessions of acquisition and extinction. By evolutionary algorithms, we tuned the cerebellar microcircuit to find out the near-optimal plasticity mechanism parameters that better reproduced human-like behavior in eye blink classical conditioning, one of the most extensively studied paradigms related to the cerebellum. We used two models: one with only the cortical plasticity and another including two additional plasticity sites at nuclear level. First, both spiking cerebellar models were able to well reproduce the real human behaviors, in terms of both "timing" and "amplitude", expressing rapid acquisition, stable late acquisition, rapid extinction, and faster reacquisition of an associative motor task. Even though the model with only the cortical plasticity site showed good learning capabilities, the model with distributed plasticity produced faster and more stable acquisition of conditioned responses in the reacquisition phase. This behavior is explained by the effect of the nuclear plasticities, which have slow dynamics and can express memory consolidation and saving. We showed how the spiking dynamics of multiple interactive neural mechanisms implicitly drive multiple essential components of complex learning processes. This study presents a very advanced computational model, developed together by biomedical engineers, computer scientists, and neuroscientists. Since its realistic features, the proposed model can provide confirmations and suggestions about neurophysiological and pathological hypotheses and can be used in challenging clinical applications.
Pediatric in vitro and in silico models of deposition via oral and nasal inhalation.
Carrigy, Nicholas B; Ruzycki, Conor A; Golshahi, Laleh; Finlay, Warren H
2014-06-01
Respiratory tract deposition models provide a useful method for optimizing the design and administration of inhaled pharmaceutical aerosols, and can be useful for estimating exposure risks to inhaled particulate matter. As aerosol must first pass through the extrathoracic region prior to reaching the lungs, deposition in this region plays an important role in both cases. Compared to adults, much less extrathoracic deposition data are available with pediatric subjects. Recently, progress in magnetic resonance imaging and computed tomography scans to develop pediatric extrathoracic airway replicas has facilitated addressing this issue. Indeed, the use of realistic replicas for benchtop inhaler testing is now relatively common during the development and in vitro evaluation of pediatric respiratory drug delivery devices. Recently, in vitro empirical modeling studies using a moderate number of these realistic replicas have related airway geometry, particle size, fluid properties, and flow rate to extrathoracic deposition. Idealized geometries provide a standardized platform for inhaler testing and exposure risk assessment and have been designed to mimic average in vitro deposition in infants and children by replicating representative average geometrical dimensions. In silico mathematical models have used morphometric data and aerosol physics to illustrate the relative importance of different deposition mechanisms on respiratory tract deposition. Computational fluid dynamics simulations allow for the quantification of local deposition patterns and an in-depth examination of aerosol behavior in the respiratory tract. Recent studies have used both in vitro and in silico deposition measurements in realistic pediatric airway geometries to some success. This article reviews the current understanding of pediatric in vitro and in silico deposition modeling via oral and nasal inhalation.
Modelling and analysis of the sugar cataract development process using stochastic hybrid systems.
Riley, D; Koutsoukos, X; Riley, K
2009-05-01
Modelling and analysis of biochemical systems such as sugar cataract development (SCD) are critical because they can provide new insights into systems, which cannot be easily tested with experiments; however, they are challenging problems due to the highly coupled chemical reactions that are involved. The authors present a stochastic hybrid system (SHS) framework for modelling biochemical systems and demonstrate the approach for the SCD process. A novel feature of the framework is that it allows modelling the effect of drug treatment on the system dynamics. The authors validate the three sugar cataract models by comparing trajectories computed by two simulation algorithms. Further, the authors present a probabilistic verification method for computing the probability of sugar cataract formation for different chemical concentrations using safety and reachability analysis methods for SHSs. The verification method employs dynamic programming based on a discretisation of the state space and therefore suffers from the curse of dimensionality. To analyse the SCD process, a parallel dynamic programming implementation that can handle large, realistic systems was developed. Although scalability is a limiting factor, this work demonstrates that the proposed method is feasible for realistic biochemical systems.
XCAT/DRASIM: a realistic CT/human-model simulation package
NASA Astrophysics Data System (ADS)
Fung, George S. K.; Stierstorfer, Karl; Segars, W. Paul; Taguchi, Katsuyuki; Flohr, Thomas G.; Tsui, Benjamin M. W.
2011-03-01
The aim of this research is to develop a complete CT/human-model simulation package by integrating the 4D eXtended CArdiac-Torso (XCAT) phantom, a computer generated NURBS surface based phantom that provides a realistic model of human anatomy and respiratory and cardiac motions, and the DRASIM (Siemens Healthcare) CT-data simulation program. Unlike other CT simulation tools which are based on simple mathematical primitives or voxelized phantoms, this new simulation package has the advantages of utilizing a realistic model of human anatomy and physiological motions without voxelization and with accurate modeling of the characteristics of clinical Siemens CT systems. First, we incorporated the 4D XCAT anatomy and motion models into DRASIM by implementing a new library which consists of functions to read-in the NURBS surfaces of anatomical objects and their overlapping order and material properties in the XCAT phantom. Second, we incorporated an efficient ray-tracing algorithm for line integral calculation in DRASIM by computing the intersection points of the rays cast from the x-ray source to the detector elements through the NURBS surfaces of the multiple XCAT anatomical objects along the ray paths. Third, we evaluated the integrated simulation package by performing a number of sample simulations of multiple x-ray projections from different views followed by image reconstruction. The initial simulation results were found to be promising by qualitative evaluation. In conclusion, we have developed a unique CT/human-model simulation package which has great potential as a tool in the design and optimization of CT scanners, and the development of scanning protocols and image reconstruction methods for improving CT image quality and reducing radiation dose.
Radiating dipoles in photonic crystals
Busch; Vats; John; Sanders
2000-09-01
The radiation dynamics of a dipole antenna embedded in a photonic crystal are modeled by an initially excited harmonic oscillator coupled to a non-Markovian bath of harmonic oscillators representing the colored electromagnetic vacuum within the crystal. Realistic coupling constants based on the natural modes of the photonic crystal, i.e., Bloch waves and their associated dispersion relation, are derived. For simple model systems, well-known results such as decay times and emission spectra are reproduced. This approach enables direct incorporation of realistic band structure computations into studies of radiative emission from atoms and molecules within photonic crystals. We therefore provide a predictive and interpretative tool for experiments in both the microwave and optical regimes.
A method for modeling contact dynamics for automated capture mechanisms
NASA Technical Reports Server (NTRS)
Williams, Philip J.
1991-01-01
Logicon Control Dynamics develops contact dynamics models for space-based docking and berthing vehicles. The models compute contact forces for the physical contact between mating capture mechanism surfaces. Realistic simulation requires proportionality constants, for calculating contact forces, to approximate surface stiffness of contacting bodies. Proportionality for rigid metallic bodies becomes quite large. Small penetrations of surface boundaries can produce large contact forces.
NASA Astrophysics Data System (ADS)
Einspigel, D.; Sachl, L.; Martinec, Z.
2014-12-01
We present the DEBOT model, which is a new global barotropic ocean model. The DEBOT model is primarily designed for modelling of ocean flow generated by the tidal attraction of the Moon and the Sun, however it can be used for other ocean applications where the barotropic model is sufficient, for instance, a tsunami wave propagation. The model has been thoroughly tested by several different methods: 1) synthetic example which involves a tsunami-like wave propagation of an initial Gaussian depression and testing of the conservation of integral invariants, 2) a benchmark study with another barotropic model, the LSGbt model, has been performed and 3) results of realistic simulations have been compared with data from tide gauge measurements around the world. The test computations prove the validity of the numerical code and demonstrate the ability of the DEBOT model to simulate the realistic ocean tides. The DEBOT model will be principaly applied in related geophysical disciplines, for instance, in an investigation of an influence of the ocean tides on the geomagnetic field or the Earth's rotation. A module for modelling of the secondary poloidal magnetic field generated by an ocean flow is already implemented in the DEBOT model and preliminary results will be presented. The future aim is to assimilate magnetic data provided by the Swarm satellite mission into the ocean flow model.
Study of the stability of a SEIRS model for computer worm propagation
NASA Astrophysics Data System (ADS)
Hernández Guillén, J. D.; Martín del Rey, A.; Hernández Encinas, L.
2017-08-01
Nowadays, malware is the most important threat to information security. In this sense, several mathematical models to simulate malware spreading have appeared. They are compartmental models where the population of devices is classified into different compartments: susceptible, exposed, infectious, recovered, etc. The main goal of this work is to propose an improved SEIRS (Susceptible-Exposed-Infectious-Recovered-Susceptible) mathematical model to simulate computer worm propagation. It is a continuous model whose dynamic is ruled by means of a system of ordinary differential equations. It considers more realistic parameters related to the propagation; in fact, a modified incidence rate has been used. Moreover, the equilibrium points are computed and their local and global stability analyses are studied. From the explicit expression of the basic reproductive number, efficient control measures are also obtained.
Numerical methods in Markov chain modeling
NASA Technical Reports Server (NTRS)
Philippe, Bernard; Saad, Youcef; Stewart, William J.
1989-01-01
Several methods for computing stationary probability distributions of Markov chains are described and compared. The main linear algebra problem consists of computing an eigenvector of a sparse, usually nonsymmetric, matrix associated with a known eigenvalue. It can also be cast as a problem of solving a homogeneous singular linear system. Several methods based on combinations of Krylov subspace techniques are presented. The performance of these methods on some realistic problems are compared.
Michel, Miriam; Egender, Friedemann; Heßling, Vera; Dähnert, Ingo; Gebauer, Roman
2016-01-01
Background Postoperative junctional ectopic tachycardia (JET) occurs frequently after pediatric cardiac surgery. R-wave synchronized atrial (AVT) pacing is used to re-establish atrioventricular synchrony. AVT pacing is complex, with technical pitfalls. We sought to establish and to test a low-cost simulation model suitable for training and analysis in AVT pacing. Methods A simulation model was developed based on a JET simulator, a simulation doll, a cardiac monitor, and a pacemaker. A computer program simulated electrocardiograms. Ten experienced pediatric cardiologists tested the model. Their performance was analyzed using a testing protocol with 10 working steps. Results Four testers found the simulation model realistic; 6 found it very realistic. Nine claimed that the trial had improved their skills. All testers considered the model useful in teaching AVT pacing. The simulation test identified 5 working steps in which major mistakes in performance test may impede safe and effective AVT pacing and thus permitted specific training. The components of the model (exclusive monitor and pacemaker) cost less than $50. Assembly and training-session expenses were trivial. Conclusions A realistic, low-cost simulation model of AVT pacing is described. The model is suitable for teaching and analyzing AVT pacing technique. PMID:26943363
Antonioletti, Mario; Biktashev, Vadim N; Jackson, Adrian; Kharche, Sanjay R; Stary, Tomas; Biktasheva, Irina V
2017-01-01
The BeatBox simulation environment combines flexible script language user interface with the robust computational tools, in order to setup cardiac electrophysiology in-silico experiments without re-coding at low-level, so that cell excitation, tissue/anatomy models, stimulation protocols may be included into a BeatBox script, and simulation run either sequentially or in parallel (MPI) without re-compilation. BeatBox is a free software written in C language to be run on a Unix-based platform. It provides the whole spectrum of multi scale tissue modelling from 0-dimensional individual cell simulation, 1-dimensional fibre, 2-dimensional sheet and 3-dimensional slab of tissue, up to anatomically realistic whole heart simulations, with run time measurements including cardiac re-entry tip/filament tracing, ECG, local/global samples of any variables, etc. BeatBox solvers, cell, and tissue/anatomy models repositories are extended via robust and flexible interfaces, thus providing an open framework for new developments in the field. In this paper we give an overview of the BeatBox current state, together with a description of the main computational methods and MPI parallelisation approaches.
Automated dynamic analytical model improvement for damped structures
NASA Technical Reports Server (NTRS)
Fuh, J. S.; Berman, A.
1985-01-01
A method is described to improve a linear nonproportionally damped analytical model of a structure. The procedure finds the smallest changes in the analytical model such that the improved model matches the measured modal parameters. Features of the method are: (1) ability to properly treat complex valued modal parameters of a damped system; (2) applicability to realistically large structural models; and (3) computationally efficiency without involving eigensolutions and inversion of a large matrix.
De Marco, Tommaso; Ries, Florian; Guermandi, Marco; Guerrieri, Roberto
2012-05-01
Electrical impedance tomography (EIT) is an imaging technology based on impedance measurements. To retrieve meaningful insights from these measurements, EIT relies on detailed knowledge of the underlying electrical properties of the body. This is obtained from numerical models of current flows therein. The nonhomogeneous and anisotropic electric properties of human tissues make accurate modeling and simulation very challenging, leading to a tradeoff between physical accuracy and technical feasibility, which at present severely limits the capabilities of EIT. This work presents a complete algorithmic flow for an accurate EIT modeling environment featuring high anatomical fidelity with a spatial resolution equal to that provided by an MRI and a novel realistic complete electrode model implementation. At the same time, we demonstrate that current graphics processing unit (GPU)-based platforms provide enough computational power that a domain discretized with five million voxels can be numerically modeled in about 30 s.
Padhi, Radhakant; Bhardhwaj, Jayender R
2009-06-01
An adaptive drug delivery design is presented in this paper using neural networks for effective treatment of infectious diseases. The generic mathematical model used describes the coupled evolution of concentration of pathogens, plasma cells, antibodies and a numerical value that indicates the relative characteristic of a damaged organ due to the disease under the influence of external drugs. From a system theoretic point of view, the external drugs can be interpreted as control inputs, which can be designed based on control theoretic concepts. In this study, assuming a set of nominal parameters in the mathematical model, first a nonlinear controller (drug administration) is designed based on the principle of dynamic inversion. This nominal drug administration plan was found to be effective in curing "nominal model patients" (patients whose immunological dynamics conform to the mathematical model used for the control design exactly. However, it was found to be ineffective in curing "realistic model patients" (patients whose immunological dynamics may have off-nominal parameter values and possibly unwanted inputs) in general. Hence, to make the drug delivery dosage design more effective for realistic model patients, a model-following adaptive control design is carried out next by taking the help of neural networks, that are trained online. Simulation studies indicate that the adaptive controller proposed in this paper holds promise in killing the invading pathogens and healing the damaged organ even in the presence of parameter uncertainties and continued pathogen attack. Note that the computational requirements for computing the control are very minimal and all associated computations (including the training of neural networks) can be carried out online. However it assumes that the required diagnosis process can be carried out at a sufficient faster rate so that all the states are available for control computation.
Forward and inverse effects of the complete electrode model in neonatal EEG
Lew, S.; Wolters, C. H.
2016-01-01
This paper investigates finite element method-based modeling in the context of neonatal electroencephalography (EEG). In particular, the focus lies on electrode boundary conditions. We compare the complete electrode model (CEM) with the point electrode model (PEM), which is the current standard in EEG. In the CEM, the voltage experienced by an electrode is modeled more realistically as the integral average of the potential distribution over its contact surface, whereas the PEM relies on a point value. Consequently, the CEM takes into account the subelectrode shunting currents, which are absent in the PEM. In this study, we aim to find out how the electrode voltage predicted by these two models differ, if standard size electrodes are attached to a head of a neonate. Additionally, we study voltages and voltage variation on electrode surfaces with two source locations: 1) next to the C6 electrode and 2) directly under the Fz electrode and the frontal fontanel. A realistic model of a neonatal head, including a skull with fontanels and sutures, is used. Based on the results, the forward simulation differences between CEM and PEM are in general small, but significant outliers can occur in the vicinity of the electrodes. The CEM can be considered as an integral part of the outer head model. The outcome of this study helps understanding volume conduction of neonatal EEG, since it enlightens the role of advanced skull and electrode modeling in forward and inverse computations. NEW & NOTEWORTHY The effect of the complete electrode model on electroencephalography forward and inverse computations is explored. A realistic neonatal head model, including a skull structure with fontanels and sutures, is used. The electrode and skull modeling differences are analyzed and compared with each other. The results suggest that the complete electrode model can be considered as an integral part of the outer head model. To achieve optimal source localization results, accurate electrode modeling might be necessary. PMID:27852731
Contaminant deposition building shielding factors for US residential structures.
Dickson, Elijah; Hamby, David; Eckerman, Keith
2017-10-10
This paper presents validated building shielding factors designed for contemporary US housing-stock under an idealized, yet realistic, exposure scenario from contaminant deposition on the roof and surrounding surfaces. The building shielding factors are intended for use in emergency planning and level three probabilistic risk assessments for a variety of postulated radiological events in which a realistic assessment is necessary to better understand the potential risks for accident mitigation and emergency response planning. Factors are calculated from detailed computational housing-units models using the general-purpose Monte Carlo N-Particle computational code, MCNP5, and are benchmarked from a series of narrow- and broad-beam measurements analyzing the shielding effectiveness of ten common general-purpose construction materials and ten shielding models representing the primary weather barriers (walls and roofs) of likely US housing-stock. Each model was designed to scale based on common residential construction practices and include, to the extent practical, all structurally significant components important for shielding against ionizing radiation. Calculations were performed for floor-specific locations from contaminant deposition on the roof and surrounding ground as well as for computing a weighted-average representative building shielding factor for single- and multi-story detached homes, both with and without basement as well for single-wide manufactured housing-unit. © 2017 IOP Publishing Ltd.
Contaminant deposition building shielding factors for US residential structures.
Dickson, E D; Hamby, D M; Eckerman, K F
2015-06-01
This paper presents validated building shielding factors designed for contemporary US housing-stock under an idealized, yet realistic, exposure scenario from contaminant deposition on the roof and surrounding surfaces. The building shielding factors are intended for use in emergency planning and level three probabilistic risk assessments for a variety of postulated radiological events in which a realistic assessment is necessary to better understand the potential risks for accident mitigation and emergency response planning. Factors are calculated from detailed computational housing-units models using the general-purpose Monte Carlo N-Particle computational code, MCNP5, and are benchmarked from a series of narrow- and broad-beam measurements analyzing the shielding effectiveness of ten common general-purpose construction materials and ten shielding models representing the primary weather barriers (walls and roofs) of likely US housing-stock. Each model was designed to scale based on common residential construction practices and include, to the extent practical, all structurally significant components important for shielding against ionizing radiation. Calculations were performed for floor-specific locations from contaminant deposition on the roof and surrounding ground as well as for computing a weighted-average representative building shielding factor for single- and multi-story detached homes, both with and without basement as well for single-wide manufactured housing-unit.
Cortical Spiking Network Interfaced with Virtual Musculoskeletal Arm and Robotic Arm
Dura-Bernal, Salvador; Zhou, Xianlian; Neymotin, Samuel A.; Przekwas, Andrzej; Francis, Joseph T.; Lytton, William W.
2015-01-01
Embedding computational models in the physical world is a critical step towards constraining their behavior and building practical applications. Here we aim to drive a realistic musculoskeletal arm model using a biomimetic cortical spiking model, and make a robot arm reproduce the same trajectories in real time. Our cortical model consisted of a 3-layered cortex, composed of several hundred spiking model-neurons, which display physiologically realistic dynamics. We interconnected the cortical model to a two-joint musculoskeletal model of a human arm, with realistic anatomical and biomechanical properties. The virtual arm received muscle excitations from the neuronal model, and fed back proprioceptive information, forming a closed-loop system. The cortical model was trained using spike timing-dependent reinforcement learning to drive the virtual arm in a 2D reaching task. Limb position was used to simultaneously control a robot arm using an improved network interface. Virtual arm muscle activations responded to motoneuron firing rates, with virtual arm muscles lengths encoded via population coding in the proprioceptive population. After training, the virtual arm performed reaching movements which were smoother and more realistic than those obtained using a simplistic arm model. This system provided access to both spiking network properties and to arm biophysical properties, including muscle forces. The use of a musculoskeletal virtual arm and the improved control system allowed the robot arm to perform movements which were smoother than those reported in our previous paper using a simplistic arm. This work provides a novel approach consisting of bidirectionally connecting a cortical model to a realistic virtual arm, and using the system output to drive a robotic arm in real time. Our techniques are applicable to the future development of brain neuroprosthetic control systems, and may enable enhanced brain-machine interfaces with the possibility for finer control of limb prosthetics. PMID:26635598
NASA Technical Reports Server (NTRS)
Arias, Adriel (Inventor)
2016-01-01
The main objective of the Holodeck Testbed is to create a cost effective, realistic, and highly immersive environment that can be used to train astronauts, carry out engineering analysis, develop procedures, and support various operations tasks. Currently, the Holodeck testbed allows to step into a simulated ISS (International Space Station) and interact with objects; as well as, perform Extra Vehicular Activities (EVA) on the surface of the Moon or Mars. The Holodeck Testbed is using the products being developed in the Hybrid Reality Lab (HRL). The HRL is combining technologies related to merging physical models with photo-realistic visuals to create a realistic and highly immersive environment. The lab also investigates technologies and concepts that are needed to allow it to be integrated with other testbeds; such as, the gravity offload capability provided by the Active Response Gravity Offload System (ARGOS). My main two duties were to develop and animate models for use in the HRL environments and work on a new way to interface with computers using Brain Computer Interface (BCI) technology. On my first task, I was able to create precise computer virtual tool models (accurate down to the thousandths or hundredths of an inch). To make these tools even more realistic, I produced animations for these tools so they would have the same mechanical features as the tools in real life. The computer models were also used to create 3D printed replicas that will be outfitted with tracking sensors. The sensor will allow the 3D printed models to align precisely with the computer models in the physical world and provide people with haptic/tactile feedback while wearing a VR (Virtual Reality) headset and interacting with the tools. Getting close to the end of my internship the lab bought a professional grade 3D Scanner. With this, I was able to replicate more intricate tools at a much more time-effective rate. The second task was to investigate the use of BCI to control objects inside the hybrid reality ISS environment. This task looked at using an Electroencephalogram (EEG) headset to collect brain state data that could be mapped to commands that a computer could execute. On this Task, I had a setback with the hardware, which stopped working and was returned to the vendor for repair. However, I was still able to collect some data, was able to process it, and started to create correlation algorithms between the electrical patterns in the brain and the commands we wanted the computer to carry out. I also carried out a test to investigate the comfort of the headset if it is worn for a long time. The knowledge gained will benefit me in my future career. I learned how to use various modeling and programming tools that included Blender, Maya, Substance Painter, Artec Studio, Github, and Unreal Engine 4. I learned how to use a professional grade 3D scanner and 3D printer. On the BCI Project I learned about data mining and how to create correlation algorithms. I also supported various demos including a live demo of the hybrid reality lab capabilities at ComicPalooza. This internship has given me a good look into engineering at NASA. I developed a more thorough understanding of engineering and my overall confidence has grown. I have also realized that any problem can be fixed, if you try hard enough, and as an engineer it is your job to not only fix problems but to embrace coming up with solutions to those problems.
NASA Astrophysics Data System (ADS)
Law, Yuen C.; Tenbrinck, Daniel; Jiang, Xiaoyi; Kuhlen, Torsten
2014-03-01
Computer-assisted processing and interpretation of medical ultrasound images is one of the most challenging tasks within image analysis. Physical phenomena in ultrasonographic images, e.g., the characteristic speckle noise and shadowing effects, make the majority of standard methods from image analysis non optimal. Furthermore, validation of adapted computer vision methods proves to be difficult due to missing ground truth information. There is no widely accepted software phantom in the community and existing software phantoms are not exible enough to support the use of specific speckle models for different tissue types, e.g., muscle and fat tissue. In this work we propose an anatomical software phantom with a realistic speckle pattern simulation to _ll this gap and provide a exible tool for validation purposes in medical ultrasound image analysis. We discuss the generation of speckle patterns and perform statistical analysis of the simulated textures to obtain quantitative measures of the realism and accuracy regarding the resulting textures.
Cybersim: geographic, temporal, and organizational dynamics of malware propagation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santhi, Nandakishore; Yan, Guanhua; Eidenbenz, Stephan
2010-01-01
Cyber-infractions into a nation's strategic security envelope pose a constant and daunting challenge. We present the modular CyberSim tool which has been developed in response to the need to realistically simulate at a national level, software vulnerabilities and resulting mal ware propagation in online social networks. CyberSim suite (a) can generate realistic scale-free networks from a database of geocoordinated computers to closely model social networks arising from personal and business email contacts and online communities; (b) maintains for each,bost a list of installed software, along with the latest published vulnerabilities; (d) allows designated initial nodes where malware gets introduced; (e)more » simulates, using distributed discrete event-driven technology, the spread of malware exploiting a specific vulnerability, with packet delay and user online behavior models; (f) provides a graphical visualization of spread of infection, its severity, businesses affected etc to the analyst. We present sample simulations on a national level network with millions of computers.« less
Crystallographic Lattice Boltzmann Method
Namburi, Manjusha; Krithivasan, Siddharth; Ansumali, Santosh
2016-01-01
Current approaches to Direct Numerical Simulation (DNS) are computationally quite expensive for most realistic scientific and engineering applications of Fluid Dynamics such as automobiles or atmospheric flows. The Lattice Boltzmann Method (LBM), with its simplified kinetic descriptions, has emerged as an important tool for simulating hydrodynamics. In a heterogeneous computing environment, it is often preferred due to its flexibility and better parallel scaling. However, direct simulation of realistic applications, without the use of turbulence models, remains a distant dream even with highly efficient methods such as LBM. In LBM, a fictitious lattice with suitable isotropy in the velocity space is considered to recover Navier-Stokes hydrodynamics in macroscopic limit. The same lattice is mapped onto a cartesian grid for spatial discretization of the kinetic equation. In this paper, we present an inverted argument of the LBM, by making spatial discretization as the central theme. We argue that the optimal spatial discretization for LBM is a Body Centered Cubic (BCC) arrangement of grid points. We illustrate an order-of-magnitude gain in efficiency for LBM and thus a significant progress towards feasibility of DNS for realistic flows. PMID:27251098
A survey on hair modeling: styling, simulation, and rendering.
Ward, Kelly; Bertails, Florence; Kim, Tae-Yong; Marschner, Stephen R; Cani, Marie-Paule; Lin, Ming C
2007-01-01
Realistic hair modeling is a fundamental part of creating virtual humans in computer graphics. This paper surveys the state of the art in the major topics of hair modeling: hairstyling, hair simulation, and hair rendering. Because of the difficult, often unsolved problems that arise in all these areas, a broad diversity of approaches are used, each with strengths that make it appropriate for particular applications. We discuss each of these major topics in turn, presenting the unique challenges facing each area and describing solutions that have been presented over the years to handle these complex issues. Finally, we outline some of the remaining computational challenges in hair modeling.
A Computational Model of Human Table Tennis for Robot Application
NASA Astrophysics Data System (ADS)
Mülling, Katharina; Peters, Jan
Table tennis is a difficult motor skill which requires all basic components of a general motor skill learning system. In order to get a step closer to such a generic approach to the automatic acquisition and refinement of table tennis, we study table tennis from a human motor control point of view. We make use of the basic models of discrete human movement phases, virtual hitting points, and the operational timing hypothesis. Using these components, we create a computational model which is aimed at reproducing human-like behavior. We verify the functionality of this model in a physically realistic simulation of a Barrett WAM.
A new polyvinyl alcohol hydrogel vascular model (KEZLEX) for microvascular anastomosis training
Mutoh, Tatsushi; Ishikawa, Tatsuya; Ono, Hidenori; Yasui, Nobuyuki
2010-01-01
Background: Microvascular anastomosis is a challenging neurosurgical technique that requires extensive training for one to master it. We developed a new vascular model (KEZLEX, Ono and Co., Ltd., Tokyo, Japan) as a non-animal, realistic tool for practicing microvascular anastomosis under realistic circumstances. Methods: The model was manufactured from polyvinyl alcohol hydrogel to provide 1.0–3.0 mm diameter (available for 0.5-mm pitch), 6–8 cm long tubes that have qualitatively similar surface characteristics, visibility, and stiffness to human donor and recipient arteries for various bypass surgeries based on three-dimensional computed tomography/magnetic resonance imaging scanning data reconstruction using visible human data set and vessel casts. Results: Trainees can acquire basic microsuturing techniques for end-to-end, end-to-side, and side-to-side anastomoses with handling similar to that for real arteries. To practice standard deep bypass techniques under realistic circumstances, the substitute vessel can be fixed to specific locations of a commercially available brain model with pins. Conclusion: Our vascular prosthesis model is simple and easy to set up for repeated practice, and will contribute to facilitate “off-the-job” training by trainees. PMID:21170365
A Eulerian-Lagrangian Model to Simulate Two-Phase/Particulate Flows
NASA Technical Reports Server (NTRS)
Apte, S. V.; Mahesh, K.; Lundgren, T.
2003-01-01
Figure 1 shows a snapshot of liquid fuel spray coming out of an injector nozzle in a realistic gas-turbine combustor. Here the spray atomization was simulated using a stochastic secondary breakup model (Apte et al. 2003a) with point-particle approximation for the droplets. Very close to the injector, it is observed that the spray density is large and the droplets cannot be treated as point-particles. The volume displaced by the liquid in this region is significant and can alter the gas-phase ow and spray evolution. In order to address this issue, one can compute the dense spray regime by an Eulerian-Lagrangian technique using advanced interface tracking/level-set methods (Sussman et al. 1994; Tryggvason et al. 2001; Herrmann 2003). This, however, is computationally intensive and may not be viable in realistic complex configurations. We therefore plan to develop a methodology based on Eulerian-Lagrangian technique which will allow us to capture the essential features of primary atomization using models to capture interactions between the fluid and droplets and which can be directly applied to the standard atomization models used in practice. The numerical scheme for unstructured grids developed by Mahesh et al. (2003) for incompressible flows is modified to take into account the droplet volume fraction. The numerical framework is directly applicable to realistic combustor geometries. Our main objectives in this work are: Develop a numerical formulation based on Eulerian-Lagrangian techniques with models for interaction terms between the fluid and particles to capture the Kelvin- Helmholtz type instabilities observed during primary atomization. Validate this technique for various two-phase and particulate flows. Assess its applicability to capture primary atomization of liquid jets in conjunction with secondary atomization models.
Tsoukias, Nikolaos M; Goldman, Daniel; Vadapalli, Arjun; Pittman, Roland N; Popel, Aleksander S
2007-10-21
A detailed computational model is developed to simulate oxygen transport from a three-dimensional (3D) microvascular network to the surrounding tissue in the presence of hemoglobin-based oxygen carriers. The model accounts for nonlinear O(2) consumption, myoglobin-facilitated diffusion and nonlinear oxyhemoglobin dissociation in the RBCs and plasma. It also includes a detailed description of intravascular resistance to O(2) transport and is capable of incorporating realistic 3D microvascular network geometries. Simulations in this study were performed using a computer-generated microvascular architecture that mimics morphometric parameters for the hamster cheek pouch retractor muscle. Theoretical results are presented next to corresponding experimental data. Phosphorescence quenching microscopy provided PO(2) measurements at the arteriolar and venular ends of capillaries in the hamster retractor muscle before and after isovolemic hemodilution with three different hemodilutents: a non-oxygen-carrying plasma expander and two hemoglobin solutions with different oxygen affinities. Sample results in a microvascular network show an enhancement of diffusive shunting between arterioles, venules and capillaries and a decrease in hemoglobin's effectiveness for tissue oxygenation when its affinity for O(2) is decreased. Model simulations suggest that microvascular network anatomy can affect the optimal hemoglobin affinity for reducing tissue hypoxia. O(2) transport simulations in realistic representations of microvascular networks should provide a theoretical framework for choosing optimal parameter values in the development of hemoglobin-based blood substitutes.
Size effects on insect hovering aerodynamics: an integrated computational study.
Liu, H; Aono, H
2009-03-01
Hovering is a miracle of insects that is observed for all sizes of flying insects. Sizing effect in insect hovering on flapping-wing aerodynamics is of interest to both the micro-air-vehicle (MAV) community and also of importance to comparative morphologists. In this study, we present an integrated computational study of such size effects on insect hovering aerodynamics, which is performed using a biology-inspired dynamic flight simulator that integrates the modelling of realistic wing-body morphology, the modelling of flapping-wing and body kinematics and an in-house Navier-Stokes solver. Results of four typical insect hovering flights including a hawkmoth, a honeybee, a fruit fly and a thrips, over a wide range of Reynolds numbers from O(10(4)) to O(10(1)) are presented, which demonstrate the feasibility of the present integrated computational methods in quantitatively modelling and evaluating the unsteady aerodynamics in insect flapping flight. Our results based on realistically modelling of insect hovering therefore offer an integrated understanding of the near-field vortex dynamics, the far-field wake and downwash structures, and their correlation with the force production in terms of sizing and Reynolds number as well as wing kinematics. Our results not only give an integrated interpretation on the similarity and discrepancy of the near- and far-field vortex structures in insect hovering but also demonstrate that our methods can be an effective tool in the MAVs design.
Population of 224 realistic human subject-based computational breast phantoms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erickson, David W.; Wells, Jered R., E-mail: jered.wells@duke.edu; Sturgeon, Gregory M.
Purpose: To create a database of highly realistic and anatomically variable 3D virtual breast phantoms based on dedicated breast computed tomography (bCT) data. Methods: A tissue classification and segmentation algorithm was used to create realistic and detailed 3D computational breast phantoms based on 230 + dedicated bCT datasets from normal human subjects. The breast volume was identified using a coarse three-class fuzzy C-means segmentation algorithm which accounted for and removed motion blur at the breast periphery. Noise in the bCT data was reduced through application of a postreconstruction 3D bilateral filter. A 3D adipose nonuniformity (bias field) correction was thenmore » applied followed by glandular segmentation using a 3D bias-corrected fuzzy C-means algorithm. Multiple tissue classes were defined including skin, adipose, and several fractional glandular densities. Following segmentation, a skin mask was produced which preserved the interdigitated skin, adipose, and glandular boundaries of the skin interior. Finally, surface modeling was used to produce digital phantoms with methods complementary to the XCAT suite of digital human phantoms. Results: After rejecting some datasets due to artifacts, 224 virtual breast phantoms were created which emulate the complex breast parenchyma of actual human subjects. The volume breast density (with skin) ranged from 5.5% to 66.3% with a mean value of 25.3% ± 13.2%. Breast volumes ranged from 25.0 to 2099.6 ml with a mean value of 716.3 ± 386.5 ml. Three breast phantoms were selected for imaging with digital compression (using finite element modeling) and simple ray-tracing, and the results show promise in their potential to produce realistic simulated mammograms. Conclusions: This work provides a new population of 224 breast phantoms based on in vivo bCT data for imaging research. Compared to previous studies based on only a few prototype cases, this dataset provides a rich source of new cases spanning a wide range of breast types, volumes, densities, and parenchymal patterns.« less
Population of 224 realistic human subject-based computational breast phantoms
Erickson, David W.; Wells, Jered R.; Sturgeon, Gregory M.; Dobbins, James T.; Segars, W. Paul; Lo, Joseph Y.
2016-01-01
Purpose: To create a database of highly realistic and anatomically variable 3D virtual breast phantoms based on dedicated breast computed tomography (bCT) data. Methods: A tissue classification and segmentation algorithm was used to create realistic and detailed 3D computational breast phantoms based on 230 + dedicated bCT datasets from normal human subjects. The breast volume was identified using a coarse three-class fuzzy C-means segmentation algorithm which accounted for and removed motion blur at the breast periphery. Noise in the bCT data was reduced through application of a postreconstruction 3D bilateral filter. A 3D adipose nonuniformity (bias field) correction was then applied followed by glandular segmentation using a 3D bias-corrected fuzzy C-means algorithm. Multiple tissue classes were defined including skin, adipose, and several fractional glandular densities. Following segmentation, a skin mask was produced which preserved the interdigitated skin, adipose, and glandular boundaries of the skin interior. Finally, surface modeling was used to produce digital phantoms with methods complementary to the XCAT suite of digital human phantoms. Results: After rejecting some datasets due to artifacts, 224 virtual breast phantoms were created which emulate the complex breast parenchyma of actual human subjects. The volume breast density (with skin) ranged from 5.5% to 66.3% with a mean value of 25.3% ± 13.2%. Breast volumes ranged from 25.0 to 2099.6 ml with a mean value of 716.3 ± 386.5 ml. Three breast phantoms were selected for imaging with digital compression (using finite element modeling) and simple ray-tracing, and the results show promise in their potential to produce realistic simulated mammograms. Conclusions: This work provides a new population of 224 breast phantoms based on in vivo bCT data for imaging research. Compared to previous studies based on only a few prototype cases, this dataset provides a rich source of new cases spanning a wide range of breast types, volumes, densities, and parenchymal patterns. PMID:26745896
NASA Technical Reports Server (NTRS)
Zhang, Ming
2005-01-01
The primary goal of this project was to perform theoretical calculations of propagation of cosmic rays and energetic particles in 3-dimensional heliospheric magnetic fields. We used Markov stochastic process simulation to achieve to this goal. We developed computation software that can be used to study particle propagation in, as two examples of heliospheric magnetic fields that have to be treated in 3 dimensions, a heliospheric magnetic field suggested by Fisk (1996) and a global heliosphere including the region beyond the termination shock. The results from our model calculations were compared with particle measurements from Ulysses, Earth-based spacecraft such as IMP-8, WIND and ACE, Voyagers and Pioneers in outer heliosphere for tests of the magnetic field models. We particularly looked for features of particle variations that can allow us to significantly distinguish the Fisk magnetic field from the conventional Parker spiral field. The computer code will eventually lead to a new generation of integrated software for solving complicated problems of particle acceleration, propagation and modulation in realistic 3-dimensional heliosphere of realistic magnetic fields and the solar wind with a single computation approach.
NASA Technical Reports Server (NTRS)
Deese, J. E.; Agarwal, R. K.
1989-01-01
Computational fluid dynamics has an increasingly important role in the design and analysis of aircraft as computer hardware becomes faster and algorithms become more efficient. Progress is being made in two directions: more complex and realistic configurations are being treated and algorithms based on higher approximations to the complete Navier-Stokes equations are being developed. The literature indicates that linear panel methods can model detailed, realistic aircraft geometries in flow regimes where this approximation is valid. As algorithms including higher approximations to the Navier-Stokes equations are developed, computer resource requirements increase rapidly. Generation of suitable grids become more difficult and the number of grid points required to resolve flow features of interest increases. Recently, the development of large vector computers has enabled researchers to attempt more complex geometries with Euler and Navier-Stokes algorithms. The results of calculations for transonic flow about a typical transport and fighter wing-body configuration using thin layer Navier-Stokes equations are described along with flow about helicopter rotor blades using both Euler/Navier-Stokes equations.
Ross K. Meentemeyer; Nik Cunniffe; Alex Cook; David M. Rizzo; Chris A. Gilligan
2010-01-01
Landscape- to regional-scale models of plant epidemics are direly needed to predict largescale impacts of disease and assess practicable options for control. While landscape heterogeneity is recognized as a major driver of disease dynamics, epidemiological models are rarely applied to realistic landscape conditions due to computational and data limitations. Here we...
Computational Fluid Dynamics of Whole-Body Aircraft
NASA Astrophysics Data System (ADS)
Agarwal, Ramesh
1999-01-01
The current state of the art in computational aerodynamics for whole-body aircraft flowfield simulations is described. Recent advances in geometry modeling, surface and volume grid generation, and flow simulation algorithms have led to accurate flowfield predictions for increasingly complex and realistic configurations. As a result, computational aerodynamics has emerged as a crucial enabling technology for the design and development of flight vehicles. Examples illustrating the current capability for the prediction of transport and fighter aircraft flowfields are presented. Unfortunately, accurate modeling of turbulence remains a major difficulty in the analysis of viscosity-dominated flows. In the future, inverse design methods, multidisciplinary design optimization methods, artificial intelligence technology, and massively parallel computer technology will be incorporated into computational aerodynamics, opening up greater opportunities for improved product design at substantially reduced costs.
ERIC Educational Resources Information Center
Slisko, Josip; Krokhin, Arkady
1995-01-01
Though the field of physics is moving toward more realistic problems and the use of computers and mathematical modeling to promote insightful treatment of physical problems, artificial problems still appear in textbooks in the field of electrostatics. Discusses physical arguments why one of the most popular textbook applications of Coulomb's Law…
Functional Risk Modeling for Lunar Surface Systems
NASA Technical Reports Server (NTRS)
Thomson, Fraser; Mathias, Donovan; Go, Susie; Nejad, Hamed
2010-01-01
We introduce an approach to risk modeling that we call functional modeling , which we have developed to estimate the capabilities of a lunar base. The functional model tracks the availability of functions provided by systems, in addition to the operational state of those systems constituent strings. By tracking functions, we are able to identify cases where identical functions are provided by elements (rovers, habitats, etc.) that are connected together on the lunar surface. We credit functional diversity in those cases, and in doing so compute more realistic estimates of operational mode availabilities. The functional modeling approach yields more realistic estimates of the availability of the various operational modes provided to astronauts by the ensemble of surface elements included in a lunar base architecture. By tracking functional availability the effects of diverse backup, which often exists when two or more independent elements are connected together, is properly accounted for.
NASA Astrophysics Data System (ADS)
Ding, Lei; Lai, Yuan; He, Bin
2005-01-01
It is of importance to localize neural sources from scalp recorded EEG. Low resolution brain electromagnetic tomography (LORETA) has received considerable attention for localizing brain electrical sources. However, most such efforts have used spherical head models in representing the head volume conductor. Investigation of the performance of LORETA in a realistic geometry head model, as compared with the spherical model, will provide useful information guiding interpretation of data obtained by using the spherical head model. The performance of LORETA was evaluated by means of computer simulations. The boundary element method was used to solve the forward problem. A three-shell realistic geometry (RG) head model was constructed from MRI scans of a human subject. Dipole source configurations of a single dipole located at different regions of the brain with varying depth were used to assess the performance of LORETA in different regions of the brain. A three-sphere head model was also used to approximate the RG head model, and similar simulations performed, and results compared with the RG-LORETA with reference to the locations of the simulated sources. Multi-source localizations were discussed and examples given in the RG head model. Localization errors employing the spherical LORETA, with reference to the source locations within the realistic geometry head, were about 20-30 mm, for four brain regions evaluated: frontal, parietal, temporal and occipital regions. Localization errors employing the RG head model were about 10 mm over the same four brain regions. The present simulation results suggest that the use of the RG head model reduces the localization error of LORETA, and that the RG head model based LORETA is desirable if high localization accuracy is needed.
Coalescent: an open-source and scalable framework for exact calculations in coalescent theory
2012-01-01
Background Currently, there is no open-source, cross-platform and scalable framework for coalescent analysis in population genetics. There is no scalable GUI based user application either. Such a framework and application would not only drive the creation of more complex and realistic models but also make them truly accessible. Results As a first attempt, we built a framework and user application for the domain of exact calculations in coalescent analysis. The framework provides an API with the concepts of model, data, statistic, phylogeny, gene tree and recursion. Infinite-alleles and infinite-sites models are considered. It defines pluggable computations such as counting and listing all the ancestral configurations and genealogies and computing the exact probability of data. It can visualize a gene tree, trace and visualize the internals of the recursion algorithm for further improvement and attach dynamically a number of output processors. The user application defines jobs in a plug-in like manner so that they can be activated, deactivated, installed or uninstalled on demand. Multiple jobs can be run and their inputs edited. Job inputs are persisted across restarts and running jobs can be cancelled where applicable. Conclusions Coalescent theory plays an increasingly important role in analysing molecular population genetic data. Models involved are mathematically difficult and computationally challenging. An open-source, scalable framework that lets users immediately take advantage of the progress made by others will enable exploration of yet more difficult and realistic models. As models become more complex and mathematically less tractable, the need for an integrated computational approach is obvious. Object oriented designs, though has upfront costs, are practical now and can provide such an integrated approach. PMID:23033878
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davidson, George S.; Brown, William Michael
2007-09-01
Techniques for high throughput determinations of interactomes, together with high resolution protein collocalizations maps within organelles and through membranes will soon create a vast resource. With these data, biological descriptions, akin to the high dimensional phase spaces familiar to physicists, will become possible. These descriptions will capture sufficient information to make possible realistic, system-level models of cells. The descriptions and the computational models they enable will require powerful computing techniques. This report is offered as a call to the computational biology community to begin thinking at this scale and as a challenge to develop the required algorithms and codes tomore » make use of the new data.3« less
Giese, Martin A; Rizzolatti, Giacomo
2015-10-07
Action recognition has received enormous interest in the field of neuroscience over the last two decades. In spite of this interest, the knowledge in terms of fundamental neural mechanisms that provide constraints for underlying computations remains rather limited. This fact stands in contrast with a wide variety of speculative theories about how action recognition might work. This review focuses on new fundamental electrophysiological results in monkeys, which provide constraints for the detailed underlying computations. In addition, we review models for action recognition and processing that have concrete mathematical implementations, as opposed to conceptual models. We think that only such implemented models can be meaningfully linked quantitatively to physiological data and have a potential to narrow down the many possible computational explanations for action recognition. In addition, only concrete implementations allow judging whether postulated computational concepts have a feasible implementation in terms of realistic neural circuits. Copyright © 2015 Elsevier Inc. All rights reserved.
Simple model of hydrophobic hydration.
Lukšič, Miha; Urbic, Tomaz; Hribar-Lee, Barbara; Dill, Ken A
2012-05-31
Water is an unusual liquid in its solvation properties. Here, we model the process of transferring a nonpolar solute into water. Our goal was to capture the physical balance between water's hydrogen bonding and van der Waals interactions in a model that is simple enough to be nearly analytical and not heavily computational. We develop a 2-dimensional Mercedes-Benz-like model of water with which we compute the free energy, enthalpy, entropy, and the heat capacity of transfer as a function of temperature, pressure, and solute size. As validation, we find that this model gives the same trends as Monte Carlo simulations of the underlying 2D model and gives qualitative agreement with experiments. The advantages of this model are that it gives simple insights and that computational time is negligible. It may provide a useful starting point for developing more efficient and more realistic 3D models of aqueous solvation.
A blended continuous–discontinuous finite element method for solving the multi-fluid plasma model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sousa, E.M., E-mail: sousae@uw.edu; Shumlak, U., E-mail: shumlak@uw.edu
The multi-fluid plasma model represents electrons, multiple ion species, and multiple neutral species as separate fluids that interact through short-range collisions and long-range electromagnetic fields. The model spans a large range of temporal and spatial scales, which renders the model stiff and presents numerical challenges. To address the large range of timescales, a blended continuous and discontinuous Galerkin method is proposed, where the massive ion and neutral species are modeled using an explicit discontinuous Galerkin method while the electrons and electromagnetic fields are modeled using an implicit continuous Galerkin method. This approach is able to capture large-gradient ion and neutralmore » physics like shock formation, while resolving high-frequency electron dynamics in a computationally efficient manner. The details of the Blended Finite Element Method (BFEM) are presented. The numerical method is benchmarked for accuracy and tested using two-fluid one-dimensional soliton problem and electromagnetic shock problem. The results are compared to conventional finite volume and finite element methods, and demonstrate that the BFEM is particularly effective in resolving physics in stiff problems involving realistic physical parameters, including realistic electron mass and speed of light. The benefit is illustrated by computing a three-fluid plasma application that demonstrates species separation in multi-component plasmas.« less
Construction of hexahedral finite element mesh capturing realistic geometries of a petroleum reserve
Park, Byoung Yoon; Roberts, Barry L.; Sobolik, Steven R.
2017-07-27
The three-dimensional finite element mesh capturing realistic geometries of the Bayou Choctaw site has been constructed using the sonar and seismic survey data obtained from the field. The mesh consists of hexahedral elements because the salt constitutive model is coded using hexahedral elements. Various ideas and techniques to construct finite element mesh capturing artificially and naturally formed geometries are provided. The techniques to reduce the number of elements as much as possible to save on computer run time while maintaining the computational accuracy is also introduced. The steps and methodologies could be applied to construct the meshes of Big Hill,more » Bryan Mound, and West Hackberry strategic petroleum reserve sites. The methodology could be applied to the complicated shape masses for various civil and geological structures.« less
Construction of hexahedral finite element mesh capturing realistic geometries of a petroleum reserve
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Byoung Yoon; Roberts, Barry L.; Sobolik, Steven R.
The three-dimensional finite element mesh capturing realistic geometries of the Bayou Choctaw site has been constructed using the sonar and seismic survey data obtained from the field. The mesh consists of hexahedral elements because the salt constitutive model is coded using hexahedral elements. Various ideas and techniques to construct finite element mesh capturing artificially and naturally formed geometries are provided. The techniques to reduce the number of elements as much as possible to save on computer run time while maintaining the computational accuracy is also introduced. The steps and methodologies could be applied to construct the meshes of Big Hill,more » Bryan Mound, and West Hackberry strategic petroleum reserve sites. The methodology could be applied to the complicated shape masses for various civil and geological structures.« less
Drawert, Brian; Engblom, Stefan; Hellander, Andreas
2012-06-22
Experiments in silico using stochastic reaction-diffusion models have emerged as an important tool in molecular systems biology. Designing computational software for such applications poses several challenges. Firstly, realistic lattice-based modeling for biological applications requires a consistent way of handling complex geometries, including curved inner- and outer boundaries. Secondly, spatiotemporal stochastic simulations are computationally expensive due to the fast time scales of individual reaction- and diffusion events when compared to the biological phenomena of actual interest. We therefore argue that simulation software needs to be both computationally efficient, employing sophisticated algorithms, yet in the same time flexible in order to meet present and future needs of increasingly complex biological modeling. We have developed URDME, a flexible software framework for general stochastic reaction-transport modeling and simulation. URDME uses Unstructured triangular and tetrahedral meshes to resolve general geometries, and relies on the Reaction-Diffusion Master Equation formalism to model the processes under study. An interface to a mature geometry and mesh handling external software (Comsol Multiphysics) provides for a stable and interactive environment for model construction. The core simulation routines are logically separated from the model building interface and written in a low-level language for computational efficiency. The connection to the geometry handling software is realized via a Matlab interface which facilitates script computing, data management, and post-processing. For practitioners, the software therefore behaves much as an interactive Matlab toolbox. At the same time, it is possible to modify and extend URDME with newly developed simulation routines. Since the overall design effectively hides the complexity of managing the geometry and meshes, this means that newly developed methods may be tested in a realistic setting already at an early stage of development. In this paper we demonstrate, in a series of examples with high relevance to the molecular systems biology community, that the proposed software framework is a useful tool for both practitioners and developers of spatial stochastic simulation algorithms. Through the combined efforts of algorithm development and improved modeling accuracy, increasingly complex biological models become feasible to study through computational methods. URDME is freely available at http://www.urdme.org.
A parallel computational model for GATE simulations.
Rannou, F R; Vega-Acevedo, N; El Bitar, Z
2013-12-01
GATE/Geant4 Monte Carlo simulations are computationally demanding applications, requiring thousands of processor hours to produce realistic results. The classical strategy of distributing the simulation of individual events does not apply efficiently for Positron Emission Tomography (PET) experiments, because it requires a centralized coincidence processing and large communication overheads. We propose a parallel computational model for GATE that handles event generation and coincidence processing in a simple and efficient way by decentralizing event generation and processing but maintaining a centralized event and time coordinator. The model is implemented with the inclusion of a new set of factory classes that can run the same executable in sequential or parallel mode. A Mann-Whitney test shows that the output produced by this parallel model in terms of number of tallies is equivalent (but not equal) to its sequential counterpart. Computational performance evaluation shows that the software is scalable and well balanced. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Mind the Gap! A Journey towards Computational Toxicology.
Mangiatordi, Giuseppe Felice; Alberga, Domenico; Altomare, Cosimo Damiano; Carotti, Angelo; Catto, Marco; Cellamare, Saverio; Gadaleta, Domenico; Lattanzi, Gianluca; Leonetti, Francesco; Pisani, Leonardo; Stefanachi, Angela; Trisciuzzi, Daniela; Nicolotti, Orazio
2016-09-01
Computational methods have advanced toxicology towards the development of target-specific models based on a clear cause-effect rationale. However, the predictive potential of these models presents strengths and weaknesses. On the good side, in silico models are valuable cheap alternatives to in vitro and in vivo experiments. On the other, the unconscious use of in silico methods can mislead end-users with elusive results. The focus of this review is on the basic scientific and regulatory recommendations in the derivation and application of computational models. Attention is paid to examine the interplay between computational toxicology and drug discovery and development. Avoiding the easy temptation of an overoptimistic future, we report our view on what can, or cannot, realistically be done. Indeed, studies of safety/toxicity represent a key element of chemical prioritization programs carried out by chemical industries, and primarily by pharmaceutical companies. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Shakiba, Maryam; Ozer, Hasan; Ziyadi, Mojtaba; Al-Qadi, Imad L.
2016-11-01
The structure-induced rolling resistance of pavements, and its impact on vehicle fuel consumption, is investigated in this study. The structural response of pavement causes additional rolling resistance and fuel consumption of vehicles through deformation of pavement and various dissipation mechanisms associated with inelastic material properties and damping. Accurate and computationally efficient models are required to capture these mechanisms and obtain realistic estimates of changes in vehicle fuel consumption. Two mechanistic-based approaches are currently used to calculate vehicle fuel consumption as related to structural rolling resistance: dissipation-induced and deflection-induced methods. The deflection-induced approach is adopted in this study, and realistic representation of pavement-vehicle interactions (PVIs) is incorporated. In addition to considering viscoelastic behavior of asphalt concrete layers, the realistic representation of PVIs in this study includes non-uniform three-dimensional tire contact stresses and dynamic analysis in pavement simulations. The effects of analysis type, tire contact stresses, pavement viscoelastic properties, pavement damping coefficients, vehicle speed, and pavement temperature are then investigated.
NASA Astrophysics Data System (ADS)
Sakimoto, S. E. H.
2016-12-01
Planetary volcanism has redefined what is considered volcanism. "Magma" now may be considered to be anything from the molten rock familiar at terrestrial volcanoes to cryovolcanic ammonia-water mixes erupted on an outer solar system moon. However, even with unfamiliar compositions and source mechanisms, we find familiar landforms such as volcanic channels, lakes, flows, and domes and thus a multitude of possibilities for modeling. As on Earth, these landforms lend themselves to analysis for estimating storage, eruption and/or flow rates. This has potential pitfalls, as extension of the simplified analytic models we often use for terrestrial features into unfamiliar parameter space might yield misleading results. Our most commonly used tools for estimating flow and cooling have tended to lag significantly behind state-of-the-art; the easiest methods to use are neither realistic or accurate, but the more realistic and accurate computational methods are not simple to use. Since the latter computational tools tend to be both expensive and require a significant learning curve, there is a need for a user-friendly approach that still takes advantage of their accuracy. One method is use of the computational package for generation of a server-based tool that allows less computationally inclined users to get accurate results over their range of input parameters for a given problem geometry. A second method is to use the computational package for the generation of a polynomial empirical solution for each class of flow geometry that can be fairly easily solved by anyone with a spreadsheet. In this study, we demonstrate both approaches for several channel flow and lava lake geometries with terrestrial and extraterrestrial examples and compare their results. Specifically, we model cooling rectangular channel flow with a yield strength material, with applications to Mauna Loa, Kilauea, Venus, and Mars. This approach also shows promise with model applications to lava lakes, magma flow through cracks, and volcanic dome formation.
Accurate Ray-tracing of Realistic Neutron Star Atmospheres for Constraining Their Parameters
NASA Astrophysics Data System (ADS)
Vincent, Frederic H.; Bejger, Michał; Różańska, Agata; Straub, Odele; Paumard, Thibaut; Fortin, Morgane; Madej, Jerzy; Majczyna, Agnieszka; Gourgoulhon, Eric; Haensel, Paweł; Zdunik, Leszek; Beldycki, Bartosz
2018-03-01
Thermal-dominated X-ray spectra of neutron stars in quiescent, transient X-ray binaries and neutron stars that undergo thermonuclear bursts are sensitive to mass and radius. The mass–radius relation of neutron stars depends on the equation of state (EoS) that governs their interior. Constraining this relation accurately is therefore of fundamental importance to understand the nature of dense matter. In this context, we introduce a pipeline to calculate realistic model spectra of rotating neutron stars with hydrogen and helium atmospheres. An arbitrarily fast-rotating neutron star with a given EoS generates the spacetime in which the atmosphere emits radiation. We use the LORENE/NROTSTAR code to compute the spacetime numerically and the ATM24 code to solve the radiative transfer equations self-consistently. Emerging specific intensity spectra are then ray-traced through the neutron star’s spacetime from the atmosphere to a distant observer with the GYOTO code. Here, we present and test our fully relativistic numerical pipeline. To discuss and illustrate the importance of realistic atmosphere models, we compare our model spectra to simpler models like the commonly used isotropic color-corrected blackbody emission. We highlight the importance of considering realistic model-atmosphere spectra together with relativistic ray-tracing to obtain accurate predictions. We also insist upon the crucial impact of the star’s rotation on the observables. Finally, we close a controversy that has been ongoing in the literature in the recent years, regarding the validity of the ATM24 code.
Efficient scatter model for simulation of ultrasound images from computed tomography data
NASA Astrophysics Data System (ADS)
D'Amato, J. P.; Lo Vercio, L.; Rubi, P.; Fernandez Vera, E.; Barbuzza, R.; Del Fresno, M.; Larrabide, I.
2015-12-01
Background and motivation: Real-time ultrasound simulation refers to the process of computationally creating fully synthetic ultrasound images instantly. Due to the high value of specialized low cost training for healthcare professionals, there is a growing interest in the use of this technology and the development of high fidelity systems that simulate the acquisitions of echographic images. The objective is to create an efficient and reproducible simulator that can run either on notebooks or desktops using low cost devices. Materials and methods: We present an interactive ultrasound simulator based on CT data. This simulator is based on ray-casting and provides real-time interaction capabilities. The simulation of scattering that is coherent with the transducer position in real time is also introduced. Such noise is produced using a simplified model of multiplicative noise and convolution with point spread functions (PSF) tailored for this purpose. Results: The computational efficiency of scattering maps generation was revised with an improved performance. This allowed a more efficient simulation of coherent scattering in the synthetic echographic images while providing highly realistic result. We describe some quality and performance metrics to validate these results, where a performance of up to 55fps was achieved. Conclusion: The proposed technique for real-time scattering modeling provides realistic yet computationally efficient scatter distributions. The error between the original image and the simulated scattering image was compared for the proposed method and the state-of-the-art, showing negligible differences in its distribution.
A geostationary Earth orbit satellite model using Easy Java Simulation
NASA Astrophysics Data System (ADS)
Wee, Loo Kang; Hwee Goh, Giam
2013-01-01
We develop an Easy Java Simulation (EJS) model for students to visualize geostationary orbits near Earth, modelled using a Java 3D implementation of the EJS 3D library. The simplified physics model is described and simulated using a simple constant angular velocity equation. We discuss four computer model design ideas: (1) a simple and realistic 3D view and associated learning in the real world; (2) comparative visualization of permanent geostationary satellites; (3) examples of non-geostationary orbits of different rotation senses, periods and planes; and (4) an incorrect physics model for conceptual discourse. General feedback from the students has been relatively positive, and we hope teachers will find the computer model useful in their own classes.
Experimental and computational analysis of sound absorption behavior in needled nonwovens
NASA Astrophysics Data System (ADS)
Soltani, Parham; Azimian, Mehdi; Wiegmann, Andreas; Zarrebini, Mohammad
2018-07-01
In this paper application of X-ray micro-computed tomography (μCT) together with fluid simulation techniques to predict sound absorption characteristics of needled nonwovens is discussed. Melt-spun polypropylene fibers of different fineness were made on an industrial scale compact melt spinning line. A conventional batt forming-needling line was used to prepare the needled samples. The normal incidence sound absorption coefficients were measured using impedance tube method. Realistic 3D images of samples at micron-level spatial resolution were obtained using μCT. Morphology of fabrics was characterized in terms of porosity, fiber diameter distribution, fiber curliness and pore size distribution from high-resolution realistic 3D images using GeoDict software. In order to calculate permeability and flow resistivity of media, fluid flow was simulated by numerically solving incompressible laminar Newtonian flow through the 3D pore space of realistic structures. Based on the flow resistivity, the frequency-dependent acoustic absorption coefficient of the needled nonwovens was predicted using the empirical model of Delany and Bazley (1970) and its associated modified models. The results were compared and validated with the corresponding experimental results. Based on morphological analysis, it was concluded that for a given weight per unit area, finer fibers yield to presence of higher number of fibers in the samples. This results in formation of smaller and more tortuous pores, which in turn leads to increase in flow resistivity of media. It was established that, among the empirical models, Mechel modification to Delany and Bazley model had superior predictive ability when compared to that of the original Delany and Bazley model at frequency range of 100-5000 Hz and is well suited to polypropylene needled nonwovens.
Modeling flow around bluff bodies and predicting urban dispersion using large eddy simulation.
Tseng, Yu-Heng; Meneveau, Charles; Parlange, Marc B
2006-04-15
Modeling air pollutant transport and dispersion in urban environments is especially challenging due to complex ground topography. In this study, we describe a large eddy simulation (LES) tool including a new dynamic subgrid closure and boundary treatment to model urban dispersion problems. The numerical model is developed, validated, and extended to a realistic urban layout. In such applications fairly coarse grids must be used in which each building can be represented using relatively few grid-points only. By carrying out LES of flow around a square cylinder and of flow over surface-mounted cubes, the coarsest resolution required to resolve the bluff body's cross section while still producing meaningful results is established. Specifically, we perform grid refinement studies showing that at least 6-8 grid points across the bluff body are required for reasonable results. The performance of several subgrid models is also compared. Although effects of the subgrid models on the mean flow are found to be small, dynamic Lagrangian models give a physically more realistic subgrid-scale (SGS) viscosity field. When scale-dependence is taken into consideration, these models lead to more realistic resolved fluctuating velocities and spectra. These results set the minimum grid resolution and subgrid model requirements needed to apply LES in simulations of neutral atmospheric boundary layer flow and scalar transport over a realistic urban geometry. The results also illustrate the advantages of LES over traditional modeling approaches, particularly its ability to take into account the complex boundary details and the unsteady nature of atmospheric boundary layer flow. Thus LES can be used to evaluate probabilities of extreme events (such as probabilities of exceeding threshold pollutant concentrations). Some comments about computer resources required for LES are also included.
NASA Astrophysics Data System (ADS)
Pécoul, S.; Heuraux, S.; Koch, R.; Leclert, G.; Bécoulet, A.; Colas, L.
1999-09-01
Self-consistent calculations of the 3D electric field patterns between the screen and the plasma have been made with the ICANT code for realistic antennas. Here we explain how the ICRH antennas of the Tore Supra tokamak are modelled.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pecoul, S.; Heuraux, S.; Koch, R.
1999-09-20
Self-consistent calculations of the 3D electric field patterns between the screen and the plasma have been made with the ICANT code for realistic antennas. Here we explain how the ICRH antennas of the Tore Supra tokamak are modelled.
Towards a Universal Calving Law: Modeling Ice Shelves Using Damage Mechanics
NASA Astrophysics Data System (ADS)
Whitcomb, M.; Bassis, J. N.; Price, S. F.; Lipscomb, W. H.
2017-12-01
Modeling iceberg calving from ice shelves and ice tongues is a particularly difficult problem in glaciology because of the wide range of observed calving rates. Ice shelves naturally calve large tabular icebergs at infrequent intervals, but may instead calve smaller bergs regularly or disintegrate due to hydrofracturing in warmer conditions. Any complete theory of iceberg calving in ice shelves must be able to generate realistic calving rate values depending on the magnitudes of the external forcings. Here we show that a simple damage evolution law, which represents crevasse distributions as a continuum field, produces reasonable estimates of ice shelf calving rates when added to the Community Ice Sheet Model (CISM). Our damage formulation is based on a linear stability analysis and depends upon the bulk stress and strain rate in the ice shelf, as well as the surface and basal melt rates. The basal melt parameter in our model enhances crevasse growth near the ice shelf terminus, leading to an increased iceberg production rate. This implies that increasing ocean temperatures underneath ice shelves will drive ice shelf retreat, as has been observed in the Amundsen and Bellingshausen Seas. We show that our model predicts broadly correct calving rates for ice tongues ranging in length from 10 km (Erebus) to over 100 km (Drygalski), by matching the computed steady state lengths to observations. In addition, we apply the model to idealized Antarctic ice shelves and show that we can also predict realistic ice shelf extents. Our damage mechanics model provides a promising, computationally efficient way to compute calving fluxes and links ice shelf stability to climate forcing.
Arai, Noriyoshi; Yasuoka, Kenji; Koishi, Takahiro; Ebisuzaki, Toshikazu; Zeng, Xiao Cheng
2013-06-12
The "asymmetric Brownian ratchet model", a variation of Feynman's ratchet and pawl system, is invoked to understand the kinesin walking behavior along a microtubule. The model system, consisting of a motor and a rail, can exhibit two distinct binding states, namely, the random Brownian state and the asymmetric potential state. When the system is transformed back and forth between the two states, the motor can be driven to "walk" in one direction. Previously, we suggested a fundamental mechanism, that is, bubble formation in a nanosized channel surrounded by hydrophobic atoms, to explain the transition between the two states. In this study, we propose a more realistic and viable switching method in our computer simulation of molecular motor walking. Specifically, we propose a thermosensitive polymer model with which the transition between the two states can be controlled by temperature pulses. Based on this new motor system, the stepping size and stepping time of the motor can be recorded. Remarkably, the "walking" behavior observed in the newly proposed model resembles that of the realistic motor protein. The bubble formation based motor not only can be highly efficient but also offers new insights into the physical mechanism of realistic biomolecule motors.
Collision detection and modeling of rigid and deformable objects in laparoscopic simulator
NASA Astrophysics Data System (ADS)
Dy, Mary-Clare; Tagawa, Kazuyoshi; Tanaka, Hiromi T.; Komori, Masaru
2015-03-01
Laparoscopic simulators are viable alternatives for surgical training and rehearsal. Haptic devices can also be incorporated with virtual reality simulators to provide additional cues to the users. However, to provide realistic feedback, the haptic device must be updated by 1kHz. On the other hand, realistic visual cues, that is, the collision detection and deformation between interacting objects must be rendered at least 30 fps. Our current laparoscopic simulator detects the collision between a point on the tool tip, and on the organ surfaces, in which haptic devices are attached on actual tool tips for realistic tool manipulation. The triangular-mesh organ model is rendered using a mass spring deformation model, or finite element method-based models. In this paper, we investigated multi-point-based collision detection on the rigid tool rods. Based on the preliminary results, we propose a method to improve the collision detection scheme, and speed up the organ deformation reaction. We discuss our proposal for an efficient method to compute simultaneous multiple collision between rigid (laparoscopic tools) and deformable (organs) objects, and perform the subsequent collision response, with haptic feedback, in real-time.
NASA Astrophysics Data System (ADS)
Šimkanin, Ján; Kyselica, Juraj
2017-12-01
Numerical simulations of the geodynamo are becoming more realistic because of advances in computer technology. Here, the geodynamo model is investigated numerically at the extremely low Ekman and magnetic Prandtl numbers using the PARODY dynamo code. These parameters are more realistic than those used in previous numerical studies of the geodynamo. Our model is based on the Boussinesq approximation and the temperature gradient between upper and lower boundaries is a source of convection. This study attempts to answer the question how realistic the geodynamo models are. Numerical results show that our dynamo belongs to the strong-field dynamos. The generated magnetic field is dipolar and large-scale while convection is small-scale and sheet-like flows (plumes) are preferred to a columnar convection. Scales of magnetic and velocity fields are separated, which enables hydromagnetic dynamos to maintain the magnetic field at the low magnetic Prandtl numbers. The inner core rotation rate is lower than that in previous geodynamo models. On the other hand, dimensional magnitudes of velocity and magnetic fields and those of the magnetic and viscous dissipation are larger than those expected in the Earth's core due to our parameter range chosen.
Computational studies of photoluminescence from disordered nanocrystalline systems
NASA Astrophysics Data System (ADS)
John, George
2000-03-01
The size (d) dependence of emission energies from semiconductor nanocrystallites have been shown to follow an effective exponent ( d^-β) determined by the disorder in the system(V.Ranjan, V.A.Singh and G.C.John, Phys. Rev B 58), 1158 (1998). Our earlier calculation was based on a simple quantum confinement model assuming a normal distribution of crystallites. This model is now extended to study the effects of realistic systems with a lognormal distribution in particle size, accounting for carrier hopping and nonradiative transitions. Computer simulations of this model performed using the Microcal Origin software can explain several conflicting experimental results reported in literature.
Introducing DeBRa: a detailed breast model for radiological studies
NASA Astrophysics Data System (ADS)
Ma, Andy K. W.; Gunn, Spencer; Darambara, Dimitra G.
2009-07-01
Currently, x-ray mammography is the method of choice in breast cancer screening programmes. As the mammography technology moves from 2D imaging modalities to 3D, conventional computational phantoms do not have sufficient detail to support the studies of these advanced imaging systems. Studies of these 3D imaging systems call for a realistic and sophisticated computational model of the breast. DeBRa (Detailed Breast model for Radiological studies) is the most advanced, detailed, 3D computational model of the breast developed recently for breast imaging studies. A DeBRa phantom can be constructed to model a compressed breast, as in film/screen, digital mammography and digital breast tomosynthesis studies, or a non-compressed breast as in positron emission mammography and breast CT studies. Both the cranial-caudal and mediolateral oblique views can be modelled. The anatomical details inside the phantom include the lactiferous duct system, the Cooper ligaments and the pectoral muscle. The fibroglandular tissues are also modelled realistically. In addition, abnormalities such as microcalcifications, irregular tumours and spiculated tumours are inserted into the phantom. Existing sophisticated breast models require specialized simulation codes. Unlike its predecessors, DeBRa has elemental compositions and densities incorporated into its voxels including those of the explicitly modelled anatomical structures and the noise-like fibroglandular tissues. The voxel dimensions are specified as needed by any study and the microcalcifications are embedded into the voxels so that the microcalcification sizes are not limited by the voxel dimensions. Therefore, DeBRa works with general-purpose Monte Carlo codes. Furthermore, general-purpose Monte Carlo codes allow different types of imaging modalities and detector characteristics to be simulated with ease. DeBRa is a versatile and multipurpose model specifically designed for both x-ray and γ-ray imaging studies.
Simplified realistic human head model for simulating Tumor Treating Fields (TTFields).
Wenger, Cornelia; Bomzon, Ze'ev; Salvador, Ricardo; Basser, Peter J; Miranda, Pedro C
2016-08-01
Tumor Treating Fields (TTFields) are alternating electric fields in the intermediate frequency range (100-300 kHz) of low-intensity (1-3 V/cm). TTFields are an anti-mitotic treatment against solid tumors, which are approved for Glioblastoma Multiforme (GBM) patients. These electric fields are induced non-invasively by transducer arrays placed directly on the patient's scalp. Cell culture experiments showed that treatment efficacy is dependent on the induced field intensity. In clinical practice, a software called NovoTalTM uses head measurements to estimate the optimal array placement to maximize the electric field delivery to the tumor. Computational studies predict an increase in the tumor's electric field strength when adapting transducer arrays to its location. Ideally, a personalized head model could be created for each patient, to calculate the electric field distribution for the specific situation. Thus, the optimal transducer layout could be inferred from field calculation rather than distance measurements. Nonetheless, creating realistic head models of patients is time-consuming and often needs user interaction, because automated image segmentation is prone to failure. This study presents a first approach to creating simplified head models consisting of convex hulls of the tissue layers. The model is able to account for anisotropic conductivity in the cortical tissues by using a tensor representation estimated from Diffusion Tensor Imaging. The induced electric field distribution is compared in the simplified and realistic head models. The average field intensities in the brain and tumor are generally slightly higher in the realistic head model, with a maximal ratio of 114% for a simplified model with reasonable layer thicknesses. Thus, the present pipeline is a fast and efficient means towards personalized head models with less complexity involved in characterizing tissue interfaces, while enabling accurate predictions of electric field distribution.
Balaya, V; Uhl, J-F; Lanore, A; Salachas, C; Samoyeau, T; Ngo, C; Bensaid, C; Cornou, C; Rossi, L; Douard, R; Bats, A-S; Lecuru, F; Delmas, V
2016-05-01
To achieve a 3D vectorial model of a female pelvis by Computer-Assisted Anatomical Dissection and to assess educationnal and surgical applications. From the database of "visible female" of Visible Human Project(®) (VHP) of the "national library of medicine" NLM (United States), we used 739 transverse anatomical slices of 0.33mm thickness going from L4 to the trochanters. The manual segmentation of each anatomical structures was done with Winsurf(®) software version 4.3. Each anatomical element was built as a separate vectorial object. The whole colored-rendered vectorial model with realistic textures was exported in 3Dpdf format to allow a real time interactive manipulation with Acrobat(®) pro version 11 software. Each element can be handled separately at any transparency, which allows an anatomical learning by systems: skeleton, pelvic organs, urogenital system, arterial and venous vascularization. This 3D anatomical model can be used as data bank to teach of the fundamental anatomy. This 3D vectorial model, realistic and interactive constitutes an efficient educational tool for the teaching of the anatomy of the pelvis. 3D printing of the pelvis is possible with the new printers. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Modeling of Tool-Tissue Interactions for Computer-Based Surgical Simulation: A Literature Review
Misra, Sarthak; Ramesh, K. T.; Okamura, Allison M.
2009-01-01
Surgical simulators present a safe and potentially effective method for surgical training, and can also be used in robot-assisted surgery for pre- and intra-operative planning. Accurate modeling of the interaction between surgical instruments and organs has been recognized as a key requirement in the development of high-fidelity surgical simulators. Researchers have attempted to model tool-tissue interactions in a wide variety of ways, which can be broadly classified as (1) linear elasticity-based, (2) nonlinear (hyperelastic) elasticity-based finite element (FE) methods, and (3) other techniques that not based on FE methods or continuum mechanics. Realistic modeling of organ deformation requires populating the model with real tissue data (which are difficult to acquire in vivo) and simulating organ response in real time (which is computationally expensive). Further, it is challenging to account for connective tissue supporting the organ, friction, and topological changes resulting from tool-tissue interactions during invasive surgical procedures. Overcoming such obstacles will not only help us to model tool-tissue interactions in real time, but also enable realistic force feedback to the user during surgical simulation. This review paper classifies the existing research on tool-tissue interactions for surgical simulators specifically based on the modeling techniques employed and the kind of surgical operation being simulated, in order to inform and motivate future research on improved tool-tissue interaction models. PMID:20119508
Uncertainty propagation of p-boxes using sparse polynomial chaos expansions
NASA Astrophysics Data System (ADS)
Schöbi, Roland; Sudret, Bruno
2017-06-01
In modern engineering, physical processes are modelled and analysed using advanced computer simulations, such as finite element models. Furthermore, concepts of reliability analysis and robust design are becoming popular, hence, making efficient quantification and propagation of uncertainties an important aspect. In this context, a typical workflow includes the characterization of the uncertainty in the input variables. In this paper, input variables are modelled by probability-boxes (p-boxes), accounting for both aleatory and epistemic uncertainty. The propagation of p-boxes leads to p-boxes of the output of the computational model. A two-level meta-modelling approach is proposed using non-intrusive sparse polynomial chaos expansions to surrogate the exact computational model and, hence, to facilitate the uncertainty quantification analysis. The capabilities of the proposed approach are illustrated through applications using a benchmark analytical function and two realistic engineering problem settings. They show that the proposed two-level approach allows for an accurate estimation of the statistics of the response quantity of interest using a small number of evaluations of the exact computational model. This is crucial in cases where the computational costs are dominated by the runs of high-fidelity computational models.
Uncertainty propagation of p-boxes using sparse polynomial chaos expansions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schöbi, Roland, E-mail: schoebi@ibk.baug.ethz.ch; Sudret, Bruno, E-mail: sudret@ibk.baug.ethz.ch
2017-06-15
In modern engineering, physical processes are modelled and analysed using advanced computer simulations, such as finite element models. Furthermore, concepts of reliability analysis and robust design are becoming popular, hence, making efficient quantification and propagation of uncertainties an important aspect. In this context, a typical workflow includes the characterization of the uncertainty in the input variables. In this paper, input variables are modelled by probability-boxes (p-boxes), accounting for both aleatory and epistemic uncertainty. The propagation of p-boxes leads to p-boxes of the output of the computational model. A two-level meta-modelling approach is proposed using non-intrusive sparse polynomial chaos expansions tomore » surrogate the exact computational model and, hence, to facilitate the uncertainty quantification analysis. The capabilities of the proposed approach are illustrated through applications using a benchmark analytical function and two realistic engineering problem settings. They show that the proposed two-level approach allows for an accurate estimation of the statistics of the response quantity of interest using a small number of evaluations of the exact computational model. This is crucial in cases where the computational costs are dominated by the runs of high-fidelity computational models.« less
NASA Technical Reports Server (NTRS)
Downward, James G.
1992-01-01
This document represents the final report for the View Generated Database (VGD) project, NAS7-1066. It documents the work done on the project up to the point at which all project work was terminated due to lack of project funds. The VGD was to provide the capability to accurately represent any real-world object or scene as a computer model. Such models include both an accurate spatial/geometric representation of surfaces of the object or scene, as well as any surface detail present on the object. Applications of such models are numerous, including acquisition and maintenance of work models for tele-autonomous systems, generation of accurate 3-D geometric/photometric models for various 3-D vision systems, and graphical models for realistic rendering of 3-D scenes via computer graphics.
1993-11-01
way is to develop a crude but working model of an entire system. The other is by developing a realistic model of the user interface , leaving out most...devices or by incorporating software for a more user -friendly interface . Automation introduces the possibility of making data entry errors. Multimode...across various human- computer interfaces . 127 a Memory: Minimize the amount of information that the user must maintain in short-term memory
Three dimensional hair model by means particles using Blender
NASA Astrophysics Data System (ADS)
Alvarez-Cedillo, Jesús Antonio; Almanza-Nieto, Roberto; Herrera-Lozada, Juan Carlos
2010-09-01
The simulation and modeling of human hair is a process whose computational complexity is very large, this due to the large number of factors that must be calculated to give a realistic appearance. Generally, the method used in the film industry to simulate hair is based on particle handling graphics. In this paper we present a simple approximation of how to model human hair using particles in Blender. [Figure not available: see fulltext.
NASA Technical Reports Server (NTRS)
Youngblut, C.
1984-01-01
Orography and geographically fixed heat sources which force a zonally asymmetric motion field are examined. An extensive space-time spectral analysis of the GLAS climate model (D130) response and observations are compared. An updated version of the model (D150) showed a remarkable improvement in the simulation of the standing waves. The main differences in the model code are an improved boundary layer flux computation and a more realistic specification of the global boundary conditions.
A poroelastic model coupled to a fluid network with applications in lung modelling.
Berger, Lorenz; Bordas, Rafel; Burrowes, Kelly; Grau, Vicente; Tavener, Simon; Kay, David
2016-01-01
We develop a lung ventilation model based on a continuum poroelastic representation of lung parenchyma that is strongly coupled to a pipe network representation of the airway tree. The continuous system of equations is discretized using a low-order stabilised finite element method. The framework is applied to a realistic lung anatomical model derived from computed tomography data and an artificially generated airway tree to model the conducting airway region. Numerical simulations produce physiologically realistic solutions and demonstrate the effect of airway constriction and reduced tissue elasticity on ventilation, tissue stress and alveolar pressure distribution. The key advantage of the model is the ability to provide insight into the mutual dependence between ventilation and deformation. This is essential when studying lung diseases, such as chronic obstructive pulmonary disease and pulmonary fibrosis. Thus the model can be used to form a better understanding of integrated lung mechanics in both the healthy and diseased states. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Protein structure prediction with local adjust tabu search algorithm
2014-01-01
Background Protein folding structure prediction is one of the most challenging problems in the bioinformatics domain. Because of the complexity of the realistic protein structure, the simplified structure model and the computational method should be adopted in the research. The AB off-lattice model is one of the simplification models, which only considers two classes of amino acids, hydrophobic (A) residues and hydrophilic (B) residues. Results The main work of this paper is to discuss how to optimize the lowest energy configurations in 2D off-lattice model and 3D off-lattice model by using Fibonacci sequences and real protein sequences. In order to avoid falling into local minimum and faster convergence to the global minimum, we introduce a novel method (SATS) to the protein structure problem, which combines simulated annealing algorithm and tabu search algorithm. Various strategies, such as the new encoding strategy, the adaptive neighborhood generation strategy and the local adjustment strategy, are adopted successfully for high-speed searching the optimal conformation corresponds to the lowest energy of the protein sequences. Experimental results show that some of the results obtained by the improved SATS are better than those reported in previous literatures, and we can sure that the lowest energy folding state for short Fibonacci sequences have been found. Conclusions Although the off-lattice models is not very realistic, they can reflect some important characteristics of the realistic protein. It can be found that 3D off-lattice model is more like native folding structure of the realistic protein than 2D off-lattice model. In addition, compared with some previous researches, the proposed hybrid algorithm can more effectively and more quickly search the spatial folding structure of a protein chain. PMID:25474708
Tveito, Aslak; Skavhaug, Ola; Lines, Glenn T; Artebrant, Robert
2011-08-01
Instabilities in the electro-chemical resting state of the heart can generate ectopic waves that in turn can initiate arrhythmias. We derive methods for computing the resting state for mathematical models of the electro-chemical process underpinning a heartbeat, and we estimate the stability of the resting state by invoking the largest real part of the eigenvalues of a linearized model. The implementation of the methods is described and a number of numerical experiments illustrate the feasibility of the methods. In particular, we test the methods for problems where we can compare the solutions with analytical results, and problems where we have solutions computed by independent software. The software is also tested for a fairly realistic 3D model. Copyright © 2011 Elsevier Ltd. All rights reserved.
Computational analysis of an aortic valve jet
NASA Astrophysics Data System (ADS)
Shadden, Shawn C.; Astorino, Matteo; Gerbeau, Jean-Frédéric
2009-11-01
In this work we employ a coupled FSI scheme using an immersed boundary method to simulate flow through a realistic deformable, 3D aortic valve model. This data was used to compute Lagrangian coherent structures, which revealed flow separation from the valve leaflets during systole, and correspondingly, the boundary between the jet of ejected fluid and the regions of separated, recirculating flow. Advantages of computing LCS in multi-dimensional FSI models of the aortic valve are twofold. For one, the quality and effectiveness of existing clinical indices used to measure aortic jet size can be tested by taking advantage of the accurate measure of the jet area derived from LCS. Secondly, as an ultimate goal, a reliable computational framework for the assessment of the aortic valve stenosis could be developed.
Bilinauskaite, Milda; Mantha, Vishveshwar Rajendra; Rouboa, Abel Ilah; Ziliukas, Pranas; Silva, Antonio Jose
2013-01-01
The aim of this paper is to determine the hydrodynamic characteristics of swimmer's scanned hand models for various combinations of both the angle of attack and the sweepback angle and shape and velocity of swimmer's hand, simulating separate underwater arm stroke phases of freestyle (front crawl) swimming. Four realistic 3D models of swimmer's hand corresponding to different combinations of separated/closed fingers positions were used to simulate different underwater front crawl phases. The fluid flow was simulated using FLUENT (ANSYS, PA, USA). Drag force and drag coefficient were calculated using (computational fluid dynamics) CFD in steady state. Results showed that the drag force and coefficient varied at the different flow velocities on all shapes of the hand and variation was observed for different hand positions corresponding to different stroke phases. The models of the hand with thumb adducted and abducted generated the highest drag forces and drag coefficients. The current study suggests that the realistic variation of both the orientation angles influenced higher values of drag, lift, and resultant coefficients and forces. To augment resultant force, which affects swimmer's propulsion, the swimmer should concentrate in effectively optimising achievable hand areas during crucial propulsive phases. PMID:23691493
In silico reconstitution of Listeria propulsion exhibits nano-saltation.
Alberts, Jonathan B; Odell, Garrett M
2004-12-01
To understand how the actin-polymerization-mediated movements in cells emerge from myriad individual protein-protein interactions, we developed a computational model of Listeria monocytogenes propulsion that explicitly simulates a large number of monomer-scale biochemical and mechanical interactions. The literature on actin networks and L. monocytogenes motility provides the foundation for a realistic mathematical/computer simulation, because most of the key rate constants governing actin network dynamics have been measured. We use a cluster of 80 Linux processors and our own suite of simulation and analysis software to characterize salient features of bacterial motion. Our "in silico reconstitution" produces qualitatively realistic bacterial motion with regard to speed and persistence of motion and actin tail morphology. The model also produces smaller scale emergent behavior; we demonstrate how the observed nano-saltatory motion of L. monocytogenes,in which runs punctuate pauses, can emerge from a cooperative binding and breaking of attachments between actin filaments and the bacterium. We describe our modeling methodology in detail, as it is likely to be useful for understanding any subcellular system in which the dynamics of many simple interactions lead to complex emergent behavior, e.g., lamellipodia and filopodia extension, cellular organization, and cytokinesis.
Performance evaluation of an automatic MGRF-based lung segmentation approach
NASA Astrophysics Data System (ADS)
Soliman, Ahmed; Khalifa, Fahmi; Alansary, Amir; Gimel'farb, Georgy; El-Baz, Ayman
2013-10-01
The segmentation of the lung tissues in chest Computed Tomography (CT) images is an important step for developing any Computer-Aided Diagnostic (CAD) system for lung cancer and other pulmonary diseases. In this paper, we introduce a new framework for validating the accuracy of our developed Joint Markov-Gibbs based lung segmentation approach using 3D realistic synthetic phantoms. These phantoms are created using a 3D Generalized Gauss-Markov Random Field (GGMRF) model of voxel intensities with pairwise interaction to model the 3D appearance of the lung tissues. Then, the appearance of the generated 3D phantoms is simulated based on iterative minimization of an energy function that is based on the learned 3D-GGMRF image model. These 3D realistic phantoms can be used to evaluate the performance of any lung segmentation approach. The performance of our segmentation approach is evaluated using three metrics, namely, the Dice Similarity Coefficient (DSC), the modified Hausdorff distance, and the Average Volume Difference (AVD) between our segmentation and the ground truth. Our approach achieves mean values of 0.994±0.003, 8.844±2.495 mm, and 0.784±0.912 mm3, for the DSC, Hausdorff distance, and the AVD, respectively.
Density-Functional Theory description of transport in the single-electron transistor
NASA Astrophysics Data System (ADS)
Zawadzki, Krissia; Oliveira, Luiz N.
The Kondo effect governs the low-temperature transport properties of the single electron transistor (SET), a quantum dot bridging two electron gases. In the weak coupling limit, for odd dot occupation, the gate-potential profile of the conductance approaches a step, known as the Kondo plateau. The plateau and other SET properties being well understood on the basis of the Anderson model, more realistic (i. e., DFT) descriptions of the device are now desired. This poses a challenge, since the SET is strongly correlated. DFT computations that reproduce the conductance plateau have been reported, e. g., by, which rely on the exact functional provided by the Bethe-Ansatz solution for the Anderson model. Here, sticking to DFT tradition, we employ a functional derived from a homogeneous system: the parametrization of the Lieb-Wu solution for the Hubbard model due to. Our computations reproduce the plateau and yield other results in accurate agreement with the exact diagonalization of the Anderson Hamiltonian. The prospects for extensions to realistic descriptions of two-dimensional nanostructured devices will be discussed. Luiz N. Oliveira thanks CNPq (312658/2013-3) and Krissia Zawadzki thanks CNPq (140703/2014-4) for financial support.
Nakajima, T Y; Imai, T; Uchino, O; Nagai, T
1999-08-20
The influence of daylight and noise current on cloud and aerosol observations by realistic spaceborne lidar was examined by computer simulations. The reflected solar radiations, which contaminate the daytime return signals of lidar operations, were strictly and explicitly estimated by accurate radiative transfer calculations. It was found that the model multilayer cirrus clouds and the boundary layer aerosols could be observed during the daytime and the nighttime with only a few laser shots. However, high background noise and noise current make it difficult to observe volcanic aerosols in middle and upper atmospheric layers. Optimal combinations of the laser power and receiver field of view are proposed to compensate for the negative influence that is due to these noises. For the computer simulations, we used a realistic set of lidar parameters similar to the Experimental Lidar in-Space Equipment of the National Space Development Agency of Japan.
Unsteady transonic flow calculations for realistic aircraft configurations
NASA Technical Reports Server (NTRS)
Batina, John T.; Seidel, David A.; Bland, Samuel R.; Bennett, Robert M.
1987-01-01
A transonic unsteady aerodynamic and aeroelasticity code has been developed for application to realistic aircraft configurations. The new code is called CAP-TSD which is an acronym for Computational Aeroelasticity Program - Transonic Small Disturbance. The CAP-TSD code uses a time-accurate approximate factorization (AF) algorithm for solution of the unsteady transonic small-disturbance equation. The AF algorithm is very efficient for solution of steady and unsteady transonic flow problems. It can provide accurate solutions in only several hundred time steps yielding a significant computational cost savings when compared to alternative methods. The new code can treat complete aircraft geometries with multiple lifting surfaces and bodies including canard, wing, tail, control surfaces, launchers, pylons, fuselage, stores, and nacelles. Applications are presented for a series of five configurations of increasing complexity to demonstrate the wide range of geometrical applicability of CAP-TSD. These results are in good agreement with available experimental steady and unsteady pressure data. Calculations for the General Dynamics one-ninth scale F-16C aircraft model are presented to demonstrate application to a realistic configuration. Unsteady results for the entire F-16C aircraft undergoing a rigid pitching motion illustrated the capability required to perform transonic unsteady aerodynamic and aeroelastic analyses for such configurations.
Model-based sensorimotor integration for multi-joint control: development of a virtual arm model.
Song, D; Lan, N; Loeb, G E; Gordon, J
2008-06-01
An integrated, sensorimotor virtual arm (VA) model has been developed and validated for simulation studies of control of human arm movements. Realistic anatomical features of shoulder, elbow and forearm joints were captured with a graphic modeling environment, SIMM. The model included 15 musculotendon elements acting at the shoulder, elbow and forearm. Muscle actions on joints were evaluated by SIMM generated moment arms that were matched to experimentally measured profiles. The Virtual Muscle (VM) model contained appropriate admixture of slow and fast twitch fibers with realistic physiological properties for force production. A realistic spindle model was embedded in each VM with inputs of fascicle length, gamma static (gamma(stat)) and dynamic (gamma(dyn)) controls and outputs of primary (I(a)) and secondary (II) afferents. A piecewise linear model of Golgi Tendon Organ (GTO) represented the ensemble sampling (I(b)) of the total muscle force at the tendon. All model components were integrated into a Simulink block using a special software tool. The complete VA model was validated with open-loop simulation at discrete hand positions within the full range of alpha and gamma drives to extrafusal and intrafusal muscle fibers. The model behaviors were consistent with a wide variety of physiological phenomena. Spindle afferents were effectively modulated by fusimotor drives and hand positions of the arm. These simulations validated the VA model as a computational tool for studying arm movement control. The VA model is available to researchers at website http://pt.usc.edu/cel .
Sevink, G J A; Schmid, F; Kawakatsu, T; Milano, G
2017-02-22
We have extended an existing hybrid MD-SCF simulation technique that employs a coarsening step to enhance the computational efficiency of evaluating non-bonded particle interactions. This technique is conceptually equivalent to the single chain in mean-field (SCMF) method in polymer physics, in the sense that non-bonded interactions are derived from the non-ideal chemical potential in self-consistent field (SCF) theory, after a particle-to-field projection. In contrast to SCMF, however, MD-SCF evolves particle coordinates by the usual Newton's equation of motion. Since collisions are seriously affected by the softening of non-bonded interactions that originates from their evaluation at the coarser continuum level, we have devised a way to reinsert the effect of collisions on the structural evolution. Merging MD-SCF with multi-particle collision dynamics (MPCD), we mimic particle collisions at the level of computational cells and at the same time properly account for the momentum transfer that is important for a realistic system evolution. The resulting hybrid MD-SCF/MPCD method was validated for a particular coarse-grained model of phospholipids in aqueous solution, against reference full-particle simulations and the original MD-SCF model. We additionally implemented and tested an alternative and more isotropic finite difference gradient. Our results show that efficiency is improved by merging MD-SCF with MPCD, as properly accounting for hydrodynamic interactions considerably speeds up the phase separation dynamics, with negligible additional computational costs compared to efficient MD-SCF. This new method enables realistic simulations of large-scale systems that are needed to investigate the applications of self-assembled structures of lipids in nanotechnologies.
Wang, Xiao-Jing; Krystal, John H.
2014-01-01
Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941
Converting differential-equation models of biological systems to membrane computing.
Muniyandi, Ravie Chandren; Zin, Abdullah Mohd; Sanders, J W
2013-12-01
This paper presents a method to convert the deterministic, continuous representation of a biological system by ordinary differential equations into a non-deterministic, discrete membrane computation. The dynamics of the membrane computation is governed by rewrite rules operating at certain rates. That has the advantage of applying accurately to small systems, and to expressing rates of change that are determined locally, by region, but not necessary globally. Such spatial information augments the standard differentiable approach to provide a more realistic model. A biological case study of the ligand-receptor network of protein TGF-β is used to validate the effectiveness of the conversion method. It demonstrates the sense in which the behaviours and properties of the system are better preserved in the membrane computing model, suggesting that the proposed conversion method may prove useful for biological systems in particular. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Clark, Martyn P.; Bierkens, Marc F. P.; Samaniego, Luis; Woods, Ross A.; Uijlenhoet, Remko; Bennett, Katrina E.; Pauwels, Valentijn R. N.; Cai, Xitian; Wood, Andrew W.; Peters-Lidard, Christa D.
2017-07-01
The diversity in hydrologic models has historically led to great controversy on the correct
approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. In this paper, we revisit key modeling challenges on requirements to (1) define suitable model equations, (2) define adequate model parameters, and (3) cope with limitations in computing power. We outline the historical modeling challenges, provide examples of modeling advances that address these challenges, and define outstanding research needs. We illustrate how modeling advances have been made by groups using models of different type and complexity, and we argue for the need to more effectively use our diversity of modeling approaches in order to advance our collective quest for physically realistic hydrologic models.
NASA Astrophysics Data System (ADS)
Clark, M. P.; Nijssen, B.; Wood, A.; Mizukami, N.; Newman, A. J.
2017-12-01
The diversity in hydrologic models has historically led to great controversy on the "correct" approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. In this paper, we revisit key modeling challenges on requirements to (1) define suitable model equations, (2) define adequate model parameters, and (3) cope with limitations in computing power. We outline the historical modeling challenges, provide examples of modeling advances that address these challenges, and define outstanding research needs. We illustrate how modeling advances have been made by groups using models of different type and complexity, and we argue for the need to more effectively use our diversity of modeling approaches in order to advance our collective quest for physically realistic hydrologic models.
Modeling Computer Communication Networks in a Realistic 3D Environment
2010-03-01
50 2. Comparison of visualization tools . . . . . . . . . . . . . . . . . 75 xi List of Abbreviations Abbreviation Page 2D two-dimensional...International Conference on, 77 –84, 2001. 20. National Defense and the Canadian Forces. “Joint Fires Support”. URL http: //www.cfd-cdf.forces.gc.ca/sites/ page ...Table of Contents Page Abstract . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iv Acknowledgements
Multiplexed Predictive Control of a Large Commercial Turbofan Engine
NASA Technical Reports Server (NTRS)
Richter, hanz; Singaraju, Anil; Litt, Jonathan S.
2008-01-01
Model predictive control is a strategy well-suited to handle the highly complex, nonlinear, uncertain, and constrained dynamics involved in aircraft engine control problems. However, it has thus far been infeasible to implement model predictive control in engine control applications, because of the combination of model complexity and the time allotted for the control update calculation. In this paper, a multiplexed implementation is proposed that dramatically reduces the computational burden of the quadratic programming optimization that must be solved online as part of the model-predictive-control algorithm. Actuator updates are calculated sequentially and cyclically in a multiplexed implementation, as opposed to the simultaneous optimization taking place in conventional model predictive control. Theoretical aspects are discussed based on a nominal model, and actual computational savings are demonstrated using a realistic commercial engine model.
PSF modeling by spikes simulations and wings measurements for the MOONS multi fiber spectrograph
NASA Astrophysics Data System (ADS)
Li Causi, G.; Lee, D.; Vitali, F.; Royer, F.; Oliva, E.
2016-08-01
The optical design of MOONS, the next generation thousand-fiber NIR spectrograph for the VLT, involves both on-axis reflective collimators and on-axis very fast reflective cameras, which yields both beam obstruction, due to fiber slit and detector support, and image spread, due to propagation within detector substrate. The need to model and control i) the effect of the diffraction spikes produced by these obstructions, ii) the detector-induced shape variation of the Point Spread Function (PSF), and iii) the intensity profile of the PSF wings, leads us to perform both simulations and lab measurements, in order to optimize the spider design and built a reliable PSF model, useful for simulate realistic raw images for testing the data reduction. Starting from the unobstructed PSF variation, as computed with the ZEMAX software, we numerically computed the diffraction spikes for different spider shapes, to which we added the PSF wing profile, as measured on a sample of the MOONS VPH diffraction grating. Finally, we implemented the PSF defocusing due to the thick detector (for the visible channel), we convolved the PSF with the fiber core image, and we added the optical ghosts, so finally obtaining a detailed and realistic PSF model, that we use for spectral extraction testing, cross talk estimation, and sensitivity predictions.
Modeling driver behavior in a cognitive architecture.
Salvucci, Dario D
2006-01-01
This paper explores the development of a rigorous computational model of driver behavior in a cognitive architecture--a computational framework with underlying psychological theories that incorporate basic properties and limitations of the human system. Computational modeling has emerged as a powerful tool for studying the complex task of driving, allowing researchers to simulate driver behavior and explore the parameters and constraints of this behavior. An integrated driver model developed in the ACT-R (Adaptive Control of Thought-Rational) cognitive architecture is described that focuses on the component processes of control, monitoring, and decision making in a multilane highway environment. This model accounts for the steering profiles, lateral position profiles, and gaze distributions of human drivers during lane keeping, curve negotiation, and lane changing. The model demonstrates how cognitive architectures facilitate understanding of driver behavior in the context of general human abilities and constraints and how the driving domain benefits cognitive architectures by pushing model development toward more complex, realistic tasks. The model can also serve as a core computational engine for practical applications that predict and recognize driver behavior and distraction.
A novel patient-specific model to compute coronary fractional flow reserve.
Kwon, Soon-Sung; Chung, Eui-Chul; Park, Jin-Seo; Kim, Gook-Tae; Kim, Jun-Woo; Kim, Keun-Hong; Shin, Eun-Seok; Shim, Eun Bo
2014-09-01
The fractional flow reserve (FFR) is a widely used clinical index to evaluate the functional severity of coronary stenosis. A computer simulation method based on patients' computed tomography (CT) data is a plausible non-invasive approach for computing the FFR. This method can provide a detailed solution for the stenosed coronary hemodynamics by coupling computational fluid dynamics (CFD) with the lumped parameter model (LPM) of the cardiovascular system. In this work, we have implemented a simple computational method to compute the FFR. As this method uses only coronary arteries for the CFD model and includes only the LPM of the coronary vascular system, it provides simpler boundary conditions for the coronary geometry and is computationally more efficient than existing approaches. To test the efficacy of this method, we simulated a three-dimensional straight vessel using CFD coupled with the LPM. The computed results were compared with those of the LPM. To validate this method in terms of clinically realistic geometry, a patient-specific model of stenosed coronary arteries was constructed from CT images, and the computed FFR was compared with clinically measured results. We evaluated the effect of a model aorta on the computed FFR and compared this with a model without the aorta. Computationally, the model without the aorta was more efficient than that with the aorta, reducing the CPU time required for computing a cardiac cycle to 43.4%. Copyright © 2014. Published by Elsevier Ltd.
Fitted Hanbury-Brown Twiss radii versus space-time variances in flow-dominated models
NASA Astrophysics Data System (ADS)
Frodermann, Evan; Heinz, Ulrich; Lisa, Michael Annan
2006-04-01
The inability of otherwise successful dynamical models to reproduce the Hanbury-Brown Twiss (HBT) radii extracted from two-particle correlations measured at the Relativistic Heavy Ion Collider (RHIC) is known as the RHIC HBT Puzzle. Most comparisons between models and experiment exploit the fact that for Gaussian sources the HBT radii agree with certain combinations of the space-time widths of the source that can be directly computed from the emission function without having to evaluate, at significant expense, the two-particle correlation function. We here study the validity of this approach for realistic emission function models, some of which exhibit significant deviations from simple Gaussian behavior. By Fourier transforming the emission function, we compute the two-particle correlation function, and fit it with a Gaussian to partially mimic the procedure used for measured correlation functions. We describe a novel algorithm to perform this Gaussian fit analytically. We find that for realistic hydrodynamic models the HBT radii extracted from this procedure agree better with the data than the values previously extracted from the space-time widths of the emission function. Although serious discrepancies between the calculated and the measured HBT radii remain, we show that a more apples-to-apples comparison of models with data can play an important role in any eventually successful theoretical description of RHIC HBT data.
Fitted Hanbury-Brown-Twiss radii versus space-time variances in flow-dominated models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frodermann, Evan; Heinz, Ulrich; Lisa, Michael Annan
2006-04-15
The inability of otherwise successful dynamical models to reproduce the Hanbury-Brown-Twiss (HBT) radii extracted from two-particle correlations measured at the Relativistic Heavy Ion Collider (RHIC) is known as the RHIC HBT Puzzle. Most comparisons between models and experiment exploit the fact that for Gaussian sources the HBT radii agree with certain combinations of the space-time widths of the source that can be directly computed from the emission function without having to evaluate, at significant expense, the two-particle correlation function. We here study the validity of this approach for realistic emission function models, some of which exhibit significant deviations from simplemore » Gaussian behavior. By Fourier transforming the emission function, we compute the two-particle correlation function, and fit it with a Gaussian to partially mimic the procedure used for measured correlation functions. We describe a novel algorithm to perform this Gaussian fit analytically. We find that for realistic hydrodynamic models the HBT radii extracted from this procedure agree better with the data than the values previously extracted from the space-time widths of the emission function. Although serious discrepancies between the calculated and the measured HBT radii remain, we show that a more apples-to-apples comparison of models with data can play an important role in any eventually successful theoretical description of RHIC HBT data.« less
Scidac-Data: Enabling Data Driven Modeling of Exascale Computing
Mubarak, Misbah; Ding, Pengfei; Aliaga, Leo; ...
2017-11-23
Here, the SciDAC-Data project is a DOE-funded initiative to analyze and exploit two decades of information and analytics that have been collected by the Fermilab data center on the organization, movement, and consumption of high energy physics (HEP) data. The project analyzes the analysis patterns and data organization that have been used by NOvA, MicroBooNE, MINERvA, CDF, D0, and other experiments to develop realistic models of HEP analysis workflows and data processing. The SciDAC-Data project aims to provide both realistic input vectors and corresponding output data that can be used to optimize and validate simulations of HEP analysis. These simulationsmore » are designed to address questions of data handling, cache optimization, and workflow structures that are the prerequisites for modern HEP analysis chains to be mapped and optimized to run on the next generation of leadership-class exascale computing facilities. We present the use of a subset of the SciDAC-Data distributions, acquired from analysis of approximately 71,000 HEP workflows run on the Fermilab data center and corresponding to over 9 million individual analysis jobs, as the input to detailed queuing simulations that model the expected data consumption and caching behaviors of the work running in high performance computing (HPC) and high throughput computing (HTC) environments. In particular we describe how the Sequential Access via Metadata (SAM) data-handling system in combination with the dCache/Enstore-based data archive facilities has been used to develop radically different models for analyzing the HEP data. We also show how the simulations may be used to assess the impact of design choices in archive facilities.« less
Scidac-Data: Enabling Data Driven Modeling of Exascale Computing
NASA Astrophysics Data System (ADS)
Mubarak, Misbah; Ding, Pengfei; Aliaga, Leo; Tsaris, Aristeidis; Norman, Andrew; Lyon, Adam; Ross, Robert
2017-10-01
The SciDAC-Data project is a DOE-funded initiative to analyze and exploit two decades of information and analytics that have been collected by the Fermilab data center on the organization, movement, and consumption of high energy physics (HEP) data. The project analyzes the analysis patterns and data organization that have been used by NOvA, MicroBooNE, MINERvA, CDF, D0, and other experiments to develop realistic models of HEP analysis workflows and data processing. The SciDAC-Data project aims to provide both realistic input vectors and corresponding output data that can be used to optimize and validate simulations of HEP analysis. These simulations are designed to address questions of data handling, cache optimization, and workflow structures that are the prerequisites for modern HEP analysis chains to be mapped and optimized to run on the next generation of leadership-class exascale computing facilities. We present the use of a subset of the SciDAC-Data distributions, acquired from analysis of approximately 71,000 HEP workflows run on the Fermilab data center and corresponding to over 9 million individual analysis jobs, as the input to detailed queuing simulations that model the expected data consumption and caching behaviors of the work running in high performance computing (HPC) and high throughput computing (HTC) environments. In particular we describe how the Sequential Access via Metadata (SAM) data-handling system in combination with the dCache/Enstore-based data archive facilities has been used to develop radically different models for analyzing the HEP data. We also show how the simulations may be used to assess the impact of design choices in archive facilities.
Scidac-Data: Enabling Data Driven Modeling of Exascale Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mubarak, Misbah; Ding, Pengfei; Aliaga, Leo
Here, the SciDAC-Data project is a DOE-funded initiative to analyze and exploit two decades of information and analytics that have been collected by the Fermilab data center on the organization, movement, and consumption of high energy physics (HEP) data. The project analyzes the analysis patterns and data organization that have been used by NOvA, MicroBooNE, MINERvA, CDF, D0, and other experiments to develop realistic models of HEP analysis workflows and data processing. The SciDAC-Data project aims to provide both realistic input vectors and corresponding output data that can be used to optimize and validate simulations of HEP analysis. These simulationsmore » are designed to address questions of data handling, cache optimization, and workflow structures that are the prerequisites for modern HEP analysis chains to be mapped and optimized to run on the next generation of leadership-class exascale computing facilities. We present the use of a subset of the SciDAC-Data distributions, acquired from analysis of approximately 71,000 HEP workflows run on the Fermilab data center and corresponding to over 9 million individual analysis jobs, as the input to detailed queuing simulations that model the expected data consumption and caching behaviors of the work running in high performance computing (HPC) and high throughput computing (HTC) environments. In particular we describe how the Sequential Access via Metadata (SAM) data-handling system in combination with the dCache/Enstore-based data archive facilities has been used to develop radically different models for analyzing the HEP data. We also show how the simulations may be used to assess the impact of design choices in archive facilities.« less
Random sphere packing model of heterogeneous propellants
NASA Astrophysics Data System (ADS)
Kochevets, Sergei Victorovich
It is well recognized that combustion of heterogeneous propellants is strongly dependent on the propellant morphology. Recent developments in computing systems make it possible to start three-dimensional modeling of heterogeneous propellant combustion. A key component of such large scale computations is a realistic model of industrial propellants which retains the true morphology---a goal never achieved before. The research presented develops the Random Sphere Packing Model of heterogeneous propellants and generates numerical samples of actual industrial propellants. This is done by developing a sphere packing algorithm which randomly packs a large number of spheres with a polydisperse size distribution within a rectangular domain. First, the packing code is developed, optimized for performance, and parallelized using the OpenMP shared memory architecture. Second, the morphology and packing fraction of two simple cases of unimodal and bimodal packs are investigated computationally and analytically. It is shown that both the Loose Random Packing and Dense Random Packing limits are not well defined and the growth rate of the spheres is identified as the key parameter controlling the efficiency of the packing. For a properly chosen growth rate, computational results are found to be in excellent agreement with experimental data. Third, two strategies are developed to define numerical samples of polydisperse heterogeneous propellants: the Deterministic Strategy and the Random Selection Strategy. Using these strategies, numerical samples of industrial propellants are generated. The packing fraction is investigated and it is shown that the experimental values of the packing fraction can be achieved computationally. It is strongly believed that this Random Sphere Packing Model of propellants is a major step forward in the realistic computational modeling of heterogeneous propellant of combustion. In addition, a method of analysis of the morphology of heterogeneous propellants is developed which uses the concept of multi-point correlation functions. A set of intrinsic length scales of local density fluctuations in random heterogeneous propellants is identified by performing a Monte-Carlo study of the correlation functions. This method of analysis shows great promise for understanding the origins of the combustion instability of heterogeneous propellants, and is believed to become a valuable tool for the development of safe and reliable rocket engines.
The Performance of Chinese Primary School Students on Realistic Arithmetic Word Problems
ERIC Educational Resources Information Center
Xin, Ziqiang; Lin, Chongde; Zhang, Li; Yan, Rong
2007-01-01
Compared with standard arithmetic word problems demanding only the direct use of number operations and computations, realistic problems are harder to solve because children need to incorporate "real-world" knowledge into their solutions. Using the realistic word problem testing materials developed by Verschaffel, De Corte, and Lasure…
Neural simulations on multi-core architectures.
Eichner, Hubert; Klug, Tobias; Borst, Alexander
2009-01-01
Neuroscience is witnessing increasing knowledge about the anatomy and electrophysiological properties of neurons and their connectivity, leading to an ever increasing computational complexity of neural simulations. At the same time, a rather radical change in personal computer technology emerges with the establishment of multi-cores: high-density, explicitly parallel processor architectures for both high performance as well as standard desktop computers. This work introduces strategies for the parallelization of biophysically realistic neural simulations based on the compartmental modeling technique and results of such an implementation, with a strong focus on multi-core architectures and automation, i.e. user-transparent load balancing.
Neural Simulations on Multi-Core Architectures
Eichner, Hubert; Klug, Tobias; Borst, Alexander
2009-01-01
Neuroscience is witnessing increasing knowledge about the anatomy and electrophysiological properties of neurons and their connectivity, leading to an ever increasing computational complexity of neural simulations. At the same time, a rather radical change in personal computer technology emerges with the establishment of multi-cores: high-density, explicitly parallel processor architectures for both high performance as well as standard desktop computers. This work introduces strategies for the parallelization of biophysically realistic neural simulations based on the compartmental modeling technique and results of such an implementation, with a strong focus on multi-core architectures and automation, i.e. user-transparent load balancing. PMID:19636393
NASA Astrophysics Data System (ADS)
Chai, Xintao; Tang, Genyang; Peng, Ronghua; Liu, Shaoyong
2018-03-01
Full-waveform inversion (FWI) reconstructs the subsurface properties from acquired seismic data via minimization of the misfit between observed and simulated data. However, FWI suffers from considerable computational costs resulting from the numerical solution of the wave equation for each source at each iteration. To reduce the computational burden, constructing supershots by combining several sources (aka source encoding) allows mitigation of the number of simulations at each iteration, but it gives rise to crosstalk artifacts because of interference between the individual sources of the supershot. A modified Gauss-Newton FWI (MGNFWI) approach showed that as long as the difference between the initial and true models permits a sparse representation, the ℓ _1-norm constrained model updates suppress subsampling-related artifacts. However, the spectral-projected gradient ℓ _1 (SPGℓ _1) algorithm employed by MGNFWI is rather complicated that makes its implementation difficult. To facilitate realistic applications, we adapt a linearized Bregman (LB) method to sparsity-promoting FWI (SPFWI) because of the efficiency and simplicity of LB in the framework of ℓ _1-norm constrained optimization problem and compressive sensing. Numerical experiments performed with the BP Salt model, the Marmousi model and the BG Compass model verify the following points. The FWI result with LB solving ℓ _1-norm sparsity-promoting problem for the model update outperforms that generated by solving ℓ _2-norm problem in terms of crosstalk elimination and high-fidelity results. The simpler LB method performs comparably and even superiorly to the complicated SPGℓ _1 method in terms of computational efficiency and model quality, making the LB method a viable alternative for realistic implementations of SPFWI.
NASA Technical Reports Server (NTRS)
Poe, C. H.; Owocki, S. P.; Castor, J. I.
1990-01-01
The steady state solution topology for absorption line-driven flows is investigated for the condition that the Sobolev approximation is not used to compute the line force. The solution topology near the sonic point is of the nodal type with two positive slope solutions. The shallower of these slopes applies to reasonable lower boundary conditions and realistic ion thermal speed v(th) and to the Sobolev limit of zero of the usual Castor, Abbott, and Klein model. At finite v(th), this solution consists of a family of very similar solutions converging on the sonic point. It is concluded that a non-Sobolev, absorption line-driven flow with a realistic values of v(th) has no uniquely defined steady state. To the extent that a pure absorption model of the outflow of stellar winds is applicable, radiatively driven winds should be intrinsically variable.
Geostatistical borehole image-based mapping of karst-carbonate aquifer pores
Michael Sukop,; Cunningham, Kevin J.
2016-01-01
Quantification of the character and spatial distribution of porosity in carbonate aquifers is important as input into computer models used in the calculation of intrinsic permeability and for next-generation, high-resolution groundwater flow simulations. Digital, optical, borehole-wall image data from three closely spaced boreholes in the karst-carbonate Biscayne aquifer in southeastern Florida are used in geostatistical experiments to assess the capabilities of various methods to create realistic two-dimensional models of vuggy megaporosity and matrix-porosity distribution in the limestone that composes the aquifer. When the borehole image data alone were used as the model training image, multiple-point geostatistics failed to detect the known spatial autocorrelation of vuggy megaporosity and matrix porosity among the three boreholes, which were only 10 m apart. Variogram analysis and subsequent Gaussian simulation produced results that showed a realistic conceptualization of horizontal continuity of strata dominated by vuggy megaporosity and matrix porosity among the three boreholes.
A Computational Framework for Realistic Retina Modeling.
Martínez-Cañada, Pablo; Morillas, Christian; Pino, Begoña; Ros, Eduardo; Pelayo, Francisco
2016-11-01
Computational simulations of the retina have led to valuable insights about the biophysics of its neuronal activity and processing principles. A great number of retina models have been proposed to reproduce the behavioral diversity of the different visual processing pathways. While many of these models share common computational stages, previous efforts have been more focused on fitting specific retina functions rather than generalizing them beyond a particular model. Here, we define a set of computational retinal microcircuits that can be used as basic building blocks for the modeling of different retina mechanisms. To validate the hypothesis that similar processing structures may be repeatedly found in different retina functions, we implemented a series of retina models simply by combining these computational retinal microcircuits. Accuracy of the retina models for capturing neural behavior was assessed by fitting published electrophysiological recordings that characterize some of the best-known phenomena observed in the retina: adaptation to the mean light intensity and temporal contrast, and differential motion sensitivity. The retinal microcircuits are part of a new software platform for efficient computational retina modeling from single-cell to large-scale levels. It includes an interface with spiking neural networks that allows simulation of the spiking response of ganglion cells and integration with models of higher visual areas.
Sultanov, Renat A; Guster, Dennis
2009-01-01
We report computational results of blood flow through a model of the human aortic arch and a vessel of actual diameter and length. A realistic pulsatile flow is used in all simulations. Calculations for bifurcation type vessels are also carried out and presented. Different mathematical methods for numerical solution of the fluid dynamics equations have been considered. The non-Newtonian behaviour of the human blood is investigated together with turbulence effects. A detailed time-dependent mathematical convergence test has been carried out. The results of computer simulations of the blood flow in vessels of three different geometries are presented: for pressure, strain rate and velocity component distributions we found significant disagreements between our results obtained with realistic non-Newtonian treatment of human blood and the widely used method in the literature: a simple Newtonian approximation. A significant increase of the strain rate and, as a result, the wall shear stress distribution, is found in the region of the aortic arch. Turbulent effects are found to be important, particularly in the case of bifurcation vessels.
Multiscale Modeling of UHTC: Thermal Conductivity
NASA Technical Reports Server (NTRS)
Lawson, John W.; Murry, Daw; Squire, Thomas; Bauschlicher, Charles W.
2012-01-01
We are developing a multiscale framework in computational modeling for the ultra high temperature ceramics (UHTC) ZrB2 and HfB2. These materials are characterized by high melting point, good strength, and reasonable oxidation resistance. They are candidate materials for a number of applications in extreme environments including sharp leading edges of hypersonic aircraft. In particular, we used a combination of ab initio methods, atomistic simulations and continuum computations to obtain insights into fundamental properties of these materials. Ab initio methods were used to compute basic structural, mechanical and thermal properties. From these results, a database was constructed to fit a Tersoff style interatomic potential suitable for atomistic simulations. These potentials were used to evaluate the lattice thermal conductivity of single crystals and the thermal resistance of simple grain boundaries. Finite element method (FEM) computations using atomistic results as inputs were performed with meshes constructed on SEM images thereby modeling the realistic microstructure. These continuum computations showed the reduction in thermal conductivity due to the grain boundary network.
A finite element head and neck model as a supportive tool for deformable image registration.
Kim, Jihun; Saitou, Kazuhiro; Matuszak, Martha M; Balter, James M
2016-07-01
A finite element (FE) head and neck model was developed as a tool to aid investigations and development of deformable image registration and patient modeling in radiation oncology. Useful aspects of a FE model for these purposes include ability to produce realistic deformations (similar to those seen in patients over the course of treatment) and a rational means of generating new configurations, e.g., via the application of force and/or displacement boundary conditions. The model was constructed based on a cone-beam computed tomography image of a head and neck cancer patient. The three-node triangular surface meshes created for the bony elements (skull, mandible, and cervical spine) and joint elements were integrated into a skeletal system and combined with the exterior surface. Nodes were additionally created inside the surface structures which were composed of the three-node triangular surface meshes, so that four-node tetrahedral FE elements were created over the whole region of the model. The bony elements were modeled as a homogeneous linear elastic material connected by intervertebral disks. The surrounding tissues were modeled as a homogeneous linear elastic material. Under force or displacement boundary conditions, FE analysis on the model calculates approximate solutions of the displacement vector field. A FE head and neck model was constructed that skull, mandible, and cervical vertebrae were mechanically connected by disks. The developed FE model is capable of generating realistic deformations that are strain-free for the bony elements and of creating new configurations of the skeletal system with the surrounding tissues reasonably deformed. The FE model can generate realistic deformations for skeletal elements. In addition, the model provides a way of evaluating the accuracy of image alignment methods by producing a ground truth deformation and correspondingly simulated images. The ability to combine force and displacement conditions provides flexibility for simulating realistic anatomic configurations.
Convective dynamics and chemical disequilibrium in the atmospheres of substellar objects
NASA Astrophysics Data System (ADS)
Bordwell, Baylee; Brown, Benjamin P.; Oishi, Jeffrey S.
2017-11-01
The thousands of substellar objects now known provide a unique opportunity to test our understanding of atmospheric dynamics across a range of environments. The chemical timescales of certain species transition from being much shorter than the dynamical timescales to being much longer than them at a point in the atmosphere known as the quench point. This transition leads to a state of dynamical disequilibrium, the effects of which can be used to probe the atmospheric dynamics of these objects. Unfortunately, due to computational constraints, models that inform the interpretation of these observations are run at dynamical parameters which are far from realistic values. In this study, we explore the behavior of a disequilibrium chemical process with increasingly realistic planetary conditions, to quantify the effects of the approximations used in current models. We simulate convection in 2-D, plane-parallel, polytropically-stratified atmospheres, into which we add reactive passive tracers that explore disequilibrium behavior. We find that as we increase the Rayleigh number, and thus achieve more realistic planetary conditions, the behavior of these tracers does not conform to the classical predictions of disequilibrium chemistry.
NASA Astrophysics Data System (ADS)
Laakso, Ilkka; Kännälä, Sami; Jokela, Kari
2013-04-01
Medical staff working near magnetic resonance imaging (MRI) scanners are exposed both to the static magnetic field itself and also to electric currents that are induced in the body when the body moves in the magnetic field. However, there are currently limited data available on the induced electric field for realistic movements. This study computationally investigates the movement induced electric fields for realistic movements in the magnetic field of a 3 T MRI scanner. The path of movement near the MRI scanner is based on magnetic field measurements using a coil sensor attached to a human volunteer. Utilizing realistic models for both the motion of the head and the magnetic field of the MRI scanner, the induced fields are computationally determined using the finite-element method for five high-resolution numerical anatomical models. The results show that the time-derivative of the magnetic flux density (dB/dt) is approximately linearly proportional to the induced electric field in the head, independent of the position of the head with respect to the magnet. This supports the use of dB/dt measurements for occupational exposure assessment. For the path of movement considered herein, the spatial maximum of the induced electric field is close to the basic restriction for the peripheral nervous system and exceeds the basic restriction for the central nervous system in the international guidelines. The 99th percentile electric field is a considerably less restrictive metric for the exposure than the spatial maximum electric field; the former is typically 60-70% lower than the latter. However, the 99th percentile electric field may exceed the basic restriction for dB/dt values that can be encountered during tasks commonly performed by MRI workers. It is also shown that the movement-induced eddy currents may reach magnitudes that could electrically stimulate the vestibular system, which could play a significant role in the generation of vertigo-like sensations reported by people moving in a strong static magnetic field.
Queuing theory models for computer networks
NASA Technical Reports Server (NTRS)
Galant, David C.
1989-01-01
A set of simple queuing theory models which can model the average response of a network of computers to a given traffic load has been implemented using a spreadsheet. The impact of variations in traffic patterns and intensities, channel capacities, and message protocols can be assessed using them because of the lack of fine detail in the network traffic rates, traffic patterns, and the hardware used to implement the networks. A sample use of the models applied to a realistic problem is included in appendix A. Appendix B provides a glossary of terms used in this paper. This Ames Research Center computer communication network is an evolving network of local area networks (LANs) connected via gateways and high-speed backbone communication channels. Intelligent planning of expansion and improvement requires understanding the behavior of the individual LANs as well as the collection of networks as a whole.
Improving Barotropic Tides by Two-way Nesting High and Low Resolution Domains
NASA Astrophysics Data System (ADS)
Jeon, C. H.; Buijsman, M. C.; Wallcraft, A. J.; Shriver, J. F.; Hogan, P. J.; Arbic, B. K.; Richman, J. G.
2017-12-01
In a realistically forced global ocean model, relatively large sea-surface-height root-mean-square (RMS) errors are observed in the North Atlantic near the Hudson Strait. These may be associated with large tidal resonances interacting with coastal bathymetry that are not correctly represented with a low resolution grid. This issue can be overcome by using high resolution grids, but at a high computational cost. In this paper we apply two-way nesting as an alternative solution. This approach applies high resolution to the area with large RMS errors and a lower resolution to the rest. It is expected to improve the tidal solution as well as reduce the computational cost. To minimize modification of the original source codes of the ocean circulation model (HYCOM), we apply the coupler OASIS3-MCT. This coupler is used to exchange barotropic pressures and velocity fields through its APIs (Application Programming Interface) between the parent and the child components. The developed two-way nesting framework has been validated with an idealized test case where the parent and the child domains have identical grid resolutions. The result of the idealized case shows very small RMS errors between the child and parent solutions. We plan to show results for a case with realistic tidal forcing in which the resolution of the child grid is three times that of the parent grid. The numerical results of this realistic case are compared to TPXO data.
The Effects of 3D Computer Simulation on Biology Students' Achievement and Memory Retention
ERIC Educational Resources Information Center
Elangovan, Tavasuria; Ismail, Zurida
2014-01-01
A quasi experimental study was conducted for six weeks to determine the effectiveness of two different 3D computer simulation based teaching methods, that is, realistic simulation and non-realistic simulation on Form Four Biology students' achievement and memory retention in Perak, Malaysia. A sample of 136 Form Four Biology students in Perak,…
The influence of computational assumptions on analysing abdominal aortic aneurysm haemodynamics.
Ene, Florentina; Delassus, Patrick; Morris, Liam
2014-08-01
The variation in computational assumptions for analysing abdominal aortic aneurysm haemodynamics can influence the desired output results and computational cost. Such assumptions for abdominal aortic aneurysm modelling include static/transient pressures, steady/transient flows and rigid/compliant walls. Six computational methods and these various assumptions were simulated and compared within a realistic abdominal aortic aneurysm model with and without intraluminal thrombus. A full transient fluid-structure interaction was required to analyse the flow patterns within the compliant abdominal aortic aneurysms models. Rigid wall computational fluid dynamics overestimates the velocity magnitude by as much as 40%-65% and the wall shear stress by 30%-50%. These differences were attributed to the deforming walls which reduced the outlet volumetric flow rate for the transient fluid-structure interaction during the majority of the systolic phase. Static finite element analysis accurately approximates the deformations and von Mises stresses when compared with transient fluid-structure interaction. Simplifying the modelling complexity reduces the computational cost significantly. In conclusion, the deformation and von Mises stress can be approximately found by static finite element analysis, while for compliant models a full transient fluid-structure interaction analysis is required for acquiring the fluid flow phenomenon. © IMechE 2014.
High-fidelity simulation capability for virtual testing of seismic and acoustic sensors
NASA Astrophysics Data System (ADS)
Wilson, D. Keith; Moran, Mark L.; Ketcham, Stephen A.; Lacombe, James; Anderson, Thomas S.; Symons, Neill P.; Aldridge, David F.; Marlin, David H.; Collier, Sandra L.; Ostashev, Vladimir E.
2005-05-01
This paper describes development and application of a high-fidelity, seismic/acoustic simulation capability for battlefield sensors. The purpose is to provide simulated sensor data so realistic that they cannot be distinguished by experts from actual field data. This emerging capability provides rapid, low-cost trade studies of unattended ground sensor network configurations, data processing and fusion strategies, and signatures emitted by prototype vehicles. There are three essential components to the modeling: (1) detailed mechanical signature models for vehicles and walkers, (2) high-resolution characterization of the subsurface and atmospheric environments, and (3) state-of-the-art seismic/acoustic models for propagating moving-vehicle signatures through realistic, complex environments. With regard to the first of these components, dynamic models of wheeled and tracked vehicles have been developed to generate ground force inputs to seismic propagation models. Vehicle models range from simple, 2D representations to highly detailed, 3D representations of entire linked-track suspension systems. Similarly detailed models of acoustic emissions from vehicle engines are under development. The propagation calculations for both the seismics and acoustics are based on finite-difference, time-domain (FDTD) methodologies capable of handling complex environmental features such as heterogeneous geologies, urban structures, surface vegetation, and dynamic atmospheric turbulence. Any number of dynamic sources and virtual sensors may be incorporated into the FDTD model. The computational demands of 3D FDTD simulation over tactical distances require massively parallel computers. Several example calculations of seismic/acoustic wave propagation through complex atmospheric and terrain environments are shown.
NASA Astrophysics Data System (ADS)
Doulgerakis, Matthaios; Eggebrecht, Adam; Wojtkiewicz, Stanislaw; Culver, Joseph; Dehghani, Hamid
2017-12-01
Parameter recovery in diffuse optical tomography is a computationally expensive algorithm, especially when used for large and complex volumes, as in the case of human brain functional imaging. The modeling of light propagation, also known as the forward problem, is the computational bottleneck of the recovery algorithm, whereby the lack of a real-time solution is impeding practical and clinical applications. The objective of this work is the acceleration of the forward model, within a diffusion approximation-based finite-element modeling framework, employing parallelization to expedite the calculation of light propagation in realistic adult head models. The proposed methodology is applicable for modeling both continuous wave and frequency-domain systems with the results demonstrating a 10-fold speed increase when GPU architectures are available, while maintaining high accuracy. It is shown that, for a very high-resolution finite-element model of the adult human head with ˜600,000 nodes, consisting of heterogeneous layers, light propagation can be calculated at ˜0.25 s/excitation source.
More-Realistic Digital Modeling of a Human Body
NASA Technical Reports Server (NTRS)
Rogge, Renee
2010-01-01
A MATLAB computer program has been written to enable improved (relative to an older program) modeling of a human body for purposes of designing space suits and other hardware with which an astronaut must interact. The older program implements a kinematic model based on traditional anthropometric measurements that do provide important volume and surface information. The present program generates a three-dimensional (3D) whole-body model from 3D body-scan data. The program utilizes thin-plate spline theory to reposition the model without need for additional scans.
Data Visualization and Animation Lab (DVAL) overview
NASA Technical Reports Server (NTRS)
Stacy, Kathy; Vonofenheim, Bill
1994-01-01
The general capabilities of the Langley Research Center Data Visualization and Animation Laboratory is described. These capabilities include digital image processing, 3-D interactive computer graphics, data visualization and analysis, video-rate acquisition and processing of video images, photo-realistic modeling and animation, video report generation, and color hardcopies. A specialized video image processing system is also discussed.
ERIC Educational Resources Information Center
Bush, Drew; Sieber, Renee; Seiler, Gale; Chandler, Mark
2016-01-01
A gap has existed between the tools and processes of scientists working on anthropogenic global climate change (AGCC) and the technologies and curricula available to educators teaching the subject through student inquiry. Designing realistic scientific inquiry into AGCC poses a challenge because research on it relies on complex computer models,…
Sharp Interface Tracking in Rotating Microflows of Solvent Extraction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Glimm, James; Almeida, Valmor de; Jiao, Xiangmin
2013-01-08
The objective of this project is to develop a specialized sharp interface tracking simulation capability for predicting interaction of micron-sized drops and bubbles in rotating flows relevant to optimized design of contactor devices used in solvent extraction processes of spent nuclear fuel reprocessing. The primary outcomes of this project include the capability to resolve drops and bubbles micro-hydrodynamics in solvent extraction contactors, determining from first principles continuum fluid mechanics how micro-drops and bubbles interact with each other and the surrounding shearing fluid for realistic flows. In the near term, this effort will play a central role in providing parameters andmore » insight into the flow dynamics of models that average over coarser scales, say at the millimeter unit length. In the longer term, it will prove to be the platform to conduct full-device, detailed simulations as parallel computing power reaches the exaflop level. The team will develop an accurate simulation tool for flows containing interacting droplets and bubbles with sharp interfaces under conditions that mimic those found in realistic contactor operations. The main objective is to create an off-line simulation capability to model drop and bubble interactions in a domain representative of the averaged length scale. The technical approach is to combine robust interface tracking software, subgrid modeling, validation quality experiments, powerful computational hardware, and a team with simulation modeling, physical modeling and technology integration experience. Simulations will then fully resolve the microflow of drops and bubbles at the microsecond time scale. This approach is computationally intensive but very accurate in treating important coupled physical phenomena in the vicinity of interfaces. The method makes it possible to resolve spatial scales smaller than the typical distance between bubbles and to model some non-equilibrium thermodynamic features such as finite critical tension in cavitating liquids« less
NASA Astrophysics Data System (ADS)
Jahandari, H.; Farquharson, C. G.
2017-11-01
Unstructured grids enable representing arbitrary structures more accurately and with fewer cells compared to regular structured grids. These grids also allow more efficient refinements compared to rectilinear meshes. In this study, tetrahedral grids are used for the inversion of magnetotelluric (MT) data, which allows for the direct inclusion of topography in the model, for constraining an inversion using a wireframe-based geological model and for local refinement at the observation stations. A minimum-structure method with an iterative model-space Gauss-Newton algorithm for optimization is used. An iterative solver is employed for solving the normal system of equations at each Gauss-Newton step and the sensitivity matrix-vector products that are required by this solver are calculated using pseudo-forward problems. This method alleviates the need to explicitly form the Hessian or Jacobian matrices which significantly reduces the required computation memory. Forward problems are formulated using an edge-based finite-element approach and a sparse direct solver is used for the solutions. This solver allows saving and re-using the factorization of matrices for similar pseudo-forward problems within a Gauss-Newton iteration which greatly minimizes the computation time. Two examples are presented to show the capability of the algorithm: the first example uses a benchmark model while the second example represents a realistic geological setting with topography and a sulphide deposit. The data that are inverted are the full-tensor impedance and the magnetic transfer function vector. The inversions sufficiently recovered the models and reproduced the data, which shows the effectiveness of unstructured grids for complex and realistic MT inversion scenarios. The first example is also used to demonstrate the computational efficiency of the presented model-space method by comparison with its data-space counterpart.
Coniferous canopy BRF simulation based on 3-D realistic scene.
Wang, Xin-Yun; Guo, Zhi-Feng; Qin, Wen-Han; Sun, Guo-Qing
2011-09-01
It is difficulties for the computer simulation method to study radiation regime at large-scale. Simplified coniferous model was investigated in the present study. It makes the computer simulation methods such as L-systems and radiosity-graphics combined method (RGM) more powerful in remote sensing of heterogeneous coniferous forests over a large-scale region. L-systems is applied to render 3-D coniferous forest scenarios, and RGM model was used to calculate BRF (bidirectional reflectance factor) in visible and near-infrared regions. Results in this study show that in most cases both agreed well. Meanwhile at a tree and forest level, the results are also good.
Coniferous Canopy BRF Simulation Based on 3-D Realistic Scene
NASA Technical Reports Server (NTRS)
Wang, Xin-yun; Guo, Zhi-feng; Qin, Wen-han; Sun, Guo-qing
2011-01-01
It is difficulties for the computer simulation method to study radiation regime at large-scale. Simplified coniferous model was investigate d in the present study. It makes the computer simulation methods such as L-systems and radiosity-graphics combined method (RGM) more powerf ul in remote sensing of heterogeneous coniferous forests over a large -scale region. L-systems is applied to render 3-D coniferous forest scenarios: and RGM model was used to calculate BRF (bidirectional refle ctance factor) in visible and near-infrared regions. Results in this study show that in most cases both agreed well. Meanwhiie at a tree and forest level. the results are also good.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, Martyn P.; Bierkens, Marc F. P.; Samaniego, Luis
The diversity in hydrologic models has historically led to great controversy on the correct approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. Here, we revisit key modeling challenges on requirements to (1) define suitable model equations, (2) define adequate model parameters, and (3) cope with limitations in computing power. We outline the historical modeling challenges, provide examples of modeling advances that address these challenges, and define outstanding research needs. We also illustrate how modeling advances have been made by groups using models of different type and complexity,more » and we argue for the need to more effectively use our diversity of modeling approaches in order to advance our collective quest for physically realistic hydrologic models.« less
Clark, Martyn P.; Bierkens, Marc F. P.; Samaniego, Luis; ...
2017-07-11
The diversity in hydrologic models has historically led to great controversy on the correct approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. Here, we revisit key modeling challenges on requirements to (1) define suitable model equations, (2) define adequate model parameters, and (3) cope with limitations in computing power. We outline the historical modeling challenges, provide examples of modeling advances that address these challenges, and define outstanding research needs. We also illustrate how modeling advances have been made by groups using models of different type and complexity,more » and we argue for the need to more effectively use our diversity of modeling approaches in order to advance our collective quest for physically realistic hydrologic models.« less
Real-time computing platform for spiking neurons (RT-spike).
Ros, Eduardo; Ortigosa, Eva M; Agís, Rodrigo; Carrillo, Richard; Arnold, Michael
2006-07-01
A computing platform is described for simulating arbitrary networks of spiking neurons in real time. A hybrid computing scheme is adopted that uses both software and hardware components to manage the tradeoff between flexibility and computational power; the neuron model is implemented in hardware and the network model and the learning are implemented in software. The incremental transition of the software components into hardware is supported. We focus on a spike response model (SRM) for a neuron where the synapses are modeled as input-driven conductances. The temporal dynamics of the synaptic integration process are modeled with a synaptic time constant that results in a gradual injection of charge. This type of model is computationally expensive and is not easily amenable to existing software-based event-driven approaches. As an alternative we have designed an efficient time-based computing architecture in hardware, where the different stages of the neuron model are processed in parallel. Further improvements occur by computing multiple neurons in parallel using multiple processing units. This design is tested using reconfigurable hardware and its scalability and performance evaluated. Our overall goal is to investigate biologically realistic models for the real-time control of robots operating within closed action-perception loops, and so we evaluate the performance of the system on simulating a model of the cerebellum where the emulation of the temporal dynamics of the synaptic integration process is important.
Nonhydrostatic icosahedral atmospheric model (NICAM) for global cloud resolving simulations
NASA Astrophysics Data System (ADS)
Satoh, M.; Matsuno, T.; Tomita, H.; Miura, H.; Nasuno, T.; Iga, S.
2008-03-01
A new type of ultra-high resolution atmospheric global circulation model is developed. The new model is designed to perform "cloud resolving simulations" by directly calculating deep convection and meso-scale circulations, which play key roles not only in the tropical circulations but in the global circulations of the atmosphere. Since cores of deep convection have a few km in horizontal size, they have not directly been resolved by existing atmospheric general circulation models (AGCMs). In order to drastically enhance horizontal resolution, a new framework of a global atmospheric model is required; we adopted nonhydrostatic governing equations and icosahedral grids to the new model, and call it Nonhydrostatic ICosahedral Atmospheric Model (NICAM). In this article, we review governing equations and numerical techniques employed, and present the results from the unique 3.5-km mesh global experiments—with O(10 9) computational nodes—using realistic topography and land/ocean surface thermal forcing. The results show realistic behaviors of multi-scale convective systems in the tropics, which have not been captured by AGCMs. We also argue future perspective of the roles of the new model in the next generation atmospheric sciences.
Towards Modeling False Memory With Computational Knowledge Bases.
Li, Justin; Kohanyi, Emma
2017-01-01
One challenge to creating realistic cognitive models of memory is the inability to account for the vast common-sense knowledge of human participants. Large computational knowledge bases such as WordNet and DBpedia may offer a solution to this problem but may pose other challenges. This paper explores some of these difficulties through a semantic network spreading activation model of the Deese-Roediger-McDermott false memory task. In three experiments, we show that these knowledge bases only capture a subset of human associations, while irrelevant information introduces noise and makes efficient modeling difficult. We conclude that the contents of these knowledge bases must be augmented and, more important, that the algorithms must be refined and optimized, before large knowledge bases can be widely used for cognitive modeling. Copyright © 2016 Cognitive Science Society, Inc.
Three-Dimensional Mechanical Model of the Human Spine and the Versatility of its Use
NASA Astrophysics Data System (ADS)
Sokol, Milan; Velísková, Petra; Rehák, Ľuboš; Žabka, Martin
2014-03-01
The aim of the work is oriented towards the simulation or modeling of the lumbar and thoracic human spine as a load-bearing 3D system in a computer program (ANSYS). The human spine model includes a determination of the geometry based on X-ray pictures of frontal and lateral projections. For this reason, another computer code, BMPCOORDINATES, was developed as an aid to obtain the most precise and realistic model of the spine. Various positions, deformations, scoliosis, rotation and torsion can be modelled. Once the geometry is done, external loading on different spinal segments is entered; consequently, the response could be analysed. This can contribute a lot to medical practice as a tool for diagnoses, and developing implants or other artificial instruments for fixing the spine.
Realistic terrain visualization based on 3D virtual world technology
NASA Astrophysics Data System (ADS)
Huang, Fengru; Lin, Hui; Chen, Bin; Xiao, Cai
2009-09-01
The rapid advances in information technologies, e.g., network, graphics processing, and virtual world, have provided challenges and opportunities for new capabilities in information systems, Internet applications, and virtual geographic environments, especially geographic visualization and collaboration. In order to achieve meaningful geographic capabilities, we need to explore and understand how these technologies can be used to construct virtual geographic environments to help to engage geographic research. The generation of three-dimensional (3D) terrain plays an important part in geographical visualization, computer simulation, and virtual geographic environment applications. The paper introduces concepts and technologies of virtual worlds and virtual geographic environments, explores integration of realistic terrain and other geographic objects and phenomena of natural geographic environment based on SL/OpenSim virtual world technologies. Realistic 3D terrain visualization is a foundation of construction of a mirror world or a sand box model of the earth landscape and geographic environment. The capabilities of interaction and collaboration on geographic information are discussed as well. Further virtual geographic applications can be developed based on the foundation work of realistic terrain visualization in virtual environments.
Realistic terrain visualization based on 3D virtual world technology
NASA Astrophysics Data System (ADS)
Huang, Fengru; Lin, Hui; Chen, Bin; Xiao, Cai
2010-11-01
The rapid advances in information technologies, e.g., network, graphics processing, and virtual world, have provided challenges and opportunities for new capabilities in information systems, Internet applications, and virtual geographic environments, especially geographic visualization and collaboration. In order to achieve meaningful geographic capabilities, we need to explore and understand how these technologies can be used to construct virtual geographic environments to help to engage geographic research. The generation of three-dimensional (3D) terrain plays an important part in geographical visualization, computer simulation, and virtual geographic environment applications. The paper introduces concepts and technologies of virtual worlds and virtual geographic environments, explores integration of realistic terrain and other geographic objects and phenomena of natural geographic environment based on SL/OpenSim virtual world technologies. Realistic 3D terrain visualization is a foundation of construction of a mirror world or a sand box model of the earth landscape and geographic environment. The capabilities of interaction and collaboration on geographic information are discussed as well. Further virtual geographic applications can be developed based on the foundation work of realistic terrain visualization in virtual environments.
NASA Astrophysics Data System (ADS)
Rundle, P. B.; Rundle, J. B.; Morein, G.; Donnellan, A.; Turcotte, D.; Klein, W.
2004-12-01
The research community is rapidly moving towards the development of an earthquake forecast technology based on the use of complex, system-level earthquake fault system simulations. Using these topologically and dynamically realistic simulations, it is possible to develop ensemble forecasting methods similar to that used in weather and climate research. To effectively carry out such a program, one needs 1) a topologically realistic model to simulate the fault system; 2) data sets to constrain the model parameters through a systematic program of data assimilation; 3) a computational technology making use of modern paradigms of high performance and parallel computing systems; and 4) software to visualize and analyze the results. In particular, we focus attention on a new version of our code Virtual California (version 2001) in which we model all of the major strike slip faults in California, from the Mexico-California border to the Mendocino Triple Junction. Virtual California is a "backslip model", meaning that the long term rate of slip on each fault segment in the model is matched to the observed rate. We use the historic data set of earthquakes larger than magnitude M > 6 to define the frictional properties of 650 fault segments (degrees of freedom) in the model. To compute the dynamics and the associated surface deformation, we use message passing as implemented in the MPICH standard distribution on a Beowulf clusters consisting of >10 cpus. We also will report results from implementing the code on significantly larger machines so that we can begin to examine much finer spatial scales of resolution, and to assess scaling properties of the code. We present results of simulations both as static images and as mpeg movies, so that the dynamical aspects of the computation can be assessed by the viewer. We compute a variety of statistics from the simulations, including magnitude-frequency relations, and compare these with data from real fault systems. We report recent results on use of Virtual California for probabilistic earthquake forecasting for several sub-groups of major faults in California. These methods have the advantage that system-level fault interactions are explicitly included, as well as laboratory-based friction laws.
NASA Astrophysics Data System (ADS)
Blanco, Francesco; La Rocca, Paola; Petta, Catia; Riggi, Francesco
2009-01-01
An educational model simulation of the sound produced by lightning in the sky has been employed to demonstrate realistic signatures of thunder and its connection to the particular structure of the lightning channel. Algorithms used in the past have been revisited and implemented, making use of current computer techniques. The basic properties of the mathematical model, together with typical results and suggestions for additional developments are discussed. The paper is intended as a teaching aid for students and teachers in the context of introductory physics courses at university level.
Computer Model Predicts the Movement of Dust
NASA Technical Reports Server (NTRS)
2002-01-01
A new computer model of the atmosphere can now actually pinpoint where global dust events come from, and can project where they're going. The model may help scientists better evaluate the impact of dust on human health, climate, ocean carbon cycles, ecosystems, and atmospheric chemistry. Also, by seeing where dust originates and where it blows people with respiratory problems can get advanced warning of approaching dust clouds. 'The model is physically more realistic than previous ones,' said Mian Chin, a co-author of the study and an Earth and atmospheric scientist at Georgia Tech and the Goddard Space Flight Center (GSFC) in Greenbelt, Md. 'It is able to reproduce the short term day-to-day variations and long term inter-annual variations of dust concentrations and distributions that are measured from field experiments and observed from satellites.' The above images show both aerosols measured from space (left) and the movement of aerosols predicted by computer model for the same date (right). For more information, read New Computer Model Tracks and Predicts Paths Of Earth's Dust Images courtesy Paul Giroux, Georgia Tech/NASA Goddard Space Flight Center
High performance MRI simulations of motion on multi-GPU systems.
Xanthis, Christos G; Venetis, Ioannis E; Aletras, Anthony H
2014-07-04
MRI physics simulators have been developed in the past for optimizing imaging protocols and for training purposes. However, these simulators have only addressed motion within a limited scope. The purpose of this study was the incorporation of realistic motion, such as cardiac motion, respiratory motion and flow, within MRI simulations in a high performance multi-GPU environment. Three different motion models were introduced in the Magnetic Resonance Imaging SIMULator (MRISIMUL) of this study: cardiac motion, respiratory motion and flow. Simulation of a simple Gradient Echo pulse sequence and a CINE pulse sequence on the corresponding anatomical model was performed. Myocardial tagging was also investigated. In pulse sequence design, software crushers were introduced to accommodate the long execution times in order to avoid spurious echoes formation. The displacement of the anatomical model isochromats was calculated within the Graphics Processing Unit (GPU) kernel for every timestep of the pulse sequence. Experiments that would allow simulation of custom anatomical and motion models were also performed. Last, simulations of motion with MRISIMUL on single-node and multi-node multi-GPU systems were examined. Gradient Echo and CINE images of the three motion models were produced and motion-related artifacts were demonstrated. The temporal evolution of the contractility of the heart was presented through the application of myocardial tagging. Better simulation performance and image quality were presented through the introduction of software crushers without the need to further increase the computational load and GPU resources. Last, MRISIMUL demonstrated an almost linear scalable performance with the increasing number of available GPU cards, in both single-node and multi-node multi-GPU computer systems. MRISIMUL is the first MR physics simulator to have implemented motion with a 3D large computational load on a single computer multi-GPU configuration. The incorporation of realistic motion models, such as cardiac motion, respiratory motion and flow may benefit the design and optimization of existing or new MR pulse sequences, protocols and algorithms, which examine motion related MR applications.
Ly, Cheng
2013-10-01
The population density approach to neural network modeling has been utilized in a variety of contexts. The idea is to group many similar noisy neurons into populations and track the probability density function for each population that encompasses the proportion of neurons with a particular state rather than simulating individual neurons (i.e., Monte Carlo). It is commonly used for both analytic insight and as a time-saving computational tool. The main shortcoming of this method is that when realistic attributes are incorporated in the underlying neuron model, the dimension of the probability density function increases, leading to intractable equations or, at best, computationally intensive simulations. Thus, developing principled dimension-reduction methods is essential for the robustness of these powerful methods. As a more pragmatic tool, it would be of great value for the larger theoretical neuroscience community. For exposition of this method, we consider a single uncoupled population of leaky integrate-and-fire neurons receiving external excitatory synaptic input only. We present a dimension-reduction method that reduces a two-dimensional partial differential-integral equation to a computationally efficient one-dimensional system and gives qualitatively accurate results in both the steady-state and nonequilibrium regimes. The method, termed modified mean-field method, is based entirely on the governing equations and not on any auxiliary variables or parameters, and it does not require fine-tuning. The principles of the modified mean-field method have potential applicability to more realistic (i.e., higher-dimensional) neural networks.
Development of a model of the coronary arterial tree for the 4D XCAT phantom
NASA Astrophysics Data System (ADS)
Fung, George S. K.; Segars, W. Paul; Gullberg, Grant T.; Tsui, Benjamin M. W.
2011-09-01
A detailed three-dimensional (3D) model of the coronary artery tree with cardiac motion has great potential for applications in a wide variety of medical imaging research areas. In this work, we first developed a computer-generated 3D model of the coronary arterial tree for the heart in the extended cardiac-torso (XCAT) phantom, thereby creating a realistic computer model of the human anatomy. The coronary arterial tree model was based on two datasets: (1) a gated cardiac dual-source computed tomography (CT) angiographic dataset obtained from a normal human subject and (2) statistical morphometric data of porcine hearts. The initial proximal segments of the vasculature and the anatomical details of the boundaries of the ventricles were defined by segmenting the CT data. An iterative rule-based generation method was developed and applied to extend the coronary arterial tree beyond the initial proximal segments. The algorithm was governed by three factors: (1) statistical morphometric measurements of the connectivity, lengths and diameters of the arterial segments; (2) avoidance forces from other vessel segments and the boundaries of the myocardium, and (3) optimality principles which minimize the drag force at the bifurcations of the generated tree. Using this algorithm, the 3D computational model of the largest six orders of the coronary arterial tree was generated, which spread across the myocardium of the left and right ventricles. The 3D coronary arterial tree model was then extended to 4D to simulate different cardiac phases by deforming the original 3D model according to the motion vector map of the 4D cardiac model of the XCAT phantom at the corresponding phases. As a result, a detailed and realistic 4D model of the coronary arterial tree was developed for the XCAT phantom by imposing constraints of anatomical and physiological characteristics of the coronary vasculature. This new 4D coronary artery tree model provides a unique simulation tool that can be used in the development and evaluation of instrumentation and methods for imaging normal and pathological hearts with myocardial perfusion defects.
Hybrid reduced order modeling for assembly calculations
Bang, Youngsuk; Abdel-Khalik, Hany S.; Jessee, Matthew A.; ...
2015-08-14
While the accuracy of assembly calculations has greatly improved due to the increase in computer power enabling more refined description of the phase space and use of more sophisticated numerical algorithms, the computational cost continues to increase which limits the full utilization of their effectiveness for routine engineering analysis. Reduced order modeling is a mathematical vehicle that scales down the dimensionality of large-scale numerical problems to enable their repeated executions on small computing environment, often available to end users. This is done by capturing the most dominant underlying relationships between the model's inputs and outputs. Previous works demonstrated the usemore » of the reduced order modeling for a single physics code, such as a radiation transport calculation. This paper extends those works to coupled code systems as currently employed in assembly calculations. Finally, numerical tests are conducted using realistic SCALE assembly models with resonance self-shielding, neutron transport, and nuclides transmutation/depletion models representing the components of the coupled code system.« less
Interactions and triggering in a 3D rate and state asperity model
NASA Astrophysics Data System (ADS)
Dublanchet, P.; Bernard, P.
2012-12-01
Precise relocation of micro-seismicity and careful analysis of seismic source parameters have progressively imposed the concept of seismic asperities embedded in a creeping fault segment as being one of the most important aspect that should appear in a realistic representation of micro-seismic sources. Another important issue concerning micro-seismic activity is the existence of robust empirical laws describing the temporal and magnitude distribution of earthquakes, such as the Omori law, the distribution of inter-event time and the Gutenberg-Richter law. In this framework, this study aims at understanding statistical properties of earthquakes, by generating synthetic catalogs with a 3D, quasi-dynamic continuous rate and state asperity model, that takes into account a realistic geometry of asperities. Our approach contrasts with ETAS models (Kagan and Knopoff, 1981) usually implemented to produce earthquake catalogs, in the sense that the non linearity observed in rock friction experiments (Dieterich, 1979) is fully taken into account by the use of rate and state friction law. Furthermore, our model differs from discrete models of faults (Ziv and Cochard, 2006) because the continuity allows us to define realistic geometries and distributions of asperities by the assembling of sub-critical computational cells that always fail in a single event. Moreover, this model allows us to adress the question of the influence of barriers and distribution of asperities on the event statistics. After recalling the main observations of asperities in the specific case of Parkfield segment of San-Andreas Fault, we analyse earthquake statistical properties computed for this area. Then, we present synthetic statistics obtained by our model that allow us to discuss the role of barriers on clustering and triggering phenomena among a population of sources. It appears that an effective size of barrier, that depends on its frictional strength, controls the presence or the absence, in the synthetic catalog, of statistical laws that are similar to what is observed for real earthquakes. As an application, we attempt to draw a comparison between synthetic statistics and the observed statistics of Parkfield in order to characterize what could be a realistic frictional model of Parkfield area. More generally, we obtained synthetic statistical properties that are in agreement with power-law decays characterized by exponents that match the observations at a global scale, showing that our mechanical model is able to provide new insights into the understanding of earthquake interaction processes in general.
Modeling Images of Natural 3D Surfaces: Overview and Potential Applications
NASA Technical Reports Server (NTRS)
Jalobeanu, Andre; Kuehnel, Frank; Stutz, John
2004-01-01
Generative models of natural images have long been used in computer vision. However, since they only describe the of 2D scenes, they fail to capture all the properties of the underlying 3D world. Even though such models are sufficient for many vision tasks a 3D scene model is when it comes to inferring a 3D object or its characteristics. In this paper, we present such a generative model, incorporating both a multiscale surface prior model for surface geometry and reflectance, and an image formation process model based on realistic rendering, the computation of the posterior model parameter densities, and on the critical aspects of the rendering. We also how to efficiently invert the model within a Bayesian framework. We present a few potential applications, such as asteroid modeling and Planetary topography recovery, illustrated by promising results on real images.
Towards a systematic construction of realistic D-brane models on a del Pezzo singularity
NASA Astrophysics Data System (ADS)
Dolan, Matthew J.; Krippendorf, Sven; Quevedo, Fernando
2011-10-01
A systematic approach is followed in order to identify realistic D-brane models at toric del Pezzo singularities. Requiring quark and lepton spectrum and Yukawas from D3 branes and massless hypercharge, we are led to Pati-Salam extensions of the Standard Model. Hierarchies of masses, flavour mixings and control of couplings select higher order del Pezzo singularities, minimising the Higgs sector prefers toric del Pezzos with dP 3 providing the most successful compromise. Then a supersymmetric local string model is presented with the following properties at low energies: (i) the MSSM spectrum plus a local B - L gauge field or additional Higgs fields depending on the breaking pattern, (ii) a realistic hierarchy of quark and lepton masses and (iii) realistic flavour mixing between quark and lepton families with computable CKM and PMNS matrices, and CP violation consistent with observations. In this construction, kinetic terms are diagonal and under calculational control suppressing standard FCNC contributions. Proton decay operators of dimension 4, 5, 6 are suppressed, and gauge couplings can unify depending on the breaking scales from string scales at energies in the range 1012-1016 GeV, consistent with TeV soft-masses from moduli mediated supersymmetry breaking. The GUT scale model corresponds to D3 branes at dP 3 with two copies of the Pati-Salam gauge symmetry SU(4) × SU(2) R × SU(2) L . D-brane instantons generate a non-vanishing μ-term. Right handed sneutrinos can break the B - L symmetry and induce a see-saw mechanism of neutrino masses and R-parity violating operators with observable low-energy implications.
Experimentally validated modification to Cook-Torrance BRDF model for improved accuracy
NASA Astrophysics Data System (ADS)
Butler, Samuel D.; Ethridge, James A.; Nauyoks, Stephen E.; Marciniak, Michael A.
2017-09-01
The BRDF describes optical scatter off realistic surfaces. The microfacet BRDF model assumes geometric optics but is computationally simple compared to wave optics models. In this work, MERL BRDF data is fitted to the original Cook-Torrance microfacet model, and a modified Cook-Torrance model using the polarization factor in place of the mathematically problematic cross section conversion and geometric attenuation terms. The results provide experimental evidence that this modified Cook-Torrance model leads to improved fits, particularly for large incident and scattered angles. These results are expected to lead to more accurate BRDF modeling for remote sensing.
On coarse projective integration for atomic deposition in amorphous systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chuang, Claire Y., E-mail: yungc@seas.upenn.edu, E-mail: meister@unm.edu, E-mail: zepedaruiz1@llnl.gov; Sinno, Talid, E-mail: talid@seas.upenn.edu; Han, Sang M., E-mail: yungc@seas.upenn.edu, E-mail: meister@unm.edu, E-mail: zepedaruiz1@llnl.gov
2015-10-07
Direct molecular dynamics simulation of atomic deposition under realistic conditions is notoriously challenging because of the wide range of time scales that must be captured. Numerous simulation approaches have been proposed to address the problem, often requiring a compromise between model fidelity, algorithmic complexity, and computational efficiency. Coarse projective integration, an example application of the “equation-free” framework, offers an attractive balance between these constraints. Here, periodically applied, short atomistic simulations are employed to compute time derivatives of slowly evolving coarse variables that are then used to numerically integrate differential equations over relatively large time intervals. A key obstacle to themore » application of this technique in realistic settings is the “lifting” operation in which a valid atomistic configuration is recreated from knowledge of the coarse variables. Using Ge deposition on amorphous SiO{sub 2} substrates as an example application, we present a scheme for lifting realistic atomistic configurations comprised of collections of Ge islands on amorphous SiO{sub 2} using only a few measures of the island size distribution. The approach is shown to provide accurate initial configurations to restart molecular dynamics simulations at arbitrary points in time, enabling the application of coarse projective integration for this morphologically complex system.« less
NDE and SHM Simulation for CFRP Composites
NASA Technical Reports Server (NTRS)
Leckey, Cara A. C.; Parker, F. Raymond
2014-01-01
Ultrasound-based nondestructive evaluation (NDE) is a common technique for damage detection in composite materials. There is a need for advanced NDE that goes beyond damage detection to damage quantification and characterization in order to enable data driven prognostics. The damage types that exist in carbon fiber-reinforced polymer (CFRP) composites include microcracking and delaminations, and can be initiated and grown via impact forces (due to ground vehicles, tool drops, bird strikes, etc), fatigue, and extreme environmental changes. X-ray microfocus computed tomography data, among other methods, have shown that these damage types often result in voids/discontinuities of a complex volumetric shape. The specific damage geometry and location within ply layers affect damage growth. Realistic threedimensional NDE and structural health monitoring (SHM) simulations can aid in the development and optimization of damage quantification and characterization techniques. This paper is an overview of ongoing work towards realistic NDE and SHM simulation tools for composites, and also discusses NASA's need for such simulation tools in aeronautics and spaceflight. The paper describes the development and implementation of a custom ultrasound simulation tool that is used to model ultrasonic wave interaction with realistic 3-dimensional damage in CFRP composites. The custom code uses elastodynamic finite integration technique and is parallelized to run efficiently on computing cluster or multicore machines.
On Coarse Projective Integration for Atomic Deposition in Amorphous Systems
Chuang, Claire Y.; Han, Sang M.; Zepeda-Ruiz, Luis A.; ...
2015-10-02
Direct molecular dynamics simulation of atomic deposition under realistic conditions is notoriously challenging because of the wide range of timescales that must be captured. Numerous simulation approaches have been proposed to address the problem, often requiring a compromise between model fidelity, algorithmic complexity and computational efficiency. Coarse projective integration, an example application of the ‘equation-free’ framework, offers an attractive balance between these constraints. Here, periodically applied, short atomistic simulations are employed to compute gradients of slowly-evolving coarse variables that are then used to numerically integrate differential equations over relatively large time intervals. A key obstacle to the application of thismore » technique in realistic settings is the ‘lifting’ operation in which a valid atomistic configuration is recreated from knowledge of the coarse variables. Using Ge deposition on amorphous SiO 2 substrates as an example application, we present a scheme for lifting realistic atomistic configurations comprised of collections of Ge islands on amorphous SiO 2 using only a few measures of the island size distribution. In conclusion, the approach is shown to provide accurate initial configurations to restart molecular dynamics simulations at arbitrary points in time, enabling the application of coarse projective integration for this morphologically complex system.« less
Neuron Bifurcations in an Analog Electronic Burster
NASA Astrophysics Data System (ADS)
Savino, Guillermo V.; Formigli, Carlos M.
2007-05-01
Although bursting electrical activity is typical in some brain neurons and biological excitable systems, its functions and mechanisms of generation are yet unknown. In modeling such complex oscillations, analog electronic models are faster than mathematical ones, whether phenomenologically or theoretically based. We show experimentally that bursting oscillator circuits can be greatly simplified by using the nonlinear characteristics of two bipolar transistors. Since our circuit qualitatively mimics Hodgkin and Huxley model neurons bursting activity, and bifurcations originating neuro-computational properties, it is not only a caricature but a realistic model.
2013-11-01
duration, or shock-pulse shape. Used in this computational study is a coarse-grained model of the lipid vesicle as a simplified model of a cell...Figures iv List of Tables iv 1. Introduction 1 2. Model and Methods 3 3. Results and Discussion 6 3.1 Simulation of the Blast Waves with Low Peak...realistic detail but to focus on a simple model of the major constituent of a cell membrane, the phospholipid bilayer. In this work, we studied the
NASA Astrophysics Data System (ADS)
Fee, David; Izbekov, Pavel; Kim, Keehoon; Yokoo, Akihiko; Lopez, Taryn; Prata, Fred; Kazahaya, Ryunosuke; Nakamichi, Haruhisa; Iguchi, Masato
2017-12-01
Eruption mass and mass flow rate are critical parameters for determining the aerial extent and hazard of volcanic emissions. Infrasound waveform inversion is a promising technique to quantify volcanic emissions. Although topography may substantially alter the infrasound waveform as it propagates, advances in wave propagation modeling and station coverage permit robust inversion of infrasound data from volcanic explosions. The inversion can estimate eruption mass flow rate and total eruption mass if the flow density is known. However, infrasound-based eruption flow rates and mass estimates have yet to be validated against independent measurements, and numerical modeling has only recently been applied to the inversion technique. Here we present a robust full-waveform acoustic inversion method, and use it to calculate eruption flow rates and masses from 49 explosions from Sakurajima Volcano, Japan. Six infrasound stations deployed from 12-20 February 2015 recorded the explosions. We compute numerical Green's functions using 3-D Finite Difference Time Domain modeling and a high-resolution digital elevation model. The inversion, assuming a simple acoustic monopole source, provides realistic eruption masses and excellent fit to the data for the majority of the explosions. The inversion results are compared to independent eruption masses derived from ground-based ash collection and volcanic gas measurements. Assuming realistic flow densities, our infrasound-derived eruption masses for ash-rich eruptions compare favorably to the ground-based estimates, with agreement ranging from within a factor of two to one order of magnitude. Uncertainties in the time-dependent flow density and acoustic propagation likely contribute to the mismatch between the methods. Our results suggest that realistic and accurate infrasound-based eruption mass and mass flow rate estimates can be computed using the method employed here. If accurate volcanic flow parameters are known, application of this technique could be broadly applied to enable near real-time calculation of eruption mass flow rates and total masses. These critical input parameters for volcanic eruption modeling and monitoring are not currently available.
CatSim: a new computer assisted tomography simulation environment
NASA Astrophysics Data System (ADS)
De Man, Bruno; Basu, Samit; Chandra, Naveen; Dunham, Bruce; Edic, Peter; Iatrou, Maria; McOlash, Scott; Sainath, Paavana; Shaughnessy, Charlie; Tower, Brendon; Williams, Eugene
2007-03-01
We present a new simulation environment for X-ray computed tomography, called CatSim. CatSim provides a research platform for GE researchers and collaborators to explore new reconstruction algorithms, CT architectures, and X-ray source or detector technologies. The main requirements for this simulator are accurate physics modeling, low computation times, and geometrical flexibility. CatSim allows simulating complex analytic phantoms, such as the FORBILD phantoms, including boxes, ellipsoids, elliptical cylinders, cones, and cut planes. CatSim incorporates polychromaticity, realistic quantum and electronic noise models, finite focal spot size and shape, finite detector cell size, detector cross-talk, detector lag or afterglow, bowtie filtration, finite detector efficiency, non-linear partial volume, scatter (variance-reduced Monte Carlo), and absorbed dose. We present an overview of CatSim along with a number of validation experiments.
Fitting neuron models to spike trains.
Rossant, Cyrille; Goodman, Dan F M; Fontaine, Bertrand; Platkiewicz, Jonathan; Magnusson, Anna K; Brette, Romain
2011-01-01
Computational modeling is increasingly used to understand the function of neural circuits in systems neuroscience. These studies require models of individual neurons with realistic input-output properties. Recently, it was found that spiking models can accurately predict the precisely timed spike trains produced by cortical neurons in response to somatically injected currents, if properly fitted. This requires fitting techniques that are efficient and flexible enough to easily test different candidate models. We present a generic solution, based on the Brian simulator (a neural network simulator in Python), which allows the user to define and fit arbitrary neuron models to electrophysiological recordings. It relies on vectorization and parallel computing techniques to achieve efficiency. We demonstrate its use on neural recordings in the barrel cortex and in the auditory brainstem, and confirm that simple adaptive spiking models can accurately predict the response of cortical neurons. Finally, we show how a complex multicompartmental model can be reduced to a simple effective spiking model.
NASA Technical Reports Server (NTRS)
Suzen, Y. B.; Huang, P. G.; Ashpis, D. E.; Volino, R. J.; Corke, T. C.; Thomas, F. O.; Huang, J.; Lake, J. P.; King, P. I.
2007-01-01
A transport equation for the intermittency factor is employed to predict the transitional flows in low-pressure turbines. The intermittent behavior of the transitional flows is taken into account and incorporated into computations by modifying the eddy viscosity, mu(sub p) with the intermittency factor, gamma. Turbulent quantities are predicted using Menter's two-equation turbulence model (SST). The intermittency factor is obtained from a transport equation model which can produce both the experimentally observed streamwise variation of intermittency and a realistic profile in the cross stream direction. The model had been previously validated against low-pressure turbine experiments with success. In this paper, the model is applied to predictions of three sets of recent low-pressure turbine experiments on the Pack B blade to further validate its predicting capabilities under various flow conditions. Comparisons of computational results with experimental data are provided. Overall, good agreement between the experimental data and computational results is obtained. The new model has been shown to have the capability of accurately predicting transitional flows under a wide range of low-pressure turbine conditions.
The role of blood vessels in high-resolution volume conductor head modeling of EEG.
Fiederer, L D J; Vorwerk, J; Lucka, F; Dannhauer, M; Yang, S; Dümpelmann, M; Schulze-Bonhage, A; Aertsen, A; Speck, O; Wolters, C H; Ball, T
2016-03-01
Reconstruction of the electrical sources of human EEG activity at high spatio-temporal accuracy is an important aim in neuroscience and neurological diagnostics. Over the last decades, numerous studies have demonstrated that realistic modeling of head anatomy improves the accuracy of source reconstruction of EEG signals. For example, including a cerebro-spinal fluid compartment and the anisotropy of white matter electrical conductivity were both shown to significantly reduce modeling errors. Here, we for the first time quantify the role of detailed reconstructions of the cerebral blood vessels in volume conductor head modeling for EEG. To study the role of the highly arborized cerebral blood vessels, we created a submillimeter head model based on ultra-high-field-strength (7T) structural MRI datasets. Blood vessels (arteries and emissary/intraosseous veins) were segmented using Frangi multi-scale vesselness filtering. The final head model consisted of a geometry-adapted cubic mesh with over 17×10(6) nodes. We solved the forward model using a finite-element-method (FEM) transfer matrix approach, which allowed reducing computation times substantially and quantified the importance of the blood vessel compartment by computing forward and inverse errors resulting from ignoring the blood vessels. Our results show that ignoring emissary veins piercing the skull leads to focal localization errors of approx. 5 to 15mm. Large errors (>2cm) were observed due to the carotid arteries and the dense arterial vasculature in areas such as in the insula or in the medial temporal lobe. Thus, in such predisposed areas, errors caused by neglecting blood vessels can reach similar magnitudes as those previously reported for neglecting white matter anisotropy, the CSF or the dura - structures which are generally considered important components of realistic EEG head models. Our findings thus imply that including a realistic blood vessel compartment in EEG head models will be helpful to improve the accuracy of EEG source analyses particularly when high accuracies in brain areas with dense vasculature are required. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
3D multicellular model of shock wave-cell interaction.
Li, Dongli; Hallack, Andre; Cleveland, Robin O; Jérusalem, Antoine
2018-05-01
Understanding the interaction between shock waves and tissue is critical for ad- vancing the use of shock waves for medical applications, such as cancer therapy. This work aims to study shock wave-cell interaction in a more realistic environment, relevant to in vitro and in vivo studies, by using 3D computational models of healthy and cancerous cells. The results indicate that for a single cell embedded in an extracellular environment, the cellular geometry does not influence significantly the membrane strain but does influence the von Mises stress. On the contrary, the presence of neighbouring cells has a strong effect on the cell response, by increasing fourfold both quantities. The membrane strain response of a cell converges with more than three neighbouring cell layers, indicating that a cluster of four layers of cells is sufficient to model the membrane strain in a large domain of tissue. However, a full 3D tissue model is needed if the stress evaluation is of main interest. A tumour mimicking multicellular spheroid model is also proposed to study mutual interaction between healthy and cancer cells and shows that cancer cells can be specifically targeted in an early stage tumour-mimicking environment. This work presents 3D computational models of shock-wave/cell interaction in a biophysically realistic environment using real cell morphology in tissue-mimicking phantom and multicellular spheroid. Results show that cell morphology does not strongly influence the membrane strain but influences the von Mises stress. While the presence of neighbouring cells significantly increases the cell response, four cell layers are enough to capture the membrane strain change in tissue. However, a full tissue model is necessary if accurate stress analysis is needed. The work also shows that cancer cells can be specifically targetted in early stage tumourmimicking environment. This work is a step towards realistic modelling of shock-wave/cell interactions in tissues and provides insight on the use of 3D models for different scenarios. Copyright © 2018 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Hansen, Akio; Ament, Felix; Lammert, Andrea
2017-04-01
Large-eddy simulations have been performed since several decades, but due to computational limits most studies were restricted to small domains or idealised initial-/boundary conditions. Within the High definition clouds and precipitation for advancing climate prediction (HD(CP)2) project realistic weather forecasting like LES simulations were performed with the newly developed ICON LES model for several days. The domain covers central Europe with a horizontal resolution down to 156 m. The setup consists of more than 3 billion grid cells, by what one 3D dump requires roughly 500 GB. A newly developed online evaluation toolbox was created to check instantaneously for realistic model simulations. The toolbox automatically combines model results with observations and generates several quicklooks for various variables. So far temperature-/humidity profiles, cloud cover, integrated water vapour, precipitation and many more are included. All kind of observations like aircraft observations, soundings or precipitation radar networks are used. For each dataset, a specific module is created, which allows for an easy handling and enhancement of the toolbox. Most of the observations are automatically downloaded from the Standardized Atmospheric Measurement Database (SAMD). The evaluation tool should support scientists at monitoring computational costly model simulations as well as to give a first overview about model's performance. The structure of the toolbox as well as the SAMD database are presented. Furthermore, the toolbox was applied on an ICON LES sensitivity study, where example results are shown.
NASA Technical Reports Server (NTRS)
Wallett, Thomas M.; Mueller, Carl H.; Griner, James H., Jr.
2009-01-01
This paper describes the efforts in modeling and simulating electromagnetic transmission and reception as in a wireless sensor network through a realistic wing model for the Integrated Vehicle Health Management project at the Glenn Research Center. A computer model in a standard format for an S-3 Viking aircraft was obtained, converted to a Microwave Studio software format, and scaled to proper dimensions in Microwave Studio. The left wing portion of the model was used with two antenna models, one transmitting and one receiving, to simulate radio frequency transmission through the wing. Transmission and reception results were inconclusive.
2010-05-01
circulation from December 2003 to June 2008 . The model is driven by tidal harmonics, realistic atmospheric forcing, and dynamically consistent initial and open...important element of the regional circulation (He and Wilkin 2006). We applied the method of Mellor and Yamada (1982) to compute vertical turbulent...shelfbreak ROMS hindcast ran continuously from December 2003 through January 2008 . Initial conditions were taken from the MABGOM ROMS simulation on 1
Thermal Nonequilibrium in Hypersonic Separated Flow
2014-12-22
flow duration and steadiness. 15. SUBJECT TERMS Hypersonic Flowfield Measurements, Laser Diagnostics of Gas Flow, Laser Induced...extent than the NS computation. While it would be convenient to believe that the more physically realistic flow modeling of the DSMC gas - surface...index and absorption coefficient. Each of the curves was produced assuming a 0.5 % concentration of lithium at the Condition A nozzle exit conditions
ERIC Educational Resources Information Center
Armstrong, Matt; Comitz, Richard L.; Biaglow, Andrew; Lachance, Russ; Sloop, Joseph
2008-01-01
A novel approach to the Chemical Engineering curriculum sequence of courses at West Point enabled our students to experience a much more realistic design process, which more closely replicated a real world scenario. Students conduct the synthesis in the organic chemistry lab, then conduct computer modeling of the reaction with ChemCad and…
ERIC Educational Resources Information Center
Haglund, Jesper; Stromdahl, Helge
2012-01-01
Nineteen informants (n = 19) were asked to study and comment two computer animations of the Otto combustion engine. One animation was non-interactive and realistic in the sense of depicting a physical engine. The other animation was more idealised, interactive and synchronised with a dynamic PV-graph. The informants represented practical and…
NASA Astrophysics Data System (ADS)
Brolin, Gustav; Sjögreen Gleisner, Katarina; Ljungberg, Michael
2013-05-01
In dynamic renal scintigraphy, the main interest is the radiopharmaceutical redistribution as a function of time. Quality control (QC) of renal procedures often relies on phantom experiments to compare image-based results with the measurement setup. A phantom with a realistic anatomy and time-varying activity distribution is therefore desirable. This work describes a pharmacokinetic (PK) compartment model for 99mTc-MAG3, used for defining a dynamic whole-body activity distribution within a digital phantom (XCAT) for accurate Monte Carlo (MC)-based images for QC. Each phantom structure is assigned a time-activity curve provided by the PK model, employing parameter values consistent with MAG3 pharmacokinetics. This approach ensures that the total amount of tracer in the phantom is preserved between time points, and it allows for modifications of the pharmacokinetics in a controlled fashion. By adjusting parameter values in the PK model, different clinically realistic scenarios can be mimicked, regarding, e.g., the relative renal uptake and renal transit time. Using the MC code SIMIND, a complete set of renography images including effects of photon attenuation, scattering, limited spatial resolution and noise, are simulated. The obtained image data can be used to evaluate quantitative techniques and computer software in clinical renography.
What we can and cannot (yet) do with functional near infrared spectroscopy
Strait, Megan; Scheutz, Matthias
2014-01-01
Functional near infrared spectroscopy (NIRS) is a relatively new technique complimentary to EEG for the development of brain-computer interfaces (BCIs). NIRS-based systems for detecting various cognitive and affective states such as mental and emotional stress have already been demonstrated in a range of adaptive human–computer interaction (HCI) applications. However, before NIRS-BCIs can be used reliably in realistic HCI settings, substantial challenges oncerning signal processing and modeling must be addressed. Although many of those challenges have been identified previously, the solutions to overcome them remain scant. In this paper, we first review what can be currently done with NIRS, specifically, NIRS-based approaches to measuring cognitive and affective user states as well as demonstrations of passive NIRS-BCIs. We then discuss some of the primary challenges these systems would face if deployed in more realistic settings, including detection latencies and motion artifacts. Lastly, we investigate the effects of some of these challenges on signal reliability via a quantitative comparison of three NIRS models. The hope is that this paper will actively engage researchers to acilitate the advancement of NIRS as a more robust and useful tool to the BCI community. PMID:24904261
Graded meshes in bio-thermal problems with transmission-line modeling method.
Milan, Hugo F M; Carvalho, Carlos A T; Maia, Alex S C; Gebremedhin, Kifle G
2014-10-01
In this study, the transmission-line modeling (TLM) applied to bio-thermal problems was improved by incorporating several novel computational techniques, which include application of graded meshes which resulted in 9 times faster in computational time and uses only a fraction (16%) of the computational resources used by regular meshes in analyzing heat flow through heterogeneous media. Graded meshes, unlike regular meshes, allow heat sources to be modeled in all segments of the mesh. A new boundary condition that considers thermal properties and thus resulting in a more realistic modeling of complex problems is introduced. Also, a new way of calculating an error parameter is introduced. The calculated temperatures between nodes were compared against the results obtained from the literature and agreed within less than 1% difference. It is reasonable, therefore, to conclude that the improved TLM model described herein has great potential in heat transfer of biological systems. Copyright © 2014 Elsevier Ltd. All rights reserved.
Costa, Paulo R; Caldas, Linda V E
2002-01-01
This work presents the development and evaluation using modern techniques to calculate radiation protection barriers in clinical radiographic facilities. Our methodology uses realistic primary and scattered spectra. The primary spectra were computer simulated using a waveform generalization and a semiempirical model (the Tucker-Barnes-Chakraborty model). The scattered spectra were obtained from published data. An analytical function was used to produce attenuation curves from polychromatic radiation for specified kVp, waveform, and filtration. The results of this analytical function are given in ambient dose equivalent units. The attenuation curves were obtained by application of Archer's model to computer simulation data. The parameters for the best fit to the model using primary and secondary radiation data from different radiographic procedures were determined. They resulted in an optimized model for shielding calculation for any radiographic room. The shielding costs were about 50% lower than those calculated using the traditional method based on Report No. 49 of the National Council on Radiation Protection and Measurements.
A general method for generating bathymetric data for hydrodynamic computer models
Burau, J.R.; Cheng, R.T.
1989-01-01
To generate water depth data from randomly distributed bathymetric data for numerical hydrodymamic models, raw input data from field surveys, water depth data digitized from nautical charts, or a combination of the two are sorted to given an ordered data set on which a search algorithm is used to isolate data for interpolation. Water depths at locations required by hydrodynamic models are interpolated from the bathymetric data base using linear or cubic shape functions used in the finite-element method. The bathymetric database organization and preprocessing, the search algorithm used in finding the bounding points for interpolation, the mathematics of the interpolation formulae, and the features of the automatic generation of water depths at hydrodynamic model grid points are included in the analysis. This report includes documentation of two computer programs which are used to: (1) organize the input bathymetric data; and (2) to interpolate depths for hydrodynamic models. An example of computer program operation is drawn from a realistic application to the San Francisco Bay estuarine system. (Author 's abstract)
Sirry, Mazin S.; Davies, Neil H.; Kadner, Karen; Dubuis, Laura; Saleh, Muhammad G.; Meintjes, Ernesta M.; Spottiswoode, Bruce S.; Zilla, Peter; Franz, Thomas
2013-01-01
Biomaterial injection based therapies have showed cautious success in restoration of cardiac function and prevention of adverse remodelling into heart failure after myocardial infarction (MI). However, the underlying mechanisms are not well understood. Computational studies utilised simplified representations of the therapeutic myocardial injectates. Wistar rats underwent experimental infarction followed by immediate injection of polyethylene glycol hydrogel in the infarct region. Hearts were explanted, cryo-sectioned and the region with the injectate histologically analysed. Histological micrographs were used to reconstruct the dispersed hydrogel injectate. Cardiac magnetic resonance imaging (CMRI) data from a healthy rat were used to obtain an end-diastolic biventricular geometry which was subsequently adjusted and combined with the injectate model. The computational geometry of the injectate exhibited microscopic structural details found the in situ. The combination of injectate and cardiac geometry provides realistic geometries for multiscale computational studies of intra-myocardial injectate therapies for the rat model that has been widely used for MI research. PMID:23682845
Measurements and Computations of Flow in an Urban Street System
NASA Astrophysics Data System (ADS)
Castro, Ian P.; Xie, Zheng-Tong; Fuka, V.; Robins, Alan G.; Carpentieri, M.; Hayden, P.; Hertwig, D.; Coceal, O.
2017-02-01
We present results from laboratory and computational experiments on the turbulent flow over an array of rectangular blocks modelling a typical, asymmetric urban canopy at various orientations to the approach flow. The work forms part of a larger study on dispersion within such arrays (project DIPLOS) and concentrates on the nature of the mean flow and turbulence fields within the canopy region, recognising that unless the flow field is adequately represented in computational models there is no reason to expect realistic simulations of the nature of the dispersion of pollutants emitted within the canopy. Comparisons between the experimental data and those obtained from both large-eddy simulation (LES) and direct numerical simulation (DNS) are shown and it is concluded that careful use of LES can produce generally excellent agreement with laboratory and DNS results, lending further confidence in the use of LES for such situations. Various crucial issues are discussed and advice offered to both experimentalists and those seeking to compute canopy flows with turbulence resolving models.
The effect of a realistic thermal diffusivity on numerical model of a subducting slab
NASA Astrophysics Data System (ADS)
Maierova, P.; Steinle-Neumann, G.; Cadek, O.
2010-12-01
A number of numerical studies of subducting slab assume simplified (constant or only depth-dependent) models of thermal conductivity. The available mineral physics data indicate, however, that thermal diffusivity is strongly temperature- and pressure-dependent and may also vary among different mantle materials. In the present study, we examine the influence of realistic thermal properties of mantle materials on the thermal state of the upper mantle and the dynamics of subducting slabs. On the basis of the data published in mineral physics literature we compile analytical relationships that approximate the pressure and temperature dependence of thermal diffusivity for major mineral phases of the mantle (olivine, wadsleyite, ringwoodite, garnet, clinopyroxenes, stishovite and perovskite). We propose a simplified composition of mineral assemblages predominating in the subducting slab and the surrounding mantle (pyrolite, mid-ocean ridge basalt, harzburgite) and we estimate their thermal diffusivity using the Hashin-Shtrikman bounds. The resulting complex formula for the diffusivity of each aggregate is then approximated by a simpler analytical relationship that is used in our numerical model as an input parameter. For the numerical modeling we use the Elmer software (open source finite element software for multiphysical problems, see http://www.csc.fi/english/pages/elmer). We set up a 2D Cartesian thermo-mechanical steady-state model of a subducting slab. The model is partly kinematic as the flow is driven by a boundary condition on velocity that is prescribed on the top of the subducting lithospheric plate. Reology of the material is non-linear and is coupled with the thermal equation. Using the realistic relationship for thermal diffusivity of mantle materials, we compute the thermal and flow fields for different input velocity and age of the subducting plate and we compare the results against the models assuming a constant thermal diffusivity. The importance of the realistic description of thermal properties in models of subducted slabs is discussed.
Basic Modeling of the Solar Atmosphere and Spectrum
NASA Technical Reports Server (NTRS)
Avrett, Eugene H.; Wagner, William J. (Technical Monitor)
2000-01-01
During the last three years we have continued the development of extensive computer programs for constructing realistic models of the solar atmosphere and for calculating detailed spectra to use in the interpretation of solar observations. This research involves two major interrelated efforts: work by Avrett and Loeser on the Pandora computer program for optically thick non-LTE modeling of the solar atmosphere including a wide range of physical processes, and work by Kurucz on the detailed high-resolution synthesis of the solar spectrum using data for over 58 million atomic and molecular lines. Our objective is to construct atmospheric models from which the calculated spectra agree as well as possible with high-and low-resolution observations over a wide wavelength range. Such modeling leads to an improved understanding of the physical processes responsible for the structure and behavior of the atmosphere.
A 3D object-based model to simulate highly-heterogeneous, coarse, braided river deposits
NASA Astrophysics Data System (ADS)
Huber, E.; Huggenberger, P.; Caers, J.
2016-12-01
There is a critical need in hydrogeological modeling for geologically more realistic representation of the subsurface. Indeed, widely-used representations of the subsurface heterogeneity based on smooth basis functions such as cokriging or the pilot-point approach fail at reproducing the connectivity of high permeable geological structures that control subsurface solute transport. To realistically model the connectivity of high permeable structures of coarse, braided river deposits, multiple-point statistics and object-based models are promising alternatives. We therefore propose a new object-based model that, according to a sedimentological model, mimics the dominant processes of floodplain dynamics. Contrarily to existing models, this object-based model possesses the following properties: (1) it is consistent with field observations (outcrops, ground-penetrating radar data, etc.), (2) it allows different sedimentological dynamics to be modeled that result in different subsurface heterogeneity patterns, (3) it is light in memory and computationally fast, and (4) it can be conditioned to geophysical data. In this model, the main sedimentological elements (scour fills with open-framework-bimodal gravel cross-beds, gravel sheet deposits, open-framework and sand lenses) and their internal structures are described by geometrical objects. Several spatial distributions are proposed that allow to simulate the horizontal position of the objects on the floodplain as well as the net rate of sediment deposition. The model is grid-independent and any vertical section can be computed algebraically. Furthermore, model realizations can serve as training images for multiple-point statistics. The significance of this model is shown by its impact on the subsurface flow distribution that strongly depends on the sedimentological dynamics modeled. The code will be provided as a free and open-source R-package.
Towards a 'siliconeural computer': technological successes and challenges.
Hughes, Mark A; Shipston, Mike J; Murray, Alan F
2015-07-28
Electronic signals govern the function of both nervous systems and computers, albeit in different ways. As such, hybridizing both systems to create an iono-electric brain-computer interface is a realistic goal; and one that promises exciting advances in both heterotic computing and neuroprosthetics capable of circumventing devastating neuropathology. 'Neural networks' were, in the 1980s, viewed naively as a potential panacea for all computational problems that did not fit well with conventional computing. The field bifurcated during the 1990s into a highly successful and much more realistic machine learning community and an equally pragmatic, biologically oriented 'neuromorphic computing' community. Algorithms found in nature that use the non-synchronous, spiking nature of neuronal signals have been found to be (i) implementable efficiently in silicon and (ii) computationally useful. As a result, interest has grown in techniques that could create mixed 'siliconeural' computers. Here, we discuss potential approaches and focus on one particular platform using parylene-patterned silicon dioxide.
A Computer Model of Drafting Effects on Collective Behavior in Elite 10,000-m Runners.
Trenchard, Hugh; Renfree, Andrew; Peters, Derek M
2017-03-01
Drafting in cycling influences collective behavior of pelotons. Although evidence for collective behavior in competitive running events exists, it is not clear if this results from energetic savings conferred by drafting. This study modeled the effects of drafting on behavior in elite 10,000-m runners. Using performance data from a men's elite 10,000-m track running event, computer simulations were constructed using Netlogo 5.1 to test the effects of 3 different drafting quantities on collective behavior: no drafting, drafting to 3 m behind with up to ~8% energy savings (a realistic running draft), and drafting up to 3 m behind with up to 38% energy savings (a realistic cycling draft). Three measures of collective behavior were analyzed in each condition: mean speed, mean group stretch (distance between first- and last-placed runner), and runner-convergence ratio (RCR), which represents the degree of drafting benefit obtained by the follower in a pair of coupled runners. Mean speeds were 6.32 ± 0.28, 5.57 ± 0.18, and 5.51 ± 0.13 m/s in the cycling-draft, runner-draft, and no-draft conditions, respectively (all P < .001). RCR was lower in the cycling-draft condition but did not differ between the other 2. Mean stretch did not differ between conditions. Collective behaviors observed in running events cannot be fully explained through energetic savings conferred by realistic drafting benefits. They may therefore result from other, possibly psychological, processes. The benefits or otherwise of engaging in such behavior are as yet unclear.
Pope, Bernard J; Fitch, Blake G; Pitman, Michael C; Rice, John J; Reumann, Matthias
2011-10-01
Future multiscale and multiphysics models that support research into human disease, translational medical science, and treatment can utilize the power of high-performance computing (HPC) systems. We anticipate that computationally efficient multiscale models will require the use of sophisticated hybrid programming models, mixing distributed message-passing processes [e.g., the message-passing interface (MPI)] with multithreading (e.g., OpenMP, Pthreads). The objective of this study is to compare the performance of such hybrid programming models when applied to the simulation of a realistic physiological multiscale model of the heart. Our results show that the hybrid models perform favorably when compared to an implementation using only the MPI and, furthermore, that OpenMP in combination with the MPI provides a satisfactory compromise between performance and code complexity. Having the ability to use threads within MPI processes enables the sophisticated use of all processor cores for both computation and communication phases. Considering that HPC systems in 2012 will have two orders of magnitude more cores than what was used in this study, we believe that faster than real-time multiscale cardiac simulations can be achieved on these systems.
Papadimitroulas, P; Loudos, G; Le Maitre, A; Efthimiou, N; Visvikis, D; Nikiforidis, G; Kagadis, G C
2012-06-01
In the present study a patient-specific dataset of realistic PET simulations was created, taking into account the variability of clinical oncology data. Tumor variability was tested in the simulated results. A comparison of the produced simulated data was performed to clinical PET/CT data, for the validation and the evaluation of the procedure. Clinical PET/CT data of oncology patients were used as the basis of the simulated variability inserting patient-specific characteristics in the NCAT and the Zubal anthropomorphic phantoms. GATE Monte Carlo toolkit was used for simulating a commercial PET scanner. The standard computational anthropomorphic phantoms were adapted to the CT data (organ shapes), using a fitting algorithm. The activity map was derived from PET images. Patient tumors were segmented and inserted in the phantom, using different activity distributions. The produced simulated data were reconstructed using the STIR opensource software and compared to the original clinical ones. The accuracy of the procedure was tested in four different oncology cases. Each pathological situation was illustrated simulating a) a healthy body, b) insertion of the clinical tumor with homogenous activity, and c) insertion of the clinical tumor with variable activity (voxel-by-voxel) based on the clinical PET data. The accuracy of the presented dataset was compared to the original PET/CT data. Partial Volume Correction (PVC) was also applied in the simulated data. In this study patient-specific characteristics were used in computational anthropomorphic models for simulating realistic pathological patients. Voxel-by-voxel activity distribution with PVC within the tumor gives the most accurate results. Radiotherapy applications can utilize the benefits of the accurate realistic imaging simulations, using the anatomicaland biological information of each patient. Further work will incorporate the development of analytical anthropomorphic models with motion and cardiac correction, combined with pathological patients to achieve high accuracy in tumor imaging. This research was supported by the Joint Research and Technology Program between Greece and France; 2009-2011 (protocol ID: 09FR103). © 2012 American Association of Physicists in Medicine.
Client - server programs analysis in the EPOCA environment
NASA Astrophysics Data System (ADS)
Donatelli, Susanna; Mazzocca, Nicola; Russo, Stefano
1996-09-01
Client - server processing is a popular paradigm for distributed computing. In the development of client - server programs, the designer has first to ensure that the implementation behaves correctly, in particular that it is deadlock free. Second, he has to guarantee that the program meets predefined performance requirements. This paper addresses the issues in the analysis of client - server programs in EPOCA. EPOCA is a computer-aided software engeneering (CASE) support system that allows the automated construction and analysis of generalized stochastic Petri net (GSPN) models of concurrent applications. The paper describes, on the basis of a realistic case study, how client - server systems are modelled in EPOCA, and the kind of qualitative and quantitative analysis supported by its tools.
A feasibility study of a 3-D finite element solution scheme for aeroengine duct acoustics
NASA Technical Reports Server (NTRS)
Abrahamson, A. L.
1980-01-01
The advantage from development of a 3-D model of aeroengine duct acoustics is the ability to analyze axial and circumferential liner segmentation simultaneously. The feasibility of a 3-D duct acoustics model was investigated using Galerkin or least squares element formulations combined with Gaussian elimination, successive over-relaxation, or conjugate gradient solution algorithms on conventional scalar computers and on a vector machine. A least squares element formulation combined with a conjugate gradient solver on a CDC Star vector computer initially appeared to have great promise, but severe difficulties were encountered with matrix ill-conditioning. These difficulties in conditioning rendered this technique impractical for realistic problems.
NASA Technical Reports Server (NTRS)
Lawson, John W.; Daw, Murray S.; Squire, Thomas H.; Bauschlicher, Charles W.
2012-01-01
We are developing a multiscale framework in computational modeling for the ultra high temperature ceramics (UHTC) ZrB2 and HfB2. These materials are characterized by high melting point, good strength, and reasonable oxidation resistance. They are candidate materials for a number of applications in extreme environments including sharp leading edges of hypersonic aircraft. In particular, we used a combination of ab initio methods, atomistic simulations and continuum computations to obtain insights into fundamental properties of these materials. Ab initio methods were used to compute basic structural, mechanical and thermal properties. From these results, a database was constructed to fit a Tersoff style interatomic potential suitable for atomistic simulations. These potentials were used to evaluate the lattice thermal conductivity of single crystals and the thermal resistance of simple grain boundaries. Finite element method (FEM) computations using atomistic results as inputs were performed with meshes constructed on SEM images thereby modeling the realistic microstructure. These continuum computations showed the reduction in thermal conductivity due to the grain boundary network.
Constructing Precisely Computing Networks with Biophysical Spiking Neurons.
Schwemmer, Michael A; Fairhall, Adrienne L; Denéve, Sophie; Shea-Brown, Eric T
2015-07-15
While spike timing has been shown to carry detailed stimulus information at the sensory periphery, its possible role in network computation is less clear. Most models of computation by neural networks are based on population firing rates. In equivalent spiking implementations, firing is assumed to be random such that averaging across populations of neurons recovers the rate-based approach. Recently, however, Denéve and colleagues have suggested that the spiking behavior of neurons may be fundamental to how neuronal networks compute, with precise spike timing determined by each neuron's contribution to producing the desired output (Boerlin and Denéve, 2011; Boerlin et al., 2013). By postulating that each neuron fires to reduce the error in the network's output, it was demonstrated that linear computations can be performed by networks of integrate-and-fire neurons that communicate through instantaneous synapses. This left open, however, the possibility that realistic networks, with conductance-based neurons with subthreshold nonlinearity and the slower timescales of biophysical synapses, may not fit into this framework. Here, we show how the spike-based approach can be extended to biophysically plausible networks. We then show that our network reproduces a number of key features of cortical networks including irregular and Poisson-like spike times and a tight balance between excitation and inhibition. Lastly, we discuss how the behavior of our model scales with network size or with the number of neurons "recorded" from a larger computing network. These results significantly increase the biological plausibility of the spike-based approach to network computation. We derive a network of neurons with standard spike-generating currents and synapses with realistic timescales that computes based upon the principle that the precise timing of each spike is important for the computation. We then show that our network reproduces a number of key features of cortical networks including irregular, Poisson-like spike times, and a tight balance between excitation and inhibition. These results significantly increase the biological plausibility of the spike-based approach to network computation, and uncover how several components of biological networks may work together to efficiently carry out computation. Copyright © 2015 the authors 0270-6474/15/3510112-23$15.00/0.
Cosmic-ray propagation with DRAGON2: I. numerical solver and astrophysical ingredients
NASA Astrophysics Data System (ADS)
Evoli, Carmelo; Gaggero, Daniele; Vittino, Andrea; Di Bernardo, Giuseppe; Di Mauro, Mattia; Ligorini, Arianna; Ullio, Piero; Grasso, Dario
2017-02-01
We present version 2 of the DRAGON code designed for computing realistic predictions of the CR densities in the Galaxy. The code numerically solves the interstellar CR transport equation (including inhomogeneous and anisotropic diffusion, either in space and momentum, advective transport and energy losses), under realistic conditions. The new version includes an updated numerical solver and several models for the astrophysical ingredients involved in the transport equation. Improvements in the accuracy of the numerical solution are proved against analytical solutions and in reference diffusion scenarios. The novel features implemented in the code allow to simulate the diverse scenarios proposed to reproduce the most recent measurements of local and diffuse CR fluxes, going beyond the limitations of the homogeneous galactic transport paradigm. To this end, several applications using DRAGON2 are presented as well. This new version facilitates the users to include their own physical models by means of a modular C++ structure.
3D simulations of early blood vessel formation
NASA Astrophysics Data System (ADS)
Cavalli, F.; Gamba, A.; Naldi, G.; Semplice, M.; Valdembri, D.; Serini, G.
2007-08-01
Blood vessel networks form by spontaneous aggregation of individual cells migrating toward vascularization sites (vasculogenesis). A successful theoretical model of two-dimensional experimental vasculogenesis has been recently proposed, showing the relevance of percolation concepts and of cell cross-talk (chemotactic autocrine loop) to the understanding of this self-aggregation process. Here we study the natural 3D extension of the computational model proposed earlier, which is relevant for the investigation of the genuinely three-dimensional process of vasculogenesis in vertebrate embryos. The computational model is based on a multidimensional Burgers equation coupled with a reaction diffusion equation for a chemotactic factor and a mass conservation law. The numerical approximation of the computational model is obtained by high order relaxed schemes. Space and time discretization are performed by using TVD schemes and, respectively, IMEX schemes. Due to the computational costs of realistic simulations, we have implemented the numerical algorithm on a cluster for parallel computation. Starting from initial conditions mimicking the experimentally observed ones, numerical simulations produce network-like structures qualitatively similar to those observed in the early stages of in vivo vasculogenesis. We develop the computation of critical percolative indices as a robust measure of the network geometry as a first step towards the comparison of computational and experimental data.
Numerical simulation of magmatic hydrothermal systems
Ingebritsen, S.E.; Geiger, S.; Hurwitz, S.; Driesner, T.
2010-01-01
The dynamic behavior of magmatic hydrothermal systems entails coupled and nonlinear multiphase flow, heat and solute transport, and deformation in highly heterogeneous media. Thus, quantitative analysis of these systems depends mainly on numerical solution of coupled partial differential equations and complementary equations of state (EOS). The past 2 decades have seen steady growth of computational power and the development of numerical models that have eliminated or minimized the need for various simplifying assumptions. Considerable heuristic insight has been gained from process-oriented numerical modeling. Recent modeling efforts employing relatively complete EOS and accurate transport calculations have revealed dynamic behavior that was damped by linearized, less accurate models, including fluid property control of hydrothermal plume temperatures and three-dimensional geometries. Other recent modeling results have further elucidated the controlling role of permeability structure and revealed the potential for significant hydrothermally driven deformation. Key areas for future reSearch include incorporation of accurate EOS for the complete H2O-NaCl-CO2 system, more realistic treatment of material heterogeneity in space and time, realistic description of large-scale relative permeability behavior, and intercode benchmarking comparisons. Copyright 2010 by the American Geophysical Union.
Realistic Covariance Prediction for the Earth Science Constellation
NASA Technical Reports Server (NTRS)
Duncan, Matthew; Long, Anne
2006-01-01
Routine satellite operations for the Earth Science Constellation (ESC) include collision risk assessment between members of the constellation and other orbiting space objects. One component of the risk assessment process is computing the collision probability between two space objects. The collision probability is computed using Monte Carlo techniques as well as by numerically integrating relative state probability density functions. Each algorithm takes as inputs state vector and state vector uncertainty information for both objects. The state vector uncertainty information is expressed in terms of a covariance matrix. The collision probability computation is only as good as the inputs. Therefore, to obtain a collision calculation that is a useful decision-making metric, realistic covariance matrices must be used as inputs to the calculation. This paper describes the process used by the NASA/Goddard Space Flight Center's Earth Science Mission Operations Project to generate realistic covariance predictions for three of the Earth Science Constellation satellites: Aqua, Aura and Terra.
Monte Carlo Analysis of Reservoir Models Using Seismic Data and Geostatistical Models
NASA Astrophysics Data System (ADS)
Zunino, A.; Mosegaard, K.; Lange, K.; Melnikova, Y.; Hansen, T. M.
2013-12-01
We present a study on the analysis of petroleum reservoir models consistent with seismic data and geostatistical constraints performed on a synthetic reservoir model. Our aim is to invert directly for structure and rock bulk properties of the target reservoir zone. To infer the rock facies, porosity and oil saturation seismology alone is not sufficient but a rock physics model must be taken into account, which links the unknown properties to the elastic parameters. We then combine a rock physics model with a simple convolutional approach for seismic waves to invert the "measured" seismograms. To solve this inverse problem, we employ a Markov chain Monte Carlo (MCMC) method, because it offers the possibility to handle non-linearity, complex and multi-step forward models and provides realistic estimates of uncertainties. However, for large data sets the MCMC method may be impractical because of a very high computational demand. To face this challenge one strategy is to feed the algorithm with realistic models, hence relying on proper prior information. To address this problem, we utilize an algorithm drawn from geostatistics to generate geologically plausible models which represent samples of the prior distribution. The geostatistical algorithm learns the multiple-point statistics from prototype models (in the form of training images), then generates thousands of different models which are accepted or rejected by a Metropolis sampler. To further reduce the computation time we parallelize the software and run it on multi-core machines. The solution of the inverse problem is then represented by a collection of reservoir models in terms of facies, porosity and oil saturation, which constitute samples of the posterior distribution. We are finally able to produce probability maps of the properties we are interested in by performing statistical analysis on the collection of solutions.
Application of Computational Fluid Dynamics (CFD) in transonic wind-tunnel/flight-test correlation
NASA Technical Reports Server (NTRS)
Murman, E. M.
1982-01-01
The capability for calculating transonic flows for realistic configurations and conditions is discussed. Various phenomena which were modeled are shown to have the same order of magnitude on the influence of predicted results. It is concluded that CFD can make the following contributions to the task of correlating wind tunnel and flight test data: some effects of geometry differences and aeroelastic distortion can be predicted; tunnel wall effects can be assessed and corrected for; and the effects of model support systems and free stream nonuniformities can be modeled.
Kinetics of phase transformation in glass forming systems
NASA Technical Reports Server (NTRS)
Ray, Chandra S.
1994-01-01
The objectives of this research were to (1) develop computer models for realistic simulations of nucleation and crystal growth in glasses, which would also have the flexibility to accomodate the different variables related to sample characteristics and experimental conditions, and (2) design and perform nucleation and crystallization experiments using calorimetric measurements, such as differential scanning calorimetry (DSC) and differential thermal analysis (DTA) to verify these models. The variables related to sample characteristics mentioned in (1) above include size of the glass particles, nucleating agents, and the relative concentration of the surface and internal nuclei. A change in any of these variables changes the mode of the transformation (crystallization) kinetics. A variation in experimental conditions includes isothermal and nonisothermal DSC/DTA measurements. This research would lead to develop improved, more realistic methods for analysis of the DSC/DTA peak profiles to determine the kinetic parameters for nucleation and crystal growth as well as to assess the relative merits and demerits of the thermoanalytical models presently used to study the phase transformation in glasses.
ODEion--a software module for structural identification of ordinary differential equations.
Gennemark, Peter; Wedelin, Dag
2014-02-01
In the systems biology field, algorithms for structural identification of ordinary differential equations (ODEs) have mainly focused on fixed model spaces like S-systems and/or on methods that require sufficiently good data so that derivatives can be accurately estimated. There is therefore a lack of methods and software that can handle more general models and realistic data. We present ODEion, a software module for structural identification of ODEs. Main characteristic features of the software are: • The model space is defined by arbitrary user-defined functions that can be nonlinear in both variables and parameters, such as for example chemical rate reactions. • ODEion implements computationally efficient algorithms that have been shown to efficiently handle sparse and noisy data. It can run a range of realistic problems that previously required a supercomputer. • ODEion is easy to use and provides SBML output. We describe the mathematical problem, the ODEion system itself, and provide several examples of how the system can be used. Available at: http://www.odeidentification.org.
Restricted diffusion in a model acinar labyrinth by NMR: Theoretical and numerical results
NASA Astrophysics Data System (ADS)
Grebenkov, D. S.; Guillot, G.; Sapoval, B.
2007-01-01
A branched geometrical structure of the mammal lungs is known to be crucial for rapid access of oxygen to blood. But an important pulmonary disease like emphysema results in partial destruction of the alveolar tissue and enlargement of the distal airspaces, which may reduce the total oxygen transfer. This effect has been intensively studied during the last decade by MRI of hyperpolarized gases like helium-3. The relation between geometry and signal attenuation remained obscure due to a lack of realistic geometrical model of the acinar morphology. In this paper, we use Monte Carlo simulations of restricted diffusion in a realistic model acinus to compute the signal attenuation in a diffusion-weighted NMR experiment. We demonstrate that this technique should be sensitive to destruction of the branched structure: partial removal of the interalveolar tissue creates loops in the tree-like acinar architecture that enhance diffusive motion and the consequent signal attenuation. The role of the local geometry and related practical applications are discussed.
Geostatistical Borehole Image-Based Mapping of Karst-Carbonate Aquifer Pores.
Sukop, Michael C; Cunningham, Kevin J
2016-03-01
Quantification of the character and spatial distribution of porosity in carbonate aquifers is important as input into computer models used in the calculation of intrinsic permeability and for next-generation, high-resolution groundwater flow simulations. Digital, optical, borehole-wall image data from three closely spaced boreholes in the karst-carbonate Biscayne aquifer in southeastern Florida are used in geostatistical experiments to assess the capabilities of various methods to create realistic two-dimensional models of vuggy megaporosity and matrix-porosity distribution in the limestone that composes the aquifer. When the borehole image data alone were used as the model training image, multiple-point geostatistics failed to detect the known spatial autocorrelation of vuggy megaporosity and matrix porosity among the three boreholes, which were only 10 m apart. Variogram analysis and subsequent Gaussian simulation produced results that showed a realistic conceptualization of horizontal continuity of strata dominated by vuggy megaporosity and matrix porosity among the three boreholes. © 2015, National Ground Water Association.
A 3D virtual reality simulator for training of minimally invasive surgery.
Mi, Shao-Hua; Hou, Zeng-Gunag; Yang, Fan; Xie, Xiao-Liang; Bian, Gui-Bin
2014-01-01
For the last decade, remarkable progress has been made in the field of cardiovascular disease treatment. However, these complex medical procedures require a combination of rich experience and technical skills. In this paper, a 3D virtual reality simulator for core skills training in minimally invasive surgery is presented. The system can generate realistic 3D vascular models segmented from patient datasets, including a beating heart, and provide a real-time computation of force and force feedback module for surgical simulation. Instruments, such as a catheter or guide wire, are represented by a multi-body mass-spring model. In addition, a realistic user interface with multiple windows and real-time 3D views are developed. Moreover, the simulator is also provided with a human-machine interaction module that gives doctors the sense of touch during the surgery training, enables them to control the motion of a virtual catheter/guide wire inside a complex vascular model. Experimental results show that the simulator is suitable for minimally invasive surgery training.
DOE Office of Scientific and Technical Information (OSTI.GOV)
El Osery, I.A.
1983-12-01
Modelling studies of metal hydride hydrogen storage beds is a part of an extensive R and D program conducted in Egypt on hydrogen energy. In this context two computer programs; namely RET and RET1; have been developed. In RET computer program, a cylindrical conduction bed model is considered and an approximate analytical solution is used for solution of the associated mass and heat transfer problem. This problem is solved in RET1 computer program numerically allowing more flexibility in operating conditions but still limited to cylindrical configuration with only two alternatives for heat exchange; either fluid is passing through tubes imbeddedmore » in the solid alloy matrix or solid rods are surrounded by annular fluid tubes. The present computer code TOBA is more flexible and realistic. It performs the mass and heat transfer dynamic analysis of metal hydride storage beds using a variety of geometrical and operating alternatives.« less
Corrado, Cesare; Zemzemi, Nejib
2018-01-01
Computational models of heart electrophysiology achieved a considerable interest in the medical community as they represent a novel framework for the study of the mechanisms underpinning heart pathologies. The high demand of computational resources and the long computational time required to evaluate the model solution hamper the use of detailed computational models in clinical applications. In this paper, we present a multi-front eikonal algorithm that adapts the conduction velocity (CV) to the activation frequency of the tissue substrate. We then couple the eikonal new algorithm with the Mitchell-Schaeffer (MS) ionic model to determine the tissue electrical state. Compared to the standard eikonal model, this model introduces three novelties: first, it evaluates the local value of the transmembrane potential and of the ionic variable solving an ionic model; second, it computes the action potential duration (APD) and the diastolic interval (DI) from the solution of the MS model and uses them to determine if the tissue is locally re-excitable; third, it adapts the CV to the underpinning electrophysiological state through an analytical expression of the CV restitution and the computed local DI. We conduct series of simulations on a 3D tissue slab and on a realistic heart geometry and compare the solutions with those obtained solving the monodomain equation. Our results show that the new model is significantly more accurate than the standard eikonal model. The proposed model enables the numerical simulation of the heart electrophysiology on a clinical time scale and thus constitutes a viable model candidate for computer-guided radio-frequency ablation. Copyright © 2017 Elsevier B.V. All rights reserved.
Equivalent circuit simulation of HPEM-induced transient responses at nonlinear loads
NASA Astrophysics Data System (ADS)
Kotzev, Miroslav; Bi, Xiaotang; Kreitlow, Matthias; Gronwald, Frank
2017-09-01
In this paper the equivalent circuit modeling of a nonlinearly loaded loop antenna and its transient responses to HPEM field excitations are investigated. For the circuit modeling the general strategy to characterize the nonlinearly loaded antenna by a linear and a nonlinear circuit part is pursued. The linear circuit part can be determined by standard methods of antenna theory and numerical field computation. The modeling of the nonlinear circuit part requires realistic circuit models of the nonlinear loads that are given by Schottky diodes. Combining both parts, appropriate circuit models are obtained and analyzed by means of a standard SPICE circuit simulator. It is the main result that in this way full-wave simulation results can be reproduced. Furthermore it is clearly seen that the equivalent circuit modeling offers considerable advantages with respect to computation speed and also leads to improved physical insights regarding the coupling between HPEM field excitation and nonlinearly loaded loop antenna.
Techniques for interpretation of geoid anomalies
NASA Technical Reports Server (NTRS)
Chapman, M. E.
1979-01-01
For purposes of geological interpretation, techniques are developed to compute directly the geoid anomaly over models of density within the earth. Ideal bodies such as line segments, vertical sheets, and rectangles are first used to calculate the geoid anomaly. Realistic bodies are modeled with formulas for two-dimensional polygons and three-dimensional polyhedra. By using Fourier transform methods the two-dimensional geoid is seen to be a filtered version of the gravity field, in which the long-wavelength components are magnified and the short-wavelength components diminished.
Analysis of Crystallization Kinetics
NASA Technical Reports Server (NTRS)
Kelton, Kenneth F.
1997-01-01
A realistic computer model for polymorphic crystallization (i.e., initial and final phases with identical compositions), which includes time-dependent nucleation and cluster-size-dependent growth rates, is developed and tested by fits to experimental data. Model calculations are used to assess the validity of two of the more common approaches for the analysis of crystallization data. The effects of particle size on transformation kinetics, important for the crystallization of many systems of limited dimension including thin films, fine powders, and nanoparticles, are examined.
Users manual for linear Time-Varying Helicopter Simulation (Program TVHIS)
NASA Technical Reports Server (NTRS)
Burns, M. R.
1979-01-01
A linear time-varying helicopter simulation program (TVHIS) is described. The program is designed as a realistic yet efficient helicopter simulation. It is based on a linear time-varying helicopter model which includes rotor, actuator, and sensor models, as well as a simulation of flight computer logic. The TVHIS can generate a mean trajectory simulation along a nominal trajectory, or propagate covariance of helicopter states, including rigid-body, turbulence, control command, controller states, and rigid-body state estimates.
A neuron-astrocyte transistor-like model for neuromorphic dressed neurons.
Valenza, G; Pioggia, G; Armato, A; Ferro, M; Scilingo, E P; De Rossi, D
2011-09-01
Experimental evidences on the role of the synaptic glia as an active partner together with the bold synapse in neuronal signaling and dynamics of neural tissue strongly suggest to investigate on a more realistic neuron-glia model for better understanding human brain processing. Among the glial cells, the astrocytes play a crucial role in the tripartite synapsis, i.e. the dressed neuron. A well-known two-way astrocyte-neuron interaction can be found in the literature, completely revising the purely supportive role for the glia. The aim of this study is to provide a computationally efficient model for neuron-glia interaction. The neuron-glia interactions were simulated by implementing the Li-Rinzel model for an astrocyte and the Izhikevich model for a neuron. Assuming the dressed neuron dynamics similar to the nonlinear input-output characteristics of a bipolar junction transistor, we derived our computationally efficient model. This model may represent the fundamental computational unit for the development of real-time artificial neuron-glia networks opening new perspectives in pattern recognition systems and in brain neurophysiology. Copyright © 2011 Elsevier Ltd. All rights reserved.
Doulgerakis, Matthaios; Eggebrecht, Adam; Wojtkiewicz, Stanislaw; Culver, Joseph; Dehghani, Hamid
2017-12-01
Parameter recovery in diffuse optical tomography is a computationally expensive algorithm, especially when used for large and complex volumes, as in the case of human brain functional imaging. The modeling of light propagation, also known as the forward problem, is the computational bottleneck of the recovery algorithm, whereby the lack of a real-time solution is impeding practical and clinical applications. The objective of this work is the acceleration of the forward model, within a diffusion approximation-based finite-element modeling framework, employing parallelization to expedite the calculation of light propagation in realistic adult head models. The proposed methodology is applicable for modeling both continuous wave and frequency-domain systems with the results demonstrating a 10-fold speed increase when GPU architectures are available, while maintaining high accuracy. It is shown that, for a very high-resolution finite-element model of the adult human head with ∼600,000 nodes, consisting of heterogeneous layers, light propagation can be calculated at ∼0.25 s/excitation source. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
Adjudicating between face-coding models with individual-face fMRI responses
Kriegeskorte, Nikolaus
2017-01-01
The perceptual representation of individual faces is often explained with reference to a norm-based face space. In such spaces, individuals are encoded as vectors where identity is primarily conveyed by direction and distinctiveness by eccentricity. Here we measured human fMRI responses and psychophysical similarity judgments of individual face exemplars, which were generated as realistic 3D animations using a computer-graphics model. We developed and evaluated multiple neurobiologically plausible computational models, each of which predicts a representational distance matrix and a regional-mean activation profile for 24 face stimuli. In the fusiform face area, a face-space coding model with sigmoidal ramp tuning provided a better account of the data than one based on exemplar tuning. However, an image-processing model with weighted banks of Gabor filters performed similarly. Accounting for the data required the inclusion of a measurement-level population averaging mechanism that approximates how fMRI voxels locally average distinct neuronal tunings. Our study demonstrates the importance of comparing multiple models and of modeling the measurement process in computational neuroimaging. PMID:28746335
NASA Astrophysics Data System (ADS)
Baart, F.; van Gils, A.; Hagenaars, G.; Donchyts, G.; Eisemann, E.; van Velzen, J. W.
2016-12-01
A compelling visualization is captivating, beautiful and narrative. Here we show how melding the skills of computer graphics, art, statistics, and environmental modeling can be used to generate innovative, attractive and very informative visualizations. We focus on the topic of visualizing forecasts and measurements of water (water level, waves, currents, density, and salinity). For the field of computer graphics and arts, water is an important topic because it occurs in many natural scenes. For environmental modeling and statistics, water is an important topic because the water is essential for transport, a healthy environment, fruitful agriculture, and a safe environment.The different disciplines take different approaches to visualizing water. In computer graphics, one focusses on creating water as realistic looking as possible. The focus on realistic perception (versus the focus on the physical balance pursued by environmental scientists) resulted in fascinating renderings, as seen in recent games and movies. Visualization techniques for statistical results have benefited from the advancement in design and journalism, resulting in enthralling infographics. The field of environmental modeling has absorbed advances in contemporary cartography as seen in the latest interactive data-driven maps. We systematically review the design emerging types of water visualizations. The examples that we analyze range from dynamically animated forecasts, interactive paintings, infographics, modern cartography to web-based photorealistic rendering. By characterizing the intended audience, the design choices, the scales (e.g. time, space), and the explorability we provide a set of guidelines and genres. The unique contributions of the different fields show how the innovations in the current state of the art of water visualization have benefited from inter-disciplinary collaborations.
Neo-deterministic seismic hazard scenarios for India—a preventive tool for disaster mitigation
NASA Astrophysics Data System (ADS)
Parvez, Imtiyaz A.; Magrin, Andrea; Vaccari, Franco; Ashish; Mir, Ramees R.; Peresan, Antonella; Panza, Giuliano Francesco
2017-11-01
Current computational resources and physical knowledge of the seismic wave generation and propagation processes allow for reliable numerical and analytical models of waveform generation and propagation. From the simulation of ground motion, it is easy to extract the desired earthquake hazard parameters. Accordingly, a scenario-based approach to seismic hazard assessment has been developed, namely the neo-deterministic seismic hazard assessment (NDSHA), which allows for a wide range of possible seismic sources to be used in the definition of reliable scenarios by means of realistic waveforms modelling. Such reliable and comprehensive characterization of expected earthquake ground motion is essential to improve building codes, particularly for the protection of critical infrastructures and for land use planning. Parvez et al. (Geophys J Int 155:489-508, 2003) published the first ever neo-deterministic seismic hazard map of India by computing synthetic seismograms with input data set consisting of structural models, seismogenic zones, focal mechanisms and earthquake catalogues. As described in Panza et al. (Adv Geophys 53:93-165, 2012), the NDSHA methodology evolved with respect to the original formulation used by Parvez et al. (Geophys J Int 155:489-508, 2003): the computer codes were improved to better fit the need of producing realistic ground shaking maps and ground shaking scenarios, at different scale levels, exploiting the most significant pertinent progresses in data acquisition and modelling. Accordingly, the present study supplies a revised NDSHA map for India. The seismic hazard, expressed in terms of maximum displacement (Dmax), maximum velocity (Vmax) and design ground acceleration (DGA), has been extracted from the synthetic signals and mapped on a regular grid over the studied territory.
MRXCAT: Realistic numerical phantoms for cardiovascular magnetic resonance
2014-01-01
Background Computer simulations are important for validating novel image acquisition and reconstruction strategies. In cardiovascular magnetic resonance (CMR), numerical simulations need to combine anatomical information and the effects of cardiac and/or respiratory motion. To this end, a framework for realistic CMR simulations is proposed and its use for image reconstruction from undersampled data is demonstrated. Methods The extended Cardiac-Torso (XCAT) anatomical phantom framework with various motion options was used as a basis for the numerical phantoms. Different tissue, dynamic contrast and signal models, multiple receiver coils and noise are simulated. Arbitrary trajectories and undersampled acquisition can be selected. The utility of the framework is demonstrated for accelerated cine and first-pass myocardial perfusion imaging using k-t PCA and k-t SPARSE. Results MRXCAT phantoms allow for realistic simulation of CMR including optional cardiac and respiratory motion. Example reconstructions from simulated undersampled k-t parallel imaging demonstrate the feasibility of simulated acquisition and reconstruction using the presented framework. Myocardial blood flow assessment from simulated myocardial perfusion images highlights the suitability of MRXCAT for quantitative post-processing simulation. Conclusion The proposed MRXCAT phantom framework enables versatile and realistic simulations of CMR including breathhold and free-breathing acquisitions. PMID:25204441
A Lagrangian particle model to predict the airborne spread of foot-and-mouth disease virus
NASA Astrophysics Data System (ADS)
Mayer, D.; Reiczigel, J.; Rubel, F.
Airborne spread of bioaerosols in the boundary layer over a complex terrain is simulated using a Lagrangian particle model, and applied to modelling the airborne spread of foot-and-mouth disease (FMD) virus. Two case studies are made with study domains located in a hilly region in the northwest of the Styrian capital Graz, the second largest town in Austria. Mountainous terrain as well as inhomogeneous and time varying meteorological conditions prevent from application of so far used Gaussian dispersion models, while the proposed model can handle these realistically. In the model, trajectories of several thousands of particles are computed and the distribution of virus concentration near the ground is calculated. This allows to assess risk of infection areas with respect to animal species of interest, such as cattle, swine or sheep. Meteorological input data like wind field and other variables necessary to compute turbulence were taken from the new pre-operational version of the non-hydrostatic numerical weather prediction model LMK ( Lokal-Modell-Kürzestfrist) running at the German weather service DWD ( Deutscher Wetterdienst). The LMK model provides meteorological parameters with a spatial resolution of about 2.8 km. To account for the spatial resolution of 400 m used by the Lagrangian particle model, the initial wind field is interpolated upon the finer grid by a mass consistent interpolation method. Case studies depict a significant influence of local wind systems on the spread of virus. Higher virus concentrations at the upwind side of the hills and marginal concentrations in the lee are well observable, as well as canalization effects by valleys. The study demonstrates that the Lagrangian particle model is an appropriate tool for risk assessment of airborne spread of virus by taking into account the realistic orographic and meteorological conditions.
NASA Astrophysics Data System (ADS)
Fedosov, Dmitry
2011-03-01
Computational biophysics is a large and rapidly growing area of computational physics. In this talk, we will focus on a number of biophysical problems related to blood cells and blood flow in health and disease. Blood flow plays a fundamental role in a wide range of physiological processes and pathologies in the organism. To understand and, if necessary, manipulate the course of these processes it is essential to investigate blood flow under realistic conditions including deformability of blood cells, their interactions, and behavior in the complex microvascular network. Using a multiscale cell model we are able to accurately capture red blood cell mechanics, rheology, and dynamics in agreement with a number of single cell experiments. Further, this validated model yields accurate predictions of the blood rheological properties, cell migration, cell-free layer, and hemodynamic resistance in microvessels. In addition, we investigate blood related changes in malaria, which include a considerable stiffening of red blood cells and their cytoadherence to endothelium. For these biophysical problems computational modeling is able to provide new physical insights and capabilities for quantitative predictions of blood flow in health and disease.
Fault Detection of Rotating Machinery using the Spectral Distribution Function
NASA Technical Reports Server (NTRS)
Davis, Sanford S.
1997-01-01
The spectral distribution function is introduced to characterize the process leading to faults in rotating machinery. It is shown to be a more robust indicator than conventional power spectral density estimates, but requires only slightly more computational effort. The method is illustrated with examples from seeded gearbox transmission faults and an analytical model of a defective bearing. Procedures are suggested for implementation in realistic environments.
Evaluation of one dimensional analytical models for vegetation canopies
NASA Technical Reports Server (NTRS)
Goel, Narendra S.; Kuusk, Andres
1992-01-01
The SAIL model for one-dimensional homogeneous vegetation canopies has been modified to include the specular reflectance and hot spot effects. This modified model and the Nilson-Kuusk model are evaluated by comparing the reflectances given by them against those given by a radiosity-based computer model, Diana, for a set of canopies, characterized by different leaf area index (LAI) and leaf angle distribution (LAD). It is shown that for homogeneous canopies, the analytical models are generally quite accurate in the visible region, but not in the infrared region. For architecturally realistic heterogeneous canopies of the type found in nature, these models fall short. These shortcomings are quantified.
Magnetic reconnection in the low solar chromosphere with a more realistic radiative cooling model
NASA Astrophysics Data System (ADS)
Ni, Lei; Lukin, Vyacheslav S.; Murphy, Nicholas A.; Lin, Jun
2018-04-01
Magnetic reconnection is the most likely mechanism responsible for the high temperature events that are observed in strongly magnetized locations around the temperature minimum in the low solar chromosphere. This work improves upon our previous work [Ni et al., Astrophys. J. 852, 95 (2018)] by using a more realistic radiative cooling model computed from the OPACITY project and the CHIANTI database. We find that the rate of ionization of the neutral component of the plasma is still faster than recombination within the current sheet region. For low β plasmas, the ionized and neutral fluid flows are well-coupled throughout the reconnection region resembling the single-fluid Sweet-Parker model dynamics. Decoupling of the ion and neutral inflows appears in the higher β case with β0=1.46 , which leads to a reconnection rate about three times faster than the rate predicted by the Sweet-Parker model. In all cases, the plasma temperature increases with time inside the current sheet, and the maximum value is above 2 ×104 K when the reconnection magnetic field strength is greater than 500 G. While the more realistic radiative cooling model does not result in qualitative changes of the characteristics of magnetic reconnection, it is necessary for studying the variations of the plasma temperature and ionization fraction inside current sheets in strongly magnetized regions of the low solar atmosphere. It is also important for studying energy conversion during the magnetic reconnection process when the hydrogen-dominated plasma approaches full ionization.
NASA Astrophysics Data System (ADS)
Homainejad, Amir S.; Satari, Mehran
2000-05-01
VR is possible which brings users to the reality by computer and VE is a simulated world which takes users to any points and directions of the object. VR and VE can be very useful if accurate and precise data are sued, and allows users to work with realistic model. Photogrammetry is a technique which is able to collect and provide accurate and precise data for building 3D model in a computer. Data can be collected from various sensor and cameras, and methods of data collector are vary based on the method of image acquiring. Indeed VR includes real-time graphics, 3D model, and display and it has application in the entertainment industry, flight simulators, industrial design.
The pros and cons of code validation
NASA Technical Reports Server (NTRS)
Bobbitt, Percy J.
1988-01-01
Computational and wind tunnel error sources are examined and quantified using specific calculations of experimental data, and a substantial comparison of theoretical and experimental results, or a code validation, is discussed. Wind tunnel error sources considered include wall interference, sting effects, Reynolds number effects, flow quality and transition, and instrumentation such as strain gage balances, electronically scanned pressure systems, hot film gages, hot wire anemometers, and laser velocimeters. Computational error sources include math model equation sets, the solution algorithm, artificial viscosity/dissipation, boundary conditions, the uniqueness of solutions, grid resolution, turbulence modeling, and Reynolds number effects. It is concluded that, although improvements in theory are being made more quickly than in experiments, wind tunnel research has the advantage of the more realistic transition process of a right turbulence model in a free-transition test.
Epstein, Joshua M.; Pankajakshan, Ramesh; Hammond, Ross A.
2011-01-01
We introduce a novel hybrid of two fields—Computational Fluid Dynamics (CFD) and Agent-Based Modeling (ABM)—as a powerful new technique for urban evacuation planning. CFD is a predominant technique for modeling airborne transport of contaminants, while ABM is a powerful approach for modeling social dynamics in populations of adaptive individuals. The hybrid CFD-ABM method is capable of simulating how large, spatially-distributed populations might respond to a physically realistic contaminant plume. We demonstrate the overall feasibility of CFD-ABM evacuation design, using the case of a hypothetical aerosol release in Los Angeles to explore potential effectiveness of various policy regimes. We conclude by arguing that this new approach can be powerfully applied to arbitrary population centers, offering an unprecedented preparedness and catastrophic event response tool. PMID:21687788
EEGLAB, SIFT, NFT, BCILAB, and ERICA: new tools for advanced EEG processing.
Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott
2011-01-01
We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments.
NASA Astrophysics Data System (ADS)
Vilagosh, Zoltan; Lajevardipour, Alireza; Wood, Andrew
2018-01-01
Finite-difference time-domain (FDTD) computational phantoms aid the analysis of THz radiation interaction with human skin. The presented computational phantoms have accurate anatomical layering and electromagnetic properties. A novel "large sheet" simulation technique is used allowing for a realistic representation of lateral absorption and reflection of in-vivo measurements. Simulations carried out to date have indicated that hair follicles act as THz propagation channels and confirms the possible role of melanin, both in nevi and skin pigmentation, to act as a significant absorber of THz radiation. A novel freezing technique has promise in increasing the depth of skin penetration of THz radiation to aid diagnostic imaging.
Systematic comparison of the behaviors produced by computational models of epileptic neocortex.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warlaumont, A. S.; Lee, H. C.; Benayoun, M.
2010-12-01
Two existing models of brain dynamics in epilepsy, one detailed (i.e., realistic) and one abstract (i.e., simplified) are compared in terms of behavioral range and match to in vitro mouse recordings. A new method is introduced for comparing across computational models that may have very different forms. First, high-level metrics were extracted from model and in vitro output time series. A principal components analysis was then performed over these metrics to obtain a reduced set of derived features. These features define a low-dimensional behavior space in which quantitative measures of behavioral range and degree of match to real data canmore » be obtained. The detailed and abstract models and the mouse recordings overlapped considerably in behavior space. Both the range of behaviors and similarity to mouse data were similar between the detailed and abstract models. When no high-level metrics were used and principal components analysis was computed over raw time series, the models overlapped minimally with the mouse recordings. The method introduced here is suitable for comparing across different kinds of model data and across real brain recordings. It appears that, despite differences in form and computational expense, detailed and abstract models do not necessarily differ in their behaviors.« less
Surrogate based wind farm layout optimization using manifold mapping
NASA Astrophysics Data System (ADS)
Kaja Kamaludeen, Shaafi M.; van Zuijle, Alexander; Bijl, Hester
2016-09-01
High computational cost associated with the high fidelity wake models such as RANS or LES serves as a primary bottleneck to perform a direct high fidelity wind farm layout optimization (WFLO) using accurate CFD based wake models. Therefore, a surrogate based multi-fidelity WFLO methodology (SWFLO) is proposed. The surrogate model is built using an SBO method referred as manifold mapping (MM). As a verification, optimization of spacing between two staggered wind turbines was performed using the proposed surrogate based methodology and the performance was compared with that of direct optimization using high fidelity model. Significant reduction in computational cost was achieved using MM: a maximum computational cost reduction of 65%, while arriving at the same optima as that of direct high fidelity optimization. The similarity between the response of models, the number of mapping points and its position, highly influences the computational efficiency of the proposed method. As a proof of concept, realistic WFLO of a small 7-turbine wind farm is performed using the proposed surrogate based methodology. Two variants of Jensen wake model with different decay coefficients were used as the fine and coarse model. The proposed SWFLO method arrived at the same optima as that of the fine model with very less number of fine model simulations.
Comprehensive Model of Single Particle Pulverized Coal Combustion Extended to Oxy-Coal Conditions
Holland, Troy; Fletcher, Thomas H.
2017-02-22
Oxy-fired coal combustion is a promising potential carbon capture technology. Predictive CFD simulations are valuable tools in evaluating and deploying oxy-fuel and other carbon capture technologies either as retrofit technologies or for new construction. But, accurate predictive simulations require physically realistic submodels with low computational requirements. In particular, comprehensive char oxidation and gasification models have been developed that describe multiple reaction and diffusion processes. Our work extends a comprehensive char conversion code (CCK), which treats surface oxidation and gasification reactions as well as processes such as film diffusion, pore diffusion, ash encapsulation, and annealing. In this work several submodels inmore » the CCK code were updated with more realistic physics or otherwise extended to function in oxy-coal conditions. Improved submodels include the annealing model, the swelling model, the mode of burning parameter, and the kinetic model, as well as the addition of the chemical percolation devolatilization (CPD) model. We compare our results of the char combustion model to oxy-coal data, and further compared to parallel data sets near conventional conditions. A potential method to apply the detailed code in CFD work is given.« less
Comprehensive Model of Single Particle Pulverized Coal Combustion Extended to Oxy-Coal Conditions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holland, Troy; Fletcher, Thomas H.
Oxy-fired coal combustion is a promising potential carbon capture technology. Predictive CFD simulations are valuable tools in evaluating and deploying oxy-fuel and other carbon capture technologies either as retrofit technologies or for new construction. But, accurate predictive simulations require physically realistic submodels with low computational requirements. In particular, comprehensive char oxidation and gasification models have been developed that describe multiple reaction and diffusion processes. Our work extends a comprehensive char conversion code (CCK), which treats surface oxidation and gasification reactions as well as processes such as film diffusion, pore diffusion, ash encapsulation, and annealing. In this work several submodels inmore » the CCK code were updated with more realistic physics or otherwise extended to function in oxy-coal conditions. Improved submodels include the annealing model, the swelling model, the mode of burning parameter, and the kinetic model, as well as the addition of the chemical percolation devolatilization (CPD) model. We compare our results of the char combustion model to oxy-coal data, and further compared to parallel data sets near conventional conditions. A potential method to apply the detailed code in CFD work is given.« less
GPU-powered Shotgun Stochastic Search for Dirichlet process mixtures of Gaussian Graphical Models
Mukherjee, Chiranjit; Rodriguez, Abel
2016-01-01
Gaussian graphical models are popular for modeling high-dimensional multivariate data with sparse conditional dependencies. A mixture of Gaussian graphical models extends this model to the more realistic scenario where observations come from a heterogenous population composed of a small number of homogeneous sub-groups. In this paper we present a novel stochastic search algorithm for finding the posterior mode of high-dimensional Dirichlet process mixtures of decomposable Gaussian graphical models. Further, we investigate how to harness the massive thread-parallelization capabilities of graphical processing units to accelerate computation. The computational advantages of our algorithms are demonstrated with various simulated data examples in which we compare our stochastic search with a Markov chain Monte Carlo algorithm in moderate dimensional data examples. These experiments show that our stochastic search largely outperforms the Markov chain Monte Carlo algorithm in terms of computing-times and in terms of the quality of the posterior mode discovered. Finally, we analyze a gene expression dataset in which Markov chain Monte Carlo algorithms are too slow to be practically useful. PMID:28626348
GPU-powered Shotgun Stochastic Search for Dirichlet process mixtures of Gaussian Graphical Models.
Mukherjee, Chiranjit; Rodriguez, Abel
2016-01-01
Gaussian graphical models are popular for modeling high-dimensional multivariate data with sparse conditional dependencies. A mixture of Gaussian graphical models extends this model to the more realistic scenario where observations come from a heterogenous population composed of a small number of homogeneous sub-groups. In this paper we present a novel stochastic search algorithm for finding the posterior mode of high-dimensional Dirichlet process mixtures of decomposable Gaussian graphical models. Further, we investigate how to harness the massive thread-parallelization capabilities of graphical processing units to accelerate computation. The computational advantages of our algorithms are demonstrated with various simulated data examples in which we compare our stochastic search with a Markov chain Monte Carlo algorithm in moderate dimensional data examples. These experiments show that our stochastic search largely outperforms the Markov chain Monte Carlo algorithm in terms of computing-times and in terms of the quality of the posterior mode discovered. Finally, we analyze a gene expression dataset in which Markov chain Monte Carlo algorithms are too slow to be practically useful.
Plank, Gernot; Zhou, Lufang; Greenstein, Joseph L; Cortassa, Sonia; Winslow, Raimond L; O'Rourke, Brian; Trayanova, Natalia A
2008-01-01
Computer simulations of electrical behaviour in the whole ventricles have become commonplace during the last few years. The goals of this article are (i) to review the techniques that are currently employed to model cardiac electrical activity in the heart, discussing the strengths and weaknesses of the various approaches, and (ii) to implement a novel modelling approach, based on physiological reasoning, that lifts some of the restrictions imposed by current state-of-the-art ionic models. To illustrate the latter approach, the present study uses a recently developed ionic model of the ventricular myocyte that incorporates an excitation–contraction coupling and mitochondrial energetics model. A paradigm to bridge the vastly disparate spatial and temporal scales, from subcellular processes to the entire organ, and from sub-microseconds to minutes, is presented. Achieving sufficient computational efficiency is the key to success in the quest to develop multiscale realistic models that are expected to lead to better understanding of the mechanisms of arrhythmia induction following failure at the organelle level, and ultimately to the development of novel therapeutic applications. PMID:18603526
Image-based models of cardiac structure in health and disease
Vadakkumpadan, Fijoy; Arevalo, Hermenegild; Prassl, Anton J.; Chen, Junjie; Kickinger, Ferdinand; Kohl, Peter; Plank, Gernot; Trayanova, Natalia
2010-01-01
Computational approaches to investigating the electromechanics of healthy and diseased hearts are becoming essential for the comprehensive understanding of cardiac function. In this article, we first present a brief review of existing image-based computational models of cardiac structure. We then provide a detailed explanation of a processing pipeline which we have recently developed for constructing realistic computational models of the heart from high resolution structural and diffusion tensor (DT) magnetic resonance (MR) images acquired ex vivo. The presentation of the pipeline incorporates a review of the methodologies that can be used to reconstruct models of cardiac structure. In this pipeline, the structural image is segmented to reconstruct the ventricles, normal myocardium, and infarct. A finite element mesh is generated from the segmented structural image, and fiber orientations are assigned to the elements based on DTMR data. The methods were applied to construct seven different models of healthy and diseased hearts. These models contain millions of elements, with spatial resolutions in the order of hundreds of microns, providing unprecedented detail in the representation of cardiac structure for simulation studies. PMID:20582162
Computational Modeling of 3D Tumor Growth and Angiogenesis for Chemotherapy Evaluation
Tang, Lei; van de Ven, Anne L.; Guo, Dongmin; Andasari, Vivi; Cristini, Vittorio; Li, King C.; Zhou, Xiaobo
2014-01-01
Solid tumors develop abnormally at spatial and temporal scales, giving rise to biophysical barriers that impact anti-tumor chemotherapy. This may increase the expenditure and time for conventional drug pharmacokinetic and pharmacodynamic studies. In order to facilitate drug discovery, we propose a mathematical model that couples three-dimensional tumor growth and angiogenesis to simulate tumor progression for chemotherapy evaluation. This application-oriented model incorporates complex dynamical processes including cell- and vascular-mediated interstitial pressure, mass transport, angiogenesis, cell proliferation, and vessel maturation to model tumor progression through multiple stages including tumor initiation, avascular growth, and transition from avascular to vascular growth. Compared to pure mechanistic models, the proposed empirical methods are not only easy to conduct but can provide realistic predictions and calculations. A series of computational simulations were conducted to demonstrate the advantages of the proposed comprehensive model. The computational simulation results suggest that solid tumor geometry is related to the interstitial pressure, such that tumors with high interstitial pressure are more likely to develop dendritic structures than those with low interstitial pressure. PMID:24404145
A gel as an array of channels.
Zimm, B H
1996-06-01
We consider the theory of charged point molecules ('probes') being pulled by an electric field through a two-dimensional net of channels that represents a piece of gel. Associated with the position in the net is a free energy of interaction between the probe and the net; this free energy fluctuates randomly with the position of the probe in the net. The free energy is intended to represent weak interactions between the probe and the gel, such as entropy associated with the restriction of the freedom of motion of the probe by the gel, or electrostatic interactions between the probe and charges fixed to the gel. The free energy can be thought of as a surface with the appearance of a rough, hilly landscape spread over the net; the roughness is measured by the standard deviation of the free-energy distribution. Two variations of the model are examined: (1) the net is assumed to have all channels open, or (2) only channels parallel to the electric field are open and all the cross-connecting channels are closed. Model (1) is more realistic but presents a two-dimensional mathematical problem which can only be solved by slow iteration methods, while model (2) is less realistic but presents a one-dimensional problem that can be reduced to simple quadratures and is easy to solve by numerical integration. In both models the mobility of the probe decreases as the roughness parameter is increased, but the effect is larger in the less realistic model (2) if the same free-energy surface is used in both. The mobility in model (2) is reduced both by high points in the rough surface ('bumps') and by low points ('traps'), while in model (1) only the traps are effective, since the probes can flow around the bumps through the cross channels. The mobility in model (2) can be made to agree with model (1) simply by cutting off the bumps of the surface. Thus the simple model (2) can be used in place of the more realistic model (1) that is more difficult to compute.
Development of a realistic, dynamic digital brain phantom for CT perfusion validation
NASA Astrophysics Data System (ADS)
Divel, Sarah E.; Segars, W. Paul; Christensen, Soren; Wintermark, Max; Lansberg, Maarten G.; Pelc, Norbert J.
2016-03-01
Physicians rely on CT Perfusion (CTP) images and quantitative image data, including cerebral blood flow, cerebral blood volume, and bolus arrival delay, to diagnose and treat stroke patients. However, the quantification of these metrics may vary depending on the computational method used. Therefore, we have developed a dynamic and realistic digital brain phantom upon which CTP scans can be simulated based on a set of ground truth scenarios. Building upon the previously developed 4D extended cardiac-torso (XCAT) phantom containing a highly detailed brain model, this work consisted of expanding the intricate vasculature by semi-automatically segmenting existing MRA data and fitting nonuniform rational B-spline surfaces to the new vessels. Using time attenuation curves input by the user as reference, the contrast enhancement in the vessels changes dynamically. At each time point, the iodine concentration in the arteries and veins is calculated from the curves and the material composition of the blood changes to reflect the expected values. CatSim, a CT system simulator, generates simulated data sets of this dynamic digital phantom which can be further analyzed to validate CTP studies and post-processing methods. The development of this dynamic and realistic digital phantom provides a valuable resource with which current uncertainties and controversies surrounding the quantitative computations generated from CTP data can be examined and resolved.
Bacillus subtilis Lipid Extract, A Branched-Chain Fatty Acid Model Membrane
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nickels, Jonathan D.; Chatterjee, Sneha; Mostofian, Barmak
Lipid extracts are an excellent choice of model biomembrane; however at present, there are no commercially available lipid extracts or computational models that mimic microbial membranes containing the branched-chain fatty acids found in many pathogenic and industrially relevant bacteria. Here, we advance the extract of Bacillus subtilis as a standard model for these diverse systems, providing a detailed experimental description and equilibrated atomistic bilayer model included as Supporting Information to this Letter and at (http://cmb.ornl.gov/members/cheng). The development and validation of this model represents an advance that enables more realistic simulations and experiments on bacterial membranes and reconstituted bacterial membrane proteins.
Study on the tumor-induced angiogenesis using mathematical models.
Suzuki, Takashi; Minerva, Dhisa; Nishiyama, Koichi; Koshikawa, Naohiko; Chaplain, Mark Andrew Joseph
2018-01-01
We studied angiogenesis using mathematical models describing the dynamics of tip cells. We reviewed the basic ideas of angiogenesis models and its numerical simulation technique to produce realistic computer graphics images of sprouting angiogenesis. We examined the classical model of Anderson-Chaplain using fundamental concepts of mass transport and chemical reaction with ECM degradation included. We then constructed two types of numerical schemes, model-faithful and model-driven ones, where new techniques of numerical simulation are introduced, such as transient probability, particle velocity, and Boolean variables. © 2017 The Authors. Cancer Science published by John Wiley & Sons Australia, Ltd on behalf of Japanese Cancer Association.
Ginn, T.R.; Woolfenden, L.
2002-01-01
A project for modeling and isotopic analysis of artificial recharge in the Rialto-Colton basin aquifer in California, is discussed. The Rialto-Colton aquifer has been divided into four primary and significant flowpaths following the general direction of groundwater flow from NW to SE. The introductory investigation include sophisticated chemical reaction modeling, with highly simplified flow path simulation. A comprehensive reactive transport model with the established set of geochemical reactions over the whole aquifer will also be developed for treating both reactions and transport realistically. This will be completed by making use of HBGC123D implemented with isotopic calculation step to compute Carbon-14 (C14) and stable Carbon-13 (C13) contents of the water. Computed carbon contents will also be calibrated with the measured carbon contents for assessment of the amount of imported recharge into the Linden pond.
Shi, Yunfei; Yao, Jiang; Young, Jonathan M.; Fee, Judy A.; Perucchio, Renato; Taber, Larry A.
2014-01-01
The morphogenetic process of cardiac looping transforms the straight heart tube into a curved tube that resembles the shape of the future four-chambered heart. Although great progress has been made in identifying the molecular and genetic factors involved in looping, the physical mechanisms that drive this process have remained poorly understood. Recent work, however, has shed new light on this complicated problem. After briefly reviewing the current state of knowledge, we propose a relatively comprehensive hypothesis for the mechanics of the first phase of looping, termed c-looping, as the straight heart tube deforms into a c-shaped tube. According to this hypothesis, differential hypertrophic growth in the myocardium supplies the main forces that cause the heart tube to bend ventrally, while regional growth and cytoskeletal contraction in the omphalomesenteric veins (primitive atria) and compressive loads exerted by the splanchnopleuric membrane drive rightward torsion. A computational model based on realistic embryonic heart geometry is used to test the physical plausibility of this hypothesis. The behavior of the model is in reasonable agreement with available experimental data from control and perturbed embryos, offering support for our hypothesis. The results also suggest, however, that several other mechanisms contribute secondarily to normal looping, and we speculate that these mechanisms play backup roles when looping is perturbed. Finally, some outstanding questions are discussed for future study. PMID:25161623
Shi, Yunfei; Yao, Jiang; Young, Jonathan M; Fee, Judy A; Perucchio, Renato; Taber, Larry A
2014-01-01
The morphogenetic process of cardiac looping transforms the straight heart tube into a curved tube that resembles the shape of the future four-chambered heart. Although great progress has been made in identifying the molecular and genetic factors involved in looping, the physical mechanisms that drive this process have remained poorly understood. Recent work, however, has shed new light on this complicated problem. After briefly reviewing the current state of knowledge, we propose a relatively comprehensive hypothesis for the mechanics of the first phase of looping, termed c-looping, as the straight heart tube deforms into a c-shaped tube. According to this hypothesis, differential hypertrophic growth in the myocardium supplies the main forces that cause the heart tube to bend ventrally, while regional growth and cytoskeletal contraction in the omphalomesenteric veins (primitive atria) and compressive loads exerted by the splanchnopleuric membrane drive rightward torsion. A computational model based on realistic embryonic heart geometry is used to test the physical plausibility of this hypothesis. The behavior of the model is in reasonable agreement with available experimental data from control and perturbed embryos, offering support for our hypothesis. The results also suggest, however, that several other mechanisms contribute secondarily to normal looping, and we speculate that these mechanisms play backup roles when looping is perturbed. Finally, some outstanding questions are discussed for future study.
Automatic Reconstruction of Spacecraft 3D Shape from Imagery
NASA Astrophysics Data System (ADS)
Poelman, C.; Radtke, R.; Voorhees, H.
We describe a system that computes the three-dimensional (3D) shape of a spacecraft from a sequence of uncalibrated, two-dimensional images. While the mathematics of multi-view geometry is well understood, building a system that accurately recovers 3D shape from real imagery remains an art. A novel aspect of our approach is the combination of algorithms from computer vision, photogrammetry, and computer graphics. We demonstrate our system by computing spacecraft models from imagery taken by the Air Force Research Laboratory's XSS-10 satellite and DARPA's Orbital Express satellite. Using feature tie points (each identified in two or more images), we compute the relative motion of each frame and the 3D location of each feature using iterative linear factorization followed by non-linear bundle adjustment. The "point cloud" that results from this traditional shape-from-motion approach is typically too sparse to generate a detailed 3D model. Therefore, we use the computed motion solution as input to a volumetric silhouette-carving algorithm, which constructs a solid 3D model based on viewpoint consistency with the image frames. The resulting voxel model is then converted to a facet-based surface representation and is texture-mapped, yielding realistic images from arbitrary viewpoints. We also illustrate other applications of the algorithm, including 3D mensuration and stereoscopic 3D movie generation.
An efficient framework for modeling clouds from Landsat8 images
NASA Astrophysics Data System (ADS)
Yuan, Chunqiang; Guo, Jing
2015-03-01
Cloud plays an important role in creating realistic outdoor scenes for video game and flight simulation applications. Classic methods have been proposed for cumulus cloud modeling. However, these methods are not flexible for modeling large cloud scenes with hundreds of clouds in that the user must repeatedly model each cloud and adjust its various properties. This paper presents a meteorologically based method to reconstruct cumulus clouds from high resolution Landsat8 satellite images. From these input satellite images, the clouds are first segmented from the background. Then, the cloud top surface is estimated from the temperature of the infrared image. After that, under a mild assumption of flat base for cumulus cloud, the base height of each cloud is computed by averaging the top height for pixels on the cloud edge. Then, the extinction is generated from the visible image. Finally, we enrich the initial shapes of clouds using a fractal method and represent the recovered clouds as a particle system. The experimental results demonstrate our method can yield realistic cloud scenes resembling those in the satellite images.
A Transfer Hamiltonian Model for Devices Based on Quantum Dot Arrays
Illera, S.; Prades, J. D.; Cirera, A.; Cornet, A.
2015-01-01
We present a model of electron transport through a random distribution of interacting quantum dots embedded in a dielectric matrix to simulate realistic devices. The method underlying the model depends only on fundamental parameters of the system and it is based on the Transfer Hamiltonian approach. A set of noncoherent rate equations can be written and the interaction between the quantum dots and between the quantum dots and the electrodes is introduced by transition rates and capacitive couplings. A realistic modelization of the capacitive couplings, the transmission coefficients, the electron/hole tunneling currents, and the density of states of each quantum dot have been taken into account. The effects of the local potential are computed within the self-consistent field regime. While the description of the theoretical framework is kept as general as possible, two specific prototypical devices, an arbitrary array of quantum dots embedded in a matrix insulator and a transistor device based on quantum dots, are used to illustrate the kind of unique insight that numerical simulations based on the theory are able to provide. PMID:25879055
A transfer hamiltonian model for devices based on quantum dot arrays.
Illera, S; Prades, J D; Cirera, A; Cornet, A
2015-01-01
We present a model of electron transport through a random distribution of interacting quantum dots embedded in a dielectric matrix to simulate realistic devices. The method underlying the model depends only on fundamental parameters of the system and it is based on the Transfer Hamiltonian approach. A set of noncoherent rate equations can be written and the interaction between the quantum dots and between the quantum dots and the electrodes is introduced by transition rates and capacitive couplings. A realistic modelization of the capacitive couplings, the transmission coefficients, the electron/hole tunneling currents, and the density of states of each quantum dot have been taken into account. The effects of the local potential are computed within the self-consistent field regime. While the description of the theoretical framework is kept as general as possible, two specific prototypical devices, an arbitrary array of quantum dots embedded in a matrix insulator and a transistor device based on quantum dots, are used to illustrate the kind of unique insight that numerical simulations based on the theory are able to provide.
Ground Contact Modeling for the Morpheus Test Vehicle Simulation
NASA Technical Reports Server (NTRS)
Cordova, Luis
2014-01-01
The Morpheus vertical test vehicle is an autonomous robotic lander being developed at Johnson Space Center (JSC) to test hazard detection technology. Because the initial ground contact simulation model was not very realistic, it was decided to improve the model without making it too computationally expensive. The first development cycle added capability to define vehicle attachment points (AP) and to keep track of their states in the lander reference frame (LFRAME). These states are used with a spring damper model to compute an AP contact force. The lateral force is then overwritten, if necessary, by the Coulomb static or kinetic friction force. The second development cycle added capability to use the PolySurface class as the contact surface. The class can load CAD data in STL (Stereo Lithography) format, and use the data to compute line of sight (LOS) intercepts. A polygon frame (PFRAME) is computed from the facet intercept normal and used to convert the AP state to PFRAME. Three flat plane tests validate the transitions from kinetic to static, static to kinetic, and vertical impact. The hazardous terrain test will be used to test for visual reasonableness. The improved model is numerically inexpensive, robust, and produces results that are reasonable.
Ground Contact Modeling for the Morpheus Test Vehicle Simulation
NASA Technical Reports Server (NTRS)
Cordova, Luis
2013-01-01
The Morpheus vertical test vehicle is an autonomous robotic lander being developed at Johnson Space Center (JSC) to test hazard detection technology. Because the initial ground contact simulation model was not very realistic, it was decided to improve the model without making it too computationally expensive. The first development cycle added capability to define vehicle attachment points (AP) and to keep track of their states in the lander reference frame (LFRAME). These states are used with a spring damper model to compute an AP contact force. The lateral force is then overwritten, if necessary, by the Coulomb static or kinetic friction force. The second development cycle added capability to use the PolySurface class as the contact surface. The class can load CAD data in STL (Stereo Lithography) format, and use the data to compute line of sight (LOS) intercepts. A polygon frame (PFRAME) is computed from the facet intercept normal and used to convert the AP state to PFRAME. Three flat plane tests validate the transitions from kinetic to static, static to kinetic, and vertical impact. The hazardous terrain test will be used to test for visual reasonableness. The improved model is numerically inexpensive, robust, and produces results that are reasonable.
NASA Astrophysics Data System (ADS)
Tran, Thang H.; Baba, Yoshihiro; Somu, Vijaya B.; Rakov, Vladimir A.
2017-12-01
The finite difference time domain (FDTD) method in the 2-D cylindrical coordinate system was used to compute the nearly full-frequency-bandwidth vertical electric field and azimuthal magnetic field waveforms produced on the ground surface by lightning return strokes. The lightning source was represented by the modified transmission-line model with linear current decay with height, which was implemented in the FDTD computations as an appropriate vertical phased-current-source array. The conductivity of atmosphere was assumed to increase exponentially with height, with different conductivity profiles being used for daytime and nighttime conditions. The fields were computed at distances ranging from 50 to 500 km. Sky waves (reflections from the ionosphere) were identified in computed waveforms and used for estimation of apparent ionospheric reflection heights. It was found that our model reproduces reasonably well the daytime electric field waveforms measured at different distances and simulated (using a more sophisticated propagation model) by Qin et al. (2017). Sensitivity of model predictions to changes in the parameters of atmospheric conductivity profile, as well as influences of the lightning source characteristics (current waveshape parameters, return-stroke speed, and channel length) and ground conductivity were examined.
Development and analysis of a finite element model to simulate pulmonary emphysema in CT imaging.
Diciotti, Stefano; Nobis, Alessandro; Ciulli, Stefano; Landini, Nicholas; Mascalchi, Mario; Sverzellati, Nicola; Innocenti, Bernardo
2015-01-01
In CT imaging, pulmonary emphysema appears as lung regions with Low-Attenuation Areas (LAA). In this study we propose a finite element (FE) model of lung parenchyma, based on a 2-D grid of beam elements, which simulates pulmonary emphysema related to smoking in CT imaging. Simulated LAA images were generated through space sampling of the model output. We employed two measurements of emphysema extent: Relative Area (RA) and the exponent D of the cumulative distribution function of LAA clusters size. The model has been used to compare RA and D computed on the simulated LAA images with those computed on the models output. Different mesh element sizes and various model parameters, simulating different physiological/pathological conditions, have been considered and analyzed. A proper mesh element size has been determined as the best trade-off between reliable results and reasonable computational cost. Both RA and D computed on simulated LAA images were underestimated with respect to those calculated on the models output. Such underestimations were larger for RA (≈ -44 ÷ -26%) as compared to those for D (≈ -16 ÷ -2%). Our FE model could be useful to generate standard test images and to design realistic physical phantoms of LAA images for the assessment of the accuracy of descriptors for quantifying emphysema in CT imaging.
NASA Technical Reports Server (NTRS)
Lilley, D. G.; Rhode, D. L.
1982-01-01
A primitive pressure-velocity variable finite difference computer code was developed to predict swirling recirculating inert turbulent flows in axisymmetric combustors in general, and for application to a specific idealized combustion chamber with sudden or gradual expansion. The technique involves a staggered grid system for axial and radial velocities, a line relaxation procedure for efficient solution of the equations, a two-equation k-epsilon turbulence model, a stairstep boundary representation of the expansion flow, and realistic accommodation of swirl effects. A user's manual, dealing with the computational problem, showing how the mathematical basis and computational scheme may be translated into a computer program is presented. A flow chart, FORTRAN IV listing, notes about various subroutines and a user's guide are supplied as an aid to prospective users of the code.
Transonic Flow Field Analysis for Wing-Fuselage Configurations
NASA Technical Reports Server (NTRS)
Boppe, C. W.
1980-01-01
A computational method for simulating the aerodynamics of wing-fuselage configurations at transonic speeds is developed. The finite difference scheme is characterized by a multiple embedded mesh system coupled with a modified or extended small disturbance flow equation. This approach permits a high degree of computational resolution in addition to coordinate system flexibility for treating complex realistic aircraft shapes. To augment the analysis method and permit applications to a wide range of practical engineering design problems, an arbitrary fuselage geometry modeling system is incorporated as well as methodology for computing wing viscous effects. Configuration drag is broken down into its friction, wave, and lift induced components. Typical computed results for isolated bodies, isolated wings, and wing-body combinations are presented. The results are correlated with experimental data. A computer code which employs this methodology is described.
High-performance computing on GPUs for resistivity logging of oil and gas wells
NASA Astrophysics Data System (ADS)
Glinskikh, V.; Dudaev, A.; Nechaev, O.; Surodina, I.
2017-10-01
We developed and implemented into software an algorithm for high-performance simulation of electrical logs from oil and gas wells using high-performance heterogeneous computing. The numerical solution of the 2D forward problem is based on the finite-element method and the Cholesky decomposition for solving a system of linear algebraic equations (SLAE). Software implementations of the algorithm used the NVIDIA CUDA technology and computing libraries are made, allowing us to perform decomposition of SLAE and find its solution on central processor unit (CPU) and graphics processor unit (GPU). The calculation time is analyzed depending on the matrix size and number of its non-zero elements. We estimated the computing speed on CPU and GPU, including high-performance heterogeneous CPU-GPU computing. Using the developed algorithm, we simulated resistivity data in realistic models.
Challenges in reducing the computational time of QSTS simulations for distribution system analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deboever, Jeremiah; Zhang, Xiaochen; Reno, Matthew J.
The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10more » to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.« less
High performance MRI simulations of motion on multi-GPU systems
2014-01-01
Background MRI physics simulators have been developed in the past for optimizing imaging protocols and for training purposes. However, these simulators have only addressed motion within a limited scope. The purpose of this study was the incorporation of realistic motion, such as cardiac motion, respiratory motion and flow, within MRI simulations in a high performance multi-GPU environment. Methods Three different motion models were introduced in the Magnetic Resonance Imaging SIMULator (MRISIMUL) of this study: cardiac motion, respiratory motion and flow. Simulation of a simple Gradient Echo pulse sequence and a CINE pulse sequence on the corresponding anatomical model was performed. Myocardial tagging was also investigated. In pulse sequence design, software crushers were introduced to accommodate the long execution times in order to avoid spurious echoes formation. The displacement of the anatomical model isochromats was calculated within the Graphics Processing Unit (GPU) kernel for every timestep of the pulse sequence. Experiments that would allow simulation of custom anatomical and motion models were also performed. Last, simulations of motion with MRISIMUL on single-node and multi-node multi-GPU systems were examined. Results Gradient Echo and CINE images of the three motion models were produced and motion-related artifacts were demonstrated. The temporal evolution of the contractility of the heart was presented through the application of myocardial tagging. Better simulation performance and image quality were presented through the introduction of software crushers without the need to further increase the computational load and GPU resources. Last, MRISIMUL demonstrated an almost linear scalable performance with the increasing number of available GPU cards, in both single-node and multi-node multi-GPU computer systems. Conclusions MRISIMUL is the first MR physics simulator to have implemented motion with a 3D large computational load on a single computer multi-GPU configuration. The incorporation of realistic motion models, such as cardiac motion, respiratory motion and flow may benefit the design and optimization of existing or new MR pulse sequences, protocols and algorithms, which examine motion related MR applications. PMID:24996972
Material parameter computation for multi-layered vocal fold models.
Schmidt, Bastian; Stingl, Michael; Leugering, Günter; Berry, David A; Döllinger, Michael
2011-04-01
Today, the prevention and treatment of voice disorders is an ever-increasing health concern. Since many occupations rely on verbal communication, vocal health is necessary just to maintain one's livelihood. Commonly applied models to study vocal fold vibrations and air flow distributions are self sustained physical models of the larynx composed of artificial silicone vocal folds. Choosing appropriate mechanical parameters for these vocal fold models while considering simplifications due to manufacturing restrictions is difficult but crucial for achieving realistic behavior. In the present work, a combination of experimental and numerical approaches to compute material parameters for synthetic vocal fold models is presented. The material parameters are derived from deformation behaviors of excised human larynges. The resulting deformations are used as reference displacements for a tracking functional to be optimized. Material optimization was applied to three-dimensional vocal fold models based on isotropic and transverse-isotropic material laws, considering both a layered model with homogeneous material properties on each layer and an inhomogeneous model. The best results exhibited a transversal-isotropic inhomogeneous (i.e., not producible) model. For the homogeneous model (three layers), the transversal-isotropic material parameters were also computed for each layer yielding deformations similar to the measured human vocal fold deformations.
How do microalgae perceive light in a high-rate pond? Towards more realistic Lagrangian experiments.
Demory, David; Combe, Charlotte; Hartmann, Philipp; Talec, Amélie; Pruvost, Eric; Hamouda, Raouf; Souillé, Fabien; Lamare, Pierre-Olivier; Bristeau, Marie-Odile; Sainte-Marie, Jacques; Rabouille, Sophie; Mairet, Francis; Sciandra, Antoine; Bernard, Olivier
2018-05-01
Hydrodynamics in a high-rate production reactor for microalgae cultivation affects the light history perceived by cells. The interplay between cell movement and medium turbidity leads to a complex light pattern, whose forcing effects on photosynthesis and photoacclimation dynamics are non-trivial. Hydrodynamics of high density algal ponds mixed by a paddle wheel has been studied recently, although the focus has never been on describing its impact on photosynthetic growth efficiency. In this multidisciplinary downscaling study, we first reconstructed single cell trajectories in an open raceway using an original hydrodynamical model offering a powerful discretization of the Navier-Stokes equations tailored to systems with free surfaces. The trajectory of a particular cell was selected and the associated high-frequency light pattern was computed. This light pattern was then experimentally reproduced in an Arduino-driven computer controlled cultivation system with a low density Dunaliella salina culture. The effect on growth and pigment content was recorded for various frequencies of the light pattern, by setting different paddle wheel velocities. Results show that the frequency of this realistic signal plays a decisive role in the dynamics of photosynthesis, thus revealing an unexpected photosynthetic response compared to that recorded under the on/off signals usually used in the literature. Indeed, the light received by a single cell contains signals from low to high frequencies that nonlinearly interact with the photosynthesis process and differentially stimulate the various time scales associated with photoacclimation and energy dissipation. This study highlights the need for experiments with more realistic light stimuli to better understand microalgal growth at high cell densities. An experimental protocol is also proposed, with simple, yet more realistic, step functions for light fluctuations.
How do microalgae perceive light in a high-rate pond? Towards more realistic Lagrangian experiments
Demory, David; Combe, Charlotte; Hartmann, Philipp; Talec, Amélie; Pruvost, Eric; Hamouda, Raouf; Souillé, Fabien; Lamare, Pierre-Olivier; Bristeau, Marie-Odile; Sainte-Marie, Jacques; Rabouille, Sophie; Mairet, Francis; Sciandra, Antoine
2018-01-01
Hydrodynamics in a high-rate production reactor for microalgae cultivation affects the light history perceived by cells. The interplay between cell movement and medium turbidity leads to a complex light pattern, whose forcing effects on photosynthesis and photoacclimation dynamics are non-trivial. Hydrodynamics of high density algal ponds mixed by a paddle wheel has been studied recently, although the focus has never been on describing its impact on photosynthetic growth efficiency. In this multidisciplinary downscaling study, we first reconstructed single cell trajectories in an open raceway using an original hydrodynamical model offering a powerful discretization of the Navier–Stokes equations tailored to systems with free surfaces. The trajectory of a particular cell was selected and the associated high-frequency light pattern was computed. This light pattern was then experimentally reproduced in an Arduino-driven computer controlled cultivation system with a low density Dunaliella salina culture. The effect on growth and pigment content was recorded for various frequencies of the light pattern, by setting different paddle wheel velocities. Results show that the frequency of this realistic signal plays a decisive role in the dynamics of photosynthesis, thus revealing an unexpected photosynthetic response compared to that recorded under the on/off signals usually used in the literature. Indeed, the light received by a single cell contains signals from low to high frequencies that nonlinearly interact with the photosynthesis process and differentially stimulate the various time scales associated with photoacclimation and energy dissipation. This study highlights the need for experiments with more realistic light stimuli to better understand microalgal growth at high cell densities. An experimental protocol is also proposed, with simple, yet more realistic, step functions for light fluctuations. PMID:29892466
Adiabatic Quantum Computing via the Rydberg Blockade
NASA Astrophysics Data System (ADS)
Keating, Tyler; Goyal, Krittika; Deutsch, Ivan
2012-06-01
We study an architecture for implementing adiabatic quantum computation with trapped neutral atoms. Ground state atoms are dressed by laser fields in a manner conditional on the Rydberg blockade mechanism, thereby providing the requisite entangling interactions. As a benchmark we study the performance of a Quadratic Unconstrained Binary Optimization (QUBO) problem whose solution is found in the ground state spin configuration of an Ising-like model. We model a realistic architecture, including the effects of magnetic level structure, with qubits encoded into the clock states of ^133Cs, effective B-fields implemented through microwaves and light shifts, and atom-atom coupling achieved by excitation to a high-lying Rydberg level. Including the fundamental effects of photon scattering we find a high fidelity for the two-qubit implementation.
Cubical Mass-Spring Model design based on a tensile deformation test and nonlinear material model.
San-Vicente, Gaizka; Aguinaga, Iker; Tomás Celigüeta, Juan
2012-02-01
Mass-Spring Models (MSMs) are used to simulate the mechanical behavior of deformable bodies such as soft tissues in medical applications. Although they are fast to compute, they lack accuracy and their design remains still a great challenge. The major difficulties in building realistic MSMs lie on the spring stiffness estimation and the topology identification. In this work, the mechanical behavior of MSMs under tensile loads is analyzed before studying the spring stiffness estimation. In particular, the performed qualitative and quantitative analysis of the behavior of cubical MSMs shows that they have a nonlinear response similar to hyperelastic material models. According to this behavior, a new method for spring stiffness estimation valid for linear and nonlinear material models is proposed. This method adjusts the stress-strain and compressibility curves to a given reference behavior. The accuracy of the MSMs designed with this method is tested taking as reference some soft-tissue simulations based on nonlinear Finite Element Method (FEM). The obtained results show that MSMs can be designed to realistically model the behavior of hyperelastic materials such as soft tissues and can become an interesting alternative to other approaches such as nonlinear FEM.
Fitting Neuron Models to Spike Trains
Rossant, Cyrille; Goodman, Dan F. M.; Fontaine, Bertrand; Platkiewicz, Jonathan; Magnusson, Anna K.; Brette, Romain
2011-01-01
Computational modeling is increasingly used to understand the function of neural circuits in systems neuroscience. These studies require models of individual neurons with realistic input–output properties. Recently, it was found that spiking models can accurately predict the precisely timed spike trains produced by cortical neurons in response to somatically injected currents, if properly fitted. This requires fitting techniques that are efficient and flexible enough to easily test different candidate models. We present a generic solution, based on the Brian simulator (a neural network simulator in Python), which allows the user to define and fit arbitrary neuron models to electrophysiological recordings. It relies on vectorization and parallel computing techniques to achieve efficiency. We demonstrate its use on neural recordings in the barrel cortex and in the auditory brainstem, and confirm that simple adaptive spiking models can accurately predict the response of cortical neurons. Finally, we show how a complex multicompartmental model can be reduced to a simple effective spiking model. PMID:21415925
A preliminary theoretical line-blanketed model solar photosphere
NASA Technical Reports Server (NTRS)
Kurucz, R. L.
1974-01-01
In the theoretical approach to model-atmosphere construction, all opacities are computed theoretically and the temperature-pressure structure is determined by conservation of energy. Until recently, this has not been a very useful method for later type stars, because the line opacity was both poorly known and difficult to calculate. However, methods have now been developed that are capable of representing the line opacity well enough for construction of realistic models. A preliminary theoretical solar model is presented that produces closer agreement with observation than has been heretofore possible. The qualitative advantages and shortcomings of this model are discussued and projected improvements are outlined.
Modeling Computer Communication Networks in a Realistic 3D Environment
2010-03-01
50 2. Comparison of visualization tools . . . . . . . . . . . . . . . . . 75 xi List of Abbreviations Abbreviation Page 2D two-dimensional...International Conference on, 77 –84, 2001. 20. National Defense and the Canadian Forces. “Joint Fires Support”. URL http: //www.cfd-cdf.forces.gc.ca/sites/ page ...UNLIMITED. Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour
Time-optimal aircraft pursuit-evasion with a weapon envelope constraint
NASA Technical Reports Server (NTRS)
Menon, P. K. A.; Duke, E. L.
1990-01-01
The optimal pursuit-evasion problem between two aircraft, including nonlinear point-mass vehicle models and a realistic weapon envelope, is analyzed. Using a linear combination of flight time and the square of the vehicle acceleration as the performance index, a closed-form solution is obtained in nonlinear feedback form. Due to its modest computational requirements, this guidance law can be used for onboard real-time implementation.
Kupczik, Kornelius; Stark, Heiko; Mundry, Roger; Neininger, Fabian T; Heidlauf, Thomas; Röhrle, Oliver
2015-10-07
Skeletal muscle models are used to investigate motion and force generation in both biological and bioengineering research. Yet, they often lack a realistic representation of the muscle's internal architecture which is primarily composed of muscle fibre bundles, known as fascicles. Recently, it has been shown that fascicles can be resolved with micro-computed tomography (µCT) following staining of the muscle tissue with iodine potassium iodide (I2KI). Here, we present the reconstruction of the fascicular spatial arrangement and geometry of the superficial masseter muscle of a dog based on a combination of pattern recognition and streamline computation. A cadaveric head of a dog was incubated in I2KI and µCT-scanned. Following segmentation of the masseter muscle a statistical pattern recognition algorithm was applied to create a vector field of fascicle directions. Streamlines were then used to transform the vector field into a realistic muscle fascicle representation. The lengths of the reconstructed fascicles and the pennation angles in two planes (frontal and sagittal) were extracted and compared against a tracked fascicle field obtained through cadaver dissection. Both fascicle lengths and angles were found to vary substantially within the muscle confirming the complex and heterogeneous nature of skeletal muscle described by previous studies. While there were significant differences in the pennation angle between the experimentally derived and µCT-reconstructed data, there was congruence in the fascicle lengths. We conclude that the presented approach allows for embedding realistic fascicle information into finite element models of skeletal muscles to better understand the functioning of the musculoskeletal system. Copyright © 2015 Elsevier Ltd. All rights reserved.
Jacob, Richard E.; Kuprat, Andrew P.; Einstein, Daniel R.; Corley, Richard A.
2016-01-01
Context Computational fluid dynamics (CFD) simulations of airflows coupled with physiologically-based pharmacokinetic (PBPK) modeling of respiratory tissue doses of airborne materials have traditionally used either steady-state inhalation or a sinusoidal approximation of the breathing cycle for airflow simulations despite their differences from normal breathing patterns. Objective Evaluate the impact of realistic breathing patterns, including sniffing, on predicted nasal tissue concentrations of a reactive vapor that targets the nose in rats as a case study. Materials and methods Whole-body plethysmography measurements from a free-breathing rat were used to produce profiles of normal breathing, sniffing, and combinations of both as flow inputs to CFD/PBPK simulations of acetaldehyde exposure. Results For the normal measured ventilation profile, modest reductions in time- and tissue depth-dependent areas under the curve (AUC) acetaldehyde concentrations were predicted in the wet squamous, respiratory, and transitional epithelium along the main airflow path, while corresponding increases were predicted in the olfactory epithelium, especially the most distal regions of the ethmoid turbinates, versus the idealized profile. The higher amplitude/frequency sniffing profile produced greater AUC increases over the idealized profile in the olfactory epithelium, especially in the posterior region. Conclusions The differences in tissue AUCs at known lesion-forming regions for acetaldehyde between normal and idealized profiles were minimal, suggesting that sinusoidal profiles may be used for this chemical and exposure concentration. However, depending upon the chemical, exposure system and concentration, and the time spent sniffing, the use of realistic breathing profiles—including sniffing—could become an important modulator for local tissue dose predictions. PMID:26986954
DOE Office of Scientific and Technical Information (OSTI.GOV)
Colby, Sean M.; Kabilan, Senthil; Jacob, Richard E.
Abstract Context: Computational fluid dynamics (CFD) simulations of airflows coupled with physiologically based pharmacokinetic (PBPK) modeling of respiratory tissue doses of airborne materials have traditionally used either steady-state inhalation or a sinusoidal approximation of the breathing cycle for airflow simulations despite their differences from normal breathing patterns. Objective: Evaluate the impact of realistic breathing patterns, including sniffing, on predicted nasal tissue concentrations of a reactive vapor that targets the nose in rats as a case study. Materials and methods: Whole-body plethysmography measurements from a free-breathing rat were used to produce profiles of normal breathing, sniffing and combinations of both asmore » flow inputs to CFD/PBPK simulations of acetaldehyde exposure. Results: For the normal measured ventilation profile, modest reductions in time- and tissue depth-dependent areas under the curve (AUC) acetaldehyde concentrations were predicted in the wet squamous, respiratory and transitional epithelium along the main airflow path, while corresponding increases were predicted in the olfactory epithelium, especially the most distal regions of the ethmoid turbinates, versus the idealized profile. The higher amplitude/frequency sniffing profile produced greater AUC increases over the idealized profile in the olfactory epithelium, especially in the posterior region. Conclusions: The differences in tissue AUCs at known lesion-forming regions for acetaldehyde between normal and idealized profiles were minimal, suggesting that sinusoidal profiles may be used for this chemical and exposure concentration. However, depending upon the chemical, exposure system and concentration and the time spent sniffing, the use of realistic breathing profiles, including sniffing, could become an important modulator for local tissue dose predictions.« less
Computer simulation of heterogeneous polymer photovoltaic devices
NASA Astrophysics Data System (ADS)
Kodali, Hari K.; Ganapathysubramanian, Baskar
2012-04-01
Polymer-based photovoltaic devices have the potential for widespread usage due to their low cost per watt and mechanical flexibility. Efficiencies close to 9.0% have been achieved recently in conjugated polymer based organic solar cells (OSCs). These devices were fabricated using solvent-based processing of electron-donating and electron-accepting materials into the so-called bulk heterojunction (BHJ) architecture. Experimental evidence suggests that a key property determining the power-conversion efficiency of such devices is the final morphological distribution of the donor and acceptor constituents. In order to understand the role of morphology on device performance, we develop a scalable computational framework that efficiently interrogates OSCs to investigate relationships between the morphology at the nano-scale with the device performance. In this work, we extend the Buxton and Clarke model (2007 Modelling Simul. Mater. Sci. Eng. 15 13-26) to simulate realistic devices with complex active layer morphologies using a dimensionally independent, scalable, finite-element method. We incorporate all stages involved in current generation, namely (1) exciton generation and diffusion, (2) charge generation and (3) charge transport in a modular fashion. The numerical challenges encountered during interrogation of realistic microstructures are detailed. We compare each stage of the photovoltaic process for two microstructures: a BHJ morphology and an idealized sawtooth morphology. The results are presented for both two- and three-dimensional structures.
Implementing Realistic Helicopter Physics in 3D Game Environments
2002-09-01
developed a highly realistic and innovative PC video game that puts you inside an Army unit. You’ll face your first tour of duty along with your fellow...helicopter physics. Many other video games include helicopters but omit realistic third person helicopter behaviors in their applications. Of the 48...to be too computationally expensive for a PC based video game . Generally, some basic parts of blade element theory are present in any attempt to
A method for the computational modeling of the physics of heart murmurs
NASA Astrophysics Data System (ADS)
Seo, Jung Hee; Bakhshaee, Hani; Garreau, Guillaume; Zhu, Chi; Andreou, Andreas; Thompson, William R.; Mittal, Rajat
2017-05-01
A computational method for direct simulation of the generation and propagation of blood flow induced sounds is proposed. This computational hemoacoustic method is based on the immersed boundary approach and employs high-order finite difference methods to resolve wave propagation and scattering accurately. The current method employs a two-step, one-way coupled approach for the sound generation and its propagation through the tissue. The blood flow is simulated by solving the incompressible Navier-Stokes equations using the sharp-interface immersed boundary method, and the equations corresponding to the generation and propagation of the three-dimensional elastic wave corresponding to the murmur are resolved with a high-order, immersed boundary based, finite-difference methods in the time-domain. The proposed method is applied to a model problem of aortic stenosis murmur and the simulation results are verified and validated by comparing with known solutions as well as experimental measurements. The murmur propagation in a realistic model of a human thorax is also simulated by using the computational method. The roles of hemodynamics and elastic wave propagation on the murmur are discussed based on the simulation results.
NASA Technical Reports Server (NTRS)
Sadovsky, Alexander V.; Davis, Damek; Isaacson, Douglas R.
2012-01-01
A class of problems in air traffic management asks for a scheduling algorithm that supplies the air traffic services authority not only with a schedule of arrivals and departures, but also with speed advisories. Since advisories must be finite, a scheduling algorithm must ultimately produce a finite data set, hence must either start with a purely discrete model or involve a discretization of a continuous one. The former choice, often preferred for intuitive clarity, naturally leads to mixed-integer programs, hindering proofs of correctness and computational cost bounds (crucial for real-time operations). In this paper, a hybrid control system is used to model air traffic scheduling, capturing both the discrete and continuous aspects. This framework is applied to a class of problems, called the Fully Routed Nominal Problem. We prove a number of geometric results on feasible schedules and use these results to formulate an algorithm that attempts to compute a collective speed advisory, effectively finite, and has computational cost polynomial in the number of aircraft. This work is a first step toward optimization and models refined with more realistic detail.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moog, E. R.; Dejus, R. J.; Sasaki, S.
2017-01-01
Magnetic modeling was performed to estimate achievable magnetic field strengths of superconducting undulators (SCUs) and to compare them with those of cryogenically cooled permanent magnet undulators (CPMUs). Starting with vacuum (beam stay-clear) gaps of 4.0 and 6.0 mm, realistic allowances for beam chambers (in the SCU case) and beam liners (in the CPMU case) were added. (A 6.0-mm vacuum gap is planned for the upgraded APS). The CPMU magnetic models consider both CPMUs that use NdFeB magnets at ~150 K and PrFeB magnets at 77 K. Parameters of the magnetic models are presented along with fitted coefficients of a Halbach-typemore » expression for the field dependence on the gap-to-period ratio. Field strengths for SCUs are estimated using a scaling law for planar SCUs; an equation for that is given. The SCUs provide higher magnetic fields than the highest-field CPMUs – those using PrFeB at 77 K – for period lengths longer than ~14 mm for NbTi-based SCUs and ~10 mm for Nb3Sn-based SCUs. To show that the model calculations and scaling law results are realistic, they are compared to CPMUs that have been built and NbTi-based SCUs that have been built. Brightness tuning curves of CPMUs (PrFeB) and SCUs (NbTi) for the upgraded APS lattice are also provided for realistic period lengths.« less
Tsunami Modeling to Validate Slip Models of the 2007 M w 8.0 Pisco Earthquake, Central Peru
NASA Astrophysics Data System (ADS)
Ioualalen, M.; Perfettini, H.; Condo, S. Yauri; Jimenez, C.; Tavera, H.
2013-03-01
Following the 2007, August 15th, M w 8.0, Pisco earthquake in central Peru, Sladen et al. (J Geophys Res 115: B02405, 2010) have derived several slip models of this event. They inverted teleseismic data together with geodetic (InSAR) measurements to look for the co-seismic slip distribution on the fault plane, considering those data sets separately or jointly. But how close to the real slip distribution are those inverted slip models? To answer this crucial question, the authors generated some tsunami records based on their slip models and compared them to DART buoys, tsunami records, and available runup data. Such an approach requires a robust and accurate tsunami model (non-linear, dispersive, accurate bathymetry and topography, etc.) otherwise the differences between the data and the model may be attributed to the slip models themselves, though they arise from an incomplete tsunami simulation. The accuracy of a numerical tsunami simulation strongly depends, among others, on two important constraints: (i) A fine computational grid (and thus the bathymetry and topography data sets used) which is not always available, unfortunately, and (ii) a realistic tsunami propagation model including dispersion. Here, we extend Sladen's work using newly available data, namely a tide gauge record at Callao (Lima harbor) and the Chilean DART buoy record, while considering a complete set of runup data along with a more realistic tsunami numerical that accounts for dispersion, and also considering a fine-resolution computational grid, which is essential. Through these accurate numerical simulations we infer that the InSAR-based model is in better agreement with the tsunami data, studying the case of the Pisco earthquake indicating that geodetic data seems essential to recover the final co-seismic slip distribution on the rupture plane. Slip models based on teleseismic data are unable to describe the observed tsunami, suggesting that a significant amount of co-seismic slip may have been aseismic. Finally, we compute the runup distribution along the central part of the Peruvian coast to better understand the wave amplification/attenuation processes of the tsunami generated by the Pisco earthquake.
Computer-based learning in neuroanatomy: A longitudinal study of learning, transfer, and retention
NASA Astrophysics Data System (ADS)
Chariker, Julia H.
A longitudinal experiment was conducted to explore computer-based learning of neuroanatomy. Using a realistic 3D graphical model of neuroanatomy, and sections derived from the model, exploratory graphical tools were integrated into interactive computer programs so as to allow adaptive exploration. 72 participants learned either sectional anatomy alone or learned whole anatomy followed by sectional anatomy. Sectional anatomy was explored either in perceptually continuous animation or discretely, as in the use of an anatomical atlas. Learning was measured longitudinally to a high performance criterion. After learning, transfer to biomedical images and long-term retention was tested. Learning whole anatomy prior to learning sectional anatomy led to a more efficient learning experience. Learners demonstrated high levels of transfer from whole anatomy to sectional anatomy and from sectional anatomy to complex biomedical images. All learning groups demonstrated high levels of retention at 2--3 weeks.
Status and prospects of computational fluid dynamics for unsteady transonic viscous flows
NASA Technical Reports Server (NTRS)
Mccroskey, W. J.; Kutler, P.; Bridgeman, J. O.
1984-01-01
Applications of computational aerodynamics to aeronautical research, design, and analysis have increased rapidly over the past decade, and these applications offer significant benefits to aeroelasticians. The past developments are traced by means of a number of specific examples, and the trends are projected over the next several years. The crucial factors that limit the present capabilities for unsteady analyses are identified; they include computer speed and memory, algorithm and solution methods, grid generation, turbulence modeling, vortex modeling, data processing, and coupling of the aerodynamic and structural dynamic analyses. The prospects for overcoming these limitations are presented, and many improvements appear to be readily attainable. If so, a complete and reliable numerical simulation of the unsteady, transonic viscous flow around a realistic fighter aircraft configuration could become possible within the next decade. The possibilities of using artificial intelligence concepts to hasten the achievement of this goal are also discussed.
The Influence of Realistic Reynolds Numbers on Slat Noise Simulations
NASA Technical Reports Server (NTRS)
Lockard, David P.; Choudhari, Meelan M.
2012-01-01
The slat noise from the 30P/30N high-lift system has been computed using a computational fluid dynamics code in conjunction with a Ffowcs Williams-Hawkings solver. Varying the Reynolds number from 1.71 to 12.0 million based on the stowed chord resulted in slight changes in the radiated noise. Tonal features in the spectra were robust and evident for all Reynolds numbers and even when a spanwise flow was imposed. The general trends observed in near-field fluctuations were also similar for all the different Reynolds numbers. Experiments on simplified, subscale high-lift systems have exhibited noticeable dependencies on the Reynolds number and tripping, although primarily for tonal features rather than the broadband portion of the spectra. Either the 30P/30N model behaves differently, or the computational model is unable to capture these effects. Hence, the results underscore the need for more detailed measurements of the slat cove flow.
Nagaoka, Tomoaki; Watanabe, Soichi
2012-01-01
Electromagnetic simulation with anatomically realistic computational human model using the finite-difference time domain (FDTD) method has recently been performed in a number of fields in biomedical engineering. To improve the method's calculation speed and realize large-scale computing with the computational human model, we adapt three-dimensional FDTD code to a multi-GPU cluster environment with Compute Unified Device Architecture and Message Passing Interface. Our multi-GPU cluster system consists of three nodes. The seven GPU boards (NVIDIA Tesla C2070) are mounted on each node. We examined the performance of the FDTD calculation on multi-GPU cluster environment. We confirmed that the FDTD calculation on the multi-GPU clusters is faster than that on a multi-GPU (a single workstation), and we also found that the GPU cluster system calculate faster than a vector supercomputer. In addition, our GPU cluster system allowed us to perform the large-scale FDTD calculation because were able to use GPU memory of over 100 GB.
[Virtual reality in ophthalmological education].
Wagner, C; Schill, M; Hennen, M; Männer, R; Jendritza, B; Knorz, M C; Bender, H J
2001-04-01
We present a computer-based medical training workstation for the simulation of intraocular eye surgery. The surgeon manipulates two original instruments inside a mechanical model of the eye. The instrument positions are tracked by CCD cameras and monitored by a PC which renders the scenery using a computer-graphic model of the eye and the instruments. The simulator incorporates a model of the operation table, a mechanical eye, three CCD cameras for the position tracking, the stereo display, and a computer. The three cameras are mounted under the operation table from where they can observe the interior of the mechanical eye. Using small markers the cameras recognize the instruments and the eye. Their position and orientation in space is determined by stereoscopic back projection. The simulation runs with more than 20 frames per second and provides a realistic impression of the surgery. It includes the cold light source which can be moved inside the eye and the shadow of the instruments on the retina which is important for navigational purposes.
Lu, Zhonghua; Arikatla, Venkata S; Han, Zhongqing; Allen, Brian F; De, Suvranu
2014-12-01
High-frequency electricity is used in the majority of surgical interventions. However, modern computer-based training and simulation systems rely on physically unrealistic models that fail to capture the interplay of the electrical, mechanical and thermal properties of biological tissue. We present a real-time and physically realistic simulation of electrosurgery by modelling the electrical, thermal and mechanical properties as three iteratively solved finite element models. To provide subfinite-element graphical rendering of vaporized tissue, a dual-mesh dynamic triangulation algorithm based on isotherms is proposed. The block compressed row storage (BCRS) structure is shown to be critical in allowing computationally efficient changes in the tissue topology due to vaporization. We have demonstrated our physics-based electrosurgery cutting algorithm through various examples. Our matrix manipulation algorithms designed for topology changes have shown low computational cost. Our simulator offers substantially greater physical fidelity compared to previous simulators that use simple geometry-based heat characterization. Copyright © 2013 John Wiley & Sons, Ltd.
Iterative Importance Sampling Algorithms for Parameter Estimation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grout, Ray W; Morzfeld, Matthias; Day, Marcus S.
In parameter estimation problems one computes a posterior distribution over uncertain parameters defined jointly by a prior distribution, a model, and noisy data. Markov chain Monte Carlo (MCMC) is often used for the numerical solution of such problems. An alternative to MCMC is importance sampling, which can exhibit near perfect scaling with the number of cores on high performance computing systems because samples are drawn independently. However, finding a suitable proposal distribution is a challenging task. Several sampling algorithms have been proposed over the past years that take an iterative approach to constructing a proposal distribution. We investigate the applicabilitymore » of such algorithms by applying them to two realistic and challenging test problems, one in subsurface flow, and one in combustion modeling. More specifically, we implement importance sampling algorithms that iterate over the mean and covariance matrix of Gaussian or multivariate t-proposal distributions. Our implementation leverages massively parallel computers, and we present strategies to initialize the iterations using 'coarse' MCMC runs or Gaussian mixture models.« less
The benefits of virtual reality simulator training for laparoscopic surgery.
Hart, Roger; Karthigasu, Krishnan
2007-08-01
Virtual reality is a computer-generated system that provides a representation of an environment. This review will analyse the literature with regard to any benefit to be derived from training with virtual reality equipment and to describe the current equipment available. Virtual reality systems are not currently realistic of the live operating environment because they lack tactile sensation, and do not represent a complete operation. The literature suggests that virtual reality training is a valuable learning tool for gynaecologists in training, particularly those in the early stages of their careers. Furthermore, it may be of benefit for the ongoing audit of surgical skills and for the early identification of a surgeon's deficiencies before operative incidents occur. It is only a matter of time before realistic virtual reality models of most complete gynaecological operations are available, with improved haptics as a result of improved computer technology. It is inevitable that in the modern climate of litigation virtual reality training will become an essential part of clinical training, as evidence for its effectiveness as a training tool exists, and in many countries training by operating on live animals is not possible.
Moulin, Emmanuel; Grondel, Sébastien; Assaad, Jamal; Duquenne, Laurent
2008-12-01
The work described in this paper is intended to present a simple and efficient way of modeling a full Lamb wave emission and reception system. The emitter behavior and the Lamb wave generation are predicted using a two-dimensional (2D) hybrid finite element-normal mode expansion model. Then the receiver electrical response is obtained from a finite element computation with prescribed displacements. A numerical correction is applied to the 2D results in order to account for the in-plane radiation divergence caused by the finite length of the emitter. The advantage of this modular approach is that realistic configurations can be simulated without performing cumbersome modeling and time-consuming computations. It also provides insight into the physical interpretation of the results. A good agreement is obtained between predicted and measured signals. The range of application of the method is discussed.
Stochastic Model of Clogging in a Microfluidic Cell Sorter
NASA Astrophysics Data System (ADS)
Fai, Thomas; Rycroft, Chris
2016-11-01
Microfluidic devices for sorting cells by deformability show promise for various medical purposes, e.g. detecting sickle cell anemia and circulating tumor cells. One class of such devices consists of a two-dimensional array of narrow channels, each column containing several identical channels in parallel. Cells are driven through the device by an applied pressure or flow rate. Such devices allows for many cells to be sorted simultaneously, but cells eventually clog individual channels and change the device properties in an unpredictable manner. In this talk, we propose a stochastic model for the failure of such microfluidic devices by clogging and present preliminary theoretical and computational results. The model can be recast as an ODE that exhibits finite time blow-up under certain conditions. The failure time distribution is investigated analytically in certain limiting cases, and more realistic versions of the model are solved by computer simulation.
Ab initio results for intermediate-mass, open-shell nuclei
NASA Astrophysics Data System (ADS)
Baker, Robert B.; Dytrych, Tomas; Launey, Kristina D.; Draayer, Jerry P.
2017-01-01
A theoretical understanding of nuclei in the intermediate-mass region is vital to astrophysical models, especially for nucleosynthesis. Here, we employ the ab initio symmetry-adapted no-core shell model (SA-NCSM) in an effort to push first-principle calculations across the sd-shell region. The ab initio SA-NCSM's advantages come from its ability to control the growth of model spaces by including only physically relevant subspaces, which allows us to explore ultra-large model spaces beyond the reach of other methods. We report on calculations for 19Ne and 20Ne up through 13 harmonic oscillator shells using realistic interactions and discuss the underlying structure as well as implications for various astrophysical reactions. This work was supported by the U.S. NSF (OCI-0904874 and ACI -1516338) and the U.S. DOE (DE-SC0005248), and also benefitted from the Blue Waters sustained-petascale computing project and high performance computing resources provided by LSU.
EEGLAB, SIFT, NFT, BCILAB, and ERICA: New Tools for Advanced EEG Processing
Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott
2011-01-01
We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments. PMID:21687590
Romps, David M.
2016-03-01
Convective entrainment is a process that is poorly represented in existing convective parameterizations. By many estimates, convective entrainment is the leading source of error in global climate models. As a potential remedy, an Eulerian implementation of the Stochastic Parcel Model (SPM) is presented here as a convective parameterization that treats entrainment in a physically realistic and computationally efficient way. Drawing on evidence that convecting clouds comprise air parcels subject to Poisson-process entrainment events, the SPM calculates the deterministic limit of an infinite number of such parcels. For computational efficiency, the SPM groups parcels at each height by their purity, whichmore » is a measure of their total entrainment up to that height. This reduces the calculation of convective fluxes to a sequence of matrix multiplications. The SPM is implemented in a single-column model and compared with a large-eddy simulation of deep convection.« less
A computational model unifies apparently contradictory findings concerning phantom pain
Boström, Kim J.; de Lussanet, Marc H. E.; Weiss, Thomas; Puta, Christian; Wagner, Heiko
2014-01-01
Amputation often leads to painful phantom sensations, whose pathogenesis is still unclear. Supported by experimental findings, an explanatory model has been proposed that identifies maladaptive reorganization of the primary somatosensory cortex (S1) as a cause of phantom pain. However, it was recently found that BOLD activity during voluntary movements of the phantom positively correlates with phantom pain rating, giving rise to a model of persistent representation. In the present study, we develop a physiologically realistic, computational model to resolve the conflicting findings. Simulations yielded that both the amount of reorganization and the level of cortical activity during phantom movements were enhanced in a scenario with strong phantom pain as compared to a scenario with weak phantom pain. These results suggest that phantom pain, maladaptive reorganization, and persistent representation may all be caused by the same underlying mechanism, which is driven by an abnormally enhanced spontaneous activity of deafferented nociceptive channels. PMID:24931344
Combining 3D structure of real video and synthetic objects
NASA Astrophysics Data System (ADS)
Kim, Man-Bae; Song, Mun-Sup; Kim, Do-Kyoon
1998-04-01
This paper presents a new approach of combining real video and synthetic objects. The purpose of this work is to use the proposed technology in the fields of advanced animation, virtual reality, games, and so forth. Computer graphics has been used in the fields previously mentioned. Recently, some applications have added real video to graphic scenes for the purpose of augmenting the realism that the computer graphics lacks in. This approach called augmented or mixed reality can produce more realistic environment that the entire use of computer graphics. Our approach differs from the virtual reality and augmented reality in the manner that computer- generated graphic objects are combined to 3D structure extracted from monocular image sequences. The extraction of the 3D structure requires the estimation of 3D depth followed by the construction of a height map. Graphic objects are then combined to the height map. The realization of our proposed approach is carried out in the following steps: (1) We derive 3D structure from test image sequences. The extraction of the 3D structure requires the estimation of depth and the construction of a height map. Due to the contents of the test sequence, the height map represents the 3D structure. (2) The height map is modeled by Delaunay triangulation or Bezier surface and each planar surface is texture-mapped. (3) Finally, graphic objects are combined to the height map. Because 3D structure of the height map is already known, Step (3) is easily manipulated. Following this procedure, we produced an animation video demonstrating the combination of the 3D structure and graphic models. Users can navigate the realistic 3D world whose associated image is rendered on the display monitor.
Neuron array with plastic synapses and programmable dendrites.
Ramakrishnan, Shubha; Wunderlich, Richard; Hasler, Jennifer; George, Suma
2013-10-01
We describe a novel neuromorphic chip architecture that models neurons for efficient computation. Traditional architectures of neuron array chips consist of large scale systems that are interfaced with AER for implementing intra- or inter-chip connectivity. We present a chip that uses AER for inter-chip communication but uses fast, reconfigurable FPGA-style routing with local memory for intra-chip connectivity. We model neurons with biologically realistic channel models, synapses and dendrites. This chip is suitable for small-scale network simulations and can also be used for sequence detection, utilizing directional selectivity properties of dendrites, ultimately for use in word recognition.
NASA Astrophysics Data System (ADS)
Zhang, Ju; Jackson, Thomas; Balachandar, Sivaramakrishnan
2015-06-01
We will develop a computational model built upon our verified and validated in-house SDT code to provide improved description of the multiphase blast wave dynamics where solid particles are considered deformable and can even undergo phase transitions. Our SDT computational framework includes a reactive compressible flow solver with sophisticated material interface tracking capability and realistic equation of state (EOS) such as Mie-Gruneisen EOS for multiphase flow modeling. The behavior of diffuse interface models by Shukla et al. (2010) and Tiwari et al. (2013) at different shock impedance ratio will be first examined and characterized. The recent constrained interface reinitialization by Shukla (2014) will then be developed to examine if conservation property can be improved. This work was supported in part by the U.S. Department of Energy and by the Defense Threat Reduction Agency.
Electromagnetic Modeling of Human Body Using High Performance Computing
NASA Astrophysics Data System (ADS)
Ng, Cho-Kuen; Beall, Mark; Ge, Lixin; Kim, Sanghoek; Klaas, Ottmar; Poon, Ada
Realistic simulation of electromagnetic wave propagation in the actual human body can expedite the investigation of the phenomenon of harvesting implanted devices using wireless powering coupled from external sources. The parallel electromagnetics code suite ACE3P developed at SLAC National Accelerator Laboratory is based on the finite element method for high fidelity accelerator simulation, which can be enhanced to model electromagnetic wave propagation in the human body. Starting with a CAD model of a human phantom that is characterized by a number of tissues, a finite element mesh representing the complex geometries of the individual tissues is built for simulation. Employing an optimal power source with a specific pattern of field distribution, the propagation and focusing of electromagnetic waves in the phantom has been demonstrated. Substantial speedup of the simulation is achieved by using multiple compute cores on supercomputers.
New 3D model for dynamics modeling
NASA Astrophysics Data System (ADS)
Perez, Alain
1994-05-01
The wrist articulation represents one of the most complex mechanical systems of the human body. It is composed of eight bones rolling and sliding along their surface and along the faces of the five metacarpals of the hand and the two bones of the arm. The wrist dynamics are however fundamental for the hand movement, but it is so complex that it still remains incompletely explored. This work is a part of a new concept of computer-assisted surgery, which consists in developing computer models to perfect surgery acts by predicting their consequences. The modeling of the wrist dynamics are based first on the static model of its bones in three dimensions. This 3D model must optimise the collision detection procedure which is the necessary step to estimate the physical contact constraints. As many other possible computer vision models do not fit with enough precision to this problem, a new 3D model has been developed thanks to the median axis of the digital distance map of the bones reconstructed volume. The collision detection procedure is then simplified for contacts are detected between spheres. The experiment of this original 3D dynamic model products realistic computer animation images of solids in contact. It is now necessary to detect ligaments on digital medical images and to model them in order to complete a wrist model.
ERIC Educational Resources Information Center
McCartney, Robert; Tenenberg, Josh
2008-01-01
Some have proposed that realistic problem situations are better for learning. This issue contains two articles that examine the effects of "making it real" in computer architecture and human-computer interaction.
Numerical Study of Solar Storms from the Sun to Earth
NASA Astrophysics Data System (ADS)
Feng, Xueshang; Jiang, Chaowei; Zhou, Yufen
2017-04-01
As solar storms are sweeping the Earth, adverse changes occur in geospace environment. How human can mitigate and avoid destructive damages caused by solar storms becomes an important frontier issue that we must face in the high-tech times. It is of both scientific significance to understand the dynamic process during solar storm's propagation in interplanetary space and realistic value to conduct physics-based numerical researches on the three-dimensional process of solar storms in interplanetary space with the aid of powerful computing capacity to predict the arrival times, intensities, and probable geoeffectiveness of solar storms at the Earth. So far, numerical studies based on magnetohydrodynamics (MHD) have gone through the transition from the initial qualitative principle researches to systematic quantitative studies on concrete events and numerical predictions. Numerical modeling community has a common goal to develop an end-to-end physics-based modeling system for forecasting the Sun-Earth relationship. It is hoped that the transition of these models to operational use depends on the availability of computational resources at reasonable cost and that the models' prediction capabilities may be improved by incorporating the observational findings and constraints into physics-based models, combining the observations, empirical models and MHD simulations in organic ways. In this talk, we briefly focus on our recent progress in using solar observations to produce realistic magnetic configurations of CMEs as they leave the Sun, and coupling data-driven simulations of CMEs to heliospheric simulations that then propagate the CME configuration to 1AU, and outlook the important numerical issues and their possible solutions in numerical space weather modeling from the Sun to Earth for future research.
Koppert, Marc; Kalitzin, Stiliyan; Velis, Demetrios; Lopes Da Silva, Fernando; Viergever, Max A
2016-12-01
Epilepsy is a condition in which periods of ongoing normal EEG activity alternate with periods of oscillatory behavior characteristic of epileptic seizures. The dynamics of the transitions between the two states are still unclear. Computational models provide a powerful tool to explore the underlying mechanisms of such transitions, with the purpose of eventually finding therapeutic interventions for this debilitating condition. In this study, the possibility to postpone seizures elicited by a decrease of inhibition is investigated by using external stimulation in a realistic bistable neuronal model consisting of two interconnected neuronal populations representing pyramidal cells and interneurons. In the simulations, seizures are induced by slowly decreasing the conductivity of GABA[Formula: see text] synaptic channels over time. Since the model is bistable, the system will change state from the initial steady state (SS) to the limit cycle (LS) state because of internal noise, when the inhibition falls below a certain threshold. Several state-independent stimulations paradigms are simulated. Their effectiveness is analyzed for various stimulation frequencies and intensities in combination with periodic and random stimulation sequences. The distributions of the time to first seizure in the presence of stimulation are compared with the situation without stimulation. In addition, stimulation protocols targeted to specific subsystems are applied with the objective of counteracting the baseline shift due to decreased inhibition in the system. Furthermore, an analytical model is used to investigate the effects of random noise. The relation between the strength of random noise stimulation, the control parameter of the system and the transitions between steady state and limit cycle are investigated. The study shows that it is possible to postpone epileptic activity by targeted stimulation in a realistic neuronal model featuring bistability and that it is possible to stop seizures by random noise in an analytical model.
Automatic Perceptual Color Map Generation for Realistic Volume Visualization
Silverstein, Jonathan C.; Parsad, Nigel M.; Tsirline, Victor
2008-01-01
Advances in computed tomography imaging technology and inexpensive high performance computer graphics hardware are making high-resolution, full color (24-bit) volume visualizations commonplace. However, many of the color maps used in volume rendering provide questionable value in knowledge representation and are non-perceptual thus biasing data analysis or even obscuring information. These drawbacks, coupled with our need for realistic anatomical volume rendering for teaching and surgical planning, has motivated us to explore the auto-generation of color maps that combine natural colorization with the perceptual discriminating capacity of grayscale. As evidenced by the examples shown that have been created by the algorithm described, the merging of perceptually accurate and realistically colorized virtual anatomy appears to insightfully interpret and impartially enhance volume rendered patient data. PMID:18430609
Probabilistic design of fibre concrete structures
NASA Astrophysics Data System (ADS)
Pukl, R.; Novák, D.; Sajdlová, T.; Lehký, D.; Červenka, J.; Červenka, V.
2017-09-01
Advanced computer simulation is recently well-established methodology for evaluation of resistance of concrete engineering structures. The nonlinear finite element analysis enables to realistically predict structural damage, peak load, failure, post-peak response, development of cracks in concrete, yielding of reinforcement, concrete crushing or shear failure. The nonlinear material models can cover various types of concrete and reinforced concrete: ordinary concrete, plain or reinforced, without or with prestressing, fibre concrete, (ultra) high performance concrete, lightweight concrete, etc. Advanced material models taking into account fibre concrete properties such as shape of tensile softening branch, high toughness and ductility are described in the paper. Since the variability of the fibre concrete material properties is rather high, the probabilistic analysis seems to be the most appropriate format for structural design and evaluation of structural performance, reliability and safety. The presented combination of the nonlinear analysis with advanced probabilistic methods allows evaluation of structural safety characterized by failure probability or by reliability index respectively. Authors offer a methodology and computer tools for realistic safety assessment of concrete structures; the utilized approach is based on randomization of the nonlinear finite element analysis of the structural model. Uncertainty of the material properties or their randomness obtained from material tests are accounted in the random distribution. Furthermore, degradation of the reinforced concrete materials such as carbonation of concrete, corrosion of reinforcement, etc. can be accounted in order to analyze life-cycle structural performance and to enable prediction of the structural reliability and safety in time development. The results can serve as a rational basis for design of fibre concrete engineering structures based on advanced nonlinear computer analysis. The presented methodology is illustrated on results from two probabilistic studies with different types of concrete structures related to practical applications and made from various materials (with the parameters obtained from real material tests).
Streamtube expansion effects on the Darrieus wind turbine
NASA Astrophysics Data System (ADS)
Paraschivoiu, I.; Fraunie, P.; Beguier, C.
1985-04-01
The purpose of the work described in this paper was to determine the aerodynamic loads and performance of a Darrieus wind turbine by including the expansion effects of the streamtubes through the rotor. The double-multiple streamtube model with variable interference factors was used to estimate the induced velocities with a modified CARDAAV computer code. Comparison with measured data and predictions shows that the stream-tube expansion effects are relatively significant at high tip-speed ratios, allowing a more realistic modeling of the upwind/downwind flowfield asymmetries inherent in the Darrieus rotor.
Hořava Gravity is Asymptotically Free in 2+1 Dimensions.
Barvinsky, Andrei O; Blas, Diego; Herrero-Valea, Mario; Sibiryakov, Sergey M; Steinwachs, Christian F
2017-11-24
We compute the β functions of marginal couplings in projectable Hořava gravity in 2+1 spacetime dimensions. We show that the renormalization group flow has an asymptotically free fixed point in the ultraviolet (UV), establishing the theory as a UV-complete model with dynamical gravitational degrees of freedom. Therefore, this theory may serve as a toy model to study fundamental aspects of quantum gravity. Our results represent a step forward towards understanding the UV properties of realistic versions of Hořava gravity.
Modeling Primary Atomization of Liquid Fuels using a Multiphase DNS/LES Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arienti, Marco; Oefelein, Joe; Doisneau, Francois
2016-08-01
As part of a Laboratory Directed Research and Development project, we are developing a modeling-and-simulation capability to study fuel direct injection in automotive engines. Predicting mixing and combustion at realistic conditions remains a challenging objective of energy science. And it is a research priority in Sandia’s mission-critical area of energy security, being also relevant to many flows in defense and climate. High-performance computing applied to this non-linear multi-scale problem is key to engine calculations with increased scientific reliability.
Lightning protection of distribution lines
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDermott, T.E.; Short, T.A.; Anderson, J.G.
1994-01-01
This paper reports a study of distribution line lightning performance, using computer simulations of lightning overvoltages. The results of previous investigations are extended with a detailed model of induced voltages from nearby strokes, coupled into a realistic power system model. The paper also considers the energy duty of distribution-class surge arresters exposed to direct strokes. The principal result is that widely separated pole-top arresters can effectively protect a distribution line from induced-voltage flashovers. This means that nearby lightning strokes need not be a significant lightning performance problem for most distribution lines.
Large-scale ground motion simulation using GPGPU
NASA Astrophysics Data System (ADS)
Aoi, S.; Maeda, T.; Nishizawa, N.; Aoki, T.
2012-12-01
Huge computation resources are required to perform large-scale ground motion simulations using 3-D finite difference method (FDM) for realistic and complex models with high accuracy. Furthermore, thousands of various simulations are necessary to evaluate the variability of the assessment caused by uncertainty of the assumptions of the source models for future earthquakes. To conquer the problem of restricted computational resources, we introduced the use of GPGPU (General purpose computing on graphics processing units) which is the technique of using a GPU as an accelerator of the computation which has been traditionally conducted by the CPU. We employed the CPU version of GMS (Ground motion Simulator; Aoi et al., 2004) as the original code and implemented the function for GPU calculation using CUDA (Compute Unified Device Architecture). GMS is a total system for seismic wave propagation simulation based on 3-D FDM scheme using discontinuous grids (Aoi&Fujiwara, 1999), which includes the solver as well as the preprocessor tools (parameter generation tool) and postprocessor tools (filter tool, visualization tool, and so on). The computational model is decomposed in two horizontal directions and each decomposed model is allocated to a different GPU. We evaluated the performance of our newly developed GPU version of GMS on the TSUBAME2.0 which is one of the Japanese fastest supercomputer operated by the Tokyo Institute of Technology. First we have performed a strong scaling test using the model with about 22 million grids and achieved 3.2 and 7.3 times of the speed-up by using 4 and 16 GPUs. Next, we have examined a weak scaling test where the model sizes (number of grids) are increased in proportion to the degree of parallelism (number of GPUs). The result showed almost perfect linearity up to the simulation with 22 billion grids using 1024 GPUs where the calculation speed reached to 79.7 TFlops and about 34 times faster than the CPU calculation using the same number of cores. Finally, we applied GPU calculation to the simulation of the 2011 Tohoku-oki earthquake. The model was constructed using a slip model from inversion of strong motion data (Suzuki et al., 2012), and a geological- and geophysical-based velocity structure model comprising all the Tohoku and Kanto regions as well as the large source area, which consists of about 1.9 billion grids. The overall characteristics of observed velocity seismograms for a longer period than range of 8 s were successfully reproduced (Maeda et al., 2012 AGU meeting). The turn around time for 50 thousand-step calculation (which correspond to 416 s in seismograph) using 100 GPUs was 52 minutes which is fairly short, especially considering this is the performance for the realistic and complex model.
Comparing Realistic Subthalamic Nucleus Neuron Models
NASA Astrophysics Data System (ADS)
Njap, Felix; Claussen, Jens C.; Moser, Andreas; Hofmann, Ulrich G.
2011-06-01
The mechanism of action of clinically effective electrical high frequency stimulation is still under debate. However, recent evidence points at the specific activation of GABA-ergic ion channels. Using a computational approach, we analyze temporal properties of the spike trains emitted by biologically realistic neurons of the subthalamic nucleus (STN) as a function of GABA-ergic synaptic input conductances. Our contribution is based on a model proposed by Rubin and Terman and exhibits a wide variety of different firing patterns, silent, low spiking, moderate spiking and intense spiking activity. We observed that most of the cells in our network turn to silent mode when we increase the GABAA input conductance above the threshold of 3.75 mS/cm2. On the other hand, insignificant changes in firing activity are observed when the input conductance is low or close to zero. We thus reproduce Rubin's model with vanishing synaptic conductances. To quantitatively compare spike trains from the original model with the modified model at different conductance levels, we apply four different (dis)similarity measures between them. We observe that Mahalanobis distance, Victor-Purpura metric, and Interspike Interval distribution are sensitive to different firing regimes, whereas Mutual Information seems undiscriminative for these functional changes.
Theoretical basis of the DOE-2 building energy use analysis program
NASA Astrophysics Data System (ADS)
Curtis, R. B.
1981-04-01
A user-oriented, public domain, computer program was developed that will enable architects and engineers to perform design and retrofit studies of the energy-use of buildings under realistic weather conditions. The DOE-2.1A has been named by the US DOE as the standard evaluation technique for the Congressionally mandated building energy performance standards (BEPS). A number of program design decisions were made that determine the breadth of applicability of DOE-2.1. Such design decisions are intrinsic to all building energy use analysis computer programs and determine the types of buildings or the kind of HVAC systems that can be modeled. In particular, the weighting factor method used in DOE-2 has both advantages and disadvantages relative to other computer programs.
Patient-Specific Simulation of Cardiac Blood Flow From High-Resolution Computed Tomography.
Lantz, Jonas; Henriksson, Lilian; Persson, Anders; Karlsson, Matts; Ebbers, Tino
2016-12-01
Cardiac hemodynamics can be computed from medical imaging data, and results could potentially aid in cardiac diagnosis and treatment optimization. However, simulations are often based on simplified geometries, ignoring features such as papillary muscles and trabeculae due to their complex shape, limitations in image acquisitions, and challenges in computational modeling. This severely hampers the use of computational fluid dynamics in clinical practice. The overall aim of this study was to develop a novel numerical framework that incorporated these geometrical features. The model included the left atrium, ventricle, ascending aorta, and heart valves. The framework used image registration to obtain patient-specific wall motion, automatic remeshing to handle topological changes due to the complex trabeculae motion, and a fast interpolation routine to obtain intermediate meshes during the simulations. Velocity fields and residence time were evaluated, and they indicated that papillary muscles and trabeculae strongly interacted with the blood, which could not be observed in a simplified model. The framework resulted in a model with outstanding geometrical detail, demonstrating the feasibility as well as the importance of a framework that is capable of simulating blood flow in physiologically realistic hearts.
NASA Astrophysics Data System (ADS)
Lou, Yang; Zhou, Weimin; Matthews, Thomas P.; Appleton, Catherine M.; Anastasio, Mark A.
2017-04-01
Photoacoustic computed tomography (PACT) and ultrasound computed tomography (USCT) are emerging modalities for breast imaging. As in all emerging imaging technologies, computer-simulation studies play a critically important role in developing and optimizing the designs of hardware and image reconstruction methods for PACT and USCT. Using computer-simulations, the parameters of an imaging system can be systematically and comprehensively explored in a way that is generally not possible through experimentation. When conducting such studies, numerical phantoms are employed to represent the physical properties of the patient or object to-be-imaged that influence the measured image data. It is highly desirable to utilize numerical phantoms that are realistic, especially when task-based measures of image quality are to be utilized to guide system design. However, most reported computer-simulation studies of PACT and USCT breast imaging employ simple numerical phantoms that oversimplify the complex anatomical structures in the human female breast. We develop and implement a methodology for generating anatomically realistic numerical breast phantoms from clinical contrast-enhanced magnetic resonance imaging data. The phantoms will depict vascular structures and the volumetric distribution of different tissue types in the breast. By assigning optical and acoustic parameters to different tissue structures, both optical and acoustic breast phantoms will be established for use in PACT and USCT studies.
Simulating realistic predator signatures in quantitative fatty acid signature analysis
Bromaghin, Jeffrey F.
2015-01-01
Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.
Confirmation of saturation equilibrium conditions in crater populations
NASA Technical Reports Server (NTRS)
Hartmann, William K.; Gaskell, Robert W.
1993-01-01
We have continued work on realistic numerical models of cratered surfaces, as first reported at last year's LPSC. We confirm the saturation equilibrium level with a new, independent test. One of us has developed a realistic computer simulation of a cratered surface. The model starts with a smooth surface or fractal topography, and adds primary craters according to the cumulative power law with exponent -1.83, as observed on lunar maria and Martian plains. Each crater has an ejecta blanket with the volume of the crater, feathering out to a distance of 4 crater radii. We use the model to test the levels of saturation equilibrium reached in naturally occurring systems, by increasing crater density and observing its dependence on various parameters. In particular, we have tested to see if these artificial systems reach the level found by Hartmann on heavily cratered planetary surfaces, hypothesized to be the natural saturation equilibrium level. This year's work gives the first results of a crater population that includes secondaries. Our model 'Gaskell-4' (September, 1992) includes primaries as described above, but also includes a secondary population, defined by exponent -4. We allowed the largest secondary from each primary to be 0.10 times the size of the primary. These parameters will be changed to test their effects in future models. The model gives realistic images of a cratered surface although it appears richer in secondaries than real surfaces are. The effect of running the model toward saturation gives interesting results for the diameter distribution. Our most heavily cratered surface had the input number of primary craters reach about 0.65 times the hypothesized saturation equilibrium, but the input number rises to more than 100 times that level for secondaries below 1.4 km in size.
Dietschreit, Johannes C B; Diestler, Dennis J; Knapp, Ernst W
2016-05-10
To speed up the generation of an ensemble of poly(ethylene oxide) (PEO) polymer chains in solution, a tetrahedral lattice model possessing the appropriate bond angles is used. The distance between noncovalently bonded atoms is maintained at realistic values by generating chains with an enhanced degree of self-avoidance by a very efficient Monte Carlo (MC) algorithm. Potential energy parameters characterizing this lattice model are adjusted so as to mimic realistic PEO polymer chains in water simulated by molecular dynamics (MD), which serves as a benchmark. The MD data show that PEO chains have a fractal dimension of about two, in contrast to self-avoiding walk lattice models, which exhibit the fractal dimension of 1.7. The potential energy accounts for a mild hydrophobic effect (HYEF) of PEO and for a proper setting of the distribution between trans and gauche conformers. The potential energy parameters are determined by matching the Flory radius, the radius of gyration, and the fraction of trans torsion angles in the chain. A gratifying result is the excellent agreement of the pair distribution function and the angular correlation for the lattice model with the benchmark distribution. The lattice model allows for the precise computation of the torsional entropy of the chain. The generation of polymer conformations of the adjusted lattice model is at least 2 orders of magnitude more efficient than MD simulations of the PEO chain in explicit water. This method of generating chain conformations on a tetrahedral lattice can also be applied to other types of polymers with appropriate adjustment of the potential energy function. The efficient MC algorithm for generating chain conformations on a tetrahedral lattice is available for download at https://github.com/Roulattice/Roulattice .
Modeling and Analysis of Realistic Fire Scenarios in Spacecraft
NASA Technical Reports Server (NTRS)
Brooker, J. E.; Dietrich, D. L.; Gokoglu, S. A.; Urban, D. L.; Ruff, G. A.
2015-01-01
An accidental fire inside a spacecraft is an unlikely, but very real emergency situation that can easily have dire consequences. While much has been learned over the past 25+ years of dedicated research on flame behavior in microgravity, a quantitative understanding of the initiation, spread, detection and extinguishment of a realistic fire aboard a spacecraft is lacking. Virtually all combustion experiments in microgravity have been small-scale, by necessity (hardware limitations in ground-based facilities and safety concerns in space-based facilities). Large-scale, realistic fire experiments are unlikely for the foreseeable future (unlike in terrestrial situations). Therefore, NASA will have to rely on scale modeling, extrapolation of small-scale experiments and detailed numerical modeling to provide the data necessary for vehicle and safety system design. This paper presents the results of parallel efforts to better model the initiation, spread, detection and extinguishment of fires aboard spacecraft. The first is a detailed numerical model using the freely available Fire Dynamics Simulator (FDS). FDS is a CFD code that numerically solves a large eddy simulation form of the Navier-Stokes equations. FDS provides a detailed treatment of the smoke and energy transport from a fire. The simulations provide a wealth of information, but are computationally intensive and not suitable for parametric studies where the detailed treatment of the mass and energy transport are unnecessary. The second path extends a model previously documented at ICES meetings that attempted to predict maximum survivable fires aboard space-craft. This one-dimensional model implies the heat and mass transfer as well as toxic species production from a fire. These simplifications result in a code that is faster and more suitable for parametric studies (having already been used to help in the hatch design of the Multi-Purpose Crew Vehicle, MPCV).
Peng, Ying; Dai, Zoujun; Mansy, Hansen A.; Sandler, Richard H.; Balk, Robert A; Royston, Thomas. J
2014-01-01
Chest physical examination often includes performing chest percussion, which involves introducing sound stimulus to the chest wall and detecting an audible change. This approach relies on observations that underlying acoustic transmission, coupling, and resonance patterns can be altered by chest structure changes due to pathologies. More accurate detection and quantification of these acoustic alterations may provide further useful diagnostic information. To elucidate the physical processes involved, a realistic computer model of sound transmission in the chest is helpful. In the present study, a computational model was developed and validated by comparing its predictions with results from animal and human experiments which involved applying acoustic excitation to the anterior chest while detecting skin vibrations at the posterior chest. To investigate the effect of pathology on sound transmission, the computational model was used to simulate the effects of pneumothorax on sounds introduced at the anterior chest and detected at the posterior. Model predictions and experimental results showed similar trends. The model also predicted wave patterns inside the chest, which may be used to assess results of elastography measurements. Future animal and human tests may expand the predictive power of the model to include acoustic behavior for a wider range of pulmonary conditions. PMID:25001497
A novel medical image data-based multi-physics simulation platform for computational life sciences.
Neufeld, Esra; Szczerba, Dominik; Chavannes, Nicolas; Kuster, Niels
2013-04-06
Simulating and modelling complex biological systems in computational life sciences requires specialized software tools that can perform medical image data-based modelling, jointly visualize the data and computational results, and handle large, complex, realistic and often noisy anatomical models. The required novel solvers must provide the power to model the physics, biology and physiology of living tissue within the full complexity of the human anatomy (e.g. neuronal activity, perfusion and ultrasound propagation). A multi-physics simulation platform satisfying these requirements has been developed for applications including device development and optimization, safety assessment, basic research, and treatment planning. This simulation platform consists of detailed, parametrized anatomical models, a segmentation and meshing tool, a wide range of solvers and optimizers, a framework for the rapid development of specialized and parallelized finite element method solvers, a visualization toolkit-based visualization engine, a Python scripting interface for customized applications, a coupling framework, and more. Core components are cross-platform compatible and use open formats. Several examples of applications are presented: hyperthermia cancer treatment planning, tumour growth modelling, evaluating the magneto-haemodynamic effect as a biomarker and physics-based morphing of anatomical models.
A tree-parenchyma coupled model for lung ventilation simulation.
Pozin, Nicolas; Montesantos, Spyridon; Katz, Ira; Pichelin, Marine; Vignon-Clementel, Irene; Grandmont, Céline
2017-11-01
In this article, we develop a lung ventilation model. The parenchyma is described as an elastic homogenized media. It is irrigated by a space-filling dyadic resistive pipe network, which represents the tracheobronchial tree. In this model, the tree and the parenchyma are strongly coupled. The tree induces an extra viscous term in the system constitutive relation, which leads, in the finite element framework, to a full matrix. We consider an efficient algorithm that takes advantage of the tree structure to enable a fast matrix-vector product computation. This framework can be used to model both free and mechanically induced respiration, in health and disease. Patient-specific lung geometries acquired from computed tomography scans are considered. Realistic Dirichlet boundary conditions can be deduced from surface registration on computed tomography images. The model is compared to a more classical exit compartment approach. Results illustrate the coupling between the tree and the parenchyma, at global and regional levels, and how conditions for the purely 0D model can be inferred. Different types of boundary conditions are tested, including a nonlinear Robin model of the surrounding lung structures. Copyright © 2017 John Wiley & Sons, Ltd.
Three-dimensional turbopump flowfield analysis
NASA Technical Reports Server (NTRS)
Sharma, O. P.; Belford, K. A.; Ni, R. H.
1992-01-01
A program was conducted to develop a flow prediction method applicable to rocket turbopumps. The complex nature of a flowfield in turbopumps is described and examples of flowfields are discussed to illustrate that physics based models and analytical calculation procedures based on computational fluid dynamics (CFD) are needed to develop reliable design procedures for turbopumps. A CFD code developed at NASA ARC was used as the base code. The turbulence model and boundary conditions in the base code were modified, respectively, to: (1) compute transitional flows and account for extra rates of strain, e.g., rotation; and (2) compute surface heat transfer coefficients and allow computation through multistage turbomachines. Benchmark quality data from two and three-dimensional cascades were used to verify the code. The predictive capabilities of the present CFD code were demonstrated by computing the flow through a radial impeller and a multistage axial flow turbine. Results of the program indicate that the present code operated in a two-dimensional mode is a cost effective alternative to full three-dimensional calculations, and that it permits realistic predictions of unsteady loadings and losses for multistage machines.
NASA Technical Reports Server (NTRS)
1979-01-01
The computer model for erythropoietic control was adapted to the mouse system by altering system parameters originally given for the human to those which more realistically represent the mouse. Parameter values were obtained from a variety of literature sources. Using the mouse model, the mouse was studied as a potential experimental model for spaceflight. Simulation studies of dehydration and hypoxia were performed. A comparison of system parameters for the mouse and human models is presented. Aside from the obvious differences expected in fluid volumes, blood flows and metabolic rates, larger differences were observed in the following: erythrocyte life span, erythropoietin half-life, and normal arterial pO2.
Structure and anomalous solubility for hard spheres in an associating lattice gas model.
Szortyka, Marcia M; Girardi, Mauricio; Henriques, Vera B; Barbosa, Marcia C
2012-08-14
In this paper we investigate the solubility of a hard-sphere gas in a solvent modeled as an associating lattice gas. The solution phase diagram for solute at 5% is compared with the phase diagram of the original solute free model. Model properties are investigated both through Monte Carlo simulations and a cluster approximation. The model solubility is computed via simulations and is shown to exhibit a minimum as a function of temperature. The line of minimum solubility (TmS) coincides with the line of maximum density (TMD) for different solvent chemical potentials, in accordance with the literature on continuous realistic models and on the "cavity" picture.
Evaluating the Influence of the Client Behavior in Cloud Computing.
Souza Pardo, Mário Henrique; Centurion, Adriana Molina; Franco Eustáquio, Paulo Sérgio; Carlucci Santana, Regina Helena; Bruschi, Sarita Mazzini; Santana, Marcos José
2016-01-01
This paper proposes a novel approach for the implementation of simulation scenarios, providing a client entity for cloud computing systems. The client entity allows the creation of scenarios in which the client behavior has an influence on the simulation, making the results more realistic. The proposed client entity is based on several characteristics that affect the performance of a cloud computing system, including different modes of submission and their behavior when the waiting time between requests (think time) is considered. The proposed characterization of the client enables the sending of either individual requests or group of Web services to scenarios where the workload takes the form of bursts. The client entity is included in the CloudSim, a framework for modelling and simulation of cloud computing. Experimental results show the influence of the client behavior on the performance of the services executed in a cloud computing system.
Evaluating the Influence of the Client Behavior in Cloud Computing
Centurion, Adriana Molina; Franco Eustáquio, Paulo Sérgio; Carlucci Santana, Regina Helena; Bruschi, Sarita Mazzini; Santana, Marcos José
2016-01-01
This paper proposes a novel approach for the implementation of simulation scenarios, providing a client entity for cloud computing systems. The client entity allows the creation of scenarios in which the client behavior has an influence on the simulation, making the results more realistic. The proposed client entity is based on several characteristics that affect the performance of a cloud computing system, including different modes of submission and their behavior when the waiting time between requests (think time) is considered. The proposed characterization of the client enables the sending of either individual requests or group of Web services to scenarios where the workload takes the form of bursts. The client entity is included in the CloudSim, a framework for modelling and simulation of cloud computing. Experimental results show the influence of the client behavior on the performance of the services executed in a cloud computing system. PMID:27441559
Computer assisted surgery in preoperative planning of acetabular fracture surgery: state of the art.
Boudissa, Mehdi; Courvoisier, Aurélien; Chabanas, Matthieu; Tonetti, Jérôme
2018-01-01
The development of imaging modalities and computer technology provides a new approach in acetabular surgery. Areas covered: This review describes the role of computer-assisted surgery (CAS) in understanding of the fracture patterns, in the virtual preoperative planning of the surgery and in the use of custom-made plates in acetabular fractures with or without 3D printing technologies. A Pubmed internet research of the English literature of the last 20 years was carried out about studies concerning computer-assisted surgery in acetabular fractures. The several steps for CAS in acetabular fracture surgery are presented and commented by the main author regarding to his personal experience. Expert commentary: Computer-assisted surgery in acetabular fractures is still initial experiences with promising results. Patient-specific biomechanical models considering soft tissues should be developed to allow a more realistic planning.
What Today's Educational Technology Needs: Defensible Evaluations and Realistic Implementation.
ERIC Educational Resources Information Center
Roweton, William E.; And Others
It is argued that in order to make computer assisted instruction effective in the schools, educators should pay more attention to implementation issues (including modifying teacher attitudes, changing classroom routines, and offering realistic technical training and support) and to producing understandable product and performance evaluations.…
Realistic simulated MRI and SPECT databases. Application to SPECT/MRI registration evaluation.
Aubert-Broche, Berengere; Grova, Christophe; Reilhac, Anthonin; Evans, Alan C; Collins, D Louis
2006-01-01
This paper describes the construction of simulated SPECT and MRI databases that account for realistic anatomical and functional variability. The data is used as a gold-standard to evaluate four SPECT/MRI similarity-based registration methods. Simulation realism was accounted for using accurate physical models of data generation and acquisition. MRI and SPECT simulations were generated from three subjects to take into account inter-subject anatomical variability. Functional SPECT data were computed from six functional models of brain perfusion. Previous models of normal perfusion and ictal perfusion observed in Mesial Temporal Lobe Epilepsy (MTLE) were considered to generate functional variability. We studied the impact noise and intensity non-uniformity in MRI simulations and SPECT scatter correction may have on registration accuracy. We quantified the amount of registration error caused by anatomical and functional variability. Registration involving ictal data was less accurate than registration involving normal data. MR intensity nonuniformity was the main factor decreasing registration accuracy. The proposed simulated database is promising to evaluate many functional neuroimaging methods, involving MRI and SPECT data.
Assembly, Integration, and Test Methods for Operationally Responsive Space Satellites
2010-03-01
like assembly and vibration tests, to ensure there have been no failures induced by the activities. External thermal control blankets and radiator...configuration of the satellite post- vibration test and adds time to the process. • Thermal blanketing is not realistic with current technology or...patterns for thermal blankets and radiator tape. The computer aided drawing (CAD) solid model was used to generate patterns that were cut and applied real
Wen, Tingxi; Medveczky, David; Wu, Jackie; Wu, Jianhuang
2018-01-25
Colonoscopy plays an important role in the clinical screening and management of colorectal cancer. The traditional 'see one, do one, teach one' training style for such invasive procedure is resource intensive and ineffective. Given that colonoscopy is difficult, and time-consuming to master, the use of virtual reality simulators to train gastroenterologists in colonoscopy operations offers a promising alternative. In this paper, a realistic and real-time interactive simulator for training colonoscopy procedure is presented, which can even include polypectomy simulation. Our approach models the colonoscopy as thick flexible elastic rods with different resolutions which are dynamically adaptive to the curvature of the colon. More material characteristics of this deformable material are integrated into our discrete model to realistically simulate the behavior of the colonoscope. We present a simulator for training colonoscopy procedure. In addition, we propose a set of key aspects of our simulator that give fast, high fidelity feedback to trainees. We also conducted an initial validation of this colonoscopic simulator to determine its clinical utility and efficacy.
Improving atomic displacement and replacement calculations with physically realistic damage models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nordlund, Kai; Zinkle, Steven J.; Sand, Andrea E.
Atomic collision processes are fundamental to numerous advanced materials technologies such as electron microscopy, semiconductor processing and nuclear power generation. Extensive experimental and computer simulation studies over the past several decades provide the physical basis for understanding the atomic-scale processes occurring during primary displacement events. The current international standard for quantifying this energetic particle damage, the Norgett-Robinson-Torrens displacements per atom (NRT-dpa) model, has nowadays several well-known limitations. In particular, the number of radiation defects produced in energetic cascades in metals is only ~1/3 the NRT-dpa prediction, while the number of atoms involved in atomic mixing is about a factor ofmore » 30 larger than the dpa value. Here we propose two new complementary displacement production estimators (athermal recombination corrected dpa, arc-dpa) and atomic mixing (replacements per atom, rpa) functions that extend the NRT-dpa by providing more physically realistic descriptions of primary defect creation in materials and may become additional standard measures for radiation damage quantification.« less
Improving atomic displacement and replacement calculations with physically realistic damage models
Nordlund, Kai; Zinkle, Steven J.; Sand, Andrea E.; ...
2018-03-14
Atomic collision processes are fundamental to numerous advanced materials technologies such as electron microscopy, semiconductor processing and nuclear power generation. Extensive experimental and computer simulation studies over the past several decades provide the physical basis for understanding the atomic-scale processes occurring during primary displacement events. The current international standard for quantifying this energetic particle damage, the Norgett-Robinson-Torrens displacements per atom (NRT-dpa) model, has nowadays several well-known limitations. In particular, the number of radiation defects produced in energetic cascades in metals is only ~1/3 the NRT-dpa prediction, while the number of atoms involved in atomic mixing is about a factor ofmore » 30 larger than the dpa value. Here we propose two new complementary displacement production estimators (athermal recombination corrected dpa, arc-dpa) and atomic mixing (replacements per atom, rpa) functions that extend the NRT-dpa by providing more physically realistic descriptions of primary defect creation in materials and may become additional standard measures for radiation damage quantification.« less
Mechanical stabilization of the Levitron's realistic model
NASA Astrophysics Data System (ADS)
Olvera, Arturo; De la Rosa, Abraham; Giordano, Claudia M.
2016-11-01
The stability of the magnetic levitation showed by the Levitron was studied by M.V. Berry as a six degrees of freedom Hamiltonian system using an adiabatic approximation. Further, H.R. Dullin found critical spin rate bounds where the levitation persists and R.F. Gans et al. offered numerical results regarding the initial conditions' manifold where this occurs. In the line of this series of works, first, we extend the equations of motion to include dissipation for a more realistic model, and then introduce a mechanical forcing to inject energy into the system in order to prevent the Levitron from falling. A systematic study of the flying time as a function of the forcing parameters is carried out which yields detailed bifurcation diagrams showing an Arnold's tongues structure. The stability of these solutions were studied with the help of a novel method to compute the maximum Lyapunov exponent called MEGNO. The bifurcation diagrams for MEGNO reproduce the same Arnold's tongue structure.
Improving atomic displacement and replacement calculations with physically realistic damage models.
Nordlund, Kai; Zinkle, Steven J; Sand, Andrea E; Granberg, Fredric; Averback, Robert S; Stoller, Roger; Suzudo, Tomoaki; Malerba, Lorenzo; Banhart, Florian; Weber, William J; Willaime, Francois; Dudarev, Sergei L; Simeone, David
2018-03-14
Atomic collision processes are fundamental to numerous advanced materials technologies such as electron microscopy, semiconductor processing and nuclear power generation. Extensive experimental and computer simulation studies over the past several decades provide the physical basis for understanding the atomic-scale processes occurring during primary displacement events. The current international standard for quantifying this energetic particle damage, the Norgett-Robinson-Torrens displacements per atom (NRT-dpa) model, has nowadays several well-known limitations. In particular, the number of radiation defects produced in energetic cascades in metals is only ~1/3 the NRT-dpa prediction, while the number of atoms involved in atomic mixing is about a factor of 30 larger than the dpa value. Here we propose two new complementary displacement production estimators (athermal recombination corrected dpa, arc-dpa) and atomic mixing (replacements per atom, rpa) functions that extend the NRT-dpa by providing more physically realistic descriptions of primary defect creation in materials and may become additional standard measures for radiation damage quantification.
Cosmic-ray propagation with DRAGON2: I. numerical solver and astrophysical ingredients
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evoli, Carmelo; Gaggero, Daniele; Vittino, Andrea
2017-02-01
We present version 2 of the DRAGON code designed for computing realistic predictions of the CR densities in the Galaxy. The code numerically solves the interstellar CR transport equation (including inhomogeneous and anisotropic diffusion, either in space and momentum, advective transport and energy losses), under realistic conditions. The new version includes an updated numerical solver and several models for the astrophysical ingredients involved in the transport equation. Improvements in the accuracy of the numerical solution are proved against analytical solutions and in reference diffusion scenarios. The novel features implemented in the code allow to simulate the diverse scenarios proposed tomore » reproduce the most recent measurements of local and diffuse CR fluxes, going beyond the limitations of the homogeneous galactic transport paradigm. To this end, several applications using DRAGON2 are presented as well. This new version facilitates the users to include their own physical models by means of a modular C++ structure.« less
Coherence penalty functional: A simple method for adding decoherence in Ehrenfest dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Akimov, Alexey V., E-mail: alexvakimov@gmail.com, E-mail: oleg.prezhdo@rochester.edu; Chemistry Department, Brookhaven National Laboratory, Upton, New York 11973; Long, Run
2014-05-21
We present a new semiclassical approach for description of decoherence in electronically non-adiabatic molecular dynamics. The method is formulated on the grounds of the Ehrenfest dynamics and the Meyer-Miller-Thoss-Stock mapping of the time-dependent Schrödinger equation onto a fully classical Hamiltonian representation. We introduce a coherence penalty functional (CPF) that accounts for decoherence effects by randomizing the wavefunction phase and penalizing development of coherences in regions of strong non-adiabatic coupling. The performance of the method is demonstrated with several model and realistic systems. Compared to other semiclassical methods tested, the CPF method eliminates artificial interference and improves agreement with the fullymore » quantum calculations on the models. When applied to study electron transfer dynamics in the nanoscale systems, the method shows an improved accuracy of the predicted time scales. The simplicity and high computational efficiency of the CPF approach make it a perfect practical candidate for applications in realistic systems.« less
Taking error into account when fitting models using Approximate Bayesian Computation.
van der Vaart, Elske; Prangle, Dennis; Sibly, Richard M
2018-03-01
Stochastic computer simulations are often the only practical way of answering questions relating to ecological management. However, due to their complexity, such models are difficult to calibrate and evaluate. Approximate Bayesian Computation (ABC) offers an increasingly popular approach to this problem, widely applied across a variety of fields. However, ensuring the accuracy of ABC's estimates has been difficult. Here, we obtain more accurate estimates by incorporating estimation of error into the ABC protocol. We show how this can be done where the data consist of repeated measures of the same quantity and errors may be assumed to be normally distributed and independent. We then derive the correct acceptance probabilities for a probabilistic ABC algorithm, and update the coverage test with which accuracy is assessed. We apply this method, which we call error-calibrated ABC, to a toy example and a realistic 14-parameter simulation model of earthworms that is used in environmental risk assessment. A comparison with exact methods and the diagnostic coverage test show that our approach improves estimation of parameter values and their credible intervals for both models. © 2017 by the Ecological Society of America.
NASA Astrophysics Data System (ADS)
Yu, Jonas C. P.; Wee, H. M.; Yang, P. C.; Wu, Simon
2016-06-01
One of the supply chain risks for hi-tech products is the result of rapid technological innovation; it results in a significant decline in the selling price and demand after the initial launch period. Hi-tech products include computers and communication consumer's products. From a practical standpoint, a more realistic replenishment policy is needed to consider the impact of risks; especially when some portions of shortages are lost. In this paper, suboptimal and optimal order policies with partial backordering are developed for a buyer when the component cost, the selling price, and the demand rate decline at a continuous rate. Two mathematical models are derived and discussed: one model has the suboptimal solution with the fixed replenishment interval and a simpler computational process; the other one has the optimal solution with the varying replenishment interval and a more complicated computational process. The second model results in more profit. Numerical examples are provided to illustrate the two replenishment models. Sensitivity analysis is carried out to investigate the relationship between the parameters and the net profit.
Computational Modeling of Fluid–Structure–Acoustics Interaction during Voice Production
Jiang, Weili; Zheng, Xudong; Xue, Qian
2017-01-01
The paper presented a three-dimensional, first-principle based fluid–structure–acoustics interaction computer model of voice production, which employed a more realistic human laryngeal and vocal tract geometries. Self-sustained vibrations, important convergent–divergent vibration pattern of the vocal folds, and entrainment of the two dominant vibratory modes were captured. Voice quality-associated parameters including the frequency, open quotient, skewness quotient, and flow rate of the glottal flow waveform were found to be well within the normal physiological ranges. The analogy between the vocal tract and a quarter-wave resonator was demonstrated. The acoustic perturbed flux and pressure inside the glottis were found to be at the same order with their incompressible counterparts, suggesting strong source–filter interactions during voice production. Such high fidelity computational model will be useful for investigating a variety of pathological conditions that involve complex vibrations, such as vocal fold paralysis, vocal nodules, and vocal polyps. The model is also an important step toward a patient-specific surgical planning tool that can serve as a no-risk trial and error platform for different procedures, such as injection of biomaterials and thyroplastic medialization. PMID:28243588
GADEN: A 3D Gas Dispersion Simulator for Mobile Robot Olfaction in Realistic Environments.
Monroy, Javier; Hernandez-Bennets, Victor; Fan, Han; Lilienthal, Achim; Gonzalez-Jimenez, Javier
2017-06-23
This work presents a simulation framework developed under the widely used Robot Operating System (ROS) to enable the validation of robotics systems and gas sensing algorithms under realistic environments. The framework is rooted in the principles of computational fluid dynamics and filament dispersion theory, modeling wind flow and gas dispersion in 3D real-world scenarios (i.e., accounting for walls, furniture, etc.). Moreover, it integrates the simulation of different environmental sensors, such as metal oxide gas sensors, photo ionization detectors, or anemometers. We illustrate the potential and applicability of the proposed tool by presenting a simulation case in a complex and realistic office-like environment where gas leaks of different chemicals occur simultaneously. Furthermore, we accomplish quantitative and qualitative validation by comparing our simulated results against real-world data recorded inside a wind tunnel where methane was released under different wind flow profiles. Based on these results, we conclude that our simulation framework can provide a good approximation to real world measurements when advective airflows are present in the environment.
GADEN: A 3D Gas Dispersion Simulator for Mobile Robot Olfaction in Realistic Environments
Hernandez-Bennetts, Victor; Fan, Han; Lilienthal, Achim; Gonzalez-Jimenez, Javier
2017-01-01
This work presents a simulation framework developed under the widely used Robot Operating System (ROS) to enable the validation of robotics systems and gas sensing algorithms under realistic environments. The framework is rooted in the principles of computational fluid dynamics and filament dispersion theory, modeling wind flow and gas dispersion in 3D real-world scenarios (i.e., accounting for walls, furniture, etc.). Moreover, it integrates the simulation of different environmental sensors, such as metal oxide gas sensors, photo ionization detectors, or anemometers. We illustrate the potential and applicability of the proposed tool by presenting a simulation case in a complex and realistic office-like environment where gas leaks of different chemicals occur simultaneously. Furthermore, we accomplish quantitative and qualitative validation by comparing our simulated results against real-world data recorded inside a wind tunnel where methane was released under different wind flow profiles. Based on these results, we conclude that our simulation framework can provide a good approximation to real world measurements when advective airflows are present in the environment. PMID:28644375
Environments for online maritime simulators with cloud computing capabilities
NASA Astrophysics Data System (ADS)
Raicu, Gabriel; Raicu, Alexandra
2016-12-01
This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.
NASA Technical Reports Server (NTRS)
Stabe, Roy G.; Schwab, John R.
1991-01-01
A 0.767-scale model of a turbine stator designed for the core of a high-bypass-ratio aircraft engine was tested with uniform inlet conditions and with an inlet radial temperature profile simulating engine conditions. The principal measurements were radial and circumferential surveys of stator-exit total temperature, total pressure, and flow angle. The stator-exit flow field was also computed by using a three-dimensional Navier-Stokes solver. Other than temperature, there were no apparent differences in performance due to the inlet conditions. The computed results compared quite well with the experimental results.
Computer Modeling of Non-Isothermal Crystallization
NASA Technical Reports Server (NTRS)
Kelton, K. F.; Narayan, K. Lakshmi; Levine, L. E.; Cull, T. C.; Ray, C. S.
1996-01-01
A realistic computer model for simulating isothermal and non-isothermal phase transformations proceeding by homogeneous and heterogeneous nucleation and interface-limited growth is presented. A new treatment for particle size effects on the crystallization kinetics is developed and is incorporated into the numerical model. Time-dependent nucleation rates, size-dependent growth rates, and surface crystallization are also included. Model predictions are compared with experimental measurements of DSC/DTA peak parameters for the crystallization of lithium disilicate glass as a function of particle size, Pt doping levels, and water content. The quantitative agreement that is demonstrated indicates that the numerical model can be used to extract key kinetic data from easily obtained calorimetric data. The model can also be used to probe nucleation and growth behavior in regimes that are otherwise inaccessible. Based on a fit to data, an earlier prediction that the time-dependent nucleation rate in a DSC/DTA scan can rise above the steady-state value at a temperature higher than the peak in the steady-state rate is demonstrated.
Numerical analysis of hypersonic turbulent film cooling flows
NASA Technical Reports Server (NTRS)
Chen, Y. S.; Chen, C. P.; Wei, H.
1992-01-01
As a building block, numerical capabilities for predicting heat flux and turbulent flowfields of hypersonic vehicles require extensive model validations. Computational procedures for calculating turbulent flows and heat fluxes for supersonic film cooling with parallel slot injections are described in this study. Two injectant mass flow rates with matched and unmatched pressure conditions using the database of Holden et al. (1990) are considered. To avoid uncertainties associated with the boundary conditions in testing turbulence models, detailed three-dimensional flowfields of the injection nozzle were calculated. Two computational fluid dynamics codes, GASP and FDNS, with the algebraic Baldwin-Lomax and k-epsilon models with compressibility corrections were used. It was found that the B-L model which resolves near-wall viscous sublayer is very sensitive to the inlet boundary conditions at the nozzle exit face. The k-epsilon models with improved wall functions are less sensitive to the inlet boundary conditions. The testings show that compressibility corrections are necessary for the k-epsilon model to realistically predict the heat fluxes of the hypersonic film cooling problems.
A new method for constructing networks from binary data
NASA Astrophysics Data System (ADS)
van Borkulo, Claudia D.; Borsboom, Denny; Epskamp, Sacha; Blanken, Tessa F.; Boschloo, Lynn; Schoevers, Robert A.; Waldorp, Lourens J.
2014-08-01
Network analysis is entering fields where network structures are unknown, such as psychology and the educational sciences. A crucial step in the application of network models lies in the assessment of network structure. Current methods either have serious drawbacks or are only suitable for Gaussian data. In the present paper, we present a method for assessing network structures from binary data. Although models for binary data are infamous for their computational intractability, we present a computationally efficient model for estimating network structures. The approach, which is based on Ising models as used in physics, combines logistic regression with model selection based on a Goodness-of-Fit measure to identify relevant relationships between variables that define connections in a network. A validation study shows that this method succeeds in revealing the most relevant features of a network for realistic sample sizes. We apply our proposed method to estimate the network of depression and anxiety symptoms from symptom scores of 1108 subjects. Possible extensions of the model are discussed.
An interface finite element model can be used to predict healing outcome of bone fractures.
Alierta, J A; Pérez, M A; García-Aznar, J M
2014-01-01
After fractures, bone can experience different potential outcomes: successful bone consolidation, non-union and bone failure. Although, there are a lot of factors that influence fracture healing, experimental studies have shown that the interfragmentary movement (IFM) is one of the main regulators for the course of bone healing. In this sense, computational models may help to improve the development of mechanical-based treatments for bone fracture healing. Hence, based on this fact, we propose a combined repair-failure mechanistic computational model to describe bone fracture healing. Despite being a simple model, it is able to correctly estimate the time course evolution of the IFM compared to in vivo measurements under different mechanical conditions. Therefore, this mathematical approach is especially suitable for modeling the healing response of bone to fractures treated with different mechanical fixators, simulating realistic clinical conditions. This model will be a useful tool to identify factors and define targets for patient specific therapeutics interventions. © 2013 Published by Elsevier Ltd.
Modeling cation/anion-water interactions in functional aluminosilicate structures.
Richards, A J; Barnes, P; Collins, D R; Christodoulos, F; Clark, S M
1995-02-01
A need for the computer simulation of hydration/dehydration processes in functional aluminosilicate structures has been noted. Full and realistic simulations of these systems can be somewhat ambitious and require the aid of interactive computer graphics to identify key structural/chemical units, both in the devising of suitable water-ion simulation potentials and in the analysis of hydrogen-bonding schemes in the subsequent simulation studies. In this article, the former is demonstrated by the assembling of a range of essential water-ion potentials. These span the range of formal charges from +4e to -2e, and are evaluated in the context of three types of structure: a porous zeolite, calcium silicate cement, and layered clay. As an example of the latter, the computer graphics output from Monte Carlo computer simulation studies of hydration/dehydration in calcium-zeolite A is presented.
A Comparison of Approximation Modeling Techniques: Polynomial Versus Interpolating Models
NASA Technical Reports Server (NTRS)
Giunta, Anthony A.; Watson, Layne T.
1998-01-01
Two methods of creating approximation models are compared through the calculation of the modeling accuracy on test problems involving one, five, and ten independent variables. Here, the test problems are representative of the modeling challenges typically encountered in realistic engineering optimization problems. The first approximation model is a quadratic polynomial created using the method of least squares. This type of polynomial model has seen considerable use in recent engineering optimization studies due to its computational simplicity and ease of use. However, quadratic polynomial models may be of limited accuracy when the response data to be modeled have multiple local extrema. The second approximation model employs an interpolation scheme known as kriging developed in the fields of spatial statistics and geostatistics. This class of interpolating model has the flexibility to model response data with multiple local extrema. However, this flexibility is obtained at an increase in computational expense and a decrease in ease of use. The intent of this study is to provide an initial exploration of the accuracy and modeling capabilities of these two approximation methods.
Forward Field Computation with OpenMEEG
Gramfort, Alexandre; Papadopoulo, Théodore; Olivi, Emmanuel; Clerc, Maureen
2011-01-01
To recover the sources giving rise to electro- and magnetoencephalography in individual measurements, realistic physiological modeling is required, and accurate numerical solutions must be computed. We present OpenMEEG, which solves the electromagnetic forward problem in the quasistatic regime, for head models with piecewise constant conductivity. The core of OpenMEEG consists of the symmetric Boundary Element Method, which is based on an extended Green Representation theorem. OpenMEEG is able to provide lead fields for four different electromagnetic forward problems: Electroencephalography (EEG), Magnetoencephalography (MEG), Electrical Impedance Tomography (EIT), and intracranial electric potentials (IPs). OpenMEEG is open source and multiplatform. It can be used from Python and Matlab in conjunction with toolboxes that solve the inverse problem; its integration within FieldTrip is operational since release 2.0. PMID:21437231
Heliocentric interplanetary low thrust trajectory optimization program, supplement 1, part 2
NASA Technical Reports Server (NTRS)
Mann, F. I.; Horsewood, J. L.
1978-01-01
The improvements made to the HILTOP electric propulsion trajectory computer program are described. A more realistic propulsion system model was implemented in which various thrust subsystem efficiencies and specific impulse are modeled as variable functions of power available to the propulsion system. The number of operating thrusters are staged, and the beam voltage is selected from a set of five (or less) constant voltages, based upon the application of variational calculus. The constant beam voltages may be optimized individually or collectively. The propulsion system logic is activated by a single program input key in such a manner as to preserve the HILTOP logic. An analysis describing these features, a complete description of program input quantities, and sample cases of computer output illustrating the program capabilities are presented.
On the prediction of swirling flowfields found in axisymmetric combustor geometries
NASA Technical Reports Server (NTRS)
Rhode, D. L.; Lilley, D. G.; Mclaughlin, D. K.
1981-01-01
The paper reports research restricted to steady turbulence flow in axisymmetric geometries under low speed and nonreacting conditions. Numerical computations are performed for a basic two-dimensional axisymmetrical flow field similar to that found in a conventional gas turbine combustor. Calculations include a stairstep boundary representation of the expansion flow, a conventional k-epsilon turbulence model and realistic accomodation of swirl effects. A preliminary evaluation of the accuracy of computed flowfields is accomplished by comparisons with flow visualizations using neutrally-buoyant helium-filled soap bubbles as tracer particles. Comparisons of calculated results show good agreement, and it is found that a problem in swirling flows is the accuracy with which the sizes and shapes of the recirculation zones may be predicted, which may be attributed to the quality of the turbulence model.
Levels of detail analysis of microwave scattering from human head models for brain stroke detection
2017-01-01
In this paper, we have presented a microwave scattering analysis from multiple human head models. This study incorporates different levels of detail in the human head models and its effect on microwave scattering phenomenon. Two levels of detail are taken into account; (i) Simplified ellipse shaped head model (ii) Anatomically realistic head model, implemented using 2-D geometry. In addition, heterogenic and frequency-dispersive behavior of the brain tissues has also been incorporated in our head models. It is identified during this study that the microwave scattering phenomenon changes significantly once the complexity of head model is increased by incorporating more details using magnetic resonance imaging database. It is also found out that the microwave scattering results match in both types of head model (i.e., geometrically simple and anatomically realistic), once the measurements are made in the structurally simplified regions. However, the results diverge considerably in the complex areas of brain due to the arbitrary shape interface of tissue layers in the anatomically realistic head model. After incorporating various levels of detail, the solution of subject microwave scattering problem and the measurement of transmitted and backscattered signals were obtained using finite element method. Mesh convergence analysis was also performed to achieve error free results with a minimum number of mesh elements and a lesser degree of freedom in the fast computational time. The results were promising and the E-Field values converged for both simple and complex geometrical models. However, the E-Field difference between both types of head model at the same reference point differentiated a lot in terms of magnitude. At complex location, a high difference value of 0.04236 V/m was measured compared to the simple location, where it turned out to be 0.00197 V/m. This study also contributes to provide a comparison analysis between the direct and iterative solvers so as to find out the solution of subject microwave scattering problem in a minimum computational time along with memory resources requirement. It is seen from this study that the microwave imaging may effectively be utilized for the detection, localization and differentiation of different types of brain stroke. The simulation results verified that the microwave imaging can be efficiently exploited to study the significant contrast between electric field values of the normal and abnormal brain tissues for the investigation of brain anomalies. In the end, a specific absorption rate analysis was carried out to compare the ionizing effects of microwave signals to different types of head model using a factor of safety for brain tissues. It is also suggested after careful study of various inversion methods in practice for microwave head imaging, that the contrast source inversion method may be more suitable and computationally efficient for such problems. PMID:29177115
NASA Astrophysics Data System (ADS)
De Geeter, N.; Crevecoeur, G.; Leemans, A.; Dupré, L.
2015-01-01
In transcranial magnetic stimulation (TMS), an applied alternating magnetic field induces an electric field in the brain that can interact with the neural system. It is generally assumed that this induced electric field is the crucial effect exciting a certain region of the brain. More specifically, it is the component of this field parallel to the neuron’s local orientation, the so-called effective electric field, that can initiate neuronal stimulation. Deeper insights on the stimulation mechanisms can be acquired through extensive TMS modelling. Most models study simple representations of neurons with assumed geometries, whereas we embed realistic neural trajectories computed using tractography based on diffusion tensor images. This way of modelling ensures a more accurate spatial distribution of the effective electric field that is in addition patient and case specific. The case study of this paper focuses on the single pulse stimulation of the left primary motor cortex with a standard figure-of-eight coil. Including realistic neural geometry in the model demonstrates the strong and localized variations of the effective electric field between the tracts themselves and along them due to the interplay of factors such as the tract’s position and orientation in relation to the TMS coil, the neural trajectory and its course along the white and grey matter interface. Furthermore, the influence of changes in the coil orientation is studied. Investigating the impact of tissue anisotropy confirms that its contribution is not negligible. Moreover, assuming isotropic tissues lead to errors of the same size as rotating or tilting the coil with 10 degrees. In contrast, the model proves to be less sensitive towards the not well-known tissue conductivity values.
Divertor target shape optimization in realistic edge plasma geometry
NASA Astrophysics Data System (ADS)
Dekeyser, W.; Reiter, D.; Baelmans, M.
2014-07-01
Tokamak divertor design for next-step fusion reactors heavily relies on numerical simulations of the plasma edge. Currently, the design process is mainly done in a forward approach, where the designer is strongly guided by his experience and physical intuition in proposing divertor shapes, which are then thoroughly assessed by numerical computations. On the other hand, automated design methods based on optimization have proven very successful in the related field of aerodynamic design. By recasting design objectives and constraints into the framework of a mathematical optimization problem, efficient forward-adjoint based algorithms can be used to automatically compute the divertor shape which performs the best with respect to the selected edge plasma model and design criteria. In the past years, we have extended these methods to automated divertor target shape design, using somewhat simplified edge plasma models and geometries. In this paper, we build on and extend previous work to apply these shape optimization methods for the first time in more realistic, single null edge plasma and divertor geometry, as commonly used in current divertor design studies. In a case study with JET-like parameters, we show that the so-called one-shot method is very effective is solving divertor target design problems. Furthermore, by detailed shape sensitivity analysis we demonstrate that the development of the method already at the present state provides physically plausible trends, allowing to achieve a divertor design with an almost perfectly uniform power load for our particular choice of edge plasma model and design criteria.
Classical and all-floating FETI methods for the simulation of arterial tissues
Augustin, Christoph M.; Holzapfel, Gerhard A.; Steinbach, Olaf
2015-01-01
High-resolution and anatomically realistic computer models of biological soft tissues play a significant role in the understanding of the function of cardiovascular components in health and disease. However, the computational effort to handle fine grids to resolve the geometries as well as sophisticated tissue models is very challenging. One possibility to derive a strongly scalable parallel solution algorithm is to consider finite element tearing and interconnecting (FETI) methods. In this study we propose and investigate the application of FETI methods to simulate the elastic behavior of biological soft tissues. As one particular example we choose the artery which is – as most other biological tissues – characterized by anisotropic and nonlinear material properties. We compare two specific approaches of FETI methods, classical and all-floating, and investigate the numerical behavior of different preconditioning techniques. In comparison to classical FETI, the all-floating approach has not only advantages concerning the implementation but in many cases also concerning the convergence of the global iterative solution method. This behavior is illustrated with numerical examples. We present results of linear elastic simulations to show convergence rates, as expected from the theory, and results from the more sophisticated nonlinear case where we apply a well-known anisotropic model to the realistic geometry of an artery. Although the FETI methods have a great applicability on artery simulations we will also discuss some limitations concerning the dependence on material parameters. PMID:26751957
Modeling of Flow Transition Using an Intermittency Transport Equation
NASA Technical Reports Server (NTRS)
Suzen, Y. B.; Huang, P. G.
1999-01-01
A new transport equation for intermittency factor is proposed to model transitional flows. The intermittent behavior of the transitional flows is incorporated into the computations by modifying the eddy viscosity, mu(sub t), obtainable from a turbulence model, with the intermittency factor, gamma: mu(sub t, sup *) = gamma.mu(sub t). In this paper, Menter's SST model (Menter, 1994) is employed to compute mu(sub t) and other turbulent quantities. The proposed intermittency transport equation can be considered as a blending of two models - Steelant and Dick (1996) and Cho and Chung (1992). The former was proposed for near-wall flows and was designed to reproduce the streamwise variation of the intermittency factor in the transition zone following Dhawan and Narasimha correlation (Dhawan and Narasimha, 1958) and the latter was proposed for free shear flows and was used to provide a realistic cross-stream variation of the intermittency profile. The new model was used to predict the T3 series experiments assembled by Savill (1993a, 1993b) including flows with different freestream turbulence intensities and two pressure-gradient cases. For all test cases good agreements between the computed results and the experimental data are observed.
Neic, Aurel; Campos, Fernando O; Prassl, Anton J; Niederer, Steven A; Bishop, Martin J; Vigmond, Edward J; Plank, Gernot
2017-10-01
Anatomically accurate and biophysically detailed bidomain models of the human heart have proven a powerful tool for gaining quantitative insight into the links between electrical sources in the myocardium and the concomitant current flow in the surrounding medium as they represent their relationship mechanistically based on first principles. Such models are increasingly considered as a clinical research tool with the perspective of being used, ultimately, as a complementary diagnostic modality. An important prerequisite in many clinical modeling applications is the ability of models to faithfully replicate potential maps and electrograms recorded from a given patient. However, while the personalization of electrophysiology models based on the gold standard bidomain formulation is in principle feasible, the associated computational expenses are significant, rendering their use incompatible with clinical time frames. In this study we report on the development of a novel computationally efficient reaction-eikonal (R-E) model for modeling extracellular potential maps and electrograms. Using a biventricular human electrophysiology model, which incorporates a topologically realistic His-Purkinje system (HPS), we demonstrate by comparing against a high-resolution reaction-diffusion (R-D) bidomain model that the R-E model predicts extracellular potential fields, electrograms as well as ECGs at the body surface with high fidelity and offers vast computational savings greater than three orders of magnitude. Due to their efficiency R-E models are ideally suitable for forward simulations in clinical modeling studies which attempt to personalize electrophysiological model features.
NASA Astrophysics Data System (ADS)
Neic, Aurel; Campos, Fernando O.; Prassl, Anton J.; Niederer, Steven A.; Bishop, Martin J.; Vigmond, Edward J.; Plank, Gernot
2017-10-01
Anatomically accurate and biophysically detailed bidomain models of the human heart have proven a powerful tool for gaining quantitative insight into the links between electrical sources in the myocardium and the concomitant current flow in the surrounding medium as they represent their relationship mechanistically based on first principles. Such models are increasingly considered as a clinical research tool with the perspective of being used, ultimately, as a complementary diagnostic modality. An important prerequisite in many clinical modeling applications is the ability of models to faithfully replicate potential maps and electrograms recorded from a given patient. However, while the personalization of electrophysiology models based on the gold standard bidomain formulation is in principle feasible, the associated computational expenses are significant, rendering their use incompatible with clinical time frames. In this study we report on the development of a novel computationally efficient reaction-eikonal (R-E) model for modeling extracellular potential maps and electrograms. Using a biventricular human electrophysiology model, which incorporates a topologically realistic His-Purkinje system (HPS), we demonstrate by comparing against a high-resolution reaction-diffusion (R-D) bidomain model that the R-E model predicts extracellular potential fields, electrograms as well as ECGs at the body surface with high fidelity and offers vast computational savings greater than three orders of magnitude. Due to their efficiency R-E models are ideally suitable for forward simulations in clinical modeling studies which attempt to personalize electrophysiological model features.
Crustal deformation in Great California Earthquake cycles
NASA Technical Reports Server (NTRS)
Li, Victor C.; Rice, James R.
1987-01-01
A model in which coupling is described approximately through a generalized Elsasser model is proposed for computation of the periodic crustal deformation associated with repeated strike-slip earthquakes. The model is found to provide a more realistic physical description of tectonic loading than do simpler kinematic models. Parameters are chosen to model the 1857 and 1906 San Andreas ruptures, and predictions are found to be consistent with data on variations of contemporary surface strain and displacement rates as a function of distance from the 1857 and 1906 rupture traces. Results indicate that the asthenosphere appropriate to describe crustal deformation on the earthquake cycle time scale lies in the lower crust and perhaps the crust-mantle transition zone.
SciDAC-Data, A Project to Enabling Data Driven Modeling of Exascale Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mubarak, M.; Ding, P.; Aliaga, L.
The SciDAC-Data project is a DOE funded initiative to analyze and exploit two decades of information and analytics that have been collected by the Fermilab Data Center on the organization, movement, and consumption of High Energy Physics data. The project will analyze the analysis patterns and data organization that have been used by the NOvA, MicroBooNE, MINERvA and other experiments, to develop realistic models of HEP analysis workflows and data processing. The SciDAC-Data project aims to provide both realistic input vectors and corresponding output data that can be used to optimize and validate simulations of HEP analysis. These simulations aremore » designed to address questions of data handling, cache optimization and workflow structures that are the prerequisites for modern HEP analysis chains to be mapped and optimized to run on the next generation of leadership class exascale computing facilities. We will address the use of the SciDAC-Data distributions acquired from Fermilab Data Center’s analysis workflows and corresponding to around 71,000 HEP jobs, as the input to detailed queuing simulations that model the expected data consumption and caching behaviors of the work running in HPC environments. In particular we describe in detail how the Sequential Access via Metadata (SAM) data handling system in combination with the dCache/Enstore based data archive facilities have been analyzed to develop the radically different models of the analysis of HEP data. We present how the simulation may be used to analyze the impact of design choices in archive facilities.« less
Model-based surgical planning and simulation of cranial base surgery.
Abe, M; Tabuchi, K; Goto, M; Uchino, A
1998-11-01
Plastic skull models of seven individual patients were fabricated by stereolithography from three-dimensional data based on computed tomography bone images. Skull models were utilized for neurosurgical planning and simulation in the seven patients with cranial base lesions that were difficult to remove. Surgical approaches and areas of craniotomy were evaluated using the fabricated skull models. In preoperative simulations, hand-made models of the tumors, major vessels and nerves were placed in the skull models. Step-by-step simulation of surgical procedures was performed using actual surgical tools. The advantages of using skull models to plan and simulate cranial base surgery include a better understanding of anatomic relationships, preoperative evaluation of the proposed procedure, increased understanding by the patient and family, and improved educational experiences for residents and other medical staff. The disadvantages of using skull models include the time and cost of making the models. The skull models provide a more realistic tool that is easier to handle than computer-graphic images. Surgical simulation using models facilitates difficult cranial base surgery and may help reduce surgical complications.
Computational aspects in mechanical modeling of the articular cartilage tissue.
Mohammadi, Hadi; Mequanint, Kibret; Herzog, Walter
2013-04-01
This review focuses on the modeling of articular cartilage (at the tissue level), chondrocyte mechanobiology (at the cell level) and a combination of both in a multiscale computation scheme. The primary objective is to evaluate the advantages and disadvantages of conventional models implemented to study the mechanics of the articular cartilage tissue and chondrocytes. From monophasic material models as the simplest form to more complicated multiscale theories, these approaches have been frequently used to model articular cartilage and have contributed significantly to modeling joint mechanics, addressing and resolving numerous issues regarding cartilage mechanics and function. It should be noted that attentiveness is important when using different modeling approaches, as the choice of the model limits the applications available. In this review, we discuss the conventional models applicable to some of the mechanical aspects of articular cartilage such as lubrication, swelling pressure and chondrocyte mechanics and address some of the issues associated with the current modeling approaches. We then suggest future pathways for a more realistic modeling strategy as applied for the simulation of the mechanics of the cartilage tissue using multiscale and parallelized finite element method.
PIV-measured versus CFD-predicted flow dynamics in anatomically realistic cerebral aneurysm models.
Ford, Matthew D; Nikolov, Hristo N; Milner, Jaques S; Lownie, Stephen P; Demont, Edwin M; Kalata, Wojciech; Loth, Francis; Holdsworth, David W; Steinman, David A
2008-04-01
Computational fluid dynamics (CFD) modeling of nominally patient-specific cerebral aneurysms is increasingly being used as a research tool to further understand the development, prognosis, and treatment of brain aneurysms. We have previously developed virtual angiography to indirectly validate CFD-predicted gross flow dynamics against the routinely acquired digital subtraction angiograms. Toward a more direct validation, here we compare detailed, CFD-predicted velocity fields against those measured using particle imaging velocimetry (PIV). Two anatomically realistic flow-through phantoms, one a giant internal carotid artery (ICA) aneurysm and the other a basilar artery (BA) tip aneurysm, were constructed of a clear silicone elastomer. The phantoms were placed within a computer-controlled flow loop, programed with representative flow rate waveforms. PIV images were collected on several anterior-posterior (AP) and lateral (LAT) planes. CFD simulations were then carried out using a well-validated, in-house solver, based on micro-CT reconstructions of the geometries of the flow-through phantoms and inlet/outlet boundary conditions derived from flow rates measured during the PIV experiments. PIV and CFD results from the central AP plane of the ICA aneurysm showed a large stable vortex throughout the cardiac cycle. Complex vortex dynamics, captured by PIV and CFD, persisted throughout the cardiac cycle on the central LAT plane. Velocity vector fields showed good overall agreement. For the BA, aneurysm agreement was more compelling, with both PIV and CFD similarly resolving the dynamics of counter-rotating vortices on both AP and LAT planes. Despite the imposition of periodic flow boundary conditions for the CFD simulations, cycle-to-cycle fluctuations were evident in the BA aneurysm simulations, which agreed well, in terms of both amplitudes and spatial distributions, with cycle-to-cycle fluctuations measured by PIV in the same geometry. The overall good agreement between PIV and CFD suggests that CFD can reliably predict the details of the intra-aneurysmal flow dynamics observed in anatomically realistic in vitro models. Nevertheless, given the various modeling assumptions, this does not prove that they are mimicking the actual in vivo hemodynamics, and so validations against in vivo data are encouraged whenever possible.
Evaluation of the new EMAC-SWIFT chemistry climate model
NASA Astrophysics Data System (ADS)
Scheffler, Janice; Langematz, Ulrike; Wohltmann, Ingo; Rex, Markus
2016-04-01
It is well known that the representation of atmospheric ozone chemistry in weather and climate models is essential for a realistic simulation of the atmospheric state. Including atmospheric ozone chemistry into climate simulations is usually done by prescribing a climatological ozone field, by including a fast linear ozone scheme into the model or by using a climate model with complex interactive chemistry. While prescribed climatological ozone fields are often not aligned with the modelled dynamics, a linear ozone scheme may not be applicable for a wide range of climatological conditions. Although interactive chemistry provides a realistic representation of atmospheric chemistry such model simulations are computationally very expensive and hence not suitable for ensemble simulations or simulations with multiple climate change scenarios. A new approach to represent atmospheric chemistry in climate models which can cope with non-linearities in ozone chemistry and is applicable to a wide range of climatic states is the Semi-empirical Weighted Iterative Fit Technique (SWIFT) that is driven by reanalysis data and has been validated against observational satellite data and runs of a full Chemistry and Transport Model. SWIFT has recently been implemented into the ECHAM/MESSy (EMAC) chemistry climate model that uses a modular approach to climate modelling where individual model components can be switched on and off. Here, we show first results of EMAC-SWIFT simulations and validate these against EMAC simulations using the complex interactive chemistry scheme MECCA, and against observations.
Application of cellular automata approach for cloud simulation and rendering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christopher Immanuel, W.; Paul Mary Deborrah, S.; Samuel Selvaraj, R.
Current techniques for creating clouds in games and other real time applications produce static, homogenous clouds. These clouds, while viable for real time applications, do not exhibit an organic feel that clouds in nature exhibit. These clouds, when viewed over a time period, were able to deform their initial shape and move in a more organic and dynamic way. With cloud shape technology we should be able in the future to extend to create even more cloud shapes in real time with more forces. Clouds are an essential part of any computer model of a landscape or an animation ofmore » an outdoor scene. A realistic animation of clouds is also important for creating scenes for flight simulators, movies, games, and other. Our goal was to create a realistic animation of clouds.« less