National Institute of Standards and Technology Data Gateway
SRD 100 Database for Simulation of Electron Spectra for Surface Analysis (SESSA)Database for Simulation of Electron Spectra for Surface Analysis (SESSA) (PC database for purchase) This database has been designed to facilitate quantitative interpretation of Auger-electron and X-ray photoelectron spectra and to improve the accuracy of quantitation in routine analysis. The database contains all physical data needed to perform quantitative interpretation of an electron spectrum for a thin-film specimen of given composition. A simulation module provides an estimate of peak intensities as well as the energy and angular distributions of the emitted electron flux.
Debriefing after Human Patient Simulation and Nursing Students' Learning
ERIC Educational Resources Information Center
Benhuri, Gloria
2014-01-01
Human Patient Simulation (HPS) exercises with life-like computerized manikins provide clinical experiences for nursing students in a safe environment followed by debriefing that promotes learning. Quantitative research in techniques to support learning from debriefing is limited. The purpose of the quantitative quasi-experimental study using a…
Establish an Agent-Simulant Technology Relationship (ASTR)
2017-04-14
for quantitative measures that characterize simulant performance in testing , such as the ability to be removed from surfaces. Component-level ASTRs...Overall Test and Agent-Simulant Technology Relationship (ASTR) process. 1.2 Background. a. Historically, many tests did not develop quantitative ...methodology report14. Report provides a VX-TPP ASTR for post -decon contact hazard and off- gassing. In the Stryker production verification test (PVT
Simulation Of Combat With An Expert System
NASA Technical Reports Server (NTRS)
Provenzano, J. P.
1989-01-01
Proposed expert system predicts outcomes of combat situations. Called "COBRA", combat outcome based on rules for attrition, system selects rules for mathematical modeling of losses and discrete events in combat according to previous experiences. Used with another software module known as the "Game". Game/COBRA software system, consisting of Game and COBRA modules, provides for both quantitative aspects and qualitative aspects in simulations of battles. COBRA intended for simulation of large-scale military exercises, concepts embodied in it have much broader applicability. In industrial research, knowledge-based system enables qualitative as well as quantitative simulations.
Normalized Temperature Contrast Processing in Infrared Flash Thermography
NASA Technical Reports Server (NTRS)
Koshti, Ajay M.
2016-01-01
The paper presents further development in normalized contrast processing used in flash infrared thermography method. Method of computing normalized image or pixel intensity contrast, and normalized temperature contrast are provided. Methods of converting image contrast to temperature contrast and vice versa are provided. Normalized contrast processing in flash thermography is useful in quantitative analysis of flash thermography data including flaw characterization and comparison of experimental results with simulation. Computation of normalized temperature contrast involves use of flash thermography data acquisition set-up with high reflectivity foil and high emissivity tape such that the foil, tape and test object are imaged simultaneously. Methods of assessing other quantitative parameters such as emissivity of object, afterglow heat flux, reflection temperature change and surface temperature during flash thermography are also provided. Temperature imaging and normalized temperature contrast processing provide certain advantages over normalized image contrast processing by reducing effect of reflected energy in images and measurements, therefore providing better quantitative data. Examples of incorporating afterglow heat-flux and reflection temperature evolution in flash thermography simulation are also discussed.
NASA Astrophysics Data System (ADS)
Narita, Y.; Iida, H.; Ebert, S.; Nakamura, T.
1997-12-01
Two independent scatter correction techniques, transmission dependent convolution subtraction (TDCS) and triple-energy window (TEW) method, were evaluated in terms of quantitative accuracy and noise properties using Monte Carlo simulation (EGS4). Emission projections (primary, scatter and scatter plus primary) were simulated for three numerical phantoms for /sup 201/Tl. Data were reconstructed with ordered-subset EM algorithm including noise-less transmission data based attenuation correction. Accuracy of TDCS and TEW scatter corrections were assessed by comparison with simulated true primary data. The uniform cylindrical phantom simulation demonstrated better quantitative accuracy with TDCS than with TEW (-2.0% vs. 16.7%) and better S/N (6.48 vs. 5.05). A uniform ring myocardial phantom simulation demonstrated better homogeneity with TDCS than TEW in the myocardium; i.e., anterior-to-posterior wall count ratios were 0.99 and 0.76 with TDCS and TEW, respectively. For the MCAT phantom, TDCS provided good visual and quantitative agreement with simulated true primary image without noticeably increasing the noise after scatter correction. Overall TDCS proved to be more accurate and less noisy than TEW, facilitating quantitative assessment of physiological functions with SPECT.
Simulation of FRET dyes allows quantitative comparison against experimental data
NASA Astrophysics Data System (ADS)
Reinartz, Ines; Sinner, Claude; Nettels, Daniel; Stucki-Buchli, Brigitte; Stockmar, Florian; Panek, Pawel T.; Jacob, Christoph R.; Nienhaus, Gerd Ulrich; Schuler, Benjamin; Schug, Alexander
2018-03-01
Fully understanding biomolecular function requires detailed insight into the systems' structural dynamics. Powerful experimental techniques such as single molecule Förster Resonance Energy Transfer (FRET) provide access to such dynamic information yet have to be carefully interpreted. Molecular simulations can complement these experiments but typically face limits in accessing slow time scales and large or unstructured systems. Here, we introduce a coarse-grained simulation technique that tackles these challenges. While requiring only few parameters, we maintain full protein flexibility and include all heavy atoms of proteins, linkers, and dyes. We are able to sufficiently reduce computational demands to simulate large or heterogeneous structural dynamics and ensembles on slow time scales found in, e.g., protein folding. The simulations allow for calculating FRET efficiencies which quantitatively agree with experimentally determined values. By providing atomically resolved trajectories, this work supports the planning and microscopic interpretation of experiments. Overall, these results highlight how simulations and experiments can complement each other leading to new insights into biomolecular dynamics and function.
Toward canonical ensemble distribution from self-guided Langevin dynamics simulation
NASA Astrophysics Data System (ADS)
Wu, Xiongwu; Brooks, Bernard R.
2011-04-01
This work derives a quantitative description of the conformational distribution in self-guided Langevin dynamics (SGLD) simulations. SGLD simulations employ guiding forces calculated from local average momentums to enhance low-frequency motion. This enhancement in low-frequency motion dramatically accelerates conformational search efficiency, but also induces certain perturbations in conformational distribution. Through the local averaging, we separate properties of molecular systems into low-frequency and high-frequency portions. The guiding force effect on the conformational distribution is quantitatively described using these low-frequency and high-frequency properties. This quantitative relation provides a way to convert between a canonical ensemble and a self-guided ensemble. Using example systems, we demonstrated how to utilize the relation to obtain canonical ensemble properties and conformational distributions from SGLD simulations. This development makes SGLD not only an efficient approach for conformational searching, but also an accurate means for conformational sampling.
A Method to Measure and Estimate Normalized Contrast in Infrared Flash Thermography
NASA Technical Reports Server (NTRS)
Koshti, Ajay M.
2016-01-01
The paper presents further development in normalized contrast processing used in flash infrared thermography method. Method of computing normalized image or pixel intensity contrast, and normalized temperature contrast are provided. Methods of converting image contrast to temperature contrast and vice versa are provided. Normalized contrast processing in flash thermography is useful in quantitative analysis of flash thermography data including flaw characterization and comparison of experimental results with simulation. Computation of normalized temperature contrast involves use of flash thermography data acquisition set-up with high reflectivity foil and high emissivity tape such that the foil, tape and test object are imaged simultaneously. Methods of assessing other quantitative parameters such as emissivity of object, afterglow heat flux, reflection temperature change and surface temperature during flash thermography are also provided. Temperature imaging and normalized temperature contrast processing provide certain advantages over normalized image contrast processing by reducing effect of reflected energy in images and measurements, therefore providing better quantitative data. Examples of incorporating afterglow heat-flux and reflection temperature evolution in flash thermography simulation are also discussed.
Computer simulation of the metastatic progression.
Wedemann, Gero; Bethge, Anja; Haustein, Volker; Schumacher, Udo
2014-01-01
A novel computer model based on a discrete event simulation procedure describes quantitatively the processes underlying the metastatic cascade. Analytical functions describe the size of the primary tumor and the metastases, while a rate function models the intravasation events of the primary tumor and metastases. Events describe the behavior of the malignant cells until the formation of new metastases. The results of the computer simulations are in quantitative agreement with clinical data determined from a patient with hepatocellular carcinoma in the liver. The model provides a more detailed view on the process than a conventional mathematical model. In particular, the implications of interventions on metastasis formation can be calculated.
Simulating realistic predator signatures in quantitative fatty acid signature analysis
Bromaghin, Jeffrey F.
2015-01-01
Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.
Smeal, Steven W; Schmitt, Margaret A; Pereira, Ronnie Rodrigues; Prasad, Ashok; Fisk, John D
2017-01-01
To expand the quantitative, systems level understanding and foster the expansion of the biotechnological applications of the filamentous bacteriophage M13, we have unified the accumulated quantitative information on M13 biology into a genetically-structured, experimentally-based computational simulation of the entire phage life cycle. The deterministic chemical kinetic simulation explicitly includes the molecular details of DNA replication, mRNA transcription, protein translation and particle assembly, as well as the competing protein-protein and protein-nucleic acid interactions that control the timing and extent of phage production. The simulation reproduces the holistic behavior of M13, closely matching experimentally reported values of the intracellular levels of phage species and the timing of events in the M13 life cycle. The computational model provides a quantitative description of phage biology, highlights gaps in the present understanding of M13, and offers a framework for exploring alternative mechanisms of regulation in the context of the complete M13 life cycle. Copyright © 2016 Elsevier Inc. All rights reserved.
Molecular-level simulations of turbulence and its decay
Gallis, M. A.; Bitter, N. P.; Koehler, T. P.; ...
2017-02-08
Here, we provide the first demonstration that molecular-level methods based on gas kinetic theory and molecular chaos can simulate turbulence and its decay. The direct simulation Monte Carlo (DSMC) method, a molecular-level technique for simulating gas flows that resolves phenomena from molecular to hydrodynamic (continuum) length scales, is applied to simulate the Taylor-Green vortex flow. The DSMC simulations reproduce the Kolmogorov –5/3 law and agree well with the turbulent kinetic energy and energy dissipation rate obtained from direct numerical simulation of the Navier-Stokes equations using a spectral method. This agreement provides strong evidence that molecular-level methods for gases can bemore » used to investigate turbulent flows quantitatively.« less
Melittin Aggregation in Aqueous Solutions: Insight from Molecular Dynamics Simulations.
Liao, Chenyi; Esai Selvan, Myvizhi; Zhao, Jun; Slimovitch, Jonathan L; Schneebeli, Severin T; Shelley, Mee; Shelley, John C; Li, Jianing
2015-08-20
Melittin is a natural peptide that aggregates in aqueous solutions with paradigmatic monomer-to-tetramer and coil-to-helix transitions. Since little is known about the molecular mechanisms of melittin aggregation in solution, we simulated its self-aggregation process under various conditions. After confirming the stability of a melittin tetramer in solution, we observed—for the first time in atomistic detail—that four separated melittin monomers aggregate into a tetramer. Our simulated dependence of melittin aggregation on peptide concentration, temperature, and ionic strength is in good agreement with prior experiments. We propose that melittin mainly self-aggregates via a mechanism involving the sequential addition of monomers, which is supported by both qualitative and quantitative evidence obtained from unbiased and metadynamics simulations. Moreover, by combining computer simulations and a theory of the electrical double layer, we provide evidence to suggest why melittin aggregation in solution likely stops at the tetramer, rather than forming higher-order oligomers. Overall, our study not only explains prior experimental results at the molecular level but also provides quantitative mechanistic information that may guide the engineering of melittin for higher efficacy and safety.
Simulating the Effects of Alternative Forest Management Strategies on Landscape Structure
Eric J. Gustafson; Thomas Crow
1996-01-01
Quantitative, spatial tools are needed to assess the long-term spatial consequences of alternative management strategies for land use planning and resource management. We constructed a timber harvest allocation model (HARVEST) that provides a visual and quantitative means to predict the spatial pattern of forest openings produced by alternative harvest strategies....
Quantitative evaluation of simulated functional brain networks in graph theoretical analysis.
Lee, Won Hee; Bullmore, Ed; Frangou, Sophia
2017-02-01
There is increasing interest in the potential of whole-brain computational models to provide mechanistic insights into resting-state brain networks. It is therefore important to determine the degree to which computational models reproduce the topological features of empirical functional brain networks. We used empirical connectivity data derived from diffusion spectrum and resting-state functional magnetic resonance imaging data from healthy individuals. Empirical and simulated functional networks, constrained by structural connectivity, were defined based on 66 brain anatomical regions (nodes). Simulated functional data were generated using the Kuramoto model in which each anatomical region acts as a phase oscillator. Network topology was studied using graph theory in the empirical and simulated data. The difference (relative error) between graph theory measures derived from empirical and simulated data was then estimated. We found that simulated data can be used with confidence to model graph measures of global network organization at different dynamic states and highlight the sensitive dependence of the solutions obtained in simulated data on the specified connection densities. This study provides a method for the quantitative evaluation and external validation of graph theory metrics derived from simulated data that can be used to inform future study designs. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Yamaguchi, Takumi; Sakae, Yoshitake; Zhang, Ying; Yamamoto, Sayoko; Okamoto, Yuko; Kato, Koichi
2014-10-06
Exploration of the conformational spaces of flexible biomacromolecules is essential for quantitatively understanding the energetics of their molecular recognition processes. We employed stable isotope- and lanthanide-assisted NMR approaches in conjunction with replica-exchange molecular dynamics (REMD) simulations to obtain atomic descriptions of the conformational dynamics of high-mannose-type oligosaccharides, which harbor intracellular glycoprotein-fate determinants in their triantennary structures. The experimentally validated REMD simulation provided quantitative views of the dynamic conformational ensembles of the complicated, branched oligosaccharides, and indicated significant expansion of the conformational space upon removal of a terminal mannose residue during the functional glycan-processing pathway. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Ion Counting from Explicit-Solvent Simulations and 3D-RISM
Giambaşu, George M.; Luchko, Tyler; Herschlag, Daniel; York, Darrin M.; Case, David A.
2014-01-01
The ionic atmosphere around nucleic acids remains only partially understood at atomic-level detail. Ion counting (IC) experiments provide a quantitative measure of the ionic atmosphere around nucleic acids and, as such, are a natural route for testing quantitative theoretical approaches. In this article, we replicate IC experiments involving duplex DNA in NaCl(aq) using molecular dynamics (MD) simulation, the three-dimensional reference interaction site model (3D-RISM), and nonlinear Poisson-Boltzmann (NLPB) calculations and test against recent buffer-equilibration atomic emission spectroscopy measurements. Further, we outline the statistical mechanical basis for interpreting IC experiments and clarify the use of specific concentration scales. Near physiological concentrations, MD simulation and 3D-RISM estimates are close to experimental results, but at higher concentrations (>0.7 M), both methods underestimate the number of condensed cations and overestimate the number of excluded anions. The effect of DNA charge on ion and water atmosphere extends 20–25 Å from its surface, yielding layered density profiles. Overall, ion distributions from 3D-RISMs are relatively close to those from corresponding MD simulations, but with less Na+ binding in grooves and tighter binding to phosphates. NLPB calculations, on the other hand, systematically underestimate the number of condensed cations at almost all concentrations and yield nearly structureless ion distributions that are qualitatively distinct from those generated by both MD simulation and 3D-RISM. These results suggest that MD simulation and 3D-RISM may be further developed to provide quantitative insight into the characterization of the ion atmosphere around nucleic acids and their effect on structure and stability. PMID:24559991
Interactive Simulations: Improving Learning Retention in Knowledge-Based Online Training Courses
ERIC Educational Resources Information Center
Boyd, James L.
2017-01-01
The purpose of this quasi-experimental quantitative study was to investigate whether online interactive simulations would provide a positive improvement in learners' ability to apply critical thinking skills in a dangerous work environment. The course in which an improvement in critical thinking skills was the target outcome was a course which…
Liu, Chun; Bridges, Melissa E; Kaundun, Shiv S; Glasgow, Les; Owen, Micheal Dk; Neve, Paul
2017-02-01
Simulation models are useful tools for predicting and comparing the risk of herbicide resistance in weed populations under different management strategies. Most existing models assume a monogenic mechanism governing herbicide resistance evolution. However, growing evidence suggests that herbicide resistance is often inherited in a polygenic or quantitative fashion. Therefore, we constructed a generalised modelling framework to simulate the evolution of quantitative herbicide resistance in summer annual weeds. Real-field management parameters based on Amaranthus tuberculatus (Moq.) Sauer (syn. rudis) control with glyphosate and mesotrione in Midwestern US maize-soybean agroecosystems demonstrated that the model can represent evolved herbicide resistance in realistic timescales. Sensitivity analyses showed that genetic and management parameters were impactful on the rate of quantitative herbicide resistance evolution, whilst biological parameters such as emergence and seed bank mortality were less important. The simulation model provides a robust and widely applicable framework for predicting the evolution of quantitative herbicide resistance in summer annual weed populations. The sensitivity analyses identified weed characteristics that would favour herbicide resistance evolution, including high annual fecundity, large resistance phenotypic variance and pre-existing herbicide resistance. Implications for herbicide resistance management and potential use of the model are discussed. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
Predicting mesoscale microstructural evolution in electron beam welding
Rodgers, Theron M.; Madison, Jonathan D.; Tikare, Veena; ...
2016-03-16
Using the kinetic Monte Carlo simulator, Stochastic Parallel PARticle Kinetic Simulator, from Sandia National Laboratories, a user routine has been developed to simulate mesoscale predictions of a grain structure near a moving heat source. Here, we demonstrate the use of this user routine to produce voxelized, synthetic, three-dimensional microstructures for electron-beam welding by comparing them with experimentally produced microstructures. When simulation input parameters are matched to experimental process parameters, qualitative and quantitative agreement for both grain size and grain morphology are achieved. The method is capable of simulating both single- and multipass welds. As a result, the simulations provide anmore » opportunity for not only accelerated design but also the integration of simulation and experiments in design such that simulations can receive parameter bounds from experiments and, in turn, provide predictions of a resultant microstructure.« less
USDA-ARS?s Scientific Manuscript database
Agricultural research increasingly is expected to provide precise, quantitative information with an explicit geographic coverage. Limited availability of continuous daily meteorological records often constrains efforts to provide such information through integrated use of simulation models, spatial ...
Application of Simulation to Individualized Self-Paced Training. Final Report. TAEG Report No. 11-2.
ERIC Educational Resources Information Center
Lindahl, William H.; Gardner, James H.
Computer simulation is recognized as a valuable systems analysis research tool which enables the detailed examination, evaluation, and manipulation, under stated conditions, of a system without direct action on the system. This technique provides management with quantitative data on system performance and capabilities which can be used to compare…
Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast
Pang, Wei; Coghill, George M.
2015-01-01
In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. PMID:25864377
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chuang, Claire Y.; Zepeda-Ruiz, Luis A.; Han, Sang M.
2015-06-01
Molecular dynamics simulations were used to study Ge island nucleation and growth on amorphous SiO 2 substrates. This process is relevant in selective epitaxial growth of Ge on Si, for which SiO 2 is often used as a template mask. The islanding process was studied over a wide range of temperatures and fluxes, using a recently proposed empirical potential model for the Si–SiO 2–Ge system. The simulations provide an excellent quantitative picture of the Ge islanding and compare well with detailed experimental measurements. These quantitative comparisons were enabled by an analytical rate model as a bridge between simulations and experimentsmore » despite the fact that deposition fluxes accessible in simulations and experiments are necessarily different by many orders of magnitude. In particular, the simulations led to accurate predictions of the critical island size and the scaling of island density as a function of temperature. Lastly, the overall approach used here should be useful not just for future studies in this particular system, but also for molecular simulations of deposition in other materials.« less
Yu, Huan; Ni, Shi-Jun; Kong, Bo; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong
2013-01-01
Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources.
Ni, Shi-Jun; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong
2013-01-01
Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources. PMID:23818816
Systems Biology in Immunology – A Computational Modeling Perspective
Germain, Ronald N.; Meier-Schellersheim, Martin; Nita-Lazar, Aleksandra; Fraser, Iain D. C.
2011-01-01
Systems biology is an emerging discipline that combines high-content, multiplexed measurements with informatic and computational modeling methods to better understand biological function at various scales. Here we present a detailed review of the methods used to create computational models and conduct simulations of immune function, We provide descriptions of the key data gathering techniques employed to generate the quantitative and qualitative data required for such modeling and simulation and summarize the progress to date in applying these tools and techniques to questions of immunological interest, including infectious disease. We include comments on what insights modeling can provide that complement information obtained from the more familiar experimental discovery methods used by most investigators and why quantitative methods are needed to eventually produce a better understanding of immune system operation in health and disease. PMID:21219182
Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.
Pang, Wei; Coghill, George M
2015-05-01
In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Hakkaart, Xavier D V; Pronk, Jack T; van Maris, Antonius J A
2017-01-01
Understanding microbial growth and metabolism is a key learning objective of microbiology and biotechnology courses, essential for understanding microbial ecology, microbial biotechnology and medical microbiology. Chemostat cultivation, a key research tool in microbial physiology that enables quantitative analysis of growth and metabolism under tightly defined conditions, provides a powerful platform to teach key features of microbial growth and metabolism. Substrate-limited chemostat cultivation can be mathematically described by four equations. These encompass mass balances for biomass and substrate, an empirical relation that describes distribution of consumed substrate over growth and maintenance energy requirements (Pirt equation), and a Monod-type equation that describes the relation between substrate concentration and substrate-consumption rate. The authors felt that the abstract nature of these mathematical equations and a lack of visualization contributed to a suboptimal operative understanding of quantitative microbial physiology among students who followed their Microbial Physiology B.Sc. courses. The studio-classroom workshop presented here was developed to improve student understanding of quantitative physiology by a set of question-guided simulations. Simulations are run on Chemostatus, a specially developed MATLAB-based program, which visualizes key parameters of simulated chemostat cultures as they proceed from dynamic growth conditions to steady state. In practice, the workshop stimulated active discussion between students and with their teachers. Moreover, its introduction coincided with increased average exam scores for questions on quantitative microbial physiology. The workshop can be easily implemented in formal microbial physiology courses or used by individuals seeking to test and improve their understanding of quantitative microbial physiology and/or chemostat cultivation.
Montgomery, Kymberlee; Morse, Catherine; Smith-Glasgow, Mary Ellen; Posmontier, Bobbie; Follen, Michele
2012-02-01
This manuscript presents the methodology used to assess the impact of a clinical simulation module used for training providers specializing in women's health. The methodology presented here will be used for a quantitative study in the future. Copyright © 2012 Elsevier HS Journals, Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Crespo Ramos, Edwin O.
This research was aimed at establishing the differences, if any, between traditional direct teaching and constructive teaching through the use of computer simulations and their effect on pre-service teachers. It's also intended to gain feedback on the users of these simulations as providers of constructive teaching and learning experiences. The experimental framework used a quantitative method with a descriptive focus. The research was guided by two hypothesis and five inquiries. The data was obtained from a group composed of twenty-nine students from a private Metropolitan University in Puerto Rico and elementary school pre-service teachers. They were divided into two sub-groups: experimental and control. Two means were used to collect data: tests and surveys. Quantitative data was analyzed through test "t" for paired samples and the non-parametric Wilcoxon test. The results of the pre and post tests do not provide enough evidence to conclude that using the simulations as learning tools was more effective than traditional teaching. However, the quantitative results obtained were not enough to reject or dismiss the hypothesis Ho1. On the other hand, an overall positive attitude towards these simulations was obtained from the surveys. The importance of including hands-on activities in daily lesson planning was proven and well recognized among practice teachers. After participating and working with these simulations, the practice teachers expressed being convinced that they would definitely use them as teaching tools in the classroom. Due to these results, hypothesis Ho2 was rejected. Evidence also proved that practice teachers need further professional development to improve their skills in the application of these simulations in the classroom environment. The majority of these practice teachers showed concern about not being instructed on important aspects of the use of simulation as part of their college education curriculum towards becoming teachers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kimminau, G; Nagler, B; Higginbotham, A
2008-06-19
Calculations of the x-ray diffraction patterns from shocked crystals derived from the results of Non-Equilibrium-Molecular-Dynamics (NEMD) simulations are presented. The atomic coordinates predicted by the NEMD simulations combined with atomic form factors are used to generate a discrete distribution of electron density. A Fast-Fourier-Transform (FFT) of this distribution provides an image of the crystal in reciprocal space, which can be further processed to produce quantitative simulated data for direct comparison with experiments that employ picosecond x-ray diffraction from laser-irradiated crystalline targets.
A Computational Model of Liver Iron Metabolism
Mitchell, Simon; Mendes, Pedro
2013-01-01
Iron is essential for all known life due to its redox properties; however, these same properties can also lead to its toxicity in overload through the production of reactive oxygen species. Robust systemic and cellular control are required to maintain safe levels of iron, and the liver seems to be where this regulation is mainly located. Iron misregulation is implicated in many diseases, and as our understanding of iron metabolism improves, the list of iron-related disorders grows. Recent developments have resulted in greater knowledge of the fate of iron in the body and have led to a detailed map of its metabolism; however, a quantitative understanding at the systems level of how its components interact to produce tight regulation remains elusive. A mechanistic computational model of human liver iron metabolism, which includes the core regulatory components, is presented here. It was constructed based on known mechanisms of regulation and on their kinetic properties, obtained from several publications. The model was then quantitatively validated by comparing its results with previously published physiological data, and it is able to reproduce multiple experimental findings. A time course simulation following an oral dose of iron was compared to a clinical time course study and the simulation was found to recreate the dynamics and time scale of the systems response to iron challenge. A disease state simulation of haemochromatosis was created by altering a single reaction parameter that mimics a human haemochromatosis gene (HFE) mutation. The simulation provides a quantitative understanding of the liver iron overload that arises in this disease. This model supports and supplements understanding of the role of the liver as an iron sensor and provides a framework for further modelling, including simulations to identify valuable drug targets and design of experiments to improve further our knowledge of this system. PMID:24244122
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rutland, Christopher J.
2009-04-26
The Terascale High-Fidelity Simulations of Turbulent Combustion (TSTC) project is a multi-university collaborative effort to develop a high-fidelity turbulent reacting flow simulation capability utilizing terascale, massively parallel computer technology. The main paradigm of the approach is direct numerical simulation (DNS) featuring the highest temporal and spatial accuracy, allowing quantitative observations of the fine-scale physics found in turbulent reacting flows as well as providing a useful tool for development of sub-models needed in device-level simulations. Under this component of the TSTC program the simulation code named S3D, developed and shared with coworkers at Sandia National Laboratories, has been enhanced with newmore » numerical algorithms and physical models to provide predictive capabilities for turbulent liquid fuel spray dynamics. Major accomplishments include improved fundamental understanding of mixing and auto-ignition in multi-phase turbulent reactant mixtures and turbulent fuel injection spray jets.« less
Gao, Fan; Rodriguez, Johanan; Kapp, Susan
2016-06-01
Harness fitting in the body-powered prosthesis remains more art than science due to a lack of consistent and quantitative evaluation. The aim of this study was to develop a mechanical, human-body-shaped apparatus to simulate body-powered upper limb prosthetic usage and evaluate its capability of quantitative examination of harness configuration. The apparatus was built upon a torso of a wooden mannequin and integrated major mechanical joints to simulate terminal device operation. Sensors were used to register cable tension, cable excursion, and grip force simultaneously. The apparatus allowed the scapula to move up to 127 mm laterally and the load cell can measure the cable tension up to 445 N. Our preliminary evaluation highlighted the needs and importance of investigating harness configurations in a systematic and controllable manner. The apparatus allows objective, systematic, and quantitative evaluation of effects of realistic harness configurations and will provide insightful and working knowledge on harness fitting in upper limb amputees using body-powered prosthesis. © The International Society for Prosthetics and Orthotics 2015.
Monte Carlo modeling of light-tissue interactions in narrow band imaging.
Le, Du V N; Wang, Quanzeng; Ramella-Roman, Jessica C; Pfefer, T Joshua
2013-01-01
Light-tissue interactions that influence vascular contrast enhancement in narrow band imaging (NBI) have not been the subject of extensive theoretical study. In order to elucidate relevant mechanisms in a systematic and quantitative manner we have developed and validated a Monte Carlo model of NBI and used it to study the effect of device and tissue parameters, specifically, imaging wavelength (415 versus 540 nm) and vessel diameter and depth. Simulations provided quantitative predictions of contrast-including up to 125% improvement in small, superficial vessel contrast for 415 over 540 nm. Our findings indicated that absorption rather than scattering-the mechanism often cited in prior studies-was the dominant factor behind spectral variations in vessel depth-selectivity. Narrow-band images of a tissue-simulating phantom showed good agreement in terms of trends and quantitative values. Numerical modeling represents a powerful tool for elucidating the factors that affect the performance of spectral imaging approaches such as NBI.
Cinelli, Giorgia; Tositti, Laura; Mostacci, Domiziano; Baré, Jonathan
2016-05-01
In view of assessing natural radioactivity with on-site quantitative gamma spectrometry, efficiency calibration of NaI(Tl) detectors is investigated. A calibration based on Monte Carlo simulation of detector response is proposed, to render reliable quantitative analysis practicable in field campaigns. The method is developed with reference to contact geometry, in which measurements are taken placing the NaI(Tl) probe directly against the solid source to be analyzed. The Monte Carlo code used for the simulations was MCNP. Experimental verification of the calibration goodness is obtained by comparison with appropriate standards, as reported. On-site measurements yield a quick quantitative assessment of natural radioactivity levels present ((40)K, (238)U and (232)Th). On-site gamma spectrometry can prove particularly useful insofar as it provides information on materials from which samples cannot be taken. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
GAO, L.; HAGEN, N.; TKACZYK, T.S.
2012-01-01
Summary We implement a filterless illumination scheme on a hyperspectral fluorescence microscope to achieve full-range spectral imaging. The microscope employs polarisation filtering, spatial filtering and spectral unmixing filtering to replace the role of traditional filters. Quantitative comparisons between full-spectrum and filter-based microscopy are provided in the context of signal dynamic range and accuracy of measured fluorophores’ emission spectra. To show potential applications, a five-colour cell immunofluorescence imaging experiment is theoretically simulated. Simulation results indicate that the use of proposed full-spectrum imaging technique may result in three times improvement in signal dynamic range compared to that can be achieved in the filter-based imaging. PMID:22356127
NASA Technical Reports Server (NTRS)
Mercer, Joey; Callantine, Todd; Martin, Lynne
2012-01-01
A recent human-in-the-loop simulation in the Airspace Operations Laboratory (AOL) at NASA's Ames Research Center investigated the robustness of Controller-Managed Spacing (CMS) operations. CMS refers to AOL-developed controller tools and procedures for enabling arrivals to conduct efficient Optimized Profile Descents with sustained high throughput. The simulation provided a rich data set for examining how a traffic management supervisor and terminal-area controller participants used the CMS tools and coordinated to respond to off-nominal events. This paper proposes quantitative measures for characterizing the participants responses. Case studies of go-around events, replicated during the simulation, provide insights into the strategies employed and the role the CMS tools played in supporting them.
NASA Astrophysics Data System (ADS)
Barrett, Samuel; Webster, Jody
2016-04-01
Numerical simulation of the stratigraphy and sedimentology of carbonate systems (carbonate forward stratigraphic modelling - CFSM) provides significant insight into the understanding of both the physical nature of these systems and the processes which control their development. It also provides the opportunity to quantitatively test conceptual models concerning stratigraphy, sedimentology or geomorphology, and allows us to extend our knowledge either spatially (e.g. between bore holes) or temporally (forwards or backwards in time). The later is especially important in determining the likely future development of carbonate systems, particularly regarding the effects of climate change. This application, by its nature, requires successful simulation of carbonate systems on short time scales and at high spatial resolutions. Previous modelling attempts have typically focused on the scales of kilometers and kilo-years or greater (the scale of entire carbonate platforms), rather than at the scale of centuries or decades, and tens to hundreds of meters (the scale of individual reefs). Previous work has identified limitations in common approaches to simulating important reef processes. We present a new CFSM, Reef Sedimentary Accretion Model (ReefSAM), which is designed to test new approaches to simulating reef-scale processes, with the aim of being able to better simulate the past and future development of coral reefs. Four major features have been tested: 1. A simulation of wave based hydrodynamic energy with multiple simultaneous directions and intensities including wave refraction, interaction, and lateral sheltering. 2. Sediment transport simulated as sediment being moved from cell to cell in an iterative fashion until complete deposition. 3. A coral growth model including consideration of local wave energy and composition of the basement substrate (as well as depth). 4. A highly quantitative model testing approach where dozens of output parameters describing the reef morphology and development are compared with observational data. Despite being a test-bed and work in progress, ReefSAM was able to simulate the Holocene development of One Tree Reef in the Southern Great Barrier Reef (Australia) and was able to improve upon previous modelling attempts in terms of both quantitative measures and qualitative outputs, such as the presence of previously un-simulated reef features. Given the success of the model in simulating the Holocene development of OTR, we used it to quantitatively explore the effect of basement substrate depth and morphology on reef maturity/lagoonal filling (as discussed by Purdy and Gischer 2005). Initial results show a number of non-linear relationships between basement substrate depth, lagoonal filling and volume of sand produced on the reef rims and deposited in the lagoon. Lastly, further testing of the model has revealed new challenges which are likely to manifest in any attempt at reef-scale simulation. Subtly different sets of energy direction and magnitude input parameters (different in each time step but with identical probability distributions across the entire model run) resulted in a wide range of quantitative model outputs. Time step length is a likely contributing factor and the results of further testing to address this challenge will be presented.
A simulation model of IT risk on program trading
NASA Astrophysics Data System (ADS)
Xia, Bingying; Jiang, Wenbao; Luo, Guangxuan
2015-12-01
The biggest difficulty for Program trading IT risk measures lies in the loss of data, in view of this situation, the current scholars approach is collecting court, network and other public media such as all kinds of accident of IT both at home and abroad for data collection, and the loss of IT risk quantitative analysis based on this database. However, the IT risk loss database established by this method can only fuzzy reflect the real situation and not for real to make fundamental explanation. In this paper, based on the study of the concept and steps of the MC simulation, we use computer simulation method, by using the MC simulation method in the "Program trading simulation system" developed by team to simulate the real programming trading and get the IT risk loss of data through its IT failure experiment, at the end of the article, on the effectiveness of the experimental data is verified. In this way, better overcome the deficiency of the traditional research method and solves the problem of lack of IT risk data in quantitative research. More empirically provides researchers with a set of simulation method are used to study the ideas and the process template.
Comparison of GEANT4 very low energy cross section models with experimental data in water.
Incerti, S; Ivanchenko, A; Karamitros, M; Mantero, A; Moretto, P; Tran, H N; Mascialino, B; Champion, C; Ivanchenko, V N; Bernal, M A; Francis, Z; Villagrasa, C; Baldacchin, G; Guèye, P; Capra, R; Nieminen, P; Zacharatou, C
2010-09-01
The GEANT4 general-purpose Monte Carlo simulation toolkit is able to simulate physical interaction processes of electrons, hydrogen and helium atoms with charge states (H0, H+) and (He0, He+, He2+), respectively, in liquid water, the main component of biological systems, down to the electron volt regime and the submicrometer scale, providing GEANT4 users with the so-called "GEANT4-DNA" physics models suitable for microdosimetry simulation applications. The corresponding software has been recently re-engineered in order to provide GEANT4 users with a coherent and unique approach to the simulation of electromagnetic interactions within the GEANT4 toolkit framework (since GEANT4 version 9.3 beta). This work presents a quantitative comparison of these physics models with a collection of experimental data in water collected from the literature. An evaluation of the closeness between the total and differential cross section models available in the GEANT4 toolkit for microdosimetry and experimental reference data is performed using a dedicated statistical toolkit that includes the Kolmogorov-Smirnov statistical test. The authors used experimental data acquired in water vapor as direct measurements in the liquid phase are not yet available in the literature. Comparisons with several recommendations are also presented. The authors have assessed the compatibility of experimental data with GEANT4 microdosimetry models by means of quantitative methods. The results show that microdosimetric measurements in liquid water are necessary to assess quantitatively the validity of the software implementation for the liquid water phase. Nevertheless, a comparison with existing experimental data in water vapor provides a qualitative appreciation of the plausibility of the simulation models. The existing reference data themselves should undergo a critical interpretation and selection, as some of the series exhibit significant deviations from each other. The GEANT4-DNA physics models available in the GEANT4 toolkit have been compared in this article to available experimental data in the water vapor phase as well as to several published recommendations on the mass stopping power. These models represent a first step in the extension of the GEANT4 Monte Carlo toolkit to the simulation of biological effects of ionizing radiation.
Comparison of numerical model simulations and SFO wake vortex windline measurements
DOT National Transportation Integrated Search
2003-06-23
To provide quantitative support for the Simultaneous Offset Instrument Approach (SOIA) procedure, an extensive data collection effort was undertaken at San Francisco International Airport by the Federal Aviation Administration (FAA, U.S. Dept. of Tra...
Nagayama, T.; Bailey, J. E.; Loisel, G.; ...
2016-02-05
Recently, frequency-resolved iron opacity measurements at electron temperatures of 170–200 eV and electron densities of (0.7 – 4.0) × 10 22 cm –3 revealed a 30–400% disagreement with the calculated opacities [J. E. Bailey et al., Nature (London) 517, 56 (2015)]. The discrepancies have a high impact on astrophysics, atomic physics, and high-energy density physics, and it is important to verify our understanding of the experimental platform with simulations. Reliable simulations are challenging because the temporal and spatial evolution of the source radiation and of the sample plasma are both complex and incompletely diagnosed. In this article, we describe simulationsmore » that reproduce the measured temperature and density in recent iron opacity experiments performed at the Sandia National Laboratories Z facility. The time-dependent spectral irradiance at the sample is estimated using the measured time- and space-dependent source radiation distribution, in situ source-to-sample distance measurements, and a three-dimensional (3D) view-factor code. The inferred spectral irradiance is used to drive 1D sample radiation hydrodynamics simulations. The images recorded by slit-imaged space-resolved spectrometers are modeled by solving radiation transport of the source radiation through the sample. We find that the same drive radiation time history successfully reproduces the measured plasma conditions for eight different opacity experiments. These results provide a quantitative physical explanation for the observed dependence of both temperature and density on the sample configuration. Simulated spectral images for the experiments without the FeMg sample show quantitative agreement with the measured spectral images. The agreement in spectral profile, spatial profile, and brightness provides further confidence in our understanding of the backlight-radiation time history and image formation. Furthermore, these simulations bridge the static-uniform picture of the data interpretation and the dynamic-gradient reality of the experiments, and they will allow us to quantitatively assess the impact of effects neglected in the data interpretation.« less
Variables affecting learning in a simulation experience: a mixed methods study.
Beischel, Kelly P
2013-02-01
The primary purpose of this study was to test a hypothesized model describing the direct effects of learning variables on anxiety and cognitive learning outcomes in a high-fidelity simulation (HFS) experience. The secondary purpose was to explain and explore student perceptions concerning the qualities and context of HFS affecting anxiety and learning. This study used a mixed methods quantitative-dominant explanatory design with concurrent qualitative data collection to examine variables affecting learning in undergraduate, beginning nursing students (N = 124). Being ready to learn, having a strong auditory-verbal learning style, and being prepared for simulation directly affected anxiety, whereas learning outcomes were directly affected by having strong auditory-verbal and hands-on learning styles. Anxiety did not quantitatively mediate cognitive learning outcomes as theorized, although students qualitatively reported debilitating levels of anxiety. This study advances nursing education science by providing evidence concerning variables affecting learning outcomes in HFS.
Classical Molecular Dynamics Simulation of Nuclear Fuel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Devanathan, Ram; Krack, Matthias; Bertolus, Marjorie
2015-10-10
Molecular dynamics simulation is well suited to study primary damage production by irradiation, defect interactions with fission gas atoms, gas bubble nucleation, grain boundary effects on defect and gas bubble evolution in nuclear fuel, and the resulting changes in thermo-mechanical properties. In these simulations, the forces on the ions are dictated by interaction potentials generated by fitting properties of interest to experimental data. The results obtained from the present generation of potentials are qualitatively similar, but quantitatively different. There is a need to refine existing potentials to provide a better representation of the performance of polycrystalline fuel under a varietymore » of operating conditions, and to develop models that are equipped to handle deviations from stoichiometry. In addition to providing insights into fundamental mechanisms governing the behaviour of nuclear fuel, MD simulations can also provide parameters that can be used as inputs for mesoscale models.« less
Ghasemi Damavandi, Hamidreza; Sen Gupta, Ananya; Nelson, Robert K; Reddy, Christopher M
2016-01-01
Comprehensive two-dimensional gas chromatography [Formula: see text] provides high-resolution separations across hundreds of compounds in a complex mixture, thus unlocking unprecedented information for intricate quantitative interpretation. We exploit this compound diversity across the [Formula: see text] topography to provide quantitative compound-cognizant interpretation beyond target compound analysis with petroleum forensics as a practical application. We focus on the [Formula: see text] topography of biomarker hydrocarbons, hopanes and steranes, as they are generally recalcitrant to weathering. We introduce peak topography maps (PTM) and topography partitioning techniques that consider a notably broader and more diverse range of target and non-target biomarker compounds compared to traditional approaches that consider approximately 20 biomarker ratios. Specifically, we consider a range of 33-154 target and non-target biomarkers with highest-to-lowest peak ratio within an injection ranging from 4.86 to 19.6 (precise numbers depend on biomarker diversity of individual injections). We also provide a robust quantitative measure for directly determining "match" between samples, without necessitating training data sets. We validate our methods across 34 [Formula: see text] injections from a diverse portfolio of petroleum sources, and provide quantitative comparison of performance against established statistical methods such as principal components analysis (PCA). Our data set includes a wide range of samples collected following the 2010 Deepwater Horizon disaster that released approximately 160 million gallons of crude oil from the Macondo well (MW). Samples that were clearly collected following this disaster exhibit statistically significant match [Formula: see text] using PTM-based interpretation against other closely related sources. PTM-based interpretation also provides higher differentiation between closely correlated but distinct sources than obtained using PCA-based statistical comparisons. In addition to results based on this experimental field data, we also provide extentive perturbation analysis of the PTM method over numerical simulations that introduce random variability of peak locations over the [Formula: see text] biomarker ROI image of the MW pre-spill sample (sample [Formula: see text] in Additional file 4: Table S1). We compare the robustness of the cross-PTM score against peak location variability in both dimensions and compare the results against PCA analysis over the same set of simulated images. Detailed description of the simulation experiment and discussion of results are provided in Additional file 1: Section S8. We provide a peak-cognizant informational framework for quantitative interpretation of [Formula: see text] topography. Proposed topographic analysis enables [Formula: see text] forensic interpretation across target petroleum biomarkers, while including the nuances of lesser-known non-target biomarkers clustered around the target peaks. This allows potential discovery of hitherto unknown connections between target and non-target biomarkers.
Fang, Jiansong; Pang, Xiaocong; Wu, Ping; Yan, Rong; Gao, Li; Li, Chao; Lian, Wenwen; Wang, Qi; Liu, Ai-lin; Du, Guan-hua
2016-05-01
A dataset of 67 berberine derivatives for the inhibition of butyrylcholinesterase (BuChE) was studied based on the combination of quantitative structure-activity relationships models, molecular docking, and molecular dynamics methods. First, a series of berberine derivatives were reported, and their inhibitory activities toward butyrylcholinesterase (BuChE) were evaluated. By 2D- quantitative structure-activity relationships studies, the best model built by partial least-square had a conventional correlation coefficient of the training set (R(2)) of 0.883, a cross-validation correlation coefficient (Qcv2) of 0.777, and a conventional correlation coefficient of the test set (Rpred2) of 0.775. The model was also confirmed by Y-randomization examination. In addition, the molecular docking and molecular dynamics simulation were performed to better elucidate the inhibitory mechanism of three typical berberine derivatives (berberine, C2, and C55) toward BuChE. The predicted binding free energy results were consistent with the experimental data and showed that the van der Waals energy term (ΔEvdw) difference played the most important role in differentiating the activity among the three inhibitors (berberine, C2, and C55). The developed quantitative structure-activity relationships models provide details on the fine relationship linking structure and activity and offer clues for structural modifications, and the molecular simulation helps to understand the inhibitory mechanism of the three typical inhibitors. In conclusion, the results of this study provide useful clues for new drug design and discovery of BuChE inhibitors from berberine derivatives. © 2015 John Wiley & Sons A/S.
Voltz, Karine; Léonard, Jérémie; Touceda, Patricia Tourón; Conyard, Jamie; Chaker, Ziyad; Dejaegere, Annick; Godet, Julien; Mély, Yves; Haacke, Stefan; Stote, Roland H.
2016-01-01
Molecular dynamics (MD) simulations and time resolved fluorescence (TRF) spectroscopy were combined to quantitatively describe the conformational landscape of the DNA primary binding sequence (PBS) of the HIV-1 genome, a short hairpin targeted by retroviral nucleocapsid proteins implicated in the viral reverse transcription. Three 2-aminopurine (2AP) labeled PBS constructs were studied. For each variant, the complete distribution of fluorescence lifetimes covering 5 orders of magnitude in timescale was measured and the populations of conformers experimentally observed to undergo static quenching were quantified. A binary quantification permitted the comparison of populations from experimental lifetime amplitudes to populations of aromatically stacked 2AP conformers obtained from simulation. Both populations agreed well, supporting the general assumption that quenching of 2AP fluorescence results from pi-stacking interactions with neighboring nucleobases and demonstrating the success of the proposed methodology for the combined analysis of TRF and MD data. Cluster analysis of the latter further identified predominant conformations that were consistent with the fluorescence decay times and amplitudes, providing a structure-based rationalization for the wide range of fluorescence lifetimes. Finally, the simulations provided evidence of local structural perturbations induced by 2AP. The approach presented is a general tool to investigate fine structural heterogeneity in nucleic acid and nucleoprotein assemblies. PMID:26896800
The Obtaining of Oil from an Oil Reservoir.
ERIC Educational Resources Information Center
Dawe, R. A.
1979-01-01
Discusses the mechanics of how an actual oil reservoir works and provides some technical background in physics. An experiment which simulates an oil reservoir and demonstrates quantitatively all the basic concepts of oil reservoir rock properties is also presented. (HM)
Tracking Expected Improvements of Decadal Prediction in Climate Services
NASA Astrophysics Data System (ADS)
Suckling, E.; Thompson, E.; Smith, L. A.
2013-12-01
Physics-based simulation models are ultimately expected to provide the best available (decision-relevant) probabilistic climate predictions, as they can capture the dynamics of the Earth System across a range of situations, situations for which observations for the construction of empirical models are scant if not nonexistent. This fact in itself provides neither evidence that predictions from today's Earth Systems Models will outperform today's empirical models, nor a guide to the space and time scales on which today's model predictions are adequate for a given purpose. Empirical (data-based) models are employed to make probability forecasts on decadal timescales. The skill of these forecasts is contrasted with that of state-of-the-art climate models, and the challenges faced by each approach are discussed. The focus is on providing decision-relevant probability forecasts for decision support. An empirical model, known as Dynamic Climatology is shown to be competitive with CMIP5 climate models on decadal scale probability forecasts. Contrasting the skill of simulation models not only with each other but also with empirical models can reveal the space and time scales on which a generation of simulation models exploits their physical basis effectively. It can also quantify their ability to add information in the formation of operational forecasts. Difficulties (i) of information contamination (ii) of the interpretation of probabilistic skill and (iii) of artificial skill complicate each modelling approach, and are discussed. "Physics free" empirical models provide fixed, quantitative benchmarks for the evaluation of ever more complex climate models, that is not available from (inter)comparisons restricted to only complex models. At present, empirical models can also provide a background term for blending in the formation of probability forecasts from ensembles of simulation models. In weather forecasting this role is filled by the climatological distribution, and can significantly enhance the value of longer lead-time weather forecasts to those who use them. It is suggested that the direct comparison of simulation models with empirical models become a regular component of large model forecast intercomparison and evaluation. This would clarify the extent to which a given generation of state-of-the-art simulation models provide information beyond that available from simpler empirical models. It would also clarify current limitations in using simulation forecasting for decision support. No model-based probability forecast is complete without a quantitative estimate if its own irrelevance; this estimate is likely to increase as a function of lead time. A lack of decision-relevant quantitative skill would not bring the science-based foundation of anthropogenic warming into doubt. Similar levels of skill with empirical models does suggest a clear quantification of limits, as a function of lead time, for spatial and temporal scales on which decisions based on such model output are expected to prove maladaptive. Failing to clearly state such weaknesses of a given generation of simulation models, while clearly stating their strength and their foundation, risks the credibility of science in support of policy in the long term.
Practical Unitary Simulator for Non-Markovian Complex Processes
NASA Astrophysics Data System (ADS)
Binder, Felix C.; Thompson, Jayne; Gu, Mile
2018-06-01
Stochastic processes are as ubiquitous throughout the quantitative sciences as they are notorious for being difficult to simulate and predict. In this Letter, we propose a unitary quantum simulator for discrete-time stochastic processes which requires less internal memory than any classical analogue throughout the simulation. The simulator's internal memory requirements equal those of the best previous quantum models. However, in contrast to previous models, it only requires a (small) finite-dimensional Hilbert space. Moreover, since the simulator operates unitarily throughout, it avoids any unnecessary information loss. We provide a stepwise construction for simulators for a large class of stochastic processes hence directly opening the possibility for experimental implementations with current platforms for quantum computation. The results are illustrated for an example process.
Error Estimation and Uncertainty Propagation in Computational Fluid Mechanics
NASA Technical Reports Server (NTRS)
Zhu, J. Z.; He, Guowei; Bushnell, Dennis M. (Technical Monitor)
2002-01-01
Numerical simulation has now become an integral part of engineering design process. Critical design decisions are routinely made based on the simulation results and conclusions. Verification and validation of the reliability of the numerical simulation is therefore vitally important in the engineering design processes. We propose to develop theories and methodologies that can automatically provide quantitative information about the reliability of the numerical simulation by estimating numerical approximation error, computational model induced errors and the uncertainties contained in the mathematical models so that the reliability of the numerical simulation can be verified and validated. We also propose to develop and implement methodologies and techniques that can control the error and uncertainty during the numerical simulation so that the reliability of the numerical simulation can be improved.
Hattori, Yoshiaki; Falgout, Leo; Lee, Woosik; Jung, Sung-Young; Poon, Emily; Lee, Jung Woo; Na, Ilyoun; Geisler, Amelia; Sadhwani, Divya; Zhang, Yihui; Su, Yewang; Wang, Xiaoqi; Liu, Zhuangjian; Xia, Jing; Cheng, Huanyu; Webb, R. Chad; Bonifas, Andrew P.; Won, Philip; Jeong, Jae-Woong; Jang, Kyung-In; Song, Young Min; Nardone, Beatrice; Nodzenski, Michael; Fan, Jonathan A.; Huang, Yonggang; West, Dennis P.; Paller, Amy S.; Alam, Murad
2014-01-01
Non-invasive, biomedical devices have the potential to provide important, quantitative data for the assessment of skin diseases and wound healing. Traditional methods either rely on qualitative visual and tactile judgments of a professional and/or data obtained using instrumentation with forms that do not readily allow intimate integration with sensitive skin near a wound site. Here we report a skin-like electronics platform that can softly and reversibly laminate perilesionally at wounds to provide highly accurate, quantitative data of relevance to the management of surgical wound healing. Clinical studies on patients using thermal sensors and actuators in fractal layouts provide precise time-dependent mapping of temperature and thermal conductivity of the skin near the wounds. Analytical and simulation results establish the fundamentals of the sensing modalities, the mechanics of the system, and strategies for optimized design. The use of this type of ‘epidermal’ electronics system in a realistic, clinical setting with human subjects establishes a set of practical procedures in disinfection, reuse, and protocols for quantitative measurement. The results have the potential to address important unmet needs in chronic wound management. PMID:24668927
Quantitative Technique for Comparing Simulant Materials through Figures of Merit
NASA Technical Reports Server (NTRS)
Rickman, Doug; Hoelzer, Hans; Fourroux, Kathy; Owens, Charles; McLemore, Carole; Fikes, John
2007-01-01
The 1989 workshop report entitled Workshop on Production and Uses of Simulated Lunar Materials and the Lunar Regolith Simulant Materials: Recommendations for Standardization, Production, and Usage, NASA Technical Publication both identified and reinforced a need for a set of standards and requirements for the production and usage of the Lunar simulant materials. As NASA prepares to return to the Moon, and set out to Mars, a set of early requirements have been developed for simulant materials and the initial methods to produce and measure those simulants have been defined. Addressed in the requirements document are: 1) a method for evaluating the quality of any simulant of a regolith, 2) the minimum characteristics for simulants of Lunar regolith, and 3) a method to produce simulants needed for NASA's Exploration mission. As an extension of the requirements document a method to evaluate new and current simulants has been rigorously defined through the mathematics of Figures of Merit (FoM). Requirements and techniques have been developed that allow the simulant provider to compare their product to a standard reference material through Figures of Merit. Standard reference material may be physical material such as the Apollo core samples or material properties predicted for any landing site. The simulant provider is not restricted to providing a single "high fidelity" simulant, which may be costly to produce. The provider can now develop "lower fidelity" simulants for engineering applications such as drilling and mobility applications.
NASA Astrophysics Data System (ADS)
Wu, Di; Donovan Wong, Molly; Li, Yuhua; Fajardo, Laurie; Zheng, Bin; Wu, Xizeng; Liu, Hong
2017-12-01
The objective of this study was to quantitatively investigate the ability to distribute microbubbles along the interface between two tissues, in an effort to improve the edge and/or boundary features in phase contrast imaging. The experiments were conducted by employing a custom designed tissue simulating phantom, which also simulated a clinical condition where the ligand-targeted microbubbles are self-aggregated on the endothelium of blood vessels surrounding malignant cells. Four different concentrations of microbubble suspensions were injected into the phantom: 0%, 0.1%, 0.2%, and 0.4%. A time delay of 5 min was implemented before image acquisition to allow the microbubbles to become distributed at the interface between the acrylic and the cavity simulating a blood vessel segment. For comparison purposes, images were acquired using three system configurations for both projection and tomosynthesis imaging with a fixed radiation dose delivery: conventional low-energy contact mode, low-energy in-line phase contrast and high-energy in-line phase contrast. The resultant images illustrate the edge feature enhancements in the in-line phase contrast imaging mode when the microbubble concentration is extremely low. The quantitative edge-enhancement-to-noise ratio calculations not only agree with the direct image observations, but also indicate that the edge feature enhancement can be improved by increasing the microbubble concentration. In addition, high-energy in-line phase contrast imaging provided better performance in detecting low-concentration microbubble distributions.
From in silica to in silico: retention thermodynamics at solid-liquid interfaces.
El Hage, Krystel; Bemish, Raymond J; Meuwly, Markus
2018-06-28
The dynamics of solvated molecules at the solid/liquid interface is essential for a molecular-level understanding for the solution thermodynamics in reversed phase liquid chromatography (RPLC). The heterogeneous nature of the systems and the competing intermolecular interactions makes solute retention in RPLC a surprisingly challenging problem which benefits greatly from modelling at atomistic resolution. However, the quality of the underlying computational model needs to be sufficiently accurate to provide a realistic description of the energetics and dynamics of systems, especially for solution-phase simulations. Here, the retention thermodynamics and the retention mechanism of a range of benzene-derivatives in C18 stationary-phase chains in contact with water/methanol mixtures is studied using point charge (PC) and multipole (MTP) electrostatic models. The results demonstrate that free energy simulations with a faithful MTP representation of the computational model provide quantitative and molecular level insight into the thermodynamics of adsorption/desorption in chromatographic systems while a conventional PC representation fails in doing so. This provides a rational basis to develop more quantitative and validated models for the optimization of separation systems.
Huff, G.F.
2004-01-01
The tendency of solutes in input water to precipitate efficiency lowering scale deposits on the membranes of reverse osmosis (RO) desalination systems is an important factor in determining the suitability of input water for desalination. Simulated input water evaporation can be used as a technique to quantitatively assess the potential for scale formation in RO desalination systems. The technique was demonstrated by simulating the increase in solute concentrations required to form calcite, gypsum, and amorphous silica scales at 25??C and 40??C from 23 desalination input waters taken from the literature. Simulation results could be used to quantitatively assess the potential of a given input water to form scale or to compare the potential of a number of input waters to form scale during RO desalination. Simulated evaporation of input waters cannot accurately predict the conditions under which scale will form owing to the effects of potentially stable supersaturated solutions, solution velocity, and residence time inside RO systems. However, the simulated scale-forming potential of proposed input waters could be compared with the simulated scale-forming potentials and actual scale-forming properties of input waters having documented operational histories in RO systems. This may provide a technique to estimate the actual performance and suitability of proposed input waters during RO.
Rietsch, Stefan H G; Quick, Harald H; Orzada, Stephan
2015-08-01
In this work, the transmit performance and interelement coupling characteristics of radio frequency (RF) antenna microstrip line elements are examined in simulations and measurements. The initial point of the simulations is a microstrip line element loaded with a phantom. Meander structures are then introduced at the end of the element. The size of the meanders is increased in fixed steps and the magnetic field is optimized. In continuative simulations, the coupling between identical elements is evaluated for different element spacing and loading conditions. Verification of the simulation results is accomplished in measurements of the coupling between two identical elements for four different meander sizes. Image acquisition on a 7 T magnetic resonance imaging (MRI) system provides qualitative and quantitative comparisons to confirm the simulation results. Simulations point out an optimum range of meander sizes concerning coupling in all chosen geometric setups. Coupling measurement results are in good agreement with the simulations. Qualitative and quantitative comparisons of the acquired MRI images substantiate the coupling results. The coupling between coil elements in RF antenna arrays consisting of the investigated element types can be optimized under consideration of the central magnetic field strength or efficiency depending on the desired application.
Leadership Development Through Peer-Facilitated Simulation in Nursing Education.
Brown, Karen M; Rode, Jennifer L
2018-01-01
Baccalaureate nursing graduates must possess leadership skills, yet few opportunities exist to cultivate leadership abilities in a clinical environment. Peer-facilitated learning may increase the leadership skills of competence, self-confidence, self-reflection, and role modeling. Facilitating human patient simulation provides opportunities to develop leadership skills. With faculty supervision, senior baccalaureate students led small-group simulation experiences with sophomore and junior peers and then conducted subsequent debriefings. Quantitative and qualitative descriptive data allowed evaluation of students' satisfaction with this teaching innovation and whether the experience affected students' desire to take on leadership roles. Students expressed satisfaction with the peer-facilitated simulation experience and confidence in mastering the content while developing necessary skills for practice. Peer-facilitated simulation provides an opportunity for leadership development and learning. Study results can inform the development of nursing curricula to best develop the leadership skills of nursing students. [J Nurs Educ. 2018;57(1):53-57.]. Copyright 2018, SLACK Incorporated.
Establishing a Novel Modeling Tool: A Python-Based Interface for a Neuromorphic Hardware System
Brüderle, Daniel; Müller, Eric; Davison, Andrew; Muller, Eilif; Schemmel, Johannes; Meier, Karlheinz
2008-01-01
Neuromorphic hardware systems provide new possibilities for the neuroscience modeling community. Due to the intrinsic parallelism of the micro-electronic emulation of neural computation, such models are highly scalable without a loss of speed. However, the communities of software simulator users and neuromorphic engineering in neuroscience are rather disjoint. We present a software concept that provides the possibility to establish such hardware devices as valuable modeling tools. It is based on the integration of the hardware interface into a simulator-independent language which allows for unified experiment descriptions that can be run on various simulation platforms without modification, implying experiment portability and a huge simplification of the quantitative comparison of hardware and simulator results. We introduce an accelerated neuromorphic hardware device and describe the implementation of the proposed concept for this system. An example setup and results acquired by utilizing both the hardware system and a software simulator are demonstrated. PMID:19562085
Establishing a novel modeling tool: a python-based interface for a neuromorphic hardware system.
Brüderle, Daniel; Müller, Eric; Davison, Andrew; Muller, Eilif; Schemmel, Johannes; Meier, Karlheinz
2009-01-01
Neuromorphic hardware systems provide new possibilities for the neuroscience modeling community. Due to the intrinsic parallelism of the micro-electronic emulation of neural computation, such models are highly scalable without a loss of speed. However, the communities of software simulator users and neuromorphic engineering in neuroscience are rather disjoint. We present a software concept that provides the possibility to establish such hardware devices as valuable modeling tools. It is based on the integration of the hardware interface into a simulator-independent language which allows for unified experiment descriptions that can be run on various simulation platforms without modification, implying experiment portability and a huge simplification of the quantitative comparison of hardware and simulator results. We introduce an accelerated neuromorphic hardware device and describe the implementation of the proposed concept for this system. An example setup and results acquired by utilizing both the hardware system and a software simulator are demonstrated.
Kim, K B; Shanyfelt, L M; Hahn, D W
2006-01-01
Dense-medium scattering is explored in the context of providing a quantitative measurement of turbidity, with specific application to corneal haze. A multiple-wavelength scattering technique is proposed to make use of two-color scattering response ratios, thereby providing a means for data normalization. A combination of measurements and simulations are reported to assess this technique, including light-scattering experiments for a range of polystyrene suspensions. Monte Carlo (MC) simulations were performed using a multiple-scattering algorithm based on full Mie scattering theory. The simulations were in excellent agreement with the polystyrene suspension experiments, thereby validating the MC model. The MC model was then used to simulate multiwavelength scattering in a corneal tissue model. Overall, the proposed multiwavelength scattering technique appears to be a feasible approach to quantify dense-medium scattering such as the manifestation of corneal haze, although more complex modeling of keratocyte scattering, and animal studies, are necessary.
Study on Contaminant Transportation of a Typical Chemical Industry Park Based on GMS Software
NASA Astrophysics Data System (ADS)
Huang, LinXian; Liu, GuoZhen; Xing, LiTing; Liu, BenHua; Xu, ZhengHe; Yang, LiZhi; Zhu, HebgHua
2018-03-01
The groundwater solute transport model can effectively simulated the transport path, the transport scope, and the concentration of contaminant which can provide quantitative data for groundwater pollution repair and groundwater resource management. In this study, we selected biological modern technology research base of Shandong province as research objective and simulated the pollution characteristic of typicalcontaminant cis-1, 3-dichloropropene under different operating conditions by using GMS software.
Wind Shear/Turbulence Inputs to Flight Simulation and Systems Certification
NASA Technical Reports Server (NTRS)
Bowles, Roland L. (Editor); Frost, Walter (Editor)
1987-01-01
The purpose of the workshop was to provide a forum for industry, universities, and government to assess current status and likely future requirements for application of flight simulators to aviation safety concerns and system certification issues associated with wind shear and atmospheric turbulence. Research findings presented included characterization of wind shear and turbulence hazards based on modeling efforts and quantitative results obtained from field measurement programs. Future research thrusts needed to maximally exploit flight simulators for aviation safety application involving wind shear and turbulence were identified. The conference contained sessions on: Existing wind shear data and simulator implementation initiatives; Invited papers regarding wind shear and turbulence simulation requirements; and Committee working session reports.
MONALISA for stochastic simulations of Petri net models of biochemical systems.
Balazki, Pavel; Lindauer, Klaus; Einloft, Jens; Ackermann, Jörg; Koch, Ina
2015-07-10
The concept of Petri nets (PN) is widely used in systems biology and allows modeling of complex biochemical systems like metabolic systems, signal transduction pathways, and gene expression networks. In particular, PN allows the topological analysis based on structural properties, which is important and useful when quantitative (kinetic) data are incomplete or unknown. Knowing the kinetic parameters, the simulation of time evolution of such models can help to study the dynamic behavior of the underlying system. If the number of involved entities (molecules) is low, a stochastic simulation should be preferred against the classical deterministic approach of solving ordinary differential equations. The Stochastic Simulation Algorithm (SSA) is a common method for such simulations. The combination of the qualitative and semi-quantitative PN modeling and stochastic analysis techniques provides a valuable approach in the field of systems biology. Here, we describe the implementation of stochastic analysis in a PN environment. We extended MONALISA - an open-source software for creation, visualization and analysis of PN - by several stochastic simulation methods. The simulation module offers four simulation modes, among them the stochastic mode with constant firing rates and Gillespie's algorithm as exact and approximate versions. The simulator is operated by a user-friendly graphical interface and accepts input data such as concentrations and reaction rate constants that are common parameters in the biological context. The key features of the simulation module are visualization of simulation, interactive plotting, export of results into a text file, mathematical expressions for describing simulation parameters, and up to 500 parallel simulations of the same parameter sets. To illustrate the method we discuss a model for insulin receptor recycling as case study. We present a software that combines the modeling power of Petri nets with stochastic simulation of dynamic processes in a user-friendly environment supported by an intuitive graphical interface. The program offers a valuable alternative to modeling, using ordinary differential equations, especially when simulating single-cell experiments with low molecule counts. The ability to use mathematical expressions provides an additional flexibility in describing the simulation parameters. The open-source distribution allows further extensions by third-party developers. The software is cross-platform and is licensed under the Artistic License 2.0.
Reilly, Thomas E.; Plummer, Niel; Phillips, Patrick J.; Busenberg, Eurybiades
1994-01-01
Measurements of the concentrations of chlorofluorocarbons (CFCs), tritium, and other environmental tracers can be used to calculate recharge ages of shallow groundwater and estimate rates of groundwater movement. Numerical simulation also provides quantitative estimates of flow rates, flow paths, and mixing properties of the groundwater system. The environmental tracer techniques and the hydraulic analyses each contribute to the understanding and quantification of the flow of shallow groundwater. However, when combined, the two methods provide feedback that improves the quantification of the flow system and provides insight into the processes that are the most uncertain. A case study near Locust Grove, Maryland, is used to investigate the utility of combining groundwater age dating, based on CFCs and tritium, and hydraulic analyses using numerical simulation techniques. The results of the feedback between an advective transport model and the estimates of groundwater ages determined by the CFCs improve a quantitative description of the system by refining the system conceptualization and estimating system parameters. The plausible system developed with this feedback between the advective flow model and the CFC ages is further tested using a solute transport simulation to reproduce the observed tritium distribution in the groundwater. The solute transport simulation corroborates the plausible system developed and also indicates that, for the system under investigation with the data obtained from 0.9-m-long (3-foot-long) well screens, the hydrodynamic dispersion is negligible. Together the two methods enable a coherent explanation of the flow paths and rates of movement while indicating weaknesses in the understanding of the system that will require future data collection and conceptual refinement of the groundwater system.
Numerical simulation of a 100-ton ANFO detonation
NASA Astrophysics Data System (ADS)
Weber, P. W.; Millage, K. K.; Crepeau, J. E.; Happ, H. J.; Gitterman, Y.; Needham, C. E.
2015-03-01
This work describes the results from a US government-owned hydrocode (SHAMRC, Second-Order Hydrodynamic Automatic Mesh Refinement Code) that simulated an explosive detonation experiment with 100,000 kg of Ammonium Nitrate-Fuel Oil (ANFO) and 2,080 kg of Composition B (CompB). The explosive surface charge was nearly hemispherical and detonated in desert terrain. Two-dimensional axisymmetric (2D) and three-dimensional (3D) simulations were conducted, with the 3D model providing a more accurate representation of the experimental setup geometry. Both 2D and 3D simulations yielded overpressure and impulse waveforms that agreed qualitatively with experiment, including the capture of the secondary shock observed in the experiment. The 2D simulation predicted the primary shock arrival time correctly but secondary shock arrival time was early. The 2D-predicted impulse waveforms agreed very well with the experiment, especially at later calculation times, and prediction of the early part of the impulse waveform (associated with the initial peak) was better quantitatively for 2D compared to 3D. The 3D simulation also predicted the primary shock arrival time correctly, and secondary shock arrival times in 3D were closer to the experiment than in the 2D results. The 3D-predicted impulse waveform had better quantitative agreement than 2D for the later part of the impulse waveform. The results of this numerical study show that SHAMRC may be used reliably to predict phenomena associated with the 100-ton detonation. The ultimate fidelity of the simulations was limited by both computer time and memory. The results obtained provide good accuracy and indicate that the code is well suited to predicting the outcomes of explosive detonations.
2009-06-01
simulation is the campaign-level Peace Support Operations Model (PSOM). This thesis provides a quantitative analysis of PSOM. The results are based ...multiple potential outcomes , further development and analysis is required before the model is used for large scale analysis . 15. NUMBER OF PAGES 159...multiple potential outcomes , further development and analysis is required before the model is used for large scale analysis . vi THIS PAGE
Computational simulation of extravehicular activity dynamics during a satellite capture attempt.
Schaffner, G; Newman, D J; Robinson, S K
2000-01-01
A more quantitative approach to the analysis of astronaut extravehicular activity (EVA) tasks is needed because of their increasing complexity, particularly in preparation for the on-orbit assembly of the International Space Station. Existing useful EVA computer analyses produce either high-resolution three-dimensional computer images based on anthropometric representations or empirically derived predictions of astronaut strength based on lean body mass and the position and velocity of body joints but do not provide multibody dynamic analysis of EVA tasks. Our physics-based methodology helps fill the current gap in quantitative analysis of astronaut EVA by providing a multisegment human model and solving the equations of motion in a high-fidelity simulation of the system dynamics. The simulation work described here improves on the realism of previous efforts by including three-dimensional astronaut motion, incorporating joint stops to account for the physiological limits of range of motion, and incorporating use of constraint forces to model interaction with objects. To demonstrate the utility of this approach, the simulation is modeled on an actual EVA task, namely, the attempted capture of a spinning Intelsat VI satellite during STS-49 in May 1992. Repeated capture attempts by an EVA crewmember were unsuccessful because the capture bar could not be held in contact with the satellite long enough for the capture latches to fire and successfully retrieve the satellite.
Voltz, Karine; Léonard, Jérémie; Touceda, Patricia Tourón; Conyard, Jamie; Chaker, Ziyad; Dejaegere, Annick; Godet, Julien; Mély, Yves; Haacke, Stefan; Stote, Roland H
2016-04-20
Molecular dynamics (MD) simulations and time resolved fluorescence (TRF) spectroscopy were combined to quantitatively describe the conformational landscape of the DNA primary binding sequence (PBS) of the HIV-1 genome, a short hairpin targeted by retroviral nucleocapsid proteins implicated in the viral reverse transcription. Three 2-aminopurine (2AP) labeled PBS constructs were studied. For each variant, the complete distribution of fluorescence lifetimes covering 5 orders of magnitude in timescale was measured and the populations of conformers experimentally observed to undergo static quenching were quantified. A binary quantification permitted the comparison of populations from experimental lifetime amplitudes to populations of aromatically stacked 2AP conformers obtained from simulation. Both populations agreed well, supporting the general assumption that quenching of 2AP fluorescence results from pi-stacking interactions with neighboring nucleobases and demonstrating the success of the proposed methodology for the combined analysis of TRF and MD data. Cluster analysis of the latter further identified predominant conformations that were consistent with the fluorescence decay times and amplitudes, providing a structure-based rationalization for the wide range of fluorescence lifetimes. Finally, the simulations provided evidence of local structural perturbations induced by 2AP. The approach presented is a general tool to investigate fine structural heterogeneity in nucleic acid and nucleoprotein assemblies. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
NASA Astrophysics Data System (ADS)
Changyong, Dou; Huadong, Guo; Chunming, Han; yuquan, Liu; Xijuan, Yue; Yinghui, Zhao
2014-03-01
Raw signal simulation is a useful tool for the system design, mission planning, processing algorithm testing, and inversion algorithm design of Synthetic Aperture Radar (SAR). Due to the wide and high frequent variation of aircraft's trajectory and attitude, and the low accuracy of the Position and Orientation System (POS)'s recording data, it's difficult to quantitatively study the sensitivity of the key parameters, i.e., the baseline length and inclination, absolute phase and the orientation of the antennas etc., of the airborne Interferometric SAR (InSAR) system, resulting in challenges for its applications. Furthermore, the imprecise estimation of the installation offset between the Global Positioning System (GPS), Inertial Measurement Unit (IMU) and the InSAR antennas compounds the issue. An airborne interferometric SAR (InSAR) simulation based on the rigorous geometric model and real navigation data is proposed in this paper, providing a way for quantitatively studying the key parameters and for evaluating the effect from the parameters on the applications of airborne InSAR, as photogrammetric mapping, high-resolution Digital Elevation Model (DEM) generation, and surface deformation by Differential InSAR technology, etc. The simulation can also provide reference for the optimal design of the InSAR system and the improvement of InSAR data processing technologies such as motion compensation, imaging, image co-registration, and application parameter retrieval, etc.
O'Connor, Michael; Lee, Caroline; Ellens, Harma; Bentz, Joe
2015-02-01
Current USFDA and EMA guidance for drug transporter interactions is dependent on IC50 measurements as these are utilized in determining whether a clinical interaction study is warranted. It is therefore important not only to standardize transport inhibition assay systems but also to develop uniform statistical criteria with associated probability statements for generation of robust IC50 values, which can be easily adopted across the industry. The current work provides a quantitative examination of critical factors affecting the quality of IC50 fits for P-gp inhibition through simulations of perfect data with randomly added error as commonly observed in the large data set collected by the P-gp IC50 initiative. The types of errors simulated were (1) variability in replicate measures of transport activity; (2) transformations of error-contaminated transport activity data prior to IC50 fitting (such as performed when determining an IC50 for inhibition of P-gp based on efflux ratio); and (3) the lack of well defined "no inhibition" and "complete inhibition" plateaus. The effect of the algorithm used in fitting the inhibition curve (e.g., two or three parameter fits) was also investigated. These simulations provide strong quantitative support for the recommendations provided in Bentz et al. (2013) for the determination of IC50 values for P-gp and demonstrate the adverse effect of data transformation prior to fitting. Furthermore, the simulations validate uniform statistical criteria for robust IC50 fits in general, which can be easily implemented across the industry. A calibration of the t-statistic is provided through calculation of confidence intervals associated with the t-statistic.
NASA Astrophysics Data System (ADS)
Hardie, Russell C.; Rucci, Michael A.; Dapore, Alexander J.; Karch, Barry K.
2017-07-01
We present a block-matching and Wiener filtering approach to atmospheric turbulence mitigation for long-range imaging of extended scenes. We evaluate the proposed method, along with some benchmark methods, using simulated and real-image sequences. The simulated data are generated with a simulation tool developed by one of the authors. These data provide objective truth and allow for quantitative error analysis. The proposed turbulence mitigation method takes a sequence of short-exposure frames of a static scene and outputs a single restored image. A block-matching registration algorithm is used to provide geometric correction for each of the individual input frames. The registered frames are then averaged, and the average image is processed with a Wiener filter to provide deconvolution. An important aspect of the proposed method lies in how we model the degradation point spread function (PSF) for the purposes of Wiener filtering. We use a parametric model that takes into account the level of geometric correction achieved during image registration. This is unlike any method we are aware of in the literature. By matching the PSF to the level of registration in this way, the Wiener filter is able to fully exploit the reduced blurring achieved by registration. We also describe a method for estimating the atmospheric coherence diameter (or Fried parameter) from the estimated motion vectors. We provide a detailed performance analysis that illustrates how the key tuning parameters impact system performance. The proposed method is relatively simple computationally, yet it has excellent performance in comparison with state-of-the-art benchmark methods in our study.
O'Connor, Michael; Lee, Caroline; Ellens, Harma; Bentz, Joe
2015-01-01
Current USFDA and EMA guidance for drug transporter interactions is dependent on IC50 measurements as these are utilized in determining whether a clinical interaction study is warranted. It is therefore important not only to standardize transport inhibition assay systems but also to develop uniform statistical criteria with associated probability statements for generation of robust IC50 values, which can be easily adopted across the industry. The current work provides a quantitative examination of critical factors affecting the quality of IC50 fits for P-gp inhibition through simulations of perfect data with randomly added error as commonly observed in the large data set collected by the P-gp IC50 initiative. The types of errors simulated were (1) variability in replicate measures of transport activity; (2) transformations of error-contaminated transport activity data prior to IC50 fitting (such as performed when determining an IC50 for inhibition of P-gp based on efflux ratio); and (3) the lack of well defined “no inhibition” and “complete inhibition” plateaus. The effect of the algorithm used in fitting the inhibition curve (e.g., two or three parameter fits) was also investigated. These simulations provide strong quantitative support for the recommendations provided in Bentz et al. (2013) for the determination of IC50 values for P-gp and demonstrate the adverse effect of data transformation prior to fitting. Furthermore, the simulations validate uniform statistical criteria for robust IC50 fits in general, which can be easily implemented across the industry. A calibration of the t-statistic is provided through calculation of confidence intervals associated with the t-statistic. PMID:25692007
NASA Technical Reports Server (NTRS)
Pi, Xiaoqing; Mannucci, Anthony J.; Verkhoglyadova, Olga P.; Stephens, Philip; Wilson, Brian D.; Akopian, Vardan; Komjathy, Attila; Lijima, Byron A.
2013-01-01
ISOGAME is designed and developed to assess quantitatively the impact of new observation systems on the capability of imaging and modeling the ionosphere. With ISOGAME, one can perform observation system simulation experiments (OSSEs). A typical OSSE using ISOGAME would involve: (1) simulating various ionospheric conditions on global scales; (2) simulating ionospheric measurements made from a constellation of low-Earth-orbiters (LEOs), particularly Global Navigation Satellite System (GNSS) radio occultation data, and from ground-based global GNSS networks; (3) conducting ionospheric data assimilation experiments with the Global Assimilative Ionospheric Model (GAIM); and (4) analyzing modeling results with visualization tools. ISOGAME can provide quantitative assessment of the accuracy of assimilative modeling with the interested observation system. Other observation systems besides those based on GNSS are also possible to analyze. The system is composed of a suite of software that combines the GAIM, including a 4D first-principles ionospheric model and data assimilation modules, an Internal Reference Ionosphere (IRI) model that has been developed by international ionospheric research communities, observation simulator, visualization software, and orbit design, simulation, and optimization software. The core GAIM model used in ISOGAME is based on the GAIM++ code (written in C++) that includes a new high-fidelity geomagnetic field representation (multi-dipole). New visualization tools and analysis algorithms for the OSSEs are now part of ISOGAME.
Tepper, Ronnie
2017-01-01
Background Workplaces today demand graduates who are prepared with field-specific knowledge, advanced social skills, problem-solving skills, and integration capabilities. Meeting these goals with didactic learning (DL) is becoming increasingly difficult. Enhanced training methods that would better prepare tomorrow’s graduates must be more engaging and game-like, such as feedback based e-learning or simulation-based training, while saving time. Empirical evidence regarding the effectiveness of advanced learning methods is lacking. Objective quantitative research comparing advanced training methods with DL is sparse. Objectives This quantitative study assessed the effectiveness of a computerized interactive simulator coupled with an instructor who monitored students’ progress and provided Web-based immediate feedback. Methods A low-cost, globally accessible, telemedicine simulator, developed at the Technion—Israel Institute of Technology, Haifa, Israel—was used. A previous study in the field of interventional cardiology, evaluating the efficacy of the simulator to enhanced learning via knowledge exams, presented promising results of average scores varying from 94% after training and 54% before training (n=20) with P<.001. Two independent experiments involving obstetrics and gynecology (Ob-Gyn) physicians and senior ultrasound sonographers, with 32 subjects, were conducted using a new interactive concept of the WOZ (Wizard of OZ) simulator platform. The contribution of an instructor to learning outcomes was evaluated by comparing students’ knowledge before and after each interactive instructor-led session as well as after fully automated e-learning in the field of Ob-Gyn. Results from objective knowledge tests were analyzed using hypothesis testing and model fitting. Results A significant advantage (P=.01) was found in favor of the WOZ training approach. Content type and training audience were not significant. Conclusions This study evaluated the contribution of an integrated teaching environment using a computerized interactive simulator, with an instructor providing immediate Web-based immediate feedback to trainees. Involvement of an instructor in the simulation-based training process provided better learning outcomes that varied training content and trainee populations did not affect the overall learning gains. PMID:28432039
Symstad, Amy J.; Fisichelli, Nicholas A.; Miller, Brian W.; Rowland, Erika; Schuurman, Gregor W.
2017-01-01
Scenario planning helps managers incorporate climate change into their natural resource decision making through a structured “what-if” process of identifying key uncertainties and potential impacts and responses. Although qualitative scenarios, in which ecosystem responses to climate change are derived via expert opinion, often suffice for managers to begin addressing climate change in their planning, this approach may face limits in resolving the responses of complex systems to altered climate conditions. In addition, this approach may fall short of the scientific credibility managers often require to take actions that differ from current practice. Quantitative simulation modeling of ecosystem response to climate conditions and management actions can provide this credibility, but its utility is limited unless the modeling addresses the most impactful and management-relevant uncertainties and incorporates realistic management actions. We use a case study to compare and contrast management implications derived from qualitative scenario narratives and from scenarios supported by quantitative simulations. We then describe an analytical framework that refines the case study’s integrated approach in order to improve applicability of results to management decisions. The case study illustrates the value of an integrated approach for identifying counterintuitive system dynamics, refining understanding of complex relationships, clarifying the magnitude and timing of changes, identifying and checking the validity of assumptions about resource responses to climate, and refining management directions. Our proposed analytical framework retains qualitative scenario planning as a core element because its participatory approach builds understanding for both managers and scientists, lays the groundwork to focus quantitative simulations on key system dynamics, and clarifies the challenges that subsequent decision making must address.
Li, Wei; Zhang, Min; Wang, Mingyu; Han, Zhantao; Liu, Jiankai; Chen, Zhezhou; Liu, Bo; Yan, Yan; Liu, Zhu
2018-06-01
Brownfield sites pollution and remediation is an urgent environmental issue worldwide. The screening and assessment of remedial alternatives is especially complex owing to its multiple criteria that involves technique, economy, and policy. To help the decision-makers selecting the remedial alternatives efficiently, the criteria framework conducted by the U.S. EPA is improved and a comprehensive method that integrates multiple criteria decision analysis (MCDA) with numerical simulation is conducted in this paper. The criteria framework is modified and classified into three categories: qualitative, semi-quantitative, and quantitative criteria, MCDA method, AHP-PROMETHEE (analytical hierarchy process-preference ranking organization method for enrichment evaluation) is used to determine the priority ranking of the remedial alternatives and the solute transport simulation is conducted to assess the remedial efficiency. A case study was present to demonstrate the screening method in a brownfield site in Cangzhou, northern China. The results show that the systematic method provides a reliable way to quantify the priority of the remedial alternatives.
A Comparison of Compressed Sensing and Sparse Recovery Algorithms Applied to Simulation Data
Fan, Ya Ju; Kamath, Chandrika
2016-09-01
The move toward exascale computing for scientific simulations is placing new demands on compression techniques. It is expected that the I/O system will not be able to support the volume of data that is expected to be written out. To enable quantitative analysis and scientific discovery, we are interested in techniques that compress high-dimensional simulation data and can provide perfect or near-perfect reconstruction. In this paper, we explore the use of compressed sensing (CS) techniques to reduce the size of the data before they are written out. Using large-scale simulation data, we investigate how the sufficient sparsity condition and themore » contrast in the data affect the quality of reconstruction and the degree of compression. Also, we provide suggestions for the practical implementation of CS techniques and compare them with other sparse recovery methods. Finally, our results show that despite longer times for reconstruction, compressed sensing techniques can provide near perfect reconstruction over a range of data with varying sparsity.« less
A Comparison of Compressed Sensing and Sparse Recovery Algorithms Applied to Simulation Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fan, Ya Ju; Kamath, Chandrika
The move toward exascale computing for scientific simulations is placing new demands on compression techniques. It is expected that the I/O system will not be able to support the volume of data that is expected to be written out. To enable quantitative analysis and scientific discovery, we are interested in techniques that compress high-dimensional simulation data and can provide perfect or near-perfect reconstruction. In this paper, we explore the use of compressed sensing (CS) techniques to reduce the size of the data before they are written out. Using large-scale simulation data, we investigate how the sufficient sparsity condition and themore » contrast in the data affect the quality of reconstruction and the degree of compression. Also, we provide suggestions for the practical implementation of CS techniques and compare them with other sparse recovery methods. Finally, our results show that despite longer times for reconstruction, compressed sensing techniques can provide near perfect reconstruction over a range of data with varying sparsity.« less
Gravitational Effects on Near Field Flow Structure of Low Density Gas Jets
NASA Technical Reports Server (NTRS)
Griffin, D. W.; Yep, T. W.; Agrawal, A. K.
2005-01-01
Experiments were conducted in Earth gravity and microgravity to acquire quantitative data on near field flow structure of helium jets injected into air. Microgravity conditions were simulated in the 2.2- second drop tower at NASA Glenn Research Center. The jet flow was observed by quantitative rainbow schlieren deflectometry, a non-intrusive line of site measurement technique for the whole field. The flow structure was characterized by distributions of angular deflection and helium mole percentage obtained from color schlieren images taken at 60 Hz. Results show that the jet in microgravity was up to 70 percent wider than that in Earth gravity. The global jet flow oscillations observed in Earth gravity were absent in microgravity, providing direct experimental evidence that the flow instability in the low density jet was buoyancy induced. The paper provides quantitative details of temporal flow evolution as the experiment undergoes change in gravity in the drop tower.
How runoff begins (and ends): characterizing hydrologic response at the catchment scale
Mirus, Benjamin B.; Loague, Keith
2013-01-01
Improved understanding of the complex dynamics associated with spatially and temporally variable runoff response is needed to better understand the hydrology component of interdisciplinary problems. The objective of this study was to quantitatively characterize the environmental controls on runoff generation for the range of different streamflow-generation mechanisms illustrated in the classic Dunne diagram. The comprehensive physics-based model of coupled surface-subsurface flow, InHM, is employed in a heuristic mode. InHM has been employed previously to successfully simulate the observed hydrologic response at four diverse, well-characterized catchments, which provides the foundation for this study. The C3 and CB catchments are located within steep, forested terrain; the TW and R5 catchments are located in gently sloping rangeland. The InHM boundary-value problems for these four catchments provide the corner-stones for alternative simulation scenarios designed to address the question of how runoff begins (and ends). Simulated rainfall-runoff events are used to systematically explore the impact of soil-hydraulic properties and rainfall characteristics. This approach facilitates quantitative analysis of both integrated and distributed hydrologic responses at high-spatial and temporal resolution over the wide range of environmental conditions represented by the four catchments. The results from 140 unique simulation scenarios illustrate how rainfall intensity/depth, subsurface permeability contrasts, characteristic curve shapes, and topography provide important controls on the hydrologic-response dynamics. The processes by which runoff begins (and ends) are shown, in large part, to be defined by the relative rates of rainfall, infiltration, lateral flow convergence, and storage dynamics within the variably saturated soil layers.
Linking agent-based models and stochastic models of financial markets
Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H. Eugene
2012-01-01
It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that “fat” tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting. PMID:22586086
Linking agent-based models and stochastic models of financial markets.
Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H Eugene
2012-05-29
It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that "fat" tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting.
Hattori, Yoshiaki; Falgout, Leo; Lee, Woosik; Jung, Sung-Young; Poon, Emily; Lee, Jung Woo; Na, Ilyoun; Geisler, Amelia; Sadhwani, Divya; Zhang, Yihui; Su, Yewang; Wang, Xiaoqi; Liu, Zhuangjian; Xia, Jing; Cheng, Huanyu; Webb, R Chad; Bonifas, Andrew P; Won, Philip; Jeong, Jae-Woong; Jang, Kyung-In; Song, Young Min; Nardone, Beatrice; Nodzenski, Michael; Fan, Jonathan A; Huang, Yonggang; West, Dennis P; Paller, Amy S; Alam, Murad; Yeo, Woon-Hong; Rogers, John A
2014-10-01
Non-invasive, biomedical devices have the potential to provide important, quantitative data for the assessment of skin diseases and wound healing. Traditional methods either rely on qualitative visual and tactile judgments of a professional and/or data obtained using instrumentation with forms that do not readily allow intimate integration with sensitive skin near a wound site. Here, an electronic sensor platform that can softly and reversibly laminate perilesionally at wounds to provide highly accurate, quantitative data of relevance to the management of surgical wound healing is reported. Clinical studies on patients using thermal sensors and actuators in fractal layouts provide precise time-dependent mapping of temperature and thermal conductivity of the skin near the wounds. Analytical and simulation results establish the fundamentals of the sensing modalities, the mechanics of the system, and strategies for optimized design. The use of this type of "epidermal" electronics system in a realistic clinical setting with human subjects establishes a set of practical procedures in disinfection, reuse, and protocols for quantitative measurement. The results have the potential to address important unmet needs in chronic wound management. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Parallel FEM Simulation of Electromechanics in the Heart
NASA Astrophysics Data System (ADS)
Xia, Henian; Wong, Kwai; Zhao, Xiaopeng
2011-11-01
Cardiovascular disease is the leading cause of death in America. Computer simulation of complicated dynamics of the heart could provide valuable quantitative guidance for diagnosis and treatment of heart problems. In this paper, we present an integrated numerical model which encompasses the interaction of cardiac electrophysiology, electromechanics, and mechanoelectrical feedback. The model is solved by finite element method on a Linux cluster and the Cray XT5 supercomputer, kraken. Dynamical influences between the effects of electromechanics coupling and mechanic-electric feedback are shown.
Virtual reality laparoscopic simulator for assessment in gynaecology.
Gor, Mounna; McCloy, Rory; Stone, Robert; Smith, Anthony
2003-02-01
A validated virtual reality laparoscopic simulator minimally invasive surgical trainer (MIST) 2 was used to assess the psychomotor skills of 21 gynaecologists (2 consultants, 8 registrars and 11 senior house officers). Nine gynaecologists failed to complete the VR tasks at the first attempt and were excluded for sequential evaluation. Each of the remaining 12 gynaecologists were tested on MIST 2 on four occasions within four weeks. The MIST 2 simulator provided quantitative data on time to complete tasks, errors, economy of movement and economy of diathermy use--for both right and left hand performance. The results show a significant early learning curve for the majority of tasks which plateaued by the third session. This suggests a high quality surgeon-computer interface. MIST 2 provides objective assessment of laparoscopic skills in gynaecologists.
Digital video timing analyzer for the evaluation of PC-based real-time simulation systems
NASA Astrophysics Data System (ADS)
Jones, Shawn R.; Crosby, Jay L.; Terry, John E., Jr.
2009-05-01
Due to the rapid acceleration in technology and the drop in costs, the use of commercial off-the-shelf (COTS) PC-based hardware and software components for digital and hardware-in-the-loop (HWIL) simulations has increased. However, the increase in PC-based components creates new challenges for HWIL test facilities such as cost-effective hardware and software selection, system configuration and integration, performance testing, and simulation verification/validation. This paper will discuss how the Digital Video Timing Analyzer (DiViTA) installed in the Aviation and Missile Research, Development and Engineering Center (AMRDEC) provides quantitative characterization data for PC-based real-time scene generation systems. An overview of the DiViTA is provided followed by details on measurement techniques, applications, and real-world examples of system benefits.
Quantitative trait nucleotide analysis using Bayesian model selection.
Blangero, John; Goring, Harald H H; Kent, Jack W; Williams, Jeff T; Peterson, Charles P; Almasy, Laura; Dyer, Thomas D
2005-10-01
Although much attention has been given to statistical genetic methods for the initial localization and fine mapping of quantitative trait loci (QTLs), little methodological work has been done to date on the problem of statistically identifying the most likely functional polymorphisms using sequence data. In this paper we provide a general statistical genetic framework, called Bayesian quantitative trait nucleotide (BQTN) analysis, for assessing the likely functional status of genetic variants. The approach requires the initial enumeration of all genetic variants in a set of resequenced individuals. These polymorphisms are then typed in a large number of individuals (potentially in families), and marker variation is related to quantitative phenotypic variation using Bayesian model selection and averaging. For each sequence variant a posterior probability of effect is obtained and can be used to prioritize additional molecular functional experiments. An example of this quantitative nucleotide analysis is provided using the GAW12 simulated data. The results show that the BQTN method may be useful for choosing the most likely functional variants within a gene (or set of genes). We also include instructions on how to use our computer program, SOLAR, for association analysis and BQTN analysis.
Quantitative Oxygenation Venography from MRI Phase
Fan, Audrey P.; Bilgic, Berkin; Gagnon, Louis; Witzel, Thomas; Bhat, Himanshu; Rosen, Bruce R.; Adalsteinsson, Elfar
2014-01-01
Purpose To demonstrate acquisition and processing methods for quantitative oxygenation venograms that map in vivo oxygen saturation (SvO2) along cerebral venous vasculature. Methods Regularized quantitative susceptibility mapping (QSM) is used to reconstruct susceptibility values and estimate SvO2 in veins. QSM with ℓ1 and ℓ2 regularization are compared in numerical simulations of vessel structures with known magnetic susceptibility. Dual-echo, flow-compensated phase images are collected in three healthy volunteers to create QSM images. Bright veins in the susceptibility maps are vectorized and used to form a three-dimensional vascular mesh, or venogram, along which to display SvO2 values from QSM. Results Quantitative oxygenation venograms that map SvO2 along brain vessels of arbitrary orientation and geometry are shown in vivo. SvO2 values in major cerebral veins lie within the normal physiological range reported by 15O positron emission tomography. SvO2 from QSM is consistent with previous MR susceptometry methods for vessel segments oriented parallel to the main magnetic field. In vessel simulations, ℓ1 regularization results in less than 10% SvO2 absolute error across all vessel tilt orientations and provides more accurate SvO2 estimation than ℓ2 regularization. Conclusion The proposed analysis of susceptibility images enables reliable mapping of quantitative SvO2 along venograms and may facilitate clinical use of venous oxygenation imaging. PMID:24006229
NASA Astrophysics Data System (ADS)
Lazariev, A.; Allouche, A.-R.; Aubert-Frécon, M.; Fauvelle, F.; Piotto, M.; Elbayed, K.; Namer, I.-J.; van Ormondt, D.; Graveron-Demilly, D.
2011-11-01
High-resolution magic angle spinning (HRMAS) nuclear magnetic resonance (NMR) is playing an increasingly important role for diagnosis. This technique enables setting up metabolite profiles of ex vivo pathological and healthy tissue. The need to monitor diseases and pharmaceutical follow-up requires an automatic quantitation of HRMAS 1H signals. However, for several metabolites, the values of chemical shifts of proton groups may slightly differ according to the micro-environment in the tissue or cells, in particular to its pH. This hampers the accurate estimation of the metabolite concentrations mainly when using quantitation algorithms based on a metabolite basis set: the metabolite fingerprints are not correct anymore. In this work, we propose an accurate method coupling quantum mechanical simulations and quantitation algorithms to handle basis-set changes. The proposed algorithm automatically corrects mismatches between the signals of the simulated basis set and the signal under analysis by maximizing the normalized cross-correlation between the mentioned signals. Optimized chemical shift values of the metabolites are obtained. This method, QM-QUEST, provides more robust fitting while limiting user involvement and respects the correct fingerprints of metabolites. Its efficiency is demonstrated by accurately quantitating 33 signals from tissue samples of human brains with oligodendroglioma, obtained at 11.7 tesla. The corresponding chemical shift changes of several metabolites within the series are also analyzed.
NASA Astrophysics Data System (ADS)
Olivieri, Giorgia; Parry, Krista M.; Powell, Cedric J.; Tobias, Douglas J.; Brown, Matthew A.
2016-04-01
Over the past decade, energy-dependent ambient pressure X-ray photoelectron spectroscopy (XPS) has emerged as a powerful analytical probe of the ion spatial distributions at the vapor (vacuum)-aqueous electrolyte interface. These experiments are often paired with complementary molecular dynamics (MD) simulations in an attempt to provide a complete description of the liquid interface. There is, however, no systematic protocol that permits a straightforward comparison of the two sets of results. XPS is an integrated technique that averages signals from multiple layers in a solution even at the lowest photoelectron kinetic energies routinely employed, whereas MD simulations provide a microscopic layer-by-layer description of the solution composition near the interface. Here, we use the National Institute of Standards and Technology database for the Simulation of Electron Spectra for Surface Analysis (SESSA) to quantitatively interpret atom-density profiles from MD simulations for XPS signal intensities using sodium and potassium iodide solutions as examples. We show that electron inelastic mean free paths calculated from a semi-empirical formula depend strongly on solution composition, varying by up to 30% between pure water and concentrated NaI. The XPS signal thus arises from different information depths in different solutions for a fixed photoelectron kinetic energy. XPS signal intensities are calculated using SESSA as a function of photoelectron kinetic energy (probe depth) and compared with a widely employed ad hoc method. SESSA simulations illustrate the importance of accounting for elastic-scattering events at low photoelectron kinetic energies (<300 eV) where the ad hoc method systematically underestimates the preferential enhancement of anions over cations. Finally, some technical aspects of applying SESSA to liquid interfaces are discussed.
Solheim, Elisabeth; Plathe, Hilde Syvertsen; Eide, Hilde
2017-11-01
Clinical skills training is an important part of nurses' education programmes. Clinical skills are complex. A common understanding of what characterizes clinical skills and learning outcomes needs to be established. The aim of the study was to develop and evaluate a new reflection and feedback tool for formative assessment. The study has a descriptive quantitative design. 129 students participated who were at the end of the first year of a Bachelor degree in nursing. After highfidelity simulation, data were collected using a questionnaire with 19 closed-ended and 2 open-ended questions. The tool stimulated peer assessment, and enabled students to be more thorough in what to assess as an observer in clinical skills. The tool provided a structure for selfassessment and made visible items that are important to be aware of in clinical skills. This article adds to simulation literature and provides a tool that is useful in enhancing peer learning, which is essential for nurses in practice. The tool has potential for enabling students to learn about reflection and developing skills for guiding others in practice after they have graduated. Copyright © 2017 Elsevier Ltd. All rights reserved.
Study of the Imaging Capabilities of SPIRIT/SPECS Concept Interferometers
NASA Technical Reports Server (NTRS)
Allen, Ronald J.
2002-01-01
Several new space science mission concepts under development at NASA-GSFC for astronomy are intended to carry out synthetic imaging using Michelson interferometers or direct (Fizeau) imaging with sparse apertures. Examples of these mission concepts include the Stellar Imager (SI), the Space Infrared Interferometric Telescope (SPIRIT), the Submillimeter Probe of the Evolution of Cosmic Structure (SPECS), and the Fourier-Kelvin Stellar Interferometer (FKSI). We have been developing computer-based simulators for these missions. These simulators are aimed at providing a quantitative evaluation of the imaging capabilities of the mission by modeling the performance on different realistic targets in terms of sensitivity, angular resolution, and dynamic range. Both Fizeau and Michelson modes of operation can be considered. Our work is based on adapting a computer simulator called imSIM which was initially written for the Space Interferometer Mission in order to simulate the imaging mode of new missions such as those listed. This report covers the activities we have undertaken to provide a preliminary version of a simulator for the SPIRIT mission concept.
Atomistic and coarse-grained computer simulations of raft-like lipid mixtures.
Pandit, Sagar A; Scott, H Larry
2007-01-01
Computer modeling can provide insights into the existence, structure, size, and thermodynamic stability of localized raft-like regions in membranes. However, the challenges in the construction and simulation of accurate models of heterogeneous membranes are great. The primary obstacle in modeling the lateral organization within a membrane is the relatively slow lateral diffusion rate for lipid molecules. Microsecond or longer time-scales are needed to fully model the formation and stability of a raft in a membra ne. Atomistic simulations currently are not able to reach this scale, but they do provide quantitative information on the intermolecular forces and correlations that are involved in lateral organization. In this chapter, the steps needed to carry out and analyze atomistic simulations of hydrated lipid bilayers having heterogeneous composition are outlined. It is then shown how the data from a molecular dynamics simulation can be used to construct a coarse-grained model for the heterogeneous bilayer that can predict the lateral organization and stability of rafts at up to millisecond time-scales.
Simulation Credibility: Advances in Verification, Validation, and Uncertainty Quantification
NASA Technical Reports Server (NTRS)
Mehta, Unmeel B. (Editor); Eklund, Dean R.; Romero, Vicente J.; Pearce, Jeffrey A.; Keim, Nicholas S.
2016-01-01
Decision makers and other users of simulations need to know quantified simulation credibility to make simulation-based critical decisions and effectively use simulations, respectively. The credibility of a simulation is quantified by its accuracy in terms of uncertainty, and the responsibility of establishing credibility lies with the creator of the simulation. In this volume, we present some state-of-the-art philosophies, principles, and frameworks. The contributing authors involved in this publication have been dedicated to advancing simulation credibility. They detail and provide examples of key advances over the last 10 years in the processes used to quantify simulation credibility: verification, validation, and uncertainty quantification. The philosophies and assessment methods presented here are anticipated to be useful to other technical communities conducting continuum physics-based simulations; for example, issues related to the establishment of simulation credibility in the discipline of propulsion are discussed. We envision that simulation creators will find this volume very useful to guide and assist them in quantitatively conveying the credibility of their simulations.
Simulation and the Development of Clinical Judgment: A Quantitative Study
ERIC Educational Resources Information Center
Holland, Susan
2015-01-01
The purpose of this quantitative pretest posttest quasi-experimental research study was to explore the effect of the NESD on clinical judgment in associate degree nursing students and compare the differences between groups when the Nursing Education Simulation Design (NESD) guided simulation in order to identify educational strategies promoting…
An ice sheet model validation framework for the Greenland ice sheet.
Price, Stephen F; Hoffman, Matthew J; Bonin, Jennifer A; Howat, Ian M; Neumann, Thomas; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey; Chambers, Don P; Evans, Katherine J; Kennedy, Joseph H; Lenaerts, Jan; Lipscomb, William H; Perego, Mauro; Salinger, Andrew G; Tuminaro, Raymond S; van den Broeke, Michiel R; Nowicki, Sophie M J
2017-01-01
We propose a new ice sheet model validation framework - the Cryospheric Model Comparison Tool (CmCt) - that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013 using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quantitative metrics for use in evaluating the different model simulations against the observations. We find that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin- and whole-ice-sheet scale metrics, we find that simulations using both idealized conceptual models and dynamic, numerical models provide an equally reasonable representation of the ice sheet surface (mean elevation differences of <1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CmCt, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate predictive skill with respect to observed dynamic changes occurring on Greenland over the past few decades. An extensible design will allow for continued use of the CmCt as future altimetry, gravimetry, and other remotely sensed data become available for use in ice sheet model validation.
Predicting low-temperature free energy landscapes with flat-histogram Monte Carlo methods
NASA Astrophysics Data System (ADS)
Mahynski, Nathan A.; Blanco, Marco A.; Errington, Jeffrey R.; Shen, Vincent K.
2017-02-01
We present a method for predicting the free energy landscape of fluids at low temperatures from flat-histogram grand canonical Monte Carlo simulations performed at higher ones. We illustrate our approach for both pure and multicomponent systems using two different sampling methods as a demonstration. This allows us to predict the thermodynamic behavior of systems which undergo both first order and continuous phase transitions upon cooling using simulations performed only at higher temperatures. After surveying a variety of different systems, we identify a range of temperature differences over which the extrapolation of high temperature simulations tends to quantitatively predict the thermodynamic properties of fluids at lower ones. Beyond this range, extrapolation still provides a reasonably well-informed estimate of the free energy landscape; this prediction then requires less computational effort to refine with an additional simulation at the desired temperature than reconstruction of the surface without any initial estimate. In either case, this method significantly increases the computational efficiency of these flat-histogram methods when investigating thermodynamic properties of fluids over a wide range of temperatures. For example, we demonstrate how a binary fluid phase diagram may be quantitatively predicted for many temperatures using only information obtained from a single supercritical state.
Matt Reeves; Paulette Ford; Leonardo Frid; David Augustine; Justin Derner
2016-01-01
The Great Plains grasslands of North America provide a multitude of ecosystem services including clean water, forage, habitat, recreation, and pollination of native and agricultural plants. A general lack of quantitative information regarding the effects of varied management strategies on these spatially heterogeneous landscapes complicates our understanding...
Identifying influences on model uncertainty: an application using a forest carbon budget model
James E. Smith; Linda S. Heath
2001-01-01
Uncertainty is an important consideration for both developers and users of environmental simulation models. Establishing quantitative estimates of uncertainty for deterministic models can be difficult when the underlying bases for such information are scarce. We demonstrate an application of probabilistic uncertainty analysis that provides for refinements in...
Pütter, Carolin; Pechlivanis, Sonali; Nöthen, Markus M; Jöckel, Karl-Heinz; Wichmann, Heinz-Erich; Scherag, André
2011-01-01
Genome-wide association studies have identified robust associations between single nucleotide polymorphisms and complex traits. As the proportion of phenotypic variance explained is still limited for most of the traits, larger and larger meta-analyses are being conducted to detect additional associations. Here we investigate the impact of the study design and the underlying assumption about the true genetic effect in a bimodal mixture situation on the power to detect associations. We performed simulations of quantitative phenotypes analysed by standard linear regression and dichotomized case-control data sets from the extremes of the quantitative trait analysed by standard logistic regression. Using linear regression, markers with an effect in the extremes of the traits were almost undetectable, whereas analysing extremes by case-control design had superior power even for much smaller sample sizes. Two real data examples are provided to support our theoretical findings and to explore our mixture and parameter assumption. Our findings support the idea to re-analyse the available meta-analysis data sets to detect new loci in the extremes. Moreover, our investigation offers an explanation for discrepant findings when analysing quantitative traits in the general population and in the extremes. Copyright © 2011 S. Karger AG, Basel.
Comparison of two laboratory-based systems for evaluation of halos in intraocular lenses
Alexander, Elsinore; Wei, Xin; Lee, Shinwook
2018-01-01
Purpose Multifocal intraocular lenses (IOLs) can be associated with unwanted visual phenomena, including halos. Predicting potential for halos is desirable when designing new multifocal IOLs. Halo images from 6 IOL models were compared using the Optikos modulation transfer function bench system and a new high dynamic range (HDR) system. Materials and methods One monofocal, 1 extended depth of focus, and 4 multifocal IOLs were evaluated. An off-the-shelf optical bench was used to simulate a distant (>50 m) car headlight and record images. A custom HDR system was constructed using an imaging photometer to simulate headlight images and to measure quantitative halo luminance data. A metric was developed to characterize halo luminance properties. Clinical relevance was investigated by correlating halo measurements to visual outcomes questionnaire data. Results The Optikos system produced halo images useful for visual comparisons; however, measurements were relative and not quantitative. The HDR halo system provided objective and quantitative measurements used to create a metric from the area under the curve (AUC) of the logarithmic normalized halo profile. This proposed metric differentiated between IOL models, and linear regression analysis found strong correlations between AUC and subjective clinical ratings of halos. Conclusion The HDR system produced quantitative, preclinical metrics that correlated to patients’ subjective perception of halos. PMID:29503526
Kessner, Darren; Novembre, John
2015-01-01
Evolve and resequence studies combine artificial selection experiments with massively parallel sequencing technology to study the genetic basis for complex traits. In these experiments, individuals are selected for extreme values of a trait, causing alleles at quantitative trait loci (QTL) to increase or decrease in frequency in the experimental population. We present a new analysis of the power of artificial selection experiments to detect and localize quantitative trait loci. This analysis uses a simulation framework that explicitly models whole genomes of individuals, quantitative traits, and selection based on individual trait values. We find that explicitly modeling QTL provides qualitatively different insights than considering independent loci with constant selection coefficients. Specifically, we observe how interference between QTL under selection affects the trajectories and lengthens the fixation times of selected alleles. We also show that a substantial portion of the genetic variance of the trait (50–100%) can be explained by detected QTL in as little as 20 generations of selection, depending on the trait architecture and experimental design. Furthermore, we show that power depends crucially on the opportunity for recombination during the experiment. Finally, we show that an increase in power is obtained by leveraging founder haplotype information to obtain allele frequency estimates. PMID:25672748
Zheng, Xiujuan; Wei, Wentao; Huang, Qiu; Song, Shaoli; Wan, Jieqing; Huang, Gang
2017-01-01
The objective and quantitative analysis of longitudinal single photon emission computed tomography (SPECT) images are significant for the treatment monitoring of brain disorders. Therefore, a computer aided analysis (CAA) method is introduced to extract a change-rate map (CRM) as a parametric image for quantifying the changes of regional cerebral blood flow (rCBF) in longitudinal SPECT brain images. The performances of the CAA-CRM approach in treatment monitoring are evaluated by the computer simulations and clinical applications. The results of computer simulations show that the derived CRMs have high similarities with their ground truths when the lesion size is larger than system spatial resolution and the change rate is higher than 20%. In clinical applications, the CAA-CRM approach is used to assess the treatment of 50 patients with brain ischemia. The results demonstrate that CAA-CRM approach has a 93.4% accuracy of recovered region's localization. Moreover, the quantitative indexes of recovered regions derived from CRM are all significantly different among the groups and highly correlated with the experienced clinical diagnosis. In conclusion, the proposed CAA-CRM approach provides a convenient solution to generate a parametric image and derive the quantitative indexes from the longitudinal SPECT brain images for treatment monitoring.
NASA Technical Reports Server (NTRS)
hoelzer, H. D.; Fourroux, K. A.; Rickman, D. L.; Schrader, C. M.
2011-01-01
Figures of Merit (FoMs) and the FoM software provide a method for quantitatively evaluating the quality of a regolith simulant by comparing the simulant to a reference material. FoMs may be used for comparing a simulant to actual regolith material, specification by stating the value a simulant s FoMs must attain to be suitable for a given application and comparing simulants from different vendors or production runs. FoMs may even be used to compare different simulants to each other. A single FoM is conceptually an algorithm that computes a single number for quantifying the similarity or difference of a single characteristic of a simulant material and a reference material and provides a clear measure of how well a simulant and reference material match or compare. FoMs have been constructed to lie between zero and 1, with zero indicating a poor or no match and 1 indicating a perfect match. FoMs are defined for modal composition, particle size distribution, particle shape distribution, (aspect ratio and angularity), and density. This TM covers the mathematics, use, installation, and licensing for the existing FoM code in detail.
Translating the simulation of procedural drilling techniques for interactive neurosurgical training.
Stredney, Don; Rezai, Ali R; Prevedello, Daniel M; Elder, J Bradley; Kerwin, Thomas; Hittle, Bradley; Wiet, Gregory J
2013-10-01
Through previous efforts we have developed a fully virtual environment to provide procedural training of otologic surgical technique. The virtual environment is based on high-resolution volumetric data of the regional anatomy. These volumetric data help drive an interactive multisensory, ie, visual (stereo), aural (stereo), and tactile, simulation environment. Subsequently, we have extended our efforts to support the training of neurosurgical procedural technique as part of the Congress of Neurological Surgeons simulation initiative. To deliberately study the integration of simulation technologies into the neurosurgical curriculum and to determine their efficacy in teaching minimally invasive cranial and skull base approaches. We discuss issues of biofidelity and our methods to provide objective, quantitative and automated assessment for the residents. We conclude with a discussion of our experiences by reporting preliminary formative pilot studies and proposed approaches to take the simulation to the next level through additional validation studies. We have presented our efforts to translate an otologic simulation environment for use in the neurosurgical curriculum. We have demonstrated the initial proof of principles and define the steps to integrate and validate the system as an adjuvant to the neurosurgical curriculum.
Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G
2014-11-20
The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.
A collimator optimization method for quantitative imaging: application to Y-90 bremsstrahlung SPECT.
Rong, Xing; Frey, Eric C
2013-08-01
Post-therapy quantitative 90Y bremsstrahlung single photon emission computed tomography (SPECT) has shown great potential to provide reliable activity estimates, which are essential for dose verification. Typically 90Y imaging is performed with high- or medium-energy collimators. However, the energy spectrum of 90Y bremsstrahlung photons is substantially different than typical for these collimators. In addition, dosimetry requires quantitative images, and collimators are not typically optimized for such tasks. Optimizing a collimator for 90Y imaging is both novel and potentially important. Conventional optimization methods are not appropriate for 90Y bremsstrahlung photons, which have a continuous and broad energy distribution. In this work, the authors developed a parallel-hole collimator optimization method for quantitative tasks that is particularly applicable to radionuclides with complex emission energy spectra. The authors applied the proposed method to develop an optimal collimator for quantitative 90Y bremsstrahlung SPECT in the context of microsphere radioembolization. To account for the effects of the collimator on both the bias and the variance of the activity estimates, the authors used the root mean squared error (RMSE) of the volume of interest activity estimates as the figure of merit (FOM). In the FOM, the bias due to the null space of the image formation process was taken in account. The RMSE was weighted by the inverse mass to reflect the application to dosimetry; for a different application, more relevant weighting could easily be adopted. The authors proposed a parameterization for the collimator that facilitates the incorporation of the important factors (geometric sensitivity, geometric resolution, and septal penetration fraction) determining collimator performance, while keeping the number of free parameters describing the collimator small (i.e., two parameters). To make the optimization results for quantitative 90Y bremsstrahlung SPECT more general, the authors simulated multiple tumors of various sizes in the liver. The authors realistically simulated human anatomy using a digital phantom and the image formation process using a previously validated and computationally efficient method for modeling the image-degrading effects including object scatter, attenuation, and the full collimator-detector response (CDR). The scatter kernels and CDR function tables used in the modeling method were generated using a previously validated Monte Carlo simulation code. The hole length, hole diameter, and septal thickness of the obtained optimal collimator were 84, 3.5, and 1.4 mm, respectively. Compared to a commercial high-energy general-purpose collimator, the optimal collimator improved the resolution and FOM by 27% and 18%, respectively. The proposed collimator optimization method may be useful for improving quantitative SPECT imaging for radionuclides with complex energy spectra. The obtained optimal collimator provided a substantial improvement in quantitative performance for the microsphere radioembolization task considered.
ERIC Educational Resources Information Center
Punch, Raymond J.
2012-01-01
The purpose of the quantitative regression study was to explore and to identify relationships between attitudes toward use and perceptions of value of computer-based simulation programs, of college instructors, toward computer based simulation programs. A relationship has been reported between attitudes toward use and perceptions of the value of…
Lee, Heow Peuh; Gordon, Bruce R.
2012-01-01
During the past decades, numerous computational fluid dynamics (CFD) studies, constructed from CT or MRI images, have simulated human nasal models. As compared to rhinomanometry and acoustic rhinometry, which provide quantitative information only of nasal airflow, resistance, and cross sectional areas, CFD enables additional measurements of airflow passing through the nasal cavity that help visualize the physiologic impact of alterations in intranasal structures. Therefore, it becomes possible to quantitatively measure, and visually appreciate, the airflow pattern (laminar or turbulent), velocity, pressure, wall shear stress, particle deposition, and temperature changes at different flow rates, in different parts of the nasal cavity. The effects of both existing anatomical factors, as well as post-operative changes, can be assessed. With recent improvements in CFD technology and computing power, there is a promising future for CFD to become a useful tool in planning, predicting, and evaluating outcomes of nasal surgery. This review discusses the possibilities and potential impacts, as well as technical limitations, of using CFD simulation to better understand nasal airflow physiology. PMID:23205221
NASA Astrophysics Data System (ADS)
Mo, Yunjeong
The purpose of this research is to support the development of an intelligent Decision Support System (DSS) by integrating quantitative information with expert knowledge in order to facilitate effective retrofit decision-making. To achieve this goal, the Energy Retrofit Decision Process Framework is analyzed. Expert system shell software, a retrofit measure cost database, and energy simulation software are needed for developing the DSS; Exsys Corvid, the NREM database and BEopt were chosen for implementing an integration model. This integration model demonstrates the holistic function of a residential energy retrofit system for existing homes, by providing a prioritized list of retrofit measures with cost information, energy simulation and expert advice. The users, such as homeowners and energy auditors, can acquire all of the necessary retrofit information from this unified system without having to explore several separate systems. The integration model plays the role of a prototype for the finalized intelligent decision support system. It implements all of the necessary functions for the finalized DSS, including integration of the database, energy simulation and expert knowledge.
Ashbaugh, H S; Garde, S; Hummer, G; Kaler, E W; Paulaitis, M E
1999-01-01
Conformational free energies of butane, pentane, and hexane in water are calculated from molecular simulations with explicit waters and from a simple molecular theory in which the local hydration structure is estimated based on a proximity approximation. This proximity approximation uses only the two nearest carbon atoms on the alkane to predict the local water density at a given point in space. Conformational free energies of hydration are subsequently calculated using a free energy perturbation method. Quantitative agreement is found between the free energies obtained from simulations and theory. Moreover, free energy calculations using this proximity approximation are approximately four orders of magnitude faster than those based on explicit water simulations. Our results demonstrate the accuracy and utility of the proximity approximation for predicting water structure as the basis for a quantitative description of n-alkane conformational equilibria in water. In addition, the proximity approximation provides a molecular foundation for extending predictions of water structure and hydration thermodynamic properties of simple hydrophobic solutes to larger clusters or assemblies of hydrophobic solutes. PMID:10423414
Schram-Bijkerk, D; van Kempen, E; Knol, A B; Kruize, H; Staatsen, B; van Kamp, I
2009-10-01
Few quantitative health impact assessments (HIAs) of transport policies have been published so far and there is a lack of a common methodology for such assessments. To evaluate the usability of existing HIA methodology to quantify health effects of transport policies at the local level. Health impact of two simulated but realistic transport interventions - speed limit reduction and traffic re-allocation - was quantified by selecting traffic-related exposures and health endpoints, modelling of population exposure, selecting exposure-effect relations and estimating the number of local traffic-related cases and disease burden, expressed in disability-adjusted life-years (DALYs), before and after the intervention. Exposure information was difficult to retrieve because of the local scale of the interventions, and exposure-effect relations for subgroups and combined effects were missing. Given uncertainty in the outcomes originating from this kind of missing information, simulated changes in population health by two local traffic interventions were estimated to be small (<5%), except for the estimated reduction in DALYs by less traffic accidents (60%) due to speed limit reduction. Quantitative HIA of transport policies at a local scale is possible, provided that data on exposures, the exposed population and their baseline health status are available. The interpretation of the HIA information should be carried out in the context of the quality of input data and assumptions and uncertainties of the analysis.
Sarrigiannis, Ptolemaios G; Zhao, Yifan; Wei, Hua-Liang; Billings, Stephen A; Fotheringham, Jayne; Hadjivassiliou, Marios
2014-01-01
To introduce a new method of quantitative EEG analysis in the time domain, the error reduction ratio (ERR)-causality test. To compare performance against cross-correlation and coherence with phase measures. A simulation example was used as a gold standard to assess the performance of ERR-causality, against cross-correlation and coherence. The methods were then applied to real EEG data. Analysis of both simulated and real EEG data demonstrates that ERR-causality successfully detects dynamically evolving changes between two signals, with very high time resolution, dependent on the sampling rate of the data. Our method can properly detect both linear and non-linear effects, encountered during analysis of focal and generalised seizures. We introduce a new quantitative EEG method of analysis. It detects real time levels of synchronisation in the linear and non-linear domains. It computes directionality of information flow with corresponding time lags. This novel dynamic real time EEG signal analysis unveils hidden neural network interactions with a very high time resolution. These interactions cannot be adequately resolved by the traditional methods of coherence and cross-correlation, which provide limited results in the presence of non-linear effects and lack fidelity for changes appearing over small periods of time. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Li, Chen; Nagasaki, Masao; Koh, Chuan Hock; Miyano, Satoru
2011-05-01
Mathematical modeling and simulation studies are playing an increasingly important role in helping researchers elucidate how living organisms function in cells. In systems biology, researchers typically tune many parameters manually to achieve simulation results that are consistent with biological knowledge. This severely limits the size and complexity of simulation models built. In order to break this limitation, we propose a computational framework to automatically estimate kinetic parameters for a given network structure. We utilized an online (on-the-fly) model checking technique (which saves resources compared to the offline approach), with a quantitative modeling and simulation architecture named hybrid functional Petri net with extension (HFPNe). We demonstrate the applicability of this framework by the analysis of the underlying model for the neuronal cell fate decision model (ASE fate model) in Caenorhabditis elegans. First, we built a quantitative ASE fate model containing 3327 components emulating nine genetic conditions. Then, using our developed efficient online model checker, MIRACH 1.0, together with parameter estimation, we ran 20-million simulation runs, and were able to locate 57 parameter sets for 23 parameters in the model that are consistent with 45 biological rules extracted from published biological articles without much manual intervention. To evaluate the robustness of these 57 parameter sets, we run another 20 million simulation runs using different magnitudes of noise. Our simulation results concluded that among these models, one model is the most reasonable and robust simulation model owing to the high stability against these stochastic noises. Our simulation results provide interesting biological findings which could be used for future wet-lab experiments.
Gravitational Effects on Near Field Flow Structure of Low Density Gas Jets
NASA Technical Reports Server (NTRS)
Yep, Tze-Wing; Agrawal, Ajay K.; Griffin, DeVon; Salzman, Jack (Technical Monitor)
2001-01-01
Experiments were conducted in Earth gravity and microgravity to acquire quantitative data on near field flow structure of helium jets injected into air. Microgravity conditions were simulated in the 2.2-second drop tower at NASA Glenn Research Center. The jet flow was observed by quantitative rainbow schlieren deflectometry, a non-intrusive line of site measurement technique for the whole field. The flow structure was characterized by distributions of angular deflection and helium mole percentage obtained from color schlieren images taken at 60 Hz. Results show that the jet flow was significantly influenced by the gravity. The jet in microgravity was up to 70 percent wider than that in Earth gravity. The jet flow oscillations observed in Earth gravity were absent in microgravity, providing direct experimental evidence that the flow instability in the low density jet was buoyancy induced. The paper provides quantitative details of temporal flow evolution as the experiment undergoes a change in gravity in the drop tower.
He, Li-hong; Wang, Hai-yan; Lei, Xiang-dong
2016-02-01
Model based on vegetation ecophysiological process contains many parameters, and reasonable parameter values will greatly improve simulation ability. Sensitivity analysis, as an important method to screen out the sensitive parameters, can comprehensively analyze how model parameters affect the simulation results. In this paper, we conducted parameter sensitivity analysis of BIOME-BGC model with a case study of simulating net primary productivity (NPP) of Larix olgensis forest in Wangqing, Jilin Province. First, with the contrastive analysis between field measurement data and the simulation results, we tested the BIOME-BGC model' s capability of simulating the NPP of L. olgensis forest. Then, Morris and EFAST sensitivity methods were used to screen the sensitive parameters that had strong influence on NPP. On this basis, we also quantitatively estimated the sensitivity of the screened parameters, and calculated the global, the first-order and the second-order sensitivity indices. The results showed that the BIOME-BGC model could well simulate the NPP of L. olgensis forest in the sample plot. The Morris sensitivity method provided a reliable parameter sensitivity analysis result under the condition of a relatively small sample size. The EFAST sensitivity method could quantitatively measure the impact of simulation result of a single parameter as well as the interaction between the parameters in BIOME-BGC model. The influential sensitive parameters for L. olgensis forest NPP were new stem carbon to new leaf carbon allocation and leaf carbon to nitrogen ratio, the effect of their interaction was significantly greater than the other parameter' teraction effect.
NASA Technical Reports Server (NTRS)
Karandikar, Harsh M.
1997-01-01
An approach for objective and quantitative technical and cost risk analysis during product development, which is applicable from the earliest stages, is discussed. The approach is supported by a software tool called the Analytical System for Uncertainty and Risk Estimation (ASURE). Details of ASURE, the underlying concepts and its application history, are provided.
ERIC Educational Resources Information Center
Arora, A.; Arora, A. Saxena
2015-01-01
This article provides educators in business schools with a new interdisciplinary experiential lab game called Supply Chain-Marketing (SC-Mark) Shark Tank game, which can be implemented in both Supply Chain Management (SCM) and Marketing courses. The SC-Mark experiential lab game is a real-life business environment simulation that explores…
Mechanistic and quantitative insight into cell surface targeted molecular imaging agent design.
Zhang, Liang; Bhatnagar, Sumit; Deschenes, Emily; Thurber, Greg M
2016-05-05
Molecular imaging agent design involves simultaneously optimizing multiple probe properties. While several desired characteristics are straightforward, including high affinity and low non-specific background signal, in practice there are quantitative trade-offs between these properties. These include plasma clearance, where fast clearance lowers background signal but can reduce target uptake, and binding, where high affinity compounds sometimes suffer from lower stability or increased non-specific interactions. Further complicating probe development, many of the optimal parameters vary depending on both target tissue and imaging agent properties, making empirical approaches or previous experience difficult to translate. Here, we focus on low molecular weight compounds targeting extracellular receptors, which have some of the highest contrast values for imaging agents. We use a mechanistic approach to provide a quantitative framework for weighing trade-offs between molecules. Our results show that specific target uptake is well-described by quantitative simulations for a variety of targeting agents, whereas non-specific background signal is more difficult to predict. Two in vitro experimental methods for estimating background signal in vivo are compared - non-specific cellular uptake and plasma protein binding. Together, these data provide a quantitative method to guide probe design and focus animal work for more cost-effective and time-efficient development of molecular imaging agents.
Saho, Tatsunori; Onishi, Hideo
2015-07-01
In this study, we evaluated hemodynamics using simulated models and determined how cerebral aneurysms develop in simulated and patient-specific models based on medical images. Computational fluid dynamics (CFD) was analyzed by use of OpenFOAM software. Flow velocity, stream line, and wall shear stress (WSS) were evaluated in a simulated model aneurysm with known geometry and in a three-dimensional angiographic model. The ratio of WSS at the aneurysm compared with that at the basilar artery was 1:10 in simulated model aneurysms with a diameter of 10 mm and 1:18 in the angiographic model, indicating similar tendencies. Vortex flow occurred in both model aneurysms, and the WSS decreased in larger model aneurysms. The angiographic model provided accurate CFD information, and the tendencies of simulated and angiographic models were similar. These findings indicate that hemodynamic effects are involved in the development of aneurysms.
Dynamics of water bound to crystalline cellulose
DOE Office of Scientific and Technical Information (OSTI.GOV)
O’Neill, Hugh; Pingali, Sai Venkatesh; Petridis, Loukas
Interactions of water with cellulose are of both fundamental and technological importance. Here, we characterize the properties of water associated with cellulose using deuterium labeling, neutron scattering and molecular dynamics simulation. Quasi-elastic neutron scattering provided quantitative details about the dynamical relaxation processes that occur and was supported by structural characterization using small-angle neutron scattering and X-ray diffraction. We can unambiguously detect two populations of water associated with cellulose. The first is “non-freezing bound” water that gradually becomes mobile with increasing temperature and can be related to surface water. The second population is consistent with confined water that abruptly becomes mobilemore » at ~260 K, and can be attributed to water that accumulates in the narrow spaces between the microfibrils. Quantitative analysis of the QENS data showed that, at 250 K, the water diffusion coefficient was 0.85 ± 0.04 × 10-10 m2sec-1 and increased to 1.77 ± 0.09 × 10-10 m2sec-1 at 265 K. MD simulations are in excellent agreement with the experiments and support the interpretation that water associated with cellulose exists in two dynamical populations. Our results provide clarity to previous work investigating the states of bound water and provide a new approach for probing water interactions with lignocellulose materials.« less
Thermodynamics and Mechanics of Membrane Curvature Generation and Sensing by Proteins and Lipids
Baumgart, Tobias; Capraro, Benjamin R.; Zhu, Chen; Das, Sovan L.
2014-01-01
Research investigating lipid membrane curvature generation and sensing is a rapidly developing frontier in membrane physical chemistry and biophysics. The fast recent progress is based on the discovery of a plethora of proteins involved in coupling membrane shape to cellular membrane function, the design of new quantitative experimental techniques to study aspects of membrane curvature, and the development of analytical theories and simulation techniques that allow a mechanistic interpretation of quantitative measurements. The present review first provides an overview of important classes of membrane proteins for which function is coupled to membrane curvature. We then survey several mechanisms that are assumed to underlie membrane curvature sensing and generation. Finally, we discuss relatively simple thermodynamic/mechanical models that allow quantitative interpretation of experimental observations. PMID:21219150
Burger, Stefan; Fraunholz, Thomas; Leirer, Christian; Hoppe, Ronald H W; Wixforth, Achim; Peter, Malte A; Franke, Thomas
2013-06-25
Phase decomposition in lipid membranes has been the subject of numerous investigations by both experiment and theoretical simulation, yet quantitative comparisons of the simulated data to the experimental results are rare. In this work, we present a novel way of comparing the temporal development of liquid-ordered domains obtained from numerically solving the Cahn-Hilliard equation and by inducing a phase transition in giant unilamellar vesicles (GUVs). Quantitative comparison is done by calculating the structure factor of the domain pattern. It turns out that the decomposition takes place in three distinct regimes in both experiment and simulation. These regimes are characterized by different rates of growth of the mean domain diameter, and there is quantitative agreement between experiment and simulation as to the duration of each regime and the absolute rate of growth in each regime.
Nijran, Kuldip S; Houston, Alex S; Fleming, John S; Jarritt, Peter H; Heikkinen, Jari O; Skrypniuk, John V
2014-07-01
In this second UK audit of quantitative parameters obtained from renography, phantom simulations were used in cases in which the 'true' values could be estimated, allowing the accuracy of the parameters measured to be assessed. A renal physical phantom was used to generate a set of three phantom simulations (six kidney functions) acquired on three different gamma camera systems. A total of nine phantom simulations and three real patient studies were distributed to UK hospitals participating in the audit. Centres were asked to provide results for the following parameters: relative function and time-to-peak (whole kidney and cortical region). As with previous audits, a questionnaire collated information on methodology. Errors were assessed as the root mean square deviation from the true value. Sixty-one centres responded to the audit, with some hospitals providing multiple sets of results. Twenty-one centres provided a complete set of parameter measurements. Relative function and time-to-peak showed a reasonable degree of accuracy and precision in most UK centres. The overall average root mean squared deviation of the results for (i) the time-to-peak measurement for the whole kidney and (ii) the relative function measurement from the true value was 7.7 and 4.5%, respectively. These results showed a measure of consistency in the relative function and time-to-peak that was similar to the results reported in a previous renogram audit by our group. Analysis of audit data suggests a reasonable degree of accuracy in the quantification of renography function using relative function and time-to-peak measurements. However, it is reasonable to conclude that the objectives of the audit could not be fully realized because of the limitations of the mechanical phantom in providing true values for renal parameters.
Implementation of an interactive liver surgery planning system
NASA Astrophysics Data System (ADS)
Wang, Luyao; Liu, Jingjing; Yuan, Rong; Gu, Shuguo; Yu, Long; Li, Zhitao; Li, Yanzhao; Li, Zhen; Xie, Qingguo; Hu, Daoyu
2011-03-01
Liver tumor, one of the most wide-spread diseases, has a very high mortality in China. To improve success rates of liver surgeries and life qualities of such patients, we implement an interactive liver surgery planning system based on contrastenhanced liver CT images. The system consists of five modules: pre-processing, segmentation, modeling, quantitative analysis and surgery simulation. The Graph Cuts method is utilized to automatically segment the liver based on an anatomical prior knowledge that liver is the biggest organ and has almost homogeneous gray value. The system supports users to build patient-specific liver segment and sub-segment models using interactive portal vein branch labeling, and to perform anatomical resection simulation. It also provides several tools to simulate atypical resection, including resection plane, sphere and curved surface. To match actual surgery resections well and simulate the process flexibly, we extend our work to develop a virtual scalpel model and simulate the scalpel movement in the hepatic tissue using multi-plane continuous resection. In addition, the quantitative analysis module makes it possible to assess the risk of a liver surgery. The preliminary results show that the system has the potential to offer an accurate 3D delineation of the liver anatomy, as well as the tumors' location in relation to vessels, and to facilitate liver resection surgeries. Furthermore, we are testing the system in a full-scale clinical trial.
Modern Scientific Visualization is more than Just Pretty Pictures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bethel, E Wes; Rubel, Oliver; Wu, Kesheng
2008-12-05
While the primary product of scientific visualization is images and movies, its primary objective is really scientific insight. Too often, the focus of visualization research is on the product, not the mission. This paper presents two case studies, both that appear in previous publications, that focus on using visualization technology to produce insight. The first applies"Query-Driven Visualization" concepts to laser wakefield simulation data to help identify and analyze the process of beam formation. The second uses topological analysis to provide a quantitative basis for (i) understanding the mixing process in hydrodynamic simulations, and (ii) performing comparative analysis of data frommore » two different types of simulations that model hydrodynamic instability.« less
Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation
NASA Technical Reports Server (NTRS)
Stocker, John C.; Golomb, Andrew M.
2011-01-01
Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.
Undergraduate interprofessional education using high-fidelity paediatric simulation.
Stewart, Moira; Kennedy, Neil; Cuene-Grandidier, Hazel
2010-06-01
High-fidelity simulation is becoming increasingly important in the delivery of teaching and learning to health care professionals within a safe environment. Its use in an interprofessional context and at undergraduate level has the potential to facilitate the learning of good communication and teamworking, in addition to clinical knowledge and skills. Interprofessional teaching and learning workshops using high-fidelity paediatric simulation were developed and delivered to undergraduate medical and nursing students at Queen's University Belfast. Learning outcomes common to both professions, and essential in the clinical management of sick children, included basic competencies, communication and teamworking skills. Quantitative and qualitative evaluation was undertaken using published questionnaires. Quantitative results - the 32-item questionnaire was analysed for reliability using spss. Responses were positive for both groups of students across four domains - acquisition of knowledge and skills, communication and teamworking, professional identity and role awareness, and attitudes to shared learning. Qualitative results - thematic content analysis was used to analyse open-ended responses. Students from both groups commented that an interprofessional education (IPE) approach to paediatric simulation improved clinical and practice-based skills, and provided a safe learning environment. Students commented that there should be more interprofessional and simulation learning opportunities. High-fidelity paediatric simulation, used in an interprofessional context, has the potential to meet the requirements of undergraduate medical and nursing curricula. Further research is needed into the long-term benefits for patient care, and its generalisability to other areas within health care teaching and learning. © Blackwell Publishing Ltd 2010.
Shao, Qiang
2016-10-26
Large-scale conformational changes in proteins are important for their functions. Tracking the conformational change in real time at the level of a single protein molecule, however, remains a great challenge. In this article, we present a novel in silico approach with the combination of normal mode analysis and integrated-tempering-sampling molecular simulation (NMA-ITS) to give quantitative data for exploring the conformational transition pathway in multi-dimensional energy landscapes starting only from the knowledge of the two endpoint structures of the protein. The open-to-closed transitions of three proteins, including nCaM, AdK, and HIV-1 PR, were investigated using NMA-ITS simulations. The three proteins have varied structural flexibilities and domain communications in their respective conformational changes. The transition state structure in the conformational change of nCaM and the associated free-energy barrier are in agreement with those measured in a standard explicit-solvent REMD simulation. The experimentally measured transition intermediate structures of the intrinsically flexible AdK are captured by the conformational transition pathway measured here. The dominant transition pathways between the closed and fully open states of HIV-1 PR are very similar to those observed in recent REMD simulations. Finally, the evaluated relaxation times of the conformational transitions of three proteins are roughly at the same level as reported experimental data. Therefore, the NMA-ITS method is applicable for a variety of cases, providing both qualitative and quantitative insights into the conformational changes associated with the real functions of proteins.
Evaluation of virtual simulation in a master's-level nurse education certificate program.
Foronda, Cynthia; Lippincott, Christine; Gattamorta, Karina
2014-11-01
Master's-level, nurse education certificate students performed virtual clinical simulations as a portion of their clinical practicum. Virtual clinical simulation is an innovative pedagogy using avatars in Web-based platforms to provide simulated clinical experiences. The purpose of this mixed-methods study was to evaluate nurse educator students' experience with virtual simulation and the effect of virtual simulation on confidence in teaching ability. Aggregated quantitative results yielded no significant change in confidence in teaching ability. Individually, some students indicated change of either increased or decreased confidence, whereas others exhibited no change in confidence after engaging in virtual simulation. Qualitative findings revealed a process of precursors of anxiety and frustration with technical difficulties followed by outcomes of appreciation and learning. Instructor support was a mediating factor to decrease anxiety and technical difficulties. This study served as a starting point regarding the application of a virtual world to teach the art of instruction. As the movement toward online education continues, educators should further explore use of virtual simulation to prepare nurse educators.
Hattori, Yoshiaki; Falgout, Leo; Lee, Woosik; ...
2014-03-26
Non-invasive, biomedical devices have the potential to provide important, quantitative data for the assessment of skin diseases and wound healing. Traditional methods either rely on qualitative visual and tactile judgments of a professional and/or data obtained using instrumentation with forms that do not readily allow intimate integration with sensitive skin near a wound site. In this paper, an electronic sensor platform that can softly and reversibly laminate perilesionally at wounds to provide highly accurate, quantitative data of relevance to the management of surgical wound healing is reported. Clinical studies on patients using thermal sensors and actuators in fractal layouts providemore » precise time-dependent mapping of temperature and thermal conductivity of the skin near the wounds. Analytical and simulation results establish the fundamentals of the sensing modalities, the mechanics of the system, and strategies for optimized design. The use of this type of “epidermal” electronics system in a realistic clinical setting with human subjects establishes a set of practical procedures in disinfection, reuse, and protocols for quantitative measurement. Finally, the results have the potential to address important unmet needs in chronic wound management.« less
Background of SAM atom-fraction profiles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ernst, Frank
Atom-fraction profiles acquired by SAM (scanning Auger microprobe) have important applications, e.g. in the context of alloy surface engineering by infusion of carbon or nitrogen through the alloy surface. However, such profiles often exhibit an artifact in form of a background with a level that anti-correlates with the local atom fraction. This article presents a theory explaining this phenomenon as a consequence of the way in which random noise in the spectrum propagates into the discretized differentiated spectrum that is used for quantification. The resulting model of “energy channel statistics” leads to a useful semi-quantitative background reduction procedure, which ismore » validated by applying it to simulated data. Subsequently, the procedure is applied to an example of experimental SAM data. The analysis leads to conclusions regarding optimum experimental acquisition conditions. The proposed method of background reduction is based on general principles and should be useful for a broad variety of applications. - Highlights: • Atom-fraction–depth profiles of carbon measured by scanning Auger microprobe • Strong background, varies with local carbon concentration. • Needs correction e.g. for quantitative comparison with simulations • Quantitative theory explains background. • Provides background removal strategy and practical advice for acquisition.« less
The subtle business of model reduction for stochastic chemical kinetics
NASA Astrophysics Data System (ADS)
Gillespie, Dan T.; Cao, Yang; Sanft, Kevin R.; Petzold, Linda R.
2009-02-01
This paper addresses the problem of simplifying chemical reaction networks by adroitly reducing the number of reaction channels and chemical species. The analysis adopts a discrete-stochastic point of view and focuses on the model reaction set S1⇌S2→S3, whose simplicity allows all the mathematics to be done exactly. The advantages and disadvantages of replacing this reaction set with a single S3-producing reaction are analyzed quantitatively using novel criteria for measuring simulation accuracy and simulation efficiency. It is shown that in all cases in which such a model reduction can be accomplished accurately and with a significant gain in simulation efficiency, a procedure called the slow-scale stochastic simulation algorithm provides a robust and theoretically transparent way of implementing the reduction.
The subtle business of model reduction for stochastic chemical kinetics.
Gillespie, Dan T; Cao, Yang; Sanft, Kevin R; Petzold, Linda R
2009-02-14
This paper addresses the problem of simplifying chemical reaction networks by adroitly reducing the number of reaction channels and chemical species. The analysis adopts a discrete-stochastic point of view and focuses on the model reaction set S(1)<=>S(2)-->S(3), whose simplicity allows all the mathematics to be done exactly. The advantages and disadvantages of replacing this reaction set with a single S(3)-producing reaction are analyzed quantitatively using novel criteria for measuring simulation accuracy and simulation efficiency. It is shown that in all cases in which such a model reduction can be accomplished accurately and with a significant gain in simulation efficiency, a procedure called the slow-scale stochastic simulation algorithm provides a robust and theoretically transparent way of implementing the reduction.
DLTPulseGenerator: A library for the simulation of lifetime spectra based on detector-output pulses
NASA Astrophysics Data System (ADS)
Petschke, Danny; Staab, Torsten E. M.
2018-01-01
The quantitative analysis of lifetime spectra relevant in both life and materials sciences presents one of the ill-posed inverse problems and, hence, leads to most stringent requirements on the hardware specifications and the analysis algorithms. Here we present DLTPulseGenerator, a library written in native C++ 11, which provides a simulation of lifetime spectra according to the measurement setup. The simulation is based on pairs of non-TTL detector output-pulses. Those pulses require the Constant Fraction Principle (CFD) for the determination of the exact timing signal and, thus, the calculation of the time difference i.e. the lifetime. To verify the functionality, simulation results were compared to experimentally obtained data using Positron Annihilation Lifetime Spectroscopy (PALS) on pure tin.
A quantitative dynamic systems model of health-related quality of life among older adults
Roppolo, Mattia; Kunnen, E Saskia; van Geert, Paul L; Mulasso, Anna; Rabaglietti, Emanuela
2015-01-01
Health-related quality of life (HRQOL) is a person-centered concept. The analysis of HRQOL is highly relevant in the aged population, which is generally suffering from health decline. Starting from a conceptual dynamic systems model that describes the development of HRQOL in individuals over time, this study aims to develop and test a quantitative dynamic systems model, in order to reveal the possible dynamic trends of HRQOL among older adults. The model is tested in different ways: first, with a calibration procedure to test whether the model produces theoretically plausible results, and second, with a preliminary validation procedure using empirical data of 194 older adults. This first validation tested the prediction that given a particular starting point (first empirical data point), the model will generate dynamic trajectories that lead to the observed endpoint (second empirical data point). The analyses reveal that the quantitative model produces theoretically plausible trajectories, thus providing support for the calibration procedure. Furthermore, the analyses of validation show a good fit between empirical and simulated data. In fact, no differences were found in the comparison between empirical and simulated final data for the same subgroup of participants, whereas the comparison between different subgroups of people resulted in significant differences. These data provide an initial basis of evidence for the dynamic nature of HRQOL during the aging process. Therefore, these data may give new theoretical and applied insights into the study of HRQOL and its development with time in the aging population. PMID:26604722
Li, Chen; Nagasaki, Masao; Ueno, Kazuko; Miyano, Satoru
2009-04-27
Model checking approaches were applied to biological pathway validations around 2003. Recently, Fisher et al. have proved the importance of model checking approach by inferring new regulation of signaling crosstalk in C. elegans and confirming the regulation with biological experiments. They took a discrete and state-based approach to explore all possible states of the system underlying vulval precursor cell (VPC) fate specification for desired properties. However, since both discrete and continuous features appear to be an indispensable part of biological processes, it is more appropriate to use quantitative models to capture the dynamics of biological systems. Our key motivation of this paper is to establish a quantitative methodology to model and analyze in silico models incorporating the use of model checking approach. A novel method of modeling and simulating biological systems with the use of model checking approach is proposed based on hybrid functional Petri net with extension (HFPNe) as the framework dealing with both discrete and continuous events. Firstly, we construct a quantitative VPC fate model with 1761 components by using HFPNe. Secondly, we employ two major biological fate determination rules - Rule I and Rule II - to VPC fate model. We then conduct 10,000 simulations for each of 48 sets of different genotypes, investigate variations of cell fate patterns under each genotype, and validate the two rules by comparing three simulation targets consisting of fate patterns obtained from in silico and in vivo experiments. In particular, an evaluation was successfully done by using our VPC fate model to investigate one target derived from biological experiments involving hybrid lineage observations. However, the understandings of hybrid lineages are hard to make on a discrete model because the hybrid lineage occurs when the system comes close to certain thresholds as discussed by Sternberg and Horvitz in 1986. Our simulation results suggest that: Rule I that cannot be applied with qualitative based model checking, is more reasonable than Rule II owing to the high coverage of predicted fate patterns (except for the genotype of lin-15ko; lin-12ko double mutants). More insights are also suggested. The quantitative simulation-based model checking approach is a useful means to provide us valuable biological insights and better understandings of biological systems and observation data that may be hard to capture with the qualitative one.
2010-01-01
Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in community-supported and standardised formats. In addition, the models and their components should be cross-referenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freely-accessible online resource for storing, viewing, retrieving, and analysing published, peer-reviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access up-to-date data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. PMID:20587024
Rebecca S.H. Kennedy; Michael C. Wimberly
2009-01-01
Regional conservation planning frequently relies on general assumptions about historical disturbance regimes to inform decisions about landscape restoration, reserve allocations, and landscape management. Spatially explicit simulations of landscape dynamics provide quantitative estimates of landscape structure and allow for the testing of alternative scenarios. We used...
Communication—Quantitative Voltammetric Analysis of High Concentration Actinides in Molten Salts
Hoyt, Nathaniel C.; Willit, James L.; Williamson, Mark A.
2017-01-18
Previous electroanalytical studies have shown that cyclic voltammetry can provide accurate quantitative measurements of actinide concentrations at low weight loadings in molten salts. However, above 2 wt%, the techniques were found to underpredict the concentrations of the reactant species. Here this work will demonstrate that much of the discrepancy is caused by uncompensated resistance and cylindrical diffusion. An improved electroanalytical approach has therefore been developed using the results of digital simulations to take these effects into account. This approach allows for accurate electroanalytical predictions across the full range of weight loadings expected to be encountered in operational nuclear fuel processingmore » equipment.« less
Communication—Quantitative Voltammetric Analysis of High Concentration Actinides in Molten Salts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoyt, Nathaniel C.; Willit, James L.; Williamson, Mark A.
Previous electroanalytical studies have shown that cyclic voltammetry can provide accurate quantitative measurements of actinide concentrations at low weight loadings in molten salts. However, above 2 wt%, the techniques were found to underpredict the concentrations of the reactant species. Here this work will demonstrate that much of the discrepancy is caused by uncompensated resistance and cylindrical diffusion. An improved electroanalytical approach has therefore been developed using the results of digital simulations to take these effects into account. This approach allows for accurate electroanalytical predictions across the full range of weight loadings expected to be encountered in operational nuclear fuel processingmore » equipment.« less
Multifractal spectrum and lacunarity as measures of complexity of osseointegration.
de Souza Santos, Daniel; Dos Santos, Leonardo Cavalcanti Bezerra; de Albuquerque Tavares Carvalho, Alessandra; Leão, Jair Carneiro; Delrieux, Claudio; Stosic, Tatijana; Stosic, Borko
2016-07-01
The goal of this study is to contribute to a better quantitative description of the early stages of osseointegration, by application of fractal, multifractal, and lacunarity analysis. Fractal, multifractal, and lacunarity analysis are performed on scanning electron microscopy (SEM) images of titanium implants that were first subjected to different treatment combinations of i) sand blasting, ii) acid etching, and iii) exposition to calcium phosphate, and were then submersed in a simulated body fluid (SBF) for 30 days. All the three numerical techniques are applied to the implant SEM images before and after SBF immersion, in order to provide a comprehensive set of common quantitative descriptors. It is found that implants subjected to different physicochemical treatments before submersion in SBF exhibit a rather similar level of complexity, while the great variety of crystal forms after SBF submersion reveals rather different quantitative measures (reflecting complexity), for different treatments. In particular, it is found that acid treatment, in most combinations with the other considered treatments, leads to a higher fractal dimension (more uniform distribution of crystals), lower lacunarity (lesser variation in gap sizes), and narrowing of the multifractal spectrum (smaller fluctuations on different scales). The current quantitative description has shown the capacity to capture the main features of complex images of implant surfaces, for several different treatments. Such quantitative description should provide a fundamental tool for future large scale systematic studies, considering the large variety of possible implant treatments and their combinations. Quantitative description of early stages of osseointegration on titanium implants with different treatments should help develop a better understanding of this phenomenon, in general, and provide basis for further systematic experimental studies. Clinical practice should benefit from such studies in the long term, by more ready access to implants of higher quality.
An ice sheet model validation framework for the Greenland ice sheet
NASA Astrophysics Data System (ADS)
Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.; Howat, Ian M.; Neumann, Thomas; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey; Chambers, Don P.; Evans, Katherine J.; Kennedy, Joseph H.; Lenaerts, Jan; Lipscomb, William H.; Perego, Mauro; Salinger, Andrew G.; Tuminaro, Raymond S.; van den Broeke, Michiel R.; Nowicki, Sophie M. J.
2017-01-01
We propose a new ice sheet model validation framework - the Cryospheric Model Comparison Tool (CmCt) - that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013, using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quantitative metrics for use in evaluating the different model simulations against the observations. We find that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin-scale and whole-ice-sheet-scale metrics, we find that simulations using both idealized conceptual models and dynamic, numerical models provide an equally reasonable representation of the ice sheet surface (mean elevation differences of < 1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CmCt, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate a predictive skill with respect to observed dynamic changes that have occurred on Greenland over the past few decades. An extensible design will allow for continued use of the CmCt as future altimetry, gravimetry, and other remotely sensed data become available for use in ice sheet model validation.
Fang, Juan; Gong, He; Kong, Lingyan; Zhu, Dong
2013-12-20
Bone can adjust its morphological structure to adapt to the changes of mechanical environment, i.e. the bone structure change is related to mechanical loading. This implies that osteoarthritis may be closely associated with knee joint deformity. The purposes of this paper were to simulate the internal bone mineral density (BMD) change in three-dimensional (3D) proximal tibia under different mechanical environments, as well as to explore the relationship between mechanical environment and bone morphological abnormity. The right proximal tibia was scanned with CT to reconstruct a 3D proximal tibia model in MIMICS, then it was imported to finite element software ANSYS to establish 3D finite element model. The internal structure of 3D proximal tibia of young normal people was simulated using quantitative bone remodeling theory in combination with finite element method, then based on the changing pattern of joint contact force on the tibial plateau in valgus knees, the mechanical loading was changed, and the simulated normal tibia structure was used as initial structure to simulate the internal structure of 3D proximal tibia for old people with 6° valgus deformity. Four regions of interest (ROIs) were selected in the proximal tibia to quantitatively analyze BMD and compare with the clinical measurements. The simulation results showed that the BMD distribution in 3D proximal tibia was consistent with clinical measurements in normal knees and that in valgus knees was consistent with the measurement of patients with osteoarthritis in clinics. It is shown that the change of mechanical environment is the main cause for the change of subchondral bone structure, and being under abnormal mechanical environment for a long time may lead to osteoarthritis. Besides, the simulation method adopted in this paper can more accurately simulate the internal structure of 3D proximal tibia under different mechanical environments. It helps to better understand the mechanism of osteoarthritis and provides theoretical basis and computational method for the prevention and treatment of osteoarthritis. It can also serve as basis for further study on periprosthetic BMD changes after total knee arthroplasty, and provide a theoretical basis for optimization design of prosthesis.
2013-01-01
Background Bone can adjust its morphological structure to adapt to the changes of mechanical environment, i.e. the bone structure change is related to mechanical loading. This implies that osteoarthritis may be closely associated with knee joint deformity. The purposes of this paper were to simulate the internal bone mineral density (BMD) change in three-dimensional (3D) proximal tibia under different mechanical environments, as well as to explore the relationship between mechanical environment and bone morphological abnormity. Methods The right proximal tibia was scanned with CT to reconstruct a 3D proximal tibia model in MIMICS, then it was imported to finite element software ANSYS to establish 3D finite element model. The internal structure of 3D proximal tibia of young normal people was simulated using quantitative bone remodeling theory in combination with finite element method, then based on the changing pattern of joint contact force on the tibial plateau in valgus knees, the mechanical loading was changed, and the simulated normal tibia structure was used as initial structure to simulate the internal structure of 3D proximal tibia for old people with 6° valgus deformity. Four regions of interest (ROIs) were selected in the proximal tibia to quantitatively analyze BMD and compare with the clinical measurements. Results The simulation results showed that the BMD distribution in 3D proximal tibia was consistent with clinical measurements in normal knees and that in valgus knees was consistent with the measurement of patients with osteoarthritis in clinics. Conclusions It is shown that the change of mechanical environment is the main cause for the change of subchondral bone structure, and being under abnormal mechanical environment for a long time may lead to osteoarthritis. Besides, the simulation method adopted in this paper can more accurately simulate the internal structure of 3D proximal tibia under different mechanical environments. It helps to better understand the mechanism of osteoarthritis and provides theoretical basis and computational method for the prevention and treatment of osteoarthritis. It can also serve as basis for further study on periprosthetic BMD changes after total knee arthroplasty, and provide a theoretical basis for optimization design of prosthesis. PMID:24359345
An ice sheet model validation framework for the Greenland ice sheet
Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.; Howat, Ian M.; Neumann, Thomas; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey; Chambers, Don P.; Evans, Katherine J.; Kennedy, Joseph H.; Lenaerts, Jan; Lipscomb, William H.; Perego, Mauro; Salinger, Andrew G.; Tuminaro, Raymond S.; van den Broeke, Michiel R.; Nowicki, Sophie M. J.
2018-01-01
We propose a new ice sheet model validation framework – the Cryospheric Model Comparison Tool (CmCt) – that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013 using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quantitative metrics for use in evaluating the different model simulations against the observations. We find that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin- and whole-ice-sheet scale metrics, we find that simulations using both idealized conceptual models and dynamic, numerical models provide an equally reasonable representation of the ice sheet surface (mean elevation differences of <1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CmCt, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate predictive skill with respect to observed dynamic changes occurring on Greenland over the past few decades. An extensible design will allow for continued use of the CmCt as future altimetry, gravimetry, and other remotely sensed data become available for use in ice sheet model validation. PMID:29697704
An Ice Sheet Model Validation Framework for the Greenland Ice Sheet
NASA Technical Reports Server (NTRS)
Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.; Howat, Ian M.; Neumann, Thomas A.; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey R.; Chambers, Don P.; Evans, Katherine J.;
2017-01-01
We propose a new ice sheet model validation framework - the Cryospheric Model Comparison Tool (CmCt) - that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013, using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quantitative metrics for use in evaluating the different model simulations against the observations. We find that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin-scale and whole-ice-sheet-scale metrics, we find that simulations using both idealized conceptual models and dynamic, numerical models provide an equally reasonable representation of the ice sheet surface (mean elevation differences of less than 1 meter). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CmCt, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate a predictive skill with respect to observed dynamic changes that have occurred on Greenland over the past few decades. An extensible design will allow for continued use of the CmCt as future altimetry, gravimetry, and other remotely sensed data become available for use in ice sheet model validation.
Shi, Kuangyu; Bayer, Christine; Gaertner, Florian C; Astner, Sabrina T; Wilkens, Jan J; Nüsslin, Fridtjof; Vaupel, Peter; Ziegler, Sibylle I
2017-02-01
Positron-emission tomography (PET) with hypoxia specific tracers provides a noninvasive method to assess the tumor oxygenation status. Reaction-diffusion models have advantages in revealing the quantitative relation between in vivo imaging and the tumor microenvironment. However, there is no quantitative comparison of the simulation results with the real PET measurements yet. The lack of experimental support hampers further applications of computational simulation models. This study aims to compare the simulation results with a preclinical [ 18 F]FMISO PET study and to optimize the reaction-diffusion model accordingly. Nude mice with xenografted human squamous cell carcinomas (CAL33) were investigated with a 2 h dynamic [ 18 F]FMISO PET followed by immunofluorescence staining using the hypoxia marker pimonidazole and the endothelium marker CD 31. A large data pool of tumor time-activity curves (TAC) was simulated for each mouse by feeding the arterial input function (AIF) extracted from experiments into the model with different configurations of the tumor microenvironment. A measured TAC was considered to match a simulated TAC when the difference metric was below a certain, noise-dependent threshold. As an extension to the well-established Kelly model, a flow-limited oxygen-dependent (FLOD) model was developed to improve the matching between measurements and simulations. The matching rate between the simulated TACs of the Kelly model and the mouse PET data ranged from 0 to 28.1% (on average 9.8%). By modifying the Kelly model to an FLOD model, the matching rate between the simulation and the PET measurements could be improved to 41.2-84.8% (on average 64.4%). Using a simulation data pool and a matching strategy, we were able to compare the simulated temporal course of dynamic PET with in vivo measurements. By modifying the Kelly model to a FLOD model, the computational simulation was able to approach the dynamic [ 18 F]FMISO measurements in the investigated tumors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shekar, Venkateswaran; Fiondella, Lance; Chatterjee, Samrat
Transportation networks are critical to the social and economic function of nations. Given the continuing increase in the populations of cities throughout the world, the criticality of transportation infrastructure is expected to increase. Thus, it is ever more important to mitigate congestion as well as to assess the impact disruptions would have on individuals who depend on transportation for their work and livelihood. Moreover, several government organizations are responsible for ensuring transportation networks are available despite the constant threat of natural disasters and terrorist activities. Most of the previous transportation network vulnerability research has been performed in the context ofmore » static traffic models, many of which are formulated as traditional optimization problems. However, transportation networks are dynamic because their usage varies over time. Thus, more appropriate methods to characterize the vulnerability of transportation networks should consider their dynamic properties. This paper presents a quantitative approach to assess the vulnerability of a transportation network to disruptions with methods from traffic simulation. Our approach can prioritize the critical links over time and is generalizable to the case where both link and node disruptions are of concern. We illustrate the approach through a series of examples. Our results demonstrate that the approach provides quantitative insight into the time varying criticality of links. Such an approach could be used as the objective function of less traditional optimization methods that use simulation and other techniques to evaluate the relative utility of a particular network defense to reduce vulnerability and increase resilience.« less
From Single-Cell Dynamics to Scaling Laws in Oncology
NASA Astrophysics Data System (ADS)
Chignola, Roberto; Sega, Michela; Stella, Sabrina; Vyshemirsky, Vladislav; Milotti, Edoardo
We are developing a biophysical model of tumor biology. We follow a strictly quantitative approach where each step of model development is validated by comparing simulation outputs with experimental data. While this strategy may slow down our advancements, at the same time it provides an invaluable reward: we can trust simulation outputs and use the model to explore territories of cancer biology where current experimental techniques fail. Here, we review our multi-scale biophysical modeling approach and show how a description of cancer at the cellular level has led us to general laws obeyed by both in vitro and in vivo tumors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matsumoto, H.; Eki, Y.; Kaji, A.
1993-12-01
An expert system which can support operators of fossil power plants in creating the optimum startup schedule and executing it accurately is described. The optimum turbine speed-up and load-up pattern is obtained through an iterative manner which is based on fuzzy resonating using quantitative calculations as plant dynamics models and qualitative knowledge as schedule optimization rules with fuzziness. The rules represent relationships between stress margins and modification rates of the schedule parameters. Simulations analysis proves that the system provides quick and accurate plant startups.
NASA Astrophysics Data System (ADS)
Dörr, Dominik; Joppich, Tobias; Schirmaier, Fabian; Mosthaf, Tobias; Kärger, Luise; Henning, Frank
2016-10-01
Thermoforming of continuously fiber reinforced thermoplastics (CFRTP) is ideally suited to thin walled and complex shaped products. By means of forming simulation, an initial validation of the producibility of a specific geometry, an optimization of the forming process and the prediction of fiber-reorientation due to forming is possible. Nevertheless, applied methods need to be validated. Therefor a method is presented, which enables the calculation of error measures for the mismatch between simulation results and experimental tests, based on measurements with a conventional coordinate measuring device. As a quantitative measure, describing the curvature is provided, the presented method is also suitable for numerical or experimental sensitivity studies on wrinkling behavior. The applied methods for forming simulation, implemented in Abaqus explicit, are presented and applied to a generic geometry. The same geometry is tested experimentally and simulation and test results are compared by the proposed validation method.
NASA Astrophysics Data System (ADS)
van Poppel, Bret; Owkes, Mark; Nelson, Thomas; Lee, Zachary; Sowell, Tyler; Benson, Michael; Vasquez Guzman, Pablo; Fahrig, Rebecca; Eaton, John; Kurman, Matthew; Kweon, Chol-Bum; Bravo, Luis
2014-11-01
In this work, we present high-fidelity Computational Fluid Dynamics (CFD) results of liquid fuel injection from a pressure-swirl atomizer and compare the simulations to experimental results obtained using both shadowgraphy and phase-averaged X-ray computed tomography (CT) scans. The CFD and experimental results focus on the dense near-nozzle region to identify the dominant mechanisms of breakup during primary atomization. Simulations are performed using the NGA code of Desjardins et al (JCP 227 (2008)) and employ the volume of fluid (VOF) method proposed by Owkes and Desjardins (JCP 270 (2013)), a second order accurate, un-split, conservative, three-dimensional VOF scheme providing second order density fluxes and capable of robust and accurate high density ratio simulations. Qualitative features and quantitative statistics are assessed and compared for the simulation and experimental results, including the onset of atomization, spray cone angle, and drop size and distribution.
Quantitative validation of carbon-fiber laminate low velocity impact simulations
English, Shawn A.; Briggs, Timothy M.; Nelson, Stacy M.
2015-09-26
Simulations of low velocity impact with a flat cylindrical indenter upon a carbon fiber fabric reinforced polymer laminate are rigorously validated. Comparison of the impact energy absorption between the model and experiment is used as the validation metric. Additionally, non-destructive evaluation, including ultrasonic scans and three-dimensional computed tomography, provide qualitative validation of the models. The simulations include delamination, matrix cracks and fiber breaks. An orthotropic damage and failure constitutive model, capable of predicting progressive damage and failure, is developed in conjunction and described. An ensemble of simulations incorporating model parameter uncertainties is used to predict a response distribution which ismore » then compared to experimental output using appropriate statistical methods. Lastly, the model form errors are exposed and corrected for use in an additional blind validation analysis. The result is a quantifiable confidence in material characterization and model physics when simulating low velocity impact in structures of interest.« less
Sun, Tie Gang; Xiao, Rong Bo; Cai, Yun Nan; Wang, Yao Wu; Wu, Chang Guang
2016-08-01
Quantitative assessment of urban thermal environment has become a focus for urban climate and environmental science since the concept of urban heat island has been proposed. With the continual development of space information and computer simulation technology, substantial progresses have been made on quantitative assessment techniques and methods of urban thermal environment. The quantitative assessment techniques have been developed to dynamics simulation and forecast of thermal environment at various scales based on statistical analysis of thermal environment on urban-scale using the historical data of weather stations. This study reviewed the development progress of ground meteorological observation, thermal infrared remote sensing and numerical simulation. Moreover, the potential advantages and disadvantages, applicability and the development trends of these techniques were also summarized, aiming to add fundamental knowledge of understanding the urban thermal environment assessment and optimization.
A Virtual Look at Epstein–Barr Virus Infection: Biological Interpretations
Delgado-Eckert, Edgar; Hadinoto, Vey; Jarrah, Abdul S; Laubenbacher, Reinhard; Lee, Kichol; Luzuriaga, Katherine; Polys, Nicholas F; Thorley-Lawson, David A
2007-01-01
The possibility of using computer simulation and mathematical modeling to gain insight into biological and other complex systems is receiving increased attention. However, it is as yet unclear to what extent these techniques will provide useful biological insights or even what the best approach is. Epstein–Barr virus (EBV) provides a good candidate to address these issues. It persistently infects most humans and is associated with several important diseases. In addition, a detailed biological model has been developed that provides an intricate understanding of EBV infection in the naturally infected human host and accounts for most of the virus' diverse and peculiar properties. We have developed an agent-based computer model/simulation (PathSim, Pathogen Simulation) of this biological model. The simulation is performed on a virtual grid that represents the anatomy of the tonsils of the nasopharyngeal cavity (Waldeyer ring) and the peripheral circulation—the sites of EBV infection and persistence. The simulation is presented via a user friendly visual interface and reproduces quantitative and qualitative aspects of acute and persistent EBV infection. The simulation also had predictive power in validation experiments involving certain aspects of viral infection dynamics. Moreover, it allows us to identify switch points in the infection process that direct the disease course towards the end points of persistence, clearance, or death. Lastly, we were able to identify parameter sets that reproduced aspects of EBV-associated diseases. These investigations indicate that such simulations, combined with laboratory and clinical studies and animal models, will provide a powerful approach to investigating and controlling EBV infection, including the design of targeted anti-viral therapies. PMID:17953479
Gravitational Effects on Flow Instability and Transition in Low Density Jets
NASA Technical Reports Server (NTRS)
Agrawal, Ajay K.; Parthasarathy, Ramkumar
2004-01-01
Experiments were conducted in Earth gravity and microgravity to acquire quantitative data on near field flow structure of helium jets injected into air. Microgravity conditions were simulated in the 2.2-second drop tower at NASA Glenn Research Center. The jet flow was observed by quantitative rainbow schlieren deflectometry, a non-intrusive line of sight measurement technique suited for the microgravity environment. The flow structure was characterized by distributions of helium mole fraction obtained from color schlieren images taken at 60 Hz. Results show that the jet in microgravity was up to 70 percent wider than that in Earth gravity. Experiments reveal that the global flow oscillations observed in Earth gravity are absent in microgravity. The report provides quantitative details of flow evolution as the experiment undergoes change in gravity in the drop tower.
Allenby, Mark C; Misener, Ruth; Panoskaltsis, Nicki; Mantalaris, Athanasios
2017-02-01
Three-dimensional (3D) imaging techniques provide spatial insight into environmental and cellular interactions and are implemented in various fields, including tissue engineering, but have been restricted by limited quantification tools that misrepresent or underutilize the cellular phenomena captured. This study develops image postprocessing algorithms pairing complex Euclidean metrics with Monte Carlo simulations to quantitatively assess cell and microenvironment spatial distributions while utilizing, for the first time, the entire 3D image captured. Although current methods only analyze a central fraction of presented confocal microscopy images, the proposed algorithms can utilize 210% more cells to calculate 3D spatial distributions that can span a 23-fold longer distance. These algorithms seek to leverage the high sample cost of 3D tissue imaging techniques by extracting maximal quantitative data throughout the captured image.
QFASAR: Quantitative fatty acid signature analysis with R
Bromaghin, Jeffrey F.
2017-01-01
Knowledge of predator diets provides essential insights into their ecology, yet diet estimation is challenging and remains an active area of research.Quantitative fatty acid signature analysis (QFASA) is a popular method of estimating diet composition that continues to be investigated and extended. However, software to implement QFASA has only recently become publicly available.I summarize a new R package, qfasar, for diet estimation using QFASA methods. The package also provides functionality to evaluate and potentially improve the performance of a library of prey signature data, compute goodness-of-fit diagnostics, and support simulation-based research. Several procedures in the package have not previously been published.qfasar makes traditional and recently published QFASA diet estimation methods accessible to ecologists for the first time. Use of the package is illustrated with signature data from Chukchi Sea polar bears and potential prey species.
Simulating Extraterrestrial Ices in the Laboratory
NASA Astrophysics Data System (ADS)
Berisford, D. F.; Carey, E. M.; Hand, K. P.; Choukroun, M.
2017-12-01
Several ongoing experiments at JPL attempt to simulate the ice environment for various regimes associated with icy moons. The Europa Penitent Ice Experiment (EPIX) simulates the surface environment of an icy moon, to investigate the physics of ice surface morphology growth. This experiment features half-meter-scale cryogenic ice samples, cryogenic radiative sink environment, vacuum conditions, and diurnal cycling solar simulation. The experiment also includes several smaller fixed-geometry vacuum chambers for ice simulation at Earth-like and intermediate temperature and vacuum conditions for development of surface morphology growth scaling relations. Additionally, an ice cutting facility built on a similar platform provides qualitative data on the mechanical behavior of cryogenic ice with impurities under vacuum, and allows testing of ice cutting/sampling tools relevant for landing spacecraft. A larger cutting facility is under construction at JPL, which will provide more quantitative data and allow full-scale sampling tool tests. Another facility, the JPL Ice Physics Laboratory, features icy analog simulant preparation abilities that range icy solar system objects such as Mars, Ceres and the icy satellites of Saturn and Jupiter. In addition, the Ice Physics Lab has unique facilities for Icy Analog Tidal Simulation and Rheological Studies of Cryogenic Icy Slurries, as well as equipment to perform thermal and mechanical properties testing on icy analog materials and their response to sinusoidal tidal stresses.
Jha, Abhinav K; Caffo, Brian; Frey, Eric C
2016-01-01
The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest. Results showed that the proposed technique provided accurate ranking of the reconstruction methods for 97.5% of the 50 noise realizations. Further, the technique was robust to the choice of evaluated reconstruction methods. The simulation study pointed to possible violations of the assumptions made in the NGS technique under clinical scenarios. However, numerical experiments indicated that the NGS technique was robust in ranking methods even when there was some degree of such violation. PMID:26982626
Jha, Abhinav K; Caffo, Brian; Frey, Eric C
2016-04-07
The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest. Results showed that the proposed technique provided accurate ranking of the reconstruction methods for 97.5% of the 50 noise realizations. Further, the technique was robust to the choice of evaluated reconstruction methods. The simulation study pointed to possible violations of the assumptions made in the NGS technique under clinical scenarios. However, numerical experiments indicated that the NGS technique was robust in ranking methods even when there was some degree of such violation.
Integration of Continuous-Time Dynamics in a Spiking Neural Network Simulator.
Hahne, Jan; Dahmen, David; Schuecker, Jannis; Frommer, Andreas; Bolten, Matthias; Helias, Moritz; Diesmann, Markus
2017-01-01
Contemporary modeling approaches to the dynamics of neural networks include two important classes of models: biologically grounded spiking neuron models and functionally inspired rate-based units. We present a unified simulation framework that supports the combination of the two for multi-scale modeling, enables the quantitative validation of mean-field approaches by spiking network simulations, and provides an increase in reliability by usage of the same simulation code and the same network model specifications for both model classes. While most spiking simulations rely on the communication of discrete events, rate models require time-continuous interactions between neurons. Exploiting the conceptual similarity to the inclusion of gap junctions in spiking network simulations, we arrive at a reference implementation of instantaneous and delayed interactions between rate-based models in a spiking network simulator. The separation of rate dynamics from the general connection and communication infrastructure ensures flexibility of the framework. In addition to the standard implementation we present an iterative approach based on waveform-relaxation techniques to reduce communication and increase performance for large-scale simulations of rate-based models with instantaneous interactions. Finally we demonstrate the broad applicability of the framework by considering various examples from the literature, ranging from random networks to neural-field models. The study provides the prerequisite for interactions between rate-based and spiking models in a joint simulation.
Integration of Continuous-Time Dynamics in a Spiking Neural Network Simulator
Hahne, Jan; Dahmen, David; Schuecker, Jannis; Frommer, Andreas; Bolten, Matthias; Helias, Moritz; Diesmann, Markus
2017-01-01
Contemporary modeling approaches to the dynamics of neural networks include two important classes of models: biologically grounded spiking neuron models and functionally inspired rate-based units. We present a unified simulation framework that supports the combination of the two for multi-scale modeling, enables the quantitative validation of mean-field approaches by spiking network simulations, and provides an increase in reliability by usage of the same simulation code and the same network model specifications for both model classes. While most spiking simulations rely on the communication of discrete events, rate models require time-continuous interactions between neurons. Exploiting the conceptual similarity to the inclusion of gap junctions in spiking network simulations, we arrive at a reference implementation of instantaneous and delayed interactions between rate-based models in a spiking network simulator. The separation of rate dynamics from the general connection and communication infrastructure ensures flexibility of the framework. In addition to the standard implementation we present an iterative approach based on waveform-relaxation techniques to reduce communication and increase performance for large-scale simulations of rate-based models with instantaneous interactions. Finally we demonstrate the broad applicability of the framework by considering various examples from the literature, ranging from random networks to neural-field models. The study provides the prerequisite for interactions between rate-based and spiking models in a joint simulation. PMID:28596730
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gearhart, A; Peterson, T; Johnson, L
2015-06-15
Purpose: To evaluate the impact of the exceptional energy resolution of germanium detectors for preclinical SPECT in comparison to conventional detectors. Methods: A cylindrical water phantom was created in GATE with a spherical Tc-99m source in the center. Sixty-four projections over 360 degrees using a pinhole collimator were simulated. The same phantom was simulated using air instead of water to establish the true reconstructed voxel intensity without attenuation. Attenuation correction based on the Chang method was performed on MLEM reconstructed images from the water phantom to determine a quantitative measure of the effectiveness of the attenuation correction. Similarly, a NEMAmore » phantom was simulated, and the effectiveness of the attenuation correction was evaluated. Both simulations were carried out using both NaI detectors with an energy resolution of 10% FWHM and Ge detectors with an energy resolution of 1%. Results: Analysis shows that attenuation correction without scatter correction using germanium detectors can reconstruct a small spherical source to within 3.5%. Scatter analysis showed that for standard sized objects in a preclinical scanner, a NaI detector has a scatter-to-primary ratio between 7% and 12.5% compared to between 0.8% and 1.5% for a Ge detector. Preliminary results from line profiles through the NEMA phantom suggest that applying attenuation correction without scatter correction provides acceptable results for the Ge detectors but overestimates the phantom activity using NaI detectors. Due to the decreased scatter, we believe that the spillover ratio for the air and water cylinders in the NEMA phantom will be lower using germanium detectors compared to NaI detectors. Conclusion: This work indicates that the superior energy resolution of germanium detectors allows for less scattered photons to be included within the energy window compared to traditional SPECT detectors. This may allow for quantitative SPECT without implementing scatter correction, reducing uncertainties introduced by scatter correction algorithms. Funding provided by NIH/NIBIB grant R01EB013677; Todd Peterson, Ph.D., has had a research contract with PHDs Co., Knoxville, TN.« less
Linearization improves the repeatability of quantitative dynamic contrast-enhanced MRI.
Jones, Kyle M; Pagel, Mark D; Cárdenas-Rodríguez, Julio
2018-04-01
The purpose of this study was to compare the repeatabilities of the linear and nonlinear Tofts and reference region models (RRM) for dynamic contrast-enhanced MRI (DCE-MRI). Simulated and experimental DCE-MRI data from 12 rats with a flank tumor of C6 glioma acquired over three consecutive days were analyzed using four quantitative and semi-quantitative DCE-MRI metrics. The quantitative methods used were: 1) linear Tofts model (LTM), 2) non-linear Tofts model (NTM), 3) linear RRM (LRRM), and 4) non-linear RRM (NRRM). The following semi-quantitative metrics were used: 1) maximum enhancement ratio (MER), 2) time to peak (TTP), 3) initial area under the curve (iauc64), and 4) slope. LTM and NTM were used to estimate K trans , while LRRM and NRRM were used to estimate K trans relative to muscle (R Ktrans ). Repeatability was assessed by calculating the within-subject coefficient of variation (wSCV) and the percent intra-subject variation (iSV) determined with the Gage R&R analysis. The iSV for R Ktrans using LRRM was two-fold lower compared to NRRM at all simulated and experimental conditions. A similar trend was observed for the Tofts model, where LTM was at least 50% more repeatable than the NTM under all experimental and simulated conditions. The semi-quantitative metrics iauc64 and MER were as equally repeatable as K trans and R Ktrans estimated by LTM and LRRM respectively. The iSV for iauc64 and MER were significantly lower than the iSV for slope and TTP. In simulations and experimental results, linearization improves the repeatability of quantitative DCE-MRI by at least 30%, making it as repeatable as semi-quantitative metrics. Copyright © 2017 Elsevier Inc. All rights reserved.
Multi-scale modeling in cell biology
Meier-Schellersheim, Martin; Fraser, Iain D. C.; Klauschen, Frederick
2009-01-01
Biomedical research frequently involves performing experiments and developing hypotheses that link different scales of biological systems such as, for instance, the scales of intracellular molecular interactions to the scale of cellular behavior and beyond to the behavior of cell populations. Computational modeling efforts that aim at exploring such multi-scale systems quantitatively with the help of simulations have to incorporate several different simulation techniques due to the different time and space scales involved. Here, we provide a non-technical overview of how different scales of experimental research can be combined with the appropriate computational modeling techniques. We also show that current modeling software permits building and simulating multi-scale models without having to become involved with the underlying technical details of computational modeling. PMID:20448808
SS-mPMG and SS-GA: tools for finding pathways and dynamic simulation of metabolic networks.
Katsuragi, Tetsuo; Ono, Naoaki; Yasumoto, Keiichi; Altaf-Ul-Amin, Md; Hirai, Masami Y; Sriyudthsak, Kansuporn; Sawada, Yuji; Yamashita, Yui; Chiba, Yukako; Onouchi, Hitoshi; Fujiwara, Toru; Naito, Satoshi; Shiraishi, Fumihide; Kanaya, Shigehiko
2013-05-01
Metabolomics analysis tools can provide quantitative information on the concentration of metabolites in an organism. In this paper, we propose the minimum pathway model generator tool for simulating the dynamics of metabolite concentrations (SS-mPMG) and a tool for parameter estimation by genetic algorithm (SS-GA). SS-mPMG can extract a subsystem of the metabolic network from the genome-scale pathway maps to reduce the complexity of the simulation model and automatically construct a dynamic simulator to evaluate the experimentally observed behavior of metabolites. Using this tool, we show that stochastic simulation can reproduce experimentally observed dynamics of amino acid biosynthesis in Arabidopsis thaliana. In this simulation, SS-mPMG extracts the metabolic network subsystem from published databases. The parameters needed for the simulation are determined using a genetic algorithm to fit the simulation results to the experimental data. We expect that SS-mPMG and SS-GA will help researchers to create relevant metabolic networks and carry out simulations of metabolic reactions derived from metabolomics data.
Gao, Xi; Kong, Bo; Vigil, R Dennis
2017-01-01
A comprehensive quantitative model incorporating the effects of fluid flow patterns, light distribution, and algal growth kinetics on biomass growth rate is developed in order to predict the performance of a Taylor vortex algal photobioreactor for culturing Chlorella vulgaris. A commonly used Lagrangian strategy for coupling the various factors influencing algal growth was employed whereby results from computational fluid dynamics and radiation transport simulations were used to compute numerous microorganism light exposure histories, and this information in turn was used to estimate the global biomass specific growth rate. The simulations provide good quantitative agreement with experimental data and correctly predict the trend in reactor performance as a key reactor operating parameter is varied (inner cylinder rotation speed). However, biomass growth curves are consistently over-predicted and potential causes for these over-predictions and drawbacks of the Lagrangian approach are addressed. Copyright © 2016 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beleggia, M.; Helmholtz-Zentrum Berlin für Materialien und Energie, Berlin; Kasama, T.
We apply off-axis electron holography and Lorentz microscopy in the transmission electron microscope to map the electric field generated by a sharp biased metallic tip. A combination of experimental data and modelling provides quantitative information about the potential and the field around the tip. Close to the tip apex, we measure a maximum field intensity of 82 MV/m, corresponding to a field k factor of 2.5, in excellent agreement with theory. In order to verify the validity of the measurements, we use the inferred charge density distribution in the tip region to generate simulated phase maps and Fresnel (out-of-focus) imagesmore » for comparison with experimental measurements. While the overall agreement is excellent, the simulations also highlight the presence of an unexpected astigmatic contribution to the intensity in a highly defocused Fresnel image, which is thought to result from the geometry of the applied field.« less
Thermal dynamics on the lattice with exponentially improved accuracy
NASA Astrophysics Data System (ADS)
Pawlowski, Jan M.; Rothkopf, Alexander
2018-03-01
We present a novel simulation prescription for thermal quantum fields on a lattice that operates directly in imaginary frequency space. By distinguishing initial conditions from quantum dynamics it provides access to correlation functions also outside of the conventional Matsubara frequencies ωn = 2 πnT. In particular it resolves their frequency dependence between ω = 0 and ω1 = 2 πT, where the thermal physics ω ∼ T of e.g. transport phenomena is dominantly encoded. Real-time spectral functions are related to these correlators via an integral transform with rational kernel, so that their unfolding from the novel simulation data is exponentially improved compared to standard Euclidean simulations. We demonstrate this improvement within a non-trivial 0 + 1-dimensional quantum mechanical toy-model and show that spectral features inaccessible in standard Euclidean simulations are quantitatively captured.
Gambarota, Giulio
2017-07-15
Magnetic resonance spectroscopy (MRS) is a well established modality for investigating tissue metabolism in vivo. In recent years, many efforts by the scientific community have been directed towards the improvement of metabolite detection and quantitation. Quantum mechanics simulations allow for investigations of the MR signal behaviour of metabolites; thus, they provide an essential tool in the optimization of metabolite detection. In this review, we will examine quantum mechanics simulations based on the density matrix formalism. The density matrix was introduced by von Neumann in 1927 to take into account statistical effects within the theory of quantum mechanics. We will discuss the main steps of the density matrix simulation of an arbitrary spin system and show some examples for the strongly coupled two spin system. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
Leachate Testing of Hamlet City Lake, North Carolina, Sediment
1992-11-01
release; distribution is unlimited. 13. ABSTRACT (Maximum 200 words) Sediment leaching studies of Hamlet City Lake, Hamlet, NC, were conducted in...laboratories at the U.S. Army Engineer Waterways Experiment Station. The pur- pose of these studies was to provide quantitative information on the...conditions similar to landfarming. The study involved three elements: batch leach tests, column leach tests, and simulations using the Hydrologic
Chemical and Physical Characterization of Comp A-3 Type II Prills
2013-06-01
the composition and properties of the explosive for implementation into modeling and simulation tools as part of the Multi-scale Response of...emulsion were identified using desorption-gas chromatography/mass spectroscopy (D-GC- MS) and Fourier transform infrared ( FTIR ) spectroscopy. Quantitative...understanding the microstructure of the pressed explosive and provides critical information for the development of a high fidelity particle-based course-grain
Dynamics of water bound to crystalline cellulose.
O'Neill, Hugh; Pingali, Sai Venkatesh; Petridis, Loukas; He, Junhong; Mamontov, Eugene; Hong, Liang; Urban, Volker; Evans, Barbara; Langan, Paul; Smith, Jeremy C; Davison, Brian H
2017-09-19
Interactions of water with cellulose are of both fundamental and technological importance. Here, we characterize the properties of water associated with cellulose using deuterium labeling, neutron scattering and molecular dynamics simulation. Quasi-elastic neutron scattering provided quantitative details about the dynamical relaxation processes that occur and was supported by structural characterization using small-angle neutron scattering and X-ray diffraction. We can unambiguously detect two populations of water associated with cellulose. The first is "non-freezing bound" water that gradually becomes mobile with increasing temperature and can be related to surface water. The second population is consistent with confined water that abruptly becomes mobile at ~260 K, and can be attributed to water that accumulates in the narrow spaces between the microfibrils. Quantitative analysis of the QENS data showed that, at 250 K, the water diffusion coefficient was 0.85 ± 0.04 × 10 -10 m 2 sec -1 and increased to 1.77 ± 0.09 × 10 -10 m 2 sec -1 at 265 K. MD simulations are in excellent agreement with the experiments and support the interpretation that water associated with cellulose exists in two dynamical populations. Our results provide clarity to previous work investigating the states of bound water and provide a new approach for probing water interactions with lignocellulose materials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schultz, Peter Andrew
The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomicmore » scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V&V) is required throughout the system to establish evidence-based metrics for the level of confidence in M&S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V&V challenge at the subcontinuum scale, an approach to incorporate V&V concepts into subcontinuum scale modeling and simulation (M&S), and a plan to incrementally incorporate effective V&V into subcontinuum scale M&S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.« less
HYDRODYNAMIC SIMULATIONS OF H ENTRAINMENT AT THE TOP OF He-SHELL FLASH CONVECTION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woodward, Paul R.; Lin, Pei-Hung; Herwig, Falk, E-mail: paul@lcse.umn.edu, E-mail: fherwig@uvic.ca
2015-01-01
We present the first three-dimensional, fully compressible gas-dynamics simulations in 4π geometry of He-shell flash convection with proton-rich fuel entrainment at the upper boundary. This work is motivated by the insufficiently understood observed consequences of the H-ingestion flash in post-asymptotic giant branch (post-AGB) stars (Sakurai's object) and metal-poor AGB stars. Our investigation is focused on the entrainment process at the top convection boundary and on the subsequent advection of H-rich material into deeper layers, and we therefore ignore the burning of the proton-rich fuel in this study. We find that for our deep convection zone, coherent convective motions of nearmore » global scale appear to dominate the flow. At the top boundary convective shear flows are stable against Kelvin-Helmholtz instabilities. However, such shear instabilities are induced by the boundary-layer separation in large-scale, opposing flows. This links the global nature of thick shell convection with the entrainment process. We establish the quantitative dependence of the entrainment rate on grid resolution. With our numerical technique, simulations with 1024{sup 3} cells or more are required to reach a numerical fidelity appropriate for this problem. However, only the result from the 1536{sup 3} simulation provides a clear indication that we approach convergence with regard to the entrainment rate. Our results demonstrate that our method, which is described in detail, can provide quantitative results related to entrainment and convective boundary mixing in deep stellar interior environments with very stiff convective boundaries. For the representative case we study in detail, we find an entrainment rate of 4.38 ± 1.48 × 10{sup –13} M {sub ☉} s{sup –1}.« less
Some aspects of robotics calibration, design and control
NASA Technical Reports Server (NTRS)
Tawfik, Hazem
1990-01-01
The main objective is to introduce techniques in the areas of testing and calibration, design, and control of robotic systems. A statistical technique is described that analyzes a robot's performance and provides quantitative three-dimensional evaluation of its repeatability, accuracy, and linearity. Based on this analysis, a corrective action should be taken to compensate for any existing errors and enhance the robot's overall accuracy and performance. A comparison between robotics simulation software packages that were commercially available (SILMA, IGRIP) and that of Kennedy Space Center (ROBSIM) is also included. These computer codes simulate the kinematics and dynamics patterns of various robot arm geometries to help the design engineer in sizing and building the robot manipulator and control system. A brief discussion on an adaptive control algorithm is provided.
Efficient Constant-Time Complexity Algorithm for Stochastic Simulation of Large Reaction Networks.
Thanh, Vo Hong; Zunino, Roberto; Priami, Corrado
2017-01-01
Exact stochastic simulation is an indispensable tool for a quantitative study of biochemical reaction networks. The simulation realizes the time evolution of the model by randomly choosing a reaction to fire and update the system state according to a probability that is proportional to the reaction propensity. Two computationally expensive tasks in simulating large biochemical networks are the selection of next reaction firings and the update of reaction propensities due to state changes. We present in this work a new exact algorithm to optimize both of these simulation bottlenecks. Our algorithm employs the composition-rejection on the propensity bounds of reactions to select the next reaction firing. The selection of next reaction firings is independent of the number reactions while the update of propensities is skipped and performed only when necessary. It therefore provides a favorable scaling for the computational complexity in simulating large reaction networks. We benchmark our new algorithm with the state of the art algorithms available in literature to demonstrate its applicability and efficiency.
Geochemical Reaction Mechanism Discovery from Molecular Simulation
Stack, Andrew G.; Kent, Paul R. C.
2014-11-10
Methods to explore reactions using computer simulation are becoming increasingly quantitative, versatile, and robust. In this review, a rationale for how molecular simulation can help build better geochemical kinetics models is first given. We summarize some common methods that geochemists use to simulate reaction mechanisms, specifically classical molecular dynamics and quantum chemical methods and discuss their strengths and weaknesses. Useful tools such as umbrella sampling and metadynamics that enable one to explore reactions are discussed. Several case studies wherein geochemists have used these tools to understand reaction mechanisms are presented, including water exchange and sorption on aqueous species and mineralmore » surfaces, surface charging, crystal growth and dissolution, and electron transfer. The impact that molecular simulation has had on our understanding of geochemical reactivity are highlighted in each case. In the future, it is anticipated that molecular simulation of geochemical reaction mechanisms will become more commonplace as a tool to validate and interpret experimental data, and provide a check on the plausibility of geochemical kinetic models.« less
Key issues review: numerical studies of turbulence in stars
NASA Astrophysics Data System (ADS)
Arnett, W. David; Meakin, Casey
2016-10-01
Three major problems of single-star astrophysics are convection, magnetic fields and rotation. Numerical simulations of convection in stars now have sufficient resolution to be truly turbulent, with effective Reynolds numbers of \\text{Re}>{{10}4} , and some turbulent boundary layers have been resolved. Implications of these developments are discussed for stellar structure, evolution and explosion as supernovae. Methods for three-dimensional (3D) simulations of stars are compared and discussed for 3D atmospheres, solar rotation, core-collapse and stellar boundary layers. Reynolds-averaged Navier-Stokes (RANS) analysis of the numerical simulations has been shown to provide a novel and quantitative estimate of resolution errors. Present treatments of stellar boundaries require revision, even for early burning stages (e.g. for mixing regions during He-burning). As stellar core-collapse is approached, asymmetry and fluctuations grow, rendering spherically symmetric models of progenitors more unrealistic. Numerical resolution of several different types of three-dimensional (3D) stellar simulations are compared; it is suggested that core-collapse simulations may be under-resolved. The Rayleigh-Taylor instability in explosions has a deep connection to convection, for which the abundance structure in supernova remnants may provide evidence.
NASA Astrophysics Data System (ADS)
Venkataraman, Ajey; Shade, Paul A.; Adebisi, R.; Sathish, S.; Pilchak, Adam L.; Viswanathan, G. Babu; Brandes, Matt C.; Mills, Michael J.; Sangid, Michael D.
2017-05-01
Ti-7Al is a good model material for mimicking the α phase response of near- α and α+ β phases of many widely used titanium-based engineering alloys, including Ti-6Al-4V. In this study, three model structures of Ti-7Al are investigated using atomistic simulations by varying the Ti and Al atom positions within the crystalline lattice. These atomic arrangements are based on transmission electron microscopy observations of short-range order. The elastic constants of the three model structures considered are calculated using molecular dynamics simulations. Resonant ultrasound spectroscopy experiments are conducted to obtain the elastic constants at room temperature and a good agreement is found between the simulation and experimental results, providing confidence that the model structures are reasonable. Additionally, energy barriers for crystalline slip are established for these structures by means of calculating the γ-surfaces for different slip systems. Finally, the positions of Al atoms in regards to solid solution strengthening are studied using density functional theory simulations, which demonstrate a higher energy barrier for slip when the Al solute atom is closer to (or at) the fault plane. These results provide quantitative insights into the deformation mechanisms of this alloy.
Mirus, B.B.; Ebel, B.A.; Heppner, C.S.; Loague, K.
2011-01-01
Concept development simulation with distributed, physics-based models provides a quantitative approach for investigating runoff generation processes across environmental conditions. Disparities within data sets employed to design and parameterize boundary value problems used in heuristic simulation inevitably introduce various levels of bias. The objective was to evaluate the impact of boundary value problem complexity on process representation for different runoff generation mechanisms. The comprehensive physics-based hydrologic response model InHM has been employed to generate base case simulations for four well-characterized catchments. The C3 and CB catchments are located within steep, forested environments dominated by subsurface stormflow; the TW and R5 catchments are located in gently sloping rangeland environments dominated by Dunne and Horton overland flows. Observational details are well captured within all four of the base case simulations, but the characterization of soil depth, permeability, rainfall intensity, and evapotranspiration differs for each. These differences are investigated through the conversion of each base case into a reduced case scenario, all sharing the same level of complexity. Evaluation of how individual boundary value problem characteristics impact simulated runoff generation processes is facilitated by quantitative analysis of integrated and distributed responses at high spatial and temporal resolution. Generally, the base case reduction causes moderate changes in discharge and runoff patterns, with the dominant process remaining unchanged. Moderate differences between the base and reduced cases highlight the importance of detailed field observations for parameterizing and evaluating physics-based models. Overall, similarities between the base and reduced cases indicate that the simpler boundary value problems may be useful for concept development simulation to investigate fundamental controls on the spectrum of runoff generation mechanisms. Copyright 2011 by the American Geophysical Union.
NASA Astrophysics Data System (ADS)
Ward, T.; Fleming, J. S.; Hoffmann, S. M. A.; Kemp, P. M.
2005-11-01
Simulation is useful in the validation of functional image analysis methods, particularly when considering the number of analysis techniques currently available lacking thorough validation. Problems exist with current simulation methods due to long run times or unrealistic results making it problematic to generate complete datasets. A method is presented for simulating known abnormalities within normal brain SPECT images using a measured point spread function (PSF), and incorporating a stereotactic atlas of the brain for anatomical positioning. This allows for the simulation of realistic images through the use of prior information regarding disease progression. SPECT images of cerebral perfusion have been generated consisting of a control database and a group of simulated abnormal subjects that are to be used in a UK audit of analysis methods. The abnormality is defined in the stereotactic space, then transformed to the individual subject space, convolved with a measured PSF and removed from the normal subject image. The dataset was analysed using SPM99 (Wellcome Department of Imaging Neuroscience, University College, London) and the MarsBaR volume of interest (VOI) analysis toolbox. The results were evaluated by comparison with the known ground truth. The analysis showed improvement when using a smoothing kernel equal to system resolution over the slightly larger kernel used routinely. Significant correlation was found between effective volume of a simulated abnormality and the detected size using SPM99. Improvements in VOI analysis sensitivity were found when using the region median over the region mean. The method and dataset provide an efficient methodology for use in the comparison and cross validation of semi-quantitative analysis methods in brain SPECT, and allow the optimization of analysis parameters.
Kelay, Tanika; Chan, Kah Leong; Ako, Emmanuel; Yasin, Mohammad; Costopoulos, Charis; Gold, Matthew; Kneebone, Roger K; Malik, Iqbal S; Bello, Fernando
2017-01-01
Distributed Simulation is the concept of portable, high-fidelity immersive simulation. Here, it is used for the development of a simulation-based training programme for cardiovascular specialities. We present an evidence base for how accessible, portable and self-contained simulated environments can be effectively utilised for the modelling, development and testing of a complex training framework and assessment methodology. Iterative user feedback through mixed-methods evaluation techniques resulted in the implementation of the training programme. Four phases were involved in the development of our immersive simulation-based training programme: ( 1) initial conceptual stage for mapping structural criteria and parameters of the simulation training framework and scenario development ( n = 16), (2) training facility design using Distributed Simulation , (3) test cases with clinicians ( n = 8) and collaborative design, where evaluation and user feedback involved a mixed-methods approach featuring (a) quantitative surveys to evaluate the realism and perceived educational relevance of the simulation format and framework for training and (b) qualitative semi-structured interviews to capture detailed feedback including changes and scope for development. Refinements were made iteratively to the simulation framework based on user feedback, resulting in (4) transition towards implementation of the simulation training framework, involving consistent quantitative evaluation techniques for clinicians ( n = 62). For comparative purposes, clinicians' initial quantitative mean evaluation scores for realism of the simulation training framework, realism of the training facility and relevance for training ( n = 8) are presented longitudinally, alongside feedback throughout the development stages from concept to delivery, including the implementation stage ( n = 62). Initially, mean evaluation scores fluctuated from low to average, rising incrementally. This corresponded with the qualitative component, which augmented the quantitative findings; trainees' user feedback was used to perform iterative refinements to the simulation design and components (collaborative design), resulting in higher mean evaluation scores leading up to the implementation phase. Through application of innovative Distributed Simulation techniques, collaborative design, and consistent evaluation techniques from conceptual, development, and implementation stages, fully immersive simulation techniques for cardiovascular specialities are achievable and have the potential to be implemented more broadly.
Combining Experiments and Simulations Using the Maximum Entropy Principle
Boomsma, Wouter; Ferkinghoff-Borg, Jesper; Lindorff-Larsen, Kresten
2014-01-01
A key component of computational biology is to compare the results of computer modelling with experimental measurements. Despite substantial progress in the models and algorithms used in many areas of computational biology, such comparisons sometimes reveal that the computations are not in quantitative agreement with experimental data. The principle of maximum entropy is a general procedure for constructing probability distributions in the light of new data, making it a natural tool in cases when an initial model provides results that are at odds with experiments. The number of maximum entropy applications in our field has grown steadily in recent years, in areas as diverse as sequence analysis, structural modelling, and neurobiology. In this Perspectives article, we give a broad introduction to the method, in an attempt to encourage its further adoption. The general procedure is explained in the context of a simple example, after which we proceed with a real-world application in the field of molecular simulations, where the maximum entropy procedure has recently provided new insight. Given the limited accuracy of force fields, macromolecular simulations sometimes produce results that are at not in complete and quantitative accordance with experiments. A common solution to this problem is to explicitly ensure agreement between the two by perturbing the potential energy function towards the experimental data. So far, a general consensus for how such perturbations should be implemented has been lacking. Three very recent papers have explored this problem using the maximum entropy approach, providing both new theoretical and practical insights to the problem. We highlight each of these contributions in turn and conclude with a discussion on remaining challenges. PMID:24586124
Bera, Maitreyee; Ortel, Terry W.
2018-01-12
The U.S. Geological Survey, in cooperation with DuPage County Stormwater Management Department, is testing a near real-time streamflow simulation system that assists in the management and operation of reservoirs and other flood-control structures in the Salt Creek and West Branch DuPage River drainage basins in DuPage County, Illinois. As part of this effort, the U.S. Geological Survey maintains a database of hourly meteorological and hydrologic data for use in this near real-time streamflow simulation system. Among these data are next generation weather radar-multisensor precipitation estimates and quantitative precipitation forecast data, which are retrieved from the North Central River Forecasting Center of the National Weather Service. The DuPage County streamflow simulation system uses these quantitative precipitation forecast data to create streamflow predictions for the two simulated drainage basins. This report discusses in detail how these data are processed for inclusion in the Watershed Data Management files used in the streamflow simulation system for the Salt Creek and West Branch DuPage River drainage basins.
Gyrokinetic modeling of impurity peaking in JET H-mode plasmas
NASA Astrophysics Data System (ADS)
Manas, P.; Camenen, Y.; Benkadda, S.; Weisen, H.; Angioni, C.; Casson, F. J.; Giroud, C.; Gelfusa, M.; Maslov, M.
2017-06-01
Quantitative comparisons are presented between gyrokinetic simulations and experimental values of the carbon impurity peaking factor in a database of JET H-modes during the carbon wall era. These plasmas feature strong NBI heating and hence high values of toroidal rotation and corresponding gradient. Furthermore, the carbon profiles present particularly interesting shapes for fusion devices, i.e., hollow in the core and peaked near the edge. Dependencies of the experimental carbon peaking factor ( R / L nC ) on plasma parameters are investigated via multilinear regressions. A marked correlation between R / L nC and the normalised toroidal rotation gradient is observed in the core, which suggests an important role of the rotation in establishing hollow carbon profiles. The carbon peaking factor is then computed with the gyrokinetic code GKW, using a quasi-linear approach, supported by a few non-linear simulations. The comparison of the quasi-linear predictions to the experimental values at mid-radius reveals two main regimes. At low normalised collisionality, ν * , and T e / T i < 1 , the gyrokinetic simulations quantitatively recover experimental carbon density profiles, provided that rotodiffusion is taken into account. In contrast, at higher ν * and T e / T i > 1 , the very hollow experimental carbon density profiles are never predicted by the simulations and the carbon density peaking is systematically over estimated. This points to a possible missing ingredient in this regime.
Impacts of using inbred animals in studies for detection of quantitative trait loci.
Freyer, G; Vukasinovic, N; Cassell, B
2009-02-01
Effects of utilizing inbred and noninbred family structures in experiments for detection of quantitative trait loci (QTL) were compared in this simulation study. Simulations were based on a general pedigree design originating from 2 unrelated sires. A variance component approach of mapping QTL was applied to simulated data that reflected common family structures from dairy populations. Five different family structures were considered: FS0 without inbreeding, FS1 with an inbred sire from an aunt-nephew mating, FS2 with an inbred sire originating from a half-sib mating, FS3 and FS4 based on FS2 but containing an increased number of offspring of the inbred sire (FS3), and another extremely inbred sire with its final offspring (FS4). Sixty replicates each of the 5 family structures in 2 simulation scenarios each were analyzed to provide a praxis-like situation of QTL analysis. The largest proportion of QTL position estimates within the correct interval of 3 cM, best test statistic profiles and the smallest average bias were obtained from the pedigrees described by FS4 and FS2. The approach does not depend on the kind and number of genetic markers. Inbreeding is not a recommended practice for commercial dairy production because of possible inbreeding depression, but inbred animals and their offspring that already exist could be advantageous for QTL mapping, because of reduced genetic variance in inbred parents.
An iterative method for near-field Fresnel region polychromatic phase contrast imaging
NASA Astrophysics Data System (ADS)
Carroll, Aidan J.; van Riessen, Grant A.; Balaur, Eugeniu; Dolbnya, Igor P.; Tran, Giang N.; Peele, Andrew G.
2017-07-01
We present an iterative method for polychromatic phase contrast imaging that is suitable for broadband illumination and which allows for the quantitative determination of the thickness of an object given the refractive index of the sample material. Experimental and simulation results suggest the iterative method provides comparable image quality and quantitative object thickness determination when compared to the analytical polychromatic transport of intensity and contrast transfer function methods. The ability of the iterative method to work over a wider range of experimental conditions means the iterative method is a suitable candidate for use with polychromatic illumination and may deliver more utility for laboratory-based x-ray sources, which typically have a broad spectrum.
NASA Astrophysics Data System (ADS)
Wang, Lin; Cao, Xin; Ren, Qingyun; Chen, Xueli; He, Xiaowei
2018-05-01
Cerenkov luminescence imaging (CLI) is an imaging method that uses an optical imaging scheme to probe a radioactive tracer. Application of CLI with clinically approved radioactive tracers has opened an opportunity for translating optical imaging from preclinical to clinical applications. Such translation was further improved by developing an endoscopic CLI system. However, two-dimensional endoscopic imaging cannot identify accurate depth and obtain quantitative information. Here, we present an imaging scheme to retrieve the depth and quantitative information from endoscopic Cerenkov luminescence tomography, which can also be applied for endoscopic radio-luminescence tomography. In the scheme, we first constructed a physical model for image collection, and then a mathematical model for characterizing the luminescent light propagation from tracer to the endoscopic detector. The mathematical model is a hybrid light transport model combined with the 3rd order simplified spherical harmonics approximation, diffusion, and radiosity equations to warrant accuracy and speed. The mathematical model integrates finite element discretization, regularization, and primal-dual interior-point optimization to retrieve the depth and the quantitative information of the tracer. A heterogeneous-geometry-based numerical simulation was used to explore the feasibility of the unified scheme, which demonstrated that it can provide a satisfactory balance between imaging accuracy and computational burden.
Quantitative fluorescence angiography for neurosurgical interventions.
Weichelt, Claudia; Duscha, Philipp; Steinmeier, Ralf; Meyer, Tobias; Kuß, Julia; Cimalla, Peter; Kirsch, Matthias; Sobottka, Stephan B; Koch, Edmund; Schackert, Gabriele; Morgenstern, Ute
2013-06-01
Present methods for quantitative measurement of cerebral perfusion during neurosurgical operations require additional technology for measurement, data acquisition, and processing. This study used conventional fluorescence video angiography--as an established method to visualize blood flow in brain vessels--enhanced by a quantifying perfusion software tool. For these purposes, the fluorescence dye indocyanine green is given intravenously, and after activation by a near-infrared light source the fluorescence signal is recorded. Video data are analyzed by software algorithms to allow quantification of the blood flow. Additionally, perfusion is measured intraoperatively by a reference system. Furthermore, comparing reference measurements using a flow phantom were performed to verify the quantitative blood flow results of the software and to validate the software algorithm. Analysis of intraoperative video data provides characteristic biological parameters. These parameters were implemented in the special flow phantom for experimental validation of the developed software algorithms. Furthermore, various factors that influence the determination of perfusion parameters were analyzed by means of mathematical simulation. Comparing patient measurement, phantom experiment, and computer simulation under certain conditions (variable frame rate, vessel diameter, etc.), the results of the software algorithms are within the range of parameter accuracy of the reference methods. Therefore, the software algorithm for calculating cortical perfusion parameters from video data presents a helpful intraoperative tool without complex additional measurement technology.
NASA Astrophysics Data System (ADS)
Le, Du; Wang, Quanzeng; Ramella-Roman, Jessica; Pfefer, Joshua
2012-06-01
Narrow-band imaging (NBI) is a spectrally-selective reflectance imaging technique for enhanced visualization of superficial vasculature. Prior clinical studies have indicated NBI's potential for detection of vasculature abnormalities associated with gastrointestinal mucosal neoplasia. While the basic mechanisms behind the increased vessel contrast - hemoglobin absorption and tissue scattering - are known, a quantitative understanding of the effect of tissue and device parameters has not been achieved. In this investigation, we developed and implemented a numerical model of light propagation that simulates NBI reflectance distributions. This was accomplished by incorporating mucosal tissue layers and vessel-like structures in a voxel-based Monte Carlo algorithm. Epithelial and mucosal layers as well as blood vessels were defined using wavelength-specific optical properties. The model was implemented to calculate reflectance distributions and vessel contrast values as a function of vessel depth (0.05 to 0.50 mm) and diameter (0.01 to 0.10 mm). These relationships were determined for NBI wavelengths of 410 nm and 540 nm, as well as broadband illumination common to standard endoscopic imaging. The effects of illumination bandwidth on vessel contrast were also simulated. Our results provide a quantitative analysis of the effect of absorption and scattering on vessel contrast. Additional insights and potential approaches for improving NBI system contrast are discussed.
Lin, Yuting; Nouizi, Farouk; Kwong, Tiffany C.; Gulsen, Gultekin
2016-01-01
Conventional fluorescence tomography (FT) can recover the distribution of fluorescent agents within a highly scattering medium. However, poor spatial resolution remains its foremost limitation. Previously, we introduced a new fluorescence imaging technique termed “temperature-modulated fluorescence tomography” (TM-FT), which provides high-resolution images of fluorophore distribution. TM-FT is a multimodality technique that combines fluorescence imaging with focused ultrasound to locate thermo-sensitive fluorescence probes using a priori spatial information to drastically improve the resolution of conventional FT. In this paper, we present an extensive simulation study to evaluate the performance of the TM-FT technique on complex phantoms with multiple fluorescent targets of various sizes located at different depths. In addition, the performance of the TM-FT is tested in the presence of background fluorescence. The results obtained using our new method are systematically compared with those obtained with the conventional FT. Overall, TM-FT provides higher resolution and superior quantitative accuracy, making it an ideal candidate for in vivo preclinical and clinical imaging. For example, a 4 mm diameter inclusion positioned in the middle of a synthetic slab geometry phantom (D:40 mm × W :100 mm) is recovered as an elongated object in the conventional FT (x = 4.5 mm; y = 10.4 mm), while TM-FT recovers it successfully in both directions (x = 3.8 mm; y = 4.6 mm). As a result, the quantitative accuracy of the TM-FT is superior because it recovers the concentration of the agent with a 22% error, which is in contrast with the 83% error of the conventional FT. PMID:26368884
Translating the Simulation of Procedural Drilling Techniques for Interactive Neurosurgical Training
Stredney, Don; Rezai, Ali R.; Prevedello, Daniel M.; Elder, J. Bradley; Kerwin, Thomas; Hittle, Bradley; Wiet, Gregory J.
2014-01-01
Background Through previous and concurrent efforts, we have developed a fully virtual environment to provide procedural training of otologic surgical technique. The virtual environment is based on high-resolution volumetric data of the regional anatomy. This volumetric data helps drive an interactive multi-sensory, i.e., visual (stereo), aural (stereo), and tactile simulation environment. Subsequently, we have extended our efforts to support the training of neurosurgical procedural technique as part of the CNS simulation initiative. Objective The goal of this multi-level development is to deliberately study the integration of simulation technologies into the neurosurgical curriculum and to determine their efficacy in teaching minimally invasive cranial and skull base approaches. Methods We discuss issues of biofidelity as well as our methods to provide objective, quantitative automated assessment for the residents. Results We conclude with a discussion of our experiences by reporting on preliminary formative pilot studies and proposed approaches to take the simulation to the next level through additional validation studies. Conclusion We have presented our efforts to translate an otologic simulation environment for use in the neurosurgical curriculum. We have demonstrated the initial proof of principles and define the steps to integrate and validate the system as an adjuvant to the neurosurgical curriculum. PMID:24051887
A qualitative and quantitative assessment for a bone marrow harvest simulator.
Machado, Liliane S; Moraes, Ronei M
2009-01-01
Several approaches to perform assessment in training simulators based on virtual reality have been proposed. There are two kinds of assessment methods: offline and online. The main requirements related to online training assessment methodologies applied to virtual reality systems are the low computational complexity and the high accuracy. In the literature it can be found several approaches for general cases which can satisfy such requirements. An inconvenient about those approaches is related to an unsatisfactory solution for specific cases, as in some medical procedures, where there are quantitative and qualitative information available to perform the assessment. In this paper, we present an approach to online training assessment based on a Modified Naive Bayes which can manipulate qualitative and quantitative variables simultaneously. A special medical case was simulated in a bone marrow harvest simulator. The results obtained were satisfactory and evidenced the applicability of the method.
Sadovskyy, I. A.; Koshelev, A. E.; Glatz, A.; ...
2016-01-01
The ability of high-temperature superconductors (HTSs) to carry very large currents with almost no dissipation makes them irreplaceable for high-power applications. The development and further improvement of HTS-based cables require an in-depth understanding of the superconducting vortex dynamics in the presence of complex pinning landscapes. We present a critical current analysis of a real HTS sample in a magnetic field by combining state-of-the-art large-scale Ginzburg-Landau simulations with reconstructive three-dimensional scanning-transmission-electron-microscopy tomography of the pinning landscape in Dy-doped YBa 2Cu 3O 7-δ. This methodology provides a unique look at the vortex dynamics in the presence of a complex pinning landscape responsiblemore » for the high-current-carrying-capacity characteristic of commercial HTS wires. Finally, our method demonstrates very good functional and quantitative agreement of the critical current between simulation and experiment, providing a new predictive tool for HTS wire designs.« less
A GIS-based modeling system for petroleum waste management. Geographical information system.
Chen, Z; Huang, G H; Li, J B
2003-01-01
With an urgent need for effective management of petroleum-contaminated sites, a GIS-aided simulation (GISSIM) system is presented in this study. The GISSIM contains two components: an advanced 3D numerical model and a geographical information system (GIS), which are integrated within a general framework. The modeling component undertakes simulation for the fate of contaminants in subsurface unsaturated and saturated zones. The GIS component is used in three areas throughout the system development and implementation process: (i) managing spatial and non-spatial databases; (ii) linking inputs, model, and outputs; and (iii) providing an interface between the GISSIM and its users. The developed system is applied to a North American case study. Concentrations of benzene, toluene, and xylenes in groundwater under a petroleum-contaminated site are dynamically simulated. Reasonable outputs have been obtained and presented graphically. They provide quantitative and scientific bases for further assessment of site-contamination impacts and risks, as well as decisions on practical remediation actions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sadovskyy, I. A.; Koshelev, A. E.; Glatz, A.
The ability of high-temperature superconductors (HTSs) to carry very large currents with almost no dissipation makes them irreplaceable for high-power applications. The development and further improvement of HTS-based cables require an in-depth understanding of the superconducting vortex dynamics in the presence of complex pinning landscapes. We present a critical current analysis of a real HTS sample in a magnetic field by combining state-of-the-art large-scale Ginzburg-Landau simulations with reconstructive three-dimensional scanning-transmission-electron-microscopy tomography of the pinning landscape in Dy-doped YBa 2Cu 3O 7-δ. This methodology provides a unique look at the vortex dynamics in the presence of a complex pinning landscape responsiblemore » for the high-current-carrying-capacity characteristic of commercial HTS wires. Finally, our method demonstrates very good functional and quantitative agreement of the critical current between simulation and experiment, providing a new predictive tool for HTS wire designs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sadovskyy, I. A.; Koshelev, A. E.; Glatz, A.
Tmore » he ability of high-temperature superconductors (HSs) to carry very large currents with almost no dissipation makes them irreplaceable for high-power applications. he development and further improvement of HS-based cables require an in-depth understanding of the superconducting vortex dynamics in the presence of complex pinning landscapes. Here, we present a critical current analysis of a real HS sample in a magnetic field by combining state-of-the-art large-scale Ginzburg-Landau simulations with reconstructive three-dimensional scanning-transmission-electron-microscopy tomography of the pinning landscape in Dy-doped YBa 2 Cu 3 O 7 - δ . his methodology provides a unique look at the vortex dynamics in the presence of a complex pinning landscape responsible for the high-current-carrying-capacity characteristic of commercial HS wires. Our method demonstrates very good functional and quantitative agreement of the critical current between simulation and experiment, providing a new predictive tool for HS wire designs.« less
Developing a database for pedestrians' earthquake emergency evacuation in indoor scenarios.
Zhou, Junxue; Li, Sha; Nie, Gaozhong; Fan, Xiwei; Tan, Jinxian; Li, Huayue; Pang, Xiaoke
2018-01-01
With the booming development of evacuation simulation software, developing an extensive database in indoor scenarios for evacuation models is imperative. In this paper, we conduct a qualitative and quantitative analysis of the collected videotapes and aim to provide a complete and unitary database of pedestrians' earthquake emergency response behaviors in indoor scenarios, including human-environment interactions. Using the qualitative analysis method, we extract keyword groups and keywords that code the response modes of pedestrians and construct a general decision flowchart using chronological organization. Using the quantitative analysis method, we analyze data on the delay time, evacuation speed, evacuation route and emergency exit choices. Furthermore, we study the effect of classroom layout on emergency evacuation. The database for indoor scenarios provides reliable input parameters and allows the construction of real and effective constraints for use in software and mathematical models. The database can also be used to validate the accuracy of evacuation models.
Metabolic network reconstruction of Chlamydomonas offers insight into light-driven algal metabolism
Chang, Roger L; Ghamsari, Lila; Manichaikul, Ani; Hom, Erik F Y; Balaji, Santhanam; Fu, Weiqi; Shen, Yun; Hao, Tong; Palsson, Bernhard Ø; Salehi-Ashtiani, Kourosh; Papin, Jason A
2011-01-01
Metabolic network reconstruction encompasses existing knowledge about an organism's metabolism and genome annotation, providing a platform for omics data analysis and phenotype prediction. The model alga Chlamydomonas reinhardtii is employed to study diverse biological processes from photosynthesis to phototaxis. Recent heightened interest in this species results from an international movement to develop algal biofuels. Integrating biological and optical data, we reconstructed a genome-scale metabolic network for this alga and devised a novel light-modeling approach that enables quantitative growth prediction for a given light source, resolving wavelength and photon flux. We experimentally verified transcripts accounted for in the network and physiologically validated model function through simulation and generation of new experimental growth data, providing high confidence in network contents and predictive applications. The network offers insight into algal metabolism and potential for genetic engineering and efficient light source design, a pioneering resource for studying light-driven metabolism and quantitative systems biology. PMID:21811229
Electrochemistry in hollow-channel paper analytical devices.
Renault, Christophe; Anderson, Morgan J; Crooks, Richard M
2014-03-26
In the present article we provide a detailed analysis of fundamental electrochemical processes in a new class of paper-based analytical devices (PADs) having hollow channels (HCs). Voltammetry and amperometry were applied under flow and no flow conditions yielding reproducible electrochemical signals that can be described by classical electrochemical theory as well as finite-element simulations. The results shown here provide new and quantitative insights into the flow within HC-PADs. The interesting new result is that despite their remarkable simplicity these HC-PADs exhibit electrochemical and hydrodynamic behavior similar to that of traditional microelectrochemical devices.
NASA Astrophysics Data System (ADS)
Møller, Søren H.; Vester-Petersen, Joakim; Nazir, Adnan; Eriksen, Emil H.; Julsgaard, Brian; Madsen, Søren P.; Balling, Peter
2018-02-01
Quantitative measurements of the electric near-field distribution of star-shaped gold nanoparticles have been performed by femtosecond laser ablation. Measurements were carried out on and off the plasmon resonance. A detailed comparison with numerical simulations of the electric fields is presented. Semi-quantitative agreement is found, with slight systematic differences between experimentally observed and simulated near-field patterns close to strong electric-field gradients. The deviations are attributed to carrier transport preceding ablation.
NASA Technical Reports Server (NTRS)
Lackey, J.; Hadfield, C.
1992-01-01
Recent mishaps and incidents on Class IV aircraft have shown a need for establishing quantitative longitudinal high angle of attack (AOA) pitch control margin design guidelines for future aircraft. NASA Langley Research Center has conducted a series of simulation tests to define these design guidelines. Flight test results have confirmed the simulation studies in that pilot rating of high AOA nose-down recoveries were based on the short-term response interval in the forms of pitch acceleration and rate.
Disease dynamics in a dynamic social network
NASA Astrophysics Data System (ADS)
Christensen, Claire; Albert, István; Grenfell, Bryan; Albert, Réka
2010-07-01
We develop a framework for simulating a realistic, evolving social network (a city) into which a disease is introduced. We compare our results to prevaccine era measles data for England and Wales, and find that they capture the quantitative and qualitative features of epidemics in populations spanning two orders of magnitude. Our results provide unique insight into how and why the social topology of the contact network influences the propagation of the disease through the population. We argue that network simulation is suitable for concurrently probing contact network dynamics and disease dynamics in ways that prior modeling approaches cannot and it can be extended to the study of less well-documented diseases.
Using simulation modeling to improve patient flow at an outpatient orthopedic clinic.
Rohleder, Thomas R; Lewkonia, Peter; Bischak, Diane P; Duffy, Paul; Hendijani, Rosa
2011-06-01
We report on the use of discrete event simulation modeling to support process improvements at an orthopedic outpatient clinic. The clinic was effective in treating patients, but waiting time and congestion in the clinic created patient dissatisfaction and staff morale issues. The modeling helped to identify improvement alternatives including optimized staffing levels, better patient scheduling, and an emphasis on staff arriving promptly. Quantitative results from the modeling provided motivation to implement the improvements. Statistical analysis of data taken before and after the implementation indicate that waiting time measures were significantly improved and overall patient time in the clinic was reduced.
NASA Astrophysics Data System (ADS)
Nyboer, John
Issues related to the reduction of greenhouse gases are encumbered with uncertainties for decision makers. Unfortunately, conventional analytical tools generate widely divergent forecasts of the effects of actions designed to mitigate these emissions. "Bottom-up" models show the costs of reducing emissions attained through the penetration of efficient technologies to be low or negative. In contrast, more aggregate "top-down" models show costs of reduction to be high. The methodological approaches of the different models used to simulate energy consumption generate, in part, the divergence found in model outputs. To address this uncertainty and bring convergence, I use a technology-explicit model that simulates turnover of equipment stock as a function of detailed data on equipment costs and stock characteristics and of verified behavioural data related to equipment acquisition and retrofitting. Such detail can inform the decision maker of the effects of actions to reduce greenhouse gases due to changes in (1) technology stocks, (2) products or services, or (3) the mix of fuels used. This thesis involves two main components: (1) the development of a quantitative model to analyse energy demand and (2) the application of this tool to a policy issue, abatement of COsb2 emissions. The analysis covers all of Canada by sector (8 industrial subsectors, residential commercial) and region. An electricity supply model to provide local electricity prices supplemented the quantitative model. Forecasts of growth and structural change were provided by national macroeconomic models. Seven different simulations were applied to each sector in each region including a base case run and three runs simulating emissions charges of 75/tonne, 150/tonne and 225/tonne CO sb2. The analysis reveals that there is significant variation in the costs and quantity of emissions reduction by sector and region. Aggregated results show that Canada can meet both stabilisation targets (1990 levels of emissions by 2000) and reduction targets (20% less than 1990 by 2010), but the cost of meeting reduction targets exceeds 225/tonne. After a review of the results, I provide several reasons for concluding that the costs are overestimated and the emissions reduction underestimated. I also provide several future research options.
Polymer Brushes under High Load
Balko, Suzanne M.; Kreer, Torsten; Costanzo, Philip J.; Patten, Tim E.; Johner, Albert; Kuhl, Tonya L.; Marques, Carlos M.
2013-01-01
Polymer coatings are frequently used to provide repulsive forces between surfaces in solution. After 25 years of design and study, a quantitative model to explain and predict repulsion under strong compression is still lacking. Here, we combine experiments, simulations, and theory to study polymer coatings under high loads and demonstrate a validated model for the repulsive forces, proposing that this universal behavior can be predicted from the polymer solution properties. PMID:23516470
NASA Astrophysics Data System (ADS)
Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R.; La Riviere, Patrick J.; Alessio, Adam M.
2014-04-01
Myocardial blood flow (MBF) can be estimated from dynamic contrast enhanced (DCE) cardiac CT acquisitions, leading to quantitative assessment of regional perfusion. The need for low radiation dose and the lack of consensus on MBF estimation methods motivates this study to refine the selection of acquisition protocols and models for CT-derived MBF. DCE cardiac CT acquisitions were simulated for a range of flow states (MBF = 0.5, 1, 2, 3 ml (min g)-1, cardiac output = 3, 5, 8 L min-1). Patient kinetics were generated by a mathematical model of iodine exchange incorporating numerous physiological features including heterogenenous microvascular flow, permeability and capillary contrast gradients. CT acquisitions were simulated for multiple realizations of realistic x-ray flux levels. CT acquisitions that reduce radiation exposure were implemented by varying both temporal sampling (1, 2, and 3 s sampling intervals) and tube currents (140, 70, and 25 mAs). For all acquisitions, we compared three quantitative MBF estimation methods (two-compartment model, an axially-distributed model, and the adiabatic approximation to the tissue homogeneous model) and a qualitative slope-based method. In total, over 11 000 time attenuation curves were used to evaluate MBF estimation in multiple patient and imaging scenarios. After iodine-based beam hardening correction, the slope method consistently underestimated flow by on average 47.5% and the quantitative models provided estimates with less than 6.5% average bias and increasing variance with increasing dose reductions. The three quantitative models performed equally well, offering estimates with essentially identical root mean squared error (RMSE) for matched acquisitions. MBF estimates using the qualitative slope method were inferior in terms of bias and RMSE compared to the quantitative methods. MBF estimate error was equal at matched dose reductions for all quantitative methods and range of techniques evaluated. This suggests that there is no particular advantage between quantitative estimation methods nor to performing dose reduction via tube current reduction compared to temporal sampling reduction. These data are important for optimizing implementation of cardiac dynamic CT in clinical practice and in prospective CT MBF trials.
Tait, Lauren; Lee, Kenneth; Rasiah, Rohan; Cooper, Joyce M; Ling, Tristan; Geelan, Benjamin; Bindoff, Ivan
2018-05-03
Background . There are numerous approaches to simulating a patient encounter in pharmacy education. However, little direct comparison between these approaches has been undertaken. Our objective was to investigate student experiences, satisfaction, and feedback preferences between three scenario simulation modalities (paper-, actor-, and computer-based). Methods . We conducted a mixed methods study with randomized cross-over of simulation modalities on final-year Australian graduate-entry Master of Pharmacy students. Participants completed case-based scenarios within each of three simulation modalities, with feedback provided at the completion of each scenario in a format corresponding to each simulation modality. A post-simulation questionnaire collected qualitative and quantitative responses pertaining to participant satisfaction, experiences, and feedback preferences. Results . Participants reported similar levels satisfaction across all three modalities. However, each modality resulted in unique positive and negative experiences, such as student disengagement with paper-based scenarios. Conclusion . Importantly, the themes of guidance and opportunity for peer discussion underlie the best forms of feedback for students. The provision of feedback following simulation should be carefully considered and delivered, with all three simulation modalities producing both positive and negative experiences in regard to their feedback format.
Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas
2016-01-01
Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.
Adaptive resolution simulation of oligonucleotides
NASA Astrophysics Data System (ADS)
Netz, Paulo A.; Potestio, Raffaello; Kremer, Kurt
2016-12-01
Nucleic acids are characterized by a complex hierarchical structure and a variety of interaction mechanisms with other molecules. These features suggest the need of multiscale simulation methods in order to grasp the relevant physical properties of deoxyribonucleic acid (DNA) and RNA using in silico experiments. Here we report an implementation of a dual-resolution modeling of a DNA oligonucleotide in physiological conditions; in the presented setup only the nucleotide molecule and the solvent and ions in its proximity are described at the atomistic level; in contrast, the water molecules and ions far from the DNA are represented as computationally less expensive coarse-grained particles. Through the analysis of several structural and dynamical parameters, we show that this setup reliably reproduces the physical properties of the DNA molecule as observed in reference atomistic simulations. These results represent a first step towards a realistic multiscale modeling of nucleic acids and provide a quantitatively solid ground for their simulation using dual-resolution methods.
A sensitivity analysis of regional and small watershed hydrologic models
NASA Technical Reports Server (NTRS)
Ambaruch, R.; Salomonson, V. V.; Simmons, J. W.
1975-01-01
Continuous simulation models of the hydrologic behavior of watersheds are important tools in several practical applications such as hydroelectric power planning, navigation, and flood control. Several recent studies have addressed the feasibility of using remote earth observations as sources of input data for hydrologic models. The objective of the study reported here was to determine how accurately remotely sensed measurements must be to provide inputs to hydrologic models of watersheds, within the tolerances needed for acceptably accurate synthesis of streamflow by the models. The study objective was achieved by performing a series of sensitivity analyses using continuous simulation models of three watersheds. The sensitivity analysis showed quantitatively how variations in each of 46 model inputs and parameters affect simulation accuracy with respect to five different performance indices.
Numerical simulation of mechanical mixing in high solid anaerobic digester.
Yu, Liang; Ma, Jingwei; Chen, Shulin
2011-01-01
Computational fluid dynamics (CFD) was employed to study mixing performance in high solid anaerobic digester (HSAD) with A-310 impeller and helical ribbon. A mathematical model was constructed to assess flow fields. Good agreement of the model results with experimental data was obtained for the A-310 impeller. A systematic comparison for the interrelationship of power number, flow number and Reynolds number was simulated in a digester with less than 5% TS and 10% TS (total solids). The simulation results suggested a great potential for using the helical ribbon mixer in the mixing of high solids digester. The results also provided quantitative confirmation for minimum power consumption in HSAD and the effect of share rate on bio-structure. Copyright © 2010 Elsevier Ltd. All rights reserved.
Numerical study of read scheme in one-selector one-resistor crossbar array
NASA Astrophysics Data System (ADS)
Kim, Sungho; Kim, Hee-Dong; Choi, Sung-Jin
2015-12-01
A comprehensive numerical circuit analysis of read schemes of a one selector-one resistance change memory (1S1R) crossbar array is carried out. Three schemes-the ground, V/2, and V/3 schemes-are compared with each other in terms of sensing margin and power consumption. Without the aid of a complex analytical approach or SPICE-based simulation, a simple numerical iteration method is developed to simulate entire current flows and node voltages within a crossbar array. Understanding such phenomena is essential in successfully evaluating the electrical specifications of selectors for suppressing intrinsic drawbacks of crossbar arrays, such as sneaky current paths and series line resistance problems. This method provides a quantitative tool for the accurate analysis of crossbar arrays and provides guidelines for developing an optimal read scheme, array configuration, and selector device specifications.
The relative entropy is fundamental to adaptive resolution simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kreis, Karsten; Graduate School Materials Science in Mainz, Staudingerweg 9, 55128 Mainz; Potestio, Raffaello, E-mail: potestio@mpip-mainz.mpg.de
Adaptive resolution techniques are powerful methods for the efficient simulation of soft matter systems in which they simultaneously employ atomistic and coarse-grained (CG) force fields. In such simulations, two regions with different resolutions are coupled with each other via a hybrid transition region, and particles change their description on the fly when crossing this boundary. Here we show that the relative entropy, which provides a fundamental basis for many approaches in systematic coarse-graining, is also an effective instrument for the understanding of adaptive resolution simulation methodologies. We demonstrate that the use of coarse-grained potentials which minimize the relative entropy withmore » respect to the atomistic system can help achieve a smoother transition between the different regions within the adaptive setup. Furthermore, we derive a quantitative relation between the width of the hybrid region and the seamlessness of the coupling. Our results do not only shed light on the what and how of adaptive resolution techniques but will also help setting up such simulations in an optimal manner.« less
Large-scale 3D modeling of projectile impact damage in brittle plates
NASA Astrophysics Data System (ADS)
Seagraves, A.; Radovitzky, R.
2015-10-01
The damage and failure of brittle plates subjected to projectile impact is investigated through large-scale three-dimensional simulation using the DG/CZM approach introduced by Radovitzky et al. [Comput. Methods Appl. Mech. Eng. 2011; 200(1-4), 326-344]. Two standard experimental setups are considered: first, we simulate edge-on impact experiments on Al2O3 tiles by Strassburger and Senf [Technical Report ARL-CR-214, Army Research Laboratory, 1995]. Qualitative and quantitative validation of the simulation results is pursued by direct comparison of simulations with experiments at different loading rates and good agreement is obtained. In the second example considered, we investigate the fracture patterns in normal impact of spheres on thin, unconfined ceramic plates over a wide range of loading rates. For both the edge-on and normal impact configurations, the full field description provided by the simulations is used to interpret the mechanisms underlying the crack propagation patterns and their strong dependence on loading rate.
Howard, Valerie Michele; Ross, Carl; Mitchell, Ann M; Nelson, Glenn M
2010-01-01
Although human patient simulators provide an innovative teaching method for nursing students, they are quite expensive. To investigate the value of this expenditure, a quantitative, quasi-experimental, two-group pretest and posttest design was used to compare two educational interventions: human patient simulators and interactive case studies. The sample (N = 49) consisted of students from baccalaureate, accelerated baccalaureate, and diploma nursing programs. Custom-designed Health Education Systems, Inc examinations were used to measure knowledge before and after the implementation of the two educational interventions. Students in the human patient simulation group scored significantly higher than did those in the interactive case study group on the posttest Health Education Systems, Inc examination, and no significant difference was found in student scores among the three types of nursing programs that participated in the study. Data obtained from a questionnaire administered to participants indicated that students responded favorably to the use of human patient simulators as a teaching method.
The relative entropy is fundamental to adaptive resolution simulations
NASA Astrophysics Data System (ADS)
Kreis, Karsten; Potestio, Raffaello
2016-07-01
Adaptive resolution techniques are powerful methods for the efficient simulation of soft matter systems in which they simultaneously employ atomistic and coarse-grained (CG) force fields. In such simulations, two regions with different resolutions are coupled with each other via a hybrid transition region, and particles change their description on the fly when crossing this boundary. Here we show that the relative entropy, which provides a fundamental basis for many approaches in systematic coarse-graining, is also an effective instrument for the understanding of adaptive resolution simulation methodologies. We demonstrate that the use of coarse-grained potentials which minimize the relative entropy with respect to the atomistic system can help achieve a smoother transition between the different regions within the adaptive setup. Furthermore, we derive a quantitative relation between the width of the hybrid region and the seamlessness of the coupling. Our results do not only shed light on the what and how of adaptive resolution techniques but will also help setting up such simulations in an optimal manner.
Optimization of a middle atmosphere diagnostic scheme
NASA Astrophysics Data System (ADS)
Akmaev, Rashid A.
1997-06-01
A new assimilative diagnostic scheme based on the use of a spectral model was recently tested on the CIRA-86 empirical model. It reproduced the observed climatology with an annual global rms temperature deviation of 3.2 K in the 15-110 km layer. The most important new component of the scheme is that the zonal forcing necessary to maintain the observed climatology is diagnosed from empirical data and subsequently substituted into the simulation model at the prognostic stage of the calculation in an annual cycle mode. The simulation results are then quantitatively compared with the empirical model, and the above mentioned rms temperature deviation provides an objective measure of the `distance' between the two climatologies. This quantitative criterion makes it possible to apply standard optimization procedures to the whole diagnostic scheme and/or the model itself. The estimates of the zonal drag have been improved in this study by introducing a nudging (Newtonian-cooling) term into the thermodynamic equation at the diagnostic stage. A proper optimal adjustment of the strength of this term makes it possible to further reduce the rms temperature deviation of simulations down to approximately 2.7 K. These results suggest that direct optimization can successfully be applied to atmospheric model parameter identification problems of moderate dimensionality.
Quantitative assessment of AOD from 17 CMIP5 models based on satellite-derived AOD over India
DOE Office of Scientific and Technical Information (OSTI.GOV)
Misra, Amit; Kanawade, Vijay P.; Tripathi, Sachchida Nand
Aerosol optical depth (AOD) values from 17 CMIP5 models are compared with Moderate Resolution Imaging Spectroradiometer (MODIS) and Multiangle Imaging Spectroradiometer (MISR) derived AODs over India. The objective is to identify the cases of successful AOD simulation by CMIP5 models, considering satellite-derived AOD as a benchmark. Six years of AOD data (2000–2005) from MISR and MODIS are processed to create quality-assured gridded AOD maps over India, which are compared with corresponding maps of 17 CMIP5 models at the same grid resolution. Intercomparison of model and satellite data shows that model-AOD is better correlated with MISR-derived AOD than MODIS. The correlation between model-AOD andmore » MISR-AOD is used to segregate the models into three categories identifying their performance in simulating the AOD over India. Maps of correlation between model-AOD and MISR-/MODIS-AOD are generated to provide quantitative information about the intercomparison. The two sets of data are examined for different seasons and years to examine the seasonal and interannual variation in the correlation coefficients. In conclusion, latitudinal and longitudinal variations in AOD as simulated by models are also examined and compared with corresponding variations observed by satellites.« less
Quantitative assessment of AOD from 17 CMIP5 models based on satellite-derived AOD over India
Misra, Amit; Kanawade, Vijay P.; Tripathi, Sachchida Nand
2016-08-03
Aerosol optical depth (AOD) values from 17 CMIP5 models are compared with Moderate Resolution Imaging Spectroradiometer (MODIS) and Multiangle Imaging Spectroradiometer (MISR) derived AODs over India. The objective is to identify the cases of successful AOD simulation by CMIP5 models, considering satellite-derived AOD as a benchmark. Six years of AOD data (2000–2005) from MISR and MODIS are processed to create quality-assured gridded AOD maps over India, which are compared with corresponding maps of 17 CMIP5 models at the same grid resolution. Intercomparison of model and satellite data shows that model-AOD is better correlated with MISR-derived AOD than MODIS. The correlation between model-AOD andmore » MISR-AOD is used to segregate the models into three categories identifying their performance in simulating the AOD over India. Maps of correlation between model-AOD and MISR-/MODIS-AOD are generated to provide quantitative information about the intercomparison. The two sets of data are examined for different seasons and years to examine the seasonal and interannual variation in the correlation coefficients. In conclusion, latitudinal and longitudinal variations in AOD as simulated by models are also examined and compared with corresponding variations observed by satellites.« less
WE-H-BRA-04: Biological Geometries for the Monte Carlo Simulation Toolkit TOPASNBio
DOE Office of Scientific and Technical Information (OSTI.GOV)
McNamara, A; Held, K; Paganetti, H
2016-06-15
Purpose: New advances in radiation therapy are most likely to come from the complex interface of physics, chemistry and biology. Computational simulations offer a powerful tool for quantitatively investigating radiation interactions with biological tissue and can thus help bridge the gap between physics and biology. The aim of TOPAS-nBio is to provide a comprehensive tool to generate advanced radiobiology simulations. Methods: TOPAS wraps and extends the Geant4 Monte Carlo (MC) simulation toolkit. TOPAS-nBio is an extension to TOPAS which utilizes the physics processes in Geant4-DNA to model biological damage from very low energy secondary electrons. Specialized cell, organelle and molecularmore » geometries were designed for the toolkit. Results: TOPAS-nBio gives the user the capability of simulating biological geometries, ranging from the micron-scale (e.g. cells and organelles) to complex nano-scale geometries (e.g. DNA and proteins). The user interacts with TOPAS-nBio through easy-to-use input parameter files. For example, in a simple cell simulation the user can specify the cell type and size as well as the type, number and size of included organelles. For more detailed nuclear simulations, the user can specify chromosome territories containing chromatin fiber loops, the later comprised of nucleosomes on a double helix. The chromatin fibers can be arranged in simple rigid geometries or within factual globules, mimicking realistic chromosome territories. TOPAS-nBio also provides users with the capability of reading protein data bank 3D structural files to simulate radiation damage to proteins or nucleic acids e.g. histones or RNA. TOPAS-nBio has been validated by comparing results to other track structure simulation software and published experimental measurements. Conclusion: TOPAS-nBio provides users with a comprehensive MC simulation tool for radiobiological simulations, giving users without advanced programming skills the ability to design and run complex simulations.« less
Taming Wild Horses: The Need for Virtual Time-based Scheduling of VMs in Network Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoginath, Srikanth B; Perumalla, Kalyan S; Henz, Brian J
2012-01-01
The next generation of scalable network simulators employ virtual machines (VMs) to act as high-fidelity models of traffic producer/consumer nodes in simulated networks. However, network simulations could be inaccurate if VMs are not scheduled according to virtual time, especially when many VMs are hosted per simulator core in a multi-core simulator environment. Since VMs are by default free-running, on the outset, it is not clear if, and to what extent, their untamed execution affects the results in simulated scenarios. Here, we provide the first quantitative basis for establishing the need for generalized virtual time scheduling of VMs in network simulators,more » based on an actual prototyped implementations. To exercise breadth, our system is tested with multiple disparate applications: (a) a set of message passing parallel programs, (b) a computer worm propagation phenomenon, and (c) a mobile ad-hoc wireless network simulation. We define and use error metrics and benchmarks in scaled tests to empirically report the poor match of traditional, fairness-based VM scheduling to VM-based network simulation, and also clearly show the better performance of our simulation-specific scheduler, with up to 64 VMs hosted on a 12-core simulator node.« less
Prediction of Environmental Impact of High-Energy Materials with Atomistic Computer Simulations
2010-11-01
from a training set of compounds. Other methods include Quantitative Struc- ture-Activity Relationship ( QSAR ) and Quantitative Structure-Property...26 28 the development of QSPR/ QSAR models, in contrast to boiling points and critical parameters derived from empirical correlations, to improve...Quadratic Configuration Interaction Singles Doubles QSAR Quantitative Structure-Activity Relationship QSPR Quantitative Structure-Property
Hoppe, Elisabeth; Körzdörfer, Gregor; Würfl, Tobias; Wetzl, Jens; Lugauer, Felix; Pfeuffer, Josef; Maier, Andreas
2017-01-01
The purpose of this work is to evaluate methods from deep learning for application to Magnetic Resonance Fingerprinting (MRF). MRF is a recently proposed measurement technique for generating quantitative parameter maps. In MRF a non-steady state signal is generated by a pseudo-random excitation pattern. A comparison of the measured signal in each voxel with the physical model yields quantitative parameter maps. Currently, the comparison is done by matching a dictionary of simulated signals to the acquired signals. To accelerate the computation of quantitative maps we train a Convolutional Neural Network (CNN) on simulated dictionary data. As a proof of principle we show that the neural network implicitly encodes the dictionary and can replace the matching process.
NASA Technical Reports Server (NTRS)
Dong, Y.; Spedding, G. R.; Egolfopoulos, F. N.; Miller, F. J.
2003-01-01
The main objective of this research is to introduce accurate fluid mechanics measurements diagnostics in the 2.2-s drop tower for the determination of the detailed flow-field at the states of extinction. These results are important as they can then be compared with confidence with detailed numerical simulations so that important insight is provided into near-limit phenomena that are controlled by not well-understood kinetics and thermal radiation processes. Past qualitative studies did enhance our general understanding on the subject. However, quantitative studies are essential for the validation of existing models that subsequently be used to describe near-limit phenomena that can initiate catastrophic events in micro- and/or reduced gravity environments.
Forest management and economics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buongiorno, J.; Gilless, J.K.
1987-01-01
This volume provides a survey of quantitative methods, guiding the reader through formulation and analysis of models that address forest management problems. The authors use simple mathematics, graphics, and short computer programs to explain each method. Emphasizing applications, they discuss linear, integer, dynamic, and goal programming; simulation; network modeling; and econometrics, as these relate to problems of determining economic harvest schedules in even-aged and uneven-aged forests, the evaluation of forest policies, multiple-objective decision making, and more.
Quantitative Uncertainty Assessment and Numerical Simulation of Micro-Fluid Systems
2005-04-01
flow at Sandia, that was supported by the Laboratory Directed Research and Devel- opment program, and by the Dept. of Energy , Office of Basic Energy ...finite energy . 6 θ is used to denote the random nature of the corresponding quantity. Being symmetrical and positive definite, REE has all its...Laboratory Directed Research and Development Program at Sandia National Laboratories, funded by the U.S. Department of Energy . Support was also provided
Fourier phase in Fourier-domain optical coherence tomography.
Uttam, Shikhar; Liu, Yang
2015-12-01
Phase of an electromagnetic wave propagating through a sample-of-interest is well understood in the context of quantitative phase imaging in transmission-mode microscopy. In the past decade, Fourier-domain optical coherence tomography has been used to extend quantitative phase imaging to the reflection-mode. Unlike transmission-mode electromagnetic phase, however, the origin and characteristics of reflection-mode Fourier phase are poorly understood, especially in samples with a slowly varying refractive index. In this paper, the general theory of Fourier phase from first principles is presented, and it is shown that Fourier phase is a joint estimate of subresolution offset and mean spatial frequency of the coherence-gated sample refractive index. It is also shown that both spectral-domain phase microscopy and depth-resolved spatial-domain low-coherence quantitative phase microscopy are special cases of this general theory. Analytical expressions are provided for both, and simulations are presented to explain and support the theoretical results. These results are further used to show how Fourier phase allows the estimation of an axial mean spatial frequency profile of the sample, along with depth-resolved characterization of localized optical density change and sample heterogeneity. Finally, a Fourier phase-based explanation of Doppler optical coherence tomography is also provided.
Quantitative imaging biomarkers: Effect of sample size and bias on confidence interval coverage.
Obuchowski, Nancy A; Bullen, Jennifer
2017-01-01
Introduction Quantitative imaging biomarkers (QIBs) are being increasingly used in medical practice and clinical trials. An essential first step in the adoption of a quantitative imaging biomarker is the characterization of its technical performance, i.e. precision and bias, through one or more performance studies. Then, given the technical performance, a confidence interval for a new patient's true biomarker value can be constructed. Estimating bias and precision can be problematic because rarely are both estimated in the same study, precision studies are usually quite small, and bias cannot be measured when there is no reference standard. Methods A Monte Carlo simulation study was conducted to assess factors affecting nominal coverage of confidence intervals for a new patient's quantitative imaging biomarker measurement and for change in the quantitative imaging biomarker over time. Factors considered include sample size for estimating bias and precision, effect of fixed and non-proportional bias, clustered data, and absence of a reference standard. Results Technical performance studies of a quantitative imaging biomarker should include at least 35 test-retest subjects to estimate precision and 65 cases to estimate bias. Confidence intervals for a new patient's quantitative imaging biomarker measurement constructed under the no-bias assumption provide nominal coverage as long as the fixed bias is <12%. For confidence intervals of the true change over time, linearity must hold and the slope of the regression of the measurements vs. true values should be between 0.95 and 1.05. The regression slope can be assessed adequately as long as fixed multiples of the measurand can be generated. Even small non-proportional bias greatly reduces confidence interval coverage. Multiple lesions in the same subject can be treated as independent when estimating precision. Conclusion Technical performance studies of quantitative imaging biomarkers require moderate sample sizes in order to provide robust estimates of bias and precision for constructing confidence intervals for new patients. Assumptions of linearity and non-proportional bias should be assessed thoroughly.
A quantitative approach to evaluating caring in nursing simulation.
Eggenberger, Terry L; Keller, Kathryn B; Chase, Susan K; Payne, Linda
2012-01-01
This study was designed to test a quantitative method of measuring caring in the simulated environment. Since competency in caring is central to nursing practice, ways of including caring concepts in designing scenarios and in evaluation of performance need to be developed. Coates' Caring Efficacy scales were adapted for simulation and named the Caring Efficacy Scale-Simulation Student Version (CES-SSV) and Caring Efficacy Scale-Simulation Faculty Version (CES-SFV). A correlational study was designed to compare student self-ratings with faculty ratings on caring efficacy during an adult acute simulation experience with traditional and accelerated baccalaureate students in a nursing program grounded in caring theory. Student self-ratings were significantly correlated with objective ratings (r = 0.345, 0.356). Both the CES-SSV and the CES-SFV were found to have excellent internal consistency and significantly correlated interrater reliability. They were useful in measuring caring in the simulated learning environment.
Markov chain Monte Carlo linkage analysis: effect of bin width on the probability of linkage.
Slager, S L; Juo, S H; Durner, M; Hodge, S E
2001-01-01
We analyzed part of the Genetic Analysis Workshop (GAW) 12 simulated data using Monte Carlo Markov chain (MCMC) methods that are implemented in the computer program Loki. The MCMC method reports the "probability of linkage" (PL) across the chromosomal regions of interest. The point of maximum PL can then be taken as a "location estimate" for the location of the quantitative trait locus (QTL). However, Loki does not provide a formal statistical test of linkage. In this paper, we explore how the bin width used in the calculations affects the max PL and the location estimate. We analyzed age at onset (AO) and quantitative trait number 5, Q5, from 26 replicates of the general simulated data in one region where we knew a major gene, MG5, is located. For each trait, we found the max PL and the corresponding location estimate, using four different bin widths. We found that bin width, as expected, does affect the max PL and the location estimate, and we recommend that users of Loki explore how their results vary with different bin widths.
NASA Astrophysics Data System (ADS)
Brun, Christophe
2017-05-01
This paper is the second part of a study of katabatic jet along a convexly curved slope with a maximum angle of about 35.5°. Large-Eddy Simulation (LES) is performed with a special focus on the outer-layer shear of the katabatic jet. In the first part, a basic statistical quantitative analysis of the flow was performed. Here a qualitative and quantitative description of vortical structures is used to gain insight in the present 3-D turbulent flow. It is shown that Görtler vortices oriented in the streamwise downslope direction develop in the shear layer. They spread with a specific mushroom shape in the vertical direction up to about 100 m height. They play a main role with respect to local turbulent mixing in the ground surface boundary layer. The present curved slope configuration constitutes a realistic model for alpine orography. This paper provides a procedure based on local turbulence anisotropy to track Görtler vortices for in situ measurements, which has never been proposed in the literature.
A GATE evaluation of the sources of error in quantitative {sup 90}Y PET
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strydhorst, Jared, E-mail: jared.strydhorst@gmail.
Purpose: Accurate reconstruction of the dose delivered by {sup 90}Y microspheres using a postembolization PET scan would permit the establishment of more accurate dose–response relationships for treatment of hepatocellular carcinoma with {sup 90}Y. However, the quality of the PET data obtained is compromised by several factors, including poor count statistics and a very high random fraction. This work uses Monte Carlo simulations to investigate what impact factors other than low count statistics have on the quantification of {sup 90}Y PET. Methods: PET acquisitions of two phantoms—a NEMA PET phantom and the NEMA IEC PET body phantom-containing either {sup 90}Y ormore » {sup 18}F were simulated using GATE. Simulated projections were created with subsets of the simulation data allowing the contributions of random, scatter, and LSO background to be independently evaluated. The simulated projections were reconstructed using the commercial software for the simulated scanner, and the quantitative accuracy of the reconstruction and the contrast recovery of the reconstructed images were evaluated. Results: The quantitative accuracy of the {sup 90}Y reconstructions were not strongly influenced by the high random fraction present in the projection data, and the activity concentration was recovered to within 5% of the known value. The contrast recovery measured for simulated {sup 90}Y data was slightly poorer than that for simulated {sup 18}F data with similar count statistics. However, the degradation was not strongly linked to any particular factor. Using a more restricted energy range to reduce the random fraction in the projections had no significant effect. Conclusions: Simulations of {sup 90}Y PET confirm that quantitative {sup 90}Y is achievable with the same approach as that used for {sup 18}F, and that there is likely very little margin for improvement by attempting to model aspects unique to {sup 90}Y, such as the much higher random fraction or the presence of bremsstrahlung in the singles data.« less
Two schemes for quantitative photoacoustic tomography based on Monte Carlo simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Yubin; Yuan, Zhen, E-mail: zhenyuan@umac.mo
Purpose: The aim of this study was to develop novel methods for photoacoustically determining the optical absorption coefficient of biological tissues using Monte Carlo (MC) simulation. Methods: In this study, the authors propose two quantitative photoacoustic tomography (PAT) methods for mapping the optical absorption coefficient. The reconstruction methods combine conventional PAT with MC simulation in a novel way to determine the optical absorption coefficient of biological tissues or organs. Specifically, the authors’ two schemes were theoretically and experimentally examined using simulations, tissue-mimicking phantoms, ex vivo, and in vivo tests. In particular, the authors explored these methods using several objects withmore » different absorption contrasts embedded in turbid media and by using high-absorption media when the diffusion approximation was not effective at describing the photon transport. Results: The simulations and experimental tests showed that the reconstructions were quantitatively accurate in terms of the locations, sizes, and optical properties of the targets. The positions of the recovered targets were accessed by the property profiles, where the authors discovered that the off center error was less than 0.1 mm for the circular target. Meanwhile, the sizes and quantitative optical properties of the targets were quantified by estimating the full width half maximum of the optical absorption property. Interestingly, for the reconstructed sizes, the authors discovered that the errors ranged from 0 for relatively small-size targets to 26% for relatively large-size targets whereas for the recovered optical properties, the errors ranged from 0% to 12.5% for different cases. Conclusions: The authors found that their methods can quantitatively reconstruct absorbing objects of different sizes and optical contrasts even when the diffusion approximation is unable to accurately describe the photon propagation in biological tissues. In particular, their methods are able to resolve the intrinsic difficulties that occur when quantitative PAT is conducted by combining conventional PAT with the diffusion approximation or with radiation transport modeling.« less
Uncovering the genetic signature of quantitative trait evolution with replicated time series data.
Franssen, S U; Kofler, R; Schlötterer, C
2017-01-01
The genetic architecture of adaptation in natural populations has not yet been resolved: it is not clear to what extent the spread of beneficial mutations (selective sweeps) or the response of many quantitative trait loci drive adaptation to environmental changes. Although much attention has been given to the genomic footprint of selective sweeps, the importance of selection on quantitative traits is still not well studied, as the associated genomic signature is extremely difficult to detect. We propose 'Evolve and Resequence' as a promising tool, to study polygenic adaptation of quantitative traits in evolving populations. Simulating replicated time series data we show that adaptation to a new intermediate trait optimum has three characteristic phases that are reflected on the genomic level: (1) directional frequency changes towards the new trait optimum, (2) plateauing of allele frequencies when the new trait optimum has been reached and (3) subsequent divergence between replicated trajectories ultimately leading to the loss or fixation of alleles while the trait value does not change. We explore these 3 phase characteristics for relevant population genetic parameters to provide expectations for various experimental evolution designs. Remarkably, over a broad range of parameters the trajectories of selected alleles display a pattern across replicates, which differs both from neutrality and directional selection. We conclude that replicated time series data from experimental evolution studies provide a promising framework to study polygenic adaptation from whole-genome population genetics data.
Mspire-Simulator: LC-MS shotgun proteomic simulator for creating realistic gold standard data.
Noyce, Andrew B; Smith, Rob; Dalgleish, James; Taylor, Ryan M; Erb, K C; Okuda, Nozomu; Prince, John T
2013-12-06
The most important step in any quantitative proteomic pipeline is feature detection (aka peak picking). However, generating quality hand-annotated data sets to validate the algorithms, especially for lower abundance peaks, is nearly impossible. An alternative for creating gold standard data is to simulate it with features closely mimicking real data. We present Mspire-Simulator, a free, open-source shotgun proteomic simulator that goes beyond previous simulation attempts by generating LC-MS features with realistic m/z and intensity variance along with other noise components. It also includes machine-learned models for retention time and peak intensity prediction and a genetic algorithm to custom fit model parameters for experimental data sets. We show that these methods are applicable to data from three different mass spectrometers, including two fundamentally different types, and show visually and analytically that simulated peaks are nearly indistinguishable from actual data. Researchers can use simulated data to rigorously test quantitation software, and proteomic researchers may benefit from overlaying simulated data on actual data sets.
Assessment of simulation fidelity using measurements of piloting technique in flight
NASA Technical Reports Server (NTRS)
Clement, W. F.; Cleveland, W. B.; Key, D. L.
1984-01-01
The U.S. Army and NASA joined together on a project to conduct a systematic investigation and validation of a ground based piloted simulation of the Army/Sikorsky UH-60A helicopter. Flight testing was an integral part of the validation effort. Nap-of-the-Earth (NOE) piloting tasks which were investigated included the bob-up, the hover turn, the dash/quickstop, the sidestep, the dolphin, and the slalom. Results from the simulation indicate that the pilot's NOE task performance in the simulator is noticeably and quantifiably degraded when compared with the task performance results generated in flight test. The results of the flight test and ground based simulation experiments support a unique rationale for the assessment of simulation fidelity: flight simulation fidelity should be judged quantitatively by measuring pilot's control strategy and technique as induced by the simulator. A quantitative comparison is offered between the piloting technique observed in a flight simulator and that observed in flight test for the same tasks performed by the same pilots.
Lin, Yuting; Nouizi, Farouk; Kwong, Tiffany C; Gulsen, Gultekin
2015-09-01
Conventional fluorescence tomography (FT) can recover the distribution of fluorescent agents within a highly scattering medium. However, poor spatial resolution remains its foremost limitation. Previously, we introduced a new fluorescence imaging technique termed "temperature-modulated fluorescence tomography" (TM-FT), which provides high-resolution images of fluorophore distribution. TM-FT is a multimodality technique that combines fluorescence imaging with focused ultrasound to locate thermo-sensitive fluorescence probes using a priori spatial information to drastically improve the resolution of conventional FT. In this paper, we present an extensive simulation study to evaluate the performance of the TM-FT technique on complex phantoms with multiple fluorescent targets of various sizes located at different depths. In addition, the performance of the TM-FT is tested in the presence of background fluorescence. The results obtained using our new method are systematically compared with those obtained with the conventional FT. Overall, TM-FT provides higher resolution and superior quantitative accuracy, making it an ideal candidate for in vivo preclinical and clinical imaging. For example, a 4 mm diameter inclusion positioned in the middle of a synthetic slab geometry phantom (D:40 mm×W:100 mm) is recovered as an elongated object in the conventional FT (x=4.5 mm; y=10.4 mm), while TM-FT recovers it successfully in both directions (x=3.8 mm; y=4.6 mm). As a result, the quantitative accuracy of the TM-FT is superior because it recovers the concentration of the agent with a 22% error, which is in contrast with the 83% error of the conventional FT.
Genome Scale Modeling in Systems Biology: Algorithms and Resources
Najafi, Ali; Bidkhori, Gholamreza; Bozorgmehr, Joseph H.; Koch, Ina; Masoudi-Nejad, Ali
2014-01-01
In recent years, in silico studies and trial simulations have complemented experimental procedures. A model is a description of a system, and a system is any collection of interrelated objects; an object, moreover, is some elemental unit upon which observations can be made but whose internal structure either does not exist or is ignored. Therefore, any network analysis approach is critical for successful quantitative modeling of biological systems. This review highlights some of most popular and important modeling algorithms, tools, and emerging standards for representing, simulating and analyzing cellular networks in five sections. Also, we try to show these concepts by means of simple example and proper images and graphs. Overall, systems biology aims for a holistic description and understanding of biological processes by an integration of analytical experimental approaches along with synthetic computational models. In fact, biological networks have been developed as a platform for integrating information from high to low-throughput experiments for the analysis of biological systems. We provide an overview of all processes used in modeling and simulating biological networks in such a way that they can become easily understandable for researchers with both biological and mathematical backgrounds. Consequently, given the complexity of generated experimental data and cellular networks, it is no surprise that researchers have turned to computer simulation and the development of more theory-based approaches to augment and assist in the development of a fully quantitative understanding of cellular dynamics. PMID:24822031
Rigorous analysis of an electric-field-driven liquid crystal lens for 3D displays
NASA Astrophysics Data System (ADS)
Kim, Bong-Sik; Lee, Seung-Chul; Park, Woo-Sang
2014-08-01
We numerically analyzed the optical performance of an electric field driven liquid crystal (ELC) lens adopted for 3-dimensional liquid crystal displays (3D-LCDs) through rigorous ray tracing. For the calculation, we first obtain the director distribution profile of the liquid crystals by using the Erickson-Leslie motional equation; then, we calculate the transmission of light through the ELC lens by using the extended Jones matrix method. The simulation was carried out for a 9view 3D-LCD with a diagonal of 17.1 inches, where the ELC lens was slanted to achieve natural stereoscopic images. The results show that each view exists separately according to the viewing position at an optimum viewing distance of 80 cm. In addition, our simulation results provide a quantitative explanation for the ghost or blurred images between views observed from a 3D-LCD with an ELC lens. The numerical simulations are also shown to be in good agreement with the experimental results. The present simulation method is expected to provide optimum design conditions for obtaining natural 3D images by rigorously analyzing the optical functionalities of an ELC lens.
Savelyev, Alexey; MacKerell, Alexander D.
2015-01-01
In the present study we report on interactions of and competition between monovalent ions for two DNA sequences in MD simulations. Efforts included the development and validation of parameters for interactions among the first-group monovalent cations, Li+, Na+, K+ and Rb+, and DNA in the Drude polarizable and additive CHARMM36 force fields (FF). The optimization process targeted gas-phase QM interaction energies of various model compounds with ions and osmotic pressures of bulk electrolyte solutions of chemically relevant ions. The optimized ionic parameters are validated against counterion condensation theory and buffer exchange-atomic emission spectroscopy measurements providing quantitative data on the competitive association of different monovalent ions with DNA. Comparison between experimental and MD simulation results demonstrates that, compared to the additive CHARMM36 model, the Drude FF provides an improved description of the general features of the ionic atmosphere around DNA and leads to closer agreement with experiment on the ionic competition within the ion atmosphere. Results indicate the importance of extended simulation systems on the order of 25 Å beyond the DNA surface to obtain proper convergence of ion distributions. PMID:25751286
Rim, Yonghoon; Laing, Susan T; McPherson, David D; Kim, Hyunggun
2014-01-01
Mitral valve (MV) repair using expanded polytetrafluoroethylene sutures is an established and preferred interventional method to resolve the complex pathophysiologic problems associated with chordal rupture. We developed a novel computational evaluation protocol to determine the effect of the artificial sutures on restoring MV function following valve repair. A virtual MV was created using three-dimensional echocardiographic data in a patient with ruptured mitral chordae tendineae (RMCT). Virtual repairs were designed by adding artificial sutures between the papillary muscles and the posterior leaflet where the native chordae were ruptured. Dynamic finite element simulations were performed to evaluate pre- and post-repair MV function. Abnormal posterior leaflet prolapse and mitral regurgitation was clearly demonstrated in the MV with ruptured chordae. Following virtual repair to reconstruct ruptured chordae, the severity of the posterior leaflet prolapse decreased and stress concentration was markedly reduced both in the leaflet tissue and the intact native chordae. Complete leaflet coaptation was restored when four or six sutures were utilized. Computational simulations provided quantitative information of functional improvement following MV repair. This novel simulation strategy may provide a powerful tool for evaluation and prediction of interventional treatment for RMCT.
Simulation of Laser Additive Manufacturing and its Applications
NASA Astrophysics Data System (ADS)
Lee, Yousub
Laser and metal powder based additive manufacturing (AM), a key category of advanced Direct Digital Manufacturing (DDM), produces metallic components directly from a digital representation of the part such as a CAD file. It is well suited for the production of high-value, customizable components with complex geometry and the repair of damaged components. Currently, the main challenges for laser and metal powder based AM include the formation of defects (e.g., porosity), low surface finish quality, and spatially non-uniform properties of material. Such challenges stem largely from the limited knowledge of complex physical processes in AM especially the molten pool physics such as melting, molten metal flow, heat conduction, vaporization of alloying elements, and solidification. Direct experimental measurement of melt pool phenomena is highly difficult since the process is localized (on the order of 0.1 mm to 1 mm melt pool size) and transient (on the order of 1 m/s scanning speed). Furthermore, current optical and infrared cameras are limited to observe the melt pool surface. As a result, fluid flows in the melt pool, melt pool shape and formation of sub-surface defects are difficult to be visualized by experiment. On the other hand, numerical simulation, based on rigorous solution of mass, momentum and energy transport equations, can provide important quantitative knowledge of complex transport phenomena taking place in AM. The overarching goal of this dissertation research is to develop an analytical foundation for fundamental understanding of heat transfer, molten metal flow and free surface evolution. Two key types of laser AM processes are studied: a) powder injection, commonly used for repairing of turbine blades, and b) powder bed, commonly used for manufacturing of new parts with complex geometry. In the powder injection simulation, fluid convection, temperature gradient (G), solidification rate (R) and melt pool shape are calculated using a heat transfer and fluid flow model, which solves the mass, momentum and energy transport equations using the volume of fluid (VOF) method. These results provide quantitative understanding of underlying mechanisms of solidification morphology, solidification scale and deposit side bulging. In particular, it is shown that convective mixing alters solidification conditions (G and R), cooling trend and resultant size of primary dendrite arm spacing. Melt pool convexity in multiple layer LAM is associated not only with the convex shape of prior deposit but also with Marangoni flow. Lastly, it is shown that the lateral width of bulge is possibly controlled by the type of surface tension gradient. It is noted that laser beam spot size in the powder injection AM is about 2 mm and it melts hundreds of powder particles. Hence, the injection of individual particles is approximated by a lumped mass flux into the molten pool. On the other hand, for laser powder bed AM, the laser beam spot size is about 100 microm and thus it only melts a few tens of particles. Therefore, resolution of individual powder particles is essential for the accurate simulation of laser powder bed AM. To obtain the powder packing information in the powder bed, dynamic discrete element simulation (DEM) is used. It considers particle-particle interactions during packing to provide the quantitative structural powder bed properties such as particle arrangement, size and packing density, which is then an inputted as initial geometry for heat transfer and fluid flow simulation. This coupled 3D transient transport model provides a high spatial resolution while requiring less demanding computation. The results show that negatively skewed particle size distribution, faster scanning speed, low power and low packing density worsen the surface finish quality and promote the formation of balling defects. Taken together, both powder injection and powder bed models have resulted in an improved quantitative understanding of heat transfer, molten metal flow and free surface evolution. Furthermore, the analytical foundation that is developed in this dissertation provides the temperature history in AM, a prerequisite for predicting the solid-state phase transformation kinetics, residual stresses and distortion using other models. Moreover, it can be integrated with experimental monitoring and sensing tools to provide the capability of controlling melt pool shape, solidification microstructure, defect formation and surface finish.
GADEN: A 3D Gas Dispersion Simulator for Mobile Robot Olfaction in Realistic Environments.
Monroy, Javier; Hernandez-Bennets, Victor; Fan, Han; Lilienthal, Achim; Gonzalez-Jimenez, Javier
2017-06-23
This work presents a simulation framework developed under the widely used Robot Operating System (ROS) to enable the validation of robotics systems and gas sensing algorithms under realistic environments. The framework is rooted in the principles of computational fluid dynamics and filament dispersion theory, modeling wind flow and gas dispersion in 3D real-world scenarios (i.e., accounting for walls, furniture, etc.). Moreover, it integrates the simulation of different environmental sensors, such as metal oxide gas sensors, photo ionization detectors, or anemometers. We illustrate the potential and applicability of the proposed tool by presenting a simulation case in a complex and realistic office-like environment where gas leaks of different chemicals occur simultaneously. Furthermore, we accomplish quantitative and qualitative validation by comparing our simulated results against real-world data recorded inside a wind tunnel where methane was released under different wind flow profiles. Based on these results, we conclude that our simulation framework can provide a good approximation to real world measurements when advective airflows are present in the environment.
GADEN: A 3D Gas Dispersion Simulator for Mobile Robot Olfaction in Realistic Environments
Hernandez-Bennetts, Victor; Fan, Han; Lilienthal, Achim; Gonzalez-Jimenez, Javier
2017-01-01
This work presents a simulation framework developed under the widely used Robot Operating System (ROS) to enable the validation of robotics systems and gas sensing algorithms under realistic environments. The framework is rooted in the principles of computational fluid dynamics and filament dispersion theory, modeling wind flow and gas dispersion in 3D real-world scenarios (i.e., accounting for walls, furniture, etc.). Moreover, it integrates the simulation of different environmental sensors, such as metal oxide gas sensors, photo ionization detectors, or anemometers. We illustrate the potential and applicability of the proposed tool by presenting a simulation case in a complex and realistic office-like environment where gas leaks of different chemicals occur simultaneously. Furthermore, we accomplish quantitative and qualitative validation by comparing our simulated results against real-world data recorded inside a wind tunnel where methane was released under different wind flow profiles. Based on these results, we conclude that our simulation framework can provide a good approximation to real world measurements when advective airflows are present in the environment. PMID:28644375
Development and validation of real-time simulation of X-ray imaging with respiratory motion.
Vidal, Franck P; Villard, Pierre-Frédéric
2016-04-01
We present a framework that combines evolutionary optimisation, soft tissue modelling and ray tracing on GPU to simultaneously compute the respiratory motion and X-ray imaging in real-time. Our aim is to provide validated building blocks with high fidelity to closely match both the human physiology and the physics of X-rays. A CPU-based set of algorithms is presented to model organ behaviours during respiration. Soft tissue deformation is computed with an extension of the Chain Mail method. Rigid elements move according to kinematic laws. A GPU-based surface rendering method is proposed to compute the X-ray image using the Beer-Lambert law. It is provided as an open-source library. A quantitative validation study is provided to objectively assess the accuracy of both components: (i) the respiration against anatomical data, and (ii) the X-ray against the Beer-Lambert law and the results of Monte Carlo simulations. Our implementation can be used in various applications, such as interactive medical virtual environment to train percutaneous transhepatic cholangiography in interventional radiology, 2D/3D registration, computation of digitally reconstructed radiograph, simulation of 4D sinograms to test tomography reconstruction tools. Copyright © 2015 Elsevier Ltd. All rights reserved.
Gamble, Andree S
2017-03-01
Simulation in health education has been shown to increase confidence, psychomotor and professional skills, and thus positively impact on student preparedness for clinical placement. It is recognised as a valuable tool to expose and engage students in realistic patient care encounters without the potential to cause patient harm. Although inherent challenges exist in the development and implementation of simulation, variability in clinical placement time, availability and quality dictates the need to provide students with learning opportunities they may otherwise not experience. With this, and a myriad of other issues providing the impetus for improved clinical preparation, 28 final semester undergraduate nursing students in a paediatric nursing course were involved in an extended multi-scenario simulated clinical shift prior to clinical placement. The simulation focussed on a complex ward experience, giving students the opportunity to demonstrate a variety of psychomotor skills, decision making, leadership, team work and other professional attributes integral for successful transition into the clinical arena. Evaluation data were collected at 3 intermittent points; post-simulation, post clinical placement, and 3 months after commencing employment as a Registered Nurse. Quantitative and qualitative analysis suggested positive impacts on critical nursing concepts and psychomotor skills resulted for participants in both clinical placement and beyond into the first months of employment. Copyright © 2017 Elsevier Ltd. All rights reserved.
A quantitative study on magnesium alloy stent biodegradation.
Gao, Yuanming; Wang, Lizhen; Gu, Xuenan; Chu, Zhaowei; Guo, Meng; Fan, Yubo
2018-06-06
Insufficient scaffolding time in the process of rapid corrosion is the main problem of magnesium alloy stent (MAS). Finite element method had been used to investigate corrosion of MAS. However, related researches mostly described all elements suffered corrosion in view of one-dimensional corrosion. Multi-dimensional corrosions significantly influence mechanical integrity of MAS structures such as edges and corners. In this study, the effects of multi-dimensional corrosion were studied using experiment quantitatively, then a phenomenological corrosion model was developed to consider these effects. We implemented immersion test with magnesium alloy (AZ31B) cubes, which had different numbers of exposed surfaces to analyze differences of dimension. It was indicated that corrosion rates of cubes are almost proportional to their exposed-surface numbers, especially when pitting corrosions are not marked. The cubes also represented the hexahedron elements in simulation. In conclusion, corrosion rate of every element accelerates by increasing corrosion-surface numbers in multi-dimensional corrosion. The damage ratios among elements with the same size are proportional to the ratios of corrosion-surface numbers under uniform corrosion. The finite element simulation using proposed model provided more details of changes of morphology and mechanics in scaffolding time by removing 25.7% of elements of MAS. The proposed corrosion model reflected the effects of multi-dimension on corrosions. It would be used to predict degradation process of MAS quantitatively. Copyright © 2018 Elsevier Ltd. All rights reserved.
Fundamentals and Recent Developments in Approximate Bayesian Computation
Lintusaari, Jarno; Gutmann, Michael U.; Dutta, Ritabrata; Kaski, Samuel; Corander, Jukka
2017-01-01
Abstract Bayesian inference plays an important role in phylogenetics, evolutionary biology, and in many other branches of science. It provides a principled framework for dealing with uncertainty and quantifying how it changes in the light of new evidence. For many complex models and inference problems, however, only approximate quantitative answers are obtainable. Approximate Bayesian computation (ABC) refers to a family of algorithms for approximate inference that makes a minimal set of assumptions by only requiring that sampling from a model is possible. We explain here the fundamentals of ABC, review the classical algorithms, and highlight recent developments. [ABC; approximate Bayesian computation; Bayesian inference; likelihood-free inference; phylogenetics; simulator-based models; stochastic simulation models; tree-based models.] PMID:28175922
NASA Astrophysics Data System (ADS)
Kanno, C.; Edlin, D.; Borrillo-Hutter, T.; McCray, J. E.
2014-12-01
Potential contamination of ground water and surface water supplies from chemical contaminants in hydraulic fracturing fluids or in natural gas is of high public concern. However, quantitative assessments have rarely been conducted at specific energy-producing locations so that the true risk of contamination can be evaluated. The most likely pathways for contamination are surface spills and faulty well bores that leak production fluids directly into an aquifer. This study conducts fate and transport simulations of the most mobile chemical contaminants, based on reactivity to subsurface soils, degradation potential, and source concentration, to better understand which chemicals are most likely to contaminate water resources, and to provide information to planners who wish to be prepared for accidental releases. The simulations are intended to be most relevant to the Niobrara shale formation.
Quantum mechanical force fields for condensed phase molecular simulations
NASA Astrophysics Data System (ADS)
Giese, Timothy J.; York, Darrin M.
2017-09-01
Molecular simulations are powerful tools for providing atomic-level details into complex chemical and physical processes that occur in the condensed phase. For strongly interacting systems where quantum many-body effects are known to play an important role, density-functional methods are often used to provide the model with the potential energy used to drive dynamics. These methods, however, suffer from two major drawbacks. First, they are often too computationally intensive to practically apply to large systems over long time scales, limiting their scope of application. Second, there remain challenges for these models to obtain the necessary level of accuracy for weak non-bonded interactions to obtain quantitative accuracy for a wide range of condensed phase properties. Quantum mechanical force fields (QMFFs) provide a potential solution to both of these limitations. In this review, we address recent advances in the development of QMFFs for condensed phase simulations. In particular, we examine the development of QMFF models using both approximate and ab initio density-functional models, the treatment of short-ranged non-bonded and long-ranged electrostatic interactions, and stability issues in molecular dynamics calculations. Example calculations are provided for crystalline systems, liquid water, and ionic liquids. We conclude with a perspective for emerging challenges and future research directions.
Steady-state and transient operation of a heat-pipe radiator system
NASA Technical Reports Server (NTRS)
Sellers, J. P.
1974-01-01
Data obtained on a VCHP heat-pipe radiator system tested in a vacuum environment were studied. Analyses and interpretation of the steady-state results are presented along with an initial analysis of some of the transient data. Particular emphasis was placed on quantitative comparisons of the experimental data with computer model simulations. The results of the study provide a better understanding of the system but do not provide a complete explanation for the observed low VCHP performance and the relatively flat radiator panel temperature distribution. The results of the study also suggest hardware, software, and testing improvements.
NASA Astrophysics Data System (ADS)
Coats, S.; Smerdon, J. E.; Stevenson, S.; Fasullo, J.; Otto-Bliesner, B. L.
2017-12-01
The observational record, which provides only limited sampling of past climate variability, has made it difficult to quantitatively analyze the complex spatio-temporal character of drought. To provide a more complete characterization of drought, machine learning based methods that identify drought in three-dimensional space-time are applied to climate model simulations of the last millennium and future, as well as tree-ring based reconstructions of hydroclimate over the Northern Hemisphere extratropics. A focus is given to the most persistent and severe droughts of the past 1000 years. Analyzing reconstructions and simulations in this context allows for a validation of the spatio-temporal character of persistent and severe drought in climate model simulations. Furthermore, the long records provided by the reconstructions and simulations, allows for sufficient sampling to constrain projected changes to the spatio-temporal character of these features using the reconstructions. Along these lines, climate models suggest that there will be large increases in the persistence and severity of droughts over the coming century, but little change in their spatial extent. These models, however, exhibit biases in the spatio-temporal character of persistent and severe drought over parts of the Northern Hemisphere, which may undermine their usefulness for future projections. Despite these limitations, and in contrast to previous claims, there are no systematic changes in the character of persistent and severe droughts in simulations of the historical interval. This suggests that climate models are not systematically overestimating the hydroclimate response to anthropogenic forcing over this period, with critical implications for confidence in hydroclimate projections.
Quantitative analysis of voids in percolating structures in two-dimensional N-body simulations
NASA Technical Reports Server (NTRS)
Harrington, Patrick M.; Melott, Adrian L.; Shandarin, Sergei F.
1993-01-01
We present in this paper a quantitative method for defining void size in large-scale structure based on percolation threshold density. Beginning with two-dimensional gravitational clustering simulations smoothed to the threshold of nonlinearity, we perform percolation analysis to determine the large scale structure. The resulting objective definition of voids has a natural scaling property, is topologically interesting, and can be applied immediately to redshift surveys.
Role of small oligomers on the amyloidogenic aggregation free-energy landscape.
He, Xianglan; Giurleo, Jason T; Talaga, David S
2010-01-08
We combine atomic-force-microscopy particle-size-distribution measurements with earlier measurements on 1-anilino-8-naphthalene sulfonate, thioflavin T, and dynamic light scattering to develop a quantitative kinetic model for the aggregation of beta-lactoglobulin into amyloid. We directly compare our simulations to the population distributions provided by dynamic light scattering and atomic force microscopy. We combine species in the simulation according to structural type for comparison with fluorescence fingerprint results. The kinetic model of amyloidogenesis leads to an aggregation free-energy landscape. We define the roles of and propose a classification scheme for different oligomeric species based on their location in the aggregation free-energy landscape. We relate the different types of oligomers to the amyloid cascade hypothesis and the toxic oligomer hypothesis for amyloid-related diseases. We discuss existing kinetic mechanisms in terms of the different types of oligomers. We provide a possible resolution to the toxic oligomer-amyloid coincidence.
Kwok, Ezra; Gopaluni, Bhushan; Kizhakkedathu, Jayachandran N.
2013-01-01
Molecular dynamics (MD) simulations results are herein incorporated into an electrostatic model used to determine the structure of an effective polymer-based antidote to the anticoagulant fondaparinux. In silico data for the polymer or its cationic binding groups has not, up to now, been available, and experimental data on the structure of the polymer-fondaparinux complex is extremely limited. Consequently, the task of optimizing the polymer structure is a daunting challenge. MD simulations provided a means to gain microscopic information on the interactions of the binding groups and fondaparinux that would have otherwise been inaccessible. This was used to refine the electrostatic model and improve the quantitative model predictions of binding affinity. Once refined, the model provided guidelines to improve electrostatic forces between candidate polymers and fondaparinux in order to increase association rate constants. PMID:27006916
NASA Astrophysics Data System (ADS)
Tian, Ye; Jiang, Lianjun; Zhang, Xuejun; Zhang, Guangfu; Zhu, Qiuxiang
2018-03-01
For the usage of the memristors in functional circuits, a predictive physical model is of great importance. However, other than the developments of the memristive models accounting bulky effects, the achievements on simulating the interfacial memristance are still insufficient. Here we provide a physical model to describe the electrical switching of the memristive interface. It considers the trap-assisted transition between Schottky emission and Fowler-Nordheim tunneling, and successfully reproduces the memristive behaviors occurring on the interface between Bi2S3 nano-networks and F-doped SnO2. Such success not only allows us uncover several features of the memristive interface including the distribution nature of the traps, barrier height/thickness and so on, but also provides a foundation from which we can quantitatively simulate the real interfacial memristor.
Models and techniques for evaluating the effectiveness of aircraft computing systems
NASA Technical Reports Server (NTRS)
Meyer, J. F.
1978-01-01
The development of system models that can provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer are described. Specific topics covered include: system models; performability evaluation; capability and functional dependence; computation of trajectory set probabilities; and hierarchical modeling of an air transport mission.
Quantization of liver tissue in dual kVp computed tomography using linear discriminant analysis
NASA Astrophysics Data System (ADS)
Tkaczyk, J. Eric; Langan, David; Wu, Xiaoye; Xu, Daniel; Benson, Thomas; Pack, Jed D.; Schmitz, Andrea; Hara, Amy; Palicek, William; Licato, Paul; Leverentz, Jaynne
2009-02-01
Linear discriminate analysis (LDA) is applied to dual kVp CT and used for tissue characterization. The potential to quantitatively model both malignant and benign, hypo-intense liver lesions is evaluated by analysis of portal-phase, intravenous CT scan data obtained on human patients. Masses with an a priori classification are mapped to a distribution of points in basis material space. The degree of localization of tissue types in the material basis space is related to both quantum noise and real compositional differences. The density maps are analyzed with LDA and studied with system simulations to differentiate these factors. The discriminant analysis is formulated so as to incorporate the known statistical properties of the data. Effective kVp separation and mAs relates to precision of tissue localization. Bias in the material position is related to the degree of X-ray scatter and partial-volume effect. Experimental data and simulations demonstrate that for single energy (HU) imaging or image-based decomposition pixel values of water-like tissues depend on proximity to other iodine-filled bodies. Beam-hardening errors cause a shift in image value on the scale of that difference sought between in cancerous and cystic lessons. In contrast, projection-based decomposition or its equivalent when implemented on a carefully calibrated system can provide accurate data. On such a system, LDA may provide novel quantitative capabilities for tissue characterization in dual energy CT.
Simulation-Based Prediction of Equivalent Continuous Noises during Construction Processes
Zhang, Hong; Pei, Yun
2016-01-01
Quantitative prediction of construction noise is crucial to evaluate construction plans to help make decisions to address noise levels. Considering limitations of existing methods for measuring or predicting the construction noise and particularly the equivalent continuous noise level over a period of time, this paper presents a discrete-event simulation method for predicting the construction noise in terms of equivalent continuous level. The noise-calculating models regarding synchronization, propagation and equivalent continuous level are presented. The simulation framework for modeling the noise-affected factors and calculating the equivalent continuous noise by incorporating the noise-calculating models into simulation strategy is proposed. An application study is presented to demonstrate and justify the proposed simulation method in predicting the equivalent continuous noise during construction. The study contributes to provision of a simulation methodology to quantitatively predict the equivalent continuous noise of construction by considering the relevant uncertainties, dynamics and interactions. PMID:27529266
Simulation-Based Prediction of Equivalent Continuous Noises during Construction Processes.
Zhang, Hong; Pei, Yun
2016-08-12
Quantitative prediction of construction noise is crucial to evaluate construction plans to help make decisions to address noise levels. Considering limitations of existing methods for measuring or predicting the construction noise and particularly the equivalent continuous noise level over a period of time, this paper presents a discrete-event simulation method for predicting the construction noise in terms of equivalent continuous level. The noise-calculating models regarding synchronization, propagation and equivalent continuous level are presented. The simulation framework for modeling the noise-affected factors and calculating the equivalent continuous noise by incorporating the noise-calculating models into simulation strategy is proposed. An application study is presented to demonstrate and justify the proposed simulation method in predicting the equivalent continuous noise during construction. The study contributes to provision of a simulation methodology to quantitatively predict the equivalent continuous noise of construction by considering the relevant uncertainties, dynamics and interactions.
A Statistical Framework for Protein Quantitation in Bottom-Up MS-Based Proteomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karpievitch, Yuliya; Stanley, Jeffrey R.; Taverner, Thomas
2009-08-15
Motivation: Quantitative mass spectrometry-based proteomics requires protein-level estimates and associated confidence measures. Challenges include the presence of low quality or incorrectly identified peptides and informative missingness. Furthermore, models are required for rolling peptide-level information up to the protein level. Results: We present a statistical model that carefully accounts for informative missingness in peak intensities and allows unbiased, model-based, protein-level estimation and inference. The model is applicable to both label-based and label-free quantitation experiments. We also provide automated, model-based, algorithms for filtering of proteins and peptides as well as imputation of missing values. Two LC/MS datasets are used to illustrate themore » methods. In simulation studies, our methods are shown to achieve substantially more discoveries than standard alternatives. Availability: The software has been made available in the opensource proteomics platform DAnTE (http://omics.pnl.gov/software/). Contact: adabney@stat.tamu.edu Supplementary information: Supplementary data are available at Bioinformatics online.« less
Hanna, Debra; Romero, Klaus; Schito, Marco
2017-03-01
The development of novel tuberculosis (TB) multi-drug regimens that are more efficacious and of shorter duration requires a robust drug development pipeline. Advances in quantitative modeling and simulation can be used to maximize the utility of patient-level data from prior and contemporary clinical trials, thus optimizing study design for anti-TB regimens. This perspective article highlights the work of seven project teams developing first-in-class translational and quantitative methodologies that aim to inform drug development decision-making, dose selection, trial design, and safety assessments, in order to achieve shorter and safer therapies for patients in need. These tools offer the opportunity to evaluate multiple hypotheses and provide a means to identify, quantify, and understand relevant sources of variability, to optimize translation and clinical trial design. When incorporated into the broader regulatory sciences framework, these efforts have the potential to transform the development paradigm for TB combination development, as well as other areas of global health. Copyright © 2016. Published by Elsevier Ltd.
[A quantitative risk assessment model of salmonella on carcass in poultry slaughterhouse].
Zhang, Yu; Chen, Yuzhen; Hu, Chunguang; Zhang, Huaning; Bi, Zhenwang; Bi, Zhenqiang
2015-05-01
To construct a quantitative risk assessment model of salmonella on carcass in poultry slaughterhouse and to find out effective interventions to reduce salmonella contamination. We constructed a modular process risk model (MPRM) from evisceration to chilling in Excel Sheet using the data of the process parameters in poultry and the Salmomella concentration surveillance of Jinan in 2012. The MPRM was simulated by @ risk software. The concentration of salmonella on carcass after chilling was 1.96MPN/g which was calculated by model. The sensitive analysis indicated that the correlation coefficient of the concentration of salmonella after defeathering and in chilling pool were 0.84 and 0.34,which were the primary factors to the concentration of salmonella on carcass after chilling. The study provided a quantitative assessment model structure for salmonella on carcass in poultry slaughterhouse. The risk manager could control the contamination of salmonella on carcass after chilling by reducing the concentration of salmonella after defeathering and in chilling pool.
Full 3D opto-electronic simulation tool for nanotextured solar cells (Conference Presentation)
NASA Astrophysics Data System (ADS)
Michallon, Jérôme; Collin, Stéphane
2017-04-01
Increasing efforts on the photovoltaics research have recently been devoted to material savings, leading to the emergence of new designs based on nanotextured and nanowire-based solar cells. The use of small absorber volumes, light-trapping nanostructures and unconventional carrier collection schemes (radial nanowire junctions, point contacts in planar structures,…) increases the impact of surfaces recombination and induces homogeneity in the photogenerated carrier concentrations. The investigation of their impacts on the device performances need to be addressed using full 3D coupled opto-electrical modeling. In this context, we have developed a new tool for full 3D opto-electrical simulation using the most advanced optical and electrical simulation techniques. We will present an overview of its simulation capabilities and the key issues that have been solved to make it fully operational and reliable. We will provide various examples of opto-electronic simulation of (i) nanostructured solar cells with localized contacts and (ii) nanowire solar cells. We will also show how opto-electronic simulation can be used to simulate light- and electron-beam induced current (LBIC/EBIC) experiments, targeting quantitative analysis of the passivation properties of surfaces.
Lui, Justin T; Hoy, Monica Y
2017-06-01
Background The increasing prevalence of virtual reality simulation in temporal bone surgery warrants an investigation to assess training effectiveness. Objectives To determine if temporal bone simulator use improves mastoidectomy performance. Data Sources Ovid Medline, Embase, and PubMed databases were systematically searched per the PRISMA guidelines. Review Methods Inclusion criteria were peer-reviewed publications that utilized quantitative data of mastoidectomy performance following the use of a temporal bone simulator. The search was restricted to human studies published in English. Studies were excluded if they were in non-peer-reviewed format, were descriptive in nature, or failed to provide surgical performance outcomes. Meta-analysis calculations were then performed. Results A meta-analysis based on the random-effects model revealed an improvement in overall mastoidectomy performance following training on the temporal bone simulator. A standardized mean difference of 0.87 (95% CI, 0.38-1.35) was generated in the setting of a heterogeneous study population ( I 2 = 64.3%, P < .006). Conclusion In the context of a diverse population of virtual reality simulation temporal bone surgery studies, meta-analysis calculations demonstrate an improvement in trainee mastoidectomy performance with virtual simulation training.
NASA Technical Reports Server (NTRS)
1979-01-01
Satellites provide an excellent platform from which to observe crops on the scale and frequency required to provide accurate crop production estimates on a worldwide basis. Multispectral imaging sensors aboard these platforms are capable of providing data from which to derive acreage and production estimates. The issue of sensor swath width was examined. The quantitative trade trade necessary to resolve the combined issue of sensor swath width, number of platforms, and their orbits was generated and are included. Problems with different swath width sensors were analyzed and an assessment of system trade-offs of swath width versus number of satellites was made for achieving Global Crop Production Forecasting.
Simulation of UV atomic radiation for application in exhaust plume spectrometry
NASA Astrophysics Data System (ADS)
Wallace, T. L.; Powers, W. T.; Cooper, A. E.
1993-06-01
Quantitative analysis of exhaust plume spectral data has long been a goal of developers of advanced engine health monitoring systems which incorporate optical measurements of rocket exhaust constituents. Discussed herein is the status of present efforts to model and predict atomic radiation spectra and infer free-atom densities from emission/absorption measurements as part of the Optical Plume Anomaly Detection (OPAD) program at Marshall Space Flight Center (MSFC). A brief examination of the mathematical formalism is provided in the context of predicting radiation from the Mach disk region of the SSME exhaust flow at nominal conditions during ground level testing at MSFC. Computational results are provided for Chromium and Copper at selected transitions which indicate a strong dependence upon broadening parameter values determining the absorption-emission line shape. Representative plots of recent spectral data from the Stennis Space Center (SSC) Diagnostic Test Facility (DTF) rocket engine are presented and compared to numerical results from the present self-absorbing model; a comprehensive quantitative analysis will be reported at a later date.
SVD compression for magnetic resonance fingerprinting in the time domain.
McGivney, Debra F; Pierre, Eric; Ma, Dan; Jiang, Yun; Saybasili, Haris; Gulani, Vikas; Griswold, Mark A
2014-12-01
Magnetic resonance (MR) fingerprinting is a technique for acquiring and processing MR data that simultaneously provides quantitative maps of different tissue parameters through a pattern recognition algorithm. A predefined dictionary models the possible signal evolutions simulated using the Bloch equations with different combinations of various MR parameters and pattern recognition is completed by computing the inner product between the observed signal and each of the predicted signals within the dictionary. Though this matching algorithm has been shown to accurately predict the MR parameters of interest, one desires a more efficient method to obtain the quantitative images. We propose to compress the dictionary using the singular value decomposition, which will provide a low-rank approximation. By compressing the size of the dictionary in the time domain, we are able to speed up the pattern recognition algorithm, by a factor of between 3.4-4.8, without sacrificing the high signal-to-noise ratio of the original scheme presented previously.
Preliminary experiments on pharmacokinetic diffuse fluorescence tomography of CT-scanning mode
NASA Astrophysics Data System (ADS)
Zhang, Yanqi; Wang, Xin; Yin, Guoyan; Li, Jiao; Zhou, Zhongxing; Zhao, Huijuan; Gao, Feng; Zhang, Limin
2016-10-01
In vivo tomographic imaging of the fluorescence pharmacokinetic parameters in tissues can provide additional specific and quantitative physiological and pathological information to that of fluorescence concentration. This modality normally requires a highly-sensitive diffuse fluorescence tomography (DFT) working in dynamic way to finally extract the pharmacokinetic parameters from the measured pharmacokinetics-associated temporally-varying boundary intensity. This paper is devoted to preliminary experimental validation of our proposed direct reconstruction scheme of instantaneous sampling based pharmacokinetic-DFT: A highly-sensitive DFT system of CT-scanning mode working with parallel four photomultiplier-tube photon-counting channels is developed to generate an instantaneous sampling dataset; A direct reconstruction scheme then extracts images of the pharmacokinetic parameters using the adaptive-EKF strategy. We design a dynamic phantom that can simulate the agent metabolism in living tissue. The results of the dynamic phantom experiments verify the validity of the experiment system and reconstruction algorithms, and demonstrate that system provides good resolution, high sensitivity and quantitativeness at different pump speed.
Option generation in the treatment of unstable patients: An experienced-novice comparison study.
Whyte, James; Pickett-Hauber, Roxanne; Whyte, Maria D
2016-09-01
There are a dearth of studies that quantitatively measure nurses' appreciation of stimuli and the subsequent generation of options in practice environments. The purpose of this paper was to provide an examination of nurses' ability to solve problems while quantifying the stimuli upon which they focus during patient care activities. The study used a quantitative descriptive method that gathered performance data from a simulated task environment using multi-angle video and audio. These videos were coded and transcripts of all of the actions that occurred in the scenario and the verbal reports of the participants were compiled. The results revealed a pattern of superiority of the experienced exemplar group. Novice actions were characterized by difficulty in following common protocols, inconsistencies in their evaluative approaches, and a pattern of omissions of key actions. The study provides support for the deliberate practice-based programs designed to facilitate higher-level performance in novices. © 2016 John Wiley & Sons Australia, Ltd.
SVD Compression for Magnetic Resonance Fingerprinting in the Time Domain
McGivney, Debra F.; Pierre, Eric; Ma, Dan; Jiang, Yun; Saybasili, Haris; Gulani, Vikas; Griswold, Mark A.
2016-01-01
Magnetic resonance fingerprinting is a technique for acquiring and processing MR data that simultaneously provides quantitative maps of different tissue parameters through a pattern recognition algorithm. A predefined dictionary models the possible signal evolutions simulated using the Bloch equations with different combinations of various MR parameters and pattern recognition is completed by computing the inner product between the observed signal and each of the predicted signals within the dictionary. Though this matching algorithm has been shown to accurately predict the MR parameters of interest, one desires a more efficient method to obtain the quantitative images. We propose to compress the dictionary using the singular value decomposition (SVD), which will provide a low-rank approximation. By compressing the size of the dictionary in the time domain, we are able to speed up the pattern recognition algorithm, by a factor of between 3.4-4.8, without sacrificing the high signal-to-noise ratio of the original scheme presented previously. PMID:25029380
Greater involvement of action simulation mechanisms in emotional vs cognitive empathy
Oliver, Lindsay D; Vieira, Joana B; Neufeld, Richard W J; Dziobek, Isabel; Mitchell, Derek G V
2018-01-01
Abstract Empathy is crucial for successful interpersonal interactions, and it is impaired in many psychiatric and neurological disorders. Action-perception matching, or action simulation mechanisms, has been suggested to facilitate empathy by supporting the simulation of perceived experience in others. However, this remains unclear, and the involvement of the action simulation circuit in cognitive empathy (the ability to adopt another’s perspective) vs emotional empathy (the capacity to share and react affectively to another’s emotional experience) has not been quantitatively compared. Presently, healthy adults completed a classic cognitive empathy task (false belief), an emotional empathy task and an action simulation button-pressing task during functional magnetic resonance imaging. Conjunction analyses revealed common recruitment of the inferior frontal gyrus (IFG), thought to be critical for action-perception matching, during both action simulation and emotional, but not cognitive, empathy. Furthermore, activation was significantly greater in action simulation regions in the left IFG during emotional vs cognitive empathy, and activity in this region was positively correlated with mean feeling ratings during the emotional empathy task. These findings provide evidence for greater involvement of action simulation mechanisms in emotional than cognitive empathy. Thus, the action simulation circuit may be an important target for delineating the pathophysiology of disorders featuring emotional empathy impairments. PMID:29462481
Analyzing wildfire exposure on Sardinia, Italy
NASA Astrophysics Data System (ADS)
Salis, Michele; Ager, Alan A.; Arca, Bachisio; Finney, Mark A.; Alcasena, Fermin; Bacciu, Valentina; Duce, Pierpaolo; Munoz Lozano, Olga; Spano, Donatella
2014-05-01
We used simulation modeling based on the minimum travel time algorithm (MTT) to analyze wildfire exposure of key ecological, social and economic features on Sardinia, Italy. Sardinia is the second largest island of the Mediterranean Basin, and in the last fifty years experienced large and dramatic wildfires, which caused losses and threatened urban interfaces, forests and natural areas, and agricultural productions. Historical fires and environmental data for the period 1995-2009 were used as input to estimate fine scale burn probability, conditional flame length, and potential fire size in the study area. With this purpose, we simulated 100,000 wildfire events within the study area, randomly drawing from the observed frequency distribution of burn periods and wind directions for each fire. Estimates of burn probability, excluding non-burnable fuels, ranged from 0 to 1.92x10-3, with a mean value of 6.48x10-5. Overall, the outputs provided a quantitative assessment of wildfire exposure at the landscape scale and captured landscape properties of wildfire exposure. We then examined how the exposure profiles varied among and within selected features and assets located on the island. Spatial variation in modeled outputs resulted in a strong effect of fuel models, coupled with slope and weather. In particular, the combined effect of Mediterranean maquis, woodland areas and complex topography on flame length was relevant, mainly in north-east Sardinia, whereas areas with herbaceous fuels and flat areas were in general characterized by lower fire intensity but higher burn probability. The simulation modeling proposed in this work provides a quantitative approach to inform wildfire risk management activities, and represents one of the first applications of burn probability modeling to capture fire risk and exposure profiles in the Mediterranean basin.
Probing the neutrino mass hierarchy with the rise time of a supernova burst
NASA Astrophysics Data System (ADS)
Serpico, Pasquale D.; Chakraborty, Sovan; Fischer, Tobias; Hüdepohl, Lorenz; Janka, Hans-Thomas; Mirizzi, Alessandro
2012-04-01
The rise time of a Galactic supernova (SN) ν¯e light curve, observable at a high-statistics experiment such as the Icecube Cherenkov detector, can provide a diagnostic tool for the neutrino mass hierarchy at “large” 1-3 leptonic mixing angle ϑ13. Thanks to the combination of matter suppression of collective effects at early post-bounce times on one hand and the presence of the ordinary Mikheyev-Smirnov-Wolfenstein effect in the outer layers of the SN on the other hand, a sufficiently fast rise time on O(100)ms scale is indicative of an inverted mass hierarchy. We investigate results from an extensive set of stellar core-collapse simulations, providing a first exploration of the astrophysical robustness of these features. We find that for all the models analyzed (sharing the same weak interaction microphysics) the rise times for the same hierarchy are similar not only qualitatively, but also quantitatively, with the signals for the two classes of hierarchies significantly separated. We show via Monte Carlo simulations that the two cases should be distinguishable at IceCube for SNe at a typical Galactic distance 99% of the time. Finally, a preliminary survey seems to show that the faster rise time for inverted hierarchy as compared to normal hierarchy is a qualitatively robust feature predicted by several simulation groups. Since the viability of this signature ultimately depends on the quantitative assessment of theoretical/numerical uncertainties, our results motivate an extensive campaign of comparison of different code predictions at early accretion times with implementation of microphysics of comparable sophistication, including effects such as nucleon recoils in weak interactions.
NASA Technical Reports Server (NTRS)
Nguyen, Quang-Viet
2002-01-01
A gas-fueled high-pressure combustion facility with optical access, which was developed over the last 2 years, has just been completed. The High Pressure Gaseous Burner (HPGB) rig at the NASA Glenn Research Center can operate at sustained pressures up to 60 atm with a variety of gaseous fuels and liquid jet fuel. The facility is unique as it is the only continuous-flow, hydrogen-capable, 60-atm rig in the world with optical access. It will provide researchers with new insights into flame conditions that simulate the environment inside the ultra-high-pressure-ratio combustion chambers of tomorrow's advanced aircraft engines. The facility provides optical access to the flame zone, enabling the calibration of nonintrusive optical diagnostics to measure chemical species and temperature. The data from the HPGB rig enables the validation of numerical codes that simulate gas turbine combustors, such as the National Combustor Code (NCC). The validation of such numerical codes is often best achieved with nonintrusive optical diagnostic techniques that meet these goals: information-rich (multispecies) and quantitative while providing good spatial and time resolution. Achieving these goals is a challenge for most nonintrusive optical diagnostic techniques. Raman scattering is a technique that meets these challenges. Raman scattering occurs when intense laser light interacts with molecules to radiate light at a shifted wavelength (known as the Raman shift). This shift in wavelength is unique to each chemical species and provides a "fingerprint" of the different species present. The facility will first be used to gather a comprehensive data base of laser Raman spectra at high pressures. These calibration data will then be used to quantify future laser Raman measurements of chemical species concentration and temperature in this facility and other facilities that use Raman scattering.
Huang, Xiao Yan; Shan, Zhi Jie; Zhai, Hong Lin; Li, Li Na; Zhang, Xiao Yun
2011-08-22
Heat shock protein 90 (Hsp90) takes part in the developments of several cancers. Novobiocin, a typically C-terminal inhibitor for Hsp90, will probably used as an important anticancer drug in the future. In this work, we explored the valuable information and designed new novobiocin derivatives based on a three-dimensional quantitative structure-activity relationship (3D QSAR). The comparative molecular field analysis and comparative molecular similarity indices analysis models with high predictive capability were established, and their reliabilities are supported by the statistical parameters. Based on the several important influence factors obtained from these models, six new novobiocin derivatives with higher inhibitory activities were designed and confirmed by the molecular simulation with our models, which provide the potential anticancer drug leads for further research.
DAWN (Design Assistant Workstation) for advanced physical-chemical life support systems
NASA Technical Reports Server (NTRS)
Rudokas, Mary R.; Cantwell, Elizabeth R.; Robinson, Peter I.; Shenk, Timothy W.
1989-01-01
This paper reports the results of a project supported by the National Aeronautics and Space Administration, Office of Aeronautics and Space Technology (NASA-OAST) under the Advanced Life Support Development Program. It is an initial attempt to integrate artificial intelligence techniques (via expert systems) with conventional quantitative modeling tools for advanced physical-chemical life support systems. The addition of artificial intelligence techniques will assist the designer in the definition and simulation of loosely/well-defined life support processes/problems as well as assist in the capture of design knowledge, both quantitative and qualitative. Expert system and conventional modeling tools are integrated to provide a design workstation that assists the engineer/scientist in creating, evaluating, documenting and optimizing physical-chemical life support systems for short-term and extended duration missions.
Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas
2017-01-01
Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments. PMID:28190948
Free-energy simulations reveal molecular mechanism for functional switch of a DNA helicase
Ma, Wen; Whitley, Kevin D; Schulten, Klaus
2018-01-01
Helicases play key roles in genome maintenance, yet it remains elusive how these enzymes change conformations and how transitions between different conformational states regulate nucleic acid reshaping. Here, we developed a computational technique combining structural bioinformatics approaches and atomic-level free-energy simulations to characterize how the Escherichia coli DNA repair enzyme UvrD changes its conformation at the fork junction to switch its function from unwinding to rezipping DNA. The lowest free-energy path shows that UvrD opens the interface between two domains, allowing the bound ssDNA to escape. The simulation results predict a key metastable 'tilted' state during ssDNA strand switching. By simulating FRET distributions with fluorophores attached to UvrD, we show that the new state is supported quantitatively by single-molecule measurements. The present study deciphers key elements for the 'hyper-helicase' behavior of a mutant and provides an effective framework to characterize directly structure-function relationships in molecular machines. PMID:29664402
Minerva exoplanet detection sensitivity from simulated observations
NASA Astrophysics Data System (ADS)
McCrady, Nate; Nava, C.
2014-01-01
Small rocky planets induce radial velocity signals that are difficult to detect in the presence of stellar noise sources of comparable or larger amplitude. Minerva is a dedicated, robotic observatory that will attain 1 meter per second precision to detect these rocky planets in the habitable zone around nearby stars. We present results of an ongoing project investigating Minerva’s planet detection sensitivity as a function of observational cadence, planet mass, and orbital parameters (period, eccentricity, and argument of periastron). Radial velocity data is simulated with realistic observing cadence, accounting for weather patterns at Mt. Hopkins, Arizona. Instrumental and stellar noise are added to the simulated observations, including effects of oscillation, jitter, starspots and rotation. We extract orbital parameters from the simulated RV data using the RVLIN code. A Monte Carlo analysis is used to explore the parameter space and evaluate planet detection completeness. Our results will inform the Minerva observing strategy by providing a quantitative measure of planet detection sensitivity as a function of orbital parameters and cadence.
Free-energy simulations reveal molecular mechanism for functional switch of a DNA helicase.
Ma, Wen; Whitley, Kevin D; Chemla, Yann R; Luthey-Schulten, Zaida; Schulten, Klaus
2018-04-17
Helicases play key roles in genome maintenance, yet it remains elusive how these enzymes change conformations and how transitions between different conformational states regulate nucleic acid reshaping. Here, we developed a computational technique combining structural bioinformatics approaches and atomic-level free-energy simulations to characterize how the Escherichia coli DNA repair enzyme UvrD changes its conformation at the fork junction to switch its function from unwinding to rezipping DNA. The lowest free-energy path shows that UvrD opens the interface between two domains, allowing the bound ssDNA to escape. The simulation results predict a key metastable 'tilted' state during ssDNA strand switching. By simulating FRET distributions with fluorophores attached to UvrD, we show that the new state is supported quantitatively by single-molecule measurements. The present study deciphers key elements for the 'hyper-helicase' behavior of a mutant and provides an effective framework to characterize directly structure-function relationships in molecular machines. © 2018, Ma et al.
Fault diagnosis based on continuous simulation models
NASA Technical Reports Server (NTRS)
Feyock, Stefan
1987-01-01
The results are described of an investigation of techniques for using continuous simulation models as basis for reasoning about physical systems, with emphasis on the diagnosis of system faults. It is assumed that a continuous simulation model of the properly operating system is available. Malfunctions are diagnosed by posing the question: how can we make the model behave like that. The adjustments that must be made to the model to produce the observed behavior usually provide definitive clues to the nature of the malfunction. A novel application of Dijkstra's weakest precondition predicate transformer is used to derive the preconditions for producing the required model behavior. To minimize the size of the search space, an envisionment generator based on interval mathematics was developed. In addition to its intended application, the ability to generate qualitative state spaces automatically from quantitative simulations proved to be a fruitful avenue of investigation in its own right. Implementations of the Dijkstra transform and the envisionment generator are reproduced in the Appendix.
Leeper, W Robert; Haut, Elliott R; Pandian, Vinciya; Nakka, Sajan; Dodd-O, Jeffrey; Bhatti, Nasir; Hunt, Elizabeth A; Saheed, Mustapha; Dalesio, Nicholas; Schiavi, Adam; Miller, Christina; Kirsch, Thomas D; Berkow, Lauren
2018-04-05
A hospital-wide difficult airway response team was developed in 2008 at The Johns Hopkins Hospital with three central pillars: operations, safety monitoring, and education. The objective of this study was to assess the outcomes of the educational pillar of the difficult airway response team program, known as the multidisciplinary difficult airway course (MDAC). The comprehensive, full-day MDAC involves trainees and staff from all provider groups who participate in airway management. The MDAC occurs within the Johns Hopkins Medicine Simulation Center approximately four times per year and uses a combination of didactic lectures, hands-on sessions, and high-fidelity simulation training. Participation in MDAC is the main intervention being investigated in this study. Data were collected prospectively using course evaluation survey with quantitative and qualitative components, and prepost course knowledge assessment multiple choice questions (MCQ). Outcomes include course evaluation scores and themes derived from qualitative assessments, and prepost course knowledge assessment MCQ scores. Tertiary care academic hospital center PARTICIPANTS: Students, residents, fellows, and practicing physicians from the departments of Surgery, Otolaryngology Head and Neck Surgery, Anesthesiology/Critical Care Medicine, and Emergency Medicine; advanced practice providers (nurse practitioners and physician assistants), nurse anesthetists, nurses, and respiratory therapists. Totally, 23 MDACs have been conducted, including 499 participants. Course evaluations were uniformly positive with mean score of 86.9 of 95 points. Qualitative responses suggest major value from high-fidelity simulation, the hands-on skill stations, and teamwork practice. MCQ scores demonstrated significant improvement: median (interquartile range) pre: 69% (60%-81%) vs post: 81% (72%-89%), p < 0.001. Implementation of a MDAC successfully disseminated principles and protocols to all airway providers. Demonstrable improvement in prepost course knowledge assessment and overwhelmingly positive course evaluations (quantitative and qualitative) suggest a critical and ongoing role for the MDAC course. Copyright © 2018 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Development of a realistic, dynamic digital brain phantom for CT perfusion validation
NASA Astrophysics Data System (ADS)
Divel, Sarah E.; Segars, W. Paul; Christensen, Soren; Wintermark, Max; Lansberg, Maarten G.; Pelc, Norbert J.
2016-03-01
Physicians rely on CT Perfusion (CTP) images and quantitative image data, including cerebral blood flow, cerebral blood volume, and bolus arrival delay, to diagnose and treat stroke patients. However, the quantification of these metrics may vary depending on the computational method used. Therefore, we have developed a dynamic and realistic digital brain phantom upon which CTP scans can be simulated based on a set of ground truth scenarios. Building upon the previously developed 4D extended cardiac-torso (XCAT) phantom containing a highly detailed brain model, this work consisted of expanding the intricate vasculature by semi-automatically segmenting existing MRA data and fitting nonuniform rational B-spline surfaces to the new vessels. Using time attenuation curves input by the user as reference, the contrast enhancement in the vessels changes dynamically. At each time point, the iodine concentration in the arteries and veins is calculated from the curves and the material composition of the blood changes to reflect the expected values. CatSim, a CT system simulator, generates simulated data sets of this dynamic digital phantom which can be further analyzed to validate CTP studies and post-processing methods. The development of this dynamic and realistic digital phantom provides a valuable resource with which current uncertainties and controversies surrounding the quantitative computations generated from CTP data can be examined and resolved.
Effects of biases in domain wall network evolution. II. Quantitative analysis
NASA Astrophysics Data System (ADS)
Correia, J. R. C. C. C.; Leite, I. S. C. R.; Martins, C. J. A. P.
2018-04-01
Domain walls form at phase transitions which break discrete symmetries. In a cosmological context, they often overclose the Universe (contrary to observational evidence), although one may prevent this by introducing biases or forcing anisotropic evolution of the walls. In a previous work [Correia et al., Phys. Rev. D 90, 023521 (2014), 10.1103/PhysRevD.90.023521], we numerically studied the evolution of various types of biased domain wall networks in the early Universe, confirming that anisotropic networks ultimately reach scaling while those with a biased potential or biased initial conditions decay. We also found that the analytic decay law obtained by Hindmarsh was in good agreement with simulations of biased potentials, but not of biased initial conditions, and suggested that the difference was related to the Gaussian approximation underlying the analytic law. Here, we extend our previous work in several ways. For the cases of biased potential and biased initial conditions, we study in detail the field distributions in the simulations, confirming that the validity (or not) of the Gaussian approximation is the key difference between the two cases. For anisotropic walls, we carry out a more extensive set of numerical simulations and compare them to the canonical velocity-dependent one-scale model for domain walls, finding that the model accurately predicts the linear scaling regime after isotropization. Overall, our analysis provides a quantitative description of the cosmological evolution of these networks.
Health and climate benefits of offshore wind facilities in the Mid-Atlantic United States
Buonocore, Jonathan J.; Luckow, Patrick; Fisher, Jeremy; ...
2016-07-14
Electricity from fossil fuels contributes substantially to both climate change and the health burden of air pollution. Renewable energy sources are capable of displacing electricity from fossil fuels, but the quantity of health and climate benefits depend on site-specific attributes that are not often included in quantitative models. Here, we link an electrical grid simulation model to an air pollution health impact assessment model and US regulatory estimates of the impacts of carbon to estimate the health and climate benefits of offshore wind facilities of different sizes in two different locations. We find that offshore wind in the Mid-Atlantic ismore » capable of producing health and climate benefits of between $54 and $120 per MWh of generation, with the largest simulated facility (3000 MW off the coast of New Jersey) producing approximately $690 million in benefits in 2017. The variability in benefits per unit generation is a function of differences in locations (Maryland versus New Jersey), simulated years (2012 versus 2017), and facility generation capacity, given complexities of the electrical grid and differences in which power plants are offset. In the end, this work demonstrates health and climate benefits of off shore wind, provides further evidence of the utility of geographically-refined modeling frameworks, and yields quantitative insights that would allow for inclusion of both climate and public health in benefits assessments of renewable energy.« less
Health and climate benefits of offshore wind facilities in the Mid-Atlantic United States
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buonocore, Jonathan J.; Luckow, Patrick; Fisher, Jeremy
Electricity from fossil fuels contributes substantially to both climate change and the health burden of air pollution. Renewable energy sources are capable of displacing electricity from fossil fuels, but the quantity of health and climate benefits depend on site-specific attributes that are not often included in quantitative models. Here, we link an electrical grid simulation model to an air pollution health impact assessment model and US regulatory estimates of the impacts of carbon to estimate the health and climate benefits of offshore wind facilities of different sizes in two different locations. We find that offshore wind in the Mid-Atlantic ismore » capable of producing health and climate benefits of between $54 and $120 per MWh of generation, with the largest simulated facility (3000 MW off the coast of New Jersey) producing approximately $690 million in benefits in 2017. The variability in benefits per unit generation is a function of differences in locations (Maryland versus New Jersey), simulated years (2012 versus 2017), and facility generation capacity, given complexities of the electrical grid and differences in which power plants are offset. In the end, this work demonstrates health and climate benefits of off shore wind, provides further evidence of the utility of geographically-refined modeling frameworks, and yields quantitative insights that would allow for inclusion of both climate and public health in benefits assessments of renewable energy.« less
Health and climate benefits of offshore wind facilities in the Mid-Atlantic United States
NASA Astrophysics Data System (ADS)
Buonocore, Jonathan J.; Luckow, Patrick; Fisher, Jeremy; Kempton, Willett; Levy, Jonathan I.
2016-07-01
Electricity from fossil fuels contributes substantially to both climate change and the health burden of air pollution. Renewable energy sources are capable of displacing electricity from fossil fuels, but the quantity of health and climate benefits depend on site-specific attributes that are not often included in quantitative models. Here, we link an electrical grid simulation model to an air pollution health impact assessment model and US regulatory estimates of the impacts of carbon to estimate the health and climate benefits of offshore wind facilities of different sizes in two different locations. We find that offshore wind in the Mid-Atlantic is capable of producing health and climate benefits of between 54 and 120 per MWh of generation, with the largest simulated facility (3000 MW off the coast of New Jersey) producing approximately 690 million in benefits in 2017. The variability in benefits per unit generation is a function of differences in locations (Maryland versus New Jersey), simulated years (2012 versus 2017), and facility generation capacity, given complexities of the electrical grid and differences in which power plants are offset. This work demonstrates health and climate benefits of offshore wind, provides further evidence of the utility of geographically-refined modeling frameworks, and yields quantitative insights that would allow for inclusion of both climate and public health in benefits assessments of renewable energy.
NASA Technical Reports Server (NTRS)
Asenov, Asen
1998-01-01
A three-dimensional (3-D) "atomistic" simulation study of random dopant induced threshold voltage lowering and fluctuations in sub-0.1 microns MOSFET's is presented. For the first time a systematic analysis of random dopant effects down to an individual dopant level was carried out in 3-D on a scale sufficient to provide quantitative statistical predictions. Efficient algorithms based on a single multigrid solution of the Poisson equation followed by the solution of a simplified current continuity equation are used in the simulations. The effects of various MOSFET design parameters, including the channel length and width, oxide thickness and channel doping, on the threshold voltage lowering and fluctuations are studied using typical samples of 200 atomistically different MOSFET's. The atomistic results for the threshold voltage fluctuations were compared with two analytical models based on dopant number fluctuations. Although the analytical models predict the general trends in the threshold voltage fluctuations, they fail to describe quantitatively the magnitude of the fluctuations. The distribution of the atomistically calculated threshold voltage and its correlation with the number of dopants in the channel of the MOSFET's was analyzed based on a sample of 2500 microscopically different devices. The detailed analysis shows that the threshold voltage fluctuations are determined not only by the fluctuation in the dopant number, but also in the dopant position.
Chen, Chia-Lin; Wang, Yuchuan; Lee, Jason J. S.; Tsui, Benjamin M. W.
2011-01-01
Purpose We assessed the quantitation accuracy of small animal pinhole single photon emission computed tomography (SPECT) under the current preclinical settings, where image compensations are not routinely applied. Procedures The effects of several common image-degrading factors and imaging parameters on quantitation accuracy were evaluated using Monte-Carlo simulation methods. Typical preclinical imaging configurations were modeled, and quantitative analyses were performed based on image reconstructions without compensating for attenuation, scatter, and limited system resolution. Results Using mouse-sized phantom studies as examples, attenuation effects alone degraded quantitation accuracy by up to −18% (Tc-99m or In-111) or −41% (I-125). The inclusion of scatter effects changed the above numbers to −12% (Tc-99m or In-111) and −21% (I-125), respectively, indicating the significance of scatter in quantitative I-125 imaging. Region-of-interest (ROI) definitions have greater impacts on regional quantitation accuracy for small sphere sources as compared to attenuation and scatter effects. For the same ROI, SPECT acquisitions using pinhole apertures of different sizes could significantly affect the outcome, whereas the use of different radii-of-rotation yielded negligible differences in quantitation accuracy for the imaging configurations simulated. Conclusions We have systematically quantified the influence of several factors affecting the quantitation accuracy of small animal pinhole SPECT. In order to consistently achieve accurate quantitation within 5% of the truth, comprehensive image compensation methods are needed. PMID:19048346
2016-01-01
Although qualitative strategies based on direct injection mass spectrometry (DIMS) have recently emerged as an alternative for the rapid classification of food samples, the potential of these approaches in quantitative tasks has scarcely been addressed to date. In this paper, the applicability of different multivariate regression procedures to data collected by DIMS from simulated mixtures has been evaluated. The most relevant factors affecting quantitation, such as random noise, the number of calibration samples, type of validation, mixture complexity and similarity of mass spectra, were also considered and comprehensively discussed. Based on the conclusions drawn from simulated data, and as an example of application, experimental mass spectral fingerprints collected by direct thermal desorption coupled to mass spectrometry were used for the quantitation of major volatiles in Thymus zygis subsp. zygis chemotypes. The results obtained, validated with the direct thermal desorption coupled to gas chromatography–mass spectrometry method here used as a reference, show the potential of DIMS approaches for the fast and precise quantitative profiling of volatiles in foods. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644978
NASA Astrophysics Data System (ADS)
Karakatsanis, Nicolas A.; Rahmim, Arman
2014-03-01
Graphical analysis is employed in the research setting to provide quantitative estimation of PET tracer kinetics from dynamic images at a single bed. Recently, we proposed a multi-bed dynamic acquisition framework enabling clinically feasible whole-body parametric PET imaging by employing post-reconstruction parameter estimation. In addition, by incorporating linear Patlak modeling within the system matrix, we enabled direct 4D reconstruction in order to effectively circumvent noise amplification in dynamic whole-body imaging. However, direct 4D Patlak reconstruction exhibits a relatively slow convergence due to the presence of non-sparse spatial correlations in temporal kinetic analysis. In addition, the standard Patlak model does not account for reversible uptake, thus underestimating the influx rate Ki. We have developed a novel whole-body PET parametric reconstruction framework in the STIR platform, a widely employed open-source reconstruction toolkit, a) enabling accelerated convergence of direct 4D multi-bed reconstruction, by employing a nested algorithm to decouple the temporal parameter estimation from the spatial image update process, and b) enhancing the quantitative performance particularly in regions with reversible uptake, by pursuing a non-linear generalized Patlak 4D nested reconstruction algorithm. A set of published kinetic parameters and the XCAT phantom were employed for the simulation of dynamic multi-bed acquisitions. Quantitative analysis on the Ki images demonstrated considerable acceleration in the convergence of the nested 4D whole-body Patlak algorithm. In addition, our simulated and patient whole-body data in the postreconstruction domain indicated the quantitative benefits of our extended generalized Patlak 4D nested reconstruction for tumor diagnosis and treatment response monitoring.
NASA Technical Reports Server (NTRS)
Gasiewski, A. J.; Skofronick, G. M.
1992-01-01
Progress by investigators at Georgia Tech in defining the requirements for large space antennas for passive microwave Earth imaging systems is reviewed. In order to determine antenna constraints (e.g., the aperture size, illumination taper, and gain uncertainty limits) necessary for the retrieval of geophysical parameters (e.g., rain rate) with adequate spatial resolution and accuracy, a numerical simulation of the passive microwave observation and retrieval process is being developed. Due to the small spatial scale of precipitation and the nonlinear relationships between precipitation parameters (e.g., rain rate, water density profile) and observed brightness temperatures, the retrieval of precipitation parameters are of primary interest in the simulation studies. Major components of the simulation are described as well as progress and plans for completion. The overall goal of providing quantitative assessments of the accuracy of candidate geosynchronous and low-Earth orbiting imaging systems will continue under a separate grant.
NASA Astrophysics Data System (ADS)
Schmieschek, S.; Shamardin, L.; Frijters, S.; Krüger, T.; Schiller, U. D.; Harting, J.; Coveney, P. V.
2017-08-01
We introduce the lattice-Boltzmann code LB3D, version 7.1. Building on a parallel program and supporting tools which have enabled research utilising high performance computing resources for nearly two decades, LB3D version 7 provides a subset of the research code functionality as an open source project. Here, we describe the theoretical basis of the algorithm as well as computational aspects of the implementation. The software package is validated against simulations of meso-phases resulting from self-assembly in ternary fluid mixtures comprising immiscible and amphiphilic components such as water-oil-surfactant systems. The impact of the surfactant species on the dynamics of spinodal decomposition are tested and quantitative measurement of the permeability of a body centred cubic (BCC) model porous medium for a simple binary mixture is described. Single-core performance and scaling behaviour of the code are reported for simulations on current supercomputer architectures.
Information Security Analysis Using Game Theory and Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schlicher, Bob G; Abercrombie, Robert K
Information security analysis can be performed using game theory implemented in dynamic simulations of Agent Based Models (ABMs). Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. Our approach addresses imperfect information and scalability that allows us to also address previous limitations of current stochastic game models. Such models only consider perfect information assuming that the defender is always able to detect attacks; assuming that the state transition probabilities are fixed before the game assuming that the players actions aremore » always synchronous; and that most models are not scalable with the size and complexity of systems under consideration. Our use of ABMs yields results of selected experiments that demonstrate our proposed approach and provides a quantitative measure for realistic information systems and their related security scenarios.« less
ID201202961, DOE S-124,539, Information Security Analysis Using Game Theory and Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abercrombie, Robert K; Schlicher, Bob G
Information security analysis can be performed using game theory implemented in dynamic simulations of Agent Based Models (ABMs). Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. Our approach addresses imperfect information and scalability that allows us to also address previous limitations of current stochastic game models. Such models only consider perfect information assuming that the defender is always able to detect attacks; assuming that the state transition probabilities are fixed before the game assuming that the players actions aremore » always synchronous; and that most models are not scalable with the size and complexity of systems under consideration. Our use of ABMs yields results of selected experiments that demonstrate our proposed approach and provides a quantitative measure for realistic information systems and their related security scenarios.« less
Antonopoulos, Markos; Stamatakos, Georgios
2015-01-01
Intensive glioma tumor infiltration into the surrounding normal brain tissues is one of the most critical causes of glioma treatment failure. To quantitatively understand and mathematically simulate this phenomenon, several diffusion-based mathematical models have appeared in the literature. The majority of them ignore the anisotropic character of diffusion of glioma cells since availability of pertinent truly exploitable tomographic imaging data is limited. Aiming at enriching the anisotropy-enhanced glioma model weaponry so as to increase the potential of exploiting available tomographic imaging data, we propose a Brownian motion-based mathematical analysis that could serve as the basis for a simulation model estimating the infiltration of glioblastoma cells into the surrounding brain tissue. The analysis is based on clinical observations and exploits diffusion tensor imaging (DTI) data. Numerical simulations and suggestions for further elaboration are provided.
Systematic coarse-grained modeling of complexation between small interfering RNA and polycations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wei, Zonghui; Luijten, Erik, E-mail: luijten@northwestern.edu; Department of Materials Science and Engineering, Northwestern University, Evanston, Illinois 60208
All-atom molecular dynamics simulations can provide insight into the properties of polymeric gene-delivery carriers by elucidating their interactions and detailed binding patterns with nucleic acids. However, to explore nanoparticle formation through complexation of these polymers and nucleic acids and study their behavior at experimentally relevant time and length scales, a reliable coarse-grained model is needed. Here, we systematically develop such a model for the complexation of small interfering RNA (siRNA) and grafted polyethyleneimine copolymers, a promising candidate for siRNA delivery. We compare the predictions of this model with all-atom simulations and demonstrate that it is capable of reproducing detailed bindingmore » patterns, charge characteristics, and water release kinetics. Since the coarse-grained model accelerates the simulations by one to two orders of magnitude, it will make it possible to quantitatively investigate nanoparticle formation involving multiple siRNA molecules and cationic copolymers.« less
Comparison of Different Methods of Grading a Level Turn Task on a Flight Simulator
NASA Technical Reports Server (NTRS)
Heath, Bruce E.; Crier, tomyka
2003-01-01
With the advancements in the computing power of personal computers, pc-based flight simulators and trainers have opened new avenues in the training of airplane pilots. It may be desirable to have the flight simulator make a quantitative evaluation of the progress of a pilot's training thereby reducing the physical requirement of the flight instructor who must, in turn, watch every flight. In an experiment, University students conducted six different flights, each consisting of two level turns. The flights were three minutes in duration. By evaluating videotapes, two certified flight instructors provided separate letter grades for each turn. These level turns were also evaluated using two other computer based grading methods. One method determined automated grades based on prescribed tolerances in bank angle, airspeed and altitude. The other method used was deviations in altitude and bank angle for performance index and performance grades.
Schmidt, Steven R; Katti, Dinesh R; Ghosh, Pijush; Katti, Kalpana S
2005-08-16
The mechanical response of the interlayer of hydrated montmorillonite was evaluated using steered molecular dynamics. An atomic model of the sodium montmorillonite was previously constructed. In the current study, the interlayer of the model was hydrated with multiple layers of water. Using steered molecular dynamics, external forces were applied to individual atoms of the clay surface, and the response of the model was studied. The displacement versus applied stress and stress versus strain relationships of various parts of the interlayer were studied. The paper describes the construction of the model, the simulation procedure, and results of the simulations. Some results of the previous work are further interpreted in the light of the current research. The simulations provide quantitative stress deformation relationships as well as an insight into the molecular interactions taking place between the clay surface and interlayer water and cations.
The co-development of looking dynamics and discrimination performance
Perone, Sammy; Spencer, John P.
2015-01-01
The study of looking dynamics and discrimination form the backbone of developmental science and are central processes in theories of infant cognition. Looking dynamics and discrimination change dramatically across the first year of life. Surprisingly, developmental changes in looking and discrimination have not been studied together. Recent simulations of a dynamic neural field (DNF) model of infant looking and memory suggest that looking and discrimination do change together over development and arise from a single neurodevelopmental mechanism. We probe this claim by measuring looking dynamics and discrimination along continuous, metrically organized dimensions in 5-, 7, and 10-month-old infants (N = 119). The results showed that looking dynamics and discrimination changed together over development and are linked within individuals. Quantitative simulations of a DNF model provide insights into the processes that underlie developmental change in looking dynamics and discrimination. Simulation results support the view that these changes might arise from a single neurodevelopmental mechanism. PMID:23957821
Ion specific correlations in bulk and at biointerfaces.
Kalcher, I; Horinek, D; Netz, R R; Dzubiella, J
2009-10-21
Ion specific effects are ubiquitous in any complex colloidal or biological fluid in bulk or at interfaces. The molecular origins of these 'Hofmeister effects' are not well understood and their theoretical description poses a formidable challenge to the modeling and simulation community. On the basis of the combination of atomistically resolved molecular dynamics (MD) computer simulations and statistical mechanics approaches, we present a few selected examples of specific electrolyte effects in bulk, at simple neutral and charged interfaces, and on a short α-helical peptide. The structural complexity in these strongly Coulomb-correlated systems is highlighted and analyzed in the light of available experimental data. While in general the comparison of MD simulations to experiments often lacks quantitative agreement, mostly because molecular force fields and coarse-graining procedures remain to be optimized, the consensus as regards trends provides important insights into microscopic hydration and binding mechanisms.
From Inverse Problems in Mathematical Physiology to Quantitative Differential Diagnoses
Zenker, Sven; Rubin, Jonathan; Clermont, Gilles
2007-01-01
The improved capacity to acquire quantitative data in a clinical setting has generally failed to improve outcomes in acutely ill patients, suggesting a need for advances in computer-supported data interpretation and decision making. In particular, the application of mathematical models of experimentally elucidated physiological mechanisms could augment the interpretation of quantitative, patient-specific information and help to better target therapy. Yet, such models are typically complex and nonlinear, a reality that often precludes the identification of unique parameters and states of the model that best represent available data. Hypothesizing that this non-uniqueness can convey useful information, we implemented a simplified simulation of a common differential diagnostic process (hypotension in an acute care setting), using a combination of a mathematical model of the cardiovascular system, a stochastic measurement model, and Bayesian inference techniques to quantify parameter and state uncertainty. The output of this procedure is a probability density function on the space of model parameters and initial conditions for a particular patient, based on prior population information together with patient-specific clinical observations. We show that multimodal posterior probability density functions arise naturally, even when unimodal and uninformative priors are used. The peaks of these densities correspond to clinically relevant differential diagnoses and can, in the simplified simulation setting, be constrained to a single diagnosis by assimilating additional observations from dynamical interventions (e.g., fluid challenge). We conclude that the ill-posedness of the inverse problem in quantitative physiology is not merely a technical obstacle, but rather reflects clinical reality and, when addressed adequately in the solution process, provides a novel link between mathematically described physiological knowledge and the clinical concept of differential diagnoses. We outline possible steps toward translating this computational approach to the bedside, to supplement today's evidence-based medicine with a quantitatively founded model-based medicine that integrates mechanistic knowledge with patient-specific information. PMID:17997590
Quantitative analysis of the anti-noise performance of an m-sequence in an electromagnetic method
NASA Astrophysics Data System (ADS)
Yuan, Zhe; Zhang, Yiming; Zheng, Qijia
2018-02-01
An electromagnetic method with a transmitted waveform coded by an m-sequence achieved better anti-noise performance compared to the conventional manner with a square-wave. The anti-noise performance of the m-sequence varied with multiple coding parameters; hence, a quantitative analysis of the anti-noise performance for m-sequences with different coding parameters was required to optimize them. This paper proposes the concept of an identification system, with the identified Earth impulse response obtained by measuring the system output with the input of the voltage response. A quantitative analysis of the anti-noise performance of the m-sequence was achieved by analyzing the amplitude-frequency response of the corresponding identification system. The effects of the coding parameters on the anti-noise performance are summarized by numerical simulation, and their optimization is further discussed in our conclusions; the validity of the conclusions is further verified by field experiment. The quantitative analysis method proposed in this paper provides a new insight into the anti-noise mechanism of the m-sequence, and could be used to evaluate the anti-noise performance of artificial sources in other time-domain exploration methods, such as the seismic method.
Yap, John Stephen; Fan, Jianqing; Wu, Rongling
2009-12-01
Estimation of the covariance structure of longitudinal processes is a fundamental prerequisite for the practical deployment of functional mapping designed to study the genetic regulation and network of quantitative variation in dynamic complex traits. We present a nonparametric approach for estimating the covariance structure of a quantitative trait measured repeatedly at a series of time points. Specifically, we adopt Huang et al.'s (2006, Biometrika 93, 85-98) approach of invoking the modified Cholesky decomposition and converting the problem into modeling a sequence of regressions of responses. A regularized covariance estimator is obtained using a normal penalized likelihood with an L(2) penalty. This approach, embedded within a mixture likelihood framework, leads to enhanced accuracy, precision, and flexibility of functional mapping while preserving its biological relevance. Simulation studies are performed to reveal the statistical properties and advantages of the proposed method. A real example from a mouse genome project is analyzed to illustrate the utilization of the methodology. The new method will provide a useful tool for genome-wide scanning for the existence and distribution of quantitative trait loci underlying a dynamic trait important to agriculture, biology, and health sciences.
Claycamp, H Gregg; Kona, Ravikanth; Fahmy, Raafat; Hoag, Stephen W
2016-04-01
Qualitative risk assessment methods are often used as the first step to determining design space boundaries; however, quantitative assessments of risk with respect to the design space, i.e., calculating the probability of failure for a given severity, are needed to fully characterize design space boundaries. Quantitative risk assessment methods in design and operational spaces are a significant aid to evaluating proposed design space boundaries. The goal of this paper is to demonstrate a relatively simple strategy for design space definition using a simplified Bayesian Monte Carlo simulation. This paper builds on a previous paper that used failure mode and effects analysis (FMEA) qualitative risk assessment and Plackett-Burman design of experiments to identity the critical quality attributes. The results show that the sequential use of qualitative and quantitative risk assessments can focus the design of experiments on a reduced set of critical material and process parameters that determine a robust design space under conditions of limited laboratory experimentation. This approach provides a strategy by which the degree of risk associated with each known parameter can be calculated and allocates resources in a manner that manages risk to an acceptable level.
Punkvang, Auradee; Hannongbua, Supa; Saparpakorn, Patchreenart; Pungpo, Pornpan
2016-05-01
The Mycobacterium tuberculosis protein kinase B (PknB) is critical for growth and survival of M. tuberculosis within the host. The series of aminopyrimidine derivatives show impressive activity against PknB (IC50 < .5 μM). However, most of them show weak or no cellular activity against M. tuberculosis (MIC > 63 μM). Consequently, the key structural features related to activity against of both PknB and M. tuberculosis need to be investigated. Here, two- and three-dimensional quantitative structure-activity relationship (2D and 3D QSAR) analyses combined with molecular dynamics (MD) simulations were employed with the aim to evaluate these key structural features of aminopyrimidine derivatives. Hologram quantitative structure-activity relationship (HQSAR) and CoMSIA models constructed from IC50 and MIC values of aminopyrimidine compounds could establish the structural requirements for better activity against of both PknB and M. tuberculosis. The NH linker and the R1 substituent of the template compound are not only crucial for the biological activity against PknB but also for the biological activity against M. tuberculosis. Moreover, the results obtained from MD simulations show that these moieties are the key fragments for binding of aminopyrimidine compounds in PknB. The combination of QSAR analysis and MD simulations helps us to provide a structural concept that could guide future design of PknB inhibitors with improved potency against both the purified enzyme and whole M. tuberculosis cells.
Li, W.; Ma, Q.; Thorne, R. M.; ...
2016-06-10
Various physical processes are known to cause acceleration, loss, and transport of energetic electrons in the Earth's radiation belts, but their quantitative roles in different time and space need further investigation. During the largest storm over the past decade (17 March 2015), relativistic electrons experienced fairly rapid acceleration up to ~7 MeV within 2 days after an initial substantial dropout, as observed by Van Allen Probes. In the present paper, we evaluate the relative roles of various physical processes during the recovery phase of this large storm using a 3-D diffusion simulation. By quantitatively comparing the observed and simulated electronmore » evolution, we found that chorus plays a critical role in accelerating electrons up to several MeV near the developing peak location and produces characteristic flat-top pitch angle distributions. By only including radial diffusion, the simulation underestimates the observed electron acceleration, while radial diffusion plays an important role in redistributing electrons and potentially accelerates them to even higher energies. Moreover, plasmaspheric hiss is found to provide efficient pitch angle scattering losses for hundreds of keV electrons, while its scattering effect on > 1 MeV electrons is relatively slow. Although an additional loss process is required to fully explain the overestimated electron fluxes at multi-MeV, the combined physical processes of radial diffusion and pitch angle and energy diffusion by chorus and hiss reproduce the observed electron dynamics remarkably well, suggesting that quasi-linear diffusion theory is reasonable to evaluate radiation belt electron dynamics during this big storm.« less
2D Quantum Transport Modeling in Nanoscale MOSFETs
NASA Technical Reports Server (NTRS)
Svizhenko, Alexei; Anantram, M. P.; Govindan, T. R.; Biegel, Bryan
2001-01-01
With the onset of quantum confinement in the inversion layer in nanoscale MOSFETs, behavior of the resonant level inevitably determines all device characteristics. While most classical device simulators take quantization into account in some simplified manner, the important details of electrostatics are missing. Our work addresses this shortcoming and provides: (a) a framework to quantitatively explore device physics issues such as the source-drain and gate leakage currents, DIBL, and threshold voltage shift due to quantization, and b) a means of benchmarking quantum corrections to semiclassical models (such as density- gradient and quantum-corrected MEDICI). We have developed physical approximations and computer code capable of realistically simulating 2-D nanoscale transistors, using the non-equilibrium Green's function (NEGF) method. This is the most accurate full quantum model yet applied to 2-D device simulation. Open boundary conditions, oxide tunneling and phase-breaking scattering are treated on equal footing. Electrons in the ellipsoids of the conduction band are treated within the anisotropic effective mass approximation. Quantum simulations are focused on MIT 25, 50 and 90 nm "well- tempered" MOSFETs and compared to classical and quantum corrected models. The important feature of quantum model is smaller slope of Id-Vg curve and consequently higher threshold voltage. These results are quantitatively consistent with I D Schroedinger-Poisson calculations. The effect of gate length on gate-oxide leakage and sub-threshold current has been studied. The shorter gate length device has an order of magnitude smaller current at zero gate bias than the longer gate length device without a significant trade-off in on-current. This should be a device design consideration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, W.; Ma, Q.; Thorne, R. M.
Various physical processes are known to cause acceleration, loss, and transport of energetic electrons in the Earth's radiation belts, but their quantitative roles in different time and space need further investigation. During the largest storm over the past decade (17 March 2015), relativistic electrons experienced fairly rapid acceleration up to ~7 MeV within 2 days after an initial substantial dropout, as observed by Van Allen Probes. In the present paper, we evaluate the relative roles of various physical processes during the recovery phase of this large storm using a 3-D diffusion simulation. By quantitatively comparing the observed and simulated electronmore » evolution, we found that chorus plays a critical role in accelerating electrons up to several MeV near the developing peak location and produces characteristic flat-top pitch angle distributions. By only including radial diffusion, the simulation underestimates the observed electron acceleration, while radial diffusion plays an important role in redistributing electrons and potentially accelerates them to even higher energies. Moreover, plasmaspheric hiss is found to provide efficient pitch angle scattering losses for hundreds of keV electrons, while its scattering effect on > 1 MeV electrons is relatively slow. Although an additional loss process is required to fully explain the overestimated electron fluxes at multi-MeV, the combined physical processes of radial diffusion and pitch angle and energy diffusion by chorus and hiss reproduce the observed electron dynamics remarkably well, suggesting that quasi-linear diffusion theory is reasonable to evaluate radiation belt electron dynamics during this big storm.« less
Conti, Michele; Van Loo, Denis; Auricchio, Ferdinando; De Beule, Matthieu; De Santis, Gianluca; Verhegghe, Benedict; Pirrelli, Stefano; Odero, Attilio
2011-06-01
To quantitatively evaluate the impact of carotid stent cell design on vessel scaffolding by using patient-specific finite element analysis of carotid artery stenting (CAS). The study was organized in 2 parts: (1) validation of a patient-specific finite element analysis of CAS and (2) evaluation of vessel scaffolding. Micro-computed tomography (CT) images of an open-cell stent deployed in a patient-specific silicone mock artery were compared with the corresponding finite element analysis results. This simulation was repeated for the closed-cell counterpart. In the second part, the stent strut distribution, as reflected by the inter-strut angles, was evaluated for both cell types in different vessel cross sections as a measure of scaffolding. The results of the patient-specific finite element analysis of CAS matched well with experimental stent deployment both qualitatively and quantitatively, demonstrating the reliability of the numerical approach. The measured inter-strut angles suggested that the closed-cell design provided superior vessel scaffolding compared to the open-cell counterpart. However, the full strut interconnection of the closed-cell design reduced the stent's ability to accommodate to the irregular eccentric profile of the vessel cross section, leading to a gap between the stent surface and the vessel wall. Even though this study was limited to a single stent design and one vascular anatomy, the study confirmed the capability of dedicated computer simulations to predict differences in scaffolding by open- and closed-cell carotid artery stents. These simulations have the potential to be used in the design of novel carotid stents or for procedure planning.
Precisely detecting atomic position of atomic intensity images.
Wang, Zhijun; Guo, Yaolin; Tang, Sai; Li, Junjie; Wang, Jincheng; Zhou, Yaohe
2015-03-01
We proposed a quantitative method to detect atomic position in atomic intensity images from experiments such as high-resolution transmission electron microscopy, atomic force microscopy, and simulation such as phase field crystal modeling. The evaluation of detection accuracy proves the excellent performance of the method. This method provides a chance to precisely determine atomic interactions based on the detected atomic positions from the atomic intensity image, and hence to investigate the related physical, chemical and electrical properties. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Pocinki, L. S.; Kaplan, L. D.; Cornell, M. E.; Greenstone, R.
1979-01-01
A model was developed to generate quantitative estimates of the risk associated with the release of graphite fibers during fires involving commercial aircraft constructed with graphite fiber composite materials. The model was used to estimate the risk associated with accidents at several U.S. airports. These results were then combined to provide an estimate of the total risk to the nation.
Chu, X; Korzekwa, K; Elsby, R; Fenner, K; Galetin, A; Lai, Y; Matsson, P; Moss, A; Nagar, S; Rosania, GR; Bai, JPF; Polli, JW; Sugiyama, Y; Brouwer, KLR
2013-01-01
Intracellular concentrations of drugs and metabolites are often important determinants of efficacy, toxicity, and drug interactions. Hepatic drug distribution can be affected by many factors, including physicochemical properties, uptake/efflux transporters, protein binding, organelle sequestration, and metabolism. This white paper highlights determinants of hepatocyte drug/metabolite concentrations and provides an update on model systems, methods, and modeling/simulation approaches used to quantitatively assess hepatocellular concentrations of molecules. The critical scientific gaps and future research directions in this field are discussed. PMID:23588320
Hydrodynamic predictions for 5.44 TeV Xe+Xe collisions
NASA Astrophysics Data System (ADS)
Giacalone, Giuliano; Noronha-Hostler, Jacquelyn; Luzum, Matthew; Ollitrault, Jean-Yves
2018-03-01
We argue that relativistic hydrodynamics is able to make robust predictions for soft particle production in Xe+Xe collisions at the CERN Large Hadron Collider (LHC). The change of system size from Pb+Pb to Xe+Xe provides a unique opportunity to test the scaling laws inherent to fluid dynamics. Using event-by-event hydrodynamic simulations, we make quantitative predictions for several observables: mean transverse momentum, anisotropic flow coefficients, and their fluctuations. Results are shown as a function of collision centrality.
Role of Correlations in the Collective Behavior of Microswimmer Suspensions
NASA Astrophysics Data System (ADS)
Stenhammar, Joakim; Nardini, Cesare; Nash, Rupert W.; Marenduzzo, Davide; Morozov, Alexander
2017-07-01
In this Letter, we study the collective behavior of a large number of self-propelled microswimmers immersed in a fluid. Using unprecedentedly large-scale lattice Boltzmann simulations, we reproduce the transition to bacterial turbulence. We show that, even well below the transition, swimmers move in a correlated fashion that cannot be described by a mean-field approach. We develop a novel kinetic theory that captures these correlations and is nonperturbative in the swimmer density. To provide an experimentally accessible measure of correlations, we calculate the diffusivity of passive tracers and reveal its nontrivial density dependence. The theory is in quantitative agreement with the lattice Boltzmann simulations and captures the asymmetry between pusher and puller swimmers below the transition to turbulence.
2012-01-30
Sensors: LIDAR , Camera, SONAR) is qualitatively or quantitatively ranked against the other options in such categories as weight and power consumption...Mapping ( SLAM ) and A*. The second software change in progress is upgrading from Unreal 2004 to is a bridge between an external program that defines a...current simulation setup, a simulated quad-copter with an Inertial Navigation System (INS) and ranging LIDAR sensor spawns within an environment and
A permeation theory for single-file ion channels: one- and two-step models.
Nelson, Peter Hugo
2011-04-28
How many steps are required to model permeation through ion channels? This question is investigated by comparing one- and two-step models of permeation with experiment and MD simulation for the first time. In recent MD simulations, the observed permeation mechanism was identified as resembling a Hodgkin and Keynes knock-on mechanism with one voltage-dependent rate-determining step [Jensen et al., PNAS 107, 5833 (2010)]. These previously published simulation data are fitted to a one-step knock-on model that successfully explains the highly non-Ohmic current-voltage curve observed in the simulation. However, these predictions (and the simulations upon which they are based) are not representative of real channel behavior, which is typically Ohmic at low voltages. A two-step association/dissociation (A/D) model is then compared with experiment for the first time. This two-parameter model is shown to be remarkably consistent with previously published permeation experiments through the MaxiK potassium channel over a wide range of concentrations and positive voltages. The A/D model also provides a first-order explanation of permeation through the Shaker potassium channel, but it does not explain the asymmetry observed experimentally. To address this, a new asymmetric variant of the A/D model is developed using the present theoretical framework. It includes a third parameter that represents the value of the "permeation coordinate" (fractional electric potential energy) corresponding to the triply occupied state n of the channel. This asymmetric A/D model is fitted to published permeation data through the Shaker potassium channel at physiological concentrations, and it successfully predicts qualitative changes in the negative current-voltage data (including a transition to super-Ohmic behavior) based solely on a fit to positive-voltage data (that appear linear). The A/D model appears to be qualitatively consistent with a large group of published MD simulations, but no quantitative comparison has yet been made. The A/D model makes a network of predictions for how the elementary steps and the channel occupancy vary with both concentration and voltage. In addition, the proposed theoretical framework suggests a new way of plotting the energetics of the simulated system using a one-dimensional permeation coordinate that uses electric potential energy as a metric for the net fractional progress through the permeation mechanism. This approach has the potential to provide a quantitative connection between atomistic simulations and permeation experiments for the first time.
The use of simulation in neurosurgical education and training. A systematic review.
Kirkman, Matthew A; Ahmed, Maria; Albert, Angelique F; Wilson, Mark H; Nandi, Dipankar; Sevdalis, Nick
2014-08-01
There is increasing evidence that simulation provides high-quality, time-effective training in an era of resident duty-hour restrictions. Simulation may also permit trainees to acquire key skills in a safe environment, important in a specialty such as neurosurgery, where technical error can result in devastating consequences. The authors systematically reviewed the application of simulation within neurosurgical training and explored the state of the art in simulation within this specialty. To their knowledge this is the first systematic review published on this topic to date. The authors searched the Ovid MEDLINE, Embase, and PsycINFO databases and identified 4101 articles; 195 abstracts were screened by 2 authors for inclusion. The authors reviewed data on study population, study design and setting, outcome measures, key findings, and limitations. Twenty-eight articles formed the basis of this systematic review. Several different simulators are at the neurosurgeon's disposal, including those for ventriculostomy, neuroendoscopic procedures, and spinal surgery, with evidence for improved performance in a range of procedures. Feedback from participants has generally been favorable. However, study quality was found to be poor overall, with many studies hampered by nonrandomized design, presenting normal rather than abnormal anatomy, lack of control groups and long-term follow-up, poor study reporting, lack of evidence of improved simulator performance translating into clinical benefit, and poor reliability and validity evidence. The mean Medical Education Research Study Quality Instrument score of included studies was 9.21 ± 1.95 (± SD) out of a possible score of 18. The authors demonstrate qualitative and quantitative benefits of a range of neurosurgical simulators but find significant shortfalls in methodology and design. Future studies should seek to improve study design and reporting, and provide long-term follow-up data on simulated and ideally patient outcomes.
Fourier phase in Fourier-domain optical coherence tomography
Uttam, Shikhar; Liu, Yang
2015-01-01
Phase of an electromagnetic wave propagating through a sample-of-interest is well understood in the context of quantitative phase imaging in transmission-mode microscopy. In the past decade, Fourier-domain optical coherence tomography has been used to extend quantitative phase imaging to the reflection-mode. Unlike transmission-mode electromagnetic phase, however, the origin and characteristics of reflection-mode Fourier phase are poorly understood, especially in samples with a slowly varying refractive index. In this paper, the general theory of Fourier phase from first principles is presented, and it is shown that Fourier phase is a joint estimate of subresolution offset and mean spatial frequency of the coherence-gated sample refractive index. It is also shown that both spectral-domain phase microscopy and depth-resolved spatial-domain low-coherence quantitative phase microscopy are special cases of this general theory. Analytical expressions are provided for both, and simulations are presented to explain and support the theoretical results. These results are further used to show how Fourier phase allows the estimation of an axial mean spatial frequency profile of the sample, along with depth-resolved characterization of localized optical density change and sample heterogeneity. Finally, a Fourier phase-based explanation of Doppler optical coherence tomography is also provided. PMID:26831383
Linkage disequilibrium interval mapping of quantitative trait loci.
Boitard, Simon; Abdallah, Jihad; de Rochambeau, Hubert; Cierco-Ayrolles, Christine; Mangin, Brigitte
2006-03-16
For many years gene mapping studies have been performed through linkage analyses based on pedigree data. Recently, linkage disequilibrium methods based on unrelated individuals have been advocated as powerful tools to refine estimates of gene location. Many strategies have been proposed to deal with simply inherited disease traits. However, locating quantitative trait loci is statistically more challenging and considerable research is needed to provide robust and computationally efficient methods. Under a three-locus Wright-Fisher model, we derived approximate expressions for the expected haplotype frequencies in a population. We considered haplotypes comprising one trait locus and two flanking markers. Using these theoretical expressions, we built a likelihood-maximization method, called HAPim, for estimating the location of a quantitative trait locus. For each postulated position, the method only requires information from the two flanking markers. Over a wide range of simulation scenarios it was found to be more accurate than a two-marker composite likelihood method. It also performed as well as identity by descent methods, whilst being valuable in a wider range of populations. Our method makes efficient use of marker information, and can be valuable for fine mapping purposes. Its performance is increased if multiallelic markers are available. Several improvements can be developed to account for more complex evolution scenarios or provide robust confidence intervals for the location estimates.
Etienne, E; Le Breton, N; Martinho, M; Mileo, E; Belle, V
2017-08-01
Site-directed spin labeling (SDSL) combined with continuous wave electron paramagnetic resonance (cw EPR) spectroscopy is a powerful technique to reveal, at the residue level, structural transitions in proteins. SDSL-EPR is based on the selective grafting of a paramagnetic label on the protein under study, followed by cw EPR analysis. To extract valuable quantitative information from SDSL-EPR spectra and thus give reliable interpretation on biological system dynamics, numerical simulations of the spectra are required. Such spectral simulations can be carried out by coding in MATLAB using functions from the EasySpin toolbox. For non-expert users of MATLAB, this could be a complex task or even impede the use of such simulation tool. We developed a graphical user interface called SimLabel dedicated to run cw EPR spectra simulations particularly coming from SDSL-EPR experiments. Simlabel provides an intuitive way to visualize, simulate, and fit such cw EPR spectra. An example of SDSL-EPR spectra simulation concerning the study of an intrinsically disordered region undergoing a local induced folding is described and discussed. We believe that this new tool will help the users to rapidly obtain reliable simulated spectra and hence facilitate the interpretation of their results. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
General Methods for Evolutionary Quantitative Genetic Inference from Generalized Mixed Models.
de Villemereuil, Pierre; Schielzeth, Holger; Nakagawa, Shinichi; Morrissey, Michael
2016-11-01
Methods for inference and interpretation of evolutionary quantitative genetic parameters, and for prediction of the response to selection, are best developed for traits with normal distributions. Many traits of evolutionary interest, including many life history and behavioral traits, have inherently nonnormal distributions. The generalized linear mixed model (GLMM) framework has become a widely used tool for estimating quantitative genetic parameters for nonnormal traits. However, whereas GLMMs provide inference on a statistically convenient latent scale, it is often desirable to express quantitative genetic parameters on the scale upon which traits are measured. The parameters of fitted GLMMs, despite being on a latent scale, fully determine all quantities of potential interest on the scale on which traits are expressed. We provide expressions for deriving each of such quantities, including population means, phenotypic (co)variances, variance components including additive genetic (co)variances, and parameters such as heritability. We demonstrate that fixed effects have a strong impact on those parameters and show how to deal with this by averaging or integrating over fixed effects. The expressions require integration of quantities determined by the link function, over distributions of latent values. In general cases, the required integrals must be solved numerically, but efficient methods are available and we provide an implementation in an R package, QGglmm. We show that known formulas for quantities such as heritability of traits with binomial and Poisson distributions are special cases of our expressions. Additionally, we show how fitted GLMM can be incorporated into existing methods for predicting evolutionary trajectories. We demonstrate the accuracy of the resulting method for evolutionary prediction by simulation and apply our approach to data from a wild pedigreed vertebrate population. Copyright © 2016 de Villemereuil et al.
Smith, Eric G.
2015-01-01
Background: Nonrandomized studies typically cannot account for confounding from unmeasured factors. Method: A method is presented that exploits the recently-identified phenomenon of “confounding amplification” to produce, in principle, a quantitative estimate of total residual confounding resulting from both measured and unmeasured factors. Two nested propensity score models are constructed that differ only in the deliberate introduction of an additional variable(s) that substantially predicts treatment exposure. Residual confounding is then estimated by dividing the change in treatment effect estimate between models by the degree of confounding amplification estimated to occur, adjusting for any association between the additional variable(s) and outcome. Results: Several hypothetical examples are provided to illustrate how the method produces a quantitative estimate of residual confounding if the method’s requirements and assumptions are met. Previously published data is used to illustrate that, whether or not the method routinely provides precise quantitative estimates of residual confounding, the method appears to produce a valuable qualitative estimate of the likely direction and general size of residual confounding. Limitations: Uncertainties exist, including identifying the best approaches for: 1) predicting the amount of confounding amplification, 2) minimizing changes between the nested models unrelated to confounding amplification, 3) adjusting for the association of the introduced variable(s) with outcome, and 4) deriving confidence intervals for the method’s estimates (although bootstrapping is one plausible approach). Conclusions: To this author’s knowledge, it has not been previously suggested that the phenomenon of confounding amplification, if such amplification is as predictable as suggested by a recent simulation, provides a logical basis for estimating total residual confounding. The method's basic approach is straightforward. The method's routine usefulness, however, has not yet been established, nor has the method been fully validated. Rapid further investigation of this novel method is clearly indicated, given the potential value of its quantitative or qualitative output. PMID:25580226
Zhao, Ying; Liu, Dongmei; Tang, Huan; Lu, Jing; Cui, Fuyi
2014-01-01
With the development of nanotechnology, more nanomaterials will enter into water environment system. Studying the existing form of nanomaterials in water environment will help people benefit from the correct use of them and to reduce the harm to human caused by them for some nanomaterials can bring polluting effect. Aggregation is a main behavior for nanoparticle in water environment. NZVI are used widely in many fields resulting in more NZVI in water environment. Molecular dynamics simulations and Materials Studio software are used to investigate the microaggregation behaviors of NZVI particles. Two scenes are involved: (1) particle size of NZVI in each simulation system is the same, but initial distance of two NZVI particles is different; (2) initial distance of two NZVI particles in each simulation system is the same, but particle size of NZVI is different. Atomistic trajectory, NP activity, total energy, and adsorption of H2O are analyzed with MS. The method provides new quantitative insight into the structure, energy, and dynamics of the aggregation behaviors of NZVI particles in water. It is necessary to understand microchange of NPs in water because it can provide theoretical research that is used to reduce polluting effect of NPs on water environment.
Liu, Dongmei; Tang, Huan; Lu, Jing; Cui, Fuyi
2014-01-01
With the development of nanotechnology, more nanomaterials will enter into water environment system. Studying the existing form of nanomaterials in water environment will help people benefit from the correct use of them and to reduce the harm to human caused by them for some nanomaterials can bring polluting effect. Aggregation is a main behavior for nanoparticle in water environment. NZVI are used widely in many fields resulting in more NZVI in water environment. Molecular dynamics simulations and Materials Studio software are used to investigate the microaggregation behaviors of NZVI particles. Two scenes are involved: (1) particle size of NZVI in each simulation system is the same, but initial distance of two NZVI particles is different; (2) initial distance of two NZVI particles in each simulation system is the same, but particle size of NZVI is different. Atomistic trajectory, NP activity, total energy, and adsorption of H2O are analyzed with MS. The method provides new quantitative insight into the structure, energy, and dynamics of the aggregation behaviors of NZVI particles in water. It is necessary to understand microchange of NPs in water because it can provide theoretical research that is used to reduce polluting effect of NPs on water environment. PMID:25250388
Rim, Yonghoon; Laing, Susan T.; McPherson, David D.; Kim, Hyunggun
2013-01-01
Mitral valve repair using expanded polytetrafluoroethylene (ePTFE) sutures is an established and preferred interventional method to resolve the complex pathophysiologic problems associated with chordal rupture. We developed a novel computational evaluation protocol to determine the effect of the artificial sutures on restoring mitral valve function following valve repair. A virtual mitral valve was created using three-dimensional echocardiographic data in a patient with ruptured mitral chordae tendineae. Virtual repairs were designed by adding artificial sutures between the papillary muscles and the posterior leaflet where the native chordae were ruptured. Dynamic finite element simulations were performed to evaluate pre- and post-repair mitral valve function. Abnormal posterior leaflet prolapse and mitral regurgitation was clearly demonstrated in the mitral valve with ruptured chordae. Following virtual repair to reconstruct ruptured chordae, the severity of the posterior leaflet prolapse decreased and stress concentration was markedly reduced both in the leaflet tissue and the intact native chordae. Complete leaflet coaptation was restored when four or six sutures were utilized. Computational simulations provided quantitative information of functional improvement following mitral valve repair. This novel simulation strategy may provide a powerful tool for evaluation and prediction of interventional treatment for ruptured mitral chordae tendineae. PMID:24072489
Aura Satellite Mission: Oxford/RAL Spring School in Quantitative Earth Observation
NASA Technical Reports Server (NTRS)
Douglass, Anne
2005-01-01
The four instruments on Aura are providing new and exciting measurements of stratospheric and tropospheric ozone, species that contribute to ozone production and loss, and long-lived gases such as nitrous oxide and methane that provide information about atmospheric transport. These discussions of atmospheric chemistry will start with the basic principles of ozone production and loss. Aura data will be used where possible to illustrate the pertinent atmospheric processes. Three-dimensional model simulations will be used both to illustrate present capabilities in constituent modeling and to demonstrate how observations are used to evaluate and improve models and our ability to predict future ozone evolution.
Wang, Ye; He, Honghui; Chang, Jintao; He, Chao; Liu, Shaoxiong; Li, Migao; Zeng, Nan; Wu, Jian; Ma, Hui
2016-07-01
Today the increasing cancer incidence rate is becoming one of the biggest threats to human health.Among all types of cancers, liver cancer ranks in the top five in both frequency and mortality rate all over the world. During the development of liver cancer, fibrosis often evolves as part of a healing process in response to liver damage, resulting in cirrhosis of liver tissues. In a previous study, we applied the Mueller matrix microscope to pathological liver tissue samples and found that both the Mueller matrix polar decomposition (MMPD) and Mueller matrix transformation (MMT) parameters are closely related to the fibrous microstructures. In this paper,we take this one step further to quantitatively facilitate the fibrosis detections and scorings of pathological liver tissue samples in different stages from cirrhosis to cancer using the Mueller matrix microscope. The experimental results of MMPD and MMT parameters for the fibrotic liver tissue samples in different stages are measured and analyzed. We also conduct Monte Carlo simulations based on the sphere birefringence model to examine in detail the influence of structural changes in different fibrosis stages on the imaging parameters. Both the experimental and simulated results indicate that the polarized light microscope and transformed Mueller matrix parameter scan provide additional quantitative information helpful for fibrosis detections and scorings of liver cirrhosis and cancers. Therefore, the polarized light microscope and transformed Mueller matrix parameters have a good application prospect in liver cancer diagnosis.
NASA Astrophysics Data System (ADS)
Wang, Ye; He, Honghui; Chang, Jintao; He, Chao; Liu, Shaoxiong; Li, Migao; Zeng, Nan; Wu, Jian; Ma, Hui
2016-07-01
Today the increasing cancer incidence rate is becoming one of the biggest threats to human health. Among all types of cancers, liver cancer ranks in the top five in both frequency and mortality rate all over the world. During the development of liver cancer, fibrosis often evolves as part of a healing process in response to liver damage, resulting in cirrhosis of liver tissues. In a previous study, we applied the Mueller matrix microscope to pathological liver tissue samples and found that both the Mueller matrix polar decomposition (MMPD) and Mueller matrix transformation (MMT) parameters are closely related to the fibrous microstructures. In this paper, we take this one step further to quantitatively facilitate the fibrosis detections and scorings of pathological liver tissue samples in different stages from cirrhosis to cancer using the Mueller matrix microscope. The experimental results of MMPD and MMT parameters for the fibrotic liver tissue samples in different stages are measured and analyzed. We also conduct Monte Carlo simulations based on the sphere birefringence model to examine in detail the influence of structural changes in different fibrosis stages on the imaging parameters. Both the experimental and simulated results indicate that the polarized light microscope and transformed Mueller matrix parameters can provide additional quantitative information helpful for fibrosis detections and scorings of liver cirrhosis and cancers. Therefore, the polarized light microscope and transformed Mueller matrix parameters have a good application prospect in liver cancer diagnosis.
Chen, Bin; Zhao, Kai; Li, Bo; Cai, Wenchao; Wang, Xiaoying; Zhang, Jue; Fang, Jing
2015-10-01
To demonstrate the feasibility of the improved temporal resolution by using compressed sensing (CS) combined imaging sequence in dynamic contrast-enhanced MRI (DCE-MRI) of kidney, and investigate its quantitative effects on renal perfusion measurements. Ten rabbits were included in the accelerated scans with a CS-combined 3D pulse sequence. To evaluate the image quality, the signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) were compared between the proposed CS strategy and the conventional full sampling method. Moreover, renal perfusion was estimated by using the separable compartmental model in both CS simulation and realistic CS acquisitions. The CS method showed DCE-MRI images with improved temporal resolution and acceptable image contrast, while presenting significantly higher SNR than the fully sampled images (p<.01) at 2-, 3- and 4-X acceleration. In quantitative measurements, renal perfusion results were in good agreement with the fully sampled one (concordance correlation coefficient=0.95, 0.91, 0.88) at 2-, 3- and 4-X acceleration in CS simulation. Moreover, in realistic acquisitions, the estimated perfusion by the separable compartmental model exhibited no significant differences (p>.05) between each CS-accelerated acquisition and the full sampling method. The CS-combined 3D sequence could improve the temporal resolution for DCE-MRI in kidney while yielding diagnostically acceptable image quality, and it could provide effective measurements of renal perfusion. Copyright © 2015 Elsevier Inc. All rights reserved.
A Study of Imaging Interferometer Simulators
NASA Technical Reports Server (NTRS)
Allen, Ronald J.
2002-01-01
Several new space science mission concepts under development at NASA-GSFC for astronomy are intended to carry out synthetic imaging using Michelson interferometers or direct (Fizeau) imaging with sparse apertures. Examples of these mission concepts include the Stellar Imager (SI), the Space Infrared Interferometric Telescope (SPIRIT), the Submillimeter Probe of the Evolution of Cosmic Structure (SPECS), and the Fourier-Kelvin Stellar Interferometer (FKSI). We have been developing computer-based simulators for these missions. These simulators are aimed at providing a quantitative evaluation of the imaging capabilities of the mission by modelling the performance on different realistic targets in terms of sensitivity, angular resolution, and dynamic range. Both Fizeau and Michelson modes of operation can be considered. Our work is based on adapting a computer simulator called imSIM, which was initially written for the Space Interferometer Mission in order to simulate the imaging mode of new missions such as those listed. In a recent GSFC-funded study we have successfully written a preliminary version of a simulator SISIM for the Stellar Imager and carried out some preliminary studies with it. In a separately funded study we have also been applying these methods to SPECS/SPIRIT.
Comparing Macroscale and Microscale Simulations of Porous Battery Electrodes
Higa, Kenneth; Wu, Shao-Ling; Parkinson, Dilworth Y.; ...
2017-06-22
This article describes a vertically-integrated exploration of NMC electrode rate limitations, combining experiments with corresponding macroscale (macro-homogeneous) and microscale models. Parameters common to both models were obtained from experiments or based on published results. Positive electrode tortuosity was the sole fitting parameter used in the macroscale model, while the microscale model used no fitting parameters, instead relying on microstructural domains generated from X-ray microtomography of pristine electrode material held under compression while immersed in electrolyte solution (additionally providing novel observations of electrode wetting). Macroscale simulations showed that the capacity decrease observed at higher rates resulted primarily from solution-phase diffusion resistance.more » This ability to provide such qualitative insights at low computational costs is a strength of macroscale models, made possible by neglecting electrode spatial details. To explore the consequences of such simplification, the corresponding, computationally-expensive microscale model was constructed. This was found to have limitations preventing quantitatively accurate predictions, for reasons that are discussed in the hope of guiding future work. Nevertheless, the microscale simulation results complement those of the macroscale model by providing a reality-check based on microstructural information; in particular, this novel comparison of the two approaches suggests a reexamination of salt diffusivity measurements.« less
NASA Astrophysics Data System (ADS)
Brereton, Carol A.; Johnson, Matthew R.
2012-05-01
Fugitive pollutant sources from the oil and gas industry are typically quite difficult to find within industrial plants and refineries, yet they are a significant contributor of global greenhouse gas emissions. A novel approach for locating fugitive emission sources using computationally efficient trajectory statistical methods (TSM) has been investigated in detailed proof-of-concept simulations. Four TSMs were examined in a variety of source emissions scenarios developed using transient CFD simulations on the simplified geometry of an actual gas plant: potential source contribution function (PSCF), concentration weighted trajectory (CWT), residence time weighted concentration (RTWC), and quantitative transport bias analysis (QTBA). Quantitative comparisons were made using a correlation measure based on search area from the source(s). PSCF, CWT and RTWC could all distinguish areas near major sources from the surroundings. QTBA successfully located sources in only some cases, even when provided with a large data set. RTWC, given sufficient domain trajectory coverage, distinguished source areas best, but otherwise could produce false source predictions. Using RTWC in conjunction with CWT could overcome this issue as well as reduce sensitivity to noise in the data. The results demonstrate that TSMs are a promising approach for identifying fugitive emissions sources within complex facility geometries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Damao; Wang, Zhien; Heymsfield, Andrew J.
Measurement of ice number concentration in clouds is important but still challenging. Stratiform mixed-phase clouds (SMCs) provide a simple scenario for retrieving ice number concentration from remote sensing measurements. The simple ice generation and growth pattern in SMCs offers opportunities to use cloud radar reflectivity (Ze) measurements and other cloud properties to infer ice number concentration quantitatively. To understand the strong temperature dependency of ice habit and growth rate quantitatively, we develop a 1-D ice growth model to calculate the ice diffusional growth along its falling trajectory in SMCs. The radar reflectivity and fall velocity profiles of ice crystals calculatedmore » from the 1-D ice growth model are evaluated with the Atmospheric Radiation Measurements (ARM) Climate Research Facility (ACRF) ground-based high vertical resolution radar measurements. Combining Ze measurements and 1-D ice growth model simulations, we develop a method to retrieve the ice number concentrations in SMCs at given cloud top temperature (CTT) and liquid water path (LWP). The retrieved ice concentrations in SMCs are evaluated with in situ measurements and with a three-dimensional cloud-resolving model simulation with a bin microphysical scheme. These comparisons show that the retrieved ice number concentrations are within an uncertainty of a factor of 2, statistically.« less
Mehraei, Mani; Bashirov, Rza; Tüzmen, Şükrü
2016-10-01
Recent molecular studies provide important clues into treatment of [Formula: see text]-thalassemia, sickle-cell anaemia and other [Formula: see text]-globin disorders revealing that increased production of fetal hemoglobin, that is normally suppressed in adulthood, can ameliorate the severity of these diseases. In this paper, we present a novel approach for drug prediction for [Formula: see text]-globin disorders. Our approach is centered upon quantitative modeling of interactions in human fetal-to-adult hemoglobin switch network using hybrid functional Petri nets. In accordance with the reverse pharmacology approach, we pose a hypothesis regarding modulation of specific protein targets that induce [Formula: see text]-globin and consequently fetal hemoglobin. Comparison of simulation results for the proposed strategy with the ones obtained for already existing drugs shows that our strategy is the optimal as it leads to highest level of [Formula: see text]-globin induction and thereby has potential beneficial therapeutic effects on [Formula: see text]-globin disorders. Simulation results enable verification of model coherence demonstrating that it is consistent with qPCR data available for known strategies and/or drugs.
Overview: early history of crop growth and photosynthesis modeling.
El-Sharkawy, Mabrouk A
2011-02-01
As in industrial and engineering systems, there is a need to quantitatively study and analyze the many constituents of complex natural biological systems as well as agro-ecosystems via research-based mechanistic modeling. This objective is normally addressed by developing mathematically built descriptions of multilevel biological processes to provide biologists a means to integrate quantitatively experimental research findings that might lead to a better understanding of the whole systems and their interactions with surrounding environments. Aided with the power of computational capacities associated with computer technology then available, pioneering cropping systems simulations took place in the second half of the 20th century by several research groups across continents. This overview summarizes that initial pioneering effort made to simulate plant growth and photosynthesis of crop canopies, focusing on the discovery of gaps that exist in the current scientific knowledge. Examples are given for those gaps where experimental research was needed to improve the validity and application of the constructed models, so that their benefit to mankind was enhanced. Such research necessitates close collaboration among experimentalists and model builders while adopting a multidisciplinary/inter-institutional approach. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Use of a Simulated MCAT to Predict Real MCAT Scores.
ERIC Educational Resources Information Center
Pohlman, Mary; And Others
1979-01-01
A simulated Medical College Admission Test (MCAT) was administered to 39 premedical students two weeks prior to the new MCAT. High correlations between simulated and active test scores were obtained in the biology, chemistry, physics, science problems, reading, and quantitative areas. (MH)
ERIC Educational Resources Information Center
Raymond, Chad; Usherwood, Simon
2013-01-01
Simulations are employed widely as teaching tools in political science, yet evidence of their pedagogical effectiveness, in comparison to other methods of instruction, is mixed. The assessment of learning outcomes is often a secondary concern in simulation design, and the qualitative and quantitative methods used to evaluate outcomes are…
Quantitative Simulation of QARBM Challenge Events During Radiation Belt Enhancements
NASA Astrophysics Data System (ADS)
Li, W.; Ma, Q.; Thorne, R. M.; Bortnik, J.; Chu, X.
2017-12-01
Various physical processes are known to affect energetic electron dynamics in the Earth's radiation belts, but their quantitative effects at different times and locations in space need further investigation. This presentation focuses on discussing the quantitative roles of various physical processes that affect Earth's radiation belt electron dynamics during radiation belt enhancement challenge events (storm-time vs. non-storm-time) selected by the GEM Quantitative Assessment of Radiation Belt Modeling (QARBM) focus group. We construct realistic global distributions of whistler-mode chorus waves, adopt various versions of radial diffusion models (statistical and event-specific), and use the global evolution of other potentially important plasma waves including plasmaspheric hiss, magnetosonic waves, and electromagnetic ion cyclotron waves from all available multi-satellite measurements. These state-of-the-art wave properties and distributions on a global scale are used to calculate diffusion coefficients, that are then adopted as inputs to simulate the dynamical electron evolution using a 3D diffusion simulation during the storm-time and the non-storm-time acceleration events respectively. We explore the similarities and differences in the dominant physical processes that cause radiation belt electron dynamics during the storm-time and non-storm-time acceleration events. The quantitative role of each physical process is determined by comparing against the Van Allen Probes electron observations at different energies, pitch angles, and L-MLT regions. This quantitative comparison further indicates instances when quasilinear theory is sufficient to explain the observed electron dynamics or when nonlinear interaction is required to reproduce the energetic electron evolution observed by the Van Allen Probes.
NASA Astrophysics Data System (ADS)
Cobden, L. J.
2017-12-01
Mineral physics provides the essential link between seismic observations of the Earth's interior, and laboratory (or computer-simulated) measurements of rock properties. In this presentation I will outline the procedure for quantitative conversion from thermochemical structure to seismic structure (and vice versa) using the latest datasets from seismology and mineralogy. I will show examples of how this method can allow us to infer major chemical and dynamic properties of the deep mantle. I will also indicate where uncertainties and limitations in the data require us to exercise caution, in order not to "over-interpret" seismic observations. Understanding and modelling these uncertainties serves as a useful guide for mineralogists to ascertain which mineral parameters are most useful in seismic interpretation, and enables seismologists to optimise their data assembly and inversions for quantitative interpretations.
Conducting multicenter research in healthcare simulation: Lessons learned from the INSPIRE network.
Cheng, Adam; Kessler, David; Mackinnon, Ralph; Chang, Todd P; Nadkarni, Vinay M; Hunt, Elizabeth A; Duval-Arnould, Jordan; Lin, Yiqun; Pusic, Martin; Auerbach, Marc
2017-01-01
Simulation-based research has grown substantially over the past two decades; however, relatively few published simulation studies are multicenter in nature. Multicenter research confers many distinct advantages over single-center studies, including larger sample sizes for more generalizable findings, sharing resources amongst collaborative sites, and promoting networking. Well-executed multicenter studies are more likely to improve provider performance and/or have a positive impact on patient outcomes. In this manuscript, we offer a step-by-step guide to conducting multicenter, simulation-based research based upon our collective experience with the International Network for Simulation-based Pediatric Innovation, Research and Education (INSPIRE). Like multicenter clinical research, simulation-based multicenter research can be divided into four distinct phases. Each phase has specific differences when applied to simulation research: (1) Planning phase , to define the research question, systematically review the literature, identify outcome measures, and conduct pilot studies to ensure feasibility and estimate power; (2) Project Development phase , when the primary investigator identifies collaborators, develops the protocol and research operations manual, prepares grant applications, obtains ethical approval and executes subsite contracts, registers the study in a clinical trial registry, forms a manuscript oversight committee, and conducts feasibility testing and data validation at each site; (3) Study Execution phase , involving recruitment and enrollment of subjects, clear communication and decision-making, quality assurance measures and data abstraction, validation, and analysis; and (4) Dissemination phase , where the research team shares results via conference presentations, publications, traditional media, social media, and implements strategies for translating results to practice. With this manuscript, we provide a guide to conducting quantitative multicenter research with a focus on simulation-specific issues.
NASA Astrophysics Data System (ADS)
Shi, Guochao; Wang, Mingli; Zhu, Yanying; Shen, Lin; Wang, Yuhong; Ma, Wanli; Chen, Yuee; Li, Ruifeng
2018-04-01
In this work, we presented an eco-friendly and low-cost method to fabricate a kind of flexible and stable Au nanoparticles/graphene oxide/cicada wing (AuNPs/GO/CW) substrate. By controlling the ratio of reactants, the optimum SERS substrate with average AuNPs size of 65 nm was obtained. The Raman enhancement factor for rhodamine 6G (R6G) was 1.08 ×106 and the limit of detection (LOD) was as low as 10-8 M. After calibrating the Raman peak intensities of R6G, it could be quantitatively detected. In order to better understand the experimental results, the 3D finite-different time-domain simulation was used to simulate the AuNPs/GO/CW-1 (the diameter of the AuNPs was 65 nm) to further investigate the SERS enhancement effect. More importantly, the AuNPs/GO/CW-1 substrates not only can provide strong enhancement factors but also can be stable and reproducible. This SERS substrates owned a good stability for the SERS intensity which was reduced only by 25% after the aging time of 60 days and the relative standard deviation was lower than 20%, revealing excellent uniformity and reproducibility. Our positive findings can pave a new way to optimize the application of SERS substrate as well as provide more SERS platforms for quantitative detection of organic contaminants vestige, which makes it very promising in the trace detection of biological molecules.
EnviroLand: A Simple Computer Program for Quantitative Stream Assessment.
ERIC Educational Resources Information Center
Dunnivant, Frank; Danowski, Dan; Timmens-Haroldson, Alice; Newman, Meredith
2002-01-01
Introduces the Enviroland computer program which features lab simulations of theoretical calculations for quantitative analysis and environmental chemistry, and fate and transport models. Uses the program to demonstrate the nature of linear and nonlinear equations. (Author/YDS)
Quantitative Morphology Measures in Galaxies: Ground-Truthing from Simulations
NASA Astrophysics Data System (ADS)
Narayanan, Desika T.; Abruzzo, Matthew W.; Dave, Romeel; Thompson, Robert
2017-01-01
The process of galaxy assembly is a prevalent question in astronomy; there are a variety of potentially important effects, including baryonic accretion from the intergalactic medium, as well as major galaxy mergers. Recent years have ushered in the development of quantitative measures of morphology such as the Gini coefficient (G), the second-order moment of the brightest quintile of a galaxy’s light (M20), and the concentration (C), asymmetry (A), and clumpiness (S) of galaxies. To investigate the efficacy of these observational methods at identifying major mergers, we have run a series of very high resolution cosmological zoom simulations, and coupled these with 3D Monte Carlo dust radiative transfer. Our methodology is powerful in that it allows us to “observe” the simulation as an observer would, while maintaining detailed knowledge of the true merger history of the galaxy. In this presentation, we will present our main results from our analysis of these quantitative morphology measures, with a particular focus on high-redshift (z>2) systems.
Asynchronous adaptive time step in quantitative cellular automata modeling
Zhu, Hao; Pang, Peter YH; Sun, Yan; Dhar, Pawan
2004-01-01
Background The behaviors of cells in metazoans are context dependent, thus large-scale multi-cellular modeling is often necessary, for which cellular automata are natural candidates. Two related issues are involved in cellular automata based multi-cellular modeling: how to introduce differential equation based quantitative computing to precisely describe cellular activity, and upon it, how to solve the heavy time consumption issue in simulation. Results Based on a modified, language based cellular automata system we extended that allows ordinary differential equations in models, we introduce a method implementing asynchronous adaptive time step in simulation that can considerably improve efficiency yet without a significant sacrifice of accuracy. An average speedup rate of 4–5 is achieved in the given example. Conclusions Strategies for reducing time consumption in simulation are indispensable for large-scale, quantitative multi-cellular models, because even a small 100 × 100 × 100 tissue slab contains one million cells. Distributed and adaptive time step is a practical solution in cellular automata environment. PMID:15222901
Knotts, Thomas A.
2017-01-01
Molecular simulation has the ability to predict various physical properties that are difficult to obtain experimentally. For example, we implement molecular simulation to predict the critical constants (i.e., critical temperature, critical density, critical pressure, and critical compressibility factor) for large n-alkanes that thermally decompose experimentally (as large as C48). Historically, molecular simulation has been viewed as a tool that is limited to providing qualitative insight. One key reason for this perceived weakness in molecular simulation is the difficulty to quantify the uncertainty in the results. This is because molecular simulations have many sources of uncertainty that propagate and are difficult to quantify. We investigate one of the most important sources of uncertainty, namely, the intermolecular force field parameters. Specifically, we quantify the uncertainty in the Lennard-Jones (LJ) 12-6 parameters for the CH4, CH3, and CH2 united-atom interaction sites. We then demonstrate how the uncertainties in the parameters lead to uncertainties in the saturated liquid density and critical constant values obtained from Gibbs Ensemble Monte Carlo simulation. Our results suggest that the uncertainties attributed to the LJ 12-6 parameters are small enough that quantitatively useful estimates of the saturated liquid density and the critical constants can be obtained from molecular simulation. PMID:28527455
Electron Thermal Transport due to Magnetic Diffusion in the MST RFP
NASA Astrophysics Data System (ADS)
Reusch, J. A.; Anderson, J. K.; den Hartog, D. J.; Forest, C. B.; Kasten, C. P.; Schnack, D. D.; Stephens, H. D.
2011-10-01
Comparison of measurements made in the MST RFP to the results from extensive nonlinear resistive MHD simulations has provided two key observations. First, trapped particles reduce electron thermal diffusion; inclusion of this effect is required for quantitative agreement of simulation to measurement. Second, the structure and evolution of long-wavelength temperature fluctuations measured in MST shows remarkable qualitative similarity to fluctuations appearing in a finite-pressure simulation. These simulations were run at parameters matching those of 400 kA discharges in MST (S ~ 4 ×106). In a zero β simulation, the measured χe is compared to the thermal diffusion due to parallel losses along diffusing magnetic field lines, χst =v∥Dmag . Agreement is only found if the reduction in χst due to trapped particles is taken into account. In a second simulation, the pressure field was evolved self consistently assuming Ohmic heating and anisotropic thermal conduction. Fluctuations in the simulated temperature are very similar in character and time evolution to temperature fluctuations measured in MST. This includes m = 1 , n = 6 fluctuations that flatten the temperature profile as well as m = 1 , n = 5 fluctuations that generate hot island structures near the core shortly after sawtooth crashes. This work supported by the US DOE and NSF.
Quantitative petri net model of gene regulated metabolic networks in the cell.
Chen, Ming; Hofestädt, Ralf
2011-01-01
A method to exploit hybrid Petri nets (HPN) for quantitatively modeling and simulating gene regulated metabolic networks is demonstrated. A global kinetic modeling strategy and Petri net modeling algorithm are applied to perform the bioprocess functioning and model analysis. With the model, the interrelations between pathway analysis and metabolic control mechanism are outlined. Diagrammatical results of the dynamics of metabolites are simulated and observed by implementing a HPN tool, Visual Object Net ++. An explanation of the observed behavior of the urea cycle is proposed to indicate possibilities for metabolic engineering and medical care. Finally, the perspective of Petri nets on modeling and simulation of metabolic networks is discussed.
Computer simulation of schlieren images of rotationally symmetric plasma systems: a simple method.
Noll, R; Haas, C R; Weikl, B; Herziger, G
1986-03-01
Schlieren techniques are commonly used methods for quantitative analysis of cylindrical or spherical index of refraction profiles. Many schlieren objects, however, are characterized by more complex geometries, so we have investigated the more general case of noncylindrical, rotationally symmetric distributions of index of refraction n(r,z). Assuming straight ray paths in the schlieren object we have calculated 2-D beam deviation profiles. It is shown that experimental schlieren images of the noncylindrical plasma generated by a plasma focus device can be simulated with these deviation profiles. The computer simulation allows a quantitative analysis of these schlieren images, which yields, for example, the plasma parameters, electron density, and electron density gradients.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang Shuli; Yeh Chiatsung; Budd, William W.
2009-02-15
Sustainability indicators have been widely developed to monitor and assess sustainable development. They are expected to guide political decision-making based on their capability to represent states and trends of development. However, using indicators to assess the sustainability of urban strategies and policies has limitations - as they neither reflect the systemic interactions among them, nor provide normative indications in what direction they should be developed. This paper uses a semi-quantitative systematic model tool (Sensitivity Model Tools, SM) to analyze the role of urban development in Taiwan's sustainability. The results indicate that the natural environment in urban area is one ofmore » the most critical components and the urban economic production plays a highly active role in affecting Taiwan's sustainable development. The semi-quantitative simulation model integrates sustainability indicators and urban development policy to provide decision-makers with information about the impacts of their decisions on urban development. The system approach incorporated by this paper can be seen as a necessary, but not sufficient, condition for a sustainability assessment. The participatory process of expert participants for providing judgments on the relations between indicator variables is also discussed.« less
Guo, Longhua; Qiu, Bin; Chi, Yuwu; Chen, Guonan
2008-09-01
In this paper, an ultrasensitive CE-CL detection system coupled with a novel double-on-column coaxial flow detection interface was developed for the detection of PCR products. A reliable procedure based on this system had been demonstrated for qualitative and quantitative analysis of genetically modified organism-the detection of Roundup Ready Soy (RRS) samples was presented as an example. The promoter, terminator, function and two reference genes of RRS were amplified with multiplex PCR simultaneously. After that, the multiplex PCR products were labeled with acridinium ester at the 5'-terminal through an amino modification and then analyzed by the proposed CE-CL system. Reproducibility of analysis times and peak heights for the CE-CL analysis were determined to be better than 0.91 and 3.07% (RSD, n=15), respectively, for three consecutive days. It was shown that this method could accurately and qualitatively detect RRS standards and the simulative samples. The evaluation in terms of quantitative analysis of RRS provided by this new method was confirmed by comparing our assay results with those of the standard real-time quantitative PCR (RT-QPCR) using SYBR Green I dyes. The results showed a good coherence between the two methods. This approach demonstrated the possibility for accurate qualitative and quantitative detection of GM plants in a single run.
NASA Astrophysics Data System (ADS)
Green, David M.; Dallaire, Joel D.; Reaper, Jerome H.
2004-08-01
The Joint Battlespace Infosphere (JBI) program is performing a technology investigation into global communications, data mining and warehousing, and data fusion technologies by focusing on techniques and methodologies that support twenty first century military distributed collaboration. Advancement of these technologies is vitally important if military decision makers are to have the right data, in the right format, at the right time and place to support making the right decisions within available timelines. A quantitative understanding of individual and combinational effects arising from the application of technologies within a framework is presently far too complex to evaluate at more than a cursory depth. In order to facilitate quantitative analysis under these circumstances, the Distributed Information Enterprise Modeling and Simulation (DIEMS) team was formed to apply modeling and simulation (M&S) techniques to help in addressing JBI analysis challenges. The DIEMS team has been tasked utilizing collaborative distributed M&S architectures to quantitatively evaluate JBI technologies and tradeoffs. This paper first presents a high level view of the DIEMS project. Once this approach has been established, a more concentrated view of the detailed communications simulation techniques used in generating the underlying support data sets is presented.
NASA Technical Reports Server (NTRS)
Olson, S. L.; Beeson, H.; Haas, J. P.
2003-01-01
The objective of this project is to modify the standard oxygen consumption (cone) calorimeter (described in ASTM E 1354 and NASA STD 6001 Test 2) to provide a reproducible bench-scale test environment that simulates the buoyant or ventilation flow that would be generated by or around a burning surface in a spacecraft or extraterrestrial gravity level. This apparatus will allow us to conduct normal gravity experiments that accurately and quantitatively evaluate a material's flammability characteristics in the real-use environment of spacecraft or extra-terrestrial gravitational acceleration. The Equivalent Low Stretch Apparatus (ELSA) uses an inverted cone geometry with the sample burning in a ceiling fire configuration that provides a reproducible bench-scale test environment that simulates the buoyant or ventilation flow that would be generated by a flame in a spacecraft or extraterrestrial gravity level. Prototype unit testing results are presented in this paper. Ignition delay times and regression rates for PMMA are presented over a range of radiant heat flux levels and equivalent stretch rates which demonstrate the ability of ELSA to simulate key features of microgravity and extraterrestrial fire behavior.
A web-based rapid assessment tool for production publishing solutions
NASA Astrophysics Data System (ADS)
Sun, Tong
2010-02-01
Solution assessment is a critical first-step in understanding and measuring the business process efficiency enabled by an integrated solution package. However, assessing the effectiveness of any solution is usually a very expensive and timeconsuming task which involves lots of domain knowledge, collecting and understanding the specific customer operational context, defining validation scenarios and estimating the expected performance and operational cost. This paper presents an intelligent web-based tool that can rapidly assess any given solution package for production publishing workflows via a simulation engine and create a report for various estimated performance metrics (e.g. throughput, turnaround time, resource utilization) and operational cost. By integrating the digital publishing workflow ontology and an activity based costing model with a Petri-net based workflow simulation engine, this web-based tool allows users to quickly evaluate any potential digital publishing solutions side-by-side within their desired operational contexts, and provides a low-cost and rapid assessment for organizations before committing any purchase. This tool also benefits the solution providers to shorten the sales cycles, establishing a trustworthy customer relationship and supplement the professional assessment services with a proven quantitative simulation and estimation technology.
Full cell simulation and the evaluation of the buffer system on air-cathode microbial fuel cell
NASA Astrophysics Data System (ADS)
Ou, Shiqi; Kashima, Hiroyuki; Aaron, Douglas S.; Regan, John M.; Mench, Matthew M.
2017-04-01
This paper presents a computational model of a single chamber, air-cathode MFC. The model considers losses due to mass transport, as well as biological and electrochemical reactions, in both the anode and cathode half-cells. Computational fluid dynamics and Monod-Nernst analysis are incorporated into the reactions for the anode biofilm and cathode Pt catalyst and biofilm. The integrated model provides a macro-perspective of the interrelation between the anode and cathode during power production, while incorporating microscale contributions of mass transport within the anode and cathode layers. Model considerations include the effects of pH (H+/OH- transport) and electric field-driven migration on concentration overpotential, effects of various buffers and various amounts of buffer on the pH in the whole reactor, and overall impacts on the power output of the MFC. The simulation results fit the experimental polarization and power density curves well. Further, this model provides insight regarding mass transport at varying current density regimes and quantitative delineation of overpotentials at the anode and cathode. Overall, this comprehensive simulation is designed to accurately predict MFC performance based on fundamental fluid and kinetic relations and guide optimization of the MFC system.
Skeletal assessment with finite element analysis: relevance, pitfalls and interpretation.
Campbell, Graeme Michael; Glüer, Claus-C
2017-07-01
Finite element models simulate the mechanical response of bone under load, enabling noninvasive assessment of strength. Models generated from quantitative computed tomography (QCT) incorporate the geometry and spatial distribution of bone mineral density (BMD) to simulate physiological and traumatic loads as well as orthopaedic implant behaviour. The present review discusses the current strengths and weakness of finite element models for application to skeletal biomechanics. In cadaver studies, finite element models provide better estimations of strength compared to BMD. Data from clinical studies are encouraging; however, the superiority of finite element models over BMD measures for fracture prediction has not been shown conclusively, and may be sex and site dependent. Therapeutic effects on bone strength are larger than for BMD; however, model validation has only been performed on untreated bone. High-resolution modalities and novel image processing methods may enhance the structural representation and predictive ability. Despite extensive use of finite element models to study orthopaedic implant stability, accurate simulation of the bone-implant interface and fracture progression remains a significant challenge. Skeletal finite element models provide noninvasive assessments of strength and implant stability. Improved structural representation and implant surface interaction may enable more accurate models of fragility in the future.
In Situ Quantification of [Re(CO) 3] + by Fluorescence Spectroscopy in Simulated Hanford Tank Waste
DOE Office of Scientific and Technical Information (OSTI.GOV)
Branch, Shirmir D.; French, Amanda D.; Lines, Amanda M.
A pretreatment protocol is presented that allows for the quantitative conversion and subsequent in situ spectroscopic analysis of [Re(CO)3]+ species in simulated Hanford tank waste. The protocol encompasses adding a simulated waste sample containing the non-emissive [Re(CO)3]+ species to a developer solution that enables the rapid, quantitative conversion of the non-emissive species to a luminescent species which can then be detected spectroscopically. The [Re(CO)3]+ species concentration in an alkaline, simulated Hanford tank waste supernatant can be quantified by the standard addition method. In a test case, the [Re(CO)3]+ species was measured to be at a concentration of 38.9 µM, whichmore » was a difference of 2.01% from the actual concentration of 39.7 µM.« less
A flow-simulation model of the tidal Potomac River
Schaffranek, Raymond W.
1987-01-01
A one-dimensional model capable of simulating flow in a network of interconnected channels has been applied to the tidal Potomac River including its major tributaries and embayments between Washington, D.C., and Indian Head, Md. The model can be used to compute water-surface elevations and flow discharges at any of 66 predetermined locations or at any alternative river cross sections definable within the network of channels. In addition, the model can be used to provide tidal-interchange flow volumes and to evaluate tidal excursions and the flushing properties of the riverine system. Comparisons of model-computed results with measured watersurface elevations and discharges demonstrate the validity and accuracy of the model. Tidal-cycle flow volumes computed by the calibrated model have been verified to be within an accuracy of ? 10 percent. Quantitative characteristics of the hydrodynamics of the tidal river are identified and discussed. The comprehensive flow data provided by the model can be used to better understand the geochemical, biological, and other processes affecting the river's water quality.
NASA Astrophysics Data System (ADS)
Pan, Xingchen; Liu, Cheng; Zhu, Jianqiang
2018-02-01
Coherent modulation imaging providing fast convergence speed and high resolution with single diffraction pattern is a promising technique to satisfy the urgent demands for on-line multiple parameter diagnostics with single setup in high power laser facilities (HPLF). However, the influence of noise on the final calculated parameters concerned has not been investigated yet. According to a series of simulations with twenty different sampling beams generated based on the practical parameters and performance of HPLF, the quantitative analysis based on statistical results was first investigated after considering five different error sources. We found the background noise of detector and high quantization error will seriously affect the final accuracy and different parameters have different sensitivity to different noise sources. The simulation results and the corresponding analysis provide the potential directions to further improve the final accuracy of parameter diagnostics which is critically important to its formal applications in the daily routines of HPLF.
Open Marketplace for Simulation Software on the Basis of a Web Platform
NASA Astrophysics Data System (ADS)
Kryukov, A. P.; Demichev, A. P.
2016-02-01
The focus in development of a new generation of middleware shifts from the global grid systems to building convenient and efficient web platforms for remote access to individual computing resources. Further line of their development, suggested in this work, is related not only with the quantitative increase in their number and with the expansion of scientific, engineering, and manufacturing areas in which they are used, but also with improved technology for remote deployment of application software on the resources interacting with the web platforms. Currently, the services for providers of application software in the context of scientific-oriented web platforms is not developed enough. The proposed in this work new web platforms of application software market should have all the features of the existing web platforms for submissions of jobs to remote resources plus the provision of specific web services for interaction on market principles between the providers and consumers of application packages. The suggested approach will be approved on the example of simulation applications in the field of nonlinear optics.
A Prestressed Cable Network Model of the Adherent Cell Cytoskeleton
Coughlin, Mark F.; Stamenović, Dimitrije
2003-01-01
A prestressed cable network is used to model the deformability of the adherent cell actin cytoskeleton. The overall and microstructural model geometries and cable mechanical properties were assigned values based on observations from living cells and mechanical measurements on isolated actin filaments, respectively. The models were deformed to mimic cell poking (CP), magnetic twisting cytometry (MTC) and magnetic bead microrheometry (MBM) measurements on living adherent cells. The models qualitatively and quantitatively captured the fibroblast cell response to the deformation imposed by CP while exhibiting only some qualitative features of the cell response to MTC and MBM. The model for CP revealed that the tensed peripheral actin filaments provide the key resistance to indentation. The actin filament tension that provides mechanical integrity to the network was estimated at ∼158 pN, and the nonlinear mechanical response during CP originates from filament kinematics. The MTC and MBM simulations revealed that the model is incomplete, however, these simulations show cable tension as a key determinant of the model response. PMID:12547813
A prestressed cable network model of the adherent cell cytoskeleton.
Coughlin, Mark F; Stamenović, Dimitrije
2003-02-01
A prestressed cable network is used to model the deformability of the adherent cell actin cytoskeleton. The overall and microstructural model geometries and cable mechanical properties were assigned values based on observations from living cells and mechanical measurements on isolated actin filaments, respectively. The models were deformed to mimic cell poking (CP), magnetic twisting cytometry (MTC) and magnetic bead microrheometry (MBM) measurements on living adherent cells. The models qualitatively and quantitatively captured the fibroblast cell response to the deformation imposed by CP while exhibiting only some qualitative features of the cell response to MTC and MBM. The model for CP revealed that the tensed peripheral actin filaments provide the key resistance to indentation. The actin filament tension that provides mechanical integrity to the network was estimated at approximately 158 pN, and the nonlinear mechanical response during CP originates from filament kinematics. The MTC and MBM simulations revealed that the model is incomplete, however, these simulations show cable tension as a key determinant of the model response.
Schroeder, Indra
2015-01-01
Abstract A main ingredient for the understanding of structure/function correlates of ion channels is the quantitative description of single-channel gating and conductance. However, a wealth of information provided from fast current fluctuations beyond the temporal resolution of the recording system is often ignored, even though it is close to the time window accessible to molecular dynamics simulations. This kind of current fluctuations provide a special technical challenge, because individual opening/closing or blocking/unblocking events cannot be resolved, and the resulting averaging over undetected events decreases the single-channel current. Here, I briefly summarize the history of fast-current fluctuation analysis and focus on the so-called “beta distributions.” This tool exploits characteristics of current fluctuation-induced excess noise on the current amplitude histograms to reconstruct the true single-channel current and kinetic parameters. A guideline for the analysis and recent applications demonstrate that a construction of theoretical beta distributions by Markov Model simulations offers maximum flexibility as compared to analytical solutions. PMID:26368656
Impeding 99Tc(IV) mobility in novel waste forms
Lee, Mal-Soon; Um, Wooyong; Wang, Guohui; Kruger, Albert A.; Lukens, Wayne W.; Rousseau, Roger; Glezakou, Vassiliki-Alexandra
2016-01-01
Technetium (99Tc) is an abundant, long-lived radioactive fission product whose mobility in the subsurface is largely governed by its oxidation state. Tc immobilization is crucial for radioactive waste management and environmental remediation. Tc(IV) incorporation in spinels has been proposed as a novel method to increase Tc retention in glass waste forms during vitrification. However, experiments under high-temperature and oxic conditions show reoxidation of Tc(IV) to volatile pertechnetate, Tc(VII). Here we examine this problem with ab initio molecular dynamics simulations and propose that, at elevated temperatures, doping with first row transition metal can significantly enhance Tc retention in magnetite in the order Co>Zn>Ni. Experiments with doped spinels at 700 °C provide quantitative confirmation of the theoretical predictions in the same order. This work highlights the power of modern, state-of-the-art simulations to provide essential insights and generate theory-inspired design criteria of complex materials at elevated temperatures. PMID:27357121
Evaluating the Efficacy of Wavelet Configurations on Turbulent-Flow Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Shaomeng; Gruchalla, Kenny; Potter, Kristin
2015-10-25
I/O is increasingly becoming a significant constraint for simulation codes and visualization tools on modern supercomputers. Data compression is an attractive workaround, and, in particular, wavelets provide a promising solution. However, wavelets can be applied in multiple configurations, and the variations in configuration impact accuracy, storage cost, and execution time. While the variation in these factors over wavelet configurations have been explored in image processing, they are not well understood for visualization and analysis of scientific data. To illuminate this issue, we evaluate multiple wavelet configurations on turbulent-flow data. Our approach is to repeat established analysis routines on uncompressed andmore » lossy-compressed versions of a data set, and then quantitatively compare their outcomes. Our findings show that accuracy varies greatly based on wavelet configuration, while storage cost and execution time vary less. Overall, our study provides new insights for simulation analysts and visualization experts, who need to make tradeoffs between accuracy, storage cost, and execution time.« less
ERIC Educational Resources Information Center
Monaghan, James M.; Clement, John
1999-01-01
Presents evidence for students' qualitative and quantitative difficulties with apparently simple one-dimensional relative-motion problems, students' spontaneous visualization of relative-motion problems, the visualizations facilitating solution of these problems, and students' memories of the online computer simulation used as a framework for…
Simulation Modeling of a Facility Layout in Operations Management Classes
ERIC Educational Resources Information Center
Yazici, Hulya Julie
2006-01-01
Teaching quantitative courses can be challenging. Similarly, layout modeling and lean production concepts can be difficult to grasp in an introductory OM (operations management) class. This article describes a simulation model developed in PROMODEL to facilitate the learning of layout modeling and lean manufacturing. Simulation allows for the…
Simulation and Advanced Practice Nursing Education
ERIC Educational Resources Information Center
Blue, Dawn I.
2016-01-01
This quantitative study compared changes in level of confidence resulting from participation in simulation or traditional instructional methods for BSN (Bachelor of Science in Nursing) to DNP (Doctor of Nursing Practice) students in a nurse practitioner course when they entered the clinical practicum. Simulation has been used in many disciplines…
Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H. L.; Onami, Shuichi
2015-01-01
Motivation: Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. Results: We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. Availability and implementation: A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Contact: sonami@riken.jp Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:25414366
Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H L; Onami, Shuichi
2015-04-01
Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Bingjing; Zhao, Jianlin, E-mail: jlzhao@nwpu.edu.cn; Wang, Jun
2013-11-21
We present a method for visually and quantitatively investigating the heat dissipation process of plate-fin heat sinks by using digital holographic interferometry. A series of phase change maps reflecting the temperature distribution and variation trend of the air field surrounding heat sink during the heat dissipation process are numerically reconstructed based on double-exposure holographic interferometry. According to the phase unwrapping algorithm and the derived relationship between temperature and phase change of the detection beam, the full-field temperature distributions are quantitatively obtained with a reasonably high measurement accuracy. And then the impact of heat sink's channel width on the heat dissipationmore » performance in the case of natural convection is analyzed. In addition, a comparison between simulation and experiment results is given to verify the reliability of this method. The experiment results certify the feasibility and validity of the presented method in full-field, dynamical, and quantitative measurement of the air field temperature distribution, which provides a basis for analyzing the heat dissipation performance of plate-fin heat sinks.« less
NASA Astrophysics Data System (ADS)
Franci, Luca; Landi, Simone; Verdini, Andrea; Matteini, Lorenzo; Hellinger, Petr
2018-01-01
Properties of the turbulent cascade from fluid to kinetic scales in collisionless plasmas are investigated by means of large-size 3D hybrid (fluid electrons, kinetic protons) particle-in-cell simulations. Initially isotropic Alfvénic fluctuations rapidly develop a strongly anisotropic turbulent cascade, mainly in the direction perpendicular to the ambient magnetic field. The omnidirectional magnetic field spectrum shows a double power-law behavior over almost two decades in wavenumber, with a Kolmogorov-like index at large scales, a spectral break around ion scales, and a steepening at sub-ion scales. Power laws are also observed in the spectra of the ion bulk velocity, density, and electric field, at both magnetohydrodynamic (MHD) and kinetic scales. Despite the complex structure, the omnidirectional spectra of all fields at ion and sub-ion scales are in remarkable quantitative agreement with those of a 2D simulation with similar physical parameters. This provides a partial, a posteriori validation of the 2D approximation at kinetic scales. Conversely, at MHD scales, the spectra of the density and of the velocity (and, consequently, of the electric field) exhibit differences between the 2D and 3D cases. Although they can be partly ascribed to the lower spatial resolution, the main reason is likely the larger importance of compressible effects in the full 3D geometry. Our findings are also in remarkable quantitative agreement with solar wind observations.
NASA Astrophysics Data System (ADS)
Zhong, Fulin; Li, Ting; Pan, Boan; Wang, Pengbo
2017-02-01
Laser acupuncture is an effective photochemical and nonthermal stimulation of traditional acupuncture points with lowintensity laser irradiation, which is advantageous in painless, sterile, and safe compared to traditional acupuncture. Laser diode (LD) provides single wavelength and relatively-higher power light for phototherapy. The quantitative effect of illumination parameters of LD in use of laser acupuncture is crucial for practical operation of laser acupuncture. However, this issue is not fully demonstrated, especially since experimental methodologies with animals or human are pretty hard to address to this issue. For example, in order to protect viability of cells and tissue, and get better therapeutic effect, it's necessary to control the output power varied at 5mW 10mW range, while the optimized power is still not clear. This study aimed to quantitatively optimize the laser output power, wavelength, and irradiation direction with highly realistic modeling of light transport in acupunctured tissue. A Monte Carlo Simulation software for 3D vowelized media and the highest-precision human anatomical model Visible Chinese Human (VCH) were employed. Our 3D simulation results showed that longer wavelength/higher illumination power, larger absorption in laser acupuncture; the vertical direction emission of the acupuncture laser results in higher amount of light absorption in both the acupunctured voxel of tissue and muscle layer. Our 3D light distribution of laser acupuncture within VCH tissue model is potential to be used in optimization and real time guidance in clinical manipulation of laser acupuncture.
Zhou, Y.; Ojeda-May, P.; Nagaraju, M.; Pu, J.
2016-01-01
Adenosine triphosphate (ATP)-binding cassette (ABC) transporters are ubiquitous ATP-dependent membrane proteins involved in translocations of a wide variety of substrates across cellular membranes. To understand the chemomechanical coupling mechanism as well as functional asymmetry in these systems, a quantitative description of how ABC transporters hydrolyze ATP is needed. Complementary to experimental approaches, computer simulations based on combined quantum mechanical and molecular mechanical (QM/MM) potentials have provided new insights into the catalytic mechanism in ABC transporters. Quantitatively reliable determination of the free energy requirement for enzymatic ATP hydrolysis, however, requires substantial statistical sampling on QM/MM potential. A case study shows that brute force sampling of ab initio QM/MM (AI/MM) potential energy surfaces is computationally impractical for enzyme simulations of ABC transporters. On the other hand, existing semiempirical QM/MM (SE/MM) methods, although affordable for free energy sampling, are unreliable for studying ATP hydrolysis. To close this gap, a multiscale QM/MM approach named reaction path–force matching (RP–FM) has been developed. In RP–FM, specific reaction parameters for a selected SE method are optimized against AI reference data along reaction paths by employing the force matching technique. The feasibility of the method is demonstrated for a proton transfer reaction in the gas phase and in solution. The RP–FM method may offer a general tool for simulating complex enzyme systems such as ABC transporters. PMID:27498639
NASA Astrophysics Data System (ADS)
Bautista, Nazan Uludag
2011-06-01
This study investigated the effectiveness of an Early Childhood Education science methods course that focused exclusively on providing various mastery (i.e., enactive, cognitive content, and cognitive pedagogical) and vicarious experiences (i.e., cognitive self-modeling, symbolic modeling, and simulated modeling) in increasing preservice elementary teachers' self-efficacy beliefs. Forty-four preservice elementary teachers participated in the study. Analysis of the quantitative (STEBI-b) and qualitative (informal surveys) data revealed that personal science teaching efficacy and science teaching outcome expectancy beliefs increased significantly over the semester. Enactive mastery, cognitive pedagogical mastery, symbolic modeling, and cognitive self-modeling were the major sources of self-efficacy. This list was followed by cognitive content mastery and simulated modeling. This study has implications for science teacher educators.
User’s guide for MapMark4GUI—A graphical user interface for the MapMark4 R package
Shapiro, Jason
2018-05-29
MapMark4GUI is an R graphical user interface (GUI) developed by the U.S. Geological Survey to support user implementation of the MapMark4 R statistical software package. MapMark4 was developed by the U.S. Geological Survey to implement probability calculations for simulating undiscovered mineral resources in quantitative mineral resource assessments. The GUI provides an easy-to-use tool to input data, run simulations, and format output results for the MapMark4 package. The GUI is written and accessed in the R statistical programming language. This user’s guide includes instructions on installing and running MapMark4GUI and descriptions of the statistical output processes, output files, and test data files.
NASA Astrophysics Data System (ADS)
Jiang, Zhijun; Prokhorenko, Sergei; Prosandeev, Sergey; Nahas, Y.; Wang, D.; Íñiguez, Jorge; Defay, E.; Bellaiche, L.
2017-07-01
Atomistic effective Hamiltonian simulations are used to investigate electrocaloric (EC) effects in the lead-free Ba (Zr0.5Ti0.5)O3 (BZT) relaxor ferroelectric. We find that the EC coefficient varies nonmonotonically with the field at any temperature, presenting a maximum that can be traced back to the behavior of BZT's polar nanoregions. We also introduce a simple Landau-based model that reproduces the EC behavior of BZT as a function of field and temperature, and which is directly applicable to other compounds. Finally, we confirm that, for low temperatures (i.e., in nonergodic conditions), the usual indirect approach to measure the EC response provides an estimate that differs quantitatively from a direct evaluation of the field-induced temperature change.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masada, Youhei; Sano, Takayoshi, E-mail: ymasada@harbor.kobe-u.ac.jp, E-mail: sano@ile.osaka-u.ac.jp
2014-10-10
The mechanism of large-scale dynamos in rigidly rotating stratified convection is explored by direct numerical simulations (DNS) in Cartesian geometry. A mean-field dynamo model is also constructed using turbulent velocity profiles consistently extracted from the corresponding DNS results. By quantitative comparison between the DNS and our mean-field model, it is demonstrated that the oscillatory α{sup 2} dynamo wave, excited and sustained in the convection zone, is responsible for large-scale magnetic activities such as cyclic polarity reversal and spatiotemporal migration. The results provide strong evidence that a nonuniformity of the α-effect, which is a natural outcome of rotating stratified convection, canmore » be an important prerequisite for large-scale stellar dynamos, even without the Ω-effect.« less
Generalizing the dynamic field theory of spatial cognition across real and developmental time scales
Simmering, Vanessa R.; Spencer, John P.; Schutte, Anne R.
2008-01-01
Within cognitive neuroscience, computational models are designed to provide insights into the organization of behavior while adhering to neural principles. These models should provide sufficient specificity to generate novel predictions while maintaining the generality needed to capture behavior across tasks and/or time scales. This paper presents one such model, the Dynamic Field Theory (DFT) of spatial cognition, showing new simulations that provide a demonstration proof that the theory generalizes across developmental changes in performance in four tasks—the Piagetian A-not-B task, a sandbox version of the A-not-B task, a canonical spatial recall task, and a position discrimination task. Model simulations demonstrate that the DFT can accomplish both specificity—generating novel, testable predictions—and generality—spanning multiple tasks across development with a relatively simple developmental hypothesis. Critically, the DFT achieves generality across tasks and time scales with no modification to its basic structure and with a strong commitment to neural principles. The only change necessary to capture development in the model was an increase in the precision of the tuning of receptive fields as well as an increase in the precision of local excitatory interactions among neurons in the model. These small quantitative changes were sufficient to move the model through a set of quantitative and qualitative behavioral changes that span the age range from 8 months to 6 years and into adulthood. We conclude by considering how the DFT is positioned in the literature, the challenges on the horizon for our framework, and how a dynamic field approach can yield new insights into development from a computational cognitive neuroscience perspective. PMID:17716632
NASA Astrophysics Data System (ADS)
Govin, A.; Capron, E.; Tzedakis, P. C.; Verheyden, S.; Ghaleb, B.; Hillaire-Marcel, C.; St-Onge, G.; Stoner, J. S.; Bassinot, F.; Bazin, L.; Blunier, T.; Combourieu-Nebout, N.; El Ouahabi, A.; Genty, D.; Gersonde, R.; Jimenez-Amat, P.; Landais, A.; Martrat, B.; Masson-Delmotte, V.; Parrenin, F.; Seidenkrantz, M.-S.; Veres, D.; Waelbroeck, C.; Zahn, R.
2015-12-01
The Last Interglacial (LIG) represents an invaluable case study to investigate the response of components of the Earth system to global warming. However, the scarcity of absolute age constraints in most archives leads to extensive use of various stratigraphic alignments to different reference chronologies. This feature sets limitations to the accuracy of the stratigraphic assignment of the climatic sequence of events across the globe during the LIG. Here, we review the strengths and limitations of the methods that are commonly used to date or develop chronologies in various climatic archives for the time span (∼140-100 ka) encompassing the penultimate deglaciation, the LIG and the glacial inception. Climatic hypotheses underlying record alignment strategies and the interpretation of tracers are explicitly described. Quantitative estimates of the associated absolute and relative age uncertainties are provided. Recommendations are subsequently formulated on how best to define absolute and relative chronologies. Future climato-stratigraphic alignments should provide (1) a clear statement of climate hypotheses involved, (2) a detailed understanding of environmental parameters controlling selected tracers and (3) a careful evaluation of the synchronicity of aligned paleoclimatic records. We underscore the need to (1) systematically report quantitative estimates of relative and absolute age uncertainties, (2) assess the coherence of chronologies when comparing different records, and (3) integrate these uncertainties in paleoclimatic interpretations and comparisons with climate simulations. Finally, we provide a sequence of major climatic events with associated age uncertainties for the period 140-105 ka, which should serve as a new benchmark to disentangle mechanisms of the Earth system's response to orbital forcing and evaluate transient climate simulations.
Out of Africa: the importance of rivers as human migration corridors
NASA Astrophysics Data System (ADS)
Ramirez, J. A.; Coulthard, T. J.; Rogerson, M.; Barton, N.; Bruecher, T.
2013-12-01
The route and timing of Homo sapiens exiting Africa remains uncertain. Corridors leading out of Africa through the Sahara, the Nile Valley, and the Red Sea coast have been proposed as migration routes for anatomically modern humans 80,000-130,000 years ago. During this time climate conditions in the Sahara were wetter than present day, and monsoon rainfall fed rivers that flowed across the desert landscape. The location and timing of these rivers may have supported human migration northward from central Africa to the Mediterranean coast, and onwards to Europe or Asia. Here, we use palaeoclimate rainfall and a hydrological model to spatially simulate and quantitatively test the existence of three major rivers crossing the Sahara from south to north during the time of human migration. We provide evidence that, given realistic underlying climatology, the well-known Sahabi and Kufrah rivers very likely flowed across modern day Libya and reached the coast. More unexpectedly an additional river crossed the core of the Sahara through Algeria (Irharhar river) and flowed into the Chotts basin. The Irharhar river is unique, because it links locations in central Africa experiencing monsoon climates with temperate coastal Mediterranean environments where food and resources were likely abundant. From an ecological perspective, this little-known corridor may prove to be the most parsimonious migration route. Support for the Irharar as a viable migration corridor is provided by its geographic proximity to middle Stone Age archaeological artefacts found in North Africa. Our new, highly novel approach provides the first quantitative analysis of the likelihood that rivers occurred during the critical period of human migration out of Africa. Simulated probability of surface water in North Africa during the last interglacial and the location of tools and ornaments from the Middle Stone Age.
NASA Astrophysics Data System (ADS)
Creech, Angus; Früh, Wolf-Gerrit; Maguire, A. Eoghan
2015-05-01
We present here a computational fluid dynamics (CFD) simulation of Lillgrund offshore wind farm, which is located in the Øresund Strait between Sweden and Denmark. The simulation combines a dynamic representation of wind turbines embedded within a large-eddy simulation CFD solver and uses hr-adaptive meshing to increase or decrease mesh resolution where required. This allows the resolution of both large-scale flow structures around the wind farm, and the local flow conditions at individual turbines; consequently, the response of each turbine to local conditions can be modelled, as well as the resulting evolution of the turbine wakes. This paper provides a detailed description of the turbine model which simulates the interaction between the wind, the turbine rotors, and the turbine generators by calculating the forces on the rotor, the body forces on the air, and instantaneous power output. This model was used to investigate a selection of key wind speeds and directions, investigating cases where a row of turbines would be fully aligned with the wind or at specific angles to the wind. Results shown here include presentations of the spin-up of turbines, the observation of eddies moving through the turbine array, meandering turbine wakes, and an extensive wind farm wake several kilometres in length. The key measurement available for cross-validation with operational wind farm data is the power output from the individual turbines, where the effect of unsteady turbine wakes on the performance of downstream turbines was a main point of interest. The results from the simulations were compared to the performance measurements from the real wind farm to provide a firm quantitative validation of this methodology. Having achieved good agreement between the model results and actual wind farm measurements, the potential of the methodology to provide a tool for further investigations of engineering and atmospheric science problems is outlined.
Mark A. Finney; Charles W. McHugh; Isaac Grenfell; Karin L. Riley
2010-01-01
Components of a quantitative risk assessment were produced by simulation of burn probabilities and fire behavior variation for 134 fire planning units (FPUs) across the continental U.S. The system uses fire growth simulation of ignitions modeled from relationships between large fire occurrence and the fire danger index Energy Release Component (ERC). Simulations of 10,...
Gaussian Accelerated Molecular Dynamics in NAMD
2016-01-01
Gaussian accelerated molecular dynamics (GaMD) is a recently developed enhanced sampling technique that provides efficient free energy calculations of biomolecules. Like the previous accelerated molecular dynamics (aMD), GaMD allows for “unconstrained” enhanced sampling without the need to set predefined collective variables and so is useful for studying complex biomolecular conformational changes such as protein folding and ligand binding. Furthermore, because the boost potential is constructed using a harmonic function that follows Gaussian distribution in GaMD, cumulant expansion to the second order can be applied to recover the original free energy profiles of proteins and other large biomolecules, which solves a long-standing energetic reweighting problem of the previous aMD method. Taken together, GaMD offers major advantages for both unconstrained enhanced sampling and free energy calculations of large biomolecules. Here, we have implemented GaMD in the NAMD package on top of the existing aMD feature and validated it on three model systems: alanine dipeptide, the chignolin fast-folding protein, and the M3 muscarinic G protein-coupled receptor (GPCR). For alanine dipeptide, while conventional molecular dynamics (cMD) simulations performed for 30 ns are poorly converged, GaMD simulations of the same length yield free energy profiles that agree quantitatively with those of 1000 ns cMD simulation. Further GaMD simulations have captured folding of the chignolin and binding of the acetylcholine (ACh) endogenous agonist to the M3 muscarinic receptor. The reweighted free energy profiles are used to characterize the protein folding and ligand binding pathways quantitatively. GaMD implemented in the scalable NAMD is widely applicable to enhanced sampling and free energy calculations of large biomolecules. PMID:28034310
Gaussian Accelerated Molecular Dynamics in NAMD.
Pang, Yui Tik; Miao, Yinglong; Wang, Yi; McCammon, J Andrew
2017-01-10
Gaussian accelerated molecular dynamics (GaMD) is a recently developed enhanced sampling technique that provides efficient free energy calculations of biomolecules. Like the previous accelerated molecular dynamics (aMD), GaMD allows for "unconstrained" enhanced sampling without the need to set predefined collective variables and so is useful for studying complex biomolecular conformational changes such as protein folding and ligand binding. Furthermore, because the boost potential is constructed using a harmonic function that follows Gaussian distribution in GaMD, cumulant expansion to the second order can be applied to recover the original free energy profiles of proteins and other large biomolecules, which solves a long-standing energetic reweighting problem of the previous aMD method. Taken together, GaMD offers major advantages for both unconstrained enhanced sampling and free energy calculations of large biomolecules. Here, we have implemented GaMD in the NAMD package on top of the existing aMD feature and validated it on three model systems: alanine dipeptide, the chignolin fast-folding protein, and the M 3 muscarinic G protein-coupled receptor (GPCR). For alanine dipeptide, while conventional molecular dynamics (cMD) simulations performed for 30 ns are poorly converged, GaMD simulations of the same length yield free energy profiles that agree quantitatively with those of 1000 ns cMD simulation. Further GaMD simulations have captured folding of the chignolin and binding of the acetylcholine (ACh) endogenous agonist to the M 3 muscarinic receptor. The reweighted free energy profiles are used to characterize the protein folding and ligand binding pathways quantitatively. GaMD implemented in the scalable NAMD is widely applicable to enhanced sampling and free energy calculations of large biomolecules.
NASA Astrophysics Data System (ADS)
Russo, David; Laufer, Asher; Shapira, Roi H.; Kurtzman, Daniel
2013-02-01
Detailed numerical simulations were used to analyze water flow and transport of nitrate, chloride, and a tracer solute in a 3-D, spatially heterogeneous, variably saturated soil, originating from a citrus orchard irrigated with treated sewage water (TSW) considering realistic features of the soil-water-plant-atmosphere system. Results of this study suggest that under long-term irrigation with TSW, because of nitrate uptake by the tree roots and nitrogen transformations, the vadose zone may provide more capacity for the attenuation of the nitrate load in the groundwater than for the chloride load in the groundwater. Results of the 3-D simulations were used to assess their counterparts based on a simplified, deterministic, 1-D vertical simulation and on limited soil monitoring. Results of the analyses suggest that the information that may be gained from a single sampling point (located close to the area active in water uptake by the tree roots) or from the results of the 1-D simulation is insufficient for a quantitative description of the response of the complicated, 3-D flow system. Both might considerably underestimate the movement and spreading of a pulse of a tracer solute and also the groundwater contamination hazard posed by nitrate and particularly by chloride moving through the vadose zone. This stems mainly from the rain that drove water through the flow system away from the rooted area and could not be represented by the 1-D model or by the single sampling point. It was shown, however, that an additional sampling point, located outside the area active in water uptake, may substantially improve the quantitative description of the response of the complicated, 3-D flow system.
Visualization and simulation techniques for surgical simulators using actual patient's data.
Radetzky, Arne; Nürnberger, Andreas
2002-11-01
Because of the increasing complexity of surgical interventions research in surgical simulation became more and more important over the last years. However, the simulation of tissue deformation is still a challenging problem, mainly due to the short response times that are required for real-time interaction. The demands to hard and software are even larger if not only the modeled human anatomy is used but the anatomy of actual patients. This is required if the surgical simulator should be used as training medium for expert surgeons rather than students. In this article, suitable visualization and simulation methods for surgical simulation utilizing actual patient's datasets are described. Therefore, the advantages and disadvantages of direct and indirect volume rendering for the visualization are discussed and a neuro-fuzzy system is described, which can be used for the simulation of interactive tissue deformations. The neuro-fuzzy system makes it possible to define the deformation behavior based on a linguistic description of the tissue characteristics or to learn the dynamics by using measured data of real tissue. Furthermore, a simulator for minimally-invasive neurosurgical interventions is presented that utilizes the described visualization and simulation methods. The structure of the simulator is described in detail and the results of a system evaluation by an experienced neurosurgeon--a quantitative comparison between different methods of virtual endoscopy as well as a comparison between real brain images and virtual endoscopies--are given. The evaluation proved that the simulator provides a higher realism of the visualization and simulation then other currently available simulators. Copyright 2002 Elsevier Science B.V.
NASA Astrophysics Data System (ADS)
Mehta, Shalin B.; Sheppard, Colin J. R.
2010-05-01
Various methods that use large illumination aperture (i.e. partially coherent illumination) have been developed for making transparent (i.e. phase) specimens visible. These methods were developed to provide qualitative contrast rather than quantitative measurement-coherent illumination has been relied upon for quantitative phase analysis. Partially coherent illumination has some important advantages over coherent illumination and can be used for measurement of the specimen's phase distribution. However, quantitative analysis and image computation in partially coherent systems have not been explored fully due to the lack of a general, physically insightful and computationally efficient model of image formation. We have developed a phase-space model that satisfies these requirements. In this paper, we employ this model (called the phase-space imager) to elucidate five different partially coherent systems mentioned in the title. We compute images of an optical fiber under these systems and verify some of them with experimental images. These results and simulated images of a general phase profile are used to compare the contrast and the resolution of the imaging systems. We show that, for quantitative phase imaging of a thin specimen with matched illumination, differential phase contrast offers linear transfer of specimen information to the image. We also show that the edge enhancement properties of spiral phase contrast are compromised significantly as the coherence of illumination is reduced. The results demonstrate that the phase-space imager model provides a useful framework for analysis, calibration, and design of partially coherent imaging methods.
The role of simulation in mixed-methods research: a framework & application to patient safety.
Guise, Jeanne-Marie; Hansen, Matthew; Lambert, William; O'Brien, Kerth
2017-05-04
Research in patient safety is an important area of health services research and is a national priority. It is challenging to investigate rare occurrences, explore potential causes, and account for the complex, dynamic context of healthcare - yet all are required in patient safety research. Simulation technologies have become widely accepted as education and clinical tools, but have yet to become a standard tool for research. We developed a framework for research that integrates accepted patient safety models with mixed-methods research approaches and describe the performance of the framework in a working example of a large National Institutes of Health (NIH)-funded R01 investigation. This worked example of a framework in action, identifies the strengths and limitations of qualitative and quantitative research approaches commonly used in health services research. Each approach builds essential layers of knowledge. We describe how the use of simulation ties these layers of knowledge together and adds new and unique dimensions of knowledge. A mixed-methods research approach that includes simulation provides a broad multi-dimensional approach to health services and patient safety research.
Chai, Chen; Wong, Yiik Diew; Wang, Xuesong
2017-07-01
This paper proposes a simulation-based approach to estimate safety impact of driver cognitive failures and driving errors. Fuzzy Logic, which involves linguistic terms and uncertainty, is incorporated with Cellular Automata model to simulate decision-making process of right-turn filtering movement at signalized intersections. Simulation experiments are conducted to estimate the relationships between cognitive failures and driving errors with safety performance. Simulation results show Different types of cognitive failures are found to have varied relationship with driving errors and safety performance. For right-turn filtering movement, cognitive failures are more likely to result in driving errors with denser conflicting traffic stream. Moreover, different driving errors are found to have different safety impacts. The study serves to provide a novel approach to linguistically assess cognitions and replicate decision-making procedures of the individual driver. Compare to crash analysis, the proposed FCA model allows quantitative estimation of particular cognitive failures, and the impact of cognitions on driving errors and safety performance. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Pi, Xiaoqing; Mannucci, Anthony J.; Verkhoglyadova, Olga; Stephens, Philip; Iijima, Bryron A.
2013-01-01
Modeling and imaging the Earth's ionosphere as well as understanding its structures, inhomogeneities, and disturbances is a key part of NASA's Heliophysics Directorate science roadmap. This invention provides a design tool for scientific missions focused on the ionosphere. It is a scientifically important and technologically challenging task to assess the impact of a new observation system quantitatively on our capability of imaging and modeling the ionosphere. This question is often raised whenever a new satellite system is proposed, a new type of data is emerging, or a new modeling technique is developed. The proposed constellation would be part of a new observation system with more low-Earth orbiters tracking more radio occultation signals broadcast by Global Navigation Satellite System (GNSS) than those offered by the current GPS and COSMIC observation system. A simulation system was developed to fulfill this task. The system is composed of a suite of software that combines the Global Assimilative Ionospheric Model (GAIM) including first-principles and empirical ionospheric models, a multiple- dipole geomagnetic field model, data assimilation modules, observation simulator, visualization software, and orbit design, simulation, and optimization software.
Auberry, Kathy; Wills, Katherine; Shaver, Carrie
2017-01-01
Direct support professionals (DSPs) are increasingly active in medication administration for people with intellectual and developmental disabilities, thus supplementing nursing and family caretakers. Providing workplace training for DSPs is often the duty of nursing personnel. This article presents empirical data and design suggestions for including simulations, debriefing, and written reflective practice during in-service training for DSPs in order to improve DSPs' skills and confidence related to medication administration. Quantitative study results demonstrate that DSPs acknowledge that their skill level and confidence rose significantly after hands-on simulations. The skill-level effect was statistically significant for general medication management -4.5 ( p < 0.001) and gastrointestinal medication management -4.4 ( p < 0.001). Qualitative findings show a deep desire by DSPs to not just be "pill poppers" but to understand the medical processes, causalities, and consequences of their medication administration. On the basis of our results, the authors make recommendations regarding how to combine DSP workplace simulations and debriefing with written reflective practice in DSP continuing education.
Representing Water Scarcity in Future Agricultural Assessments
NASA Technical Reports Server (NTRS)
Winter, Jonathan M.; Lopez, Jose R.; Ruane, Alexander C.; Young, Charles A.; Scanlon, Bridget R.; Rosenzweig, Cynthia
2017-01-01
Globally, irrigated agriculture is both essential for food production and the largest user of water. A major challenge for hydrologic and agricultural research communities is assessing the sustainability of irrigated croplands under climate variability and change. Simulations of irrigated croplands generally lack key interactions between water supply, water distribution, and agricultural water demand. In this article, we explore the critical interface between water resources and agriculture by motivating, developing, and illustrating the application of an integrated modeling framework to advance simulations of irrigated croplands. We motivate the framework by examining historical dynamics of irrigation water withdrawals in the United States and quantitatively reviewing previous modeling studies of irrigated croplands with a focus on representations of water supply, agricultural water demand, and impacts on crop yields when water demand exceeds water supply. We then describe the integrated modeling framework for simulating irrigated croplands, which links trends and scenarios with water supply, water allocation, and agricultural water demand. Finally, we provide examples of efforts that leverage the framework to improve simulations of irrigated croplands as well as identify opportunities for interventions that increase agricultural productivity, resiliency, and sustainability.
Boulet-Audet, Maxime; Buffeteau, Thierry; Boudreault, Simon; Daugey, Nicolas; Pézolet, Michel
2010-06-24
Due to its unmatched hardness and chemical inertia, diamond offers many advantages over other materials for extreme conditions and routine analysis by attenuated total reflection (ATR) infrared spectroscopy. Its low refractive index can offer up to a 6-fold absorbance increase compared to germanium. Unfortunately, it also results for strong bands in spectral distortions compared to transmission experiments. The aim of this paper is to present a methodological approach to determine quantitatively the degree of the spectral distortions in ATR spectra. This approach requires the determination of the optical constants (refractive index and extinction coefficient) of the investigated sample. As a typical example, the optical constants of the fibroin protein of the silk worm Bombyx mori have been determined from the polarized ATR spectra obtained using both diamond and germanium internal reflection elements. The positions found for the amide I band by germanium and diamond ATR are respectively 6 and 17 cm(-1) lower than the true value dtermined from the k(nu) spectrum, which is calculated to be 1659 cm(-1). To determine quantitatively the effect of relevant parameters such as the film thickness and the protein concentration, various spectral simulations have also been performed. The use of a thinner film probed by light polarized in the plane of incidence and diluting the protein sample can help in obtaining ATR spectra that are closer to their transmittance counterparts. To extend this study to any system, the ATR distortion amplitude has been evaluated using spectral simulations performed for bands of various intensities and widths. From these simulations, a simple empirical relationship has been found to estimate the band shift from the experimental band height and width that could be of practical use for ATR users. This paper shows that the determination of optical constants provides an efficient way to recover the true spectrum shape and band frequencies of distorted ATR spectra.
Soranno, Andrea; Holla, Andrea; Dingfelder, Fabian; Nettels, Daniel; Makarov, Dmitrii E.; Schuler, Benjamin
2017-01-01
Internal friction is an important contribution to protein dynamics at all stages along the folding reaction. Even in unfolded and intrinsically disordered proteins, internal friction has a large influence, as demonstrated with several experimental techniques and in simulations. However, these methods probe different facets of internal friction and have been applied to disparate molecular systems, raising questions regarding the compatibility of the results. To obtain an integrated view, we apply here the combination of two complementary experimental techniques, simulations, and theory to the same system: unfolded protein L. We use single-molecule Förster resonance energy transfer (FRET) to measure the global reconfiguration dynamics of the chain, and photoinduced electron transfer (PET), a contact-based method, to quantify the rate of loop formation between two residues. This combination enables us to probe unfolded-state dynamics on different length scales, corresponding to different parts of the intramolecular distance distribution. Both FRET and PET measurements show that internal friction dominates unfolded-state dynamics at low denaturant concentration, and the results are in remarkable agreement with recent large-scale molecular dynamics simulations using a new water model. The simulations indicate that intrachain interactions and dihedral angle rotation correlate with the presence of internal friction, and theoretical models of polymer dynamics provide a framework for interrelating the contribution of internal friction observed in the two types of experiments and in the simulations. The combined results thus provide a coherent and quantitative picture of internal friction in unfolded proteins that could not be attained from the individual techniques. PMID:28223518
Soranno, Andrea; Holla, Andrea; Dingfelder, Fabian; Nettels, Daniel; Makarov, Dmitrii E; Schuler, Benjamin
2017-03-07
Internal friction is an important contribution to protein dynamics at all stages along the folding reaction. Even in unfolded and intrinsically disordered proteins, internal friction has a large influence, as demonstrated with several experimental techniques and in simulations. However, these methods probe different facets of internal friction and have been applied to disparate molecular systems, raising questions regarding the compatibility of the results. To obtain an integrated view, we apply here the combination of two complementary experimental techniques, simulations, and theory to the same system: unfolded protein L. We use single-molecule Förster resonance energy transfer (FRET) to measure the global reconfiguration dynamics of the chain, and photoinduced electron transfer (PET), a contact-based method, to quantify the rate of loop formation between two residues. This combination enables us to probe unfolded-state dynamics on different length scales, corresponding to different parts of the intramolecular distance distribution. Both FRET and PET measurements show that internal friction dominates unfolded-state dynamics at low denaturant concentration, and the results are in remarkable agreement with recent large-scale molecular dynamics simulations using a new water model. The simulations indicate that intrachain interactions and dihedral angle rotation correlate with the presence of internal friction, and theoretical models of polymer dynamics provide a framework for interrelating the contribution of internal friction observed in the two types of experiments and in the simulations. The combined results thus provide a coherent and quantitative picture of internal friction in unfolded proteins that could not be attained from the individual techniques.
Modelling runoff on ceramic tile roofs using the kinematic wave equations
NASA Astrophysics Data System (ADS)
Silveira, Alexandre; Abrantes, João; de Lima, João; Lira, Lincoln
2016-04-01
Rainwater harvesting is a water saving alternative strategy that presents many advantages and can provide solutions to address major water resources problems, such as fresh water scarcity, urban stream degradation and flooding. In recent years, these problems have become global challenges, due to climatic change, population growth and increasing urbanisation. Generally, roofs are the first to come into contact with rainwater; thus, they are the best candidates for rainwater harvesting. In this context, the correct evaluation of roof runoff quantity and quality is essential to effectively design rainwater harvesting systems. Despite this, many studies usually focus on the qualitative aspects in detriment of the quantitative aspects. Laboratory studies using rainfall simulators have been widely used to investigate rainfall-runoff processes. These studies enabled a detailed exploration and systematic replication of a large range of hydrologic conditions, such as rainfall spatial and temporal characteristics, providing for a fast way to obtain precise and consistent data that can be used to calibrate and validate numerical models. This study aims to evaluate the performance of a kinematic wave based numerical model in simulating runoff on sloping roofs, by comparing the numerical results with the ones obtained from laboratory rainfall simulations on a real-scale ceramic tile roof (Lusa tiles). For all studied slopes, simulated discharge hydrographs had a good adjust to observed ones. Coefficient of determination and Nash-Sutcliffe efficiency values were close to 1.0. Particularly, peak discharges, times to peak and peak durations were very well simulated.
Model-assisted development of a laminography inspection system
NASA Astrophysics Data System (ADS)
Grandin, R.; Gray, J.
2012-05-01
Traditional computed tomography (CT) is an effective method of determining the internal structure of an object through non-destructive means; however, inspection of certain objects, such as those with planar geometrics or with limited access, requires an alternate approach. An alternative is laminography and has been the focus of a number of researchers in the past decade for both medical and industrial inspections. Many research efforts rely on geometrically-simple analytical models, such as the Shepp-Logan phantom, for the development of their algorithms. Recent work at the Center for Non-Destructive Evaluation makes extensive use of a forward model, XRSIM, to study artifacts arising from the reconstruction method, the effects of complex geometries and known issues such as high density features on the laminography reconstruction process. The use of a model provides full knowledge of all aspects of the geometry and provides a means to quantitatively evaluate the impact of methods designed to reduce artifacts generated by the reconstruction methods or that are result of the part geometry. We will illustrate the use of forward simulations to quantitatively assess reconstruction algorithm development and artifact reduction.
Slade, Jeffrey W.; Adams, Jean V.; Christie, Gavin C.; Cuddy, Douglas W.; Fodale, Michael F.; Heinrich, John W.; Quinlan, Henry R.; Weise, Jerry G.; Weisser, John W.; Young, Robert J.
2003-01-01
Before 1995, Great Lakes streams were selected for lampricide treatment based primarily on qualitative measures of the relative abundance of larval sea lampreys, Petromyzon marinus. New integrated pest management approaches required standardized quantitative measures of sea lamprey. This paper evaluates historical larval assessment techniques and data and describes how new standardized methods for estimating abundance of larval and metamorphosed sea lampreys were developed and implemented. These new methods have been used to estimate larval and metamorphosed sea lamprey abundance in about 100 Great Lakes streams annually and to rank them for lampricide treatment since 1995. Implementation of these methods has provided a quantitative means of selecting streams for treatment based on treatment cost and estimated production of metamorphosed sea lampreys, provided managers with a tool to estimate potential recruitment of sea lampreys to the Great Lakes and the ability to measure the potential consequences of not treating streams, resulting in a more justifiable allocation of resources. The empirical data produced can also be used to simulate the impacts of various control scenarios.
Planner-Based Control of Advanced Life Support Systems
NASA Technical Reports Server (NTRS)
Muscettola, Nicola; Kortenkamp, David; Fry, Chuck; Bell, Scott
2005-01-01
The paper describes an approach to the integration of qualitative and quantitative modeling techniques for advanced life support (ALS) systems. Developing reliable control strategies that scale up to fully integrated life support systems requires augmenting quantitative models and control algorithms with the abstractions provided by qualitative, symbolic models and their associated high-level control strategies. This will allow for effective management of the combinatorics due to the integration of a large number of ALS subsystems. By focusing control actions at different levels of detail and reactivity we can use faster: simpler responses at the lowest level and predictive but complex responses at the higher levels of abstraction. In particular, methods from model-based planning and scheduling can provide effective resource management over long time periods. We describe reference implementation of an advanced control system using the IDEA control architecture developed at NASA Ames Research Center. IDEA uses planning/scheduling as the sole reasoning method for predictive and reactive closed loop control. We describe preliminary experiments in planner-based control of ALS carried out on an integrated ALS simulation developed at NASA Johnson Space Center.
Quantitative assessment of image motion blur in diffraction images of moving biological cells
NASA Astrophysics Data System (ADS)
Wang, He; Jin, Changrong; Feng, Yuanming; Qi, Dandan; Sa, Yu; Hu, Xin-Hua
2016-02-01
Motion blur (MB) presents a significant challenge for obtaining high-contrast image data from biological cells with a polarization diffraction imaging flow cytometry (p-DIFC) method. A new p-DIFC experimental system has been developed to evaluate the MB and its effect on image analysis using a time-delay-integration (TDI) CCD camera. Diffraction images of MCF-7 and K562 cells have been acquired with different speed-mismatch ratios and compared to characterize MB quantitatively. Frequency analysis of the diffraction images shows that the degree of MB can be quantified by bandwidth variations of the diffraction images along the motion direction. The analytical results were confirmed by the p-DIFC image data acquired at different speed-mismatch ratios and used to validate a method of numerical simulation of MB on blur-free diffraction images, which provides a useful tool to examine the blurring effect on diffraction images acquired from the same cell. These results provide insights on the dependence of diffraction image on MB and allow significant improvement on rapid biological cell assay with the p-DIFC method.
Experimental study and simulation of space charge stimulated discharge
NASA Astrophysics Data System (ADS)
Noskov, M. D.; Malinovski, A. S.; Cooke, C. M.; Wright, K. A.; Schwab, A. J.
2002-11-01
The electrical discharge of volume distributed space charge in poly(methylmethacrylate) (PMMA) has been investigated both experimentally and by computer simulation. The experimental space charge was implanted in dielectric samples by exposure to a monoenergetic electron beam of 3 MeV. Electrical breakdown through the implanted space charge region within the sample was initiated by a local electric field enhancement applied to the sample surface. A stochastic-deterministic dynamic model for electrical discharge was developed and used in a computer simulation of these breakdowns. The model employs stochastic rules to describe the physical growth of the discharge channels, and deterministic laws to describe the electric field, the charge, and energy dynamics within the discharge channels and the dielectric. Simulated spatial-temporal and current characteristics of the expanding discharge structure during physical growth are quantitatively compared with the experimental data to confirm the discharge model. It was found that a single fixed set of physically based dielectric parameter values was adequate to simulate the complete family of experimental space charge discharges in PMMA. It is proposed that such a set of parameters also provides a useful means to quantify the breakdown properties of other dielectrics.
Quantitative modeling of soil genesis processes
NASA Technical Reports Server (NTRS)
Levine, E. R.; Knox, R. G.; Kerber, A. G.
1992-01-01
For fine spatial scale simulation, a model is being developed to predict changes in properties over short-, meso-, and long-term time scales within horizons of a given soil profile. Processes that control these changes can be grouped into five major process clusters: (1) abiotic chemical reactions; (2) activities of organisms; (3) energy balance and water phase transitions; (4) hydrologic flows; and (5) particle redistribution. Landscape modeling of soil development is possible using digitized soil maps associated with quantitative soil attribute data in a geographic information system (GIS) framework to which simulation models are applied.
Stochastic flux analysis of chemical reaction networks
2013-01-01
Background Chemical reaction networks provide an abstraction scheme for a broad range of models in biology and ecology. The two common means for simulating these networks are the deterministic and the stochastic approaches. The traditional deterministic approach, based on differential equations, enjoys a rich set of analysis techniques, including a treatment of reaction fluxes. However, the discrete stochastic simulations, which provide advantages in some cases, lack a quantitative treatment of network fluxes. Results We describe a method for flux analysis of chemical reaction networks, where flux is given by the flow of species between reactions in stochastic simulations of the network. Extending discrete event simulation algorithms, our method constructs several data structures, and thereby reveals a variety of statistics about resource creation and consumption during the simulation. We use these structures to quantify the causal interdependence and relative importance of the reactions at arbitrary time intervals with respect to the network fluxes. This allows us to construct reduced networks that have the same flux-behavior, and compare these networks, also with respect to their time series. We demonstrate our approach on an extended example based on a published ODE model of the same network, that is, Rho GTP-binding proteins, and on other models from biology and ecology. Conclusions We provide a fully stochastic treatment of flux analysis. As in deterministic analysis, our method delivers the network behavior in terms of species transformations. Moreover, our stochastic analysis can be applied, not only at steady state, but at arbitrary time intervals, and used to identify the flow of specific species between specific reactions. Our cases study of Rho GTP-binding proteins reveals the role played by the cyclic reverse fluxes in tuning the behavior of this network. PMID:24314153
Damage Evaluation of Concrete Column under Impact Load Using a Piezoelectric-Based EMI Technique.
Fan, Shuli; Zhao, Shaoyu; Qi, Baoxin; Kong, Qingzhao
2018-05-17
One of the major causes of damage to column-supported concrete structures, such as bridges and highways, are collisions from moving vehicles, such as cars and ships. It is essential to quantify the collision damage of the column so that appropriate actions can be taken to prevent catastrophic events. A widely used method to assess structural damage is through the root-mean-square deviation (RMSD) damage index established by the collected data; however, the RMSD index does not truly provide quantitative information about the structure. Conversely, the damage volume ratio that can only be obtained via simulation provides better detail about the level of damage in a structure. Furthermore, as simulation can also provide the RMSD index relating to that particular damage volume ratio, the empirically obtained RMSD index can thus be related to the structural damage degree through comparison of the empirically obtained RMSD index to numerically-obtained RMSD. Thus, this paper presents a novel method in which the impact-induced damage to a structure is simulated in order to obtain the relationship between the damage volume ratio to the RMSD index, and the relationship can be used to predict the true damage degree by comparison to the empirical RMSD index. In this paper, the collision damage of a bridge column by moving vehicles was simulated by using a concrete beam model subjected to continuous impact loadings by a freefalling steel ball. The variation in admittance signals measured by the surface attached lead zirconate titanate (PZT) patches was used to establish the RMSD index. The results demonstrate that the RMSD index and the damage ratio of concrete have a linear relationship for the particular simulation model.
Updates to Multi-Dimensional Flux Reconstruction for Hypersonic Simulations on Tetrahedral Grids
NASA Technical Reports Server (NTRS)
Gnoffo, Peter A.
2010-01-01
The quality of simulated hypersonic stagnation region heating with tetrahedral meshes is investigated by using an updated three-dimensional, upwind reconstruction algorithm for the inviscid flux vector. An earlier implementation of this algorithm provided improved symmetry characteristics on tetrahedral grids compared to conventional reconstruction methods. The original formulation however displayed quantitative differences in heating and shear that were as large as 25% compared to a benchmark, structured-grid solution. The primary cause of this discrepancy is found to be an inherent inconsistency in the formulation of the flux limiter. The inconsistency is removed by employing a Green-Gauss formulation of primitive gradients at nodes to replace the previous Gram-Schmidt algorithm. Current results are now in good agreement with benchmark solutions for two challenge problems: (1) hypersonic flow over a three-dimensional cylindrical section with special attention to the uniformity of the solution in the spanwise direction and (2) hypersonic flow over a three-dimensional sphere. The tetrahedral cells used in the simulation are derived from a structured grid where cell faces are bisected across the diagonal resulting in a consistent pattern of diagonals running in a biased direction across the otherwise symmetric domain. This grid is known to accentuate problems in both shock capturing and stagnation region heating encountered with conventional, quasi-one-dimensional inviscid flux reconstruction algorithms. Therefore the test problems provide a sensitive indicator for algorithmic effects on heating. Additional simulations on a sharp, double cone and the shuttle orbiter are then presented to demonstrate the capabilities of the new algorithm on more geometrically complex flows with tetrahedral grids. These results provide the first indication that pure tetrahedral elements utilizing the updated, three-dimensional, upwind reconstruction algorithm may be used for the simulation of heating and shear in hypersonic flows in upwind, finite volume formulations.
Stochastic flux analysis of chemical reaction networks.
Kahramanoğulları, Ozan; Lynch, James F
2013-12-07
Chemical reaction networks provide an abstraction scheme for a broad range of models in biology and ecology. The two common means for simulating these networks are the deterministic and the stochastic approaches. The traditional deterministic approach, based on differential equations, enjoys a rich set of analysis techniques, including a treatment of reaction fluxes. However, the discrete stochastic simulations, which provide advantages in some cases, lack a quantitative treatment of network fluxes. We describe a method for flux analysis of chemical reaction networks, where flux is given by the flow of species between reactions in stochastic simulations of the network. Extending discrete event simulation algorithms, our method constructs several data structures, and thereby reveals a variety of statistics about resource creation and consumption during the simulation. We use these structures to quantify the causal interdependence and relative importance of the reactions at arbitrary time intervals with respect to the network fluxes. This allows us to construct reduced networks that have the same flux-behavior, and compare these networks, also with respect to their time series. We demonstrate our approach on an extended example based on a published ODE model of the same network, that is, Rho GTP-binding proteins, and on other models from biology and ecology. We provide a fully stochastic treatment of flux analysis. As in deterministic analysis, our method delivers the network behavior in terms of species transformations. Moreover, our stochastic analysis can be applied, not only at steady state, but at arbitrary time intervals, and used to identify the flow of specific species between specific reactions. Our cases study of Rho GTP-binding proteins reveals the role played by the cyclic reverse fluxes in tuning the behavior of this network.
Sun, Ke; Saadi, Fadl H; Lichterman, Michael F; Hale, William G; Wang, Hsin-Ping; Zhou, Xinghao; Plymale, Noah T; Omelchenko, Stefan T; He, Jr-Hau; Papadantonakis, Kimberly M; Brunschwig, Bruce S; Lewis, Nathan S
2015-03-24
Reactively sputtered nickel oxide (NiOx) films provide transparent, antireflective, electrically conductive, chemically stable coatings that also are highly active electrocatalysts for the oxidation of water to O2(g). These NiOx coatings provide protective layers on a variety of technologically important semiconducting photoanodes, including textured crystalline Si passivated by amorphous silicon, crystalline n-type cadmium telluride, and hydrogenated amorphous silicon. Under anodic operation in 1.0 M aqueous potassium hydroxide (pH 14) in the presence of simulated sunlight, the NiOx films stabilized all of these self-passivating, high-efficiency semiconducting photoelectrodes for >100 h of sustained, quantitative solar-driven oxidation of water to O2(g).
Limit of a nonpreferential attachment multitype network model
NASA Astrophysics Data System (ADS)
Shang, Yilun
2017-02-01
Here, we deal with a model of multitype network with nonpreferential attachment growth. The connection between two nodes depends asymmetrically on their types, reflecting the implication of time order in temporal networks. Based upon graph limit theory, we analytically determined the limit of the network model characterized by a kernel, in the sense that the number of copies of any fixed subgraph converges when network size tends to infinity. The results are confirmed by extensive simulations. Our work thus provides a theoretical framework for quantitatively understanding grown temporal complex networks as a whole.
Universal monopole scaling near transitions from the Coulomb phase.
Powell, Stephen
2012-08-10
Certain frustrated systems, including spin ice and dimer models, exhibit a Coulomb phase at low temperatures, with power-law correlations and fractionalized monopole excitations. Transitions out of this phase, at which the effective gauge theory becomes confining, provide examples of unconventional criticality. This Letter studies the behavior at nonzero monopole density near such transitions, using scaling theory to arrive at universal expressions for the crossover phenomena. For a particular transition in spin ice, quantitative predictions are made by mapping to the XY model and confirmed using Monte Carlo simulations.
Unveiling Mars nightside mesosphere dynamics by IUVS/MAVEN global images of NO nightglow
NASA Astrophysics Data System (ADS)
Stiepen, A.; Jain, S. K.; Schneider, N. M.; Milby, Z.; Deighan, J. I.; Gonzàlez-Galindo, F.; Gérard, J.-C.; Forget, F.; Bougher, S.; Stewart, A. I. F.; Royer, E.; Stevens, M. H.; Evans, J. S.; Chaffin, M. S.; Crismani, M.; McClintock, W. E.; Clarke, J. T.; Holsclaw, G. W.; Montmessin, F.; Lo, D. Y.
2017-09-01
We analyze the morphology of the ultraviolet nightglow in the Martian upper atmosphere through Nitric Oxide (NO) δ and γ bands emissions observed by the Imaging Ultraviolet Spectrograph instrument on the Mars Atmosphere and Volatile EvolutioN spacecraft. The seasonal dynamics of the Martian thermosphere-mesosphere can be constrained based on the distribution of these emissions. We show evidence for local (emission streaks and splotches) and global (longitudinal and seasonal) variability in brightness of the emission and provide quantitative comparisons to GCM simulations.
MSFC shuttle lightning research
NASA Technical Reports Server (NTRS)
Vaughan, Otha H., Jr.
1993-01-01
The shuttle mesoscale lightning experiment (MLE), flown on earlier shuttle flights, and most recently flown on the following space transportation systems (STS's), STS-31, -32, -35, -37, -38, -40, -41, and -48, has continued to focus on obtaining additional quantitative measurements of lightning characteristics and to create a data base for use in demonstrating observation simulations for future spaceborne lightning mapping systems. These flights are also providing design criteria data for the design of a proposed shuttle MLE-type lightning research instrument called mesoscale lightning observational sensors (MELOS), which are currently under development here at MSFC.
Colloidal Dynamics Simulations of Rheology and Stability of Concentrated Fuel Slurries.
1987-04-10
Weals potential as the adsorbed polymer concentration and Hamaker con- stant are changed. These calculations provide quantitative evidence for the...derived by Hamaker : 3 6 U (r) A d 2 2 +2Ln( 2- d2(3 A T2 2 2 2 2 A value of 5.0 x 10" 2 0 j was used for the Hamaker constant, A. A plot of Eq. (31) is...parameter controlling the strength of the repulsive steric potential. The Hamaker constant A (Eq. (33)) is the nat- ural choice for the attractive
NASA Astrophysics Data System (ADS)
Matsui, Daichi; Ishii, Katsunori; Awazu, Kunio
2015-07-01
Atherosclerosis is a primary cause of critical ischemic diseases like heart infarction or stroke. A method that can provide detailed information about the stability of atherosclerotic plaques is required. We focused on spectroscopic techniques that could evaluate the chemical composition of lipid in plaques. A novel angioscope using multispectral imaging at wavelengths around 1200 nm for quantitative evaluation of atherosclerotic plaques was developed. The angioscope consists of a halogen lamp, an indium gallium arsenide (InGaAs) camera, 3 optical band pass filters transmitting wavelengths of 1150, 1200, and 1300 nm, an image fiber having 0.7 mm outer diameter, and an irradiation fiber which consists of 7 multimode fibers. Atherosclerotic plaque phantoms with 100, 60, 20 vol.% of lipid were prepared and measured by the multispectral angioscope. The acquired datasets were processed by spectral angle mapper (SAM) method. As a result, simulated plaque areas in atherosclerotic plaque phantoms that could not be detected by an angioscopic visible image could be clearly enhanced. In addition, quantitative evaluation of atherosclerotic plaque phantoms based on the lipid volume fractions was performed up to 20 vol.%. These results show the potential of a multispectral angioscope at wavelengths around 1200 nm for quantitative evaluation of the stability of atherosclerotic plaques.
Impact of reconstruction parameters on quantitative I-131 SPECT
NASA Astrophysics Data System (ADS)
van Gils, C. A. J.; Beijst, C.; van Rooij, R.; de Jong, H. W. A. M.
2016-07-01
Radioiodine therapy using I-131 is widely used for treatment of thyroid disease or neuroendocrine tumors. Monitoring treatment by accurate dosimetry requires quantitative imaging. The high energy photons however render quantitative SPECT reconstruction challenging, potentially requiring accurate correction for scatter and collimator effects. The goal of this work is to assess the effectiveness of various correction methods on these effects using phantom studies. A SPECT/CT acquisition of the NEMA IEC body phantom was performed. Images were reconstructed using the following parameters: (1) without scatter correction, (2) with triple energy window (TEW) scatter correction and (3) with Monte Carlo-based scatter correction. For modelling the collimator-detector response (CDR), both (a) geometric Gaussian CDRs as well as (b) Monte Carlo simulated CDRs were compared. Quantitative accuracy, contrast to noise ratios and recovery coefficients were calculated, as well as the background variability and the residual count error in the lung insert. The Monte Carlo scatter corrected reconstruction method was shown to be intrinsically quantitative, requiring no experimentally acquired calibration factor. It resulted in a more accurate quantification of the background compartment activity density compared with TEW or no scatter correction. The quantification error relative to a dose calibrator derived measurement was found to be <1%,-26% and 33%, respectively. The adverse effects of partial volume were significantly smaller with the Monte Carlo simulated CDR correction compared with geometric Gaussian or no CDR modelling. Scatter correction showed a small effect on quantification of small volumes. When using a weighting factor, TEW correction was comparable to Monte Carlo reconstruction in all measured parameters, although this approach is clinically impractical since this factor may be patient dependent. Monte Carlo based scatter correction including accurately simulated CDR modelling is the most robust and reliable method to reconstruct accurate quantitative iodine-131 SPECT images.
Simulation methods to estimate design power: an overview for applied research.
Arnold, Benjamin F; Hogan, Daniel R; Colford, John M; Hubbard, Alan E
2011-06-20
Estimating the required sample size and statistical power for a study is an integral part of study design. For standard designs, power equations provide an efficient solution to the problem, but they are unavailable for many complex study designs that arise in practice. For such complex study designs, computer simulation is a useful alternative for estimating study power. Although this approach is well known among statisticians, in our experience many epidemiologists and social scientists are unfamiliar with the technique. This article aims to address this knowledge gap. We review an approach to estimate study power for individual- or cluster-randomized designs using computer simulation. This flexible approach arises naturally from the model used to derive conventional power equations, but extends those methods to accommodate arbitrarily complex designs. The method is universally applicable to a broad range of designs and outcomes, and we present the material in a way that is approachable for quantitative, applied researchers. We illustrate the method using two examples (one simple, one complex) based on sanitation and nutritional interventions to improve child growth. We first show how simulation reproduces conventional power estimates for simple randomized designs over a broad range of sample scenarios to familiarize the reader with the approach. We then demonstrate how to extend the simulation approach to more complex designs. Finally, we discuss extensions to the examples in the article, and provide computer code to efficiently run the example simulations in both R and Stata. Simulation methods offer a flexible option to estimate statistical power for standard and non-traditional study designs and parameters of interest. The approach we have described is universally applicable for evaluating study designs used in epidemiologic and social science research.
Simulation methods to estimate design power: an overview for applied research
2011-01-01
Background Estimating the required sample size and statistical power for a study is an integral part of study design. For standard designs, power equations provide an efficient solution to the problem, but they are unavailable for many complex study designs that arise in practice. For such complex study designs, computer simulation is a useful alternative for estimating study power. Although this approach is well known among statisticians, in our experience many epidemiologists and social scientists are unfamiliar with the technique. This article aims to address this knowledge gap. Methods We review an approach to estimate study power for individual- or cluster-randomized designs using computer simulation. This flexible approach arises naturally from the model used to derive conventional power equations, but extends those methods to accommodate arbitrarily complex designs. The method is universally applicable to a broad range of designs and outcomes, and we present the material in a way that is approachable for quantitative, applied researchers. We illustrate the method using two examples (one simple, one complex) based on sanitation and nutritional interventions to improve child growth. Results We first show how simulation reproduces conventional power estimates for simple randomized designs over a broad range of sample scenarios to familiarize the reader with the approach. We then demonstrate how to extend the simulation approach to more complex designs. Finally, we discuss extensions to the examples in the article, and provide computer code to efficiently run the example simulations in both R and Stata. Conclusions Simulation methods offer a flexible option to estimate statistical power for standard and non-traditional study designs and parameters of interest. The approach we have described is universally applicable for evaluating study designs used in epidemiologic and social science research. PMID:21689447
A radiation-free mixed-reality training environment and assessment concept for C-arm-based surgery.
Stefan, Philipp; Habert, Séverine; Winkler, Alexander; Lazarovici, Marc; Fürmetz, Julian; Eck, Ulrich; Navab, Nassir
2018-06-25
The discrepancy of continuously decreasing opportunities for clinical training and assessment and the increasing complexity of interventions in surgery has led to the development of different training and assessment options like anatomical models, computer-based simulators or cadaver trainings. However, trainees, following training, assessment and ultimately performing patient treatment, still face a steep learning curve. To address this problem for C-arm-based surgery, we introduce a realistic radiation-free simulation system that combines patient-based 3D printed anatomy and simulated X-ray imaging using a physical C-arm. To explore the fidelity and usefulness of the proposed mixed-reality system for training and assessment, we conducted a user study with six surgical experts performing a facet joint injection on the simulator. In a technical evaluation, we show that our system simulates X-ray images accurately with an RMSE of 1.85 mm compared to real X-ray imaging. The participants expressed agreement with the overall realism of the simulation, the usefulness of the system for assessment and strong agreement with the usefulness of such a mixed-reality system for training of novices and experts. In a quantitative analysis, we furthermore evaluated the suitability of the system for the assessment of surgical skills and gather preliminary evidence for validity. The proposed mixed-reality simulation system facilitates a transition to C-arm-based surgery and has the potential to complement or even replace large parts of cadaver training, to provide a safe assessment environment and to reduce the risk for errors when proceeding to patient treatment. We propose an assessment concept and outline the steps necessary to expand the system into a test instrument that provides reliable and justified assessments scores indicative of surgical proficiency with sufficient evidence for validity.
2014-01-01
Background Left pulmonary artery sling (LPAS) is a rare but severe congenital anomaly, in which the stenoses are formed in the trachea and/or main bronchi. Multi-detector computed tomography (MDCT) provides useful anatomical images, but does not offer functional information. The objective of the present study is to quantitatively analyze the airflow in the trachea and main bronchi of LPAS subjects through computational fluid dynamics (CFD) simulation. Methods Five subjects (four LPAS patients, one normal control) aging 6-19 months are analyzed. The geometric model of the trachea and the two main bronchi is extracted from the MDCT images. The inlet velocity is determined based on the body weight and the inlet area. Both the geometric model and personalized inflow conditions are imported into CFD software, ANSYS. The pressure drop, mass flow ratio through two bronchi, wall pressure, flow velocity and wall shear stress (WSS) are obtained, and compared to the normal control. Results Due to the tracheal and/or bronchial stenosis, the pressure drop for the LPAS patients ranges 78.9 - 914.5 Pa, much higher than for the normal control (0.7 Pa). The mass flow ratio through the two bronchi does not correlate with the sectional area ratio if the anomalous left pulmonary artery compresses the trachea or bronchi. It is suggested that the C-shaped trachea plays an important role on facilitating the air flow into the left bronchus with the inertia force. For LPAS subjects, the distributions of velocities, wall pressure and WSS are less regular than for the normal control. At the stenotic site, high velocity, low wall pressure and high WSS are observed. Conclusions Using geometric models extracted from CT images and the patient-specified inlet boundary conditions, CFD simulation can provide vital quantitative flow information for LPAS. Due to the stenosis, high pressure drops, inconsistent distributions of velocities, wall pressure and WSS are observed. The C-shaped trachea may facilitate a larger flow of air into the left bronchus under the inertial force, and decrease the ventilation of the right lung. Quantitative and personalized information may help understand the mechanism of LPAS and the correlations between stenosis and dyspnea, and facilitate the structural and functional assessment of LPAS. PMID:24957947
A Novel Simulation Technician Laboratory Design: Results of a Survey-Based Study
Hughes, Patrick G; Friedl, Ed; Ortiz Figueroa, Fabiana; Cepeda Brito, Jose R; Frey, Jennifer; Birmingham, Lauren E; Atkinson, Steven Scott
2016-01-01
Objective The purpose of this study was to elicit feedback from simulation technicians prior to developing the first simulation technician-specific simulation laboratory in Akron, OH. Background Simulation technicians serve a vital role in simulation centers within hospitals/health centers around the world. The first simulation technician degree program in the US has been approved in Akron, OH. To satisfy the requirements of this program and to meet the needs of this special audience of learners, a customized simulation lab is essential. Method A web-based survey was circulated to simulation technicians prior to completion of the lab for the new program. The survey consisted of questions aimed at identifying structural and functional design elements of a novel simulation center for the training of simulation technicians. Quantitative methods were utilized to analyze data. Results Over 90% of technicians (n=65) think that a lab designed explicitly for the training of technicians is novel and beneficial. Approximately 75% of respondents think that the space provided appropriate audiovisual (AV) infrastructure and space to evaluate the ability of technicians to be independent. The respondents think that the lab needed more storage space, visualization space for a large number of students, and more space in the technical/repair area. Conclusions A space designed for the training of simulation technicians was considered to be beneficial. This laboratory requires distinct space for technical repair, adequate bench space for the maintenance and repair of simulators, an appropriate AV infrastructure, and space to evaluate the ability of technicians to be independent. PMID:27096134
A Novel Simulation Technician Laboratory Design: Results of a Survey-Based Study.
Ahmed, Rami; Hughes, Patrick G; Friedl, Ed; Ortiz Figueroa, Fabiana; Cepeda Brito, Jose R; Frey, Jennifer; Birmingham, Lauren E; Atkinson, Steven Scott
2016-03-16
OBJECTIVE : The purpose of this study was to elicit feedback from simulation technicians prior to developing the first simulation technician-specific simulation laboratory in Akron, OH. Simulation technicians serve a vital role in simulation centers within hospitals/health centers around the world. The first simulation technician degree program in the US has been approved in Akron, OH. To satisfy the requirements of this program and to meet the needs of this special audience of learners, a customized simulation lab is essential. A web-based survey was circulated to simulation technicians prior to completion of the lab for the new program. The survey consisted of questions aimed at identifying structural and functional design elements of a novel simulation center for the training of simulation technicians. Quantitative methods were utilized to analyze data. Over 90% of technicians (n=65) think that a lab designed explicitly for the training of technicians is novel and beneficial. Approximately 75% of respondents think that the space provided appropriate audiovisual (AV) infrastructure and space to evaluate the ability of technicians to be independent. The respondents think that the lab needed more storage space, visualization space for a large number of students, and more space in the technical/repair area. CONCLUSIONS : A space designed for the training of simulation technicians was considered to be beneficial. This laboratory requires distinct space for technical repair, adequate bench space for the maintenance and repair of simulators, an appropriate AV infrastructure, and space to evaluate the ability of technicians to be independent.
Methods for Quantitative Interpretation of Retarding Field Analyzer Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Calvey, J.R.; Crittenden, J.A.; Dugan, G.F.
2011-03-28
Over the course of the CesrTA program at Cornell, over 30 Retarding Field Analyzers (RFAs) have been installed in the CESR storage ring, and a great deal of data has been taken with them. These devices measure the local electron cloud density and energy distribution, and can be used to evaluate the efficacy of different cloud mitigation techniques. Obtaining a quantitative understanding of RFA data requires use of cloud simulation programs, as well as a detailed model of the detector itself. In a drift region, the RFA can be modeled by postprocessing the output of a simulation code, and onemore » can obtain best fit values for important simulation parameters with a chi-square minimization method.« less
Takamura, Ayari; Watanabe, Ken; Akutsu, Tomoko
2017-07-01
Identification of human semen is indispensable for the investigation of sexual assaults. Fluorescence staining methods using commercial kits, such as the series of SPERM HY-LITER™ kits, have been useful to detect human sperm via strong fluorescence. These kits have been examined from various forensic aspects. However, because of a lack of evaluation methods, these studies did not provide objective, or quantitative, descriptions of the results nor clear criteria for the decisions reached. In addition, the variety of validations was considerably limited. In this study, we conducted more advanced validations of SPERM HY-LITER™ Express using our established image analysis method. Use of this method enabled objective and specific identification of fluorescent sperm's spots and quantitative comparisons of the sperm detection performance under complex experimental conditions. For body fluid mixtures, we examined interference with the fluorescence staining from other body fluid components. Effects of sample decomposition were simulated in high humidity and high temperature conditions. Semen with quite low sperm concentrations, such as azoospermia and oligospermia samples, represented the most challenging cases in application of the kit. Finally, the tolerance of the kit against various acidic and basic environments was analyzed. The validations herein provide useful information for the practical applications of the SPERM HY-LITER™ Express kit, which were previously unobtainable. Moreover, the versatility of our image analysis method toward various complex cases was demonstrated.
Patel, Nikunjkumar; Wiśniowska, Barbara; Jamei, Masoud; Polak, Sebastian
2017-11-27
A quantitative systems toxicology (QST) model for citalopram was established to simulate, in silico, a 'virtual twin' of a real patient to predict the occurrence of cardiotoxic events previously reported in patients under various clinical conditions. The QST model considers the effects of citalopram and its most notable electrophysiologically active primary (desmethylcitalopram) and secondary (didesmethylcitalopram) metabolites, on cardiac electrophysiology. The in vitro cardiac ion channel current inhibition data was coupled with the biophysically detailed model of human cardiac electrophysiology to investigate the impact of (i) the inhibition of multiple ion currents (I Kr , I Ks , I CaL ); (ii) the inclusion of metabolites in the QST model; and (iii) unbound or total plasma as the operating drug concentration, in predicting clinically observed QT prolongation. The inclusion of multiple ion channel current inhibition and metabolites in the simulation with unbound plasma citalopram concentration provided the lowest prediction error. The predictive performance of the model was verified with three additional therapeutic and supra-therapeutic drug exposure clinical cases. The results indicate that considering only the hERG ion channel inhibition of only the parent drug is potentially misleading, and the inclusion of active metabolite data and the influence of other ion channel currents should be considered to improve the prediction of potential cardiac toxicity. Mechanistic modelling can help bridge the gaps existing in the quantitative translation from preclinical cardiac safety assessment to clinical toxicology. Moreover, this study shows that the QST models, in combination with appropriate drug and systems parameters, can pave the way towards personalised safety assessment.
Stock, Philipp; Monroe, Jacob I; Utzig, Thomas; Smith, David J; Shell, M Scott; Valtiner, Markus
2017-03-28
Interactions between hydrophobic moieties steer ubiquitous processes in aqueous media, including the self-organization of biologic matter. Recent decades have seen tremendous progress in understanding these for macroscopic hydrophobic interfaces. Yet, it is still a challenge to experimentally measure hydrophobic interactions (HIs) at the single-molecule scale and thus to compare with theory. Here, we present a combined experimental-simulation approach to directly measure and quantify the sequence dependence and additivity of HIs in peptide systems at the single-molecule scale. We combine dynamic single-molecule force spectroscopy on model peptides with fully atomistic, both equilibrium and nonequilibrium, molecular dynamics (MD) simulations of the same systems. Specifically, we mutate a flexible (GS) 5 peptide scaffold with increasing numbers of hydrophobic leucine monomers and measure the peptides' desorption from hydrophobic self-assembled monolayer surfaces. Based on the analysis of nonequilibrium work-trajectories, we measure an interaction free energy that scales linearly with 3.0-3.4 k B T per leucine. In good agreement, simulations indicate a similar trend with 2.1 k B T per leucine, while also providing a detailed molecular view into HIs. This approach potentially provides a roadmap for directly extracting qualitative and quantitative single-molecule interactions at solid/liquid interfaces in a wide range of fields, including interactions at biointerfaces and adhesive interactions in industrial applications.
Metrics for comparing dynamic earthquake rupture simulations
Barall, Michael; Harris, Ruth A.
2014-01-01
Earthquakes are complex events that involve a myriad of interactions among multiple geologic features and processes. One of the tools that is available to assist with their study is computer simulation, particularly dynamic rupture simulation. A dynamic rupture simulation is a numerical model of the physical processes that occur during an earthquake. Starting with the fault geometry, friction constitutive law, initial stress conditions, and assumptions about the condition and response of the near‐fault rocks, a dynamic earthquake rupture simulation calculates the evolution of fault slip and stress over time as part of the elastodynamic numerical solution (Ⓔ see the simulation description in the electronic supplement to this article). The complexity of the computations in a dynamic rupture simulation make it challenging to verify that the computer code is operating as intended, because there are no exact analytic solutions against which these codes’ results can be directly compared. One approach for checking if dynamic rupture computer codes are working satisfactorily is to compare each code’s results with the results of other dynamic rupture codes running the same earthquake simulation benchmark. To perform such a comparison consistently, it is necessary to have quantitative metrics. In this paper, we present a new method for quantitatively comparing the results of dynamic earthquake rupture computer simulation codes.
Effect of exercise on hemodynamic conditions in the abdominal aorta.
Taylor, C A; Hughes, T J; Zarins, C K
1999-06-01
The beneficial effect of exercise in the retardation of the progression of cardiovascular disease is hypothesized to be caused, at least in part, by the elimination of adverse hemodynamic conditions, including flow recirculation and low wall shear stress. In vitro and in vivo investigations have provided qualitative and limited quantitative information on flow patterns in the abdominal aorta and on the effect of exercise on the elimination of adverse hemodynamic conditions. We used computational fluid mechanics methods to examine the effects of simulated exercise on hemodynamic conditions in an idealized model of the human abdominal aorta. A three-dimensional computer model of a healthy human abdominal aorta was created to simulate pulsatile aortic blood flow under conditions of rest and graded exercise. Flow velocity patterns and wall shear stress were computed in the lesion-prone infrarenal aorta, and the effects of exercise were determined. A recirculation zone was observed to form along the posterior wall of the aorta immediately distal to the renal vessels under resting conditions. Low time-averaged wall shear stress was present in this location, along the posterior wall opposite the superior mesenteric artery and along the anterior wall between the superior and inferior mesenteric arteries. Shear stress temporal oscillations, as measured with an oscillatory shear index, were elevated in these regions. Under simulated light exercise conditions, a region of low wall shear stress and high oscillatory shear index remained along the posterior wall immediately distal to the renal arteries. Under simulated moderate exercise conditions, all the regions of low wall shear stress and high oscillatory shear index were eliminated. This numeric investigation provided detailed quantitative data on the effect of exercise on hemodynamic conditions in the abdominal aorta. Our results indicated that moderate levels of lower limb exercise are necessary to eliminate the flow reversal and regions of low wall shear stress in the abdominal aorta that exist under resting conditions. The lack of flow reversal and increased wall shear stress during exercise suggest a mechanism by which exercise may promote arterial health, namely with the elimination of adverse hemodynamic conditions.
SU-E-T-656: Quantitative Analysis of Proton Boron Fusion Therapy (PBFT) in Various Conditions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoon, D; Jung, J; Shin, H
2015-06-15
Purpose: Three alpha particles are concomitant of proton boron interaction, which can be used in radiotherapy applications. We performed simulation studies to determine the effectiveness of proton boron fusion therapy (PBFT) under various conditions. Methods: Boron uptake regions (BURs) of various widths and densities were implemented in Monte Carlo n-particle extended (MCNPX) simulation code. The effect of proton beam energy was considered for different BURs. Four simulation scenarios were designed to verify the effectiveness of integrated boost that was observed in the proton boron reaction. In these simulations, the effect of proton beam energy was determined for different physical conditions,more » such as size, location, and boron concentration. Results: Proton dose amplification was confirmed for all proton beam energies considered (< 96.62%). Based on the simulation results for different physical conditions, the threshold for the range in which proton dose amplification occurred was estimated as 0.3 cm. Effective proton boron reaction requires the boron concentration to be equal to or greater than 14.4 mg/g. Conclusion: We established the effects of the PBFT with various conditions by using Monte Carlo simulation. The results of our research can be used for providing a PBFT dose database.« less
Simulated microgravity induces an inflammatory response in the common carotid artery of rats.
Liu, Huan; Wang, Zhong-Chao; Yue, Yuan; Yu, Jin-Wen; Cai, Yue; Bai, Yun-Gang; Zhang, Hai-Jun; Bao, Jun-Xiang; Ren, Xin-Ling; Xie, Man-Jiang; Ma, Jin
2014-08-01
Post-spaceflight orthostatic intolerance is one of the most important adverse effects after exposure to space microgravity, and there are still no effective countermeasures. It has been considered that arterial remodeling may play an important role in the occurrence of post-spaceflight orthostatic intolerance, but the cellular mechanisms remain unknown. In this study, we investigated whether an inflammatory response exists in the common carotid artery of rats exposed to simulated microgravity. For this, Sprague-Dawley rats were subjected to 4 weeks of hindlimb unweighting to simulate microgravity. The expression levels of the adhesion molecules E-selectin and vascular cell adhesion molecule-1 (VCAM-1), and the cytokine monocyte chemoattractant protein-1 (MCP-1) in the common carotid artery of simulated microgravity rats were evaluated by immunohistochemical staining, quantitative RT-PCR, and Western blot analyses. The recruitment of monocytes in the common carotid artery of rats exposed to simulated microgravity was investigated by en face immunofluorescence staining and monocyte binding assays. Our results provided convincing evidence that there is an inflammatory response in the common carotid artery of rats exposed to simulated microgravity. Our work suggests that the inflammatory response may be a novel cellular mechanism that is responsible for the arterial remodeling that occurs during exposure to microgravity.
Analysing magnetism using scanning SQUID microscopy.
Reith, P; Renshaw Wang, X; Hilgenkamp, H
2017-12-01
Scanning superconducting quantum interference device microscopy (SSM) is a scanning probe technique that images local magnetic flux, which allows for mapping of magnetic fields with high field and spatial accuracy. Many studies involving SSM have been published in the last few decades, using SSM to make qualitative statements about magnetism. However, quantitative analysis using SSM has received less attention. In this work, we discuss several aspects of interpreting SSM images and methods to improve quantitative analysis. First, we analyse the spatial resolution and how it depends on several factors. Second, we discuss the analysis of SSM scans and the information obtained from the SSM data. Using simulations, we show how signals evolve as a function of changing scan height, SQUID loop size, magnetization strength, and orientation. We also investigated 2-dimensional autocorrelation analysis to extract information about the size, shape, and symmetry of magnetic features. Finally, we provide an outlook on possible future applications and improvements.
Analysing magnetism using scanning SQUID microscopy
NASA Astrophysics Data System (ADS)
Reith, P.; Renshaw Wang, X.; Hilgenkamp, H.
2017-12-01
Scanning superconducting quantum interference device microscopy (SSM) is a scanning probe technique that images local magnetic flux, which allows for mapping of magnetic fields with high field and spatial accuracy. Many studies involving SSM have been published in the last few decades, using SSM to make qualitative statements about magnetism. However, quantitative analysis using SSM has received less attention. In this work, we discuss several aspects of interpreting SSM images and methods to improve quantitative analysis. First, we analyse the spatial resolution and how it depends on several factors. Second, we discuss the analysis of SSM scans and the information obtained from the SSM data. Using simulations, we show how signals evolve as a function of changing scan height, SQUID loop size, magnetization strength, and orientation. We also investigated 2-dimensional autocorrelation analysis to extract information about the size, shape, and symmetry of magnetic features. Finally, we provide an outlook on possible future applications and improvements.
Li, Wenjin
2018-02-28
Transition path ensemble consists of reactive trajectories and possesses all the information necessary for the understanding of the mechanism and dynamics of important condensed phase processes. However, quantitative description of the properties of the transition path ensemble is far from being established. Here, with numerical calculations on a model system, the equipartition terms defined in thermal equilibrium were for the first time estimated in the transition path ensemble. It was not surprising to observe that the energy was not equally distributed among all the coordinates. However, the energies distributed on a pair of conjugated coordinates remained equal. Higher energies were observed to be distributed on several coordinates, which are highly coupled to the reaction coordinate, while the rest were almost equally distributed. In addition, the ensemble-averaged energy on each coordinate as a function of time was also quantified. These quantitative analyses on energy distributions provided new insights into the transition path ensemble.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, Bennett N., E-mail: bennett.walker@fda.hhs.gov; Office of Device Evaluation, Center for Devices and Radiological Health, U.S. Food and Drug Administration, Silver Spring, Maryland 20993; James, Robert H.
Glare, glistenings, optical defects, dysphotopsia, and poor image quality are a few of the known deficiencies of intraocular lenses (IOLs). All of these optical phenomena are related to light scatter. However, the specific direction that light scatters makes a critical difference between debilitating glare and a slightly noticeable decrease in image quality. Consequently, quantifying the magnitude and direction of scattered light is essential to appropriately evaluate the safety and efficacy of IOLs. In this study, we introduce a full-angle scanning light scattering profiler (SLSP) as a novel approach capable of quantitatively evaluating the light scattering from IOLs with a nearlymore » 360° view. The SLSP method can simulate in situ conditions by controlling the parameters of the light source including angle of incidence. This testing strategy will provide a more effective nonclinical approach for the evaluation of IOL light scatter.« less
Fast quantitative optical detection of heat dissipation by surface plasmon polaritons.
Möller, Thomas B; Ganser, Andreas; Kratt, Martina; Dickreuter, Simon; Waitz, Reimar; Scheer, Elke; Boneberg, Johannes; Leiderer, Paul
2018-06-13
Heat management at the nanoscale is an issue of increasing importance. In optoelectronic devices the transport and decay of plasmons contribute to the dissipation of heat. By comparison of experimental data and simulations we demonstrate that it is possible to gain quantitative information about excitation, propagation and decay of surface plasmon polaritons (SPPs) in a thin gold stripe supported by a silicon membrane. The temperature-dependent optical transmissivity of the membrane is used to determine the temperature distribution around the metal stripe with high spatial and temporal resolution. This method is complementary to techniques where the propagation of SPPs is monitored optically, and provides additional information which is not readily accessible by other means. In particular, we demonstrate that the thermal conductivity of the membrane can also be derived from our analysis. The results presented here show the high potential of this tool for heat management studies in nanoscale devices.
Quantitative evaluation of statistical errors in small-angle X-ray scattering measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sedlak, Steffen M.; Bruetzel, Linda K.; Lipfert, Jan
A new model is proposed for the measurement errors incurred in typical small-angle X-ray scattering (SAXS) experiments, which takes into account the setup geometry and physics of the measurement process. The model accurately captures the experimentally determined errors from a large range of synchrotron and in-house anode-based measurements. Its most general formulation gives for the variance of the buffer-subtracted SAXS intensity σ 2(q) = [I(q) + const.]/(kq), whereI(q) is the scattering intensity as a function of the momentum transferq;kand const. are fitting parameters that are characteristic of the experimental setup. The model gives a concrete procedure for calculating realistic measurementmore » errors for simulated SAXS profiles. In addition, the results provide guidelines for optimizing SAXS measurements, which are in line with established procedures for SAXS experiments, and enable a quantitative evaluation of measurement errors.« less
Wang, Qinghua; Ri, Shien; Tsuda, Hiroshi; Kodera, Masako; Suguro, Kyoichi; Miyashita, Naoto
2017-09-19
Quantitative detection of defects in atomic structures is of great significance to evaluating product quality and exploring quality improvement process. In this study, a Fourier transform filtered sampling Moire technique was proposed to visualize and detect defects in atomic arrays in a large field of view. Defect distributions, defect numbers and defect densities could be visually and quantitatively determined from a single atomic structure image at low cost. The effectiveness of the proposed technique was verified from numerical simulations. As an application, the dislocation distributions in a GaN/AlGaN atomic structure in two directions were magnified and displayed in Moire phase maps, and defect locations and densities were detected automatically. The proposed technique is able to provide valuable references to material scientists and engineers by checking the effect of various treatments for defect reduction. © 2017 IOP Publishing Ltd.
Quadrant CFD Analysis of a Mixer-Ejector Nozzle for HSCT Applications
NASA Technical Reports Server (NTRS)
Yoder, Dennis A.; Georgiadis, Nicholas J.; Wolter, John D.
2005-01-01
This study investigates the sidewall effect on flow within the mixing duct downstream of a lobed mixer-ejector nozzle. Simulations which model only one half-chute width of the ejector array are compared with those which model one complete quadrant of the nozzle geometry and with available experimental data. These solutions demonstrate the applicability of the half-chute technique to model the flowfield far away from the sidewall and the necessity of a full-quadrant simulation to predict the formation of a low-energy flow region near the sidewall. The quadrant solutions are further examined to determine the cause of this low-energy region, which reduces the amount of mixing and lowers the thrust of the nozzle. Grid resolution and different grid topologies are also examined. Finally, an assessment of the half-chute and quadrant approaches is made to determine the ability of these simulations to provide qualitative and/or quantitative predictions for this type of complex flowfield.
Patra, Chandra N
2014-11-14
A systematic investigation of the spherical electric double layers with the electrolytes having size as well as charge asymmetry is carried out using density functional theory and Monte Carlo simulations. The system is considered within the primitive model, where the macroion is a structureless hard spherical colloid, the small ions as charged hard spheres of different size, and the solvent is represented as a dielectric continuum. The present theory approximates the hard sphere part of the one particle correlation function using a weighted density approach whereas a perturbation expansion around the uniform fluid is applied to evaluate the ionic contribution. The theory is in quantitative agreement with Monte Carlo simulation for the density and the mean electrostatic potential profiles over a wide range of electrolyte concentrations, surface charge densities, valence of small ions, and macroion sizes. The theory provides distinctive evidence of charge and size correlations within the electrode-electrolyte interface in spherical geometry.
NASA Astrophysics Data System (ADS)
Mota, F. L.; Song, Y.; Pereda, J.; Billia, B.; Tourret, D.; Debierre, J.-M.; Trivedi, R.; Karma, A.; Bergeon, N.
2017-08-01
To study the dynamical formation and evolution of cellular and dendritic arrays under diffusive growth conditions, three-dimensional (3D) directional solidification experiments were conducted in microgravity on a model transparent alloy onboard the International Space Station using the Directional Solidification Insert in the DEvice for the study of Critical LIquids and Crystallization. Selected experiments were repeated on Earth under gravity-driven fluid flow to evidence convection effects. Both radial and axial macrosegregation resulting from convection are observed in ground experiments, and primary spacings measured on Earth and microgravity experiments are noticeably different. The microgravity experiments provide unique benchmark data for numerical simulations of spatially extended pattern formation under diffusive growth conditions. The results of 3D phase-field simulations highlight the importance of accurately modeling thermal conditions that strongly influence the front recoil of the interface and the selection of the primary spacing. The modeling predictions are in good quantitative agreements with the microgravity experiments.
Investigation on the forced response of a radial turbine under aerodynamic excitations
NASA Astrophysics Data System (ADS)
Ma, Chaochen; Huang, Zhi; Qi, Mingxu
2016-04-01
Rotor blades in a radial turbine with nozzle guide vanes typically experience harmonic aerodynamic excitations due to the rotor stator interaction. Dynamic stresses induced by the harmonic excitations can result in high cycle fatigue (HCF) of the blades. A reliable prediction method for forced response issue is essential to avoid the HCF problem. In this work, the forced response mechanisms were investigated based on a fluid structure interaction (FSI) method. Aerodynamic excitations were obtained by three-dimensional unsteady computational fluid dynamics (CFD) simulation with phase shifted periodic boundary conditions. The first two harmonic pressures were determined as the primary components of the excitation and applied to finite element (FE) model to conduct the computational structural dynamics (CSD) simulation. The computed results from the harmonic forced response analysis show good agreement with the predictions of Singh's advanced frequency evaluation (SAFE) diagram. Moreover, the mode superposition method used in FE simulation offers an efficient way to provide quantitative assessments of mode response levels and resonant strength.
Objective fitting of hemoglobin dynamics in traumatic bruises based on temperature depth profiling
NASA Astrophysics Data System (ADS)
Vidovič, Luka; Milanič, Matija; Majaron, Boris
2014-02-01
Pulsed photothermal radiometry (PPTR) allows noninvasive measurement of laser-induced temperature depth profiles. The obtained profiles provide information on depth distribution of absorbing chromophores, such as melanin and hemoglobin. We apply this technique to objectively characterize mass diffusion and decomposition rate of extravasated hemoglobin during the bruise healing process. In present study, we introduce objective fitting of PPTR data obtained over the course of the bruise healing process. By applying Monte Carlo simulation of laser energy deposition and simulation of the corresponding PPTR signal, quantitative analysis of underlying bruise healing processes is possible. Introduction of objective fitting enables an objective comparison between the simulated and experimental PPTR signals. In this manner, we avoid reconstruction of laser-induced depth profiles and thus inherent loss of information in the process. This approach enables us to determine the value of hemoglobin mass diffusivity, which is controversial in existing literature. Such information will be a valuable addition to existing bruise age determination techniques.
Telerobotics: A simulation facility for university research
NASA Technical Reports Server (NTRS)
Stark, L.; Kim, W.; Tendick, F.; Tyler, M.; Hannaford, B.; Barakat, W.; Bergengruen, O.; Braddi, L.; Eisenberg, J.; Ellis, S.
1987-01-01
An experimental telerobotics (TR) simulation suitable for studying human operator (H.O.) performance is described. Simple manipulator pick-and-place and tracking tasks allowed quantitative comparison of a number of calligraphic display viewing conditions. A number of control modes could be compared in this TR simulation, including displacement, rate and acceleratory control using position and force joysticks. A homeomorphic controller turned out to be no better than joysticks; the adaptive properties of the H.O. can apparently permit quite good control over a variety of controller configurations and control modes. Training by optimal control example seemed helpful in preliminary experiments. An introduced communication delay was found to produce decrease in performance. In considerable part, this difficulty could be compensated for by preview control information. That neurological control of normal human movement contains a data period of 0.2 second may relate to this robustness of H.O. control to delay. The Ames-Berkeley enhanced perspective display was utilized in conjunction with an experimental helmet mounted display system (HMD) that provided stereoscopic enhanced views.
NASA Astrophysics Data System (ADS)
Duan, Lian; Makita, Shuichi; Yamanari, Masahiro; Lim, Yiheng; Yasuno, Yoshiaki
2011-08-01
A Monte-Carlo-based phase retardation estimator is developed to correct the systematic error in phase retardation measurement by polarization sensitive optical coherence tomography (PS-OCT). Recent research has revealed that the phase retardation measured by PS-OCT has a distribution that is neither symmetric nor centered at the true value. Hence, a standard mean estimator gives us erroneous estimations of phase retardation, and it degrades the performance of PS-OCT for quantitative assessment. In this paper, the noise property in phase retardation is investigated in detail by Monte-Carlo simulation and experiments. A distribution transform function is designed to eliminate the systematic error by using the result of the Monte-Carlo simulation. This distribution transformation is followed by a mean estimator. This process provides a significantly better estimation of phase retardation than a standard mean estimator. This method is validated both by numerical simulations and experiments. The application of this method to in vitro and in vivo biological samples is also demonstrated.
SPH simulation of free surface flow over a sharp-crested weir
NASA Astrophysics Data System (ADS)
Ferrari, Angela
2010-03-01
In this paper the numerical simulation of a free surface flow over a sharp-crested weir is presented. Since in this case the usual shallow water assumptions are not satisfied, we propose to solve the problem using the full weakly compressible Navier-Stokes equations with the Tait equation of state for water. The numerical method used consists of the new meshless Smooth Particle Hydrodynamics (SPH) formulation proposed by Ferrari et al. (2009) [8], that accurately tracks the free surface profile and provides monotone pressure fields. Thus, the unsteady evolution of the complex moving material interface (free surface) can been properly solved. The simulations involving about half a million of fluid particles have been run in parallel on two of the most powerful High Performance Computing (HPC) facilities in Europe. The validation of the results has been carried out analysing the pressure field and comparing the free surface profiles obtained with the SPH scheme with experimental measurements available in literature [18]. A very good quantitative agreement has been obtained.
NASA Astrophysics Data System (ADS)
Miller, B. W.; Schuurman, G. W.; Symstad, A.; Fisichelli, N. A.; Frid, L.
2017-12-01
Managing natural resources in this era of anthropogenic climate change is fraught with uncertainties around how ecosystems will respond to management actions and a changing climate. Scenario planning (oftentimes implemented as a qualitative, participatory exercise for exploring multiple possible futures) is a valuable tool for addressing this challenge. However, this approach may face limits in resolving responses of complex systems to altered climate and management conditions, and may not provide the scientific credibility that managers often require to support actions that depart from current practice. Quantitative information on projected climate changes and ecological responses is rapidly growing and evolving, but this information is often not at a scale or in a form that is `actionable' for resource managers. We describe a project that sought to create usable information for resource managers in the northern Great Plains by combining qualitative and quantitative methods. In particular, researchers, resource managers, and climate adaptation specialists co-produced a simulation model in conjunction with scenario planning workshops to inform natural resource management in southwest South Dakota. Scenario planning for a wide range of resources facilitated open-minded thinking about a set of divergent and challenging, yet relevant and plausible, climate scenarios and management alternatives that could be implemented in the simulation. With stakeholder input throughout the process, we built a simulation of key vegetation types, grazing, exotic plants, fire, and the effects of climate and management on rangeland productivity and composition. By simulating multiple land management jurisdictions, climate scenarios, and management alternatives, the model highlighted important tradeoffs between herd sizes and vegetation composition, and between the short- versus long-term costs of invasive species management. It also identified impactful uncertainties related to the effects of fire and grazing on vegetation. Ultimately, this integrative and iterative approach yielded counter-intuitive and surprising findings, and resulted in a more tractable set of possible futures for resource management planning.
NASA Astrophysics Data System (ADS)
Xu, Zexuan; Hu, Bill
2016-04-01
Dual-permeability karst aquifers of porous media and conduit networks with significant different hydrological characteristics are widely distributed in the world. Discrete-continuum numerical models, such as MODFLOW-CFP and CFPv2, have been verified as appropriate approaches to simulate groundwater flow and solute transport in numerical modeling of karst hydrogeology. On the other hand, seawater intrusion associated with fresh groundwater resources contamination has been observed and investigated in numbers of coastal aquifers, especially under conditions of sea level rise. Density-dependent numerical models including SEAWAT are able to quantitatively evaluate the seawater/freshwater interaction processes. A numerical model of variable-density flow and solute transport - conduit flow process (VDFST-CFP) is developed to provide a better description of seawater intrusion and submarine groundwater discharge in a coastal karst aquifer with conduits. The coupling discrete-continuum VDFST-CFP model applies Darcy-Weisbach equation to simulate non-laminar groundwater flow in the conduit system in which is conceptualized and discretized as pipes, while Darcy equation is still used in continuum porous media. Density-dependent groundwater flow and solute transport equations with appropriate density terms in both conduit and porous media systems are derived and numerically solved using standard finite difference method with an implicit iteration procedure. Synthetic horizontal and vertical benchmarks are created to validate the newly developed VDFST-CFP model by comparing with other numerical models such as variable density SEAWAT, couplings of constant density groundwater flow and solute transport MODFLOW/MT3DMS and discrete-continuum CFPv2/UMT3D models. VDFST-CFP model improves the simulation of density dependent seawater/freshwater mixing processes and exchanges between conduit and matrix. Continuum numerical models greatly overestimated the flow rate under turbulent flow condition but discrete-continuum models provide more accurate results. Parameters sensitivities analysis indicates that conduit diameter and friction factor, matrix hydraulic conductivity and porosity are important parameters that significantly affect variable-density flow and solute transport simulation. The pros and cons of model assumptions, conceptual simplifications and numerical techniques in VDFST-CFP are discussed. In general, the development of VDFST-CFP model is an innovation in numerical modeling methodology and could be applied to quantitatively evaluate the seawater/freshwater interaction in coastal karst aquifers. Keywords: Discrete-continuum numerical model; Variable density flow and transport; Coastal karst aquifer; Non-laminar flow
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wen, N; Glide-Hurst, C; Liu, M
Purpose: Quantitative magnetic resonance imaging (MRI) of cranial lesions prior to stereotactic radiosurgery (SRS) may improve treatment planning and provide potential prognostic value. The practicality and logistics of acquiring advanced multiparametric MRI sequences to measure vascular and cellular properties of cerebral tumors are explored on a 1.0 Tesla MR Simulator. Methods: MR simulation was performed immediately following routine CT simulation on a 1T MR Simulator. MR sequences used were in the order they were performed: T2-Weighted Turbo Spin Echo (T2W-TSE), T2 FLAIR, Diffusion-weighted (DWI, b = 0, 800 to generate an apparent diffusion coefficient (ADC) map), 3D T1-Weighted Fast Fieldmore » Echo (T1W-FFE), Dynamic Contrast Enhanced (DCE) and Post Gadolinium Contrast Enhanced 3D T1W-FFE images. T1 pre-contrast values was generated by acquiring six different flip angles. The arterial input function was derived from arterial pixels in the perfusion images selected manually. The extended Tofts model was used to generate the permeability maps. Routine MRI scans took about 30 minutes to complete; the additional scans added 12 minutes. Results: To date, seven patients with cerebral tumors have been imaged and tumor physiology characterized. For example, on a glioblastoma patient, the volume contoured on T1 Gd images, ADC map and the pharmacokinetic map (Ktrans) were 1.9, 1.4, and 1.5 cc respectively with strong spatial correlation. The mean ADC value of the entire volume was 1141 μm2/s while the value in the white matter was 811 μm2/s. The mean value of Ktrans was 0.02 min-1 in the tumor volume and 0.00 in the normal white matter. Conclusion: Our initial results suggest that multiparametric MRI sequences may provide a more quantitative evaluation of vascular and tumor properties. Implementing functional imaging during MR-SIM may be particularly beneficial in assessing tumor extent, differentiating radiation necrosis from tumor recurrence, and establishing reliable bio-markers for treatment response evaluation. The Department of Radiation Oncology at Henry Ford Health System has research agreement with Varian Medical System and Philips Health Care.« less
NASA Astrophysics Data System (ADS)
Shirley, Rachel Elizabeth
Nuclear power plant (NPP) simulators are proliferating in academic research institutions and national laboratories in response to the availability of affordable, digital simulator platforms. Accompanying the new research facilities is a renewed interest in using data collected in NPP simulators for Human Reliability Analysis (HRA) research. An experiment conducted in The Ohio State University (OSU) NPP Simulator Facility develops data collection methods and analytical tools to improve use of simulator data in HRA. In the pilot experiment, student operators respond to design basis accidents in the OSU NPP Simulator Facility. Thirty-three undergraduate and graduate engineering students participated in the research. Following each accident scenario, student operators completed a survey about perceived simulator biases and watched a video of the scenario. During the video, they periodically recorded their perceived strength of significant Performance Shaping Factors (PSFs) such as Stress. This dissertation reviews three aspects of simulator-based research using the data collected in the OSU NPP Simulator Facility: First, a qualitative comparison of student operator performance to computer simulations of expected operator performance generated by the Information Decision Action Crew (IDAC) HRA method. Areas of comparison include procedure steps, timing of operator actions, and PSFs. Second, development of a quantitative model of the simulator bias introduced by the simulator environment. Two types of bias are defined: Environmental Bias and Motivational Bias. This research examines Motivational Bias--that is, the effect of the simulator environment on an operator's motivations, goals, and priorities. A bias causal map is introduced to model motivational bias interactions in the OSU experiment. Data collected in the OSU NPP Simulator Facility are analyzed using Structural Equation Modeling (SEM). Data include crew characteristics, operator surveys, and time to recognize and diagnose the accident in the scenario. These models estimate how the effects of the scenario conditions are mediated by simulator bias, and demonstrate how to quantify the strength of the simulator bias. Third, development of a quantitative model of subjective PSFs based on objective data (plant parameters, alarms, etc.) and PSF values reported by student operators. The objective PSF model is based on the PSF network in the IDAC HRA method. The final model is a mixed effects Bayesian hierarchical linear regression model. The subjective PSF model includes three factors: The Environmental PSF, the simulator Bias, and the Context. The Environmental Bias is mediated by an operator sensitivity coefficient that captures the variation in operator reactions to plant conditions. The data collected in the pilot experiments are not expected to reflect professional NPP operator performance, because the students are still novice operators. However, the models used in this research and the methods developed to analyze them demonstrate how to consider simulator bias in experiment design and how to use simulator data to enhance the technical basis of a complex HRA method. The contributions of the research include a framework for discussing simulator bias, a quantitative method for estimating simulator bias, a method for obtaining operator-reported PSF values, and a quantitative method for incorporating the variability in operator perception into PSF models. The research demonstrates applications of Structural Equation Modeling and hierarchical Bayesian linear regression models in HRA. Finally, the research demonstrates the benefits of using student operators as a test platform for HRA research.
Scheper, Carsten; Wensch-Dorendorf, Monika; Yin, Tong; Dressel, Holger; Swalve, Herrmann; König, Sven
2016-06-29
Intensified selection of polled individuals has recently gained importance in predominantly horned dairy cattle breeds as an alternative to routine dehorning. The status quo of the current polled breeding pool of genetically-closely related artificial insemination sires with lower breeding values for performance traits raises questions regarding the effects of intensified selection based on this founder pool. We developed a stochastic simulation framework that combines the stochastic simulation software QMSim and a self-designed R program named QUALsim that acts as an external extension. Two traits were simulated in a dairy cattle population for 25 generations: one quantitative (QMSim) and one qualitative trait with Mendelian inheritance (i.e. polledness, QUALsim). The assignment scheme for qualitative trait genotypes initiated realistic initial breeding situations regarding allele frequencies, true breeding values for the quantitative trait and genetic relatedness. Intensified selection for polled cattle was achieved using an approach that weights estimated breeding values in the animal best linear unbiased prediction model for the quantitative trait depending on genotypes or phenotypes for the polled trait with a user-defined weighting factor. Selection response for the polled trait was highest in the selection scheme based on genotypes. Selection based on phenotypes led to significantly lower allele frequencies for polled. The male selection path played a significantly greater role for a fast dissemination of polled alleles compared to female selection strategies. Fixation of the polled allele implies selection based on polled genotypes among males. In comparison to a base breeding scenario that does not take polledness into account, intensive selection for polled substantially reduced genetic gain for this quantitative trait after 25 generations. Reducing selection intensity for polled males while maintaining strong selection intensity among females, simultaneously decreased losses in genetic gain and achieved a final allele frequency of 0.93 for polled. A fast transition to a completely polled population through intensified selection for polled was in contradiction to the preservation of high genetic gain for the quantitative trait. Selection on male polled genotypes with moderate weighting, and selection on female polled phenotypes with high weighting, could be a suitable compromise regarding all important breeding aspects.
Quantitative Analysis of Hepatitis C NS5A Viral Protein Dynamics on the ER Surface.
Knodel, Markus M; Nägel, Arne; Reiter, Sebastian; Vogel, Andreas; Targett-Adams, Paul; McLauchlan, John; Herrmann, Eva; Wittum, Gabriel
2018-01-08
Exploring biophysical properties of virus-encoded components and their requirement for virus replication is an exciting new area of interdisciplinary virological research. To date, spatial resolution has only rarely been analyzed in computational/biophysical descriptions of virus replication dynamics. However, it is widely acknowledged that intracellular spatial dependence is a crucial component of virus life cycles. The hepatitis C virus-encoded NS5A protein is an endoplasmatic reticulum (ER)-anchored viral protein and an essential component of the virus replication machinery. Therefore, we simulate NS5A dynamics on realistic reconstructed, curved ER surfaces by means of surface partial differential equations (sPDE) upon unstructured grids. We match the in silico NS5A diffusion constant such that the NS5A sPDE simulation data reproduce experimental NS5A fluorescence recovery after photobleaching (FRAP) time series data. This parameter estimation yields the NS5A diffusion constant. Such parameters are needed for spatial models of HCV dynamics, which we are developing in parallel but remain qualitative at this stage. Thus, our present study likely provides the first quantitative biophysical description of the movement of a viral component. Our spatio-temporal resolved ansatz paves new ways for understanding intricate spatial-defined processes central to specfic aspects of virus life cycles.
Multi-observation PET image analysis for patient follow-up quantitation and therapy assessment
NASA Astrophysics Data System (ADS)
David, S.; Visvikis, D.; Roux, C.; Hatt, M.
2011-09-01
In positron emission tomography (PET) imaging, an early therapeutic response is usually characterized by variations of semi-quantitative parameters restricted to maximum SUV measured in PET scans during the treatment. Such measurements do not reflect overall tumor volume and radiotracer uptake variations. The proposed approach is based on multi-observation image analysis for merging several PET acquisitions to assess tumor metabolic volume and uptake variations. The fusion algorithm is based on iterative estimation using a stochastic expectation maximization (SEM) algorithm. The proposed method was applied to simulated and clinical follow-up PET images. We compared the multi-observation fusion performance to threshold-based methods, proposed for the assessment of the therapeutic response based on functional volumes. On simulated datasets the adaptive threshold applied independently on both images led to higher errors than the ASEM fusion and on clinical datasets it failed to provide coherent measurements for four patients out of seven due to aberrant delineations. The ASEM method demonstrated improved and more robust estimation of the evaluation leading to more pertinent measurements. Future work will consist in extending the methodology and applying it to clinical multi-tracer datasets in order to evaluate its potential impact on the biological tumor volume definition for radiotherapy applications.
Struksnes, Solveig; Engelien, Ragna Ingeborg
2016-01-01
Education institution and practice field have a joint responsibility with regard to facilitating a learning environment for the nursing students that provides learning outcomes in accordance with the National Curriculum. Using simulated patient situations is about ensuring a safe learning environment where mistakes are not putting real patients' lives in danger. To compare nursing students' experiences with a skills training situation immediately after the training and after their ten weeks clinical placement in nursing homes. Quantitative, cross-sectional and evaluative. Full- and part-time students in their first year of a Bachelor of Nursing degree. The students answered a questionnaire on two different occasions, immediately after skills training and after internship in a nursing home. Being a "patient" and a "nurse" in simulation was experienced as useful to clinical practice. Students with previous experience had a significantly higher perception of mastering the procedure after the internship, while unexperienced fellow students did not report any significant increase with regard to a sense of coping during their clinical practice. The findings raise questions if there are aspects with the education institution or the practice field that should be improved to help facilitate a better learning process for students without any previous experience. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Kühn, Michael; Vieth-Hillebrand, Andrea; Wilke, Franziska D. H.
2017-04-01
Black shales are a heterogeneous mixture of minerals, organic matter and formation water and little is actually known about the fluid-rock interactions during hydraulic fracturing and their effects on composition of flowback and produced water. Geochemical simulations have been performed based on the analyses of "real" flowback water samples and artificial stimulation fluids from lab experiments with the aim to set up a chemical process model for shale gas reservoirs. Prediction of flowback water compositions for potential or already chosen sites requires validated and parameterized geochemical models. For the software "Geochemist's Workbench" (GWB) data bases are adapted and amended based on a literature review. Evaluation of the system has been performed in comparison with the results from laboratory experiments. Parameterization was done in regard to field data provided. Finally, reaction path models are applied for quantitative information about the mobility of compounds in specific settings. Our work leads to quantitative estimates of reservoir compounds in the flowback based on calibrations by laboratory experiments. Such information is crucial for the assessment of environmental impacts as well as to estimate human- and ecotoxicological effects of the flowback waters from a variety of natural gas shales. With a comprehensive knowledge about potential composition and mobility of flowback water, selection of water treatment techniques will become easier.
Mapping QTLs for drought tolerance in a SEA 5 x AND 277 common bean cross with SSRs and SNP markers.
Briñez, Boris; Perseguini, Juliana Morini Küpper Cardoso; Rosa, Juliana Santa; Bassi, Denis; Gonçalves, João Guilherme Ribeiro; Almeida, Caléo; Paulino, Jean Fausto de Carvalho; Blair, Matthew Ward; Chioratto, Alisson Fernando; Carbonell, Sérgio Augusto Morais; Valdisser, Paula Arielle Mendes Ribeiro; Vianello, Rosana Pereira; Benchimol-Reis, Luciana Lasry
2017-01-01
The common bean is characterized by high sensitivity to drought and low productivity. Breeding for drought resistance in this species involves genes of different genetic groups. In this work, we used a SEA 5 x AND 277 cross to map quantitative trait loci associated with drought tolerance in order to assess the factors that determine the magnitude of drought response in common beans. A total of 438 polymorphic markers were used to genotype the F8 mapping population. Phenotyping was done in two greenhouses, one used to simulate drought and the other to simulate irrigated conditions. Fourteen traits associated with drought tolerance were measured to identify the quantitative trait loci (QTLs). The map was constructed with 331 markers that covered all 11 chromosomes and had a total length of 1515 cM. Twenty-two QTLs were discovered for chlorophyll, leaf and stem fresh biomass, leaf biomass dry weight, leaf temperature, number of pods per plant, number of seeds per plant, seed weight, days to flowering, dry pod weight and total yield under well-watered and drought (stress) conditions. All the QTLs detected under drought conditions showed positive effects of the SEA 5 allele. This study provides a better understanding of the genetic inheritance of drought tolerance in common bean.
Quantitative Agent Based Model of Opinion Dynamics: Polish Elections of 2015
Sobkowicz, Pawel
2016-01-01
We present results of an abstract, agent based model of opinion dynamics simulations based on the emotion/information/opinion (E/I/O) approach, applied to a strongly polarized society, corresponding to the Polish political scene between 2005 and 2015. Under certain conditions the model leads to metastable coexistence of two subcommunities of comparable size (supporting the corresponding opinions)—which corresponds to the bipartisan split found in Poland. Spurred by the recent breakdown of this political duopoly, which occurred in 2015, we present a model extension that describes both the long term coexistence of the two opposing opinions and a rapid, transitory change due to the appearance of a third party alternative. We provide quantitative comparison of the model with the results of polls and elections in Poland, testing the assumptions related to the modeled processes and the parameters used in the simulations. It is shown, that when the propaganda messages of the two incumbent parties differ in emotional tone, the political status quo may be unstable. The asymmetry of the emotions within the support bases of the two parties allows one of them to be ‘invaded’ by a newcomer third party very quickly, while the second remains immune to such invasion. PMID:27171226
Sadaie, Wakako; Harada, Yoshie; Matsuda, Michiyuki
2014-01-01
Computer-assisted simulation is a promising approach for clarifying complicated signaling networks. However, this approach is currently limited by a deficiency of kinetic parameters determined in living cells. To overcome this problem, we applied fluorescence cross-correlation spectrometry (FCCS) to measure dissociation constant (Kd) values of signaling molecule complexes in living cells (in vivo Kd). Among the pairs of fluorescent molecules tested, that of monomerized enhanced green fluorescent protein (mEGFP) and HaloTag-tetramethylrhodamine was most suitable for the measurement of in vivo Kd by FCCS. Using this pair, we determined 22 in vivo Kd values of signaling molecule complexes comprising the epidermal growth factor receptor (EGFR)–Ras–extracellular signal-regulated kinase (ERK) mitogen-activated protein (MAP) kinase pathway. With these parameters, we developed a kinetic simulation model of the EGFR-Ras-ERK MAP kinase pathway and uncovered a potential role played by stoichiometry in Shc binding to EGFR during the peak activations of Ras, MEK, and ERK. Intriguingly, most of the in vivo Kd values determined in this study were higher than the in vitro Kd values reported previously, suggesting the significance of competitive bindings inside cells. These in vivo Kd values will provide a sound basis for the quantitative understanding of signal transduction. PMID:24958104
Ensari, Ipek; Motl, Robert W; Klaren, Rachel E; Fernhall, Bo; Smith, Denise L; Horn, Gavin P
2017-05-01
A standard exercise protocol that allows comparisons across various ergonomic studies would be of great value for researchers investigating the physical and physiological strains of firefighting and possible interventions for reducing the demands. We compared the pattern of cardiorespiratory changes from 21 firefighters during simulated firefighting activities using a newly developed firefighting activity station (FAS) and treadmill walking both performed within an identical laboratory setting. Data on cardiorespiratory parameters and core temperature were collected continuously using a portable metabolic unit and a wireless ingestible temperature probe. Repeated measures ANOVA indicated distinct patterns of change in cardiorespiratory parameters and heart rate between conditions. The pattern consisted of alternating periods of peaks and nadirs in the FAS that were qualitatively and quantitatively similar to live fire activities, whereas the same parameters increased logarithmically in the treadmill condition. Core temperature increased in a similarly for both conditions, although more rapidly in the FAS. Practitioner Summary: The firefighting activity station (FAS) yields a pattern of cardiorespiratory responses qualitatively and quantitatively similar to live fire activities, significantly different than treadmill walking. The FAS can be performed in a laboratory/clinic, providing a potentially standardised protocol for testing interventions to improve health and safety and conducting return to duty decisions.
Quantitative Analysis of Hepatitis C NS5A Viral Protein Dynamics on the ER Surface
Nägel, Arne; Reiter, Sebastian; Vogel, Andreas; McLauchlan, John; Herrmann, Eva; Wittum, Gabriel
2018-01-01
Exploring biophysical properties of virus-encoded components and their requirement for virus replication is an exciting new area of interdisciplinary virological research. To date, spatial resolution has only rarely been analyzed in computational/biophysical descriptions of virus replication dynamics. However, it is widely acknowledged that intracellular spatial dependence is a crucial component of virus life cycles. The hepatitis C virus-encoded NS5A protein is an endoplasmatic reticulum (ER)-anchored viral protein and an essential component of the virus replication machinery. Therefore, we simulate NS5A dynamics on realistic reconstructed, curved ER surfaces by means of surface partial differential equations (sPDE) upon unstructured grids. We match the in silico NS5A diffusion constant such that the NS5A sPDE simulation data reproduce experimental NS5A fluorescence recovery after photobleaching (FRAP) time series data. This parameter estimation yields the NS5A diffusion constant. Such parameters are needed for spatial models of HCV dynamics, which we are developing in parallel but remain qualitative at this stage. Thus, our present study likely provides the first quantitative biophysical description of the movement of a viral component. Our spatio-temporal resolved ansatz paves new ways for understanding intricate spatial-defined processes central to specfic aspects of virus life cycles. PMID:29316722
Leveraging Quick Response Code Technology to Facilitate Simulation-Based Leaderboard Competition.
Chang, Todd P; Doughty, Cara B; Mitchell, Diana; Rutledge, Chrystal; Auerbach, Marc A; Frisell, Karin; Jani, Priti; Kessler, David O; Wolfe, Heather; MacKinnon, Ralph J; Dewan, Maya; Pirie, Jonathan; Lemke, Daniel; Khattab, Mona; Tofil, Nancy; Nagamuthu, Chenthila; Walsh, Catharine M
2018-02-01
Leaderboards provide feedback on relative performance and a competitive atmosphere for both self-guided improvement and social comparison. Because simulation can provide substantial quantitative participant feedback, leaderboards can be used, not only locally but also in a multidepartment, multicenter fashion. Quick Response (QR) codes can be integrated to allow participants to access and upload data. We present the development, implementation, and initial evaluation of an online leaderboard employing principles of gamification using points, badges, and leaderboards designed to enhance competition among healthcare providers. This article details the fundamentals behind the development and implementation of a user-friendly, online, multinational leaderboard that employs principles of gamification to enhance competition and integrates a QR code system to promote both self-reporting of performance data and data integrity. An open-ended survey was administered to capture perceptions of leaderboard implementation. Conceptual step-by-step instructions detailing how to apply the QR code system to any leaderboard using simulated or real performance metrics are outlined using an illustrative example of a leaderboard that employed simulated cardiopulmonary resuscitation performance scores to compare participants across 17 hospitals in 4 countries for 16 months. The following three major descriptive categories that captured perceptions of leaderboard implementation emerged from initial evaluation data from 10 sites: (1) competition, (2) longevity, and (3) perceived deficits. A well-designed leaderboard should be user-friendly and encompass best practices in gamification principles while collecting and storing data for research analyses. Easy storage and export of data allow for longitudinal record keeping that can be leveraged both to track compliance and to enable social competition.
Edwards, Stefan M.; Sørensen, Izel F.; Sarup, Pernille; Mackay, Trudy F. C.; Sørensen, Peter
2016-01-01
Predicting individual quantitative trait phenotypes from high-resolution genomic polymorphism data is important for personalized medicine in humans, plant and animal breeding, and adaptive evolution. However, this is difficult for populations of unrelated individuals when the number of causal variants is low relative to the total number of polymorphisms and causal variants individually have small effects on the traits. We hypothesized that mapping molecular polymorphisms to genomic features such as genes and their gene ontology categories could increase the accuracy of genomic prediction models. We developed a genomic feature best linear unbiased prediction (GFBLUP) model that implements this strategy and applied it to three quantitative traits (startle response, starvation resistance, and chill coma recovery) in the unrelated, sequenced inbred lines of the Drosophila melanogaster Genetic Reference Panel. Our results indicate that subsetting markers based on genomic features increases the predictive ability relative to the standard genomic best linear unbiased prediction (GBLUP) model. Both models use all markers, but GFBLUP allows differential weighting of the individual genetic marker relationships, whereas GBLUP weighs the genetic marker relationships equally. Simulation studies show that it is possible to further increase the accuracy of genomic prediction for complex traits using this model, provided the genomic features are enriched for causal variants. Our GFBLUP model using prior information on genomic features enriched for causal variants can increase the accuracy of genomic predictions in populations of unrelated individuals and provides a formal statistical framework for leveraging and evaluating information across multiple experimental studies to provide novel insights into the genetic architecture of complex traits. PMID:27235308
Motion compensation using origin ensembles in awake small animal positron emission tomography
NASA Astrophysics Data System (ADS)
Gillam, John E.; Angelis, Georgios I.; Kyme, Andre Z.; Meikle, Steven R.
2017-02-01
In emission tomographic imaging, the stochastic origin ensembles algorithm provides unique information regarding the detected counts given the measured data. Precision in both voxel and region-wise parameters may be determined for a single data set based on the posterior distribution of the count density allowing uncertainty estimates to be allocated to quantitative measures. Uncertainty estimates are of particular importance in awake animal neurological and behavioral studies for which head motion, unique for each acquired data set, perturbs the measured data. Motion compensation can be conducted when rigid head pose is measured during the scan. However, errors in pose measurements used for compensation can degrade the data and hence quantitative outcomes. In this investigation motion compensation and detector resolution models were incorporated into the basic origin ensembles algorithm and an efficient approach to computation was developed. The approach was validated against maximum liklihood—expectation maximisation and tested using simulated data. The resultant algorithm was then used to analyse quantitative uncertainty in regional activity estimates arising from changes in pose measurement precision. Finally, the posterior covariance acquired from a single data set was used to describe correlations between regions of interest providing information about pose measurement precision that may be useful in system analysis and design. The investigation demonstrates the use of origin ensembles as a powerful framework for evaluating statistical uncertainty of voxel and regional estimates. While in this investigation rigid motion was considered in the context of awake animal PET, the extension to arbitrary motion may provide clinical utility where respiratory or cardiac motion perturb the measured data.
Evaluation of compost blankets for erosion control from disturbed lands.
Bhattarai, Rabin; Kalita, Prasanta K; Yatsu, Shotaro; Howard, Heidi R; Svendsen, Niels G
2011-03-01
Soil erosion due to water and wind results in the loss of valuable top soil and causes land degradation and environmental quality problems. Site specific best management practices (BMP) are needed to curb erosion and sediment control and in turn, increase productivity of lands and sustain environmental quality. The aim of this study was to investigate the effectiveness of three different types of biodegradable erosion control blankets- fine compost, mulch, and 50-50 mixture of compost and mulch, for soil erosion control under field and laboratory-scale experiments. Quantitative analysis was conducted by comparing the sediment load in the runoff collected from sloped and tilled plots in the field and in the laboratory with the erosion control blankets. The field plots had an average slope of 3.5% and experiments were conducted under natural rainfall conditions, while the laboratory experiments were conducted at 4, 8 and 16% slopes under simulated rainfall conditions. Results obtained from the field experiments indicated that the 50-50 mixture of compost and mulch provides the best erosion control measures as compared to using either the compost or the mulch blanket alone. Laboratory results under simulated rains indicated that both mulch cover and the 50-50 mixture of mulch and compost cover provided better erosion control measures compared to using the compost alone. Although these results indicate that the 50-50 mixtures and the mulch in laboratory experiments are the best measures among the three erosion control blankets, all three types of blankets provide very effective erosion control measures from bare-soil surface. Results of this study can be used in controlling erosion and sediment from disturbed lands with compost mulch application. Testing different mixture ratios and types of mulch and composts, and their efficiencies in retaining various soil nutrients may provide more quantitative data for developing erosion control plans. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Gallo, Emanuela Carolina Angela
Width increased dual-pump enhanced coherent anti-Stokes Raman spectroscopy (WIDECARS) measurements were conducted in a McKenna air-ethylene premixed burner, at nominal equivalence ratio range between 0.55 and 2.50 to provide quantitative measurements of six major combustion species (C2H 4, N2, O2, H2, CO, CO2) concentration and temperature simultaneously. The purpose of this test was to investigate the uncertainties in the experimental and spectral modeling methods in preparation for an subsequent scramjet C2H4/air combustion test at the University of Virginia-Aerospace Research Laboratory. A broadband Pyrromethene (PM) PM597 and PM650 dye laser mixture and optical cavity were studied and optimized to excite the Raman shift of all the target species. Two hundred single shot recorded spectra were processed, theoretically fitted and then compared to computational models, to verify where chemical equilibrium or adiabatic condition occurred, providing experimental flame location and formation, species concentrations, temperature, and heat losses inputs to computational kinetic models. The Stark effect, temperature, and concentration errors are discussed. Subsequently, WIDECARS measurements of a premixed air-ethylene flame were successfully acquired in a direct connect small-scale dual-mode scramjet combustor, at University of Virginia Supersonic Combustion Facility (UVaSCF). A nominal Mach 5 flight condition was simulated (stagnation pressure p0 = 300 kPa, temperature T0 = 1200 K, equivalence ratio range ER = 0.3 -- 0.4). The purpose of this test was to provide quantitative measurements of the six major combustion species concentration and temperature. Point-wise measurements were taken by mapping four two-dimensional orthogonal planes (before, within, and two planes after the cavity flame holder) with respect to the combustor freestream direction. Two hundred single shot recorded spectra were processed and theoretically fitted. Mean flow and standard deviation are provided for each investigated case. Within the flame limits tested, WIDECARS data were analyzed and compared with CFD simulations and OH-PLIF measurements.
Crotta, Matteo; Paterlini, Franco; Rizzi, Rita; Guitian, Javier
2016-02-01
Foodborne disease as a result of raw milk consumption is an increasing concern in Western countries. Quantitative microbial risk assessment models have been used to estimate the risk of illness due to different pathogens in raw milk. In these models, the duration and temperature of storage before consumption have a critical influence in the final outcome of the simulations and are usually described and modeled as independent distributions in the consumer phase module. We hypothesize that this assumption can result in the computation, during simulations, of extreme scenarios that ultimately lead to an overestimation of the risk. In this study, a sensorial analysis was conducted to replicate consumers' behavior. The results of the analysis were used to establish, by means of a logistic model, the relationship between time-temperature combinations and the probability that a serving of raw milk is actually consumed. To assess our hypothesis, 2 recently published quantitative microbial risk assessment models quantifying the risks of listeriosis and salmonellosis related to the consumption of raw milk were implemented. First, the default settings described in the publications were kept; second, the likelihood of consumption as a function of the length and temperature of storage was included. When results were compared, the density of computed extreme scenarios decreased significantly in the modified model; consequently, the probability of illness and the expected number of cases per year also decreased. Reductions of 11.6 and 12.7% in the proportion of computed scenarios in which a contaminated milk serving was consumed were observed for the first and the second study, respectively. Our results confirm that overlooking the time-temperature dependency may yield to an important overestimation of the risk. Furthermore, we provide estimates of this dependency that could easily be implemented in future quantitative microbial risk assessment models of raw milk pathogens. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
On the simulation and mitigation of anisoplanatic optical turbulence for long range imaging
NASA Astrophysics Data System (ADS)
Hardie, Russell C.; LeMaster, Daniel A.
2017-05-01
We describe a numerical wave propagation method for simulating long range imaging of an extended scene under anisoplanatic conditions. Our approach computes an array of point spread functions (PSFs) for a 2D grid on the object plane. The PSFs are then used in a spatially varying weighted sum operation, with an ideal image, to produce a simulated image with realistic optical turbulence degradation. To validate the simulation we compare simulated outputs with the theoretical anisoplanatic tilt correlation and differential tilt variance. This is in addition to comparing the long- and short-exposure PSFs, and isoplanatic angle. Our validation analysis shows an excellent match between the simulation statistics and the theoretical predictions. The simulation tool is also used here to quantitatively evaluate a recently proposed block- matching and Wiener filtering (BMWF) method for turbulence mitigation. In this method block-matching registration algorithm is used to provide geometric correction for each of the individual input frames. The registered frames are then averaged and processed with a Wiener filter for restoration. A novel aspect of the proposed BMWF method is that the PSF model used for restoration takes into account the level of geometric correction achieved during image registration. This way, the Wiener filter is able fully exploit the reduced blurring achieved by registration. The BMWF method is relatively simple computationally, and yet, has excellent performance in comparison to state-of-the-art benchmark methods.
Drug-physiology interaction and its influence on the QT prolongation-mechanistic modeling study.
Wiśniowska, Barbara; Polak, Sebastian
2018-06-01
The current study is an example of drug-disease interaction modeling where a drug induces a condition which can affect the pharmacodynamics of other concomitantly taken drugs. The electrophysiological effects of hypokaliemia and heart rate changes induced by the antiasthmatic drugs were simulated with the use of the cardiac safety simulator. Biophysically detailed model of the human cardiac physiology-ten Tusscher ventricular cardiomyocyte cell model-was employed to generate pseudo-ECG signals and QTc intervals for 44 patients from four clinical studies. Simulated and observed mean QTc values with standard deviation (SD) for each reported study point were compared and differences were analyzed with Student's t test (α = 0.05). The simulated results reflected the QTc interval changes measured in patients, as well as their clinically observed interindividual variability. The QTc interval changes were highly correlated with the change in plasma potassium both in clinical studies and in the simulations (Pearson's correlation coefficient > 0.55). The results suggest that the modeling and simulation approach could provide valuable quantitative insight into the cardiological effect of the potassium and heart rate changes caused by electrophysiologically inactive, non-cardiological drugs. This allows to simulate and predict the joint effect of several risk factors for QT prolongation, e.g., drug-dependent QT prolongation due to the ion channels inhibition and the current patient physiological conditions.
Wu, Zujian; Pang, Wei; Coghill, George M
2015-01-01
Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.
Jha, Abhinav K; Song, Na; Caffo, Brian; Frey, Eric C
2015-04-13
Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.
Kasiri, Keyvan; Kazemi, Kamran; Dehghani, Mohammad Javad; Helfroush, Mohammad Sadegh
2013-01-01
In this paper, we present a new semi-automatic brain tissue segmentation method based on a hybrid hierarchical approach that combines a brain atlas as a priori information and a least-square support vector machine (LS-SVM). The method consists of three steps. In the first two steps, the skull is removed and the cerebrospinal fluid (CSF) is extracted. These two steps are performed using the toolbox FMRIB's automated segmentation tool integrated in the FSL software (FSL-FAST) developed in Oxford Centre for functional MRI of the brain (FMRIB). Then, in the third step, the LS-SVM is used to segment grey matter (GM) and white matter (WM). The training samples for LS-SVM are selected from the registered brain atlas. The voxel intensities and spatial positions are selected as the two feature groups for training and test. SVM as a powerful discriminator is able to handle nonlinear classification problems; however, it cannot provide posterior probability. Thus, we use a sigmoid function to map the SVM output into probabilities. The proposed method is used to segment CSF, GM and WM from the simulated magnetic resonance imaging (MRI) using Brainweb MRI simulator and real data provided by Internet Brain Segmentation Repository. The semi-automatically segmented brain tissues were evaluated by comparing to the corresponding ground truth. The Dice and Jaccard similarity coefficients, sensitivity and specificity were calculated for the quantitative validation of the results. The quantitative results show that the proposed method segments brain tissues accurately with respect to corresponding ground truth. PMID:24696800
Vandyk, Amanda D; Lalonde, Michelle; Merali, Sabrina; Wright, Erica; Bajnok, Irmajean; Davies, Barbara
2018-04-01
Evidence on the use of simulation to teach psychiatry and mental health (including addiction) content is emerging, yet no summary of the implementation processes or associated outcomes exists. The aim of this study was to systematically search and review empirical literature on the use of psychiatry-focused simulation in undergraduate nursing education. Objectives were to (i) assess the methodological quality of existing evidence on the use of simulation to teach mental health content to undergraduate nursing students, (ii) describe the operationalization of the simulations, and (iii) summarize the associated quantitative and qualitative outcomes. We conducted online database (MEDLINE, Embase, ERIC, CINAHL, PsycINFO from January 2004 to October 2015) and grey literature searches. Thirty-two simulation studies were identified describing and evaluating six types of simulations (standardized patients, audio simulations, high-fidelity simulators, virtual world, multimodal, and tabletop). Overall, 2724 participants were included in the studies. Studies reflected a limited number of intervention designs, and outcomes were evaluated with qualitative and quantitative methods incorporating a variety of tools. Results indicated that simulation was effective in reducing student anxiety and improving their knowledge, empathy, communication, and confidence. The summarized qualitative findings all supported the benefit of simulation; however, more research is needed to assess the comparative effectiveness of the types of simulations. Recommendations from the findings include the development of guidelines for educators to deliver each simulation component (briefing, active simulation, debriefing). Finally, consensus around appropriate training of facilitators is needed, as is consistent and agreed upon simulation terminology. © 2017 Australian College of Mental Health Nurses Inc.
ERIC Educational Resources Information Center
Schnurr, Matthew A.; De Santo, Elizabeth M.; Green, Amanda D.; Taylor, Alanna
2015-01-01
This article investigates the particular mechanisms through which a role-play simulation impacts student perceptions of knowledge acquisition. Longitudinal data were mobilized in the form of quantitative and qualitative surveys to examine whether the simulation succeeded in increasing knowledge around both content and skills. It then delves deeper…
The effects of changing land cover on streamflow simulation in Puerto Rico
A.E. Van Beusekom; L.E. Hay; R.J. Viger; W.A. Gould; J.A. Collazo; A. Henareh Khalyani
2014-01-01
This study quantitatively explores whether land cover changes have a substantive impact on simulated streamflow within the tropical island setting of Puerto Rico. The Precipitation Runoff Modeling System (PRMS) was used to compare streamflow simulations based on five static parameterizations of land cover with those based on dynamically varying parameters derived from...
Physics-based simulations of the impacts forest management practices have on hydrologic response
Adrianne Carr; Keith Loague
2012-01-01
The impacts of logging on near-surface hydrologic response at the catchment and watershed scales were examined quantitatively using numerical simulation. The simulations were conducted with the Integrated Hydrology Model (InHM) for the North Fork of Caspar Creek Experimental Watershed, located near Fort Bragg, California. InHM is a comprehensive physics-based...
Forecasting Lightning Threat using Cloud-resolving Model Simulations
NASA Technical Reports Server (NTRS)
McCaul, E. W., Jr.; Goodman, S. J.; LaCasse, K. M.; Cecil, D. J.
2009-01-01
As numerical forecasts capable of resolving individual convective clouds become more common, it is of interest to see if quantitative forecasts of lightning flash rate density are possible, based on fields computed by the numerical model. Previous observational research has shown robust relationships between observed lightning flash rates and inferred updraft and large precipitation ice fields in the mixed phase regions of storms, and that these relationships might allow simulated fields to serve as proxies for lightning flash rate density. It is shown in this paper that two simple proxy fields do indeed provide reasonable and cost-effective bases for creating time-evolving maps of predicted lightning flash rate density, judging from a series of diverse simulation case study events in North Alabama for which Lightning Mapping Array data provide ground truth. One method is based on the product of upward velocity and the mixing ratio of precipitating ice hydrometeors, modeled as graupel only, in the mixed phase region of storms at the -15\\dgc\\ level, while the second method is based on the vertically integrated amounts of ice hydrometeors in each model grid column. Each method can be calibrated by comparing domainwide statistics of the peak values of simulated flash rate proxy fields against domainwide peak total lightning flash rate density data from observations. Tests show that the first method is able to capture much of the temporal variability of the lightning threat, while the second method does a better job of depicting the areal coverage of the threat. A blended solution is designed to retain most of the temporal sensitivity of the first method, while adding the improved spatial coverage of the second. Weather Research and Forecast Model simulations of selected North Alabama cases show that this model can distinguish the general character and intensity of most convective events, and that the proposed methods show promise as a means of generating quantitatively realistic fields of lightning threat. However, because models tend to have more difficulty in correctly predicting the instantaneous placement of storms, forecasts of the detailed location of the lightning threat based on single simulations can be in error. Although these model shortcomings presently limit the precision of lightning threat forecasts from individual runs of current generation models, the techniques proposed herein should continue to be applicable as newer and more accurate physically-based model versions, physical parameterizations, initialization techniques and ensembles of cloud-allowing forecasts become available.
Research on the use of data fusion technology to evaluate the state of electromechanical equipment
NASA Astrophysics Data System (ADS)
Lin, Lin
2018-04-01
Aiming at the problems of different testing information modes and the coexistence of quantitative and qualitative information in the state evaluation of electromechanical equipment, the paper proposes the use of data fusion technology to evaluate the state of electromechanical equipment. This paper introduces the state evaluation process of mechanical and electrical equipment in detail, uses the D-S evidence theory to fuse the decision-making layers of mechanical and electrical equipment state evaluation and carries out simulation tests. The simulation results show that it is feasible and effective to apply the data fusion technology to the state evaluation of the mechatronic equipment. After the multiple decision-making information provided by different evaluation methods are fused repeatedly and the useful information is extracted repeatedly, the fuzziness of judgment can be reduced and the state evaluation Credibility.
Scaling and efficiency determine the irreversible evolution of a market
Baldovin, F.; Stella, A. L.
2007-01-01
In setting up a stochastic description of the time evolution of a financial index, the challenge consists in devising a model compatible with all stylized facts emerging from the analysis of financial time series and providing a reliable basis for simulating such series. Based on constraints imposed by market efficiency and on an inhomogeneous-time generalization of standard simple scaling, we propose an analytical model which accounts simultaneously for empirical results like the linear decorrelation of successive returns, the power law dependence on time of the volatility autocorrelation function, and the multiscaling associated to this dependence. In addition, our approach gives a justification and a quantitative assessment of the irreversible character of the index dynamics. This irreversibility enters as a key ingredient in a novel simulation strategy of index evolution which demonstrates the predictive potential of the model.
Linking Well-Tempered Metadynamics Simulations with Experiments
Barducci, Alessandro; Bonomi, Massimiliano; Parrinello, Michele
2010-01-01
Abstract Linking experiments with the atomistic resolution provided by molecular dynamics simulations can shed light on the structure and dynamics of protein-disordered states. The sampling limitations of classical molecular dynamics can be overcome using metadynamics, which is based on the introduction of a history-dependent bias on a small number of suitably chosen collective variables. Even if such bias distorts the probability distribution of the other degrees of freedom, the equilibrium Boltzmann distribution can be reconstructed using a recently developed reweighting algorithm. Quantitative comparison with experimental data is thus possible. Here we show the potential of this combined approach by characterizing the conformational ensemble explored by a 13-residue helix-forming peptide by means of a well-tempered metadynamics/parallel tempering approach and comparing the reconstructed nuclear magnetic resonance scalar couplings with experimental data. PMID:20441734
Computer simulation of the probability that endangered whales will interact with oil spills
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reed, M.; Jayko, K.; Bowles, A.
1987-03-01
A numerical model system was developed to assess quantitatively the probability that endangered bowhead and gray whales will encounter spilled oil in Alaskan waters. Bowhead and gray whale migration and diving-surfacing models, and an oil-spill trajectory model comprise the system. The migration models were developed from conceptual considerations, then calibrated with and tested against observations. The movement of a whale point is governed by a random walk algorithm which stochastically follows a migratory pathway. The oil-spill model, developed under a series of other contracts, accounts for transport and spreading behavior in open water and in the presence of sea ice.more » Historical wind records and heavy, normal, or light ice cover data sets are selected at random to provide stochastic oil-spill scenarios for whale-oil interaction simulations.« less
A sampling algorithm for segregation analysis
Tier, Bruce; Henshall, John
2001-01-01
Methods for detecting Quantitative Trait Loci (QTL) without markers have generally used iterative peeling algorithms for determining genotype probabilities. These algorithms have considerable shortcomings in complex pedigrees. A Monte Carlo Markov chain (MCMC) method which samples the pedigree of the whole population jointly is described. Simultaneous sampling of the pedigree was achieved by sampling descent graphs using the Metropolis-Hastings algorithm. A descent graph describes the inheritance state of each allele and provides pedigrees guaranteed to be consistent with Mendelian sampling. Sampling descent graphs overcomes most, if not all, of the limitations incurred by iterative peeling algorithms. The algorithm was able to find the QTL in most of the simulated populations. However, when the QTL was not modeled or found then its effect was ascribed to the polygenic component. No QTL were detected when they were not simulated. PMID:11742631
Fast Realistic MRI Simulations Based on Generalized Multi-Pool Exchange Tissue Model.
Liu, Fang; Velikina, Julia V; Block, Walter F; Kijowski, Richard; Samsonov, Alexey A
2017-02-01
We present MRiLab, a new comprehensive simulator for large-scale realistic MRI simulations on a regular PC equipped with a modern graphical processing unit (GPU). MRiLab combines realistic tissue modeling with numerical virtualization of an MRI system and scanning experiment to enable assessment of a broad range of MRI approaches including advanced quantitative MRI methods inferring microstructure on a sub-voxel level. A flexible representation of tissue microstructure is achieved in MRiLab by employing the generalized tissue model with multiple exchanging water and macromolecular proton pools rather than a system of independent proton isochromats typically used in previous simulators. The computational power needed for simulation of the biologically relevant tissue models in large 3D objects is gained using parallelized execution on GPU. Three simulated and one actual MRI experiments were performed to demonstrate the ability of the new simulator to accommodate a wide variety of voxel composition scenarios and demonstrate detrimental effects of simplified treatment of tissue micro-organization adapted in previous simulators. GPU execution allowed ∼ 200× improvement in computational speed over standard CPU. As a cross-platform, open-source, extensible environment for customizing virtual MRI experiments, MRiLab streamlines the development of new MRI methods, especially those aiming to infer quantitatively tissue composition and microstructure.
Fast Realistic MRI Simulations Based on Generalized Multi-Pool Exchange Tissue Model
Velikina, Julia V.; Block, Walter F.; Kijowski, Richard; Samsonov, Alexey A.
2017-01-01
We present MRiLab, a new comprehensive simulator for large-scale realistic MRI simulations on a regular PC equipped with a modern graphical processing unit (GPU). MRiLab combines realistic tissue modeling with numerical virtualization of an MRI system and scanning experiment to enable assessment of a broad range of MRI approaches including advanced quantitative MRI methods inferring microstructure on a sub-voxel level. A flexibl representation of tissue microstructure is achieved in MRiLab by employing the generalized tissue model with multiple exchanging water and macromolecular proton pools rather than a system of independent proton isochromats typically used in previous simulators. The computational power needed for simulation of the biologically relevant tissue models in large 3D objects is gained using parallelized execution on GPU. Three simulated and one actual MRI experiments were performed to demonstrate the ability of the new simulator to accommodate a wide variety of voxel composition scenarios and demonstrate detrimental effects of simplifie treatment of tissue micro-organization adapted in previous simulators. GPU execution allowed ∼200× improvement in computational speed over standard CPU. As a cross-platform, open-source, extensible environment for customizing virtual MRI experiments, MRiLab streamlines the development of new MRI methods, especially those aiming to infer quantitatively tissue composition and microstructure. PMID:28113746
NASA Astrophysics Data System (ADS)
Powell, C. J.; Smekal, W.; Werner, W. S. M.
2005-09-01
We describe a new NIST database for the Simulation of Electron Spectra for Surface Analysis (SESSA). This database provides data for the many parameters needed in quantitative Auger electron spectroscopy (AES) and X-ray photoelectron spectroscopy (XPS). In addition, AES and XPS spectra can be simulated for layered samples. The simulated spectra, for layer compositions and thicknesses specified by the user, can be compared with measured spectra. The layer compositions and thicknesses can then be adjusted to find maximum consistency between simulated and measured spectra. In this way, AES and XPS can provide more detailed characterization of multilayer thin-film materials. We report on the use of SESSA for determining the thicknesses of HfO2, ZrO2, HfSiO4, and ZrSiO4 films on Si by angle-resolved XPS. Practical effective attenuation lengths (EALs) have been computed from SESSA as a function of film thickness and photoelectron emission angle (i.e., to simulate the effects of tilting the sample). These EALs have been compared with similar values obtained from the NIST Electron Effective-Attenuation-Length Database (SRD 82). Generally good agreement was found between corresponding EAL values, but there were differences for film thicknesses less than the inelastic mean free path of the photoelectrons in the overlayer film. These differences are due to a simplifying approximation in the algorithm used to compute EALs in SRD 82. SESSA, with realistic cross sections for elastic and inelastic scattering in the film and substrate materials, is believed to provide more accurate EALs than SRD 82 for thin-film thickness measurements, particularly in applications where the film and substrate have different electron-scattering properties.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Powell, C.J.; Smekal, W.; Werner, W.S.M.
2005-09-09
We describe a new NIST database for the Simulation of Electron Spectra for Surface Analysis (SESSA). This database provides data for the many parameters needed in quantitative Auger electron spectroscopy (AES) and X-ray photoelectron spectroscopy (XPS). In addition, AES and XPS spectra can be simulated for layered samples. The simulated spectra, for layer compositions and thicknesses specified by the user, can be compared with measured spectra. The layer compositions and thicknesses can then be adjusted to find maximum consistency between simulated and measured spectra. In this way, AES and XPS can provide more detailed characterization of multilayer thin-film materials. Wemore » report on the use of SESSA for determining the thicknesses of HfO2, ZrO2, HfSiO4, and ZrSiO4 films on Si by angle-resolved XPS. Practical effective attenuation lengths (EALs) have been computed from SESSA as a function of film thickness and photoelectron emission angle (i.e., to simulate the effects of tilting the sample). These EALs have been compared with similar values obtained from the NIST Electron Effective-Attenuation-Length Database (SRD 82). Generally good agreement was found between corresponding EAL values, but there were differences for film thicknesses less than the inelastic mean free path of the photoelectrons in the overlayer film. These differences are due to a simplifying approximation in the algorithm used to compute EALs in SRD 82. SESSA, with realistic cross sections for elastic and inelastic scattering in the film and substrate materials, is believed to provide more accurate EALs than SRD 82 for thin-film thickness measurements, particularly in applications where the film and substrate have different electron-scattering properties.« less
ERIC Educational Resources Information Center
Carsey, Thomas M.; Harden, Jeffrey J.
2015-01-01
Graduate students in political science come to the discipline interested in exploring important political questions, such as "What causes war?" or "What policies promote economic growth?" However, they typically do not arrive prepared to address those questions using quantitative methods. Graduate methods instructors must…
Contrast of Backscattered Electron SEM Images of Nanoparticles on Substrates with Complex Structure
Müller, Erich; Fritsch-Decker, Susanne; Hettler, Simon; Störmer, Heike; Weiss, Carsten; Gerthsen, Dagmar
2017-01-01
This study is concerned with backscattered electron scanning electron microscopy (BSE SEM) contrast of complex nanoscaled samples which consist of SiO2 nanoparticles (NPs) deposited on indium-tin-oxide covered bulk SiO2 and glassy carbon substrates. BSE SEM contrast of NPs is studied as function of the primary electron energy and working distance. Contrast inversions are observed which prevent intuitive interpretation of NP contrast in terms of material contrast. Experimental data is quantitatively compared with Monte-Carlo- (MC-) simulations. Quantitative agreement between experimental data and MC-simulations is obtained if the transmission characteristics of the annular semiconductor detector are taken into account. MC-simulations facilitate the understanding of NP contrast inversions and are helpful to derive conditions for optimum material and topography contrast. PMID:29109816
Contrast of Backscattered Electron SEM Images of Nanoparticles on Substrates with Complex Structure.
Kowoll, Thomas; Müller, Erich; Fritsch-Decker, Susanne; Hettler, Simon; Störmer, Heike; Weiss, Carsten; Gerthsen, Dagmar
2017-01-01
This study is concerned with backscattered electron scanning electron microscopy (BSE SEM) contrast of complex nanoscaled samples which consist of SiO 2 nanoparticles (NPs) deposited on indium-tin-oxide covered bulk SiO 2 and glassy carbon substrates. BSE SEM contrast of NPs is studied as function of the primary electron energy and working distance. Contrast inversions are observed which prevent intuitive interpretation of NP contrast in terms of material contrast. Experimental data is quantitatively compared with Monte-Carlo- (MC-) simulations. Quantitative agreement between experimental data and MC-simulations is obtained if the transmission characteristics of the annular semiconductor detector are taken into account. MC-simulations facilitate the understanding of NP contrast inversions and are helpful to derive conditions for optimum material and topography contrast.
PCDD/F and Aromatic Emissions from Simulated Forest and Grassland Fires
Emissions of polychlorinated dibenzodioxin and polychlorinated dibenzofuran (PCDD/F) from simulated grassland and forest fires were quantitatively sampled to derive emission factors in support of PCDD/F inventory development. Grasses from Kentucky and Minnesota; forest shrubs fro...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pardini, Tom; Aquila, Andrew; Boutet, Sebastien
Numerical simulations of the current and future pulse intensity distributions at selected locations along the Far Experimental Hall, the hard X-ray section of the Linac Coherent Light Source (LCLS), are provided. Estimates are given for the pulse fluence, energy and size in and out of focus, taking into account effects due to the experimentally measured divergence of the X-ray beam, and measured figure errors of all X-ray optics in the beam path. Out-of-focus results are validated by comparison with experimental data. Previous work is expanded on, providing quantitatively correct predictions of the pulse intensity distribution. Numerical estimates in focus aremore » particularly important given that the latter cannot be measured with direct imaging techniques due to detector damage. Finally, novel numerical estimates of improvements to the pulse intensity distribution expected as part of the on-going upgrade of the LCLS X-ray transport system are provided. As a result, we suggest how the new generation of X-ray optics to be installed would outperform the old one, satisfying the tight requirements imposed by X-ray free-electron laser facilities.« less
Pardini, Tom; Aquila, Andrew; Boutet, Sebastien; ...
2017-06-15
Numerical simulations of the current and future pulse intensity distributions at selected locations along the Far Experimental Hall, the hard X-ray section of the Linac Coherent Light Source (LCLS), are provided. Estimates are given for the pulse fluence, energy and size in and out of focus, taking into account effects due to the experimentally measured divergence of the X-ray beam, and measured figure errors of all X-ray optics in the beam path. Out-of-focus results are validated by comparison with experimental data. Previous work is expanded on, providing quantitatively correct predictions of the pulse intensity distribution. Numerical estimates in focus aremore » particularly important given that the latter cannot be measured with direct imaging techniques due to detector damage. Finally, novel numerical estimates of improvements to the pulse intensity distribution expected as part of the on-going upgrade of the LCLS X-ray transport system are provided. As a result, we suggest how the new generation of X-ray optics to be installed would outperform the old one, satisfying the tight requirements imposed by X-ray free-electron laser facilities.« less
Lusch, Bethany; Weholt, Jake; Maia, Pedro D; Kutz, J Nathan
2018-06-01
The accurate diagnosis and assessment of neurodegenerative disease and traumatic brain injuries (TBI) remain open challenges. Both cause cognitive and functional deficits due to focal axonal swellings (FAS), but it is difficult to deliver a prognosis due to our limited ability to assess damaged neurons at a cellular level in vivo. We simulate the effects of neurodegenerative disease and TBI using convolutional neural networks (CNNs) as our model of cognition. We utilize biophysically relevant statistical data on FAS to damage the connections in CNNs in a functionally relevant way. We incorporate energy constraints on the brain by pruning the CNNs to be less over-engineered. Qualitatively, we demonstrate that damage leads to human-like mistakes. Our experiments also provide quantitative assessments of how accuracy is affected by various types and levels of damage. The deficit resulting from a fixed amount of damage greatly depends on which connections are randomly injured, providing intuition for why it is difficult to predict impairments. There is a large degree of subjectivity when it comes to interpreting cognitive deficits from complex systems such as the human brain. However, we provide important insight and a quantitative framework for disorders in which FAS are implicated. Copyright © 2018 Elsevier Inc. All rights reserved.