Sample records for summary simulation model

  1. Implementing and Evaluating an Innovative Approach to Simulation Training Acquisitions

    DTIC Science & Technology

    2006-01-01

    busi- ness model, compares it with other approaches for buying simulations and simulation training, reviews economic theories relevant to the model, and...Points in Common with Other Approaches but Also Some Distinctive Characteristics ........................... 53 Contents vii CHAPTER FOUR The Economic ...Appropriate? .................... 65 4.3. Summary of Key Findings from Economic Theory .............. 72 xiii Summary In the wake of the failure of the Joint

  2. A simulations approach for meta-analysis of genetic association studies based on additive genetic model.

    PubMed

    John, Majnu; Lencz, Todd; Malhotra, Anil K; Correll, Christoph U; Zhang, Jian-Ping

    2018-06-01

    Meta-analysis of genetic association studies is being increasingly used to assess phenotypic differences between genotype groups. When the underlying genetic model is assumed to be dominant or recessive, assessing the phenotype differences based on summary statistics, reported for individual studies in a meta-analysis, is a valid strategy. However, when the genetic model is additive, a similar strategy based on summary statistics will lead to biased results. This fact about the additive model is one of the things that we establish in this paper, using simulations. The main goal of this paper is to present an alternate strategy for the additive model based on simulating data for the individual studies. We show that the alternate strategy is far superior to the strategy based on summary statistics.

  3. High-Fidelity Simulation in Biomedical and Aerospace Engineering

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan

    2005-01-01

    Contents include the following: Introduction / Background. Modeling and Simulation Challenges in Aerospace Engineering. Modeling and Simulation Challenges in Biomedical Engineering. Digital Astronaut. Project Columbia. Summary and Discussion.

  4. Identifying and Quantifying Emergent Behavior Through System of Systems Modeling and Simulation

    DTIC Science & Technology

    2015-09-01

    42 J . SUMMARY ..............................................................................................43 III. METHODOLOGY...our research. e. Ptolemy Ptolemy is a simulation and rapid prototype environment developed at the University of California Berkely in the...simulation. J . SUMMARY This chapter describes the many works used as a basis for this research. This research used the principles of Selberg’s 2008

  5. Two Models of Adhesive Debonding of Sylgard

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, Ralph Robert

    This report begins with a brief summary of the range of modeling methods used to simulate adhesive debonding. Then the mechanical simulation of the blister debonding test, and the thermomechanical simulation of the potted hemisphere problem are described. For both simulations, details of the chosen modeling techniques, and the reasons for choosing them (and rejecting alternate modeling approaches) will be discussed.

  6. Airport Landside. Volume II. The Airport Landside Simulation Model (ALSIM) Description and Users Guide.

    DOT National Transportation Integrated Search

    1982-06-01

    This volume provides a general description of the Airport Landside Simulation Model. A summary of simulated passenger and vehicular processing through the landside is presented. Program operating characteristics and assumptions are documented and a c...

  7. Rainfall Data Simulation

    Treesearch

    T.L. Rogerson

    1980-01-01

    A simple simulation model to predict rainfall for individual storms in central Arkansas is described. Output includes frequency distribution tables for days between storms and for storm size classes; a storm summary by day number (January 1 = 1 and December 31 = 365) and rainfall amount; and an annual storm summary that includes monthly values for rainfall and number...

  8. Why Summary Comorbidity Measures Such As the Charlson Comorbidity Index and Elixhauser Score Work.

    PubMed

    Austin, Steven R; Wong, Yu-Ning; Uzzo, Robert G; Beck, J Robert; Egleston, Brian L

    2015-09-01

    Comorbidity adjustment is an important component of health services research and clinical prognosis. When adjusting for comorbidities in statistical models, researchers can include comorbidities individually or through the use of summary measures such as the Charlson Comorbidity Index or Elixhauser score. We examined the conditions under which individual versus summary measures are most appropriate. We provide an analytic proof of the utility of comorbidity summary measures when used in place of individual comorbidities. We compared the use of the Charlson and Elixhauser scores versus individual comorbidities in prognostic models using a SEER-Medicare data example. We examined the ability of summary comorbidity measures to adjust for confounding using simulations. We devised a mathematical proof that found that the comorbidity summary measures are appropriate prognostic or adjustment mechanisms in survival analyses. Once one knows the comorbidity score, no other information about the comorbidity variables used to create the score is generally needed. Our data example and simulations largely confirmed this finding. Summary comorbidity measures, such as the Charlson Comorbidity Index and Elixhauser scores, are commonly used for clinical prognosis and comorbidity adjustment. We have provided a theoretical justification that validates the use of such scores under many conditions. Our simulations generally confirm the utility of the summary comorbidity measures as substitutes for use of the individual comorbidity variables in health services research. One caveat is that a summary measure may only be as good as the variables used to create it.

  9. Project 0-1800 : NAFTA impacts on operations : executive summary

    DOT National Transportation Integrated Search

    2001-07-01

    Project 0-1800 pioneered the use of modern micro-simulation models to analyze the complex procedures involved in international border crossing in Texas. Animated models simulate the entire southbound commercial traffic flow in two important internati...

  10. Documentation of the dynamic parameter, water-use, stream and lake flow routing, and two summary output modules and updates to surface-depression storage simulation and initial conditions specification options with the Precipitation-Runoff Modeling System (PRMS)

    USGS Publications Warehouse

    Regan, R. Steve; LaFontaine, Jacob H.

    2017-10-05

    This report documents seven enhancements to the U.S. Geological Survey (USGS) Precipitation-Runoff Modeling System (PRMS) hydrologic simulation code: two time-series input options, two new output options, and three updates of existing capabilities. The enhancements are (1) new dynamic parameter module, (2) new water-use module, (3) new Hydrologic Response Unit (HRU) summary output module, (4) new basin variables summary output module, (5) new stream and lake flow routing module, (6) update to surface-depression storage and flow simulation, and (7) update to the initial-conditions specification. This report relies heavily upon U.S. Geological Survey Techniques and Methods, book 6, chapter B7, which documents PRMS version 4 (PRMS-IV). A brief description of PRMS is included in this report.

  11. Computer Simulation for Emergency Incident Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, D L

    2004-12-03

    This report describes the findings and recommendations resulting from the Department of Homeland Security (DHS) Incident Management Simulation Workshop held by the DHS Advanced Scientific Computing Program in May 2004. This workshop brought senior representatives of the emergency response and incident-management communities together with modeling and simulation technologists from Department of Energy laboratories. The workshop provided an opportunity for incident responders to describe the nature and substance of the primary personnel roles in an incident response, to identify current and anticipated roles of modeling and simulation in support of incident response, and to begin a dialog between the incident responsemore » and simulation technology communities that will guide and inform planned modeling and simulation development for incident response. This report provides a summary of the discussions at the workshop as well as a summary of simulation capabilities that are relevant to incident-management training, and recommendations for the use of simulation in both incident management and in incident management training, based on the discussions at the workshop. In addition, the report discusses areas where further research and development will be required to support future needs in this area.« less

  12. Phase II, improved work zone design guidelines and enhanced model of traffic delays in work zones : executive summary report.

    DOT National Transportation Integrated Search

    2009-03-01

    This project contains three major parts. In the first part a digital computer simulation model was developed with the aim to model the traffic through a freeway work zone situation. The model was based on the Arena simulation software and used cumula...

  13. Statistical Compression for Climate Model Output

    NASA Astrophysics Data System (ADS)

    Hammerling, D.; Guinness, J.; Soh, Y. J.

    2017-12-01

    Numerical climate model simulations run at high spatial and temporal resolutions generate massive quantities of data. As our computing capabilities continue to increase, storing all of the data is not sustainable, and thus is it important to develop methods for representing the full datasets by smaller compressed versions. We propose a statistical compression and decompression algorithm based on storing a set of summary statistics as well as a statistical model describing the conditional distribution of the full dataset given the summary statistics. We decompress the data by computing conditional expectations and conditional simulations from the model given the summary statistics. Conditional expectations represent our best estimate of the original data but are subject to oversmoothing in space and time. Conditional simulations introduce realistic small-scale noise so that the decompressed fields are neither too smooth nor too rough compared with the original data. Considerable attention is paid to accurately modeling the original dataset-one year of daily mean temperature data-particularly with regard to the inherent spatial nonstationarity in global fields, and to determining the statistics to be stored, so that the variation in the original data can be closely captured, while allowing for fast decompression and conditional emulation on modest computers.

  14. Summary of a Modeling and Simulation Framework for High-Fidelity Weapon Models in Joint Semi-Automated Forces (JSAF) and Other Mission-Simulation Software

    DTIC Science & Technology

    2008-05-01

    communicate with other weapon models In a mission-level simulation; (3) introduces the four configuration levels of the M&S framework; and (4) presents a cost ...and Disadvantages ....................................................................... 26 6 COST -EFFECTIVE M&S LABORATORY PLAN...25 23 Weapon Model Sample Time and Average TET Displayed on the Target PC ..... 26 24 Design and Cost of an

  15. Alternating Renewal Process Models for Behavioral Observation: Simulation Methods, Software, and Validity Illustrations

    ERIC Educational Resources Information Center

    Pustejovsky, James E.; Runyon, Christopher

    2014-01-01

    Direct observation recording procedures produce reductive summary measurements of an underlying stream of behavior. Previous methodological studies of these recording procedures have employed simulation methods for generating random behavior streams, many of which amount to special cases of a statistical model known as the alternating renewal…

  16. Calculating Higher-Order Moments of Phylogenetic Stochastic Mapping Summaries in Linear Time.

    PubMed

    Dhar, Amrit; Minin, Vladimir N

    2017-05-01

    Stochastic mapping is a simulation-based method for probabilistically mapping substitution histories onto phylogenies according to continuous-time Markov models of evolution. This technique can be used to infer properties of the evolutionary process on the phylogeny and, unlike parsimony-based mapping, conditions on the observed data to randomly draw substitution mappings that do not necessarily require the minimum number of events on a tree. Most stochastic mapping applications simulate substitution mappings only to estimate the mean and/or variance of two commonly used mapping summaries: the number of particular types of substitutions (labeled substitution counts) and the time spent in a particular group of states (labeled dwelling times) on the tree. Fast, simulation-free algorithms for calculating the mean of stochastic mapping summaries exist. Importantly, these algorithms scale linearly in the number of tips/leaves of the phylogenetic tree. However, to our knowledge, no such algorithm exists for calculating higher-order moments of stochastic mapping summaries. We present one such simulation-free dynamic programming algorithm that calculates prior and posterior mapping variances and scales linearly in the number of phylogeny tips. Our procedure suggests a general framework that can be used to efficiently compute higher-order moments of stochastic mapping summaries without simulations. We demonstrate the usefulness of our algorithm by extending previously developed statistical tests for rate variation across sites and for detecting evolutionarily conserved regions in genomic sequences.

  17. Calculating Higher-Order Moments of Phylogenetic Stochastic Mapping Summaries in Linear Time

    PubMed Central

    Dhar, Amrit

    2017-01-01

    Abstract Stochastic mapping is a simulation-based method for probabilistically mapping substitution histories onto phylogenies according to continuous-time Markov models of evolution. This technique can be used to infer properties of the evolutionary process on the phylogeny and, unlike parsimony-based mapping, conditions on the observed data to randomly draw substitution mappings that do not necessarily require the minimum number of events on a tree. Most stochastic mapping applications simulate substitution mappings only to estimate the mean and/or variance of two commonly used mapping summaries: the number of particular types of substitutions (labeled substitution counts) and the time spent in a particular group of states (labeled dwelling times) on the tree. Fast, simulation-free algorithms for calculating the mean of stochastic mapping summaries exist. Importantly, these algorithms scale linearly in the number of tips/leaves of the phylogenetic tree. However, to our knowledge, no such algorithm exists for calculating higher-order moments of stochastic mapping summaries. We present one such simulation-free dynamic programming algorithm that calculates prior and posterior mapping variances and scales linearly in the number of phylogeny tips. Our procedure suggests a general framework that can be used to efficiently compute higher-order moments of stochastic mapping summaries without simulations. We demonstrate the usefulness of our algorithm by extending previously developed statistical tests for rate variation across sites and for detecting evolutionarily conserved regions in genomic sequences. PMID:28177780

  18. Performance Summary of the 2006 Community Multiscale Air Quality (CMAQ) Simulation for the AQMEII Project: North American Application

    EPA Science Inventory

    The CMAQ modeling system has been used to simulate the CONUS using 12-km by 12-km horizontal grid spacing for the entire year of 2006 as part of the Air Quality Model Evaluation International initiative (AQMEII). The operational model performance for O3 and PM2.5<...

  19. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs — evaluation summary for DMA program.

    DOT National Transportation Integrated Search

    2017-07-04

    The primary objective of this project is to develop multiple simulation testbeds/transportation models to evaluate the impacts of Dynamic Mobility Application (DMA) connected vehicle applications and Active Transportation and Demand management (ATDM)...

  20. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs : summary report for the Chicago testbed.

    DOT National Transportation Integrated Search

    2017-04-01

    The primary objective of this project is to develop multiple simulation testbeds and transportation models to evaluate the impacts of Connected Vehicle Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) strateg...

  1. Selected Urban Simulations and Games. IFF Working Paper WP-4.

    ERIC Educational Resources Information Center

    Nagelberg, Mark; Little, Dennis L.

    Summary descriptions of selected urban simulations and games that have been developed outside the Institute For The Future are presented. The operating characteristics and potential applications of each model are described. These include (1) the history of development, (2) model and player requirements, (3) a description of the environment being…

  2. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs — evaluation summary for ATDM program.

    DOT National Transportation Integrated Search

    2017-07-04

    The primary objective of this project is to develop multiple simulation testbeds/transportation models to evaluate the impacts of Dynamic Mobility Application (DMA) connected vehicle applications and Active Transportation and Dynamic management (ATDM...

  3. Research highlights of the global modeling and simulation branch for 1986-1987

    NASA Technical Reports Server (NTRS)

    Baker, Wayman (Editor); Susskind, Joel (Editor); Pfaendtner, James (Editor); Randall, David (Editor); Atlas, Robert (Editor)

    1988-01-01

    This document provides a summary of the research conducted in the Global Modeling and Simulation Branch and highlights the most significant accomplishments in 1986 to 1987. The Branch has been the focal point for global weather and climate prediction research in the Laboratory for Atmospheres through the retrieval and use of satellite data, the development of global models and data assimilation techniques, the simulation of future observing systems, and the performance of atmospheric diagnostic studies.

  4. Exposure time independent summary statistics for assessment of drug dependent cell line growth inhibition.

    PubMed

    Falgreen, Steffen; Laursen, Maria Bach; Bødker, Julie Støve; Kjeldsen, Malene Krag; Schmitz, Alexander; Nyegaard, Mette; Johnsen, Hans Erik; Dybkær, Karen; Bøgsted, Martin

    2014-06-05

    In vitro generated dose-response curves of human cancer cell lines are widely used to develop new therapeutics. The curves are summarised by simplified statistics that ignore the conventionally used dose-response curves' dependency on drug exposure time and growth kinetics. This may lead to suboptimal exploitation of data and biased conclusions on the potential of the drug in question. Therefore we set out to improve the dose-response assessments by eliminating the impact of time dependency. First, a mathematical model for drug induced cell growth inhibition was formulated and used to derive novel dose-response curves and improved summary statistics that are independent of time under the proposed model. Next, a statistical analysis workflow for estimating the improved statistics was suggested consisting of 1) nonlinear regression models for estimation of cell counts and doubling times, 2) isotonic regression for modelling the suggested dose-response curves, and 3) resampling based method for assessing variation of the novel summary statistics. We document that conventionally used summary statistics for dose-response experiments depend on time so that fast growing cell lines compared to slowly growing ones are considered overly sensitive. The adequacy of the mathematical model is tested for doxorubicin and found to fit real data to an acceptable degree. Dose-response data from the NCI60 drug screen were used to illustrate the time dependency and demonstrate an adjustment correcting for it. The applicability of the workflow was illustrated by simulation and application on a doxorubicin growth inhibition screen. The simulations show that under the proposed mathematical model the suggested statistical workflow results in unbiased estimates of the time independent summary statistics. Variance estimates of the novel summary statistics are used to conclude that the doxorubicin screen covers a significant diverse range of responses ensuring it is useful for biological interpretations. Time independent summary statistics may aid the understanding of drugs' action mechanism on tumour cells and potentially renew previous drug sensitivity evaluation studies.

  5. Exposure time independent summary statistics for assessment of drug dependent cell line growth inhibition

    PubMed Central

    2014-01-01

    Background In vitro generated dose-response curves of human cancer cell lines are widely used to develop new therapeutics. The curves are summarised by simplified statistics that ignore the conventionally used dose-response curves’ dependency on drug exposure time and growth kinetics. This may lead to suboptimal exploitation of data and biased conclusions on the potential of the drug in question. Therefore we set out to improve the dose-response assessments by eliminating the impact of time dependency. Results First, a mathematical model for drug induced cell growth inhibition was formulated and used to derive novel dose-response curves and improved summary statistics that are independent of time under the proposed model. Next, a statistical analysis workflow for estimating the improved statistics was suggested consisting of 1) nonlinear regression models for estimation of cell counts and doubling times, 2) isotonic regression for modelling the suggested dose-response curves, and 3) resampling based method for assessing variation of the novel summary statistics. We document that conventionally used summary statistics for dose-response experiments depend on time so that fast growing cell lines compared to slowly growing ones are considered overly sensitive. The adequacy of the mathematical model is tested for doxorubicin and found to fit real data to an acceptable degree. Dose-response data from the NCI60 drug screen were used to illustrate the time dependency and demonstrate an adjustment correcting for it. The applicability of the workflow was illustrated by simulation and application on a doxorubicin growth inhibition screen. The simulations show that under the proposed mathematical model the suggested statistical workflow results in unbiased estimates of the time independent summary statistics. Variance estimates of the novel summary statistics are used to conclude that the doxorubicin screen covers a significant diverse range of responses ensuring it is useful for biological interpretations. Conclusion Time independent summary statistics may aid the understanding of drugs’ action mechanism on tumour cells and potentially renew previous drug sensitivity evaluation studies. PMID:24902483

  6. Mathematical modeling and simulation of the space shuttle imaging radar antennas

    NASA Technical Reports Server (NTRS)

    Campbell, R. W.; Melick, K. E.; Coffey, E. L., III

    1978-01-01

    Simulations of space shuttle synthetic aperture radar antennas under the influence of space environmental conditions were carried out at L, C, and X-band. Mathematical difficulties in modeling large, non-planar array antennas are discussed, and an approximate modeling technique is presented. Results for several antenna error conditions are illustrated in far-field profile patterns, earth surface footprint contours, and summary graphs.

  7. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs - evaluation summary for the San Diego testbed

    DOT National Transportation Integrated Search

    2017-08-01

    The primary objective of this project is to develop multiple simulation testbeds and transportation models to evaluate the impacts of Connected Vehicle Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) strateg...

  8. Investigating the Cosmic Web with Topological Data Analysis

    NASA Astrophysics Data System (ADS)

    Cisewski-Kehe, Jessi; Wu, Mike; Fasy, Brittany; Hellwing, Wojciech; Lovell, Mark; Rinaldo, Alessandro; Wasserman, Larry

    2018-01-01

    Data exhibiting complicated spatial structures are common in many areas of science (e.g. cosmology, biology), but can be difficult to analyze. Persistent homology is a popular approach within the area of Topological Data Analysis that offers a new way to represent, visualize, and interpret complex data by extracting topological features, which can be used to infer properties of the underlying structures. In particular, TDA may be useful for analyzing the large-scale structure (LSS) of the Universe, which is an intricate and spatially complex web of matter. In order to understand the physics of the Universe, theoretical and computational cosmologists develop large-scale simulations that allow for visualizing and analyzing the LSS under varying physical assumptions. Each point in the 3D data set represents a galaxy or a cluster of galaxies, and topological summaries ("persistent diagrams") can be obtained summarizing the different ordered holes in the data (e.g. connected components, loops, voids).The topological summaries are interesting and informative descriptors of the Universe on their own, but hypothesis tests using the topological summaries would provide a way to make more rigorous comparisons of LSS under different theoretical models. For example, the received cosmological model has cold dark matter (CDM); however, while the case is strong for CDM, there are some observational inconsistencies with this theory. Another possibility is warm dark matter (WDM). It is of interest to see if a CDM Universe and WDM Universe produce LSS that is topologically distinct.We present several possible test statistics for two-sample hypothesis tests using the topological summaries, carryout a simulation study to investigate the suitableness of the proposed test statistics using simulated data from a variation of the Voronoi foam model, and finally we apply the proposed inference framework to WDM vs. CDM cosmological simulation data.

  9. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs — summary report for the Chicago testbed. [supporting datasets - Chicago Testbed

    DOT National Transportation Integrated Search

    2017-04-01

    The datasets in this zip file are in support of Intelligent Transportation Systems Joint Program Office (ITS JPO) report FHWA-JPO-16-385, "Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applica...

  10. Approximate Bayesian Computation Using Markov Chain Monte Carlo Simulation: Theory, Concepts, and Applications

    NASA Astrophysics Data System (ADS)

    Sadegh, M.; Vrugt, J. A.

    2013-12-01

    The ever increasing pace of computational power, along with continued advances in measurement technologies and improvements in process understanding has stimulated the development of increasingly complex hydrologic models that simulate soil moisture flow, groundwater recharge, surface runoff, root water uptake, and river discharge at increasingly finer spatial and temporal scales. Reconciling these system models with field and remote sensing data is a difficult task, particularly because average measures of model/data similarity inherently lack the power to provide a meaningful comparative evaluation of the consistency in model form and function. The very construction of the likelihood function - as a summary variable of the (usually averaged) properties of the error residuals - dilutes and mixes the available information into an index having little remaining correspondence to specific behaviors of the system (Gupta et al., 2008). The quest for a more powerful method for model evaluation has inspired Vrugt and Sadegh [2013] to introduce "likelihood-free" inference as vehicle for diagnostic model evaluation. This class of methods is also referred to as Approximate Bayesian Computation (ABC) and relaxes the need for an explicit likelihood function in favor of one or multiple different summary statistics rooted in hydrologic theory that together have a much stronger and compelling diagnostic power than some aggregated measure of the size of the error residuals. Here, we will introduce an efficient ABC sampling method that is orders of magnitude faster in exploring the posterior parameter distribution than commonly used rejection and Population Monte Carlo (PMC) samplers. Our methodology uses Markov Chain Monte Carlo simulation with DREAM, and takes advantage of a simple computational trick to resolve discontinuity problems with the application of set-theoretic summary statistics. We will also demonstrate a set of summary statistics that are rather insensitive to errors in the forcing data. This enhances prospects of detecting model structural deficiencies.

  11. Chaste: A test-driven approach to software development for biological modelling

    NASA Astrophysics Data System (ADS)

    Pitt-Francis, Joe; Pathmanathan, Pras; Bernabeu, Miguel O.; Bordas, Rafel; Cooper, Jonathan; Fletcher, Alexander G.; Mirams, Gary R.; Murray, Philip; Osborne, James M.; Walter, Alex; Chapman, S. Jon; Garny, Alan; van Leeuwen, Ingeborg M. M.; Maini, Philip K.; Rodríguez, Blanca; Waters, Sarah L.; Whiteley, Jonathan P.; Byrne, Helen M.; Gavaghan, David J.

    2009-12-01

    Chaste ('Cancer, heart and soft-tissue environment') is a software library and a set of test suites for computational simulations in the domain of biology. Current functionality has arisen from modelling in the fields of cancer, cardiac physiology and soft-tissue mechanics. It is released under the LGPL 2.1 licence. Chaste has been developed using agile programming methods. The project began in 2005 when it was reasoned that the modelling of a variety of physiological phenomena required both a generic mathematical modelling framework, and a generic computational/simulation framework. The Chaste project evolved from the Integrative Biology (IB) e-Science Project, an inter-institutional project aimed at developing a suitable IT infrastructure to support physiome-level computational modelling, with a primary focus on cardiac and cancer modelling. Program summaryProgram title: Chaste Catalogue identifier: AEFD_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFD_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: LGPL 2.1 No. of lines in distributed program, including test data, etc.: 5 407 321 No. of bytes in distributed program, including test data, etc.: 42 004 554 Distribution format: tar.gz Programming language: C++ Operating system: Unix Has the code been vectorised or parallelized?: Yes. Parallelized using MPI. RAM:<90 Megabytes for two of the scenarios described in Section 6 of the manuscript (Monodomain re-entry on a slab or Cylindrical crypt simulation). Up to 16 Gigabytes (distributed across processors) for full resolution bidomain cardiac simulation. Classification: 3. External routines: Boost, CodeSynthesis XSD, CxxTest, HDF5, METIS, MPI, PETSc, Triangle, Xerces Nature of problem: Chaste may be used for solving coupled ODE and PDE systems arising from modelling biological systems. Use of Chaste in two application areas are described in this paper: cardiac electrophysiology and intestinal crypt dynamics. Solution method: Coupled multi-physics with PDE, ODE and discrete mechanics simulation. Running time: The largest cardiac simulation described in the manuscript takes about 6 hours to run on a single 3 GHz core. See results section (Section 6) of the manuscript for discussion on parallel scaling.

  12. Model-Observation "Data Cubes" for the DOE Atmospheric Radiation Measurement Program's LES ARM Symbiotic Simulation and Observation (LASSO) Workflow

    NASA Astrophysics Data System (ADS)

    Vogelmann, A. M.; Gustafson, W. I., Jr.; Toto, T.; Endo, S.; Cheng, X.; Li, Z.; Xiao, H.

    2015-12-01

    The Department of Energy's Atmospheric Radiation Measurement (ARM) Climate Research Facilities' Large-Eddy Simulation (LES) ARM Symbiotic Simulation and Observation (LASSO) Workflow is currently being designed to provide output from routine LES to complement its extensive observations. The modeling portion of the LASSO workflow is presented by Gustafson et al., which will initially focus on shallow convection over the ARM megasite in Oklahoma, USA. This presentation describes how the LES output will be combined with observations to construct multi-dimensional and dynamically consistent "data cubes", aimed at providing the best description of the atmospheric state for use in analyses by the community. The megasite observations are used to constrain large-eddy simulations that provide a complete spatial and temporal coverage of observables and, further, the simulations also provide information on processes that cannot be observed. Statistical comparisons of model output with their observables are used to assess the quality of a given simulated realization and its associated uncertainties. A data cube is a model-observation package that provides: (1) metrics of model-observation statistical summaries to assess the simulations and the ensemble spread; (2) statistical summaries of additional model property output that cannot be or are very difficult to observe; and (3) snapshots of the 4-D simulated fields from the integration period. Searchable metrics are provided that characterize the general atmospheric state to assist users in finding cases of interest, such as categorization of daily weather conditions and their specific attributes. The data cubes will be accompanied by tools designed for easy access to cube contents from within the ARM archive and externally, the ability to compare multiple data streams within an event as well as across events, and the ability to use common grids and time sampling, where appropriate.

  13. Probabilistic Evaluation of Competing Climate Models

    NASA Astrophysics Data System (ADS)

    Braverman, A. J.; Chatterjee, S.; Heyman, M.; Cressie, N.

    2017-12-01

    A standard paradigm for assessing the quality of climate model simulations is to compare what these models produce for past and present time periods, to observations of the past and present. Many of these comparisons are based on simple summary statistics called metrics. Here, we propose an alternative: evaluation of competing climate models through probabilities derived from tests of the hypothesis that climate-model-simulated and observed time sequences share common climate-scale signals. The probabilities are based on the behavior of summary statistics of climate model output and observational data, over ensembles of pseudo-realizations. These are obtained by partitioning the original time sequences into signal and noise components, and using a parametric bootstrap to create pseudo-realizations of the noise sequences. The statistics we choose come from working in the space of decorrelated and dimension-reduced wavelet coefficients. We compare monthly sequences of CMIP5 model output of average global near-surface temperature anomalies to similar sequences obtained from the well-known HadCRUT4 data set, as an illustration.

  14. Making objective summaries of climate model behavior more accessible

    NASA Astrophysics Data System (ADS)

    Gleckler, P. J.

    2016-12-01

    For multiple reasons, a more efficient and systematic evaluation of publically available climate model simulations is urgently needed. The IPCC, national assessments, and an assortment of other public and policy-driven needs place taxing demands on researchers. While cutting edge research is essential to meeting these needs, so too are results from well-established analysis, and these should be more efficiently produced, widely accessible, and be highly traceable. Furthermore, the number of simulations used by the research community is already large and expected to dramatically increase with the 6th phase of the Coupled Model Intercomparison Project (CMIP6). To help meet the demands on the research community and synthesize results from the rapidly expanding number and complexity of model simulations, well-established characteristics from all CMIP DECK (Diagnosis, Evaluation and Characterization of Klima) experiments will be routinely produced and made accessible. This presentation highlights the PCMDI Metrics Package (PMP), a capability that is designed to provide a diverse suite of objective summary statistics across spatial and temporal scales, gauging the agreement between models and observations. In addition to the PMP, ESMValTool is being developed to broadly diagnose CMIP simulations, and a variety of other packages target specialized sets of analysis. The challenges and opportunities of working towards coordinating these community-based capabilities will be discussed.

  15. Modeling Nonstationarity in Space and Time

    PubMed Central

    2017-01-01

    Summary We propose to model a spatio-temporal random field that has nonstationary covariance structure in both space and time domains by applying the concept of the dimension expansion method in Bornn et al. (2012). Simulations are conducted for both separable and nonseparable space-time covariance models, and the model is also illustrated with a streamflow dataset. Both simulation and data analyses show that modeling nonstationarity in both space and time can improve the predictive performance over stationary covariance models or models that are nonstationary in space but stationary in time. PMID:28134977

  16. Development Of Maneuvering Autopilot For Flight Tests

    NASA Technical Reports Server (NTRS)

    Menon, P. K. A.; Walker, R. A.

    1992-01-01

    Report describes recent efforts to develop automatic control system operating under supervision of pilot and making airplane follow prescribed trajectories during flight tests. Report represents additional progress on this project. Gives background information on technology of control of test-flight trajectories; presents mathematical models of airframe, engine and command-augmentation system; focuses on mathematical modeling of maneuvers; addresses design of autopilots for maneuvers; discusses numerical simulation and evaluation of results of simulation of eight maneuvers under control of simulated autopilot; and presents summary and discussion of future work.

  17. Independent Verification and Validation of the Global Deployment Analysis System (GDAS). Phase 2 Summary

    DTIC Science & Technology

    1991-06-28

    and examined various models as possible alternatives to TRANSMO. None of the candidate models met all CAA’s requirements, so a major TERP recommendation...will simulate the mobilization of U.S. forces, deployment of forces and supplies across an intertheater network, and deployment of forces and... supplies to the combat zone. 1.2 Phase !1 IV&V Summary Potomac Systems Engineering, Inc. (PSE), is providing IV&V support to CAA during the GDAS development

  18. A simulation model for risk assessment of turbine wheels

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Hage, Richard T.

    1991-01-01

    A simulation model has been successfully developed to evaluate the risk of the Space Shuttle auxiliary power unit (APU) turbine wheels for a specific inspection policy. Besides being an effective tool for risk/reliability evaluation, the simulation model also allows the analyst to study the trade-offs between wheel reliability, wheel life, inspection interval, and rejection crack size. For example, in the APU application, sensitivity analysis results showed that the wheel life limit has the least effect on wheel reliability when compared to the effect of the inspection interval and the rejection crack size. In summary, the simulation model developed represents a flexible tool to predict turbine wheel reliability and study the risk under different inspection policies.

  19. A simulation model for risk assessment of turbine wheels

    NASA Astrophysics Data System (ADS)

    Safie, Fayssal M.; Hage, Richard T.

    A simulation model has been successfully developed to evaluate the risk of the Space Shuttle auxiliary power unit (APU) turbine wheels for a specific inspection policy. Besides being an effective tool for risk/reliability evaluation, the simulation model also allows the analyst to study the trade-offs between wheel reliability, wheel life, inspection interval, and rejection crack size. For example, in the APU application, sensitivity analysis results showed that the wheel life limit has the least effect on wheel reliability when compared to the effect of the inspection interval and the rejection crack size. In summary, the simulation model developed represents a flexible tool to predict turbine wheel reliability and study the risk under different inspection policies.

  20. Freight Transportation Energy Use : Volume 1. Summary and Baseline Results.

    DOT National Transportation Integrated Search

    1978-07-01

    The overall design of the TSC Freight Energy Model is presented. A hierarchical modeling strategy is used, in which detailed modal simulators estimate the performance characteristics of transportation network elements, and the estimates are input to ...

  1. An Overview of Mesoscale Modeling Software for Energetic Materials Research

    DTIC Science & Technology

    2010-03-01

    12 2.9 Large-scale Atomic/Molecular Massively Parallel Simulator ( LAMMPS ...13 Table 10. LAMMPS summary...extensive reviews, lectures and workshops are available on multiscale modeling of materials applications (76-78). • Multi-phase mixtures of

  2. SMMP v. 3.0—Simulating proteins and protein interactions in Python and Fortran

    NASA Astrophysics Data System (ADS)

    Meinke, Jan H.; Mohanty, Sandipan; Eisenmenger, Frank; Hansmann, Ulrich H. E.

    2008-03-01

    We describe a revised and updated version of the program package SMMP. SMMP is an open-source FORTRAN package for molecular simulation of proteins within the standard geometry model. It is designed as a simple and inexpensive tool for researchers and students to become familiar with protein simulation techniques. SMMP 3.0 sports a revised API increasing its flexibility, an implementation of the Lund force field, multi-molecule simulations, a parallel implementation of the energy function, Python bindings, and more. Program summaryTitle of program:SMMP Catalogue identifier:ADOJ_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADOJ_v3_0.html Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Licensing provisions:Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html Programming language used:FORTRAN, Python No. of lines in distributed program, including test data, etc.:52 105 No. of bytes in distributed program, including test data, etc.:599 150 Distribution format:tar.gz Computer:Platform independent Operating system:OS independent RAM:2 Mbytes Classification:3 Does the new version supersede the previous version?:Yes Nature of problem:Molecular mechanics computations and Monte Carlo simulation of proteins. Solution method:Utilizes ECEPP2/3, FLEX, and Lund potentials. Includes Monte Carlo simulation algorithms for canonical, as well as for generalized ensembles. Reasons for new version:API changes and increased functionality. Summary of revisions:Added Lund potential; parameters used in subroutines are now passed as arguments; multi-molecule simulations; parallelized energy calculation for ECEPP; Python bindings. Restrictions:The consumed CPU time increases with the size of protein molecule. Running time:Depends on the size of the simulated molecule.

  3. Teleoperator and robotics system analysis

    NASA Technical Reports Server (NTRS)

    Teoh, William

    1987-01-01

    The Orbital Maneuvering Vehicle (OMV) was designed to operate as a remotely controlled space teleoperator. The design and implementation of OMM (a mathematical model of the OMV) are discussed. The State Vector Transformation Module (SVX), an interface between the OMV simulation model and the mobile base (TOM-B) of the flat floor simulation system is described. A summary of testing procedures and conclusions are presented together with the test data obtained.

  4. Design analysis and computer-aided performance evaluation of shuttle orbiter electrical power system. Volume 1: Summary

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Studies were conducted to develop appropriate space shuttle electrical power distribution and control (EPDC) subsystem simulation models and to apply the computer simulations to systems analysis of the EPDC. A previously developed software program (SYSTID) was adapted for this purpose. The following objectives were attained: (1) significant enhancement of the SYSTID time domain simulation software, (2) generation of functionally useful shuttle EPDC element models, and (3) illustrative simulation results in the analysis of EPDC performance, under the conditions of fault, current pulse injection due to lightning, and circuit protection sizing and reaction times.

  5. Emissions Models and Other Methods to Produce Emission Inventories

    EPA Pesticide Factsheets

    An emissions inventory is a summary or forecast of the emissions produced by a group of sources in a given time period. Inventories of air pollution from mobile sources are often produced by models such as the MOtor Vehicle Emission Simulator (MOVES).

  6. System Dynamics Modeling for Proactive Intelligence

    DTIC Science & Technology

    2010-01-01

    5  4. Modeling Resources as Part of an Integrated Multi- Methodology System .................. 16  5. Formalizing Pro-Active...Observable Data With and Without Simulation Analysis ............................... 15  Figure 13. Summary of Probe Methodology and Results...Strategy ............................................................................. 22  Figure 22. Overview of Methodology

  7. A study of remote sensing as applied to regional and small watersheds. Volume 1: Summary report

    NASA Technical Reports Server (NTRS)

    Ambaruch, R.

    1974-01-01

    The accuracy of remotely sensed measurements to provide inputs to hydrologic models of watersheds is studied. A series of sensitivity analyses on continuous simulation models of three watersheds determined: (1)Optimal values and permissible tolerances of inputs to achieve accurate simulation of streamflow from the watersheds; (2) Which model inputs can be quantified from remote sensing, directly, indirectly or by inference; and (3) How accurate remotely sensed measurements (from spacecraft or aircraft) must be to provide a basis for quantifying model inputs within permissible tolerances.

  8. 0-6629 : Texas specific drive cycles and idle emissions rates for using with EPA's MOVES model, [project summary].

    DOT National Transportation Integrated Search

    2013-08-01

    The U.S. Environmental Protection Agencys : newest emissions model, Motor Vehicle Emission : Simulator (MOVES), enables users to use local : drive schedules(representative vehicle speed : profiles) in order to perform an accurate analysis : of emi...

  9. skelesim: an extensible, general framework for population genetic simulation in R.

    PubMed

    Parobek, Christian M; Archer, Frederick I; DePrenger-Levin, Michelle E; Hoban, Sean M; Liggins, Libby; Strand, Allan E

    2017-01-01

    Simulations are a key tool in molecular ecology for inference and forecasting, as well as for evaluating new methods. Due to growing computational power and a diversity of software with different capabilities, simulations are becoming increasingly powerful and useful. However, the widespread use of simulations by geneticists and ecologists is hindered by difficulties in understanding these softwares' complex capabilities, composing code and input files, a daunting bioinformatics barrier and a steep conceptual learning curve. skelesim (an R package) guides users in choosing appropriate simulations, setting parameters, calculating genetic summary statistics and organizing data output, in a reproducible pipeline within the R environment. skelesim is designed to be an extensible framework that can 'wrap' around any simulation software (inside or outside the R environment) and be extended to calculate and graph any genetic summary statistics. Currently, skelesim implements coalescent and forward-time models available in the fastsimcoal2 and rmetasim simulation engines to produce null distributions for multiple population genetic statistics and marker types, under a variety of demographic conditions. skelesim is intended to make simulations easier while still allowing full model complexity to ensure that simulations play a fundamental role in molecular ecology investigations. skelesim can also serve as a teaching tool: demonstrating the outcomes of stochastic population genetic processes; teaching general concepts of simulations; and providing an introduction to the R environment with a user-friendly graphical user interface (using shiny). © 2016 John Wiley & Sons Ltd.

  10. skeleSim: an extensible, general framework for population genetic simulation in R

    PubMed Central

    Parobek, Christian M.; Archer, Frederick I.; DePrenger-Levin, Michelle E.; Hoban, Sean M.; Liggins, Libby; Strand, Allan E.

    2016-01-01

    Simulations are a key tool in molecular ecology for inference and forecasting, as well as for evaluating new methods. Due to growing computational power and a diversity of software with different capabilities, simulations are becoming increasingly powerful and useful. However, the widespread use of simulations by geneticists and ecologists is hindered by difficulties in understanding these softwares’ complex capabilities, composing code and input files, a daunting bioinformatics barrier, and a steep conceptual learning curve. skeleSim (an R package) guides users in choosing appropriate simulations, setting parameters, calculating genetic summary statistics, and organizing data output, in a reproducible pipeline within the R environment. skeleSim is designed to be an extensible framework that can ‘wrap’ around any simulation software (inside or outside the R environment) and be extended to calculate and graph any genetic summary statistics. Currently, skeleSim implements coalescent and forward-time models available in the fastsimcoal2 and rmetasim simulation engines to produce null distributions for multiple population genetic statistics and marker types, under a variety of demographic conditions. skeleSim is intended to make simulations easier while still allowing full model complexity to ensure that simulations play a fundamental role in molecular ecology investigations. skeleSim can also serve as a teaching tool: demonstrating the outcomes of stochastic population genetic processes; teaching general concepts of simulations; and providing an introduction to the R environment with a user-friendly graphical user interface (using shiny). PMID:27736016

  11. Effects of Symbolic Modeling on Children's Interpersonal Aggression.

    ERIC Educational Resources Information Center

    Liebert, Robert M.; Baron, Robert A.

    Does exposure to symbolically modeled aggression (aggression in cartoons, movies, stories and simulated television programs) increase children's willingness to engage in behavior which might actually harm another human being? This paper presents a summary of three recent experiments offering affirmative answers to the question. A fourth experiment…

  12. On the use and misuse of scalar scores of confounders in design and analysis of observational studies.

    PubMed

    Pfeiffer, R M; Riedl, R

    2015-08-15

    We assess the asymptotic bias of estimates of exposure effects conditional on covariates when summary scores of confounders, instead of the confounders themselves, are used to analyze observational data. First, we study regression models for cohort data that are adjusted for summary scores. Second, we derive the asymptotic bias for case-control studies when cases and controls are matched on a summary score, and then analyzed either using conditional logistic regression or by unconditional logistic regression adjusted for the summary score. Two scores, the propensity score (PS) and the disease risk score (DRS) are studied in detail. For cohort analysis, when regression models are adjusted for the PS, the estimated conditional treatment effect is unbiased only for linear models, or at the null for non-linear models. Adjustment of cohort data for DRS yields unbiased estimates only for linear regression; all other estimates of exposure effects are biased. Matching cases and controls on DRS and analyzing them using conditional logistic regression yields unbiased estimates of exposure effect, whereas adjusting for the DRS in unconditional logistic regression yields biased estimates, even under the null hypothesis of no association. Matching cases and controls on the PS yield unbiased estimates only under the null for both conditional and unconditional logistic regression, adjusted for the PS. We study the bias for various confounding scenarios and compare our asymptotic results with those from simulations with limited sample sizes. To create realistic correlations among multiple confounders, we also based simulations on a real dataset. Copyright © 2015 John Wiley & Sons, Ltd.

  13. Massive optimal data compression and density estimation for scalable, likelihood-free inference in cosmology

    NASA Astrophysics Data System (ADS)

    Alsing, Justin; Wandelt, Benjamin; Feeney, Stephen

    2018-07-01

    Many statistical models in cosmology can be simulated forwards but have intractable likelihood functions. Likelihood-free inference methods allow us to perform Bayesian inference from these models using only forward simulations, free from any likelihood assumptions or approximations. Likelihood-free inference generically involves simulating mock data and comparing to the observed data; this comparison in data space suffers from the curse of dimensionality and requires compression of the data to a small number of summary statistics to be tractable. In this paper, we use massive asymptotically optimal data compression to reduce the dimensionality of the data space to just one number per parameter, providing a natural and optimal framework for summary statistic choice for likelihood-free inference. Secondly, we present the first cosmological application of Density Estimation Likelihood-Free Inference (DELFI), which learns a parametrized model for joint distribution of data and parameters, yielding both the parameter posterior and the model evidence. This approach is conceptually simple, requires less tuning than traditional Approximate Bayesian Computation approaches to likelihood-free inference and can give high-fidelity posteriors from orders of magnitude fewer forward simulations. As an additional bonus, it enables parameter inference and Bayesian model comparison simultaneously. We demonstrate DELFI with massive data compression on an analysis of the joint light-curve analysis supernova data, as a simple validation case study. We show that high-fidelity posterior inference is possible for full-scale cosmological data analyses with as few as ˜104 simulations, with substantial scope for further improvement, demonstrating the scalability of likelihood-free inference to large and complex cosmological data sets.

  14. Fully Bayesian tests of neutrality using genealogical summary statistics.

    PubMed

    Drummond, Alexei J; Suchard, Marc A

    2008-10-31

    Many data summary statistics have been developed to detect departures from neutral expectations of evolutionary models. However questions about the neutrality of the evolution of genetic loci within natural populations remain difficult to assess. One critical cause of this difficulty is that most methods for testing neutrality make simplifying assumptions simultaneously about the mutational model and the population size model. Consequentially, rejecting the null hypothesis of neutrality under these methods could result from violations of either or both assumptions, making interpretation troublesome. Here we harness posterior predictive simulation to exploit summary statistics of both the data and model parameters to test the goodness-of-fit of standard models of evolution. We apply the method to test the selective neutrality of molecular evolution in non-recombining gene genealogies and we demonstrate the utility of our method on four real data sets, identifying significant departures of neutrality in human influenza A virus, even after controlling for variation in population size. Importantly, by employing a full model-based Bayesian analysis, our method separates the effects of demography from the effects of selection. The method also allows multiple summary statistics to be used in concert, thus potentially increasing sensitivity. Furthermore, our method remains useful in situations where analytical expectations and variances of summary statistics are not available. This aspect has great potential for the analysis of temporally spaced data, an expanding area previously ignored for limited availability of theory and methods.

  15. Using Landsat to provide potato production estimates to Columbia Basin farmers and processors

    NASA Technical Reports Server (NTRS)

    1990-01-01

    A summary of project activities relative to the estimation of potato yields in the Columbia Basin is given. Oregon State University is using a two-pronged approach to yield estimation, one using simulation models and the other using purely empirical models. The simulation modeling approach has used satellite observations to determine key dates in the development of the crop for each field identified as potatoes. In particular, these include planting dates, emergence dates, and harvest dates. These critical dates are fed into simulation models of crop growth and development to derive yield forecasts. Two empirical modeling approaches are illustrated. One relates tuber yield to estimates of cumulative intercepted solar radiation; the other relates tuber yield to the integral under the GVI curve.

  16. Summary of: Simulating the Value of Concentrating Solar Power with Thermal Energy Storage in a Production Cost Model (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Denholm, P.; Hummon, M.

    2013-02-01

    Concentrating solar power (CSP) deployed with thermal energy storage (TES) provides a dispatchable source of renewable energy. The value of CSP with TES, as with other potential generation resources, needs to be established using traditional utility planning tools. Production cost models, which simulate the operation of grid, are often used to estimate the operational value of different generation mixes. CSP with TES has historically had limited analysis in commercial production simulations. This document describes the implementation of CSP with TES in a commercial production cost model. It also describes the simulation of grid operations with CSP in a test systemmore » consisting of two balancing areas located primarily in Colorado.« less

  17. Summary of Research 1993

    DTIC Science & Technology

    1993-12-31

    34Nonlinear development in advanced avionics Simulation for an Autonomous Ummanned technology topics. Air Vehicle,* Master’s Thesis , September 1993. OMNTM...taps over the lower blade Passage Flow Model Simulation ,O section surface and end walls, a Master’s Thesis , December 1992. pitot survey probe downstream...Graphical Simulation of Walking Robot Kinematics," Master’s Thesis , March Byrnes, R.B., Kwak, S.H., Nelson, 1993. M.L., McGhee, R.B., and Healey, A.J

  18. Summary of investigations of engine response to distorted inlet conditions

    NASA Technical Reports Server (NTRS)

    Biesiadny, T. J.; Braithwaite, W. M.; Soeder, R. H.; Abdelwahab, M.

    1986-01-01

    A survey is presented of experimental and analytical experience of the NASA Lewis Research Center in engine response to inlet temperature and pressure distortions. This includes a description of the hardware and techniques employed, and a summary of the highlights of experimental investigations and analytical modeling. Distortion devices successfully simulated inlet distortion, and knowledge was gained about compression system response to different types of distortion. A list of NASA research references is included.

  19. Research Summary 3-D Computational Fluid Dynamics (CFD) Model Of The Human Respiratory System

    EPA Science Inventory

    The U.S. EPA’s Office of Research and Development (ORD) has developed a 3-D computational fluid dynamics (CFD) model of the human respiratory system that allows for the simulation of particulate based contaminant deposition and clearance, while being adaptable for age, ethnicity,...

  20. Summary appraisals of the Nation's ground-water resources; Lower Mississippi region

    USGS Publications Warehouse

    Terry, J.E.; Hosman, R.L.; Bryant, C.T.

    1979-01-01

    Great advances have been made in hydrologic technology in recent years. Predictive models have been developed that make it possible for the hydroiogist to simulate aquifer responses to proposed development or other stresses. These models would be invaluable tools in progressive water-resources planning and management.

  1. A study of application of remote sensing to river forecasting. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A project is described whose goal was to define, implement and evaluate a pilot demonstration test to show the practicability of applying remotely sensed data to operational river forecasting in gaged or previously ungaged watersheds. A secondary objective was to provide NASA with documentation describing the computer programs that comprise the streamflow forecasting simulation model used. A computer-based simulation model was adapted to a streamflow forecasting application and implemented in an IBM System/360 Model 44 computer, operating in a dedicated mode, with operator interactive control through a Model 2250 keyboard/graphic CRT terminal. The test site whose hydrologic behavior was simulated is a small basin (365 square kilometers) designated Town Creek near Geraldine, Alabama.

  2. XFEL OSCILLATOR SIMULATION INCLUDING ANGLE-DEPENDENT CRYSTAL REFLECTIVITY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fawley, William; Lindberg, Ryan; Kim, K-J

    The oscillator package within the GINGER FEL simulation code has now been extended to include angle-dependent reflectivity properties of Bragg crystals. Previously, the package was modified to include frequencydependent reflectivity in order to model x-ray FEL oscillators from start-up from shot noise through to saturation. We present a summary of the algorithms used for modeling the crystal reflectivity and radiation propagation outside the undulator, discussing various numerical issues relevant to the domain of high Fresnel number and efficient Hankel transforms. We give some sample XFEL-O simulation results obtained with the angle-dependent reflectivity model, with particular attention directed to the longitudinalmore » and transverse coherence of the radiation output.« less

  3. Looking for trees in the forest: summary tree from posterior samples

    PubMed Central

    2013-01-01

    Background Bayesian phylogenetic analysis generates a set of trees which are often condensed into a single tree representing the whole set. Many methods exist for selecting a representative topology for a set of unrooted trees, few exist for assigning branch lengths to a fixed topology, and even fewer for simultaneously setting the topology and branch lengths. However, there is very little research into locating a good representative for a set of rooted time trees like the ones obtained from a BEAST analysis. Results We empirically compare new and known methods for generating a summary tree. Some new methods are motivated by mathematical constructions such as tree metrics, while the rest employ tree concepts which work well in practice. These use more of the posterior than existing methods, which discard information not directly mapped to the chosen topology. Using results from a large number of simulations we assess the quality of a summary tree, measuring (a) how well it explains the sequence data under the model and (b) how close it is to the “truth”, i.e to the tree used to generate the sequences. Conclusions Our simulations indicate that no single method is “best”. Methods producing good divergence time estimates have poor branch lengths and lower model fit, and vice versa. Using the results presented here, a user can choose the appropriate method based on the purpose of the summary tree. PMID:24093883

  4. Looking for trees in the forest: summary tree from posterior samples.

    PubMed

    Heled, Joseph; Bouckaert, Remco R

    2013-10-04

    Bayesian phylogenetic analysis generates a set of trees which are often condensed into a single tree representing the whole set. Many methods exist for selecting a representative topology for a set of unrooted trees, few exist for assigning branch lengths to a fixed topology, and even fewer for simultaneously setting the topology and branch lengths. However, there is very little research into locating a good representative for a set of rooted time trees like the ones obtained from a BEAST analysis. We empirically compare new and known methods for generating a summary tree. Some new methods are motivated by mathematical constructions such as tree metrics, while the rest employ tree concepts which work well in practice. These use more of the posterior than existing methods, which discard information not directly mapped to the chosen topology. Using results from a large number of simulations we assess the quality of a summary tree, measuring (a) how well it explains the sequence data under the model and (b) how close it is to the "truth", i.e to the tree used to generate the sequences. Our simulations indicate that no single method is "best". Methods producing good divergence time estimates have poor branch lengths and lower model fit, and vice versa. Using the results presented here, a user can choose the appropriate method based on the purpose of the summary tree.

  5. Advanced ETC/LSS computerized analytical models, CO2 concentration. Volume 1: Summary document

    NASA Technical Reports Server (NTRS)

    Taylor, B. N.; Loscutoff, A. V.

    1972-01-01

    Computer simulations have been prepared for the concepts of C02 concentration which have the potential for maintaining a C02 partial pressure of 3.0 mmHg, or less, in a spacecraft environment. The simulations were performed using the G-189A Generalized Environmental Control computer program. In preparing the simulations, new subroutines to model the principal functional components for each concept were prepared and integrated into the existing program. Sample problems were run to demonstrate the methods of simulation and performance characteristics of the individual concepts. Comparison runs for each concept can be made for parametric values of cabin pressure, crew size, cabin air dry and wet bulb temperatures, and mission duration.

  6. Rapid methods for radionuclide contaminant transport in nuclear fuel cycle simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huff, Kathryn

    Here, nuclear fuel cycle and nuclear waste disposal decisions are technologically coupled. However, current nuclear fuel cycle simulators lack dynamic repository performance analysis due to the computational burden of high-fidelity hydrolgic contaminant transport models. The Cyder disposal environment and repository module was developed to fill this gap. It implements medium-fidelity hydrologic radionuclide transport models to support assessment appropriate for fuel cycle simulation in the Cyclus fuel cycle simulator. Rapid modeling of hundreds of discrete waste packages in a geologic environment is enabled within this module by a suite of four closed form models for advective, dispersive, coupled, and idealized con-more » taminant transport: a Degradation Rate model, a Mixed Cell model, a Lumped Parameter model, and a 1-D Permeable Porous Medium model. A summary of the Cyder module, its timestepping algorithm, and the mathematical models implemented within it are presented. Additionally, parametric demonstrations simulations performed with Cyder are presented and shown to demonstrate functional agreement with parametric simulations conducted in a standalone hydrologic transport model, the Clay Generic Disposal System Model developed by the Used Fuel Disposition Campaign Department of Energy Office of Nuclear Energy.« less

  7. Rapid methods for radionuclide contaminant transport in nuclear fuel cycle simulation

    DOE PAGES

    Huff, Kathryn

    2017-08-01

    Here, nuclear fuel cycle and nuclear waste disposal decisions are technologically coupled. However, current nuclear fuel cycle simulators lack dynamic repository performance analysis due to the computational burden of high-fidelity hydrolgic contaminant transport models. The Cyder disposal environment and repository module was developed to fill this gap. It implements medium-fidelity hydrologic radionuclide transport models to support assessment appropriate for fuel cycle simulation in the Cyclus fuel cycle simulator. Rapid modeling of hundreds of discrete waste packages in a geologic environment is enabled within this module by a suite of four closed form models for advective, dispersive, coupled, and idealized con-more » taminant transport: a Degradation Rate model, a Mixed Cell model, a Lumped Parameter model, and a 1-D Permeable Porous Medium model. A summary of the Cyder module, its timestepping algorithm, and the mathematical models implemented within it are presented. Additionally, parametric demonstrations simulations performed with Cyder are presented and shown to demonstrate functional agreement with parametric simulations conducted in a standalone hydrologic transport model, the Clay Generic Disposal System Model developed by the Used Fuel Disposition Campaign Department of Energy Office of Nuclear Energy.« less

  8. Data Intensive Systems (DIS) Benchmark Performance Summary

    DTIC Science & Technology

    2003-08-01

    models assumed by today’s conventional architectures. Such applications include model- based Automatic Target Recognition (ATR), synthetic aperture...radar (SAR) codes, large scale dynamic databases/battlefield integration, dynamic sensor- based processing, high-speed cryptanalysis, high speed...distributed interactive and data intensive simulations, data-oriented problems characterized by pointer- based and other highly irregular data structures

  9. Near-source air quality in rail yard environments – an overview of recent EPA measurement and modeling findings

    EPA Science Inventory

    This presentation will providing a summary of field measurements conducted in areas surrounding two major rail yards as well as modeling simulations of rail yard emissions dispersion. The Cicero Rail Yard Study (CIRYS) was recently released to the public and includes mobile and ...

  10. 78 FR 70516 - Approval and Promulgation of Implementation Plans; North Carolina: Non-Interference Demonstration...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-26

    ... AGENCY: Environmental Protection Agency (EPA). ACTION: Proposed rule. SUMMARY: EPA is proposing to... Emissions Simulator (MOVES) and NONROAD2008 models which are the most current versions of modeling systems... Area from those areas subject to the 7.8 psi Federal RVP requirements, such action will occur in a...

  11. Cocaine Dependence Treatment Data: Methods for Measurement Error Problems With Predictors Derived From Stationary Stochastic Processes

    PubMed Central

    Guan, Yongtao; Li, Yehua; Sinha, Rajita

    2011-01-01

    In a cocaine dependence treatment study, we use linear and nonlinear regression models to model posttreatment cocaine craving scores and first cocaine relapse time. A subset of the covariates are summary statistics derived from baseline daily cocaine use trajectories, such as baseline cocaine use frequency and average daily use amount. These summary statistics are subject to estimation error and can therefore cause biased estimators for the regression coefficients. Unlike classical measurement error problems, the error we encounter here is heteroscedastic with an unknown distribution, and there are no replicates for the error-prone variables or instrumental variables. We propose two robust methods to correct for the bias: a computationally efficient method-of-moments-based method for linear regression models and a subsampling extrapolation method that is generally applicable to both linear and nonlinear regression models. Simulations and an application to the cocaine dependence treatment data are used to illustrate the efficacy of the proposed methods. Asymptotic theory and variance estimation for the proposed subsampling extrapolation method and some additional simulation results are described in the online supplementary material. PMID:21984854

  12. Infectious diseases: Surveillance, genetic modification and simulation

    USGS Publications Warehouse

    Koh, H. L.; Teh, S.Y.; De Angelis, D. L.; Jiang, J.

    2011-01-01

    Infectious diseases such as influenza and dengue have the potential of becoming a worldwide pandemic that may exert immense pressures on existing medical infrastructures. Careful surveillance of these diseases, supported by consistent model simulations, provides a means for tracking the disease evolution. The integrated surveillance and simulation program is essential in devising effective early warning systems and in implementing efficient emergency preparedness and control measures. This paper presents a summary of simulation analysis on influenza A (H1N1) 2009 in Malaysia. This simulation analysis provides insightful lessons regarding how disease surveillance and simulation should be performed in the future. This paper briefly discusses the controversy over the experimental field release of genetically modified (GM) Aedes aegypti mosquito in Malaysia. Model simulations indicate that the proposed release of GM mosquitoes is neither a viable nor a sustainable control strategy. ?? 2011 WIT Press.

  13. Summary of nozzle-exhaust plume flowfield analyses related to space shuttle applications

    NASA Technical Reports Server (NTRS)

    Penny, M. M.

    1975-01-01

    Exhaust plume shape simulation is studied, with the major effort directed toward computer program development and analytical support of various plume related problems associated with the space shuttle. Program development centered on (1) two-phase nozzle-exhaust plume flows, (2) plume impingement, and (3) support of exhaust plume simulation studies. Several studies were also conducted to provide full-scale data for defining exhaust plume simulation criteria. Model nozzles used in launch vehicle test were analyzed and compared to experimental calibration data.

  14. Utilizing NX Advanced Simulation for NASA's New Mobile Launcher for Ares-l

    NASA Technical Reports Server (NTRS)

    Brown, Christopher

    2010-01-01

    This slide presentation reviews the use of NX to simulate the new Mobile Launcher (ML) for the Ares-I. It includes: a comparison of the sizes of the Saturn 5, the Space Shuttle, the Ares I, and the Ares V, with the height, and payload capability; the loads control plan; drawings of the base framing, the underside of the ML, beam arrangement, and the finished base and the origin of the 3D CAD data. It also reviews the modeling approach, meshing. the assembly Finite Element Modeling, the model summary. and beam improvements.

  15. Summary of results of January climate simulations with the GISS coarse-mesh model

    NASA Technical Reports Server (NTRS)

    Spar, J.; Cohen, C.; Wu, P.

    1981-01-01

    The large scale climates generated by extended runs of the model are relatively independent of the initial atmospheric conditions, if the first few months of each simulation are discarded. The perpetual January simulations with a specified SST field produced excessive snow accumulation over the continents of the Northern Hemisphere. Mass exchanges between the cold (warm) continents and the warm (cold) adjacent oceans produced significant surface pressure changes over the oceans as well as over the land. The effect of terrain and terrain elevation on the amount of precipitation was examined. The evaporation of continental moisture was calculated to cause large increases in precipitation over the continents.

  16. Electromechanical simulation and test of rotating systems with magnetic bearing or piezoelectric actuator active vibration control

    NASA Technical Reports Server (NTRS)

    Palazzolo, Alan B.; Tang, Punan; Kim, Chaesil; Manchala, Daniel; Barrett, Tim; Kascak, Albert F.; Brown, Gerald; Montague, Gerald; Dirusso, Eliseo; Klusman, Steve

    1994-01-01

    This paper contains a summary of the experience of the authors in the field of electromechanical modeling for rotating machinery - active vibration control. Piezoelectric and magnetic bearing actuator based control are discussed.

  17. July 2012 MOVES Model Review Work Group Meeting Materials

    EPA Pesticide Factsheets

    The Mobile Sources Technical Review Subcommittee (MSTRS) meeting on 31 July 2012 began with a summary of the work group's focus, which included the next version of MOVES (MOtor Vehicle Emission Simulator), and its data sources and analysis methods.

  18. Model-Biased, Data-Driven Adaptive Failure Prediction

    NASA Technical Reports Server (NTRS)

    Leen, Todd K.

    2004-01-01

    This final report, which contains a research summary and a viewgraph presentation, addresses clustering and data simulation techniques for failure prediction. The researchers applied their techniques to both helicopter gearbox anomaly detection and segmentation of Earth Observing System (EOS) satellite imagery.

  19. iTOUGH2: A multiphysics simulation-optimization framework for analyzing subsurface systems

    NASA Astrophysics Data System (ADS)

    Finsterle, S.; Commer, M.; Edmiston, J. K.; Jung, Y.; Kowalsky, M. B.; Pau, G. S. H.; Wainwright, H. M.; Zhang, Y.

    2017-11-01

    iTOUGH2 is a simulation-optimization framework for the TOUGH suite of nonisothermal multiphase flow models and related simulators of geophysical, geochemical, and geomechanical processes. After appropriate parameterization of subsurface structures and their properties, iTOUGH2 runs simulations for multiple parameter sets and analyzes the resulting output for parameter estimation through automatic model calibration, local and global sensitivity analyses, data-worth analyses, and uncertainty propagation analyses. Development of iTOUGH2 is driven by scientific challenges and user needs, with new capabilities continually added to both the forward simulator and the optimization framework. This review article provides a summary description of methods and features implemented in iTOUGH2, and discusses the usefulness and limitations of an integrated simulation-optimization workflow in support of the characterization and analysis of complex multiphysics subsurface systems.

  20. Revised Baseline Budget Projections for Fiscal Years 1999-2008

    DTIC Science & Technology

    1998-03-03

    or for the conclusions that CBO reached. CONTENTS SUMMARY AND INTRODUCTION 1 MODELING THE ECONOMY 3 Production 6 Private Capital and Labor 7... modeling the economy and the federal budget in the long run and to estimating the effects of uncertainty about future population and productivity . MODELING ...conventions of the official national income and product accounts (NIPAs). Given the equations, the model needs two other factors to make a simulation

  1. FORBEEF: A Forage-Livestock System Computer Model Used as a Teaching Aid for Decision Making.

    ERIC Educational Resources Information Center

    Stringer, W. C.; And Others

    1987-01-01

    Describes the development of a computer simulation model of forage-beef production systems, which is intended to incorporate soil, forage, and animal decisions into an enterprise scenario. Produces a summary of forage production and livestock needs. Cites positive assessment of the program's value by participants in inservice training workshops.…

  2. Results and Lessons Learned from Phase 1 of the Air Quality Model Evaluation International Initiative (AQMEII)

    EPA Science Inventory

    A summary of the key findings from the model evaluation studies performed for the Phase 1 annual 2006 North American and European simulations, as well as reflections on experiences gained during Phase 1 that will be important for guiding the implementation of Phase 2 of the Air Q...

  3. NAS (Numerical Aerodynamic Simulation Program) technical summaries, March 1989 - February 1990

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Given here are selected scientific results from the Numerical Aerodynamic Simulation (NAS) Program's third year of operation. During this year, the scientific community was given access to a Cray-2 and a Cray Y-MP supercomputer. Topics covered include flow field analysis of fighter wing configurations, large-scale ocean modeling, the Space Shuttle flow field, advanced computational fluid dynamics (CFD) codes for rotary-wing airloads and performance prediction, turbulence modeling of separated flows, airloads and acoustics of rotorcraft, vortex-induced nonlinearities on submarines, and standing oblique detonation waves.

  4. Propulsion and airframe aerodynamic interactions of supersonic V/STOL configurations. Volume 4: Summary

    NASA Technical Reports Server (NTRS)

    Zilz, D. E.; Wallace, H. W.; Hiley, P. E.

    1985-01-01

    A wind tunnel model of a supersonic V/STOL fighter configuration has been tested to measure the aerodynamic interaction effects which can result from geometrically close-coupled propulsion system/airframe components. The approach was to configure the model to represent two different test techniques. One was a conventional test technique composed of two test modes. In the Flow-Through mode, absolute configuration aerodynamics are measured, including inlet/airframe interactions. In the Jet-Effects mode, incremental nozzle/airframe interactions are measured. The other test technique is a propulsion simulator approach, where a sub-scale, externally powered engine is mounted in the model. This allows proper measurement of inlet/airframe and nozzle/airframe interactions simultaneously. This is Volume 4 of 4: Final Report- Summary.

  5. Transformation of Summary Statistics from Linear Mixed Model Association on All-or-None Traits to Odds Ratio.

    PubMed

    Lloyd-Jones, Luke R; Robinson, Matthew R; Yang, Jian; Visscher, Peter M

    2018-04-01

    Genome-wide association studies (GWAS) have identified thousands of loci that are robustly associated with complex diseases. The use of linear mixed model (LMM) methodology for GWAS is becoming more prevalent due to its ability to control for population structure and cryptic relatedness and to increase power. The odds ratio (OR) is a common measure of the association of a disease with an exposure ( e.g. , a genetic variant) and is readably available from logistic regression. However, when the LMM is applied to all-or-none traits it provides estimates of genetic effects on the observed 0-1 scale, a different scale to that in logistic regression. This limits the comparability of results across studies, for example in a meta-analysis, and makes the interpretation of the magnitude of an effect from an LMM GWAS difficult. In this study, we derived transformations from the genetic effects estimated under the LMM to the OR that only rely on summary statistics. To test the proposed transformations, we used real genotypes from two large, publicly available data sets to simulate all-or-none phenotypes for a set of scenarios that differ in underlying model, disease prevalence, and heritability. Furthermore, we applied these transformations to GWAS summary statistics for type 2 diabetes generated from 108,042 individuals in the UK Biobank. In both simulation and real-data application, we observed very high concordance between the transformed OR from the LMM and either the simulated truth or estimates from logistic regression. The transformations derived and validated in this study improve the comparability of results from prospective and already performed LMM GWAS on complex diseases by providing a reliable transformation to a common comparative scale for the genetic effects. Copyright © 2018 by the Genetics Society of America.

  6. Dynamics Modeling and Simulation of Large Transport Airplanes in Upset Conditions

    NASA Technical Reports Server (NTRS)

    Foster, John V.; Cunningham, Kevin; Fremaux, Charles M.; Shah, Gautam H.; Stewart, Eric C.; Rivers, Robert A.; Wilborn, James E.; Gato, William

    2005-01-01

    As part of NASA's Aviation Safety and Security Program, research has been in progress to develop aerodynamic modeling methods for simulations that accurately predict the flight dynamics characteristics of large transport airplanes in upset conditions. The motivation for this research stems from the recognition that simulation is a vital tool for addressing loss-of-control accidents, including applications to pilot training, accident reconstruction, and advanced control system analysis. The ultimate goal of this effort is to contribute to the reduction of the fatal accident rate due to loss-of-control. Research activities have involved accident analyses, wind tunnel testing, and piloted simulation. Results have shown that significant improvements in simulation fidelity for upset conditions, compared to current training simulations, can be achieved using state-of-the-art wind tunnel testing and aerodynamic modeling methods. This paper provides a summary of research completed to date and includes discussion on key technical results, lessons learned, and future research needs.

  7. Transport Experiments

    NASA Technical Reports Server (NTRS)

    Hall, Timothy M.; Wuebbles, Donald J.; Boering, Kristie A.; Eckman, Richard S.; Lerner, Jean; Plumb, R. Alan; Rind, David H.; Rinsland, Curtis P.; Waugh, Darryn W.; Wei, Chu-Feng

    1999-01-01

    MM II defined a series of experiments to better understand and characterize model transport and to assess the realism of this transport by comparison to observations. Measurements from aircraft, balloon, and satellite, not yet available at the time of MM I [Prather and Remsberg, 1993], provide new and stringent constraints on model transport, and address the limits of our transport modeling abilities. Simulations of the idealized tracers the age spectrum, and propagating boundary conditions, and conserved HSCT-like emissions probe the relative roles of different model transport mechanisms, while simulations of SF6 and C02 make the connection to observations. Some of the tracers are related, and transport diagnostics such as the mean age can be derived from more than one of the experiments for comparison to observations. The goals of the transport experiments are: (1) To isolate the effects of transport in models from other processes; (2) To assess model transport for realistic tracers (such as SF6 and C02) for comparison to observations; (3) To use certain idealized tracers to isolate model mechanisms and relationships to atmospheric chemical perturbations; (4) To identify strengths and weaknesses of the treatment of transport processes in the models; (5) To relate evaluated shortcomings to aspects of model formulation. The following section are included:Executive Summary, Introduction, Age Spectrum, Observation, Tropical Transport in Models, Global Mean Age in Models, Source-Transport Covariance, HSCT "ANOY" Tracer Distributions, and Summary and Conclusions.

  8. Basic Research in Digital Stochastic Model Algorithmic Control.

    DTIC Science & Technology

    1980-11-01

    IDCOM Description 115 8.2 Basic Control Computation 117 8.3 Gradient Algorithm 119 8.4 Simulation Model 119 8.5 Model Modifications 123 8.6 Summary 124...constraints, and 3) control traJectorv comouta- tion. 2.1.1 Internal Model of the System The multivariable system to be controlled is represented by a...more flexible and adaptive, since the model , criteria, and sampling rates can be adjusted on-line. This flexibility comes from the use of the impulse

  9. Fundamental statistical relationships between monthly and daily meteorological variables: Temporal downscaling of weather based on a global observational dataset

    NASA Astrophysics Data System (ADS)

    Sommer, Philipp; Kaplan, Jed

    2016-04-01

    Accurate modelling of large-scale vegetation dynamics, hydrology, and other environmental processes requires meteorological forcing on daily timescales. While meteorological data with high temporal resolution is becoming increasingly available, simulations for the future or distant past are limited by lack of data and poor performance of climate models, e.g., in simulating daily precipitation. To overcome these limitations, we may temporally downscale monthly summary data to a daily time step using a weather generator. Parameterization of such statistical models has traditionally been based on a limited number of observations. Recent developments in the archiving, distribution, and analysis of "big data" datasets provide new opportunities for the parameterization of a temporal downscaling model that is applicable over a wide range of climates. Here we parameterize a WGEN-type weather generator using more than 50 million individual daily meteorological observations, from over 10'000 stations covering all continents, based on the Global Historical Climatology Network (GHCN) and Synoptic Cloud Reports (EECRA) databases. Using the resulting "universal" parameterization and driven by monthly summaries, we downscale mean temperature (minimum and maximum), cloud cover, and total precipitation, to daily estimates. We apply a hybrid gamma-generalized Pareto distribution to calculate daily precipitation amounts, which overcomes much of the inability of earlier weather generators to simulate high amounts of daily precipitation. Our globally parameterized weather generator has numerous applications, including vegetation and crop modelling for paleoenvironmental studies.

  10. Large Eddy Simulations and Turbulence Modeling for Film Cooling

    NASA Technical Reports Server (NTRS)

    Acharya, Sumanta

    1999-01-01

    The objective of the research is to perform Direct Numerical Simulations (DNS) and Large Eddy Simulations (LES) for film cooling process, and to evaluate and improve advanced forms of the two equation turbulence models for turbine blade surface flow analysis. The DNS/LES were used to resolve the large eddies within the flow field near the coolant jet location. The work involved code development and applications of the codes developed to the film cooling problems. Five different codes were developed and utilized to perform this research. This report presented a summary of the development of the codes and their applications to analyze the turbulence properties at locations near coolant injection holes.

  11. Dislocation dynamics: simulation of plastic flow of bcc metals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lassila, D H

    This is the final report for the LDRD strategic initiative entitled ''Dislocation Dynamic: Simulation of Plastic Flow of bcc Metals'' (tracking code: 00-SI-011). This report is comprised of 6 individual sections. The first is an executive summary of the project and describes the overall project goal, which is to establish an experimentally validated 3D dislocation dynamics simulation. This first section also gives some information of LLNL's multi-scale modeling efforts associated with the plasticity of bcc metals, and the role of this LDRD project in the multiscale modeling program. The last five sections of this report are journal articles that weremore » produced during the course of the FY-2000 efforts.« less

  12. Assessment of stretched vortex subgrid-scale models for LES of incompressible inhomogeneous turbulent flow

    PubMed Central

    Shetty, Dinesh A.; Frankel, Steven H.

    2013-01-01

    Summary The physical space version of the stretched vortex subgrid scale model [Phys. Fluids 12, 1810 (2000)] is tested in large eddy simulations (LES) of the turbulent lid driven cubic cavity flow. LES is carried out using a higher order finite-difference method [J. Comput. Phys. 229, 8802 (2010)]. The effects of different vortex orientation models and subgrid turbulence spectrums are assessed through comparisons of the LES predictions against direct numerical simulations (DNS) [Phys. Fluids 12, 1363 (2000)]. Three Reynolds numbers 12000, 18000, and 22000 are studied. Good agreement with the DNS data for the mean and fluctuating quantities is observed. PMID:24187423

  13. Advanced local area network concepts

    NASA Technical Reports Server (NTRS)

    Grant, Terry

    1985-01-01

    Development of a good model of the data traffic requirements for Local Area Networks (LANs) onboard the Space Station is the driving problem in this work. A parameterized workload model is under development. An analysis contract has been started specifically to capture the distributed processing requirements for the Space Station and then to develop a top level model to simulate how various processing scenarios can handle the workload and what data communication patterns result. A summary of the Local Area Network Extendsible Simulator 2 Requirements Specification and excerpts from a grant report on the topological design of fiber optic local area networks with application to Expressnet are given.

  14. EVALUATING RISK-PREDICTION MODELS USING DATA FROM ELECTRONIC HEALTH RECORDS.

    PubMed

    Wang, L E; Shaw, Pamela A; Mathelier, Hansie M; Kimmel, Stephen E; French, Benjamin

    2016-03-01

    The availability of data from electronic health records facilitates the development and evaluation of risk-prediction models, but estimation of prediction accuracy could be limited by outcome misclassification, which can arise if events are not captured. We evaluate the robustness of prediction accuracy summaries, obtained from receiver operating characteristic curves and risk-reclassification methods, if events are not captured (i.e., "false negatives"). We derive estimators for sensitivity and specificity if misclassification is independent of marker values. In simulation studies, we quantify the potential for bias in prediction accuracy summaries if misclassification depends on marker values. We compare the accuracy of alternative prognostic models for 30-day all-cause hospital readmission among 4548 patients discharged from the University of Pennsylvania Health System with a primary diagnosis of heart failure. Simulation studies indicate that if misclassification depends on marker values, then the estimated accuracy improvement is also biased, but the direction of the bias depends on the direction of the association between markers and the probability of misclassification. In our application, 29% of the 1143 readmitted patients were readmitted to a hospital elsewhere in Pennsylvania, which reduced prediction accuracy. Outcome misclassification can result in erroneous conclusions regarding the accuracy of risk-prediction models.

  15. Forecasting a winner for Malaysian Cup 2013 using soccer simulation model

    NASA Astrophysics Data System (ADS)

    Yusof, Muhammad Mat; Fauzee, Mohd Soffian Omar; Latif, Rozita Abdul

    2014-07-01

    This paper investigates through soccer simulation the calculation of the probability for each team winning Malaysia Cup 2013. Our methodology used here is we predict the outcomes of individual matches and then we simulate the Malaysia Cup 2013 tournament 5000 times. As match outcomes are always a matter of uncertainty, statistical model, in particular a double Poisson model is used to predict the number of goals scored and conceded for each team. Maximum likelihood estimation is use to measure the attacking strength and defensive weakness for each team. Based on our simulation result, LionXII has a higher probability in becoming the winner, followed by Selangor, ATM, JDT and Kelantan. Meanwhile, T-Team, Negeri Sembilan and Felda United have lower probabilities to win Malaysia Cup 2013. In summary, we find that the probability for each team becominga winner is small, indicating that the level of competitive balance in Malaysia Cup 2013 is quite high.

  16. Simulation in paediatric urology and surgery, part 2: An overview of simulation modalities and their applications.

    PubMed

    Nataraja, R M; Webb, N; Lopez, P J

    2018-04-01

    Surgical training has changed radically in the last few decades. The traditional Halstedian model of time-bound apprenticeship has been replaced with competency-based training. In our previous article, we presented an overview of learning theory relevant to clinical teaching; a summary for the busy paediatric surgeon and urologist. We introduced the concepts underpinning current changes in surgical education and training. In this next article, we give an overview of the various modalities of surgical simulation, the educational principles that underlie them, and potential applications in clinical practice. These modalities include; open surgical models and trainers, laparoscopic bench trainers, virtual reality trainers, simulated patients and role-play, hybrid simulation, scenario-based simulation, distributed simulation, virtual reality, and online simulation. Specific examples of technology that may be used for these modalities are included but this is not a comprehensive review of all available products. Copyright © 2018 Journal of Pediatric Urology Company. Published by Elsevier Ltd. All rights reserved.

  17. The skill of summary in clinician-patient communication: a case study.

    PubMed

    Quilligan, Sally; Silverman, Jonathan

    2012-03-01

    To investigate the use and impact of the micro-skill of summary in clinical encounters, a core skill that has little empirical investigation of its use and outcomes. This exploratory study used a mixed method design. Video recordings of ten consultations between simulated patients and medical-students were analysed to identify types of summary used. Two contrasting cases were then micro-analysed and follow up interviews held with the 2 students and simulated patients, involved in the consultations, using the video recording as a trigger. Ninety-nine summaries were identified and grouped into six types: reflective, screening, clarifying, paraphrasing, interim and full. Summary appeared to aid accuracy. However, summaries about the patient's perspective were summarised less frequently than the biomedical perspective. When summaries were repeatedly incorrect they made the simulated patient feel they were not being listened to. The use and effect of summary appears more complex than the medical literature suggests and may have both positive and negative attributes. Further research is needed to investigate whether these preliminary findings are replicated within doctor-patient consultations. When teaching use of summary we need to address: type, purpose, accuracy, effect on patient and flexible use to suit the patient. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  18. A study of remote sensing as applied to regional and small watersheds. Volume 2: Supporting technical details. [using computerized simulation models

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The Stanford Watershed Model, the Kentucky Watershed Model and OPSET program, and the NASA-IBM system for simulation and analysis of watersheds are described in terms of their applications to the study of remote sensing of water resources. Specific calibration processes and input and output parameters that are instrumental in the simulations are explained for the following kinds of data: (1) hourly precipitation data; (2) daily discharge data; (3) flood hydrographs; (4) temperature and evaporation data; and (5) snowmelt data arrays. The Sensitivity Analysis Task, which provides a method for evaluation of any of the separate simulation runs in the form of performance indices, is also reported. The method is defined and a summary of results is given which indicates the values obtained in the simulation runs performed for Town Creek, Alabama; Alamosa Creek, Colorado; and Pearl River, Louisiana. The results are shown in tabular and plot graph form. For Vol. 1, see N74-27813.

  19. Advanced Multiple Processor Configuration Study. Final Report.

    ERIC Educational Resources Information Center

    Clymer, S. J.

    This summary of a study on multiple processor configurations includes the objectives, background, approach, and results of research undertaken to provide the Air Force with a generalized model of computer processor combinations for use in the evaluation of proposed flight training simulator computational designs. An analysis of a real-time flight…

  20. Summary Report of the Proceedings of the Annual Model U.N. Seminar (9th, New York, New York, July 8-10, 1988).

    ERIC Educational Resources Information Center

    United Nations Association of the United States of America, New York, NY.

    The purpose of this Model United Nations (UN) seminar for faculty advisors and conference leaders was to provide seminar participants with new and innovative ideas to more effectively simulate the complex UN system. This document summarizes seminar speeches presented by: (1) James Jonah, Assistant U.N. Secretary-General; (2) Frank Pinto,…

  1. Statistical modelling of gaze behaviour as categorical time series: what you should watch to save soccer penalties.

    PubMed

    Button, C; Dicks, M; Haines, R; Barker, R; Davids, K

    2011-08-01

    Previous research on gaze behaviour in sport has typically reported summary fixation statistics thereby largely ignoring the temporal sequencing of gaze. In the present study on penalty kicking in soccer, our aim was to apply a Markov chain modelling method to eye movement data obtained from goalkeepers. Building on the discrete analysis of gaze employed by Dicks et al. (Atten Percept Psychophys 72(3):706-720, 2010b), we wanted to statistically model the relative probabilities of the goalkeeper's gaze being directed to different locations throughout the penalty taker's approach (Dicks et al. in Atten Percept Psychophys 72(3):706-720, 2010b). Examination of gaze behaviours under in situ and video-simulation task constraints reveals differences in information pickup for perception and action (Attention, Perception and Psychophysics 72(3), 706-720). The probabilities of fixating anatomical locations of the penalty taker were high under simulated movement response conditions. In contrast, when actually required to intercept kicks, the goalkeepers initially favoured watching the penalty taker's head but then rapidly shifted focus directly to the ball for approximately the final second prior to foot-ball contact. The increased spatio-temporal demands of in situ interceptive actions over laboratory-based simulated actions lead to different visual search strategies being used. When eye movement data are modelled as time series, it is possible to discern subtle but important behavioural characteristics that are less apparent with discrete summary statistics alone.

  2. Simulation of blood flow in deformable vessels using subject-specific geometry and spatially varying wall properties

    PubMed Central

    Xiong, Guanglei; Figueroa, C. Alberto; Xiao, Nan; Taylor, Charles A.

    2011-01-01

    SUMMARY Simulation of blood flow using image-based models and computational fluid dynamics has found widespread application to quantifying hemodynamic factors relevant to the initiation and progression of cardiovascular diseases and for planning interventions. Methods for creating subject-specific geometric models from medical imaging data have improved substantially in the last decade but for many problems, still require significant user interaction. In addition, while fluid–structure interaction methods are being employed to model blood flow and vessel wall dynamics, tissue properties are often assumed to be uniform. In this paper, we propose a novel workflow for simulating blood flow using subject-specific geometry and spatially varying wall properties. The geometric model construction is based on 3D segmentation and geometric processing. Variable wall properties are assigned to the model based on combining centerline-based and surface-based methods. We finally demonstrate these new methods using an idealized cylindrical model and two subject-specific vascular models with thoracic and cerebral aneurysms. PMID:21765984

  3. Terrestrial ecosystem process model Biome-BGCMuSo v4.0: summary of improvements and new modeling possibilities

    NASA Astrophysics Data System (ADS)

    Hidy, Dóra; Barcza, Zoltán; Marjanović, Hrvoje; Zorana Ostrogović Sever, Maša; Dobor, Laura; Gelybó, Györgyi; Fodor, Nándor; Pintér, Krisztina; Churkina, Galina; Running, Steven; Thornton, Peter; Bellocchi, Gianni; Haszpra, László; Horváth, Ferenc; Suyker, Andrew; Nagy, Zoltán

    2016-12-01

    The process-based biogeochemical model Biome-BGC was enhanced to improve its ability to simulate carbon, nitrogen, and water cycles of various terrestrial ecosystems under contrasting management activities. Biome-BGC version 4.1.1 was used as a base model. Improvements included addition of new modules such as the multilayer soil module, implementation of processes related to soil moisture and nitrogen balance, soil-moisture-related plant senescence, and phenological development. Vegetation management modules with annually varying options were also implemented to simulate management practices of grasslands (mowing, grazing), croplands (ploughing, fertilizer application, planting, harvesting), and forests (thinning). New carbon and nitrogen pools have been defined to simulate yield and soft stem development of herbaceous ecosystems. The model version containing all developments is referred to as Biome-BGCMuSo (Biome-BGC with multilayer soil module; in this paper, Biome-BGCMuSo v4.0 is documented). Case studies on a managed forest, cropland, and grassland are presented to demonstrate the effect of model developments on the simulation of plant growth as well as on carbon and water balance.

  4. MPPhys—A many-particle simulation package for computational physics education

    NASA Astrophysics Data System (ADS)

    Müller, Thomas

    2014-03-01

    In a first course to classical mechanics elementary physical processes like elastic two-body collisions, the mass-spring model, or the gravitational two-body problem are discussed in detail. The continuation to many-body systems, however, is deferred to graduate courses although the underlying equations of motion are essentially the same and although there is a strong motivation for high-school students in particular because of the use of particle systems in computer games. The missing link between the simple and the more complex problem is a basic introduction to solve the equations of motion numerically which could be illustrated, however, by means of the Euler method. The many-particle physics simulation package MPPhys offers a platform to experiment with simple particle simulations. The aim is to give a principle idea how to implement many-particle simulations and how simulation and visualization can be combined for interactive visual explorations. Catalogue identifier: AERR_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERR_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 111327 No. of bytes in distributed program, including test data, etc.: 608411 Distribution format: tar.gz Programming language: C++, OpenGL, GLSL, OpenCL. Computer: Linux and Windows platforms with OpenGL support. Operating system: Linux and Windows. RAM: Source Code 4.5 MB Complete package 242 MB Classification: 14, 16.9. External routines: OpenGL, OpenCL Nature of problem: Integrate N-body simulations, mass-spring models Solution method: Numerical integration of N-body-simulations, 3D-Rendering via OpenGL. Running time: Problem dependent

  5. Rule-based modeling with Virtual Cell

    PubMed Central

    Schaff, James C.; Vasilescu, Dan; Moraru, Ion I.; Loew, Leslie M.; Blinov, Michael L.

    2016-01-01

    Summary: Rule-based modeling is invaluable when the number of possible species and reactions in a model become too large to allow convenient manual specification. The popular rule-based software tools BioNetGen and NFSim provide powerful modeling and simulation capabilities at the cost of learning a complex scripting language which is used to specify these models. Here, we introduce a modeling tool that combines new graphical rule-based model specification with existing simulation engines in a seamless way within the familiar Virtual Cell (VCell) modeling environment. A mathematical model can be built integrating explicit reaction networks with reaction rules. In addition to offering a large choice of ODE and stochastic solvers, a model can be simulated using a network free approach through the NFSim simulation engine. Availability and implementation: Available as VCell (versions 6.0 and later) at the Virtual Cell web site (http://vcell.org/). The application installs and runs on all major platforms and does not require registration for use on the user’s computer. Tutorials are available at the Virtual Cell website and Help is provided within the software. Source code is available at Sourceforge. Contact: vcell_support@uchc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27497444

  6. Genetic Simulation Resources: a website for the registration and discovery of genetic data simulators

    PubMed Central

    Peng, Bo; Chen, Huann-Sheng; Mechanic, Leah E.; Racine, Ben; Clarke, John; Clarke, Lauren; Gillanders, Elizabeth; Feuer, Eric J.

    2013-01-01

    Summary: Many simulation methods and programs have been developed to simulate genetic data of the human genome. These data have been widely used, for example, to predict properties of populations retrospectively or prospectively according to mathematically intractable genetic models, and to assist the validation, statistical inference and power analysis of a variety of statistical models. However, owing to the differences in type of genetic data of interest, simulation methods, evolutionary features, input and output formats, terminologies and assumptions for different applications, choosing the right tool for a particular study can be a resource-intensive process that usually involves searching, downloading and testing many different simulation programs. Genetic Simulation Resources (GSR) is a website provided by the National Cancer Institute (NCI) that aims to help researchers compare and choose the appropriate simulation tools for their studies. This website allows authors of simulation software to register their applications and describe them with well-defined attributes, thus allowing site users to search and compare simulators according to specified features. Availability: http://popmodels.cancercontrol.cancer.gov/gsr. Contact: gsr@mail.nih.gov PMID:23435068

  7. Evaluation of a kinetic model for computer simulation of growth and fermentation by Scheffersomyces (Pichia) stipitis fed D-xylose.

    PubMed

    Slininger, P J; Dien, B S; Lomont, J M; Bothast, R J; Ladisch, M R; Okos, M R

    2014-08-01

    Scheffersomyces (formerly Pichia) stipitis is a potential biocatalyst for converting lignocelluloses to ethanol because the yeast natively ferments xylose. An unstructured kinetic model based upon a system of linear differential equations has been formulated that describes growth and ethanol production as functions of ethanol, oxygen, and xylose concentrations for both growth and fermentation stages. The model was validated for various growth conditions including batch, cell recycle, batch with in situ ethanol removal and fed-batch. The model provides a summary of basic physiological yeast properties and is an important tool for simulating and optimizing various culture conditions and evaluating various bioreactor designs for ethanol production. © 2014 Wiley Periodicals, Inc.

  8. On Fitting Generalized Linear Mixed-effects Models for Binary Responses using Different Statistical Packages

    PubMed Central

    Zhang, Hui; Lu, Naiji; Feng, Changyong; Thurston, Sally W.; Xia, Yinglin; Tu, Xin M.

    2011-01-01

    Summary The generalized linear mixed-effects model (GLMM) is a popular paradigm to extend models for cross-sectional data to a longitudinal setting. When applied to modeling binary responses, different software packages and even different procedures within a package may give quite different results. In this report, we describe the statistical approaches that underlie these different procedures and discuss their strengths and weaknesses when applied to fit correlated binary responses. We then illustrate these considerations by applying these procedures implemented in some popular software packages to simulated and real study data. Our simulation results indicate a lack of reliability for most of the procedures considered, which carries significant implications for applying such popular software packages in practice. PMID:21671252

  9. An Integrated Analysis of the Physiological Effects of Space Flight: Executive Summary

    NASA Technical Reports Server (NTRS)

    Leonard, J. I.

    1985-01-01

    A large array of models were applied in a unified manner to solve problems in space flight physiology. Mathematical simulation was used as an alternative way of looking at physiological systems and maximizing the yield from previous space flight experiments. A medical data analysis system was created which consist of an automated data base, a computerized biostatistical and data analysis system, and a set of simulation models of physiological systems. Five basic models were employed: (1) a pulsatile cardiovascular model; (2) a respiratory model; (3) a thermoregulatory model; (4) a circulatory, fluid, and electrolyte balance model; and (5) an erythropoiesis regulatory model. Algorithms were provided to perform routine statistical tests, multivariate analysis, nonlinear regression analysis, and autocorrelation analysis. Special purpose programs were prepared for rank correlation, factor analysis, and the integration of the metabolic balance data.

  10. A Computational Approach for Probabilistic Analysis of Water Impact Simulations

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.

    2009-01-01

    NASA's development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.

  11. Modeling the Webgraph: How Far We Are

    NASA Astrophysics Data System (ADS)

    Donato, Debora; Laura, Luigi; Leonardi, Stefano; Millozzi, Stefano

    The following sections are included: * Introduction * Preliminaries * WebBase * In-degree and out-degree * PageRank * Bipartite cliques * Strongly connected components * Stochastic models of the webgraph * Models of the webgraph * A multi-layer model * Large scale simulation * Algorithmic techniques for generating and measuring webgraphs * Data representation and multifiles * Generating webgraphs * Traversal with two bits for each node * Semi-external breadth first search * Semi-external depth first search * Computation of the SCCs * Computation of the bow-tie regions * Disjoint bipartite cliques * PageRank * Summary and outlook

  12. The Lake Tahoe Basin Land Use Simulation Model

    USGS Publications Warehouse

    Forney, William M.; Oldham, I. Benson

    2011-01-01

    This U.S. Geological Survey Open-File Report describes the final modeling product for the Tahoe Decision Support System project for the Lake Tahoe Basin funded by the Southern Nevada Public Land Management Act and the U.S. Geological Survey's Geographic Analysis and Monitoring Program. This research was conducted by the U.S. Geological Survey Western Geographic Science Center. The purpose of this report is to describe the basic elements of the novel Lake Tahoe Basin Land Use Simulation Model, publish samples of the data inputs, basic outputs of the model, and the details of the Python code. The results of this report include a basic description of the Land Use Simulation Model, descriptions and summary statistics of model inputs, two figures showing the graphical user interface from the web-based tool, samples of the two input files, seven tables of basic output results from the web-based tool and descriptions of their parameters, and the fully functional Python code.

  13. Summary measures of agreement and association between many raters' ordinal classifications.

    PubMed

    Mitani, Aya A; Freer, Phoebe E; Nelson, Kerrie P

    2017-10-01

    Interpretation of screening tests such as mammograms usually require a radiologist's subjective visual assessment of images, often resulting in substantial discrepancies between radiologists' classifications of subjects' test results. In clinical screening studies to assess the strength of agreement between experts, multiple raters are often recruited to assess subjects' test results using an ordinal classification scale. However, using traditional measures of agreement in some studies is challenging because of the presence of many raters, the use of an ordinal classification scale, and unbalanced data. We assess and compare the performances of existing measures of agreement and association as well as a newly developed model-based measure of agreement to three large-scale clinical screening studies involving many raters' ordinal classifications. We also conduct a simulation study to demonstrate the key properties of the summary measures. The assessment of agreement and association varied according to the choice of summary measure. Some measures were influenced by the underlying prevalence of disease and raters' marginal distributions and/or were limited in use to balanced data sets where every rater classifies every subject. Our simulation study indicated that popular measures of agreement and association are prone to underlying disease prevalence. Model-based measures provide a flexible approach for calculating agreement and association and are robust to missing and unbalanced data as well as the underlying disease prevalence. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Understanding the effects of different HIV transmission models in individual-based microsimulation of HIV epidemic dynamics in people who inject drugs

    PubMed Central

    MONTEIRO, J.F.G.; ESCUDERO, D.J.; WEINREB, C.; FLANIGAN, T.; GALEA, S.; FRIEDMAN, S.R.; MARSHALL, B.D.L.

    2017-01-01

    SUMMARY We investigated how different models of HIV transmission, and assumptions regarding the distribution of unprotected sex and syringe-sharing events (‘risk acts’), affect quantitative understanding of HIV transmission process in people who inject drugs (PWID). The individual-based model simulated HIV transmission in a dynamic sexual and injecting network representing New York City. We constructed four HIV transmission models: model 1, constant probabilities; model 2, random number of sexual and parenteral acts; model 3, viral load individual assigned; and model 4, two groups of partnerships (low and high risk). Overall, models with less heterogeneity were more sensitive to changes in numbers risk acts, producing HIV incidence up to four times higher than that empirically observed. Although all models overestimated HIV incidence, micro-simulations with greater heterogeneity in the HIV transmission modelling process produced more robust results and better reproduced empirical epidemic dynamics. PMID:26753627

  15. A computer simulation of Skylab dynamics and attitude control for performance verification and operational support

    NASA Technical Reports Server (NTRS)

    Buchanan, H.; Nixon, D.; Joyce, R.

    1974-01-01

    A simulation of the Skylab attitude and pointing control system (APCS) is outlined and discussed. Implementation is via a large hybrid computer and includes those factors affecting system momentum management, propellant consumption, and overall vehicle performance. The important features of the flight system are discussed; the mathematical models necessary for this treatment are outlined; and the decisions involved in implementation are discussed. A brief summary of the goals and capabilities of this tool is also included.

  16. Multi-scale and multi-physics simulations using the multi-fluid plasma model

    DTIC Science & Technology

    2017-04-25

    small The simulation uses 512 second-order elements Bz = 1.0, Te = Ti = 0.01, ui = ue = 0 ne = ni = 1.0 + e−10(x−6) 2 Baboolal, Math . and Comp. Sim. 55...DISTRIBUTION Clearance No. 17211 23 / 31 SUMMARY The blended finite element method (BFEM) is presented DG spatial discretization with explicit Runge...Kutta (i+, n) CG spatial discretization with implicit Crank-Nicolson (e−, fileds) DG captures shocks and discontinuities CG is efficient and robust for

  17. NOAA's State Climate Summaries for the National Climate Assessment: A Sustained Assessment Product

    NASA Astrophysics Data System (ADS)

    Kunkel, K.; Champion, S.; Frankson, R.; Easterling, D. R.; Griffin, J.; Runkle, J. D.; Stevens, L. E.; Stewart, B. C.; Sun, L.; Veasey, S.

    2016-12-01

    A set of State Climate Summaries have been produced for all 50 U.S. states as part of the National Climate Assessment Sustained Assessment and represent a NOAA contribution to this process. Each summary includes information on observed and projected climate change conditions and impacts associated with future greenhouse gas emissions pathways. The summaries focus on the physical climate and coastal issues as a part of NOAA's mission. Core climate data and simulations used to produce these summaries have been previously published, and have been analyzed to represent a targeted synthesis of historical and plausible future climate conditions. As these are intended to be supplemental to major climate assessment development, the scope of the content remains true to a "summary" style document. Each state's Climate Summary includes its climatology and projections of future temperatures and precipitation, which are presented in order to provide a context for the assessment of future impacts. The climatological component focuses on temperature, precipitation, and noteworthy weather events specific to each state and relevant to the climate change discussion. Future climate scenarios are also briefly discussed, using well-known and consistent sets of climate model simulations based on two possible futures of greenhouse gas emissions. These future scenarios present an internally consistent climate picture for every state and are intended to inform the potential impacts of climate change. These 50 State Climate Summaries were produced by NOAA's National Centers for Environmental Information (NCEI) and the North Carolina State University Cooperative Institute for Climate and Satellites - NC (CICS-NC) with additional input provided by climate experts, including the NOAA Regional Climate Centers and State Climatologists. Each summary document also underwent a comprehensive and anonymous peer review. Each summary contains text, figures, and an interactive web presentation. A full suite of the comprehensive analyses and metadata are also available. The audience is targeted as both decision-makers and informed non-scientists. This presentation will discuss the scientific development for the project, demonstrate the suite of information, and provide examples of noteworthy figures from select states.

  18. Human performance across decision making, selective attention, and working memory tasks: Experimental data and computer simulations.

    PubMed

    Stocco, Andrea; Yamasaki, Brianna L; Prat, Chantel S

    2018-04-01

    This article describes the data analyzed in the paper "Individual differences in the Simon effect are underpinned by differences in the competitive dynamics in the basal ganglia: An experimental verification and a computational model" (Stocco et al., 2017) [1]. The data includes behavioral results from participants performing three cognitive tasks (Probabilistic Stimulus Selection (Frank et al., 2004) [2], Simon task (Craft and Simon, 1970) [3], and Automated Operation Span (Unsworth et al., 2005) [4]), as well as simulationed traces generated by a computational neurocognitive model that accounts for individual variations in human performance across the tasks. The experimental data encompasses individual data files (in both preprocessed and native output format) as well as group-level summary files. The simulation data includes the entire model code, the results of a full-grid search of the model's parameter space, and the code used to partition the model space and parallelize the simulations. Finally, the repository includes the R scripts used to carry out the statistical analyses reported in the original paper.

  19. Anchoring quartet-based phylogenetic distances and applications to species tree reconstruction.

    PubMed

    Sayyari, Erfan; Mirarab, Siavash

    2016-11-11

    Inferring species trees from gene trees using the coalescent-based summary methods has been the subject of much attention, yet new scalable and accurate methods are needed. We introduce DISTIQUE, a new statistically consistent summary method for inferring species trees from gene trees under the coalescent model. We generalize our results to arbitrary phylogenetic inference problems; we show that two arbitrarily chosen leaves, called anchors, can be used to estimate relative distances between all other pairs of leaves by inferring relevant quartet trees. This results in a family of distance-based tree inference methods, with running times ranging between quadratic to quartic in the number of leaves. We show in simulated studies that DISTIQUE has comparable accuracy to leading coalescent-based summary methods and reduced running times.

  20. Technologies and costs for control of disinfection by-products: Executive summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-11-01

    The document characterizes the feasibility of treatment for disinfection by-products control and estimates the costs for treatment alternatives that can then be used by utilities to meet national regulations. Treatment criteria are developed through the use of a water treatment simulation model for parameters critical to disinfection by-products control.

  1. GDINA and CDM Packages in R

    ERIC Educational Resources Information Center

    Rupp, André A.; van Rijn, Peter W.

    2018-01-01

    We review the GIDNA and CDM packages in R for fitting cognitive diagnosis/diagnostic classification models. We first provide a summary of their core capabilities and then use both simulated and real data to compare their functionalities in practice. We found that the most relevant routines in the two packages appear to be more similar than…

  2. 76 FR 58210 - Approval and Promulgation of Implementation Plans and Designation of Areas for Air Quality...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-20

    ... Nonattainment Area to Attainment AGENCY: Environmental Protection Agency (EPA). ACTION: Proposed rule. SUMMARY... four separate but related actions. First, EPA is proposing to approve the December 18, 2009, PM 2.5 redesignation request, including the December 22, 2010, Motor Vehicle Emission Simulator (MOVES) mobile model...

  3. Extensible 3D (X3D) Earth Technical Requirements Workshop Summary Report

    DTIC Science & Technology

    2007-08-01

    world in detail already, but rarely interconnect on to another • Most interesting part of “virtual reality” (VR) is reality – which means physics... Two Web-Enabled Modeling and Simulation (WebSim) symposia have demonstrated that large partnerships can work 9. Server-side 3D graphics • Our

  4. Analysis of the Lenticular Jointed MARSIS Antenna Deployment

    NASA Technical Reports Server (NTRS)

    Mobrem, Mehran; Adams, Douglas S.

    2006-01-01

    This paper summarizes important milestones in a yearlong comprehensive effort which culminated in successful deployments of the MARSIS antenna booms in May and June of 2005. Experimentally measured straight section and hinge properties are incorporated into specialized modeling techniques that are used to simulate the boom lenticular joints. System level models are exercised to understand the boom deployment dynamics and spacecraft level implications. Discussion includes a comparison of ADAMS simulation results to measured flight data taken during the three boom deployments. Important parameters that govern lenticular joint behavior are outlined and a short summary of lessons learned and recommendations is included to better understand future applications of this technology.

  5. The Scientific Method, Diagnostic Bayes, and How to Detect Epistemic Errors

    NASA Astrophysics Data System (ADS)

    Vrugt, J. A.

    2015-12-01

    In the past decades, Bayesian methods have found widespread application and use in environmental systems modeling. Bayes theorem states that the posterior probability, P(H|D) of a hypothesis, H is proportional to the product of the prior probability, P(H) of this hypothesis and the likelihood, L(H|hat{D}) of the same hypothesis given the new/incoming observations, \\hat {D}. In science and engineering, H often constitutes some numerical simulation model, D = F(x,.) which summarizes using algebraic, empirical, and differential equations, state variables and fluxes, all our theoretical and/or practical knowledge of the system of interest, and x are the d unknown parameters which are subject to inference using some data, \\hat {D} of the observed system response. The Bayesian approach is intimately related to the scientific method and uses an iterative cycle of hypothesis formulation (model), experimentation and data collection, and theory/hypothesis refinement to elucidate the rules that govern the natural world. Unfortunately, model refinement has proven to be very difficult in large part because of the poor diagnostic power of residual based likelihood functions tep{gupta2008}. This has inspired te{vrugt2013} to advocate the use of 'likelihood-free' inference using approximate Bayesian computation (ABC). This approach uses one or more summary statistics, S(\\hat {D}) of the original data, \\hat {D} designed ideally to be sensitive only to one particular process in the model. Any mismatch between the observed and simulated summary metrics is then easily linked to a specific model component. A recurrent issue with the application of ABC is self-sufficiency of the summary statistics. In theory, S(.) should contain as much information as the original data itself, yet complex systems rarely admit sufficient statistics. In this article, we propose to combine the ideas of ABC and regular Bayesian inference to guarantee that no information is lost in diagnostic model evaluation. This hybrid approach, coined diagnostic Bayes, uses the summary metrics as prior distribution and original data in the likelihood function, or P(x|\\hat {D}) ∝ P(x|S(\\hat {D})) L(x|\\hat {D}). A case study illustrates the ability of the proposed methodology to diagnose epistemic errors and provide guidance on model refinement.

  6. Inferring brain-computational mechanisms with models of activity measurements

    PubMed Central

    Diedrichsen, Jörn

    2016-01-01

    High-resolution functional imaging is providing increasingly rich measurements of brain activity in animals and humans. A major challenge is to leverage such data to gain insight into the brain's computational mechanisms. The first step is to define candidate brain-computational models (BCMs) that can perform the behavioural task in question. We would then like to infer which of the candidate BCMs best accounts for measured brain-activity data. Here we describe a method that complements each BCM by a measurement model (MM), which simulates the way the brain-activity measurements reflect neuronal activity (e.g. local averaging in functional magnetic resonance imaging (fMRI) voxels or sparse sampling in array recordings). The resulting generative model (BCM-MM) produces simulated measurements. To avoid having to fit the MM to predict each individual measurement channel of the brain-activity data, we compare the measured and predicted data at the level of summary statistics. We describe a novel particular implementation of this approach, called probabilistic representational similarity analysis (pRSA) with MMs, which uses representational dissimilarity matrices (RDMs) as the summary statistics. We validate this method by simulations of fMRI measurements (locally averaging voxels) based on a deep convolutional neural network for visual object recognition. Results indicate that the way the measurements sample the activity patterns strongly affects the apparent representational dissimilarities. However, modelling of the measurement process can account for these effects, and different BCMs remain distinguishable even under substantial noise. The pRSA method enables us to perform Bayesian inference on the set of BCMs and to recognize the data-generating model in each case. This article is part of the themed issue ‘Interpreting BOLD: a dialogue between cognitive and cellular neuroscience’. PMID:27574316

  7. Large eddy simulations and direct numerical simulations of high speed turbulent reacting flows

    NASA Technical Reports Server (NTRS)

    Givi, Peyman; Madnia, C. K.; Steinberger, C. J.; Tsai, A.

    1991-01-01

    This research is involved with the implementations of advanced computational schemes based on large eddy simulations (LES) and direct numerical simulations (DNS) to study the phenomenon of mixing and its coupling with chemical reactions in compressible turbulent flows. In the efforts related to LES, a research program was initiated to extend the present capabilities of this method for the treatment of chemically reacting flows, whereas in the DNS efforts, focus was on detailed investigations of the effects of compressibility, heat release, and nonequilibrium kinetics modeling in high speed reacting flows. The efforts to date were primarily focussed on simulations of simple flows, namely, homogeneous compressible flows and temporally developing hign speed mixing layers. A summary of the accomplishments is provided.

  8. Design of experiment for earth rotation and baseline parameter determination from very long baseline interferometry

    NASA Technical Reports Server (NTRS)

    Dermanis, A.

    1977-01-01

    The possibility of recovering earth rotation and network geometry (baseline) parameters are emphasized. The numerical simulated experiments performed are set up in an environment where station coordinates vary with respect to inertial space according to a simulated earth rotation model similar to the actual but unknown rotation of the earth. The basic technique of VLBI and its mathematical model are presented. The parametrization of earth rotation chosen is described and the resulting model is linearized. A simple analysis of the geometry of the observations leads to some useful hints on achieving maximum sensitivity of the observations with respect to the parameters considered. The basic philosophy for the simulation of data and their analysis through standard least squares adjustment techniques is presented. A number of characteristic network designs based on present and candidate station locations are chosen. The results of the simulations for each design are presented together with a summary of the conclusions.

  9. Simulation-based power calculations for planning a two-stage individual participant data meta-analysis.

    PubMed

    Ensor, Joie; Burke, Danielle L; Snell, Kym I E; Hemming, Karla; Riley, Richard D

    2018-05-18

    Researchers and funders should consider the statistical power of planned Individual Participant Data (IPD) meta-analysis projects, as they are often time-consuming and costly. We propose simulation-based power calculations utilising a two-stage framework, and illustrate the approach for a planned IPD meta-analysis of randomised trials with continuous outcomes where the aim is to identify treatment-covariate interactions. The simulation approach has four steps: (i) specify an underlying (data generating) statistical model for trials in the IPD meta-analysis; (ii) use readily available information (e.g. from publications) and prior knowledge (e.g. number of studies promising IPD) to specify model parameter values (e.g. control group mean, intervention effect, treatment-covariate interaction); (iii) simulate an IPD meta-analysis dataset of a particular size from the model, and apply a two-stage IPD meta-analysis to obtain the summary estimate of interest (e.g. interaction effect) and its associated p-value; (iv) repeat the previous step (e.g. thousands of times), then estimate the power to detect a genuine effect by the proportion of summary estimates with a significant p-value. In a planned IPD meta-analysis of lifestyle interventions to reduce weight gain in pregnancy, 14 trials (1183 patients) promised their IPD to examine a treatment-BMI interaction (i.e. whether baseline BMI modifies intervention effect on weight gain). Using our simulation-based approach, a two-stage IPD meta-analysis has < 60% power to detect a reduction of 1 kg weight gain for a 10-unit increase in BMI. Additional IPD from ten other published trials (containing 1761 patients) would improve power to over 80%, but only if a fixed-effect meta-analysis was appropriate. Pre-specified adjustment for prognostic factors would increase power further. Incorrect dichotomisation of BMI would reduce power by over 20%, similar to immediately throwing away IPD from ten trials. Simulation-based power calculations could inform the planning and funding of IPD projects, and should be used routinely.

  10. Inferring causal relationships between phenotypes using summary statistics from genome-wide association studies.

    PubMed

    Meng, Xiang-He; Shen, Hui; Chen, Xiang-Ding; Xiao, Hong-Mei; Deng, Hong-Wen

    2018-03-01

    Genome-wide association studies (GWAS) have successfully identified numerous genetic variants associated with diverse complex phenotypes and diseases, and provided tremendous opportunities for further analyses using summary association statistics. Recently, Pickrell et al. developed a robust method for causal inference using independent putative causal SNPs. However, this method may fail to infer the causal relationship between two phenotypes when only a limited number of independent putative causal SNPs identified. Here, we extended Pickrell's method to make it more applicable for the general situations. We extended the causal inference method by replacing the putative causal SNPs with the lead SNPs (the set of the most significant SNPs in each independent locus) and tested the performance of our extended method using both simulation and empirical data. Simulations suggested that when the same number of genetic variants is used, our extended method had similar distribution of test statistic under the null model as well as comparable power under the causal model compared with the original method by Pickrell et al. But in practice, our extended method would generally be more powerful because the number of independent lead SNPs was often larger than the number of independent putative causal SNPs. And including more SNPs, on the other hand, would not cause more false positives. By applying our extended method to summary statistics from GWAS for blood metabolites and femoral neck bone mineral density (FN-BMD), we successfully identified ten blood metabolites that may causally influence FN-BMD. We extended a causal inference method for inferring putative causal relationship between two phenotypes using summary statistics from GWAS, and identified a number of potential causal metabolites for FN-BMD, which may provide novel insights into the pathophysiological mechanisms underlying osteoporosis.

  11. A survey of modelling methods for high-fidelity wind farm simulations using large eddy simulation.

    PubMed

    Breton, S-P; Sumner, J; Sørensen, J N; Hansen, K S; Sarmast, S; Ivanell, S

    2017-04-13

    Large eddy simulations (LES) of wind farms have the capability to provide valuable and detailed information about the dynamics of wind turbine wakes. For this reason, their use within the wind energy research community is on the rise, spurring the development of new models and methods. This review surveys the most common schemes available to model the rotor, atmospheric conditions and terrain effects within current state-of-the-art LES codes, of which an overview is provided. A summary of the experimental research data available for validation of LES codes within the context of single and multiple wake situations is also supplied. Some typical results for wind turbine and wind farm flows are presented to illustrate best practices for carrying out high-fidelity LES of wind farms under various atmospheric and terrain conditions.This article is part of the themed issue 'Wind energy in complex terrains'. © 2017 The Author(s).

  12. A survey of modelling methods for high-fidelity wind farm simulations using large eddy simulation

    PubMed Central

    Sumner, J.; Sørensen, J. N.; Hansen, K. S.; Sarmast, S.; Ivanell, S.

    2017-01-01

    Large eddy simulations (LES) of wind farms have the capability to provide valuable and detailed information about the dynamics of wind turbine wakes. For this reason, their use within the wind energy research community is on the rise, spurring the development of new models and methods. This review surveys the most common schemes available to model the rotor, atmospheric conditions and terrain effects within current state-of-the-art LES codes, of which an overview is provided. A summary of the experimental research data available for validation of LES codes within the context of single and multiple wake situations is also supplied. Some typical results for wind turbine and wind farm flows are presented to illustrate best practices for carrying out high-fidelity LES of wind farms under various atmospheric and terrain conditions. This article is part of the themed issue ‘Wind energy in complex terrains’. PMID:28265021

  13. Project summaries

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Lunar base projects, including a reconfigurable lunar cargo launcher, a thermal and micrometeorite protection system, a versatile lifting machine with robotic capabilities, a cargo transport system, the design of a road construction system for a lunar base, and the design of a device for removing lunar dust from material surfaces, are discussed. The emphasis on the Gulf of Mexico project was on the development of a computer simulation model for predicting vessel station keeping requirements. An existing code, used in predicting station keeping requirements for oil drilling platforms operating in North Shore (Alaska) waters was used as a basis for the computer simulation. Modifications were made to the existing code. The input into the model consists of satellite altimeter readings and water velocity readings from buoys stationed in the Gulf of Mexico. The satellite data consists of altimeter readings (wave height) taken during the spring of 1989. The simulation model predicts water velocity and direction, and wind velocity.

  14. Lipid Clustering Correlates with Membrane Curvature as Revealed by Molecular Simulations of Complex Lipid Bilayers

    PubMed Central

    Koldsø, Heidi; Shorthouse, David; Hélie, Jean; Sansom, Mark S. P.

    2014-01-01

    Cell membranes are complex multicomponent systems, which are highly heterogeneous in the lipid distribution and composition. To date, most molecular simulations have focussed on relatively simple lipid compositions, helping to inform our understanding of in vitro experimental studies. Here we describe on simulations of complex asymmetric plasma membrane model, which contains seven different lipids species including the glycolipid GM3 in the outer leaflet and the anionic lipid, phosphatidylinositol 4,5-bisphophate (PIP2), in the inner leaflet. Plasma membrane models consisting of 1500 lipids and resembling the in vivo composition were constructed and simulations were run for 5 µs. In these simulations the most striking feature was the formation of nano-clusters of GM3 within the outer leaflet. In simulations of protein interactions within a plasma membrane model, GM3, PIP2, and cholesterol all formed favorable interactions with the model α-helical protein. A larger scale simulation of a model plasma membrane containing 6000 lipid molecules revealed correlations between curvature of the bilayer surface and clustering of lipid molecules. In particular, the concave (when viewed from the extracellular side) regions of the bilayer surface were locally enriched in GM3. In summary, these simulations explore the nanoscale dynamics of model bilayers which mimic the in vivo lipid composition of mammalian plasma membranes, revealing emergent nanoscale membrane organization which may be coupled both to fluctuations in local membrane geometry and to interactions with proteins. PMID:25340788

  15. Lipid clustering correlates with membrane curvature as revealed by molecular simulations of complex lipid bilayers.

    PubMed

    Koldsø, Heidi; Shorthouse, David; Hélie, Jean; Sansom, Mark S P

    2014-10-01

    Cell membranes are complex multicomponent systems, which are highly heterogeneous in the lipid distribution and composition. To date, most molecular simulations have focussed on relatively simple lipid compositions, helping to inform our understanding of in vitro experimental studies. Here we describe on simulations of complex asymmetric plasma membrane model, which contains seven different lipids species including the glycolipid GM3 in the outer leaflet and the anionic lipid, phosphatidylinositol 4,5-bisphophate (PIP2), in the inner leaflet. Plasma membrane models consisting of 1500 lipids and resembling the in vivo composition were constructed and simulations were run for 5 µs. In these simulations the most striking feature was the formation of nano-clusters of GM3 within the outer leaflet. In simulations of protein interactions within a plasma membrane model, GM3, PIP2, and cholesterol all formed favorable interactions with the model α-helical protein. A larger scale simulation of a model plasma membrane containing 6000 lipid molecules revealed correlations between curvature of the bilayer surface and clustering of lipid molecules. In particular, the concave (when viewed from the extracellular side) regions of the bilayer surface were locally enriched in GM3. In summary, these simulations explore the nanoscale dynamics of model bilayers which mimic the in vivo lipid composition of mammalian plasma membranes, revealing emergent nanoscale membrane organization which may be coupled both to fluctuations in local membrane geometry and to interactions with proteins.

  16. Multiple Phenotype Association Tests Using Summary Statistics in Genome-Wide Association Studies

    PubMed Central

    Liu, Zhonghua; Lin, Xihong

    2017-01-01

    Summary We study in this paper jointly testing the associations of a genetic variant with correlated multiple phenotypes using the summary statistics of individual phenotype analysis from Genome-Wide Association Studies (GWASs). We estimated the between-phenotype correlation matrix using the summary statistics of individual phenotype GWAS analyses, and developed genetic association tests for multiple phenotypes by accounting for between-phenotype correlation without the need to access individual-level data. Since genetic variants often affect multiple phenotypes differently across the genome and the between-phenotype correlation can be arbitrary, we proposed robust and powerful multiple phenotype testing procedures by jointly testing a common mean and a variance component in linear mixed models for summary statistics. We computed the p-values of the proposed tests analytically. This computational advantage makes our methods practically appealing in large-scale GWASs. We performed simulation studies to show that the proposed tests maintained correct type I error rates, and to compare their powers in various settings with the existing methods. We applied the proposed tests to a GWAS Global Lipids Genetics Consortium summary statistics data set and identified additional genetic variants that were missed by the original single-trait analysis. PMID:28653391

  17. Deduction as Stochastic Simulation

    DTIC Science & Technology

    2013-07-01

    different tokens representing entities that it contains. The second parameter constrains the contents of a model, and in particular the different...of premises. In summary, the system manipulates stochastically the size, the contents , and the revisions of models. We now describe in detail each...9 10 0. 0 0. 1 0. 2 0. 3 λ = 4 0 1 2 3 4 5 6 7 8 9 10 0. 0 0. 1 0. 2 0. 3 λ = 5 The contents of a mental model (parameter ε) The second component

  18. In silico cancer modeling: is it ready for primetime?

    PubMed Central

    Deisboeck, Thomas S; Zhang, Le; Yoon, Jeongah; Costa, Jose

    2011-01-01

    SUMMARY At the dawn of the era of personalized, systems-driven medicine, computational or in silico modeling and the simulation of disease processes is becoming increasingly important for hypothesis generation and data integration in both experiment and clinics alike. Arguably, this is nowhere more visible than in oncology. To illustrate the field’s vast potential as well as its current limitations we briefly review selected works on modeling malignant brain tumors. Implications for clinical practice, including trial design and outcome prediction are also discussed. PMID:18852721

  19. Simulation of streamflows and basin-wide hydrologic variables over several climate-change scenarios, Methow River basin, Washington

    USGS Publications Warehouse

    Voss, Frank D.; Mastin, Mark C.

    2012-01-01

    A database was developed to automate model execution and to provide users with Internet access to voluminous data products ranging from summary figures to model output timeseries. Database-enabled Internet tools were developed to allow users to create interactive graphs of output results based on their analysis needs. For example, users were able to create graphs by selecting time intervals, greenhouse gas emission scenarios, general circulation models, and specific hydrologic variables.

  20. Using subject-specific three-dimensional (3D) anthropometry data in digital human modelling: case study in hand motion simulation.

    PubMed

    Tsao, Liuxing; Ma, Liang

    2016-11-01

    Digital human modelling enables ergonomists and designers to consider ergonomic concerns and design alternatives in a timely and cost-efficient manner in the early stages of design. However, the reliability of the simulation could be limited due to the percentile-based approach used in constructing the digital human model. To enhance the accuracy of the size and shape of the models, we proposed a framework to generate digital human models using three-dimensional (3D) anthropometric data. The 3D scan data from specific subjects' hands were segmented based on the estimated centres of rotation. The segments were then driven in forward kinematics to perform several functional postures. The constructed hand models were then verified, thereby validating the feasibility of the framework. The proposed framework helps generate accurate subject-specific digital human models, which can be utilised to guide product design and workspace arrangement. Practitioner Summary: Subject-specific digital human models can be constructed under the proposed framework based on three-dimensional (3D) anthropometry. This approach enables more reliable digital human simulation to guide product design and workspace arrangement.

  1. Numerical simulation of the actuation system for the ALDF's propulsion control valve. [Aircraft Landing Dynamics Facility

    NASA Technical Reports Server (NTRS)

    Korte, John J.

    1990-01-01

    A numerical simulation of the actuation system for the propulsion control valve (PCV) of the NASA Langley Aircraft Landing Dynamics Facility was developed during the preliminary design of the PCV and used throughout the entire project. The simulation is based on a predictive model of the PCV which is used to evaluate and design the actuation system. The PCV controls a 1.7 million-pound thrust water jet used in propelling a 108,000-pound test carriage. The PCV can open and close in 0.300 second and deliver over 9,000 gallons of water per sec at pressures up to 3150 psi. The numerical simulation results are used to predict transient performance and valve opening characteristics, specify the hydraulic control system, define transient loadings on components, and evaluate failure modes. The mathematical model used for numerically simulating the mechanical fluid power system is described, and numerical results are demonstrated for a typical opening and closing cycle of the PCV. A summary is then given on how the model is used in the design process.

  2. Modelling and simulation of complex sociotechnical systems: envisioning and analysing work environments

    PubMed Central

    Hettinger, Lawrence J.; Kirlik, Alex; Goh, Yang Miang; Buckle, Peter

    2015-01-01

    Accurate comprehension and analysis of complex sociotechnical systems is a daunting task. Empirically examining, or simply envisioning the structure and behaviour of such systems challenges traditional analytic and experimental approaches as well as our everyday cognitive capabilities. Computer-based models and simulations afford potentially useful means of accomplishing sociotechnical system design and analysis objectives. From a design perspective, they can provide a basis for a common mental model among stakeholders, thereby facilitating accurate comprehension of factors impacting system performance and potential effects of system modifications. From a research perspective, models and simulations afford the means to study aspects of sociotechnical system design and operation, including the potential impact of modifications to structural and dynamic system properties, in ways not feasible with traditional experimental approaches. This paper describes issues involved in the design and use of such models and simulations and describes a proposed path forward to their development and implementation. Practitioner Summary: The size and complexity of real-world sociotechnical systems can present significant barriers to their design, comprehension and empirical analysis. This article describes the potential advantages of computer-based models and simulations for understanding factors that impact sociotechnical system design and operation, particularly with respect to process and occupational safety. PMID:25761227

  3. A Simulation Study of Methods for Selecting Subgroup-Specific Doses in Phase I Trials

    PubMed Central

    Morita, Satoshi; Thall, Peter F.; Takeda, Kentaro

    2016-01-01

    Summary Patient heterogeneity may complicate dose-finding in phase I clinical trials if the dose-toxicity curves differ between subgroups. Conducting separate trials within subgroups may lead to infeasibly small sample sizes in subgroups having low prevalence. Alternatively, it is not obvious how to conduct a single trial while accounting for heterogeneity. To address this problem, we consider a generalization of the continual reassessment method (O’Quigley, et al., 1990) based on a hierarchical Bayesian dose-toxicity model that borrows strength between subgroups under the assumption that the subgroups are exchangeable. We evaluate a design using this model that includes subgroup-specific dose selection and safety rules. A simulation study is presented that includes comparison of this method to three alternative approaches, based on non-hierarchical models, that make different types of assumptions about within-subgroup dose-toxicity curves. The simulations show that the hierarchical model-based method is recommended in settings where the dose-toxicity curves are exchangeable between subgroups. We present practical guidelines for application, and provide computer programs for trial simulation and conduct. PMID:28111916

  4. Atmosphere Assessment for MARS Science Laboratory Entry, Descent and Landing Operations

    NASA Technical Reports Server (NTRS)

    Cianciolo, Alicia D.; Cantor, Bruce; Barnes, Jeff; Tyler, Daniel, Jr.; Rafkin, Scot; Chen, Allen; Kass, David; Mischna, Michael; Vasavada, Ashwin R.

    2013-01-01

    On August 6, 2012, the Mars Science Laboratory rover, Curiosity, successfully landed on the surface of Mars. The Entry, Descent and Landing (EDL) sequence was designed using atmospheric conditions estimated from mesoscale numerical models. The models, developed by two independent organizations (Oregon State University and the Southwest Research Institute), were validated against observations at Mars from three prior years. In the weeks and days before entry, the MSL "Council of Atmospheres" (CoA), a group of atmospheric scientists and modelers, instrument experts and EDL simulation engineers, evaluated the latest Mars data from orbiting assets including the Mars Reconnaissance Orbiter's Mars Color Imager (MARCI) and Mars Climate Sounder (MCS), as well as Mars Odyssey's Thermal Emission Imaging System (THEMIS). The observations were compared to the mesoscale models developed for EDL performance simulation to determine if a spacecraft parameter update was necessary prior to entry. This paper summarizes the daily atmosphere observations and comparison to the performance simulation atmosphere models. Options to modify the atmosphere model in the simulation to compensate for atmosphere effects are also presented. Finally, a summary of the CoA decisions and recommendations to the MSL project in the days leading up to EDL is provided.

  5. Mathematical Approaches to Understanding and Imaging Atrial Fibrillation: Significance for Mechanisms and Management

    PubMed Central

    Trayanova, Natalia A

    2014-01-01

    Atrial fibrillation (AF) is the most common sustained arrhythmia in humans. The mechanisms that govern AF initiation and persistence are highly complex, of dynamic nature, and involve interactions across multiple temporal and spatial scales in the atria. This articles aims to review the mathematical modeling and computer simulation approaches to understanding AF mechanisms and aiding in its management. Various atrial modeling approaches are presented, with descriptions of the methodological basis and advancements in both lower-dimensional and realistic geometry models. A review of the most significant mechanistic insights made by atrial simulations is provided. The article showcases the contributions that atrial modeling and simulation have made not only to our understanding of the pathophysiology of atrial arrhythmias, but also to the development of AF management approaches. A summary of the future developments envisioned for the field of atrial simulation and modeling is also presented. The review contends that computational models of the atria assembled with data from clinical imaging modalities that incorporate electrophysiological and structural remodeling could become a first line of screening for new AF therapies and approaches, new diagnostic developments, and new methods for arrhythmia prevention. PMID:24763468

  6. GLISSANDO: GLauber Initial-State Simulation AND mOre…

    NASA Astrophysics Data System (ADS)

    Broniowski, Wojciech; Rybczyński, Maciej; Bożek, Piotr

    2009-01-01

    We present a Monte Carlo generator for a variety of Glauber-like models (the wounded-nucleon model, binary collisions model, mixed model, model with hot spots). These models describe the early stages of relativistic heavy-ion collisions, in particular the spatial distribution of the transverse energy deposition which ultimately leads to production of particles from the interaction region. The original geometric distribution of sources in the transverse plane can be superimposed with a statistical distribution simulating the dispersion in the generated transverse energy in each individual collision. The program generates inter alia the fixed-axes (standard) and variable-axes (participant) two-dimensional profiles of the density of sources in the transverse plane and their azimuthal Fourier components. These profiles can be used in further analysis of physical phenomena, such as the jet quenching, event-by-event hydrodynamics, or analysis of the elliptic flow and its fluctuations. Characteristics of the event (multiplicities, eccentricities, Fourier coefficients, etc.) are stored in a ROOT file and can be analyzed off-line. In particular, event-by-event studies can be carried out in a simple way. A number of ROOT scripts is provided for that purpose. Supplied variants of the code can also be used for the proton-nucleus and deuteron-nucleus collisions. Program summaryProgram title: GLISSANDO Catalogue identifier: AEBS_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEBS_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 4452 No. of bytes in distributed program, including test data, etc.: 34 766 Distribution format: tar.gz Programming language: C++ Computer: any computer with a C++ compiler and the ROOT environment [R. Brun, et al., Root Users Guide 5.16, CERN, 2007, http://root.cern.ch[1

  7. Adapt-Mix: learning local genetic correlation structure improves summary statistics-based analyses

    PubMed Central

    Park, Danny S.; Brown, Brielin; Eng, Celeste; Huntsman, Scott; Hu, Donglei; Torgerson, Dara G.; Burchard, Esteban G.; Zaitlen, Noah

    2015-01-01

    Motivation: Approaches to identifying new risk loci, training risk prediction models, imputing untyped variants and fine-mapping causal variants from summary statistics of genome-wide association studies are playing an increasingly important role in the human genetics community. Current summary statistics-based methods rely on global ‘best guess’ reference panels to model the genetic correlation structure of the dataset being studied. This approach, especially in admixed populations, has the potential to produce misleading results, ignores variation in local structure and is not feasible when appropriate reference panels are missing or small. Here, we develop a method, Adapt-Mix, that combines information across all available reference panels to produce estimates of local genetic correlation structure for summary statistics-based methods in arbitrary populations. Results: We applied Adapt-Mix to estimate the genetic correlation structure of both admixed and non-admixed individuals using simulated and real data. We evaluated our method by measuring the performance of two summary statistics-based methods: imputation and joint-testing. When using our method as opposed to the current standard of ‘best guess’ reference panels, we observed a 28% decrease in mean-squared error for imputation and a 73.7% decrease in mean-squared error for joint-testing. Availability and implementation: Our method is publicly available in a software package called ADAPT-Mix available at https://github.com/dpark27/adapt_mix. Contact: noah.zaitlen@ucsf.edu PMID:26072481

  8. A brief introduction to computer-intensive methods, with a view towards applications in spatial statistics and stereology.

    PubMed

    Mattfeldt, Torsten

    2011-04-01

    Computer-intensive methods may be defined as data analytical procedures involving a huge number of highly repetitive computations. We mention resampling methods with replacement (bootstrap methods), resampling methods without replacement (randomization tests) and simulation methods. The resampling methods are based on simple and robust principles and are largely free from distributional assumptions. Bootstrap methods may be used to compute confidence intervals for a scalar model parameter and for summary statistics from replicated planar point patterns, and for significance tests. For some simple models of planar point processes, point patterns can be simulated by elementary Monte Carlo methods. The simulation of models with more complex interaction properties usually requires more advanced computing methods. In this context, we mention simulation of Gibbs processes with Markov chain Monte Carlo methods using the Metropolis-Hastings algorithm. An alternative to simulations on the basis of a parametric model consists of stochastic reconstruction methods. The basic ideas behind the methods are briefly reviewed and illustrated by simple worked examples in order to encourage novices in the field to use computer-intensive methods. © 2010 The Authors Journal of Microscopy © 2010 Royal Microscopical Society.

  9. Chemical & Biological Point Detection Decontamination

    DTIC Science & Technology

    2002-04-01

    high priority in biological defense. Research on multivalent assays is also ongoing. Biased libraries, generated from immunized animals, or unbiased ...2003 TBD decontamination and modeling and simulation I I The Chem-Bio Point Detection Roadmap The summary level updated and expanded Bio Point... Molecular Imprinted Polymer Sensor, Dendrimer-based Antibody Assays, Pyrolysis-GC-ion mobility spectrometry, and surface enhanced Raman spectroscopy. Data

  10. AESS: Accelerated Exact Stochastic Simulation

    NASA Astrophysics Data System (ADS)

    Jenkins, David D.; Peterson, Gregory D.

    2011-12-01

    The Stochastic Simulation Algorithm (SSA) developed by Gillespie provides a powerful mechanism for exploring the behavior of chemical systems with small species populations or with important noise contributions. Gene circuit simulations for systems biology commonly employ the SSA method, as do ecological applications. This algorithm tends to be computationally expensive, so researchers seek an efficient implementation of SSA. In this program package, the Accelerated Exact Stochastic Simulation Algorithm (AESS) contains optimized implementations of Gillespie's SSA that improve the performance of individual simulation runs or ensembles of simulations used for sweeping parameters or to provide statistically significant results. Program summaryProgram title: AESS Catalogue identifier: AEJW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: University of Tennessee copyright agreement No. of lines in distributed program, including test data, etc.: 10 861 No. of bytes in distributed program, including test data, etc.: 394 631 Distribution format: tar.gz Programming language: C for processors, CUDA for NVIDIA GPUs Computer: Developed and tested on various x86 computers and NVIDIA C1060 Tesla and GTX 480 Fermi GPUs. The system targets x86 workstations, optionally with multicore processors or NVIDIA GPUs as accelerators. Operating system: Tested under Ubuntu Linux OS and CentOS 5.5 Linux OS Classification: 3, 16.12 Nature of problem: Simulation of chemical systems, particularly with low species populations, can be accurately performed using Gillespie's method of stochastic simulation. Numerous variations on the original stochastic simulation algorithm have been developed, including approaches that produce results with statistics that exactly match the chemical master equation (CME) as well as other approaches that approximate the CME. Solution method: The Accelerated Exact Stochastic Simulation (AESS) tool provides implementations of a wide variety of popular variations on the Gillespie method. Users can select the specific algorithm considered most appropriate. Comparisons between the methods and with other available implementations indicate that AESS provides the fastest known implementation of Gillespie's method for a variety of test models. Users may wish to execute ensembles of simulations to sweep parameters or to obtain better statistical results, so AESS supports acceleration of ensembles of simulation using parallel processing with MPI, SSE vector units on x86 processors, and/or using NVIDIA GPUs with CUDA.

  11. Summary of hydrologic modeling for the Delaware River Basin using the Water Availability Tool for Environmental Resources (WATER)

    USGS Publications Warehouse

    Williamson, Tanja N.; Lant, Jeremiah G.; Claggett, Peter; Nystrom, Elizabeth A.; Milly, Paul C.D.; Nelson, Hugh L.; Hoffman, Scott A.; Colarullo, Susan J.; Fischer, Jeffrey M.

    2015-11-18

    The Water Availability Tool for Environmental Resources (WATER) is a decision support system for the nontidal part of the Delaware River Basin that provides a consistent and objective method of simulating streamflow under historical, forecasted, and managed conditions. In order to quantify the uncertainty associated with these simulations, however, streamflow and the associated hydroclimatic variables of potential evapotranspiration, actual evapotranspiration, and snow accumulation and snowmelt must be simulated and compared to long-term, daily observations from sites. This report details model development and optimization, statistical evaluation of simulations for 57 basins ranging from 2 to 930 km2 and 11.0 to 99.5 percent forested cover, and how this statistical evaluation of daily streamflow relates to simulating environmental changes and management decisions that are best examined at monthly time steps normalized over multiple decades. The decision support system provides a database of historical spatial and climatic data for simulating streamflow for 2001–11, in addition to land-cover and general circulation model forecasts that focus on 2030 and 2060. WATER integrates geospatial sampling of landscape characteristics, including topographic and soil properties, with a regionally calibrated hillslope-hydrology model, an impervious-surface model, and hydroclimatic models that were parameterized by using three hydrologic response units: forested, agricultural, and developed land cover. This integration enables the regional hydrologic modeling approach used in WATER without requiring site-specific optimization or those stationary conditions inferred when using a statistical model.

  12. A summary of computational experience at GE Aircraft Engines for complex turbulent flows in gas turbines

    NASA Astrophysics Data System (ADS)

    Zerkle, Ronald D.; Prakash, Chander

    1995-03-01

    This viewgraph presentation summarizes some CFD experience at GE Aircraft Engines for flows in the primary gaspath of a gas turbine engine and in turbine blade cooling passages. It is concluded that application of the standard k-epsilon turbulence model with wall functions is not adequate for accurate CFD simulation of aerodynamic performance and heat transfer in the primary gas path of a gas turbine engine. New models are required in the near-wall region which include more physics than wall functions. The two-layer modeling approach appears attractive because of its computational complexity. In addition, improved CFD simulation of film cooling and turbine blade internal cooling passages will require anisotropic turbulence models. New turbulence models must be practical in order to have a significant impact on the engine design process. A coordinated turbulence modeling effort between NASA centers would be beneficial to the gas turbine industry.

  13. A summary of computational experience at GE Aircraft Engines for complex turbulent flows in gas turbines

    NASA Technical Reports Server (NTRS)

    Zerkle, Ronald D.; Prakash, Chander

    1995-01-01

    This viewgraph presentation summarizes some CFD experience at GE Aircraft Engines for flows in the primary gaspath of a gas turbine engine and in turbine blade cooling passages. It is concluded that application of the standard k-epsilon turbulence model with wall functions is not adequate for accurate CFD simulation of aerodynamic performance and heat transfer in the primary gas path of a gas turbine engine. New models are required in the near-wall region which include more physics than wall functions. The two-layer modeling approach appears attractive because of its computational complexity. In addition, improved CFD simulation of film cooling and turbine blade internal cooling passages will require anisotropic turbulence models. New turbulence models must be practical in order to have a significant impact on the engine design process. A coordinated turbulence modeling effort between NASA centers would be beneficial to the gas turbine industry.

  14. Numerical aerodynamic simulation facility feasibility study, executive summary

    NASA Technical Reports Server (NTRS)

    1979-01-01

    There were three major issues examined in the feasibility study. First, the ability of the proposed system architecture to support the anticipated workload was evaluated. Second, the throughput of the computational engine (the flow model processor) was studied using real application programs. Third, the availability, reliability, and maintainability of the system were modeled. The evaluations were based on the baseline systems. The results show that the implementation of the Numerical Aerodynamic Simulation Facility, in the form considered, would indeed be a feasible project with an acceptable level of risk. The technology required (both hardware and software) either already exists or, in the case of a few parts, is expected to be announced this year.

  15. Two Paradoxes in Linear Regression Analysis

    PubMed Central

    FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong

    2016-01-01

    Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214

  16. Process modelling and simulation of underground coal gasification: A Review of work done at IIT Bombay

    NASA Astrophysics Data System (ADS)

    Sharma, Surabhi; Mahajani, Sanjay M.

    2017-07-01

    This paper presents the summary of the work performed over the last decade, at IIT Bombay by the UCG group. The overall objective is to determine the feasibility of a given coal for underground coal gasification and then determine the capacity of a single pair of well through modelling and simulation. It would help one to design a UCG facility for the desired rate of gas production. The simulator developed in this study seeks inputs on four important aspects: Kinetics of all the reactions under the conditions of interest, heat and mass transfer limitations, if any, the flow patterns inside the cavity and lastly the thermo-mechanical failure of the coal. Each of them requires detailed studies in laboratory. Indian Lignite from one of the reserves was chosen as a case study.

  17. Coupled fvGCM-GCE Modeling System, TRMM Latent Heating and Cloud Library

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2004-01-01

    Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that cloud-resolving models (CRMs) agree with observations better than traditional single-column models in simulating various types of clouds and cloud systems from different geographic locations. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a super-parameterization or multi-scale modeling framework, MMF) to use these satellite data to imiprove the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameterization. NASA satellite and field campaign cloud related datasets can provide initial conditions as well as validation for both the MMF and CRMs. A seed fund is available at NASA Goddard to build a MMF based on the 2D GCE model and the Goddard finite volume general circulation model (fvGCM). A prototype MMF will be developed by the end of 2004 and production runs will be conducted at the beginning of 2005. The purpose of this proposal is to augment the current Goddard MMF and other cloud modeling activities. I this talk, I will present: (1) A summary of the second Cloud Modeling Workshop took place at NASA Goddard, (2) A summary of the third TRMM Latent Heating Workshop took place at Nara Japan, (3) A brief discussion on the Goddard research plan of using Weather Research Forecast (WRF) model, and (4) A brief discussion on the GCE model on developing a global cloud simulator.

  18. The atmospheric effects of stratospheric aircraft. Report of the 1992 Models and Measurements Workshop. Volume 1: Workshop objectives and summary

    NASA Technical Reports Server (NTRS)

    Prather, Michael J. (Editor); Remsburg, Ellis E. (Editor)

    1993-01-01

    This Workshop on Stratospheric Models and Measurements (M&M) marks a significant expansion in the history of model intercomparisons. It provides a foundation for establishing the credibility of stratospheric models used in environmental assessments of chlorofluorocarbons, aircraft emissions, and climate-chemistry interactions. The core of the M&M comparisons involves the selection of observations of the current stratosphere (i.e., within the last 15 years): these data are believed to be accurate and representative of certain aspects of stratospheric chemistry and dynamics that the models should be able to simulate.

  19. A marketing approach to carpool demand analysis. Technical memorandum III. Tradeoff model and policy simulation. Conservation paper. [Commuter survey in 3 major urban areas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1976-07-01

    The memorandum discusses the theoretical basis of the trade-off model and its adaptation particularly in the simulation procedures used in evaluating specific policies. Two published articles dealing with the development and application of the trade-off model for market research are included as appendices to this memorandum. This model was the primary instrument used in connection with a research effort examining the role of individuals attitudes and perceptions in deciding whether or not to carpool. The research was based upon a survey of commuters in 3 major urban areas and has resulted in a sizeable new data base on respondents' socio-economicmore » and worktrip characteristics, travel perceptions, and travel preferences. Research is contained in the Summary Report, also available through NTIS.« less

  20. Coupled fvGCM-GCE Modeling System: TRMM Latent Heating and Cloud Library

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2005-01-01

    Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that cloud-resolving models (CRMs) agree with observations better than traditional single-column models in simulating various types of clouds and cloud systems from different geographic locations. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a super-parameterization or multi-scale modeling framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameterization. NASA satellite and field campaign cloud related datasets can provide initial conditions as well as validation for both the MMF and CRMs. A seed fund is available at NASA Goddard to build a MMF based on the 2D GCE model and the Goddard finite volume general circulation model (fvGCM). A prototype MMF will be developed by the end of 2004 and production runs will be conducted at the beginning of 2005. The purpose of this proposal is to augment the current Goddard MMF and other cloud modeling activities. In this talk, I will present: (1) A summary of the second Cloud Modeling Workshop took place at NASA Goddard, (2) A summary of the third TRMM Latent Heating Workshop took place at Nara Japan, (3) A brief discussion on the GCE model on developing a global cloud simulator.

  1. A Comparative Analysis of Kalman Filters Using a Hypervelocity Missile Simulation.

    DTIC Science & Technology

    1981-12-01

    2-29 2.9 Summary ......... ...................... . 2-35 III. Kalman Filter Development ..... ............... ... 3-1 3.1 Introduction...3-2 3.1.4 Assumptions ................ 3-3 3.2 Development of Line-of-Sight Filters ......... ... 3-4 3.2.1 Introduction ....... .............. . 3-4... Development of Inertial Filters ... ......... ... 3-20 3.3.1 Introduction ...... ................ ... 3-20 3.3.2 Filter Model I.I

  2. Big-leaf mahogany Swietenia macrophylla population dynamics and implications for sustainable management

    Treesearch

    James Grogan; R. Matthew Landis; Christopher M. Free; Mark D. Schulze; Marco Lentini; Mark S. Ashton

    2014-01-01

    Summary 1. The impacts of selective harvesting in tropical forests on population recovery and future timber yields by high-value species remain largely unknown for lack of demographic data spanning all phases of life history, from seed to senescence. In this study, we use an individual- based model parameterized using 15 years of annual census data to simulate...

  3. Planetary geomorphology research: FY 1990-1991

    NASA Technical Reports Server (NTRS)

    Malin, M. C.

    1991-01-01

    Progress in the following research areas is discussed: (1) volatile ice sublimation in a simulated Martian polar environment; (2) a global synthesis of Venusian tectonics; (3) a summary of nearly a decade of field studies of eolian processes in cold volcanic deserts; and (4) a model for interpretation of Martian sediment distribution using Viking observations. Some conclusions from the research are presented.

  4. Summary of Research 1998, Department of Mechanical Engineering.

    DTIC Science & Technology

    1999-08-01

    thermoacoustic behavior in strong zero-mean oscillatory flows with potential application to the design of heat exchangers in thermoacoustic engines...important feature in the thermal characterization of microtubes , which are to be used in microheat exchangers . DoD KEY TECHNOLOGY AREA: Modeling and...Simulation KEYWORDS: Laminar Duct Flows, Convection and Conduction Heat Transfer, Axial Conduction, Micro- heat Exchang - ers DEVELOPMENT AND CALIBRATION

  5. Review of Diagnostics for Water Sources in General Circulation Models (GCMs)

    NASA Technical Reports Server (NTRS)

    Bosilovich, M.

    2003-01-01

    We will describe the uses of passive tracers in GCMs to compute the geographical sources of water for precipitation. We will present a summary of recent research and how this methodology can be applied in climate and climate change studies. We will also discuss the possibility of using passive tracers in conjunction with simulations and observations of stable observations.

  6. Turbulence modeling for hypersonic flight

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge E.

    1992-01-01

    The objective of the present work is to develop, verify, and incorporate two equation turbulence models which account for the effect of compressibility at high speeds into a three dimensional Reynolds averaged Navier-Stokes code and to provide documented model descriptions and numerical procedures so that they can be implemented into the National Aerospace Plane (NASP) codes. A summary of accomplishments is listed: (1) Four codes have been tested and evaluated against a flat plate boundary layer flow and an external supersonic flow; (2) a code named RANS was chosen because of its speed, accuracy, and versatility; (3) the code was extended from thin boundary layer to full Navier-Stokes; (4) the K-omega two equation turbulence model has been implemented into the base code; (5) a 24 degree laminar compression corner flow has been simulated and compared to other numerical simulations; and (6) work is in progress in writing the numerical method of the base code including the turbulence model.

  7. AFHRL/FT [Air Force Human Resources Laboratory/Flight Training] Capabilities in Undergraduate Pilot Training Simulation Research: Executive Summary.

    ERIC Educational Resources Information Center

    Matheny, W. G.; And Others

    The document presents a summary description of the Air Force Human Resource Laboratory's Flying Training Division (AFHRL/FT) research capabilities for undergraduate pilot training. One of the research devices investigated is the Advanced Simulator for Undergraduate Pilot Training (ASUPT). The equipment includes the ASUPT, the instrumented T-37…

  8. Construct validity of individual and summary performance metrics associated with a computer-based laparoscopic simulator.

    PubMed

    Rivard, Justin D; Vergis, Ashley S; Unger, Bertram J; Hardy, Krista M; Andrew, Chris G; Gillman, Lawrence M; Park, Jason

    2014-06-01

    Computer-based surgical simulators capture a multitude of metrics based on different aspects of performance, such as speed, accuracy, and movement efficiency. However, without rigorous assessment, it may be unclear whether all, some, or none of these metrics actually reflect technical skill, which can compromise educational efforts on these simulators. We assessed the construct validity of individual performance metrics on the LapVR simulator (Immersion Medical, San Jose, CA, USA) and used these data to create task-specific summary metrics. Medical students with no prior laparoscopic experience (novices, N = 12), junior surgical residents with some laparoscopic experience (intermediates, N = 12), and experienced surgeons (experts, N = 11) all completed three repetitions of four LapVR simulator tasks. The tasks included three basic skills (peg transfer, cutting, clipping) and one procedural skill (adhesiolysis). We selected 36 individual metrics on the four tasks that assessed six different aspects of performance, including speed, motion path length, respect for tissue, accuracy, task-specific errors, and successful task completion. Four of seven individual metrics assessed for peg transfer, six of ten metrics for cutting, four of nine metrics for clipping, and three of ten metrics for adhesiolysis discriminated between experience levels. Time and motion path length were significant on all four tasks. We used the validated individual metrics to create summary equations for each task, which successfully distinguished between the different experience levels. Educators should maintain some skepticism when reviewing the plethora of metrics captured by computer-based simulators, as some but not all are valid. We showed the construct validity of a limited number of individual metrics and developed summary metrics for the LapVR. The summary metrics provide a succinct way of assessing skill with a single metric for each task, but require further validation.

  9. Global Mean Temperature Timeseries Projections from GCMs: The Implications of Rebasing

    NASA Astrophysics Data System (ADS)

    Chapman, S. C.; Stainforth, D. A.; Watkins, N. W.

    2017-12-01

    Global climate models are assessed by comparison with observations through several benchmarks. One highlighted by the InterGovernmental Panel on Climate Change (IPCC) is their ability to reproduce "general features of the global and annual mean surface temperature changes over the historical period" [1,2] and to simulate "a trend in global-mean surface temperature from 1951 to 2012 that agrees with the observed trend" [3]. These aspects of annual mean global mean temperature (GMT) change are presented as one feature demonstrating the relevance of these models for climate projections. Here we consider a formal interpretation of "general features" and discuss the implications of this approach to model assessment and intercomparison, for the interpretation of GCM projections. Following the IPCC, we interpret a major element of "general features" as being the slow timescale response to external forcings. (Shorter timescale behaviour such as the response to volcanic eruptions are also elements of "general features" but are not considered here.) Also following the IPCC, we consider only GMT anomalies. The models have absolute temperatures which range over about 3K so this means their timeseries (and the observations) are rebased. We show that rebasing in combination with general agreement, implies a separation of scales which limits the degree to which sub-global behaviour can feedback on the global response. It also implies a degree of linearity in the GMT slow timescale response. For each individual model these implications only apply over the range of absolute temperatures simulated by the model in historic simulations. Taken together, however, they imply consequences over a wider range of GMTs. [1] IPCC, Fifth Assessment Report, Working Group 1, Technical Summary: Stocker et al. 2013. [2] IPCC, Fifth Assessment Report, Working Group 1, Chapter 9 - "Evaluation of Climate Models": Flato et al. 2013. [3] IPCC, Fifth Assessment Report, Working Group 1, Summary for Policy Makers: IPCC, 2013.

  10. mizer: an R package for multispecies, trait-based and community size spectrum ecological modelling.

    PubMed

    Scott, Finlay; Blanchard, Julia L; Andersen, Ken H

    2014-10-01

    Size spectrum ecological models are representations of a community of individuals which grow and change trophic level. A key emergent feature of these models is the size spectrum; the total abundance of all individuals that scales negatively with size. The models we focus on are designed to capture fish community dynamics useful for assessing the community impacts of fishing.We present mizer , an R package for implementing dynamic size spectrum ecological models of an entire aquatic community subject to fishing. Multiple fishing gears can be defined and fishing mortality can change through time making it possible to simulate a range of exploitation strategies and management options. mizer implements three versions of the size spectrum modelling framework: the community model, where individuals are only characterized by their size; the trait-based model, where individuals are further characterized by their asymptotic size; and the multispecies model where additional trait differences are resolved.A range of plot, community indicator and summary methods are available to inspect the results of the simulations.

  11. Line-by-line spectroscopic simulations on graphics processing units

    NASA Astrophysics Data System (ADS)

    Collange, Sylvain; Daumas, Marc; Defour, David

    2008-01-01

    We report here on software that performs line-by-line spectroscopic simulations on gases. Elaborate models (such as narrow band and correlated-K) are accurate and efficient for bands where various components are not simultaneously and significantly active. Line-by-line is probably the most accurate model in the infrared for blends of gases that contain high proportions of H 2O and CO 2 as this was the case for our prototype simulation. Our implementation on graphics processing units sustains a speedup close to 330 on computation-intensive tasks and 12 on memory intensive tasks compared to implementations on one core of high-end processors. This speedup is due to data parallelism, efficient memory access for specific patterns and some dedicated hardware operators only available in graphics processing units. It is obtained leaving most of processor resources available and it would scale linearly with the number of graphics processing units in parallel machines. Line-by-line simulation coupled with simulation of fluid dynamics was long believed to be economically intractable but our work shows that it could be done with some affordable additional resources compared to what is necessary to perform simulations on fluid dynamics alone. Program summaryProgram title: GPU4RE Catalogue identifier: ADZY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZY_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 62 776 No. of bytes in distributed program, including test data, etc.: 1 513 247 Distribution format: tar.gz Programming language: C++ Computer: x86 PC Operating system: Linux, Microsoft Windows. Compilation requires either gcc/g++ under Linux or Visual C++ 2003/2005 and Cygwin under Windows. It has been tested using gcc 4.1.2 under Ubuntu Linux 7.04 and using Visual C++ 2005 with Cygwin 1.5.24 under Windows XP. RAM: 1 gigabyte Classification: 21.2 External routines: OpenGL ( http://www.opengl.org) Nature of problem: Simulating radiative transfer on high-temperature high-pressure gases. Solution method: Line-by-line Monte-Carlo ray-tracing. Unusual features: Parallel computations are moved to the GPU. Additional comments: nVidia GeForce 7000 or ATI Radeon X1000 series graphics processing unit is required. Running time: A few minutes.

  12. A stochastic simulator of birth-death master equations with application to phylodynamics.

    PubMed

    Vaughan, Timothy G; Drummond, Alexei J

    2013-06-01

    In this article, we present a versatile new software tool for the simulation and analysis of stochastic models of population phylodynamics and chemical kinetics. Models are specified via an expressive and human-readable XML format and can be used as the basis for generating either single population histories or large ensembles of such histories. Importantly, phylogenetic trees or networks can be generated alongside the histories they correspond to, enabling investigations into the interplay between genealogies and population dynamics. Summary statistics such as means and variances can be recorded in place of the full ensemble, allowing for a reduction in the amount of memory used--an important consideration for models including large numbers of individual subpopulations or demes. In the case of population size histories, the resulting simulation output is written to disk in the flexible JSON format, which is easily read into numerical analysis environments such as R for visualization or further processing. Simulated phylogenetic trees can be recorded using the standard Newick or NEXUS formats, with extensions to these formats used for non-tree-like inheritance relationships.

  13. A Stochastic Simulator of Birth–Death Master Equations with Application to Phylodynamics

    PubMed Central

    Vaughan, Timothy G.; Drummond, Alexei J.

    2013-01-01

    In this article, we present a versatile new software tool for the simulation and analysis of stochastic models of population phylodynamics and chemical kinetics. Models are specified via an expressive and human-readable XML format and can be used as the basis for generating either single population histories or large ensembles of such histories. Importantly, phylogenetic trees or networks can be generated alongside the histories they correspond to, enabling investigations into the interplay between genealogies and population dynamics. Summary statistics such as means and variances can be recorded in place of the full ensemble, allowing for a reduction in the amount of memory used—an important consideration for models including large numbers of individual subpopulations or demes. In the case of population size histories, the resulting simulation output is written to disk in the flexible JSON format, which is easily read into numerical analysis environments such as R for visualization or further processing. Simulated phylogenetic trees can be recorded using the standard Newick or NEXUS formats, with extensions to these formats used for non-tree-like inheritance relationships. PMID:23505043

  14. University Research in Support of TREAT Modeling and Simulation, FY 2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeHart, Mark David

    Idaho National Laboratory is currently evolving the modeling and simulation (M&S) capability that will enable improved core operation as well as design and analysis of TREAT experiments. This M&S capability primarily uses MAMMOTH, a reactor physics application being developed under the Multi-physics Object Oriented Simulation Environment (MOOSE) framework. MAMMOTH allows the coupling of a number of other MOOSE-based applications. In support of this research, INL is working with four universities to explore advanced solution methods that will complement or augment capabilities in MAMMOTH. This report consists of a collection of year end summaries of research from the universities performed inmore » support of TREAT modeling and simulation. This research was led by Prof. Sedat Goluoglu at the University of Florida, Profs. Jim Morel and Jean Ragusa at Texas A&M University, Profs. Benoit Forget and Kord Smith at Massachusetts Institute of Technology, Prof. Leslie Kerby of Idaho State University and Prof. Barry Ganapol of University of Arizona. A significant number of students were supported at various levels though the projects and, for some, also as interns at INL.« less

  15. Multiple phenotype association tests using summary statistics in genome-wide association studies.

    PubMed

    Liu, Zhonghua; Lin, Xihong

    2018-03-01

    We study in this article jointly testing the associations of a genetic variant with correlated multiple phenotypes using the summary statistics of individual phenotype analysis from Genome-Wide Association Studies (GWASs). We estimated the between-phenotype correlation matrix using the summary statistics of individual phenotype GWAS analyses, and developed genetic association tests for multiple phenotypes by accounting for between-phenotype correlation without the need to access individual-level data. Since genetic variants often affect multiple phenotypes differently across the genome and the between-phenotype correlation can be arbitrary, we proposed robust and powerful multiple phenotype testing procedures by jointly testing a common mean and a variance component in linear mixed models for summary statistics. We computed the p-values of the proposed tests analytically. This computational advantage makes our methods practically appealing in large-scale GWASs. We performed simulation studies to show that the proposed tests maintained correct type I error rates, and to compare their powers in various settings with the existing methods. We applied the proposed tests to a GWAS Global Lipids Genetics Consortium summary statistics data set and identified additional genetic variants that were missed by the original single-trait analysis. © 2017, The International Biometric Society.

  16. A molecular dynamics implementation of the 3D Mercedes-Benz water model

    NASA Astrophysics Data System (ADS)

    Hynninen, T.; Dias, C. L.; Mkrtchyan, A.; Heinonen, V.; Karttunen, M.; Foster, A. S.; Ala-Nissila, T.

    2012-02-01

    The three-dimensional Mercedes-Benz model was recently introduced to account for the structural and thermodynamic properties of water. It treats water molecules as point-like particles with four dangling bonds in tetrahedral coordination, representing H-bonds of water. Its conceptual simplicity renders the model attractive in studies where complex behaviors emerge from H-bond interactions in water, e.g., the hydrophobic effect. A molecular dynamics (MD) implementation of the model is non-trivial and we outline here the mathematical framework of its force-field. Useful routines written in modern Fortran are also provided. This open source code is free and can easily be modified to account for different physical context. The provided code allows both serial and MPI-parallelized execution. Program summaryProgram title: CASHEW (Coarse Approach Simulator for Hydrogen-bonding Effects in Water) Catalogue identifier: AEKM_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKM_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 20 501 No. of bytes in distributed program, including test data, etc.: 551 044 Distribution format: tar.gz Programming language: Fortran 90 Computer: Program has been tested on desktop workstations and a Cray XT4/XT5 supercomputer. Operating system: Linux, Unix, OS X Has the code been vectorized or parallelized?: The code has been parallelized using MPI. RAM: Depends on size of system, about 5 MB for 1500 molecules. Classification: 7.7 External routines: A random number generator, Mersenne Twister ( http://www.math.sci.hiroshima-u.ac.jp/m-mat/MT/VERSIONS/FORTRAN/mt95.f90), is used. A copy of the code is included in the distribution. Nature of problem: Molecular dynamics simulation of a new geometric water model. Solution method: New force-field for water molecules, velocity-Verlet integration, representation of molecules as rigid particles with rotations described using quaternion algebra. Restrictions: Memory and cpu time limit the size of simulations. Additional comments: Software web site: https://gitorious.org/cashew/. Running time: Depends on the size of system. The sample tests provided only take a few seconds.

  17. Piloted Simulation Investigation of a Supersonic Transport Configuration (LaRC.4)

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Martinez, Debbie; Derry, Stephen D.

    1999-01-01

    This report contains a description of the test facilities and software utilized during a joint NASA/aerospace industry study of improved control laws and desired inceptor characteristics for a candidate supersonic transport air-craft design. Details concerning the characteristics of the simulation cockpit, image generator and display systems, and motion platform are described. Depictions of the various display formats are included. The test schedule, session log, and flight cards describing the maneuvers performed is included. A brief summary of high-lights of the study is given. Modifications made to the industry-provided simulation model are described. This report is intended to serve as a reference document for industry researchers.

  18. Integration of Computational Geometry, Finite Element, and Multibody System Algorithms for the Development of New Computational Methodology for High-Fidelity Vehicle Systems Modeling and Simulation. ADDENDUM

    DTIC Science & Technology

    2013-11-12

    Dr. Paramsothy Jayakumar (586) 282-4896 Computational Dynamics Inc.   0    Name of Contractor Computational Dynamics Inc. (CDI) 1809...Dr. Paramsothy Jayakumar TARDEC Computational Dynamics Inc.   1    Project Summary This project aims at addressing and remedying the serious...Shabana, A.A., Jayakumar , P., and Letherwood, M., “Soil Models and Vehicle System Dynamics”, Applied Mechanics Reviews, Vol. 65(4), 2013, doi

  19. A Review of Numerical Simulation and Analytical Modeling for Medical Devices Safety in MRI

    PubMed Central

    Kabil, J.; Belguerras, L.; Trattnig, S.; Pasquier, C.; Missoffe, A.

    2016-01-01

    Summary Objectives To review past and present challenges and ongoing trends in numerical simulation for MRI (Magnetic Resonance Imaging) safety evaluation of medical devices. Methods A wide literature review on numerical and analytical simulation on simple or complex medical devices in MRI electromagnetic fields shows the evolutions through time and a growing concern for MRI safety over the years. Major issues and achievements are described, as well as current trends and perspectives in this research field. Results Numerical simulation of medical devices is constantly evolving, supported by calculation methods now well-established. Implants with simple geometry can often be simulated in a computational human model, but one issue remaining today is the experimental validation of these human models. A great concern is to assess RF heating on implants too complex to be traditionally simulated, like pacemaker leads. Thus, ongoing researches focus on alternative hybrids methods, both numerical and experimental, with for example a transfer function method. For the static field and gradient fields, analytical models can be used for dimensioning simple implants shapes, but limited for complex geometries that cannot be studied with simplifying assumptions. Conclusions Numerical simulation is an essential tool for MRI safety testing of medical devices. The main issues remain the accuracy of simulations compared to real life and the studies of complex devices; but as the research field is constantly evolving, some promising ideas are now under investigation to take up the challenges. PMID:27830244

  20. Modeling viscoelasticity through spring–dashpot models in intermittent-contact atomic force microscopy

    PubMed Central

    López-Guerra, Enrique A

    2014-01-01

    Summary We examine different approaches to model viscoelasticity within atomic force microscopy (AFM) simulation. Our study ranges from very simple linear spring–dashpot models to more sophisticated nonlinear systems that are able to reproduce fundamental properties of viscoelastic surfaces, including creep, stress relaxation and the presence of multiple relaxation times. Some of the models examined have been previously used in AFM simulation, but their applicability to different situations has not yet been examined in detail. The behavior of each model is analyzed here in terms of force–distance curves, dissipated energy and any inherent unphysical artifacts. We focus in this paper on single-eigenmode tip–sample impacts, but the models and results can also be useful in the context of multifrequency AFM, in which the tip trajectories are very complex and there is a wider range of sample deformation frequencies (descriptions of tip–sample model behaviors in the context of multifrequency AFM require detailed studies and are beyond the scope of this work). PMID:25551043

  1. Optimized multiple quantum MAS lineshape simulations in solid state NMR

    NASA Astrophysics Data System (ADS)

    Brouwer, William J.; Davis, Michael C.; Mueller, Karl T.

    2009-10-01

    The majority of nuclei available for study in solid state Nuclear Magnetic Resonance have half-integer spin I>1/2, with corresponding electric quadrupole moment. As such, they may couple with a surrounding electric field gradient. This effect introduces anisotropic line broadening to spectra, arising from distinct chemical species within polycrystalline solids. In Multiple Quantum Magic Angle Spinning (MQMAS) experiments, a second frequency dimension is created, devoid of quadrupolar anisotropy. As a result, the center of gravity of peaks in the high resolution dimension is a function of isotropic second order quadrupole and chemical shift alone. However, for complex materials, these parameters take on a stochastic nature due in turn to structural and chemical disorder. Lineshapes may still overlap in the isotropic dimension, complicating the task of assignment and interpretation. A distributed computational approach is presented here which permits simulation of the two-dimensional MQMAS spectrum, generated by random variates from model distributions of isotropic chemical and quadrupole shifts. Owing to the non-convex nature of the residual sum of squares (RSS) function between experimental and simulated spectra, simulated annealing is used to optimize the simulation parameters. In this manner, local chemical environments for disordered materials may be characterized, and via a re-sampling approach, error estimates for parameters produced. Program summaryProgram title: mqmasOPT Catalogue identifier: AEEC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEC_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3650 No. of bytes in distributed program, including test data, etc.: 73 853 Distribution format: tar.gz Programming language: C, OCTAVE Computer: UNIX/Linux Operating system: UNIX/Linux Has the code been vectorised or parallelized?: Yes RAM: Example: (1597 powder angles) × (200 Samples) × (81 F2 frequency pts) × (31 F1 frequency points) = 3.5M, SMP AMD opteron Classification: 2.3 External routines: OCTAVE ( http://www.gnu.org/software/octave/), GNU Scientific Library ( http://www.gnu.org/software/gsl/), OPENMP ( http://openmp.org/wp/) Nature of problem: The optimal simulation and modeling of multiple quantum magic angle spinning NMR spectra, for general systems, especially those with mild to significant disorder. The approach outlined and implemented in C and OCTAVE also produces model parameter error estimates. Solution method: A model for each distinct chemical site is first proposed, for the individual contribution of crystallite orientations to the spectrum. This model is averaged over all powder angles [1], as well as the (stochastic) parameters; isotropic chemical shift and quadrupole coupling constant. The latter is accomplished via sampling from a bi-variate Gaussian distribution, using the Box-Muller algorithm to transform Sobol (quasi) random numbers [2]. A simulated annealing optimization is performed, and finally the non-linear jackknife [3] is applied in developing model parameter error estimates. Additional comments: The distribution contains a script, mqmasOpt.m, which runs in the OCTAVE language workspace. Running time: Example: (1597 powder angles) × (200 Samples) × (81 F2 frequency pts) × (31 F1 frequency points) = 58.35 seconds, SMP AMD opteron. References:S.K. Zaremba, Annali di Matematica Pura ed Applicata 73 (1966) 293. H. Niederreiter, Random Number Generation and Quasi-Monte Carlo Methods, SIAM, 1992. T. Fox, D. Hinkley, K. Larntz, Technometrics 22 (1980) 29.

  2. Ab initio folding of proteins using all-atom discrete molecular dynamics

    PubMed Central

    Ding, Feng; Tsao, Douglas; Nie, Huifen; Dokholyan, Nikolay V.

    2008-01-01

    Summary Discrete molecular dynamics (DMD) is a rapid sampling method used in protein folding and aggregation studies. Until now, DMD was used to perform simulations of simplified protein models in conjunction with structure-based force fields. Here, we develop an all-atom protein model and a transferable force field featuring packing, solvation, and environment-dependent hydrogen bond interactions. Using the replica exchange method, we perform folding simulations of six small proteins (20–60 residues) with distinct native structures. In all cases, native or near-native states are reached in simulations. For three small proteins, multiple folding transitions are observed and the computationally-characterized thermodynamics are in quantitative agreement with experiments. The predictive power of all-atom DMD highlights the importance of environment-dependent hydrogen bond interactions in modeling protein folding. The developed approach can be used for accurate and rapid sampling of conformational spaces of proteins and protein-protein complexes, and applied to protein engineering and design of protein-protein interactions. PMID:18611374

  3. Simulation of guided-wave ultrasound propagation in composite laminates: Benchmark comparisons of numerical codes and experiment.

    PubMed

    Leckey, Cara A C; Wheeler, Kevin R; Hafiychuk, Vasyl N; Hafiychuk, Halyna; Timuçin, Doğan A

    2018-03-01

    Ultrasonic wave methods constitute the leading physical mechanism for nondestructive evaluation (NDE) and structural health monitoring (SHM) of solid composite materials, such as carbon fiber reinforced polymer (CFRP) laminates. Computational models of ultrasonic wave excitation, propagation, and scattering in CFRP composites can be extremely valuable in designing practicable NDE and SHM hardware, software, and methodologies that accomplish the desired accuracy, reliability, efficiency, and coverage. The development and application of ultrasonic simulation approaches for composite materials is an active area of research in the field of NDE. This paper presents comparisons of guided wave simulations for CFRP composites implemented using four different simulation codes: the commercial finite element modeling (FEM) packages ABAQUS, ANSYS, and COMSOL, and a custom code executing the Elastodynamic Finite Integration Technique (EFIT). Benchmark comparisons are made between the simulation tools and both experimental laser Doppler vibrometry data and theoretical dispersion curves. A pristine and a delamination type case (Teflon insert in the experimental specimen) is studied. A summary is given of the accuracy of simulation results and the respective computational performance of the four different simulation tools. Published by Elsevier B.V.

  4. Evaluation of NASA's end-to-end data systems using DSDS+

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Davenport, William; Message, Philip

    1994-01-01

    The Data Systems Dynamic Simulator (DSDS+) is a software tool being developed by the authors to evaluate candidate architectures for NASA's end-to-end data systems. Via modeling and simulation, we are able to quickly predict the performance characteristics of each architecture, to evaluate 'what-if' scenarios, and to perform sensitivity analyses. As such, we are using modeling and simulation to help NASA select the optimal system configuration, and to quantify the performance characteristics of this system prior to its delivery. This paper is divided into the following six sections: (1) The role of modeling and simulation in the systems engineering process. In this section, we briefly describe the different types of results obtained by modeling each phase of the systems engineering life cycle, from concept definition through operations and maintenance; (2) Recent applications of DSDS+. In this section, we describe ongoing applications of DSDS+ in support of the Earth Observing System (EOS), and we present some of the simulation results generated of candidate system designs. So far, we have modeled individual EOS subsystems (e.g. the Solid State Recorders used onboard the spacecraft), and we have also developed an integrated model of the EOS end-to-end data processing and data communications systems (from the payloads onboard to the principle investigator facilities on the ground); (3) Overview of DSDS+. In this section we define what a discrete-event model is, and how it works. The discussion is presented relative to the DSDS+ simulation tool that we have developed, including it's run-time optimization algorithms that enables DSDS+ to execute substantially faster than comparable discrete-event simulation tools; (4) Summary. In this section, we summarize our findings and 'lessons learned' during the development and application of DSDS+ to model NASA's data systems; (5) Further Information; and (6) Acknowledgements.

  5. Development of modelling method selection tool for health services management: from problem structuring methods to modelling and simulation methods.

    PubMed

    Jun, Gyuchan T; Morris, Zoe; Eldabi, Tillal; Harper, Paul; Naseer, Aisha; Patel, Brijesh; Clarkson, John P

    2011-05-19

    There is an increasing recognition that modelling and simulation can assist in the process of designing health care policies, strategies and operations. However, the current use is limited and answers to questions such as what methods to use and when remain somewhat underdeveloped. The aim of this study is to provide a mechanism for decision makers in health services planning and management to compare a broad range of modelling and simulation methods so that they can better select and use them or better commission relevant modelling and simulation work. This paper proposes a modelling and simulation method comparison and selection tool developed from a comprehensive literature review, the research team's extensive expertise and inputs from potential users. Twenty-eight different methods were identified, characterised by their relevance to different application areas, project life cycle stages, types of output and levels of insight, and four input resources required (time, money, knowledge and data). The characterisation is presented in matrix forms to allow quick comparison and selection. This paper also highlights significant knowledge gaps in the existing literature when assessing the applicability of particular approaches to health services management, where modelling and simulation skills are scarce let alone money and time. A modelling and simulation method comparison and selection tool is developed to assist with the selection of methods appropriate to supporting specific decision making processes. In particular it addresses the issue of which method is most appropriate to which specific health services management problem, what the user might expect to be obtained from the method, and what is required to use the method. In summary, we believe the tool adds value to the scarce existing literature on methods comparison and selection.

  6. Numerical studies of various Néel-VBS transitions in SU(N) anti-ferromagnets

    NASA Astrophysics Data System (ADS)

    Kaul, Ribhu K.; Block, Matthew S.

    2015-09-01

    In this manuscript we review recent developments in the numerical simulations of bipartite SU(N) spin models by quantum Monte Carlo (QMC) methods. We provide an account of a large family of newly discovered sign-problem free spin models which can be simulated in their ground states on large lattices, containing O(105) spins, using the stochastic series expansion method with efficient loop algorithms. One of the most important applications so far of these Hamiltonians are to unbiased studies of quantum criticality between Neel and valence bond phases in two dimensions - a summary of this body of work is provided. The article concludes with an overview of the current status of and outlook for future studies of the “designer” Hamiltonians.

  7. Summary of flight tests to determine the spin and controllability characteristics of a remotely piloted, large-scale (3/8) fighter airplane model

    NASA Technical Reports Server (NTRS)

    Holleman, E. C.

    1976-01-01

    An unpowered, large, dynamically scaled airplane model was test flown by remote pilot to investigate the stability and controllability of the configuration at high angles of attack. The configuration proved to be departure/spin resistant; however, spins were obtained by using techniques developed on a flight support simulator. Spin modes at high and medium high angles of attack were identified, and recovery techniques were investigated. A flight support simulation of the airplane model mechanized with low speed wind tunnel data over an angle of attack range of + or - 90 deg. and an angle of sideslip range of + or - 40 deg. provided insight into the effects of altitude, stability, aerodynamic damping, and the operation of the augmented flight control system on spins. Aerodynamic derivatives determined from flight maneuvers were used to correlate model controllability with two proposed departure/spin design criteria.

  8. Darrieus rotor aerodynamics

    NASA Astrophysics Data System (ADS)

    Klimas, P. C.

    1982-05-01

    A summary of the progress of modeling the aerodynamic effects on the blades of a Darrieus wind turbine is presented. Interference is discussed in terms of blade/blade wake interaction and improvements in single and multiple stream tube models, of vortex simulations of blades and their wakes, and a hybrid momentum/vortex code to combine fast computation time with interference-describing capabilities. An empirical model has been developed for treating the properties of dynamic stall such as airfoil geometry, Reynolds number, reduced frequency, angle-of-attack, and Mach number. Pitching circulation has been subjected to simulation as potential flow about a two-dimensional flat plate, along with applications of the concepts of virtual camber and virtual incidence, with a cambered airfoil operating in a rectilinear flowfield. Finally, a need to develop a loading model suitable for nonsymmetrical blade sections is indicated, as well as blade behavior in a dynamic, curvilinear regime.

  9. Summary of FY15 results of benchmark modeling activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arguello, J. Guadalupe

    2015-08-01

    Sandia is participating in the third phase of an is a contributing partner to a U.S.-German "Joint Project" entitled "Comparison of current constitutive models and simulation procedures on the basis of model calculations of the thermo-mechanical behavior and healing of rock salt." The first goal of the project is to check the ability of numerical modeling tools to correctly describe the relevant deformation phenomena in rock salt under various influences. Achieving this goal will lead to increased confidence in the results of numerical simulations related to the secure storage of radioactive wastes in rock salt, thereby enhancing the acceptance ofmore » the results. These results may ultimately be used to make various assertions regarding both the stability analysis of an underground repository in salt, during the operating phase, and the long-term integrity of the geological barrier against the release of harmful substances into the biosphere, in the post-operating phase.« less

  10. Project FOOTPRINT: Substation modeling and simulations for E1 pulses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Scott D.; Larson, D. J.; Kirkendall, B. A.

    This report includes a presentation with an: Introduction to CW coupling; Introduction to single-pulse coupling; Description of E1 waveforms; Structures in a substation yard --articulated (as part of the substation's defined electrical functionality)--unarticulated (not as part of the substation's defined electrical functionality); Coupling --electrical coupling (capacitive coupling) --magnetic coupling (inductive coupling); Connectivity to long-line transmission lines; Control infrastructure; Summary; and References.

  11. Relative motion of orbiting particles under the influence of perturbing forces. Volume 1: Summary

    NASA Technical Reports Server (NTRS)

    Eades, J. B., Jr.

    1974-01-01

    The relative motion for orbiting vehicles, under the influence of various perturbing forces, has been studied to determine what influence these inputs, and others, can have. The analytical tasks are discribed in general terms; the force types considered, are outlined modelled and simulated, and the capabilities of the computer programs which have evolved in support of this work are denoted.

  12. TEA CO 2 Laser Simulator: A software tool to predict the output pulse characteristics of TEA CO 2 laser

    NASA Astrophysics Data System (ADS)

    Abdul Ghani, B.

    2005-09-01

    "TEA CO 2 Laser Simulator" has been designed to simulate the dynamic emission processes of the TEA CO 2 laser based on the six-temperature model. The program predicts the behavior of the laser output pulse (power, energy, pulse duration, delay time, FWHM, etc.) depending on the physical and geometrical input parameters (pressure ratio of gas mixture, reflecting area of the output mirror, media length, losses, filling and decay factors, etc.). Program summaryTitle of program: TEA_CO2 Catalogue identifier: ADVW Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVW Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer: P.IV DELL PC Setup: Atomic Energy Commission of Syria, Scientific Services Department, Mathematics and Informatics Division Operating system: MS-Windows 9x, 2000, XP Programming language: Delphi 6.0 No. of lines in distributed program, including test data, etc.: 47 315 No. of bytes in distributed program, including test data, etc.:7 681 109 Distribution format:tar.gz Classification: 15 Laser Physics Nature of the physical problem: "TEA CO 2 Laser Simulator" is a program that predicts the behavior of the laser output pulse by studying the effect of the physical and geometrical input parameters on the characteristics of the output laser pulse. The laser active medium consists of a CO 2-N 2-He gas mixture. Method of solution: Six-temperature model, for the dynamics emission of TEA CO 2 laser, has been adapted in order to predict the parameters of laser output pulses. A simulation of the laser electrical pumping was carried out using two approaches; empirical function equation (8) and differential equation (9). Typical running time: The program's running time mainly depends on both integration interval and step; for a 4 μs period of time and 0.001 μs integration step (defaults values used in the program), the running time will be about 4 seconds. Restrictions on the complexity: Using a very small integration step might leads to stop the program run due to the huge number of calculating points and to a small paging file size of the MS-Windows virtual memory. In such case, it is recommended to enlarge the paging file size to the appropriate size, or to use a bigger value of integration step.

  13. Comments of statistical issue in numerical modeling for underground nuclear test monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nicholson, W.L.; Anderson, K.K.

    1993-03-01

    The Symposium concluded with prepared summaries by four experts in the involved disciplines. These experts made no mention of statistics and/or the statistical content of issues. The first author contributed an extemporaneous statement at the Symposium because there are important issues associated with conducting and evaluating numerical modeling that are familiar to statisticians and often treated successfully by them. This note expands upon these extemporaneous remarks. Statistical ideas may be helpful in resolving some numerical modeling issues. Specifically, we comment first on the role of statistical design/analysis in the quantification process to answer the question ``what do we know aboutmore » the numerical modeling of underground nuclear tests?`` and second on the peculiar nature of uncertainty analysis for situations involving numerical modeling. The simulations described in the workshop, though associated with topic areas, were basically sets of examples. Each simulation was tuned towards agreeing with either empirical evidence or an expert`s opinion of what empirical evidence would be. While the discussions were reasonable, whether the embellishments were correct or a forced fitting of reality is unclear and illustrates that ``simulation is easy.`` We also suggest that these examples of simulation are typical and the questions concerning the legitimacy and the role of knowing the reality are fair, in general, with respect to simulation. The answers will help us understand why ``prediction is difficult.``« less

  14. Formal specification and design techniques for wireless sensor and actuator networks.

    PubMed

    Martínez, Diego; González, Apolinar; Blanes, Francisco; Aquino, Raúl; Simo, José; Crespo, Alfons

    2011-01-01

    A current trend in the development and implementation of industrial applications is to use wireless networks to communicate the system nodes, mainly to increase application flexibility, reliability and portability, as well as to reduce the implementation cost. However, the nondeterministic and concurrent behavior of distributed systems makes their analysis and design complex, often resulting in less than satisfactory performance in simulation and test bed scenarios, which is caused by using imprecise models to analyze, validate and design these systems. Moreover, there are some simulation platforms that do not support these models. This paper presents a design and validation method for Wireless Sensor and Actuator Networks (WSAN) which is supported on a minimal set of wireless components represented in Colored Petri Nets (CPN). In summary, the model presented allows users to verify the design properties and structural behavior of the system.

  15. Automatic physical inference with information maximizing neural networks

    NASA Astrophysics Data System (ADS)

    Charnock, Tom; Lavaux, Guilhem; Wandelt, Benjamin D.

    2018-04-01

    Compressing large data sets to a manageable number of summaries that are informative about the underlying parameters vastly simplifies both frequentist and Bayesian inference. When only simulations are available, these summaries are typically chosen heuristically, so they may inadvertently miss important information. We introduce a simulation-based machine learning technique that trains artificial neural networks to find nonlinear functionals of data that maximize Fisher information: information maximizing neural networks (IMNNs). In test cases where the posterior can be derived exactly, likelihood-free inference based on automatically derived IMNN summaries produces nearly exact posteriors, showing that these summaries are good approximations to sufficient statistics. In a series of numerical examples of increasing complexity and astrophysical relevance we show that IMNNs are robustly capable of automatically finding optimal, nonlinear summaries of the data even in cases where linear compression fails: inferring the variance of Gaussian signal in the presence of noise, inferring cosmological parameters from mock simulations of the Lyman-α forest in quasar spectra, and inferring frequency-domain parameters from LISA-like detections of gravitational waveforms. In this final case, the IMNN summary outperforms linear data compression by avoiding the introduction of spurious likelihood maxima. We anticipate that the automatic physical inference method described in this paper will be essential to obtain both accurate and precise cosmological parameter estimates from complex and large astronomical data sets, including those from LSST and Euclid.

  16. The implications of rebasing global mean temperature timeseries for GCM based climate projections

    NASA Astrophysics Data System (ADS)

    Stainforth, David; Chapman, Sandra; Watkins, Nicholas

    2017-04-01

    Global climate and earth system models are assessed by comparison with observations through a number of metrics. The InterGovernmental Panel on Climate Change (IPCC) highlights in particular their ability to reproduce "general features of the global and annual mean surface temperature changes over the historical period" [1,2] and to simulate "a trend in global-mean surface temperature from 1951 to 2012 that agrees with the observed trend" [3]. This focus on annual mean global mean temperature (hereafter GMT) change is presented as an important element in demonstrating the relevance of these models for climate projections. Any new model or new model version whose historic simulations fail to reproduce the "general features " and 20th century trends is likely therefore to undergo further tuning. Thus this focus could have implications for model development. Here we consider a formal interpretation of "general features" and discuss the implications of this approach to model assessment and intercomparison, for the interpretation of GCM projections. Following the IPCC, we interpret a major element of "general features" as being the slow timescale response to external forcings. (Shorter timescale behaviour such as the response to volcanic eruptions are also elements of "general features" but are not considered here.) Also following the IPCC, we consider only GMT anomalies i.e. changes with respect to some period. Since the models have absolute temperatures which range over about 3K (roughly observed GMT +/- 1.5K) this means their timeseries (and the observations) are rebased. We present timeseries of the slow timescale response of the CMIP5 models rebased to late-20th century temperatures and to mid-19th century temperatures. We provide a mathematical interpretation of this approach to model assessment and discuss two consequences. First is a separation of scales which limits the degree to which sub-global behaviour can feedback on the global response. Second, is an implication of linearity in the GMT response (to the extent that the slow-timescale response of the historic simulations is consistent with observations, and given their uncertainties). For each individual model these consequences only apply over the range of absolute temperatures simulated by the model in historic simulations. Taken together, however, they imply consequences over a much wider range of GMTs. The analysis suggests that this aspect of model evaluation risks providing a model development pressure which acts against a wide exploration of physically plausible responses; in particular against an exploration of potentially globally significant nonlinear responses and feedbacks. [1] IPCC, Fifth Assessment Report, Working Group 1, Technical Summary: Stocker et al. 2013. [2] IPCC, Fifth Assessment Report, Working Group 1, Chapter 9 - "Evaluation of Climate Models": Flato et al. 2013. [3] IPCC, Fifth Assessment Report, Working Group 1, Summary for Policy Makers: IPCC, 2013.

  17. Relationship of CogScreen-AE to flight simulator performance and pilot age.

    PubMed

    Taylor, J L; O'Hara, R; Mumenthaler, M S; Yesavage, J A

    2000-04-01

    We report on the relationship between CogScreen-Aeromedical Edition (AE) factor scores and flight simulator performance in aircraft pilots aged 50-69. Some 100 licensed, civilian aviators (average age 58+/-5.3 yr) performed aviation tasks in a Frasca model 141 flight simulator and the CogScreen-AE battery. The aviation performance indices were: a) staying on course; b) dialing in communication frequencies; c) avoiding conflicting traffic; d) monitoring cockpit instruments; e) executing the approach; and f) a summary score, which was the mean of these scores. The CogScreen predictors were based on a factor structure reported by Kay (11), which comprised 28 CogScreen scores. Through principal components analysis of Kay's nine factors, we reduced the number of predictors to five composite CogScreen scores: Speed/Working Memory (WM), Visual Associative Memory, Motor Coordination, Tracking, and Attribute Identification. Speed/WM scores had the highest correlation with the flight summary score, Spearman r(rho) = 0.57. A stepwise-forward multiple regression analysis indicated that four CogScreen variables could explain 45% of the variance in flight summary scores. Significant predictors, in order of entry, were: Speed/WM, Visual Associative Memory, Motor Coordination, and Tracking (p<0.05). Pilot age was found to significantly improve prediction beyond that which could be predicted by the four cognitive variables. In addition, there was some evidence for specific ability relationships between certain flight component scores and CogScreen scores, such as approach performance and tracking errors. These data support the validity of CogScreen-AE as a cognitive battery that taps skills relevant to piloting.

  18. Simulation of ultra-high energy photon propagation with PRESHOWER 2.0

    NASA Astrophysics Data System (ADS)

    Homola, P.; Engel, R.; Pysz, A.; Wilczyński, H.

    2013-05-01

    In this paper we describe a new release of the PRESHOWER program, a tool for Monte Carlo simulation of propagation of ultra-high energy photons in the magnetic field of the Earth. The PRESHOWER program is designed to calculate magnetic pair production and bremsstrahlung and should be used together with other programs to simulate extensive air showers induced by photons. The main new features of the PRESHOWER code include a much faster algorithm applied in the procedures of simulating the processes of gamma conversion and bremsstrahlung, update of the geomagnetic field model, and a minor correction. The new simulation procedure increases the flexibility of the code so that it can also be applied to other magnetic field configurations such as, for example, encountered in the vicinity of the sun or neutron stars. Program summaryProgram title: PRESHOWER 2.0 Catalog identifier: ADWG_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADWG_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3968 No. of bytes in distributed program, including test data, etc.: 37198 Distribution format: tar.gz Programming language: C, FORTRAN 77. Computer: Intel-Pentium based PC. Operating system: Linux or Unix. RAM:< 100 kB Classification: 1.1. Does the new version supercede the previous version?: Yes Catalog identifier of previous version: ADWG_v1_0 Journal reference of previous version: Comput. Phys. Comm. 173 (2005) 71 Nature of problem: Simulation of a cascade of particles initiated by UHE photon in magnetic field. Solution method: The primary photon is tracked until its conversion into an e+ e- pair. If conversion occurs each individual particle in the resultant preshower is checked for either bremsstrahlung radiation (electrons) or secondary gamma conversion (photons). Reasons for new version: Slow and outdated algorithm in the old version (a significant speed up is possible); Extension of the program to allow simulations also for extraterrestrial magnetic field configurations (e.g. neutron stars) and very long path lengths. Summary of revisions: A veto algorithm was introduced in the gamma conversion and bremsstrahlung tracking procedures. The length of the tracking step is now variable along the track and depends on the probability of the process expected to occur. The new algorithm reduces significantly the number of tracking steps and speeds up the execution of the program. The geomagnetic field model has been updated to IGRF-11, allowing for interpolations up to the year 2015. Numerical Recipes procedures to calculate modified Bessel functions have been replaced with an open source CERN routine DBSKA. One minor bug has been fixed. Restrictions: Gamma conversion into particles other than an electron pair is not considered. Spatial structure of the cascade is neglected. Additional comments: The following routines are supplied in the package, IGRF [1, 2], DBSKA [3], ran2 [4] Running time: 100 preshower events with primary energy 1020 eV require a 2.66 GHz CPU time of about 200 sec.; at the energy of 1021 eV, 600 sec.

  19. Simulation of Distributed PV Power Output in Oahu Hawaii

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lave, Matthew Samuel

    2016-08-01

    Distributed solar photovoltaic (PV) power generation in Oahu has grown rapidly since 2008. For applications such as determining the value of energy storage, it is important to have PV power output timeseries. Since these timeseries of not typically measured, here we produce simulated distributed PV power output for Oahu. Simulated power output is based on (a) satellite-derived solar irradiance, (b) PV permit data by neighborhood, and (c) population data by census block. Permit and population data was used to model locations of distributed PV, and irradiance data was then used to simulate power output. PV power output simulations are presentedmore » by sub-neighborhood polygons, neighborhoods, and for the whole island of Oahu. Summary plots of annual PV energy and a sample week timeseries of power output are shown, and a the files containing the entire timeseries are described.« less

  20. On summary measure analysis of linear trend repeated measures data: performance comparison with two competing methods.

    PubMed

    Vossoughi, Mehrdad; Ayatollahi, S M T; Towhidi, Mina; Ketabchi, Farzaneh

    2012-03-22

    The summary measure approach (SMA) is sometimes the only applicable tool for the analysis of repeated measurements in medical research, especially when the number of measurements is relatively large. This study aimed to describe techniques based on summary measures for the analysis of linear trend repeated measures data and then to compare performances of SMA, linear mixed model (LMM), and unstructured multivariate approach (UMA). Practical guidelines based on the least squares regression slope and mean of response over time for each subject were provided to test time, group, and interaction effects. Through Monte Carlo simulation studies, the efficacy of SMA vs. LMM and traditional UMA, under different types of covariance structures, was illustrated. All the methods were also employed to analyze two real data examples. Based on the simulation and example results, it was found that the SMA completely dominated the traditional UMA and performed convincingly close to the best-fitting LMM in testing all the effects. However, the LMM was not often robust and led to non-sensible results when the covariance structure for errors was misspecified. The results emphasized discarding the UMA which often yielded extremely conservative inferences as to such data. It was shown that summary measure is a simple, safe and powerful approach in which the loss of efficiency compared to the best-fitting LMM was generally negligible. The SMA is recommended as the first choice to reliably analyze the linear trend data with a moderate to large number of measurements and/or small to moderate sample sizes.

  1. Si amorphization by focused ion beam milling: Point defect model with dynamic BCA simulation and experimental validation.

    PubMed

    Huang, J; Loeffler, M; Muehle, U; Moeller, W; Mulders, J J L; Kwakman, L F Tz; Van Dorp, W F; Zschech, E

    2018-01-01

    A Ga focused ion beam (FIB) is often used in transmission electron microscopy (TEM) analysis sample preparation. In case of a crystalline Si sample, an amorphous near-surface layer is formed by the FIB process. In order to optimize the FIB recipe by minimizing the amorphization, it is important to predict the amorphous layer thickness from simulation. Molecular Dynamics (MD) simulation has been used to describe the amorphization, however, it is limited by computational power for a realistic FIB process simulation. On the other hand, Binary Collision Approximation (BCA) simulation is able and has been used to simulate ion-solid interaction process at a realistic scale. In this study, a Point Defect Density approach is introduced to a dynamic BCA simulation, considering dynamic ion-solid interactions. We used this method to predict the c-Si amorphization caused by FIB milling on Si. To validate the method, dedicated TEM studies are performed. It shows that the amorphous layer thickness predicted by the numerical simulation is consistent with the experimental data. In summary, the thickness of the near-surface Si amorphization layer caused by FIB milling can be well predicted using the Point Defect Density approach within the dynamic BCA model. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. NASA Environmentally Responsible Aviation Hybrid Wing Body Flow-Through Nacelle Wind Tunnel CFD

    NASA Technical Reports Server (NTRS)

    Schuh, Michael J.; Garcia, Jospeh A.; Carter, Melissa B.; Deere, Karen A.; Stremel, Paul M.; Tompkins, Daniel M.

    2016-01-01

    Wind tunnel tests of a 5.75% scale model of the Boeing Hybrid Wing Body (HWB) configuration were conducted in the NASA Langley Research Center (LaRC) 14'x22' and NASA Ames Research Center (ARC) 40'x80' low speed wind tunnels as part of the NASA Environmentally Responsible Aviation (ERA) Project. Computational fluid dynamics (CFD) simulations of the flow-through nacelle (FTN) configuration of this model were performed before and after the testing. This paper presents a summary of the experimental and CFD results for the model in the cruise and landing configurations.

  3. NASA Environmentally Responsible Aviation Hybrid Wing Body Flow-Through Nacelle Wind Tunnel CFD

    NASA Technical Reports Server (NTRS)

    Schuh, Michael J.; Garcia, Joseph A.; Carter, Melissa B.; Deere, Karen A.; Tompkins, Daniel M.; Stremel, Paul M.

    2016-01-01

    Wind tunnel tests of a 5.75 scale model of the Boeing Hybrid Wing Body (HWB) configuration were conducted in the NASA Langley Research Center (LaRC) 14x22 and NASA Ames Research Center (ARC) 40x80 low speed wind tunnels as part of the NASA Environmentally Responsible Aviation (ERA) Project. Computational fluid dynamics (CFD) simulations of the flow-through nacelle (FTN) configuration of this model were performed before and after the testing. This paper presents a summary of the experimental and CFD results for the model in the cruise and landing configurations.

  4. Wind Tunnel Measured Effects on a Twin-Engine Short-Haul Transport Caused by Simulated Ice Accretions: Data Report

    NASA Technical Reports Server (NTRS)

    Reehorst, Andrew; Potapczuk, Mark; Ratvasky, Thomas; Laflin, Brenda Gile

    1997-01-01

    The purpose of this report is to release the data from the NASA Langley/Lewis 14 by 22 foot wind tunnel test that examined icing effects on a 1/8 scale twin-engine short-haul jet transport model. Presented in this document are summary data from the major configurations tested. The entire test database in addition to ice shape and model measurements is available as a data supplement in CD-ROM form. Data measured and presented are: wing pressure distributions, model force and moment, and wing surface flow visualization.

  5. Decision-analytic modeling studies: An overview for clinicians using multiple myeloma as an example.

    PubMed

    Rochau, U; Jahn, B; Qerimi, V; Burger, E A; Kurzthaler, C; Kluibenschaedl, M; Willenbacher, E; Gastl, G; Willenbacher, W; Siebert, U

    2015-05-01

    The purpose of this study was to provide a clinician-friendly overview of decision-analytic models evaluating different treatment strategies for multiple myeloma (MM). We performed a systematic literature search to identify studies evaluating MM treatment strategies using mathematical decision-analytic models. We included studies that were published as full-text articles in English, and assessed relevant clinical endpoints, and summarized methodological characteristics (e.g., modeling approaches, simulation techniques, health outcomes, perspectives). Eleven decision-analytic modeling studies met our inclusion criteria. Five different modeling approaches were adopted: decision-tree modeling, Markov state-transition modeling, discrete event simulation, partitioned-survival analysis and area-under-the-curve modeling. Health outcomes included survival, number-needed-to-treat, life expectancy, and quality-adjusted life years. Evaluated treatment strategies included novel agent-based combination therapies, stem cell transplantation and supportive measures. Overall, our review provides a comprehensive summary of modeling studies assessing treatment of MM and highlights decision-analytic modeling as an important tool for health policy decision making. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  6. Time-dependent summary receiver operating characteristics for meta-analysis of prognostic studies.

    PubMed

    Hattori, Satoshi; Zhou, Xiao-Hua

    2016-11-20

    Prognostic studies are widely conducted to examine whether biomarkers are associated with patient's prognoses and play important roles in medical decisions. Because findings from one prognostic study may be very limited, meta-analyses may be useful to obtain sound evidence. However, prognostic studies are often analyzed by relying on a study-specific cut-off value, which can lead to difficulty in applying the standard meta-analysis techniques. In this paper, we propose two methods to estimate a time-dependent version of the summary receiver operating characteristics curve for meta-analyses of prognostic studies with a right-censored time-to-event outcome. We introduce a bivariate normal model for the pair of time-dependent sensitivity and specificity and propose a method to form inferences based on summary statistics reported in published papers. This method provides a valid inference asymptotically. In addition, we consider a bivariate binomial model. To draw inferences from this bivariate binomial model, we introduce a multiple imputation method. The multiple imputation is found to be approximately proper multiple imputation, and thus the standard Rubin's variance formula is justified from a Bayesian view point. Our simulation study and application to a real dataset revealed that both methods work well with a moderate or large number of studies and the bivariate binomial model coupled with the multiple imputation outperforms the bivariate normal model with a small number of studies. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  7. Real-world hydrologic assessment of a fully-distributed hydrological model in a parallel computing environment

    NASA Astrophysics Data System (ADS)

    Vivoni, Enrique R.; Mascaro, Giuseppe; Mniszewski, Susan; Fasel, Patricia; Springer, Everett P.; Ivanov, Valeriy Y.; Bras, Rafael L.

    2011-10-01

    SummaryA major challenge in the use of fully-distributed hydrologic models has been the lack of computational capabilities for high-resolution, long-term simulations in large river basins. In this study, we present the parallel model implementation and real-world hydrologic assessment of the Triangulated Irregular Network (TIN)-based Real-time Integrated Basin Simulator (tRIBS). Our parallelization approach is based on the decomposition of a complex watershed using the channel network as a directed graph. The resulting sub-basin partitioning divides effort among processors and handles hydrologic exchanges across boundaries. Through numerical experiments in a set of nested basins, we quantify parallel performance relative to serial runs for a range of processors, simulation complexities and lengths, and sub-basin partitioning methods, while accounting for inter-run variability on a parallel computing system. In contrast to serial simulations, the parallel model speed-up depends on the variability of hydrologic processes. Load balancing significantly improves parallel speed-up with proportionally faster runs as simulation complexity (domain resolution and channel network extent) increases. The best strategy for large river basins is to combine a balanced partitioning with an extended channel network, with potential savings through a lower TIN resolution. Based on these advances, a wider range of applications for fully-distributed hydrologic models are now possible. This is illustrated through a set of ensemble forecasts that account for precipitation uncertainty derived from a statistical downscaling model.

  8. SARDA HITL Simulations: System Performance Results

    NASA Technical Reports Server (NTRS)

    Gupta, Gautam

    2012-01-01

    This presentation gives an overview of the 2012 SARDA human-in-the-loop simulation, and presents a summary of system performance results from the simulation, including delay, throughput and fuel consumption

  9. Phase I of the Near Term Hybrid Passenger Vehicle Development Program. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1980-10-01

    The results of Phase I of the Near-Term Hybrid Vehicle Program are summarized. This phase of the program ws a study leading to the preliminary design of a 5-passenger hybrid vehicle utilizing two energy sources (electricity and gasoline/diesel fuel) to minimize petroleum usage on a fleet basis. This report presents the following: overall summary of the Phase I activity; summary of the individual tasks; summary of the hybrid vehicle design; summary of the alternative design options; summary of the computer simulations; summary of the economic analysis; summary of the maintenance and reliability considerations; summary of the design for crash safety;more » and bibliography.« less

  10. A comparison between Poisson and zero-inflated Poisson regression models with an application to number of black spots in Corriedale sheep

    PubMed Central

    Naya, Hugo; Urioste, Jorge I; Chang, Yu-Mei; Rodrigues-Motta, Mariana; Kremer, Roberto; Gianola, Daniel

    2008-01-01

    Dark spots in the fleece area are often associated with dark fibres in wool, which limits its competitiveness with other textile fibres. Field data from a sheep experiment in Uruguay revealed an excess number of zeros for dark spots. We compared the performance of four Poisson and zero-inflated Poisson (ZIP) models under four simulation scenarios. All models performed reasonably well under the same scenario for which the data were simulated. The deviance information criterion favoured a Poisson model with residual, while the ZIP model with a residual gave estimates closer to their true values under all simulation scenarios. Both Poisson and ZIP models with an error term at the regression level performed better than their counterparts without such an error. Field data from Corriedale sheep were analysed with Poisson and ZIP models with residuals. Parameter estimates were similar for both models. Although the posterior distribution of the sire variance was skewed due to a small number of rams in the dataset, the median of this variance suggested a scope for genetic selection. The main environmental factor was the age of the sheep at shearing. In summary, age related processes seem to drive the number of dark spots in this breed of sheep. PMID:18558072

  11. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE PAGES

    Dai, Heng; Ye, Ming; Walker, Anthony P.; ...

    2017-03-28

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  12. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  13. Population variability in animal health: Influence on dose-exposure-response relationships: Part II: Modelling and simulation.

    PubMed

    Martinez, Marilyn N; Gehring, Ronette; Mochel, Jonathan P; Pade, Devendra; Pelligand, Ludovic

    2018-05-28

    During the 2017 Biennial meeting, the American Academy of Veterinary Pharmacology and Therapeutics hosted a 1-day session on the influence of population variability on dose-exposure-response relationships. In Part I, we highlighted some of the sources of population variability. Part II provides a summary of discussions on modelling and simulation tools that utilize existing pharmacokinetic data, can integrate drug physicochemical characteristics with species physiological characteristics and dosing information or that combine observed with predicted and in vitro information to explore and describe sources of variability that may influence the safe and effective use of veterinary pharmaceuticals. © 2018 John Wiley & Sons Ltd. This article has been contributed to by US Government employees and their work is in the public domain in the USA.

  14. A statistical model investigating the prevalence of tuberculosis in New York City using counting processes with two change-points

    PubMed Central

    ACHCAR, J. A.; MARTINEZ, E. Z.; RUFFINO-NETTO, A.; PAULINO, C. D.; SOARES, P.

    2008-01-01

    SUMMARY We considered a Bayesian analysis for the prevalence of tuberculosis cases in New York City from 1970 to 2000. This counting dataset presented two change-points during this period. We modelled this counting dataset considering non-homogeneous Poisson processes in the presence of the two-change points. A Bayesian analysis for the data is considered using Markov chain Monte Carlo methods. Simulated Gibbs samples for the parameters of interest were obtained using WinBugs software. PMID:18346287

  15. Approaches to Process Performance Modeling: A Summary from the SEI Series of Workshops on CMMI High Maturity Measurement and Analysis

    DTIC Science & Technology

    2010-01-01

    Soporte de Modelos Sistémicos: Aplicación al Sector de Desarrollo de Software de Argentina,” Tesis de PhD, Universidad Tecnológica Nacional-Facultad...with New Results 31  2.3  Other Simulation Approaches 37  Conceptual Planning , Execution, and Operation of Combat Fire Support Effectiveness: A...Figure 29:  Functional Structure of Multiple Regression Model 80  Figure 30:  TSP Quality Plan One 85  Figure 31:  TSP Quality Plan Two 85  Figure

  16. Formal Specification and Design Techniques for Wireless Sensor and Actuator Networks

    PubMed Central

    Martínez, Diego; González, Apolinar; Blanes, Francisco; Aquino, Raúl; Simo, José; Crespo, Alfons

    2011-01-01

    A current trend in the development and implementation of industrial applications is to use wireless networks to communicate the system nodes, mainly to increase application flexibility, reliability and portability, as well as to reduce the implementation cost. However, the nondeterministic and concurrent behavior of distributed systems makes their analysis and design complex, often resulting in less than satisfactory performance in simulation and test bed scenarios, which is caused by using imprecise models to analyze, validate and design these systems. Moreover, there are some simulation platforms that do not support these models. This paper presents a design and validation method for Wireless Sensor and Actuator Networks (WSAN) which is supported on a minimal set of wireless components represented in Colored Petri Nets (CPN). In summary, the model presented allows users to verify the design properties and structural behavior of the system. PMID:22344203

  17. Checking distributional assumptions for pharmacokinetic summary statistics based on simulations with compartmental models.

    PubMed

    Shen, Meiyu; Russek-Cohen, Estelle; Slud, Eric V

    2016-08-12

    Bioequivalence (BE) studies are an essential part of the evaluation of generic drugs. The most common in vivo BE study design is the two-period two-treatment crossover design. AUC (area under the concentration-time curve) and Cmax (maximum concentration) are obtained from the observed concentration-time profiles for each subject from each treatment under each sequence. In the BE evaluation of pharmacokinetic crossover studies, the normality of the univariate response variable, e.g. log(AUC) 1 or log(Cmax), is often assumed in the literature without much evidence. Therefore, we investigate the distributional assumption of the normality of response variables, log(AUC) and log(Cmax), by simulating concentration-time profiles from two-stage pharmacokinetic models (commonly used in pharmacokinetic research) for a wide range of pharmacokinetic parameters and measurement error structures. Our simulations show that, under reasonable distributional assumptions on the pharmacokinetic parameters, log(AUC) has heavy tails and log(Cmax) is skewed. Sensitivity analyses are conducted to investigate how the distribution of the standardized log(AUC) (or the standardized log(Cmax)) for a large number of simulated subjects deviates from normality if distributions of errors in the pharmacokinetic model for plasma concentrations deviate from normality and if the plasma concentration can be described by different compartmental models.

  18. Summary of photovoltaic system performance models

    NASA Technical Reports Server (NTRS)

    Smith, J. H.; Reiter, L. J.

    1984-01-01

    A detailed overview of photovoltaics (PV) performance modeling capabilities developed for analyzing PV system and component design and policy issues is provided. A set of 10 performance models are selected which span a representative range of capabilities from generalized first order calculations to highly specialized electrical network simulations. A set of performance modeling topics and characteristics is defined and used to examine some of the major issues associated with photovoltaic performance modeling. Each of the models is described in the context of these topics and characteristics to assess its purpose, approach, and level of detail. The issues are discussed in terms of the range of model capabilities available and summarized in tabular form for quick reference. The models are grouped into categories to illustrate their purposes and perspectives.

  19. Feasibility of dynamic models of the interaction of potential oil spills with bowhead and gray whales in the Bering, Chukchi, and Beaufort Seas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reed, M.; Bowles, A.E.; Anderson, E.L.

    1984-08-01

    Feasibility and design considerations for developing computer models of migratory bow-head and gray whales and linking such models to oil spill models for application in Alaskan Outer Continental Shelf areas were evaluated. A summary of all relevant bowhead and gray whale distributional and migration data were summarized and presented at monthly intervals. The data were, for the most part, deemed sufficient to prepare whale migration simulation models. A variety of whale migration conceptual models were devised and ranking was achieved by means of a scaling-weighted protocol. Existing oil spill trajectory and fate models, as well as conceptual models, were similarlymore » ranked.« less

  20. Gas-Grain Simulation Facility: Fundamental studies of particle formation and interactions. Volume 1: Executive summary and overview

    NASA Technical Reports Server (NTRS)

    Fogleman, Guy (Editor); Huntington, Judith L. (Editor); Schwartz, Deborah E. (Editor); Fonda, Mark L. (Editor)

    1989-01-01

    An overview of the Gas-Grain Simulation Facility (GGSF) project and its current status is provided. The proceedings of the Gas-Grain Simulation Facility Experiments Workshop are recorded. The goal of the workshop was to define experiments for the GGSF--a small particle microgravity research facility. The workshop addressed the opportunity for performing, in Earth orbit, a wide variety of experiments that involve single small particles (grains) or clouds of particles. The first volume includes the executive summary, overview, scientific justification, history, and planned development of the Facility.

  1. Effects of Simulated Surface Effect Ship Motions on Crew Habitability. Phase II. Volume 1. Summary Report and Comments

    DTIC Science & Technology

    1981-04-01

    one 24-hour exposure to that condition may be regarded as the most complete and unbiased for determining some effects of a type of simulated SES...eliminated entirely. The ability to predict in advance the resultant effects of motion exposure thus seems to depend on the existance of a given...F• 198OF1L-0/I- =•RAI. )81 -• i . _j EFFECT OF.SIMULATED 1 S URFACE EFFECT SHIP J•OTIONS_2 ON CREW HABITABILITY 1PHASE 1J_ "I ,,OLUME 1 iI SUMMARY

  2. Influence of spatial discretization, underground water storage and glacier melt on a physically-based hydrological model of the Upper Durance River basin

    NASA Astrophysics Data System (ADS)

    Lafaysse, M.; Hingray, B.; Etchevers, P.; Martin, E.; Obled, C.

    2011-06-01

    SummaryThe SAFRAN-ISBA-MODCOU hydrological model ( Habets et al., 2008) presents severe limitations for alpine catchments. Here we propose possible model adaptations. For the catchment discretization, Relatively Homogeneous Hydrological Units (RHHUs) are used instead of the classical 8 km square grid. They are defined from the dilineation of hydrological subbasins, elevation bands, and aspect classes. Glacierized and non-glacierized areas are also treated separately. In addition, new modules are included in the model for the simulation of glacier melt, and retention of underground water. The improvement resulting from each model modification is analysed for the Upper Durance basin. RHHUs allow the model to better account for the high spatial variability of the hydrological processes (e.g. snow cover). The timing and the intensity of the spring snowmelt floods are significantly improved owing to the representation of water retention by aquifers. Despite the relatively small area covered by glaciers, accounting for glacier melt is necessary for simulating the late summer low flows. The modified model is robust over a long simulation period and it produces a good reproduction of the intra and interannual variability of discharge, which is a necessary condition for its application in a modified climate context.

  3. Investigating NARCCAP Precipitation Extremes via Bivariate Extreme Value Theory (Invited)

    NASA Astrophysics Data System (ADS)

    Weller, G. B.; Cooley, D. S.; Sain, S. R.; Bukovsky, M. S.; Mearns, L. O.

    2013-12-01

    We introduce methodology from statistical extreme value theory to examine the ability of reanalysis-drive regional climate models to simulate past daily precipitation extremes. Going beyond a comparison of summary statistics such as 20-year return values, we study whether the most extreme precipitation events produced by climate model simulations exhibit correspondence to the most extreme events seen in observational records. The extent of this correspondence is formulated via the statistical concept of tail dependence. We examine several case studies of extreme precipitation events simulated by the six models of the North American Regional Climate Change Assessment Program (NARCCAP) driven by NCEP reanalysis. It is found that the NARCCAP models generally reproduce daily winter precipitation extremes along the Pacific coast quite well; in contrast, simulation of past daily summer precipitation extremes in a central US region is poor. Some differences in the strength of extremal correspondence are seen in the central region between models which employ spectral nudging and those which do not. We demonstrate how these techniques may be used to draw a link between extreme precipitation events and large-scale atmospheric drivers, as well as to downscale extreme precipitation simulated by a future run of a regional climate model. Specifically, we examine potential future changes in the nature of extreme precipitation along the Pacific coast produced by the pineapple express (PE) phenomenon. A link between extreme precipitation events and a "PE Index" derived from North Pacific sea-surface pressure fields is found. This link is used to study PE-influenced extreme precipitation produced by a future-scenario climate model run.

  4. Building an Open-source Simulation Platform of Acoustic Radiation Force-based Breast Elastography

    PubMed Central

    Wang, Yu; Peng, Bo; Jiang, Jingfeng

    2017-01-01

    Ultrasound-based elastography including strain elastography (SE), acoustic radiation force Impulse (ARFI) imaging, point shear wave elastography (pSWE) and supersonic shear imaging (SSI) have been used to differentiate breast tumors among other clinical applications. The objective of this study is to extend a previously published virtual simulation platform built for ultrasound quasi-static breast elastography toward acoustic radiation force-based breast elastography. Consequently, the extended virtual breast elastography simulation platform can be used to validate image pixels with known underlying soft tissue properties (i.e. “ground truth”) in complex, heterogeneous media, enhancing confidence in elastographic image interpretations. The proposed virtual breast elastography system inherited four key components from the previously published virtual simulation platform: an ultrasound simulator (Field II), a mesh generator (Tetgen), a finite element solver (FEBio) and a visualization and data processing package (VTK). Using a simple message passing mechanism, functionalities have now been extended to acoustic radiation force-based elastography simulations. Examples involving three different numerical breast models with increasing complexity – one uniform model, one simple inclusion model and one virtual complex breast model derived from magnetic resonance imaging data, were used to demonstrate capabilities of this extended virtual platform. Overall, simulation results were compared with the published results. In the uniform model, the estimated shear wave speed (SWS) values were within 4% compared to the predetermined SWS values. In the simple inclusion and the complex breast models, SWS values of all hard inclusions in soft backgrounds were slightly underestimated, similar to what has been reported. The elastic contrast values and visual observation show that ARFI images have higher spatial resolution, while SSI images can provide higher inclusion-to-background contrast. In summary, our initial results were consistent with our expectations and what have been reported in the literature. The proposed (open-source) simulation platform can serve as a single gateway to perform many elastographic simulations in a transparent manner, thereby promoting collaborative developments. PMID:28075330

  5. Building an open-source simulation platform of acoustic radiation force-based breast elastography

    NASA Astrophysics Data System (ADS)

    Wang, Yu; Peng, Bo; Jiang, Jingfeng

    2017-03-01

    Ultrasound-based elastography including strain elastography, acoustic radiation force impulse (ARFI) imaging, point shear wave elastography and supersonic shear imaging (SSI) have been used to differentiate breast tumors among other clinical applications. The objective of this study is to extend a previously published virtual simulation platform built for ultrasound quasi-static breast elastography toward acoustic radiation force-based breast elastography. Consequently, the extended virtual breast elastography simulation platform can be used to validate image pixels with known underlying soft tissue properties (i.e. ‘ground truth’) in complex, heterogeneous media, enhancing confidence in elastographic image interpretations. The proposed virtual breast elastography system inherited four key components from the previously published virtual simulation platform: an ultrasound simulator (Field II), a mesh generator (Tetgen), a finite element solver (FEBio) and a visualization and data processing package (VTK). Using a simple message passing mechanism, functionalities have now been extended to acoustic radiation force-based elastography simulations. Examples involving three different numerical breast models with increasing complexity—one uniform model, one simple inclusion model and one virtual complex breast model derived from magnetic resonance imaging data, were used to demonstrate capabilities of this extended virtual platform. Overall, simulation results were compared with the published results. In the uniform model, the estimated shear wave speed (SWS) values were within 4% compared to the predetermined SWS values. In the simple inclusion and the complex breast models, SWS values of all hard inclusions in soft backgrounds were slightly underestimated, similar to what has been reported. The elastic contrast values and visual observation show that ARFI images have higher spatial resolution, while SSI images can provide higher inclusion-to-background contrast. In summary, our initial results were consistent with our expectations and what have been reported in the literature. The proposed (open-source) simulation platform can serve as a single gateway to perform many elastographic simulations in a transparent manner, thereby promoting collaborative developments.

  6. Capturing atmospheric effects on 3D millimeter wave radar propagation patterns

    NASA Astrophysics Data System (ADS)

    Cook, Richard D.; Fiorino, Steven T.; Keefer, Kevin J.; Stringer, Jeremy

    2016-05-01

    Traditional radar propagation modeling is done using a path transmittance with little to no input for weather and atmospheric conditions. As radar advances into the millimeter wave (MMW) regime, atmospheric effects such as attenuation and refraction become more pronounced than at traditional radar wavelengths. The DoD High Energy Laser Joint Technology Offices High Energy Laser End-to-End Operational Simulation (HELEEOS) in combination with the Laser Environmental Effects Definition and Reference (LEEDR) code have shown great promise simulating atmospheric effects on laser propagation. Indeed, the LEEDR radiative transfer code has been validated in the UV through RF. Our research attempts to apply these models to characterize the far field radar pattern in three dimensions as a signal propagates from an antenna towards a point in space. Furthermore, we do so using realistic three dimensional atmospheric profiles. The results from these simulations are compared to those from traditional radar propagation software packages. In summary, a fast running method has been investigated which can be incorporated into computational models to enhance understanding and prediction of MMW propagation through various atmospheric and weather conditions.

  7. On Super-Resolution and the MUSIC Algorithm,

    DTIC Science & Technology

    1985-05-01

    SUPER-RESOLUTION AND THE MUSIC ALGORITHM AUTHOR: G D de Villiers DATE: May 1985 SUMMARY Simulation results for phased array signal processing using...the MUSIC algorithm are presented. The model used is more realistic than previous ones and it gives an indication as to how the algorithm would perform...resolution ON SUPER-RESOLUTION AND THE MUSIC ALGORITHM 1. INTRODUCTION At present there is a considerable amount of interest in "high-resolution" b

  8. Computational Evaluation of the Strict Master and Random Template Models of Endogenous Retrovirus Evolution

    PubMed Central

    Nascimento, Fabrícia F.; Rodrigo, Allen G.

    2016-01-01

    Transposable elements (TEs) are DNA sequences that are able to replicate and move within and between host genomes. Their mechanism of replication is also shared with endogenous retroviruses (ERVs), which are also a type of TE that represent an ancient retroviral infection within animal genomes. Two models have been proposed to explain TE proliferation in host genomes: the strict master model (SMM), and the random template (or transposon) model (TM). In SMM only a single copy of a given TE lineage is able to replicate, and all other genomic copies of TEs are derived from that master copy. In TM, any element of a given family is able to replicate in the host genome. In this paper, we simulated ERV phylogenetic trees under variations of SMM and TM. To test whether current phylogenetic programs can recover the simulated ERV phylogenies, DNA sequence alignments were simulated and maximum likelihood trees were reconstructed and compared to the simulated phylogenies. Results indicate that visual inspection of phylogenetic trees alone can be misleading. However, if a set of statistical summaries is calculated, we are able to distinguish between models with high accuracy by using a data mining algorithm that we introduce here. We also demonstrate the use of our data mining algorithm with empirical data for the porcine endogenous retrovirus (PERV), an ERV that is able to replicate in human and pig cells in vitro. PMID:27649303

  9. The Soil Model Development and Intercomparison Panel (SoilMIP) of the International Soil Modeling Consortium (ISMC)

    NASA Astrophysics Data System (ADS)

    Vanderborght, Jan; Priesack, Eckart

    2017-04-01

    The Soil Model Development and Intercomparison Panel (SoilMIP) is an initiative of the International Soil Modeling Consortium. Its mission is to foster the further development of soil models that can predict soil functions and their changes (i) due to soil use and land management and (ii) due to external impacts of climate change and pollution. Since soil functions and soil threats are diverse but linked with each other, the overall aim is to develop holistic models that represent the key functions of the soil system and the links between them. These models should be scaled up and integrated in terrestrial system models that describe the feedbacks between processes in the soil and the other terrestrial compartments. We propose and illustrate a few steps that could be taken to achieve these goals. A first step is the development of scenarios that compare simulations by models that predict the same or different soil services. Scenarios can be considered at three different levels of comparisons: scenarios that compare the numerics (accuracy but also speed) of models, scenarios that compare the effect of differences in process descriptions, and scenarios that compare simulations with experimental data. A second step involves the derivation of metrics or summary statistics that effectively compare model simulations and disentangle parameterization from model concept differences. These metrics can be used to evaluate how more complex model simulations can be represented by simpler models using an appropriate parameterization. A third step relates to the parameterization of models. Application of simulation models implies that appropriate model parameters have to be defined for a range of environmental conditions and locations. Spatial modelling approaches are used to derive parameter distributions. Considering that soils and their properties emerge from the interaction between physical, chemical and biological processes, the combination of spatial models with process models would lead to consistent parameter distributions correlations and could potentially represent self-organizing processes in soils and landscapes.

  10. Influence of mesh structure on 2D full shallow water equations and SCS Curve Number simulation of rainfall/runoff events

    NASA Astrophysics Data System (ADS)

    Caviedes-Voullième, Daniel; García-Navarro, Pilar; Murillo, Javier

    2012-07-01

    SummaryHydrological simulation of rain-runoff processes is often performed with lumped models which rely on calibration to generate storm hydrographs and study catchment response to rain. In this paper, a distributed, physically-based numerical model is used for runoff simulation in a mountain catchment. This approach offers two advantages. The first is that by using shallow-water equations for runoff flow, there is less freedom to calibrate routing parameters (as compared to, for example, synthetic hydrograph methods). The second, is that spatial distributions of water depth and velocity can be obtained. Furthermore, interactions among the various hydrological processes can be modeled in a physically-based approach which may depend on transient and spatially distributed factors. On the other hand, the undertaken numerical approach relies on accurate terrain representation and mesh selection, which also affects significantly the computational cost of the simulations. Hence, we investigate the response of a gauged catchment with this distributed approach. The methodology consists of analyzing the effects that the mesh has on the simulations by using a range of meshes. Next, friction is applied to the model and the response to variations and interaction with the mesh is studied. Finally, a first approach with the well-known SCS Curve Number method is studied to evaluate its behavior when coupled with a shallow-water model for runoff flow. The results show that mesh selection is of great importance, since it may affect the results in a magnitude as large as physical factors, such as friction. Furthermore, results proved to be less sensitive to roughness spatial distribution than to mesh properties. Finally, the results indicate that SCS-CN may not be suitable for simulating hydrological processes together with a shallow-water model.

  11. Development of a biosphere hydrological model considering vegetation dynamics and its evaluation at basin scale under climate change

    NASA Astrophysics Data System (ADS)

    Li, Qiaoling; Ishidaira, Hiroshi

    2012-01-01

    SummaryThe biosphere and hydrosphere are intrinsically coupled. The scientific question is if there is a substantial change in one component such as vegetation cover, how will the other components such as transpiration and runoff generation respond, especially under climate change conditions? Stand-alone hydrological models have a detailed description of hydrological processes but do not sufficiently parameterize vegetation as a dynamic component. Dynamic global vegetation models (DGVMs) are able to simulate transient structural changes in major vegetation types but do not simulate runoff generation reliably. Therefore, both hydrological models and DGVMs have their limitations as well as advantages for addressing this question. In this study a biosphere hydrological model (LPJH) is developed by coupling a prominent DGVM (Lund-Postdam-Jena model referred to as LPJ) with a stand-alone hydrological model (HYMOD), with the objective of analyzing the role of vegetation in the hydrological processes at basin scale and evaluating the impact of vegetation change on the hydrological processes under climate change. The application and validation of the LPJH model to four basins representing a variety of climate and vegetation conditions shows that the performance of LPJH is much better than that of the original LPJ and is similar to that of stand-alone hydrological models for monthly and daily runoff simulation at the basin scale. It is argued that the LPJH model gives more reasonable hydrological simulation since it considers both the spatial variability of soil moisture and vegetation dynamics, which make the runoff generation mechanism more reliable. As an example, it is shown that changing atmospheric CO 2 content alone would result in runoff increases in humid basins and decreases in arid basins. Theses changes are mainly attributable to changes in transpiration driven by vegetation dynamics, which are not simulated in stand-alone hydrological models. Therefore LPJH potentially provides a powerful tool for simulating vegetation response to climate changes in the biosphere hydrological cycle.

  12. Numerical simulation of turbulent jet noise, part 2

    NASA Technical Reports Server (NTRS)

    Metcalfe, R. W.; Orszag, S. A.

    1976-01-01

    Results on the numerical simulation of jet flow fields were used to study the radiated sound field, and in addition, to extend and test the capabilities of the turbulent jet simulation codes. The principal result of the investigation was the computation of the radiated sound field from a turbulent jet. In addition, the computer codes were extended to account for the effects of compressibility and eddy viscosity, and the treatment of the nonlinear terms of the Navier-Stokes equations was modified so that they can be computed in a semi-implicit way. A summary of the flow model and a description of the numerical methods used for its solution are presented. Calculations of the radiated sound field are reported. In addition, the extensions that were made to the fundamental dynamical codes are described. Finally, the current state-of-the-art for computer simulation of turbulent jet noise is summarized.

  13. V&V Of CFD Modeling Of The Argonne Bubble Experiment: FY15 Summary Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoyt, Nathaniel C.; Wardle, Kent E.; Bailey, James L.

    2015-09-30

    In support of the development of accelerator-driven production of the fission product Mo 99, computational fluid dynamics (CFD) simulations of an electron-beam irradiated, experimental-scale bubble chamber have been conducted in order to aid in interpretation of existing experimental results, provide additional insights into the physical phenomena, and develop predictive thermal hydraulic capabilities that can be applied to full-scale target solution vessels. Toward that end, a custom hybrid Eulerian-Eulerian-Lagrangian multiphase solver was developed, and simulations have been performed on high-resolution meshes. Good agreement between experiments and simulations has been achieved, especially with respect to the prediction of the maximum temperature ofmore » the uranyl sulfate solution in the experimental vessel. These positive results suggest that the simulation methodology that has been developed will prove to be suitable to assist in the development of full-scale production hardware.« less

  14. Resource management and operations in central North Dakota: Climate change scenario planning workshop summary November 12-13, 2015, Bismarck, ND

    USGS Publications Warehouse

    Fisichelli, Nicholas A.; Schuurman, Gregor; Symstad, Amy J.; Ray, Andrea; Friedman, Jonathan M.; Miller, Brian; Rowland, Erika

    2016-01-01

    The Scaling Climate Change Adaptation in the Northern Great Plains through Regional Climate Summaries and Local Qualitative-Quantitative Scenario Planning Workshops project synthesizes climate data into 3-5 distinct but plausible climate summaries for the northern Great Plains region; crafts quantitative summaries of these climate futures for two focal areas; and applies these local summaries by developing climate-resource-management scenarios through participatory workshops and, where possible, simulation models. The two focal areas are central North Dakota and southwest South Dakota (Figure 1). The primary objective of this project is to help resource managers and scientists in a focal area use scenario planning to make management and planning decisions based on assessments of critical future uncertainties.This report summarizes project work for public and tribal lands in the central North Dakota focal area, with an emphasis on Knife River Indian Villages National Historic Site. The report explainsscenario planning as an adaptation tool in general, then describes how it was applied to the central North Dakota focal area in three phases. Priority resource management and climate uncertainties were identified in the orientation phase. Local climate summaries for relevant, divergent, and challenging climate scenarios were developed in the second phase. In the final phase, a two-day scenario planning workshop held November 12-13, 2015 in Bismarck, ND, featured scenario development and implications, testing management decisions, and methods for operationalizing scenario planning outcomes.

  15. Resource management and operations in southwest South Dakota: Climate change scenario planning workshop summary January 20-21, 2016, Rapid City, SD

    USGS Publications Warehouse

    Fisichelli, Nicholas A.; Schuurman, Gregor W.; Symstad, Amy J.; Ray, Andrea; Miller, Brian; Cross, Molly; Rowland, Erika

    2016-01-01

    The Scaling Climate Change Adaptation in the Northern Great Plains through Regional Climate Summaries and Local Qualitative-Quantitative Scenario Planning Workshops project synthesizes climate data into 3-5 distinct but plausible climate summaries for the northern Great Plains region; crafts quantitative summaries of these climate futures for two focal areas; and applies these local summaries by developing climate-resource-management scenarios through participatory workshops and, where possible, simulation models. The two focal areas are central North Dakota and southwest South Dakota (Figure 1). The primary objective of this project is to help resource managers and scientists in a focal area use scenario planning to make management and planning decisions based on assessments of critical future uncertainties.This report summarizes project work for public and tribal lands in the southwest South Dakota grasslands focal area, with an emphasis on Badlands National Park and Buffalo Gap National Grassland. The report explains scenario planning as an adaptation tool in general, then describes how it was applied to the focal area in three phases. Priority resource management and climate uncertainties were identified in the orientation phase. Local climate summaries for relevant, divergent, and challenging climate scenarios were developed in the second phase. In the final phase, a two-day scenario planning workshop held January 20-21, 2016 in Rapid City, South Dakota, featured scenario development and implications, testing management decisions, and methods for operationalizing scenario planning outcomes.

  16. METAGUI. A VMD interface for analyzing metadynamics and molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Biarnés, Xevi; Pietrucci, Fabio; Marinelli, Fabrizio; Laio, Alessandro

    2012-01-01

    We present a new computational tool, METAGUI, which extends the VMD program with a graphical user interface that allows constructing a thermodynamic and kinetic model of a given process simulated by large-scale molecular dynamics. The tool is specially designed for analyzing metadynamics based simulations. The huge amount of diverse structures generated during such a simulation is partitioned into a set of microstates (i.e. structures with similar values of the collective variables). Their relative free energies are then computed by a weighted-histogram procedure and the most relevant free energy wells are identified by diagonalization of the rate matrix followed by a commitor analysis. All this procedure leads to a convenient representation of the metastable states and long-time kinetics of the system which can be compared with experimental data. The tool allows to seamlessly switch between a collective variables space representation of microstates and their atomic structure representation, which greatly facilitates the set-up and analysis of molecular dynamics simulations. METAGUI is based on the output format of the PLUMED plugin, making it compatible with a number of different molecular dynamics packages like AMBER, NAMD, GROMACS and several others. The METAGUI source files can be downloaded from the PLUMED web site ( http://www.plumed-code.org). Program summaryProgram title: METAGUI Catalogue identifier: AEKH_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKH_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 117 545 No. of bytes in distributed program, including test data, etc.: 8 516 203 Distribution format: tar.gz Programming language: TK/TCL, Fortran Computer: Any computer with a VMD installation and capable of running an executable produced by a gfortran compiler Operating system: Linux, Unix OS-es RAM: 1 073 741 824 bytes Classification: 23 External routines: A VMD installation ( http://www.ks.uiuc.edu/Research/vmd/) Nature of problem: Extract thermodynamic data and build a kinetic model of a given process simulated by metadynamics or molecular dynamics simulations, and provide this information on a dual representation that allows navigating and exploring the molecular structures corresponding to each point along the multi-dimensional free energy hypersurface. Solution method: Graphical-user interface linked to VMD that clusterizes the simulation trajectories in the space of a set of collective variables and assigns each frame to a given microstate, determines the free energy of each microstate by a weighted histogram analysis method, and identifies the most relevant free energy wells (kinetic basins) by diagonalization of the rate matrix followed by a commitor analysis. Restrictions: Input format files compatible with PLUMED and all the MD engines supported by PLUMED and VMD. Running time: A few minutes.

  17. Microbial processes in marine ecosystem models: state of the art and future prospective

    NASA Astrophysics Data System (ADS)

    Polimene, L.; Butenschon, M.; Blackford, J.; Allen, I.

    2012-12-01

    Heterotrophic bacteria play a key role in the marine biogeochemistry being the main consumer of dissolved organic matter (DOM) and the main producer of carbon dioxide (CO2) by respiration. Quantifying the carbon and energy fluxes within bacteria (i.e. production, respiration, overflow metabolism etc.) is therefore crucial for the assessment of the global ocean carbon and nutrient cycles. Consequently, the description of bacteria dynamic in ecosystem models is a key (although challenging) issue which cannot be overlooked if we want to properly simulate the marine environment. We present an overview of the microbial processes described in the European Sea Regional Ecosystem Model (ERSEM), a state of the art biogeochemical model resolving carbon and nutrient cycles (N, P, Si and Fe) within the low trophic levels (up to mesozooplankton) of the marine ecosystem. The description of the theoretical assumptions and philosophy underpinning the ERSEM bacteria sub-model will be followed by the presentation of some case studies highlighting the relevance of resolving microbial processes in the simulation of ecosystem dynamics at a local scale. Recent results concerning the implementation of ERSEM on a global ocean domain will be also presented. This latter exercise includes a comparison between simulations carried out with the full bacteria sub-model and simulations carried out with an implicit parameterization of bacterial activity. The results strongly underline the importance of explicitly resolved bacteria in the simulation of global carbon fluxes. Finally, a summary of the future developments along with issues still open on the topic will be presented and discussed.

  18. An investigation into pilot and system response to critical in-flight events. Volume 2: Appendix

    NASA Technical Reports Server (NTRS)

    Rockwell, T. H.; Griffin, W. C.

    1981-01-01

    Materials relating to the study of pilot and system response to critical in-flight events (CIFE) are given. An annotated bibliography and a trip summary outline are presented, as are knowledge surveys with accompanying answer keys. Performance profiles of pilots and performance data from the simulations of CIFE's are given. The paper and pencil testing materials are reproduced. Conditions for the use of the additive model are discussed. A master summary of data for the destination diversion scenario is given. An interview with an aircraft mechanic demonstrates the feasibility of system problem diagnosis from a verbal description of symptoms and shows the information seeking and problem solving logic used by an expert to narrow the list of probable causes of aircraft failure.

  19. Three-Dimensional Computer Model of the Right Atrium Including the Sinoatrial and Atrioventricular Nodes Predicts Classical Nodal Behaviours

    PubMed Central

    Li, Jue; Inada, Shin; Schneider, Jurgen E.; Zhang, Henggui; Dobrzynski, Halina; Boyett, Mark R.

    2014-01-01

    The aim of the study was to develop a three-dimensional (3D) anatomically-detailed model of the rabbit right atrium containing the sinoatrial and atrioventricular nodes to study the electrophysiology of the nodes. A model was generated based on 3D images of a rabbit heart (atria and part of ventricles), obtained using high-resolution magnetic resonance imaging. Segmentation was carried out semi-manually. A 3D right atrium array model (∼3.16 million elements), including eighteen objects, was constructed. For description of cellular electrophysiology, the Rogers-modified FitzHugh-Nagumo model was further modified to allow control of the major characteristics of the action potential with relatively low computational resource requirements. Model parameters were chosen to simulate the action potentials in the sinoatrial node, atrial muscle, inferior nodal extension and penetrating bundle. The block zone was simulated as passive tissue. The sinoatrial node, crista terminalis, main branch and roof bundle were considered as anisotropic. We have simulated normal and abnormal electrophysiology of the two nodes. In accordance with experimental findings: (i) during sinus rhythm, conduction occurs down the interatrial septum and into the atrioventricular node via the fast pathway (conduction down the crista terminalis and into the atrioventricular node via the slow pathway is slower); (ii) during atrial fibrillation, the sinoatrial node is protected from overdrive by its long refractory period; and (iii) during atrial fibrillation, the atrioventricular node reduces the frequency of action potentials reaching the ventricles. The model is able to simulate ventricular echo beats. In summary, a 3D anatomical model of the right atrium containing the cardiac conduction system is able to simulate a wide range of classical nodal behaviours. PMID:25380074

  20. Transportation Analysis and Simulation System Requirements

    DOT National Transportation Integrated Search

    1973-04-01

    This document provides: : a. A brief summary of overall project (PPA OS223) accomplishments during FY 72. : b. A detailed summary of the following two major FY 72 activities: : 1. Analysis of TSC's computation resources and their utilization; : 2. Pr...

  1. Assessment of IT solutions used in the Hungarian income tax microsimulation system

    NASA Astrophysics Data System (ADS)

    Molnar, I.; Hardhienata, S.

    2017-01-01

    This paper focuses on the use of information technology (IT) in diverse microsimulation studies and presents state-of-the-art solutions in the traditional application field of personal income tax simulation. The aim of the paper is to promote solutions, which can improve the efficiency and quality of microsimulation model implementation, assess their applicability and help to shift attention from microsimulation model implementation and data analysis towards experiment design and model use. First, the authors shortly discuss the relevant characteristics of the microsimulation application field and the managerial decision-making problem. After examination of the salient problems, advanced IT solutions, such as meta-database and service-oriented architecture are presented. The authors show how selected technologies can be applied to support both data- and behavior-driven and even agent-based personal income tax microsimulation model development. Finally, examples are presented and references made to the Hungarian Income Tax Simulator (HITS) models and their results. The paper concludes with a summary of the IT assessment and application-related author remarks dedicated to an Indonesian Income Tax Microsimulation Model.

  2. Realistic modeling of neurons and networks: towards brain simulation

    PubMed Central

    D’Angelo, Egidio; Solinas, Sergio; Garrido, Jesus; Casellato, Claudia; Pedrocchi, Alessandra; Mapelli, Jonathan; Gandolfi, Daniela; Prestori, Francesca

    Summary Realistic modeling is a new advanced methodology for investigating brain functions. Realistic modeling is based on a detailed biophysical description of neurons and synapses, which can be integrated into microcircuits. The latter can, in turn, be further integrated to form large-scale brain networks and eventually to reconstruct complex brain systems. Here we provide a review of the realistic simulation strategy and use the cerebellar network as an example. This network has been carefully investigated at molecular and cellular level and has been the object of intense theoretical investigation. The cerebellum is thought to lie at the core of the forward controller operations of the brain and to implement timing and sensory prediction functions. The cerebellum is well described and provides a challenging field in which one of the most advanced realistic microcircuit models has been generated. We illustrate how these models can be elaborated and embedded into robotic control systems to gain insight into how the cellular properties of cerebellar neurons emerge in integrated behaviors. Realistic network modeling opens up new perspectives for the investigation of brain pathologies and for the neurorobotic field. PMID:24139652

  3. Paired and interacting galaxies: Conference summary

    NASA Technical Reports Server (NTRS)

    Norman, Colin A.

    1990-01-01

    The author gives a summary of the conference proceedings. The conference began with the presentation of the basic data sets on pairs, groups, and interacting galaxies with the latter being further discussed with respect to both global properties and properties of the galactic nuclei. Then followed the theory, modelling and interpretation using analytic techniques, simulations and general modelling for spirals and ellipticals, starbursts and active galactic nuclei. Before the conference the author wrote down the three questions concerning pairs, groups and interacting galaxies that he hoped would be answered at the meeting: (1) How do they form, including the role of initial conditions, the importance of subclustering, the evolution of groups to compact groups, and the fate of compact groups; (2) How do they evolve, including issues such as relevant timescales, the role of halos and the problem of overmerging, the triggering and enhancement of star formation and activity in the galactic nuclei, and the relative importance of dwarf versus giant encounters; and (3) Are they important, including the frequency of pairs and interactions, whether merging and interactions are very important aspects of the life of a normal galaxy at formation, during its evolution, in forming bars, shells, rings, bulges, etc., and in the formation and evolution of active galaxies? Where possible he focuses on these three central issues in the summary.

  4. Panel summary report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutjahr, A.L.; Kincaid, C.T.; Mercer, J.W.

    1987-04-01

    The objective of this report is to summarize the various modeling approaches that were used to simulate solute transport in a variably saturated emission. In particular, the technical strengths and weaknesses of each approach are discussed, and conclusions and recommendations for future studies are made. Five models are considered: (1) one-dimensional analytical and semianalytical solutions of the classical deterministic convection-dispersion equation (van Genuchten, Parker, and Kool, this report ); (2) one-dimensional simulation using a continuous-time Markov process (Knighton and Wagenet, this report); (3) one-dimensional simulation using the time domain method and the frequency domain method (Duffy and Al-Hassan, this report);more » (4) one-dimensional numerical approach that combines a solution of the classical deterministic convection-dispersion equation with a chemical equilibrium speciation model (Cederberg, this report); and (5) three-dimensional numerical solution of the classical deterministic convection-dispersion equation (Huyakorn, Jones, Parker, Wadsworth, and White, this report). As part of the discussion, the input data and modeling results are summarized. The models were used in a data analysis mode, as opposed to a predictive mode. Thus, the following discussion will concentrate on the data analysis aspects of model use. Also, all the approaches were similar in that they were based on a convection-dispersion model of solute transport. Each discussion addresses the modeling approaches in the order listed above.« less

  5. Pronounced differences between observed and CMIP5-simulated multidecadal climate variability in the twentieth century

    NASA Astrophysics Data System (ADS)

    Kravtsov, Sergey

    2017-06-01

    Identification and dynamical attribution of multidecadal climate undulations to either variations in external forcings or to internal sources is one of the most important topics of modern climate science, especially in conjunction with the issue of human-induced global warming. Here we utilize ensembles of twentieth century climate simulations to isolate the forced signal and residual internal variability in a network of observed and modeled climate indices. The observed internal variability so estimated exhibits a pronounced multidecadal mode with a distinctive spatiotemporal signature, which is altogether absent in model simulations. This single mode explains a major fraction of model-data differences over the entire climate index network considered; it may reflect either biases in the models' forced response or models' lack of requisite internal dynamics, or a combination of both.Plain Language SummaryGlobal and regional warming trends over the course of the twentieth century have been nonuniform, with decadal and longer periods of faster or slower warming, or even cooling. Here we show that state-of-the-art global models used to predict climate fail to adequately reproduce such multidecadal climate variations. In particular, the models underestimate the magnitude of the observed variability and misrepresent its spatial pattern. Therefore, our ability to interpret the observed climate change using these models is limited.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AIPC.1786e0018D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AIPC.1786e0018D"><span>Development of the ARISTOTLE webware for cloud-based rarefied gas flow modeling</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Deschenes, Timothy R.; Grot, Jonathan; Cline, Jason A.</p> <p>2016-11-01</p> <p>Rarefied gas dynamics are important for a wide variety of applications. An improvement in the ability of general users to predict these gas flows will enable optimization of current, and discovery of future processes. Despite this potential, most rarefied simulation software is designed by and for experts in the community. This has resulted in low adoption of the methods outside of the immediate RGD community. This paper outlines an ongoing effort to create a rarefied gas dynamics simulation tool that can be used by a general audience. The tool leverages a direct simulation Monte Carlo (DSMC) library that is available to the entire community and a web-based simulation process that will enable all users to take advantage of high performance computing capabilities. First, the DSMC library and simulation architecture are described. Then the DSMC library is used to predict a number of representative transient gas flows that are applicable to the rarefied gas dynamics community. The paper closes with a summary and future direction.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA101846','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA101846"><span>Theater-Level Gaming and Analysis Workshop for Force Planning. Volume II. Summary, Discussion of Issues and Requirements for Research. September 27- 29, 1977, Held at Xerox International Center for Training and Management Development, Leesburg, Virginia</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1981-05-01</p> <p>be allocated to targets on the battlefield and in the rear area. The speaker describes the VECTOR I/NUCLEAR model, a combination of the UNICORN target...outlined. UNICORN is compatible with VECTOR 1 in level of detail. It is an expected value damage model and uses linear programming to optimize the...and a growing appreciation for the power of simulation in addressing large, complex problems, it was only a few short years before these games had</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA617421','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA617421"><span>A Unified Access Model for Interconnecting Heterogeneous Wireless Networks</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2015-05-01</p> <p>Defined Networking, OpenFlow, WiFi, LTE 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF PAGES 18 19a. NAME OF...Machine Configurations with WiFi and LTE 4 2.3 Three Virtual Machine Configurations with WiFi and LTE 5 3. Results and Discussion 5 4. Summary and...WiFi and long-term evolution ( LTE ), and created a communication pathway between them via a central controller node. Our simulation serves as a</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20090005963','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20090005963"><span>NASA Standard for Models and Simulations: Credibility Assessment Scale</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Babula, Maria; Bertch, William J.; Green, Lawrence L.; Hale, Joseph P.; Mosier, Gary E.; Steele, Martin J.; Woods, Jody</p> <p>2009-01-01</p> <p>As one of its many responses to the 2003 Space Shuttle Columbia accident, NASA decided to develop a formal standard for models and simulations (M&S). Work commenced in May 2005. An interim version was issued in late 2006. This interim version underwent considerable revision following an extensive Agency-wide review in 2007 along with some additional revisions as a result of the review by the NASA Engineering Management Board (EMB) in the first half of 2008. Issuance of the revised, permanent version, hereafter referred to as the M&S Standard or just the Standard, occurred in July 2008. Bertch, Zang and Steeleiv provided a summary review of the development process of this standard up through the start of the review by the EMB. A thorough recount of the entire development process, major issues, key decisions, and all review processes are available in Ref. v. This is the second of a pair of papers providing a summary of the final version of the Standard. Its focus is the Credibility Assessment Scale, a key feature of the Standard, including an example of its application to a real-world M&S problem for the James Webb Space Telescope. The companion paper summarizes the overall philosophy of the Standard and an overview of the requirements. Verbatim quotes from the Standard are integrated into the text of this paper, and are indicated by quotation marks.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4099133','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4099133"><span>Reducing the Complexity of an Agent-Based Local Heroin Market Model</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Heard, Daniel; Bobashev, Georgiy V.; Morris, Robert J.</p> <p>2014-01-01</p> <p>This project explores techniques for reducing the complexity of an agent-based model (ABM). The analysis involved a model developed from the ethnographic research of Dr. Lee Hoffer in the Larimer area heroin market, which involved drug users, drug sellers, homeless individuals and police. The authors used statistical techniques to create a reduced version of the original model which maintained simulation fidelity while reducing computational complexity. This involved identifying key summary quantities of individual customer behavior as well as overall market activity and replacing some agents with probability distributions and regressions. The model was then extended to allow external market interventions in the form of police busts. Extensions of this research perspective, as well as its strengths and limitations, are discussed. PMID:25025132</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22676278','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22676278"><span>Standardisation of digital human models.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Paul, Gunther; Wischniewski, Sascha</p> <p>2012-01-01</p> <p>Digital human models (DHM) have evolved as useful tools for ergonomic workplace design and product development, and found in various industries and education. DHM systems which dominate the market were developed for specific purposes and differ significantly, which is not only reflected in non-compatible results of DHM simulations, but also provoking misunderstanding of how DHM simulations relate to real world problems. While DHM developers are restricted by uncertainty about the user need and lack of model data related standards, users are confined to one specific product and cannot exchange results, or upgrade to another DHM system, as their previous results would be rendered worthless. Furthermore, origin and validity of anthropometric and biomechanical data is not transparent to the user. The lack of standardisation in DHM systems has become a major roadblock in further system development, affecting all stakeholders in the DHM industry. Evidently, a framework for standardising digital human models is necessary to overcome current obstructions. Practitioner Summary: This short communication addresses a standardisation issue for digital human models, which has been addressed at the International Ergonomics Association Technical Committee for Human Simulation and Virtual Environments. It is the outcome of a workshop at the DHM 2011 symposium in Lyon, which concluded steps towards DHM standardisation that need to be taken.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20110004200','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20110004200"><span>A Computational Approach for Probabilistic Analysis of LS-DYNA Water Impact Simulations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.</p> <p>2010-01-01</p> <p>NASA s development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. Because of the computational cost, these tools are often used to evaluate specific conditions and rarely used for statistical analysis. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. For this problem, response surface models are used to predict the system time responses to a water landing as a function of capsule speed, direction, attitude, water speed, and water direction. Furthermore, these models can also be used to ascertain the adequacy of the design in terms of probability measures. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=20150008468&hterms=time+keeper&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D20%26Ntt%3Dtime%2Bkeeper','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=20150008468&hterms=time+keeper&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D20%26Ntt%3Dtime%2Bkeeper"><span>Neutralizer Hollow Cathode Simulations and Comparisons with Ground Test Data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Mikellides, Ioannis G.; Snyder, John S.; Goebel, Dan M.; Katz, Ira; Herman, Daniel A.</p> <p>2009-01-01</p> <p>The fidelity of electric propulsion physics-based models depends largely on the validity of their predictions over a range of operating conditions and geometries. In general, increased complexity of the physics requires more extensive comparisons with laboratory data to identify the region(s) that lie outside the validity of the model assumptions and to quantify the uncertainties within its range of application. This paper presents numerical simulations of neutralizer hollow cathodes at various operating conditions and orifice sizes. The simulations were performed using a two-dimensional axisymmetric model that solves numerically a relatively extensive system of conservation laws for the partially ionized gas in these devices. A summary of the comparisons between simulation results and Langmuir probe measurements is provided. The model has also been employed to provide insight into recent ground test observations of the neutralizer cathode in NEXT. It is found that a likely cause of the observed keeper voltage drop is cathode orifice erosion. However, due to the small magnitude of this change, is approx. 0.5 V (less than 5% of the beginning-of-life value) over 10 khrs, and in light of the large uncertainties of the cathode material sputtering yield at low ion energies, other causes cannot be excluded. Preliminary simulations to understand transition to plume mode suggest that in the range of 3-5 sccm the existing 2-D model reproduces fairly well the rise of the keeper voltage in the NEXT neutralizer as observed in the laboratory. At lower flow rates the simulation produces oscillations in the keeper current and voltage that require prohibitively small time-steps to resolve with the existing algorithms.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4876308','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4876308"><span>Open Knee: Open Source Modeling & Simulation to Enable Scientific Discovery and Clinical Care in Knee Biomechanics</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Erdemir, Ahmet</p> <p>2016-01-01</p> <p>Virtual representations of the knee joint can provide clinicians, scientists, and engineers the tools to explore mechanical function of the knee and its tissue structures in health and disease. Modeling and simulation approaches such as finite element analysis also provide the possibility to understand the influence of surgical procedures and implants on joint stresses and tissue deformations. A large number of knee joint models are described in the biomechanics literature. However, freely accessible, customizable, and easy-to-use models are scarce. Availability of such models can accelerate clinical translation of simulations, where labor intensive reproduction of model development steps can be avoided. The interested parties can immediately utilize readily available models for scientific discovery and for clinical care. Motivated by this gap, this study aims to describe an open source and freely available finite element representation of the tibiofemoral joint, namely Open Knee, which includes detailed anatomical representation of the joint's major tissue structures, their nonlinear mechanical properties and interactions. Three use cases illustrate customization potential of the model, its predictive capacity, and its scientific and clinical utility: prediction of joint movements during passive flexion, examining the role of meniscectomy on contact mechanics and joint movements, and understanding anterior cruciate ligament mechanics. A summary of scientific and clinically directed studies conducted by other investigators are also provided. The utilization of this open source model by groups other than its developers emphasizes the premise of model sharing as an accelerator of simulation-based medicine. Finally, the imminent need to develop next generation knee models are noted. These are anticipated to incorporate individualized anatomy and tissue properties supported by specimen-specific joint mechanics data for evaluation, all acquired in vitro from varying age groups and pathological states. PMID:26444849</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26444849','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26444849"><span>Open Knee: Open Source Modeling and Simulation in Knee Biomechanics.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Erdemir, Ahmet</p> <p>2016-02-01</p> <p>Virtual representations of the knee joint can provide clinicians, scientists, and engineers the tools to explore mechanical functions of the knee and its tissue structures in health and disease. Modeling and simulation approaches such as finite element analysis also provide the possibility to understand the influence of surgical procedures and implants on joint stresses and tissue deformations. A large number of knee joint models are described in the biomechanics literature. However, freely accessible, customizable, and easy-to-use models are scarce. Availability of such models can accelerate clinical translation of simulations, where labor-intensive reproduction of model development steps can be avoided. Interested parties can immediately utilize readily available models for scientific discovery and clinical care. Motivated by this gap, this study aims to describe an open source and freely available finite element representation of the tibiofemoral joint, namely Open Knee, which includes the detailed anatomical representation of the joint's major tissue structures and their nonlinear mechanical properties and interactions. Three use cases illustrate customization potential of the model, its predictive capacity, and its scientific and clinical utility: prediction of joint movements during passive flexion, examining the role of meniscectomy on contact mechanics and joint movements, and understanding anterior cruciate ligament mechanics. A summary of scientific and clinically directed studies conducted by other investigators are also provided. The utilization of this open source model by groups other than its developers emphasizes the premise of model sharing as an accelerator of simulation-based medicine. Finally, the imminent need to develop next-generation knee models is noted. These are anticipated to incorporate individualized anatomy and tissue properties supported by specimen-specific joint mechanics data for evaluation, all acquired in vitro from varying age groups and pathological states. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19770012832','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19770012832"><span>Capabilities and applications of the Program to Optimize Simulated Trajectories (POST). Program summary document</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Brauer, G. L.; Cornick, D. E.; Stevenson, R.</p> <p>1977-01-01</p> <p>The capabilities and applications of the three-degree-of-freedom (3DOF) version and the six-degree-of-freedom (6DOF) version of the Program to Optimize Simulated Trajectories (POST) are summarized. The document supplements the detailed program manuals by providing additional information that motivates and clarifies basic capabilities, input procedures, applications and computer requirements of these programs. The information will enable prospective users to evaluate the programs, and to determine if they are applicable to their problems. Enough information is given to enable managerial personnel to evaluate the capabilities of the programs and describes the POST structure, formulation, input and output procedures, sample cases, and computer requirements. The report also provides answers to basic questions concerning planet and vehicle modeling, simulation accuracy, optimization capabilities, and general input rules. Several sample cases are presented.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20080006497','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20080006497"><span>Statistical Analyses of Satellite Cloud Object Data from CERES. Part III; Comparison with Cloud-Resolving Model Simulations of Tropical Convective Clouds</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Luo, Yali; Xu, Kuan-Man; Wielicki, Bruce A.; Wong, Takmeng; Eitzen, Zachary A.</p> <p>2007-01-01</p> <p>The present study evaluates the ability of a cloud-resolving model (CRM) to simulate the physical properties of tropical deep convective cloud objects identified from a Clouds and the Earth s Radiant Energy System (CERES) data product. The emphasis of this study is the comparisons among the small-, medium- and large-size categories of cloud objects observed during March 1998 and between the large-size categories of cloud objects observed during March 1998 (strong El Ni o) and March 2000 (weak La Ni a). Results from the CRM simulations are analyzed in a way that is consistent with the CERES retrieval algorithm and they are averaged to match the scale of the CERES satellite footprints. Cloud physical properties are analyzed in terms of their summary histograms for each category. It is found that there is a general agreement in the overall shapes of all cloud physical properties between the simulated and observed distributions. Each cloud physical property produced by the CRM also exhibits different degrees of disagreement with observations over different ranges of the property. The simulated cloud tops are generally too high and cloud top temperatures are too low except for the large-size category of March 1998. The probability densities of the simulated top-of-the-atmosphere (TOA) albedos for all four categories are underestimated for high albedos, while those of cloud optical depth are overestimated at its lowest bin. These disagreements are mainly related to uncertainties in the cloud microphysics parameterization and inputs such as cloud ice effective size to the radiation calculation. Summary histograms of cloud optical depth and TOA albedo from the CRM simulations of the large-size category of cloud objects do not differ significantly between the March 1998 and 2000 periods, consistent with the CERES observations. However, the CRM is unable to reproduce the significant differences in the observed cloud top height while it overestimates the differences in the observed outgoing longwave radiation and cloud top temperature between the two periods. Comparisons between the CRM results and the observations for most parameters in March 1998 consistently show that both the simulations and observations have larger differences between the large- and small-size categories than between the large- and medium-size, or between the medium- and small-size categories. However, the simulated cloud properties do not change as much with size as observed. These disagreements are likely related to the spatial averaging of the forcing data and the mismatch in time and in space between the numerical weather prediction model from which the forcing data are produced and the CERES observed cloud systems.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017PhDT........62G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017PhDT........62G"><span>Numerical Coupling and Simulation of Point-Mass System with the Turbulent Fluid Flow</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gao, Zheng</p> <p></p> <p>A computational framework that combines the Eulerian description of the turbulence field with a Lagrangian point-mass ensemble is proposed in this dissertation. Depending on the Reynolds number, the turbulence field is simulated using Direct Numerical Simulation (DNS) or eddy viscosity model. In the meanwhile, the particle system, such as spring-mass system and cloud droplets, are modeled using the ordinary differential system, which is stiff and hence poses a challenge to the stability of the entire system. This computational framework is applied to the numerical study of parachute deceleration and cloud microphysics. These two distinct problems can be uniformly modeled with Partial Differential Equations (PDEs) and Ordinary Differential Equations (ODEs), and numerically solved in the same framework. For the parachute simulation, a novel porosity model is proposed to simulate the porous effects of the parachute canopy. This model is easy to implement with the projection method and is able to reproduce Darcy's law observed in the experiment. Moreover, the impacts of using different versions of k-epsilon turbulence model in the parachute simulation have been investigated and conclude that the standard and Re-Normalisation Group (RNG) model may overestimate the turbulence effects when Reynolds number is small while the Realizable model has a consistent performance with both large and small Reynolds number. For another application, cloud microphysics, the cloud entrainment-mixing problem is studied in the same numerical framework. Three sets of DNS are carried out with both decaying and forced turbulence. The numerical result suggests a new way parameterize the cloud mixing degree using the dynamical measures. The numerical experiments also verify the negative relationship between the droplets number concentration and the vorticity field. The results imply that the gravity has fewer impacts on the forced turbulence than the decaying turbulence. In summary, the proposed framework can be used to solve a physics problem that involves turbulence field and point-mass system, and therefore has a broad application.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70191919','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70191919"><span>State-and-transition simulation models: a framework for forecasting landscape change</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Daniel, Colin; Frid, Leonardo; Sleeter, Benjamin M.; Fortin, Marie-Josée</p> <p>2016-01-01</p> <p>SummaryA wide range of spatially explicit simulation models have been developed to forecast landscape dynamics, including models for projecting changes in both vegetation and land use. While these models have generally been developed as separate applications, each with a separate purpose and audience, they share many common features.We present a general framework, called a state-and-transition simulation model (STSM), which captures a number of these common features, accompanied by a software product, called ST-Sim, to build and run such models. The STSM method divides a landscape into a set of discrete spatial units and simulates the discrete state of each cell forward as a discrete-time-inhomogeneous stochastic process. The method differs from a spatially interacting Markov chain in several important ways, including the ability to add discrete counters such as age and time-since-transition as state variables, to specify one-step transition rates as either probabilities or target areas, and to represent multiple types of transitions between pairs of states.We demonstrate the STSM method using a model of land-use/land-cover (LULC) change for the state of Hawai'i, USA. Processes represented in this example include expansion/contraction of agricultural lands, urbanization, wildfire, shrub encroachment into grassland and harvest of tree plantations; the model also projects shifts in moisture zones due to climate change. Key model output includes projections of the future spatial and temporal distribution of LULC classes and moisture zones across the landscape over the next 50 years.State-and-transition simulation models can be applied to a wide range of landscapes, including questions of both land-use change and vegetation dynamics. Because the method is inherently stochastic, it is well suited for characterizing uncertainty in model projections. When combined with the ST-Sim software, STSMs offer a simple yet powerful means for developing a wide range of models of landscape dynamics.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19920019574','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19920019574"><span>Large Eddy Simulations (LES) and Direct Numerical Simulations (DNS) for the computational analyses of high speed reacting flows</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Givi, Peyman; Madnia, Cyrus K.; Steinberger, C. J.; Frankel, S. H.</p> <p>1992-01-01</p> <p>The principal objective is to extend the boundaries within which large eddy simulations (LES) and direct numerical simulations (DNS) can be applied in computational analyses of high speed reacting flows. A summary of work accomplished during the last six months is presented.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li class="active"><span>13</span></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_13 --> <div id="page_14" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li class="active"><span>14</span></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="261"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19870014609','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19870014609"><span>Simulation and analysis of a geopotential research mission</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Schutz, B. E.</p> <p>1987-01-01</p> <p>Computer simulations were performed for a Geopotential Research Mission (GRM) to enable the study of the gravitational sensitivity of the range rate measurements between the two satellites and to provide a set of simulated measurements to assist in the evaluation of techniques developed for the determination of the gravity field. The simulations were conducted with two satellites in near circular, frozen orbits at 160 km altitudes separated by 300 km. High precision numerical integration of the polar orbits were used with a gravitational field complete to degree and order 360. The set of simulated data for a mission duration of about 32 days was generated on a Cray X-MP computer. The results presented cover the most recent simulation, S8703, and includes a summary of the numerical integration of the simulated trajectories, a summary of the requirements to compute nominal reference trajectories to meet the initial orbit determination requirements for the recovery of the geopotential, an analysis of the nature of the one way integrated Doppler measurements associated with the simulation, and a discussion of the data set to be made available.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5161269','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5161269"><span>Challenges in Species Tree Estimation Under the Multispecies Coalescent Model</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Xu, Bo; Yang, Ziheng</p> <p>2016-01-01</p> <p>The multispecies coalescent (MSC) model has emerged as a powerful framework for inferring species phylogenies while accounting for ancestral polymorphism and gene tree-species tree conflict. A number of methods have been developed in the past few years to estimate the species tree under the MSC. The full likelihood methods (including maximum likelihood and Bayesian inference) average over the unknown gene trees and accommodate their uncertainties properly but involve intensive computation. The approximate or summary coalescent methods are computationally fast and are applicable to genomic datasets with thousands of loci, but do not make an efficient use of information in the multilocus data. Most of them take the two-step approach of reconstructing the gene trees for multiple loci by phylogenetic methods and then treating the estimated gene trees as observed data, without accounting for their uncertainties appropriately. In this article we review the statistical nature of the species tree estimation problem under the MSC, and explore the conceptual issues and challenges of species tree estimation by focusing mainly on simple cases of three or four closely related species. We use mathematical analysis and computer simulation to demonstrate that large differences in statistical performance may exist between the two classes of methods. We illustrate that several counterintuitive behaviors may occur with the summary methods but they are due to inefficient use of information in the data by summary methods and vanish when the data are analyzed using full-likelihood methods. These include (i) unidentifiability of parameters in the model, (ii) inconsistency in the so-called anomaly zone, (iii) singularity on the likelihood surface, and (iv) deterioration of performance upon addition of more data. We discuss the challenges and strategies of species tree inference for distantly related species when the molecular clock is violated, and highlight the need for improving the computational efficiency and model realism of the likelihood methods as well as the statistical efficiency of the summary methods. PMID:27927902</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/22525532-approximate-bayesian-computation-forward-modeling-cosmology','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/22525532-approximate-bayesian-computation-forward-modeling-cosmology"><span>Approximate Bayesian computation for forward modeling in cosmology</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Akeret, Joël; Refregier, Alexandre; Amara, Adam</p> <p></p> <p>Bayesian inference is often used in cosmology and astrophysics to derive constraints on model parameters from observations. This approach relies on the ability to compute the likelihood of the data given a choice of model parameters. In many practical situations, the likelihood function may however be unavailable or intractable due to non-gaussian errors, non-linear measurements processes, or complex data formats such as catalogs and maps. In these cases, the simulation of mock data sets can often be made through forward modeling. We discuss how Approximate Bayesian Computation (ABC) can be used in these cases to derive an approximation to themore » posterior constraints using simulated data sets. This technique relies on the sampling of the parameter set, a distance metric to quantify the difference between the observation and the simulations and summary statistics to compress the information in the data. We first review the principles of ABC and discuss its implementation using a Population Monte-Carlo (PMC) algorithm and the Mahalanobis distance metric. We test the performance of the implementation using a Gaussian toy model. We then apply the ABC technique to the practical case of the calibration of image simulations for wide field cosmological surveys. We find that the ABC analysis is able to provide reliable parameter constraints for this problem and is therefore a promising technique for other applications in cosmology and astrophysics. Our implementation of the ABC PMC method is made available via a public code release.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3864982','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3864982"><span>Bayesian spatial transformation models with applications in neuroimaging data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Miranda, Michelle F.; Zhu, Hongtu; Ibrahim, Joseph G.</p> <p>2013-01-01</p> <p>Summary The aim of this paper is to develop a class of spatial transformation models (STM) to spatially model the varying association between imaging measures in a three-dimensional (3D) volume (or 2D surface) and a set of covariates. Our STMs include a varying Box-Cox transformation model for dealing with the issue of non-Gaussian distributed imaging data and a Gaussian Markov Random Field model for incorporating spatial smoothness of the imaging data. Posterior computation proceeds via an efficient Markov chain Monte Carlo algorithm. Simulations and real data analysis demonstrate that the STM significantly outperforms the voxel-wise linear model with Gaussian noise in recovering meaningful geometric patterns. Our STM is able to reveal important brain regions with morphological changes in children with attention deficit hyperactivity disorder. PMID:24128143</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1227256','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1227256"><span>2015 Los Alamos Space Weather Summer School Research Reports</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Cowee, Misa; Chen, Yuxi; Desai, Ravindra</p> <p></p> <p>The fifth Los Alamos Space Weather Summer School was held June 1st - July 24th, 2015, at Los Alamos National Laboratory (LANL). With renewed support from the Institute of Geophysics, Planetary Physics, and Signatures (IGPPS) and additional support from the National Aeronautics and Space Administration (NASA) and the Department of Energy (DOE) Office of Science, we hosted a new class of five students from various U.S. and foreign research institutions. The summer school curriculum includes a series of structured lectures as well as mentored research and practicum opportunities. Lecture topics including general and specialized topics in the field of spacemore » weather were given by a number of researchers affiliated with LANL. Students were given the opportunity to engage in research projects through a mentored practicum experience. Each student works with one or more LANL-affiliated mentors to execute a collaborative research project, typically linked with a larger ongoing research effort at LANL and/or the student’s PhD thesis research. This model provides a valuable learning experience for the student while developing the opportunity for future collaboration. This report includes a summary of the research efforts fostered and facilitated by the Space Weather Summer School. These reports should be viewed as work-in-progress as the short session typically only offers sufficient time for preliminary results. At the close of the summer school session, students present a summary of their research efforts. Titles of the papers included in this report are as follows: Full particle-in-cell (PIC) simulation of whistler wave generation, Hybrid simulations of the right-hand ion cyclotron anisotropy instability in a sub-Alfvénic plasma flow, A statistical ensemble for solar wind measurements, Observations and models of substorm injection dispersion patterns, Heavy ion effects on Kelvin-Helmholtz instability: hybrid study, Simulating plasmaspheric electron densities with a two-component electric field model, Ion and electron heating by whistler turbulence: parametric studies via particle-in-cell simulation, and The statistics of relativistic electron pitch angle distribution in the Earth’s radiation belt based on the Van Allen Probes measurements.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24795787','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24795787"><span>QUANTIFYING ALTERNATIVE SPLICING FROM PAIRED-END RNA-SEQUENCING DATA.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Rossell, David; Stephan-Otto Attolini, Camille; Kroiss, Manuel; Stöcker, Almond</p> <p>2014-03-01</p> <p>RNA-sequencing has revolutionized biomedical research and, in particular, our ability to study gene alternative splicing. The problem has important implications for human health, as alternative splicing may be involved in malfunctions at the cellular level and multiple diseases. However, the high-dimensional nature of the data and the existence of experimental biases pose serious data analysis challenges. We find that the standard data summaries used to study alternative splicing are severely limited, as they ignore a substantial amount of valuable information. Current data analysis methods are based on such summaries and are hence sub-optimal. Further, they have limited flexibility in accounting for technical biases. We propose novel data summaries and a Bayesian modeling framework that overcome these limitations and determine biases in a non-parametric, highly flexible manner. These summaries adapt naturally to the rapid improvements in sequencing technology. We provide efficient point estimates and uncertainty assessments. The approach allows to study alternative splicing patterns for individual samples and can also be the basis for downstream analyses. We found a several fold improvement in estimation mean square error compared popular approaches in simulations, and substantially higher consistency between replicates in experimental data. Our findings indicate the need for adjusting the routine summarization and analysis of alternative splicing RNA-seq studies. We provide a software implementation in the R package casper.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://rosap.ntl.bts.gov/view/dot/26201','DOTNTL'); return false;" href="https://rosap.ntl.bts.gov/view/dot/26201"><span>A qualitative analysis of bus simulator training on transit incidents : a case study in Florida. [Summary].</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntlsearch.bts.gov/tris/index.do">DOT National Transportation Integrated Search</a></p> <p></p> <p>2013-01-01</p> <p>The simulator was once a very expensive, large-scale mechanical device for training military pilots or astronauts. Modern computers, linking sophisticated software and large-screen displays, have yielded simulators for the desktop or configured as sm...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19890014414','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19890014414"><span>Digital control of the Kuiper Airborne Observatory telescope</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Mccormack, Ann C.; Snyder, Philip K.</p> <p>1989-01-01</p> <p>The feasibility of using a digital controller to stabilize a telescope mounted in an airplane is investigated. The telescope is a 30 in. infrared telescope mounted aboard a NASA C-141 aircraft known as the Kuiper Airborne Observatory. Current efforts to refurbish the 14-year-old compensation system have led to considering a digital controller. A typical digital controller is modeled and added into the telescope system model. This model is simulated on a computer to generate the Bode plots and time responses which determine system stability and performance parameters. Important aspects of digital control system hardware are discussed. A summary of the findings shows that a digital control system would result in satisfactory telescope performance.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA126682','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA126682"><span>Descriptive Summaries of the Research Development Test & Evaluation. Army Appropriation FY 1984. Supporting Data FY 1984 Budget Estimate Submitted to Congress--February 1983. Volume I.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1983-02-01</p> <p>s.,ccesstully modeled to enhance future computer design simulations; (2) a new methodology for conduc*n dynamic analysis of vehicle mechanics was...to prelminary design methodology for tilt rotors, advancing blade concepts configuration helicopters, and compound helicopters in conjunction with...feasibility of low-level personnel parachutes has been demon- strated. A study was begun to design a free-fall water contalner. An experimental program to</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19790016802','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19790016802"><span>Optimization of MLS receivers for multipath environments</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Mcalpine, G. A.; Highfill, J. H., III</p> <p>1979-01-01</p> <p>The angle tracking problems in microwave landing system receivers along with a receiver design capable of optimal performance in the multipath environments found in air terminal areas were studied. Included were various theoretical and evaluative studies like: (1) signal model development; (2) derivation of optimal receiver structures; and (3) development and use of computer simulations for receiver algorithm evaluation. The development of an experimental receiver for flight testing is presented. An overview of the work and summary of principal results and conclusions are reported.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011plri.book.....E','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011plri.book.....E"><span>Planetary Rings</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Esposito, Larry W.</p> <p>2011-07-01</p> <p>Preface; 1. Introduction: the allure of ringed planets; 2. Studies of planetary rings 1610-2004; 3. Diversity of planetary rings; 4. Individual ring particles and their collisions; 5. Large-scale ring evolution; 6. Moons confine and sculpt rings; 7. Explaining ring phenomena; 8. N-Body simulations; 9. Stochastic models; 10. Age and evolution of rings; 11. Saturn's mysterious F ring; 12. Neptune's partial rings; 13. Jupiter's ring-moon system after Galileo; 14. Ring photometry; 15. Dusty rings; 16. Cassini observations; 17. Summary: the big questions; Glossary; References; Index.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA244182','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA244182"><span>Formalization and Validation of an SADT Specification Through Executable Simulation in VHDL</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1991-12-01</p> <p>be found in (39, 40, 41). One recent summary of the SADT methodology was written by Marca and McGowan in 1988 (.32). SADT is a methodology to provide...that is required. Also, the presence of "all" inputs and controls may not be needed for the activity to proceed. Marca and McGowan (32) describe a...diagrams which describe a complete system. Marca and McGowan define an SADT Model as: "a collection of carefully coorinated descriptions, starting from a</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA591639','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA591639"><span>Integration of Computational Geometry, Finite Element, and Multibody System Algorithms for the Development of New Computational Methodology for High-Fidelity Vehicle Systems Modeling and Simulation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2013-04-11</p> <p>vehicle dynamics. Unclassified Unclassified Unclassified UU 9 Dr. Paramsothy Jayakumar (586) 282-4896 Computational Dynamics Inc.   0    Name of...Technical Representative Dr. Paramsothy Jayakumar TARDEC Computational Dynamics Inc.   1    Project Summary This project aims at addressing and...applications. This literature review is being summarized and incorporated into the paper. The commentary provided by Dr. Jayakumar was addressed and</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1389911','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1389911"><span>Fuego/Scefire MPMD Coupling L2 Milestone Executive Summary</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Pierce, Flint; Tencer, John; Pautz, Shawn D.</p> <p>2017-09-01</p> <p>This milestone campaign was focused on coupling Sandia physics codes SIERRA low Mach module Fuego and RAMSES Boltzmann transport code Sceptre(Scefire). Fuego enables simulation of low Mach, turbulent, reacting, particle laden flows on unstructured meshes using CVFEM for abnormal thermal environments throughout SNL and the larger national security community. Sceptre provides simulation for photon, neutron, and charged particle transport on unstructured meshes using Discontinuous Galerkin for radiation effects calculations at SNL and elsewhere. Coupling these ”best of breed” codes enables efficient modeling of thermal/fluid environments with radiation transport, including fires (pool, propellant, composite) as well as those with directed radiantmore » fluxes. We seek to improve the experience of Fuego users who require radiation transport capabilities in two ways. The first is performance. We achieve this through leveraging additional computational resources for Scefire, reducing calculation times while leaving unaffected resources for fluid physics. This approach is new to Fuego, which previously utilized the same resources for both fluid and radiation solutions. The second improvement enables new radiation capabilities, including spectral (banded) radiation, beam boundary sources, and alternate radiation solvers (i.e. Pn). This summary provides an overview of these achievements.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012CoPhC.183.2629H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012CoPhC.183.2629H"><span>BADGER v1.0: A Fortran equation of state library</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Heltemes, T. A.; Moses, G. A.</p> <p>2012-12-01</p> <p>The BADGER equation of state library was developed to enable inertial confinement fusion plasma codes to more accurately model plasmas in the high-density, low-temperature regime. The code had the capability to calculate 1- and 2-T plasmas using the Thomas-Fermi model and an individual electron accounting model. Ion equation of state data can be calculated using an ideal gas model or via a quotidian equation of state with scaled binding energies. Electron equation of state data can be calculated via the ideal gas model or with an adaptation of the screened hydrogenic model with ℓ-splitting. The ionization and equation of state calculations can be done in local thermodynamic equilibrium or in a non-LTE mode using a variant of the Busquet equivalent temperature method. The code was written as a stand-alone Fortran library for ease of implementation by external codes. EOS results for aluminum are presented that show good agreement with the SESAME library and ionization calculations show good agreement with the FLYCHK code. Program summaryProgram title: BADGERLIB v1.0 Catalogue identifier: AEND_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEND_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 41 480 No. of bytes in distributed program, including test data, etc.: 2 904 451 Distribution format: tar.gz Programming language: Fortran 90. Computer: 32- or 64-bit PC, or Mac. Operating system: Windows, Linux, MacOS X. RAM: 249.496 kB plus 195.630 kB per isotope record in memory Classification: 19.1, 19.7. Nature of problem: Equation of State (EOS) calculations are necessary for the accurate simulation of high energy density plasmas. Historically, most EOS codes used in these simulations have relied on an ideal gas model. This model is inadequate for low-temperature, high-density plasma conditions; the gaseous and liquid phases; and the solid phase. The BADGER code was developed to give more realistic EOS data in these regimes. Solution method: BADGER has multiple, user-selectable models to treat the ions, average-atom ionization state and electrons. Ion models are ideal gas and quotidian equation of state (QEOS), ionization models are Thomas-Fermi and individual accounting method (IEM) formulation of the screened hydrogenic model (SHM) with l-splitting, electron ionization models are ideal gas and a Helmholtz free energy minimization method derived from the SHM. The default equation of state and ionization models are appropriate for plasmas in local thermodynamic equilibrium (LTE). The code can calculate non-LTE equation of state (EOS) and ionization data using a simplified form of the Busquet equivalent-temperature method. Restrictions: Physical data are only provided for elements Z=1 to Z=86. Multiple solid phases are not currently supported. Liquid, gas and plasma phases are combined into a generalized "fluid" phase. Unusual features: BADGER divorces the calculation of average-atom ionization from the electron equation of state model, allowing the user to select ionization and electron EOS models that are most appropriate to the simulation. The included ion ideal gas model uses ground-state nuclear spin data to differentiate between isotopes of a given element. Running time: Example provided only takes a few seconds to run.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1374298','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1374298"><span>Initial CGE Model Results Summary Exogenous and Endogenous Variables Tests</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Edwards, Brian Keith; Boero, Riccardo; Rivera, Michael Kelly</p> <p></p> <p>The following discussion presents initial results of tests of the most recent version of the National Infrastructure Simulation and Analysis Center Dynamic Computable General Equilibrium (CGE) model developed by Los Alamos National Laboratory (LANL). The intent of this is to test and assess the model’s behavioral properties. The test evaluated whether the predicted impacts are reasonable from a qualitative perspective. This issue is whether the predicted change, be it an increase or decrease in other model variables, is consistent with prior economic intuition and expectations about the predicted change. One of the purposes of this effort is to determine whethermore » model changes are needed in order to improve its behavior qualitatively and quantitatively.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/FR-2013-11-04/pdf/2013-26270.pdf','FEDREG'); return false;" href="https://www.gpo.gov/fdsys/pkg/FR-2013-11-04/pdf/2013-26270.pdf"><span>78 FR 66099 - Petition for Exemption; Summary of Petition Received</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collection.action?collectionCode=FR">Federal Register 2010, 2011, 2012, 2013, 2014</a></p> <p></p> <p>2013-11-04</p> <p>..., Office of Rulemaking. Petition for Exemption Docket No.: FAA-2013-0818. Petitioner: ELITE Simulation Solutions. Section of 14 CFR Affected: Sec. 61.65(i) Description of Relief Sought: ELITE Simulation...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017MS%26E..257a2008T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017MS%26E..257a2008T"><span>6 DOF articulated-arm robot and mobile platform: Dynamic modelling as Multibody System and its validation via Experimental Modal Analysis.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Toledo Fuentes, A.; Kipfmueller, M.; José Prieto, M. A.</p> <p>2017-10-01</p> <p>Mobile manipulators are becoming a key instrument to increase the flexibility in industrial processes. Some of their requirements include handling of objects with different weights and sizes and their “fast” transportation, without jeopardizing production workers and machines. The compensation of forces affecting the system dynamic is therefore needed to avoid unwanted oscillations and tilting by sudden accelerations and decelerations. One general solution may be the implementation of external positioning elements to active stabilize the system. To accomplish the approach, the dynamic behavior of a robotic arm and a mobile platform was investigated to develop the stabilization mechanism using multibody simulations. The methodology used was divided into two phases for each subsystem: their natural frequencies and modal shapes were obtained using experimental modal analyses. Then, based on these experimental results, multibody simulation models (MBS) were set up and its dynamical parameters adjusted. Their modal shapes together with their obtained natural frequencies allowed a quantitative and qualitative analysis. In summary, the MBS models were successfully validated with the real subsystems, with a maximal percentage error of 15%. These models will serve as the basis for future steps in the design of the external actuators and its control strategy using a co-simulation tool.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=simulation&id=EJ1100967','ERIC'); return false;" href="https://eric.ed.gov/?q=simulation&id=EJ1100967"><span>Learning with STEM Simulations in the Classroom: Findings and Trends from a Meta-Analysis</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>D'Angelo, Cynthia M.; Rutstein, Daisy; Harris, Christopher J.</p> <p>2016-01-01</p> <p>This article presents a summary of the findings of a systematic review and meta-analysis of the literature on computer-based interactive simulations for K-12 science, technology, engineering, and mathematics (STEM) learning topics. For achievement outcomes, simulations had a moderate to strong effect on student learning. Overall, simulations have…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/1994STIN...9526171S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/1994STIN...9526171S"><span>Intermediate-sized natural gas fueled carbonate fuel cell power plants</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sudhoff, Frederick A.; Fleming, Donald K.</p> <p>1994-04-01</p> <p>This executive summary of the report describes the accomplishments of the joint US Department of Energy's (DOE) Morgantown Energy Technology Center (METC) and M-C POWER Corporation's Cooperative Research and Development Agreement (CRADA) No. 93-013. This study addresses the intermediate power plant size between 2 megawatt (MW) and 200 MW. A 25 MW natural-gas, fueled-carbonate fuel cell power plant was chosen for this purpose. In keeping with recent designs, the fuel cell will operate under approximately three atmospheres of pressure. An expander/alternator is utilized to expand exhaust gas to atmospheric conditions and generate additional power. A steam-bottoming cycle is not included in this study because it is not believed to be cost effective for this system size. This study also addresses the simplicity and accuracy of a spreadsheet-based simulation with that of a full Advanced System for Process Engineering (ASPEN) simulation. The personal computer can fully utilize the simple spreadsheet model simulation. This model can be made available to all users and is particularly advantageous to the small business user.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li class="active"><span>14</span></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_14 --> <div id="page_15" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li class="active"><span>15</span></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="281"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..17.7462W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..17.7462W"><span>Scales of variability of black carbon plumes and their dependence on resolution of ECHAM6-HAM</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Weigum, Natalie; Stier, Philip; Schutgens, Nick; Kipling, Zak</p> <p>2015-04-01</p> <p>Prediction of the aerosol effect on climate depends on the ability of three-dimensional numerical models to accurately estimate aerosol properties. However, a limitation of traditional grid-based models is their inability to resolve variability on scales smaller than a grid box. Past research has shown that significant aerosol variability exists on scales smaller than these grid-boxes, which can lead to discrepancies between observations and aerosol models. The aim of this study is to understand how a global climate model's (GCM) inability to resolve sub-grid scale variability affects simulations of important aerosol features. This problem is addressed by comparing observed black carbon (BC) plume scales from the HIPPO aircraft campaign to those simulated by ECHAM-HAM GCM, and testing how model resolution affects these scales. This study additionally investigates how model resolution affects BC variability in remote and near-source regions. These issues are examined using three different approaches: comparison of observed and simulated along-flight-track plume scales, two-dimensional autocorrelation analysis, and 3-dimensional plume analysis. We find that the degree to which GCMs resolve variability can have a significant impact on the scales of BC plumes, and it is important for models to capture the scales of aerosol plume structures, which account for a large degree of aerosol variability. In this presentation, we will provide further results from the three analysis techniques along with a summary of the implication of these results on future aerosol model development.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://rosap.ntl.bts.gov/view/dot/9214','DOTNTL'); return false;" href="https://rosap.ntl.bts.gov/view/dot/9214"><span>Human Factors Experiments for Data Link : Extended Summary for Interim Reports 1 Through 4</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntlsearch.bts.gov/tris/index.do">DOT National Transportation Integrated Search</a></p> <p></p> <p>1974-04-01</p> <p>The report provides an extended summary of Interim Reports numbers 1 through 4, dealing with Human Factors Experiments for Data Link. The material summarized includes a description of two experiments run on the GAT-1 simulator at TSC using one-man cr...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA173875','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA173875"><span>Flight Simulation.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1986-09-01</p> <p>TECHNICAL EVALUATION REPORT OF THE SYMPOSIUM ON "FLIGHT SIMULATION" A. M. Cook. NASA -Ames Research Center 1. INTRODUCIL𔃻N This report evaluates the 67th...John C. Ousterberry* NASA Ames Research Center Moffett Field, California 94035, U.S.A. SUMMARY Early AGARD papers on manned flight simulation...and developffent simulators. VISUAL AND MOTION CUEING IN HELICOPTER SIMULATION Nichard S. Bray NASA Ames Research Center Moffett Field, California</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/20234313','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/20234313"><span>Simulation-based assessment in anesthesiology: requirements for practical implementation.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Boulet, John R; Murray, David J</p> <p>2010-04-01</p> <p>Simulations have taken a central role in the education and assessment of medical students, residents, and practicing physicians. The introduction of simulation-based assessments in anesthesiology, especially those used to establish various competencies, has demanded fairly rigorous studies concerning the psychometric properties of the scores. Most important, major efforts have been directed at identifying, and addressing, potential threats to the validity of simulation-based assessment scores. As a result, organizations that wish to incorporate simulation-based assessments into their evaluation practices can access information regarding effective test development practices, the selection of appropriate metrics, the minimization of measurement errors, and test score validation processes. The purpose of this article is to provide a broad overview of the use of simulation for measuring physician skills and competencies. For simulations used in anesthesiology, studies that describe advances in scenario development, the development of scoring rubrics, and the validation of assessment results are synthesized. Based on the summary of relevant research, psychometric requirements for practical implementation of simulation-based assessments in anesthesiology are forwarded. As technology expands, and simulation-based education and evaluation takes on a larger role in patient safety initiatives, the groundbreaking work conducted to date can serve as a model for those individuals and organizations that are responsible for developing, scoring, or validating simulation-based education and assessment programs in anesthesiology.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/447181-novel-use-geochemical-models-evaluating-treatment-trains-aqueous-radioactive-waste-streams','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/447181-novel-use-geochemical-models-evaluating-treatment-trains-aqueous-radioactive-waste-streams"><span>Novel use of geochemical models in evaluating treatment trains for aqueous radioactive waste streams</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Abitz, R.J.</p> <p>1996-12-31</p> <p>Thermodynamic geochemical models have been applied to assess the relative effectiveness of a variety of reagents added to aqueous waste streams for the removal of radioactive elements. Two aqueous waste streams were examined: effluent derived from the processing of uranium ore and irradiated uranium fuel rods. Simulations of the treatment train were performed to estimate the mass of reagents needed per kilogram of solution, identify pH regions corresponding to solubility minimums, and predict the identity and quantity of precipitated solids. Results generated by the simulations include figures that chart the chemical evolution of the waste stream as reagents are addedmore » and summary tables that list mass balances for all reagents and radioactive elements of concern. Model results were used to set initial reagent levels for the treatment trains, minimizing the number of bench-scale tests required to bring the treatment train up to full-scale operation. Additionally, presentation of modeling results at public meetings helps to establish good faith between the federal government, industry, concerned citizens, and media groups. 18 refs., 3 figs., 1 tab.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2004SPIE.5423....1D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2004SPIE.5423....1D"><span>Prospects for composability of models and simulations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Davis, Paul K.; Anderson, Robert B.</p> <p>2004-08-01</p> <p>This paper is the summary of a recent RAND study done at the request of the U.S. Defense Modeling and Simulation Office (DMSO). Commissioned in recognition that the last decade's efforts by DoD to achieve model "composability" have had only limited success (e.g., HLA-mediated exercises), and that fundamental problems remain, the study surveyed the underlying problems that make composability difficult. It then went on to recommend a series of improvement measures for DMSO and other DoD offices to consider. One strong recommendation was that DoD back away from an earlier tendency toward overselling composability, moving instead to a more particularized approach in which composability is sought within domains where it makes most sense substantively. Another recommendation was that DoD needs to recognize the shortcomings of standard software-engineering paradigms when dealing with "models" rather than pure software. Beyond this, the study had concrete recommendations dealing with science and technology, the base of human capital, management, and infrastructure. Many recommendations involved the need to align more closely with cutting edge technology and emerging standards in the private sector.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20040085690','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20040085690"><span>Theory, Modeling, Software and Hardware Development for Analytical and Computational Materials Science</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Young, Gerald W.; Clemons, Curtis B.</p> <p>2004-01-01</p> <p>The focus of this Cooperative Agreement between the Computational Materials Laboratory (CML) of the Processing Science and Technology Branch of the NASA Glenn Research Center (GRC) and the Department of Theoretical and Applied Mathematics at The University of Akron was in the areas of system development of the CML workstation environment, modeling of microgravity and earth-based material processing systems, and joint activities in laboratory projects. These efforts complement each other as the majority of the modeling work involves numerical computations to support laboratory investigations. Coordination and interaction between the modelers, system analysts, and laboratory personnel are essential toward providing the most effective simulations and communication of the simulation results. Toward these means, The University of Akron personnel involved in the agreement worked at the Applied Mathematics Research Laboratory (AMRL) in the Department of Theoretical and Applied Mathematics while maintaining a close relationship with the personnel of the Computational Materials Laboratory at GRC. Network communication between both sites has been established. A summary of the projects we undertook during the time period 9/1/03 - 6/30/04 is included.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012JHyd..460..103M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012JHyd..460..103M"><span>Estimation of dew yield from radiative condensers by means of an energy balance model</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Maestre-Valero, J. F.; Ragab, R.; Martínez-Alvarez, V.; Baille, A.</p> <p>2012-08-01</p> <p>SummaryThis paper presents an energy balance modelling approach to predict the nightly water yield and the surface temperature (Tf) of two passive radiative dew condensers (RDCs) tilted 30° from horizontal. One was fitted with a white hydrophilic polyethylene foil recommended for dew harvest and the other with a black polyethylene foil widely used in horticulture. The model was validated in south-eastern Spain by comparing the simulation outputs with field measurements of Tf and dew yield. The results indicate that the model is robust and accurate in reproducing the behaviour of the two RDCs, especially in what refers to Tf, whose estimates were very close to the observations. The results were somewhat less precise for dew yield, with a larger scatter around the 1:1 relationship. A sensitivity analysis showed that the simulated dew yield was highly sensitive to changes in relative humidity and downward longwave radiation. The proposed approach provides a useful tool to water managers for quantifying the amount of dew that could be harvested as a valuable water resource in arid, semiarid and water stressed regions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19880011521','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19880011521"><span>Predictive monitoring research: Summary of the PREMON system</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Doyle, Richard J.; Sellers, Suzanne M.; Atkinson, David J.</p> <p>1987-01-01</p> <p>Traditional approaches to monitoring are proving inadequate in the face of two important issues: the dynamic adjustment of expectations about sensor values when the behavior of the device is too complex to enumerate beforehand, and the selective but effective interpretation of sensor readings when the number of sensors becomes overwhelming. This system addresses these issues by building an explicit model of a device and applying common-sense theories of physics to model causality in the device. The resulting causal simulation of the device supports planning decisions about how to efficiently yet reliably utilize a limited number of sensors to verify correct operation of the device.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=Telecom&pg=3&id=EJ273221','ERIC'); return false;" href="https://eric.ed.gov/?q=Telecom&pg=3&id=EJ273221"><span>Telecom Link--A Competitive Simulated Design Exercise.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Freeman, J.; Allen, J.</p> <p>1982-01-01</p> <p>Telecom link is a structured design exercise concerned with building a telecommunications link between London and Amsterdam. Designed for A-level physics, the simulation requires a minimum of 10 hours. Aims of the exercise, design specifications and technical aspects, and summaries of four possible technologies used in the simulation are…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016PhR...656....1R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016PhR...656....1R"><span>Collision partner selection schemes in DSMC: From micro/nano flows to hypersonic flows</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Roohi, Ehsan; Stefanov, Stefan</p> <p>2016-10-01</p> <p>The motivation of this review paper is to present a detailed summary of different collision models developed in the framework of the direct simulation Monte Carlo (DSMC) method. The emphasis is put on a newly developed collision model, i.e., the Simplified Bernoulli trial (SBT), which permits efficient low-memory simulation of rarefied gas flows. The paper starts with a brief review of the governing equations of the rarefied gas dynamics including Boltzmann and Kac master equations and reiterates that the linear Kac equation reduces to a non-linear Boltzmann equation under the assumption of molecular chaos. An introduction to the DSMC method is provided, and principles of collision algorithms in the DSMC are discussed. A distinction is made between those collision models that are based on classical kinetic theory (time counter, no time counter (NTC), and nearest neighbor (NN)) and the other class that could be derived mathematically from the Kac master equation (pseudo-Poisson process, ballot box, majorant frequency, null collision, Bernoulli trials scheme and its variants). To provide a deeper insight, the derivation of both collision models, either from the principles of the kinetic theory or the Kac master equation, is provided with sufficient details. Some discussions on the importance of subcells in the DSMC collision procedure are also provided and different types of subcells are presented. The paper then focuses on the simplified version of the Bernoulli trials algorithm (SBT) and presents a detailed summary of validation of the SBT family collision schemes (SBT on transient adaptive subcells: SBT-TAS, and intelligent SBT: ISBT) in a broad spectrum of rarefied gas-flow test cases, ranging from low speed, internal micro and nano flows to external hypersonic flow, emphasizing first the accuracy of these new collision models and second, demonstrating that the SBT family scheme, if compared to other conventional and recent collision models, requires smaller number of particles per cell to obtain sufficiently accurate solutions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMIN23C0102W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMIN23C0102W"><span>A Climate Statistics Tool and Data Repository</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wang, J.; Kotamarthi, V. R.; Kuiper, J. A.; Orr, A.</p> <p>2017-12-01</p> <p>Researchers at Argonne National Laboratory and collaborating organizations have generated regional scale, dynamically downscaled climate model output using Weather Research and Forecasting (WRF) version 3.3.1 at a 12km horizontal spatial resolution over much of North America. The WRF model is driven by boundary conditions obtained from three independent global scale climate models and two different future greenhouse gas emission scenarios, named representative concentration pathways (RCPs). The repository of results has a temporal resolution of three hours for all the simulations, includes more than 50 variables, is stored in Network Common Data Form (NetCDF) files, and the data volume is nearly 600Tb. A condensed 800Gb set of NetCDF files were made for selected variables most useful for climate-related planning, including daily precipitation, relative humidity, solar radiation, maximum temperature, minimum temperature, and wind. The WRF model simulations are conducted for three 10-year time periods (1995-2004, 2045-2054, and 2085-2094), and two future scenarios RCP4.5 and RCP8.5). An open-source tool was coded using Python 2.7.8 and ESRI ArcGIS 10.3.1 programming libraries to parse the NetCDF files, compute summary statistics, and output results as GIS layers. Eight sets of summary statistics were generated as examples for the contiguous U.S. states and much of Alaska, including number of days over 90°F, number of days with a heat index over 90°F, heat waves, monthly and annual precipitation, drought, extreme precipitation, multi-model averages, and model bias. This paper will provide an overview of the project to generate the main and condensed data repositories, describe the Python tool and how to use it, present the GIS results of the computed examples, and discuss some of the ways they can be used for planning. The condensed climate data, Python tool, computed GIS results, and documentation of the work are shared on the Internet.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1406830','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1406830"><span>Transactive Control of Commercial Building HVAC Systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Corbin, Charles D.; Makhmalbaf, Atefe; Huang, Sen</p> <p></p> <p>This document details the development and testing of market-based transactive controls for building heating, ventilating and air conditioning (HVAC) systems. These controls are intended to serve the purposes of reducing electricity use through conservation, reducing peak building electric demand, and providing demand flexibility to assist with power system operations. This report is the summary of the first year of work conducted under Phase 1 of the Clean Energy and Transactive Campus Project. The methods and techniques described here were first investigated in simulation, and then subsequently deployed to a physical testbed on the Pacific Northwest National Laboratory (PNNL) campus formore » validation. In this report, we describe the models and control algorithms we have developed, testing of the control algorithms in simulation, and deployment to a physical testbed. Results from physical experiments support previous simulation findings, and provide insights for further improvement.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..18.4246M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..18.4246M"><span>Simulation of a Severe Autumn/Winter Drought in Eastern China by Regional Atmospheric Modeling System(RAMS)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Meng, Chunchun; Ma, Yaoming</p> <p>2016-04-01</p> <p>Compared with European Centre for Medium-Range Weather Forecasts (ERA-interim) Reanalysis data and Global Summary Of Day (GSOD) observation data, the outcomes from RAMS of the 2008/2009 severe autumn/winter drought in eastern china are analyzed in this study. The reanalysis data showed that most parts of north China are controlled by northwest wind which was accompanied by cold air, the warm and moist air from South Sea is so weak to meet with cold air, therefore forming a circulation which is unfavorable for the formation of precipitation over Eastern China. RAMS performs very well over the simulation of this atmospheric circulation, so do the rainfall and air temperature over China and where the drought occurred. Meanwhile, the simulation of the time series of precipitation and temperature behaves excellent, the square of correlation coefficient between simulations and observations reached above 0.8. Although the performance of RAMS on this drought simulation is fairly accurate, there is amount of research work to be continued to complete a more realistic simulation. KEY WORDS RAMS; severe drought; numerical simulation; atmospheric circulation; precipitation and air temperature</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/6870885-nuclear-effects-model-embedded-stochastically-simulation-nemesis-summary-report-technical-paper','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/6870885-nuclear-effects-model-embedded-stochastically-simulation-nemesis-summary-report-technical-paper"><span>Nuclear-effects model embedded stochastically in simulation (NEMESIS) summary report. Technical paper</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Youngren, M.A.</p> <p>1989-11-01</p> <p>An analytic probability model of tactical nuclear warfare in the theater is presented in this paper. The model addresses major problems associated with representing nuclear warfare in the theater. Current theater representations of a potential nuclear battlefield are developed in context of low-resolution, theater-level models or scenarios. These models or scenarios provide insufficient resolution in time and space for modeling a nuclear exchange. The model presented in this paper handles the spatial uncertainty in potentially targeted unit locations by proposing two-dimensional multivariate probability models for the actual and perceived locations of units subordinate to the major (division-level) units represented inmore » theater scenarios. The temporal uncertainty in the activities of interest represented in our theater-level Force Evaluation Model (FORCEM) is handled through probability models of the acquisition and movement of potential nuclear target units.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28603412','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28603412"><span>In vitro dose comparison of Respimat® inhaler with dry powder inhalers for COPD maintenance therapy.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Ciciliani, Anna-Maria; Langguth, Peter; Wachtel, Herbert</p> <p>2017-01-01</p> <p>Combining in vitro mouth-throat deposition measurements, cascade impactor data and computational fluid dynamics (CFD) simulations, four different inhalers were compared which are indicated for chronic obstructive pulmonary disease (COPD) treatment. The Respimat inhaler, the Breezhaler, the Genuair, and the Ellipta were coupled to the idealized Alberta throat model. The modeled dose to the lung (mDTL) was collected downstream of the Alberta throat model using either a filter or a next generation impactor (NGI). Idealized breathing patterns from COPD patient groups - moderate and very severe COPD - were applied. Theoretical lung deposition patterns were assessed by an individual path model. For the Respimat the mDTL was found to be 59% (SD 5%) for the moderate COPD breathing pattern and 67% (SD 5%) for very severe COPD breathing pattern. The percentages refer to nominal dose (ND) in vitro. This is in the range of 44%-63% in vivo in COPD patients who display large individual variability. Breezhaler showed a mDTL of 43% (SD 2%) for moderate disease simulation and 51% (SD 2%) for very severe simulation. The corresponding results for Genuair are mDTL of 32% (SD 2%) for moderate and 42% (SD 1%) for very severe disease. Ellipta vilanterol particles showed a mDTL of 49% (SD 3%) for moderate and 55% (SD 2%) for very severe disease simulation, and Ellipta fluticasone particles showed a mDTL of 33% (SD 3%) and 41% (SD 2%), respectively for the two breathing patterns. Based on the throat output and average flows of the different inhalers, CFD simulations were performed. Laminar and turbulent steady flow calculations indicated that deposition occurs mainly in the small airways. In summary, Respimat showed the lowest amount of particles depositing in the mouth-throat model and the highest amount reaching all regions of the simulation lung model.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5457178','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5457178"><span>In vitro dose comparison of Respimat® inhaler with dry powder inhalers for COPD maintenance therapy</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Ciciliani, Anna-Maria; Langguth, Peter; Wachtel, Herbert</p> <p>2017-01-01</p> <p>Background Combining in vitro mouth–throat deposition measurements, cascade impactor data and computational fluid dynamics (CFD) simulations, four different inhalers were compared which are indicated for chronic obstructive pulmonary disease (COPD) treatment. Methods The Respimat inhaler, the Breezhaler, the Genuair, and the Ellipta were coupled to the idealized Alberta throat model. The modeled dose to the lung (mDTL) was collected downstream of the Alberta throat model using either a filter or a next generation impactor (NGI). Idealized breathing patterns from COPD patient groups – moderate and very severe COPD – were applied. Theoretical lung deposition patterns were assessed by an individual path model. Results and conclusion For the Respimat the mDTL was found to be 59% (SD 5%) for the moderate COPD breathing pattern and 67% (SD 5%) for very severe COPD breathing pattern. The percentages refer to nominal dose (ND) in vitro. This is in the range of 44%–63% in vivo in COPD patients who display large individual variability. Breezhaler showed a mDTL of 43% (SD 2%) for moderate disease simulation and 51% (SD 2%) for very severe simulation. The corresponding results for Genuair are mDTL of 32% (SD 2%) for moderate and 42% (SD 1%) for very severe disease. Ellipta vilanterol particles showed a mDTL of 49% (SD 3%) for moderate and 55% (SD 2%) for very severe disease simulation, and Ellipta fluticasone particles showed a mDTL of 33% (SD 3%) and 41% (SD 2%), respectively for the two breathing patterns. Based on the throat output and average flows of the different inhalers, CFD simulations were performed. Laminar and turbulent steady flow calculations indicated that deposition occurs mainly in the small airways. In summary, Respimat showed the lowest amount of particles depositing in the mouth–throat model and the highest amount reaching all regions of the simulation lung model. PMID:28603412</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3830866','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3830866"><span>Selecting Summary Statistics in Approximate Bayesian Computation for Calibrating Stochastic Models</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Burr, Tom</p> <p>2013-01-01</p> <p>Approximate Bayesian computation (ABC) is an approach for using measurement data to calibrate stochastic computer models, which are common in biology applications. ABC is becoming the “go-to” option when the data and/or parameter dimension is large because it relies on user-chosen summary statistics rather than the full data and is therefore computationally feasible. One technical challenge with ABC is that the quality of the approximation to the posterior distribution of model parameters depends on the user-chosen summary statistics. In this paper, the user requirement to choose effective summary statistics in order to accurately estimate the posterior distribution of model parameters is investigated and illustrated by example, using a model and corresponding real data of mitochondrial DNA population dynamics. We show that for some choices of summary statistics, the posterior distribution of model parameters is closely approximated and for other choices of summary statistics, the posterior distribution is not closely approximated. A strategy to choose effective summary statistics is suggested in cases where the stochastic computer model can be run at many trial parameter settings, as in the example. PMID:24288668</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24288668','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24288668"><span>Selecting summary statistics in approximate Bayesian computation for calibrating stochastic models.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Burr, Tom; Skurikhin, Alexei</p> <p>2013-01-01</p> <p>Approximate Bayesian computation (ABC) is an approach for using measurement data to calibrate stochastic computer models, which are common in biology applications. ABC is becoming the "go-to" option when the data and/or parameter dimension is large because it relies on user-chosen summary statistics rather than the full data and is therefore computationally feasible. One technical challenge with ABC is that the quality of the approximation to the posterior distribution of model parameters depends on the user-chosen summary statistics. In this paper, the user requirement to choose effective summary statistics in order to accurately estimate the posterior distribution of model parameters is investigated and illustrated by example, using a model and corresponding real data of mitochondrial DNA population dynamics. We show that for some choices of summary statistics, the posterior distribution of model parameters is closely approximated and for other choices of summary statistics, the posterior distribution is not closely approximated. A strategy to choose effective summary statistics is suggested in cases where the stochastic computer model can be run at many trial parameter settings, as in the example.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4994914','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4994914"><span>Constrained Maximum Likelihood Estimation for Model Calibration Using Summary-level Information from External Big Data Sources</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Chatterjee, Nilanjan; Chen, Yi-Hau; Maas, Paige; Carroll, Raymond J.</p> <p>2016-01-01</p> <p>Information from various public and private data sources of extremely large sample sizes are now increasingly available for research purposes. Statistical methods are needed for utilizing information from such big data sources while analyzing data from individual studies that may collect more detailed information required for addressing specific hypotheses of interest. In this article, we consider the problem of building regression models based on individual-level data from an “internal” study while utilizing summary-level information, such as information on parameters for reduced models, from an “external” big data source. We identify a set of very general constraints that link internal and external models. These constraints are used to develop a framework for semiparametric maximum likelihood inference that allows the distribution of covariates to be estimated using either the internal sample or an external reference sample. We develop extensions for handling complex stratified sampling designs, such as case-control sampling, for the internal study. Asymptotic theory and variance estimators are developed for each case. We use simulation studies and a real data application to assess the performance of the proposed methods in contrast to the generalized regression (GR) calibration methodology that is popular in the sample survey literature. PMID:27570323</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li class="active"><span>15</span></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_15 --> <div id="page_16" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li class="active"><span>16</span></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="301"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013CoPhC.184.1729B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013CoPhC.184.1729B"><span>CalcHEP 3.4 for collider physics within and beyond the Standard Model</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Belyaev, Alexander; Christensen, Neil D.; Pukhov, Alexander</p> <p>2013-07-01</p> <p>We present version 3.4 of the CalcHEP software package which is designed for effective evaluation and simulation of high energy physics collider processes at parton level. The main features of CalcHEP are the computation of Feynman diagrams, integration over multi-particle phase space and event simulation at parton level. The principle attractive key-points along these lines are that it has: (a) an easy startup and usage even for those who are not familiar with CalcHEP and programming; (b) a friendly and convenient graphical user interface (GUI); (c) the option for the user to easily modify a model or introduce a new model by either using the graphical interface or by using an external package with the possibility of cross checking the results in different gauges; (d) a batch interface which allows to perform very complicated and tedious calculations connecting production and decay modes for processes with many particles in the final state. With this features set, CalcHEP can efficiently perform calculations with a high level of automation from a theory in the form of a Lagrangian down to phenomenology in the form of cross sections, parton level event simulation and various kinematical distributions. In this paper we report on the new features of CalcHEP 3.4 which improves the power of our package to be an effective tool for the study of modern collider phenomenology. Program summaryProgram title: CalcHEP Catalogue identifier: AEOV_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOV_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 78535 No. of bytes in distributed program, including test data, etc.: 818061 Distribution format: tar.gz Programming language: C. Computer: PC, MAC, Unix Workstations. Operating system: Unix. RAM: Depends on process under study Classification: 4.4, 5. External routines: X11 Nature of problem: Implement new models of particle interactions. Generate Feynman diagrams for a physical process in any implemented theoretical model. Integrate phase space for Feynman diagrams to obtain cross sections or particle widths taking into account kinematical cuts. Simulate collisions at modern colliders and generate respective unweighted events. Mix events for different subprocesses and connect them with the decays of unstable particles. Solution method: Symbolic calculations. Squared Feynman diagram approach Vegas Monte Carlo algorithm. Restrictions: Up to 2→4 production (1→5 decay) processes are realistic on typical computers. Higher multiplicities sometimes possible for specific 2→5 and 2→6 processes. Unusual features: Graphical user interface, symbolic algebra calculation of squared matrix element, parallelization on a pbs cluster. Running time: Depends strongly on the process. For a typical 2→2 process it takes seconds. For 2→3 processes the typical running time is of the order of minutes. For higher multiplicities it could take much longer.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012CoPhC.183.1793S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012CoPhC.183.1793S"><span>LAMMPS framework for dynamic bonding and an application modeling DNA</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Svaneborg, Carsten</p> <p>2012-08-01</p> <p>We have extended the Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS) to support directional bonds and dynamic bonding. The framework supports stochastic formation of new bonds, breakage of existing bonds, and conversion between bond types. Bond formation can be controlled to limit the maximal functionality of a bead with respect to various bond types. Concomitant with the bond dynamics, angular and dihedral interactions are dynamically introduced between newly connected triplets and quartets of beads, where the interaction type is determined from the local pattern of bead and bond types. When breaking bonds, all angular and dihedral interactions involving broken bonds are removed. The framework allows chemical reactions to be modeled, and use it to simulate a simplistic, coarse-grained DNA model. The resulting DNA dynamics illustrates the power of the present framework. Catalogue identifier: AEME_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEME_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public Licence No. of lines in distributed program, including test data, etc.: 2 243 491 No. of bytes in distributed program, including test data, etc.: 771 Distribution format: tar.gz Programming language: C++ Computer: Single and multiple core servers Operating system: Linux/Unix/Windows Has the code been vectorized or parallelized?: Yes. The code has been parallelized by the use of MPI directives. RAM: 1 Gb Classification: 16.11, 16.12 Nature of problem: Simulating coarse-grain models capable of chemistry e.g. DNA hybridization dynamics. Solution method: Extending LAMMPS to handle dynamic bonding and directional bonds. Unusual features: Allows bonds to be created and broken while angular and dihedral interactions are kept consistent. Additional comments: The distribution file for this program is approximately 36 Mbytes and therefore is not delivered directly when download or E-mail is requested. Instead an html file giving details of how the program can be obtained is sent. Running time: Hours to days. The examples provided in the distribution take just seconds to run.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5592947','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5592947"><span>Accelerating Wright–Fisher Forward Simulations on the Graphics Processing Unit</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Lawrie, David S.</p> <p>2017-01-01</p> <p>Forward Wright–Fisher simulations are powerful in their ability to model complex demography and selection scenarios, but suffer from slow execution on the Central Processor Unit (CPU), thus limiting their usefulness. However, the single-locus Wright–Fisher forward algorithm is exceedingly parallelizable, with many steps that are so-called “embarrassingly parallel,” consisting of a vast number of individual computations that are all independent of each other and thus capable of being performed concurrently. The rise of modern Graphics Processing Units (GPUs) and programming languages designed to leverage the inherent parallel nature of these processors have allowed researchers to dramatically speed up many programs that have such high arithmetic intensity and intrinsic concurrency. The presented GPU Optimized Wright–Fisher simulation, or “GO Fish” for short, can be used to simulate arbitrary selection and demographic scenarios while running over 250-fold faster than its serial counterpart on the CPU. Even modest GPU hardware can achieve an impressive speedup of over two orders of magnitude. With simulations so accelerated, one can not only do quick parametric bootstrapping of previously estimated parameters, but also use simulated results to calculate the likelihoods and summary statistics of demographic and selection models against real polymorphism data, all without restricting the demographic and selection scenarios that can be modeled or requiring approximations to the single-locus forward algorithm for efficiency. Further, as many of the parallel programming techniques used in this simulation can be applied to other computationally intensive algorithms important in population genetics, GO Fish serves as an exciting template for future research into accelerating computation in evolution. GO Fish is part of the Parallel PopGen Package available at: http://dl42.github.io/ParallelPopGen/. PMID:28768689</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19780013662','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19780013662"><span>Use of the Marshall Space Flight Center solar simulator in collector performance evaluation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Humphries, W. R.</p> <p>1978-01-01</p> <p>Actual measured values from simulator checkout tests are detailed. Problems encountered during initial startup are discussed and solutions described. Techniques utilized to evaluate collector performance from simulator test data are given. Performance data generated in the simulator are compared to equivalent data generated during natural outdoor testing. Finally, a summary of collector performance parameters generated to date as a result of simulator testing are given.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013BGD....1013097H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013BGD....1013097H"><span>Technical Note: Approximate Bayesian parameterization of a complex tropical forest model</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hartig, F.; Dislich, C.; Wiegand, T.; Huth, A.</p> <p>2013-08-01</p> <p>Inverse parameter estimation of process-based models is a long-standing problem in ecology and evolution. A key problem of inverse parameter estimation is to define a metric that quantifies how well model predictions fit to the data. Such a metric can be expressed by general cost or objective functions, but statistical inversion approaches are based on a particular metric, the probability of observing the data given the model, known as the likelihood. Deriving likelihoods for dynamic models requires making assumptions about the probability for observations to deviate from mean model predictions. For technical reasons, these assumptions are usually derived without explicit consideration of the processes in the simulation. Only in recent years have new methods become available that allow generating likelihoods directly from stochastic simulations. Previous applications of these approximate Bayesian methods have concentrated on relatively simple models. Here, we report on the application of a simulation-based likelihood approximation for FORMIND, a parameter-rich individual-based model of tropical forest dynamics. We show that approximate Bayesian inference, based on a parametric likelihood approximation placed in a conventional MCMC, performs well in retrieving known parameter values from virtual field data generated by the forest model. We analyze the results of the parameter estimation, examine the sensitivity towards the choice and aggregation of model outputs and observed data (summary statistics), and show results from using this method to fit the FORMIND model to field data from an Ecuadorian tropical forest. Finally, we discuss differences of this approach to Approximate Bayesian Computing (ABC), another commonly used method to generate simulation-based likelihood approximations. Our results demonstrate that simulation-based inference, which offers considerable conceptual advantages over more traditional methods for inverse parameter estimation, can successfully be applied to process-based models of high complexity. The methodology is particularly suited to heterogeneous and complex data structures and can easily be adjusted to other model types, including most stochastic population and individual-based models. Our study therefore provides a blueprint for a fairly general approach to parameter estimation of stochastic process-based models in ecology and evolution.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20050185206','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20050185206"><span>Development of the Runway Incursion Advisory and Alerting System (RIAAS): Research Summary</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Jones, Denise R. (Technical Monitor); Cassell, Rick</p> <p>2005-01-01</p> <p>This report summarizes research conducted on an aircraft based Runway Incursion Advisory and Alerting System (RIAAS) developed under a cooperative agreement between Rannoch Corporation and the NASA Langley Research Center. A summary of RIAAS is presented along with results from simulation and flight testing, safety benefits, and key technical issues.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/15899310','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/15899310"><span>NONMEMory: a run management tool for NONMEM.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Wilkins, Justin J</p> <p>2005-06-01</p> <p>NONMEM is an extremely powerful tool for nonlinear mixed-effect modelling and simulation of pharmacokinetic and pharmacodynamic data. However, it is a console-based application whose output does not lend itself to rapid interpretation or efficient management. NONMEMory has been created to be a comprehensive project manager for NONMEM, providing detailed summary, comparison and overview of the runs comprising a given project, including the display of output data, simple post-run processing, fast diagnostic plots and run output management, complementary to other available modelling aids. Analysis time ought not to be spent on trivial tasks, and NONMEMory's role is to eliminate these as far as possible by increasing the efficiency of the modelling process. NONMEMory is freely available from http://www.uct.ac.za/depts/pha/nonmemory.php.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19980203040','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19980203040"><span>Atmospheric Probe Model: Construction and Wind Tunnel Tests</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Vogel, Jerald M.</p> <p>1998-01-01</p> <p>The material contained in this document represents a summary of the results of a low speed wind tunnel test program to determine the performance of an atmospheric probe at low speed. The probe configuration tested consists of a 2/3 scale model constructed from a combination of hard maple wood and aluminum stock. The model design includes approximately 130 surface static pressure taps. Additional hardware incorporated in the baseline model provides a mechanism for simulating external and internal trailing edge split flaps for probe flow control. Test matrix parameters include probe side slip angle, external/internal split flap deflection angle, and trip strip applications. Test output database includes surface pressure distributions on both inner and outer annular wings and probe center line velocity distributions from forward probe to aft probe locations.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3035647','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3035647"><span>A General Model of Synaptic Transmission and Short-Term Plasticity</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Pan, Bin; Zucker, Robert S.</p> <p>2011-01-01</p> <p>SUMMARY Some synapses transmit strongly to action potentials (APs), but weaken with repeated activation; others transmit feebly at first, but strengthen with sustained activity. We measured synchronous and asynchronous transmitter release at “phasic” crayfish neuromuscular junctions (NMJs) showing depression and at facilitating “tonic” junctions, and define the kinetics of depression and facilitation. We offer a comprehensive model of presynaptic processes, encompassing mobilization of reserve vesicles, priming of docked vesicles, their association with Ca2+ channels, and refractoriness of release sites, while accounting for data on presynaptic buffers governing Ca2+ diffusion. Model simulations reproduce many experimentally defined aspects of transmission and plasticity at these synapses. Their similarity to vertebrate central synapses suggests that the model might be of general relevance to synaptic transmission. PMID:19477155</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70032426','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70032426"><span>GSTARS computer models and their applications, Part II: Applications</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Simoes, F.J.M.; Yang, C.T.</p> <p>2008-01-01</p> <p>In part 1 of this two-paper series, a brief summary of the basic concepts and theories used in developing the Generalized Stream Tube model for Alluvial River Simulation (GSTARS) computer models was presented. Part 2 provides examples that illustrate some of the capabilities of the GSTARS models and how they can be applied to solve a wide range of river and reservoir sedimentation problems. Laboratory and field case studies are used and the examples show representative applications of the earlier and of the more recent versions of GSTARS. Some of the more recent capabilities implemented in GSTARS3, one of the latest versions of the series, are also discussed here with more detail. ?? 2008 International Research and Training Centre on Erosion and Sedimentation and the World Association for Sedimentation and Erosion Research.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012CoPhC.183..125N','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012CoPhC.183..125N"><span>An open-source library for the numerical modeling of mass-transfer in solid oxide fuel cells</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Novaresio, Valerio; García-Camprubí, María; Izquierdo, Salvador; Asinari, Pietro; Fueyo, Norberto</p> <p>2012-01-01</p> <p>The generation of direct current electricity using solid oxide fuel cells (SOFCs) involves several interplaying transport phenomena. Their simulation is crucial for the design and optimization of reliable and competitive equipment, and for the eventual market deployment of this technology. An open-source library for the computational modeling of mass-transport phenomena in SOFCs is presented in this article. It includes several multicomponent mass-transport models ( i.e. Fickian, Stefan-Maxwell and Dusty Gas Model), which can be applied both within porous media and in porosity-free domains, and several diffusivity models for gases. The library has been developed for its use with OpenFOAM ®, a widespread open-source code for fluid and continuum mechanics. The library can be used to model any fluid flow configuration involving multicomponent transport phenomena and it is validated in this paper against the analytical solution of one-dimensional test cases. In addition, it is applied for the simulation of a real SOFC and further validated using experimental data. Program summaryProgram title: multiSpeciesTransportModels Catalogue identifier: AEKB_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKB_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License No. of lines in distributed program, including test data, etc.: 18 140 No. of bytes in distributed program, including test data, etc.: 64 285 Distribution format: tar.gz Programming language:: C++ Computer: Any x86 (the instructions reported in the paper consider only the 64 bit case for the sake of simplicity) Operating system: Generic Linux (the instructions reported in the paper consider only the open-source Ubuntu distribution for the sake of simplicity) Classification: 12 External routines: OpenFOAM® (version 1.6-ext) ( http://www.extend-project.de) Nature of problem: This software provides a library of models for the simulation of the steady state mass and momentum transport in a multi-species gas mixture, possibly in a porous medium. The software is particularly designed to be used as the mass-transport library for the modeling of solid oxide fuel cells (SOFC). When supplemented with other sub-models, such as thermal and charge-transport ones, it allows the prediction of the cell polarization curve and hence the cell performance. Solution method: Standard finite volume method (FVM) is used for solving all the conservation equations. The pressure-velocity coupling is solved using the SIMPLE algorithm (possibly adding a porous drag term if required). The mass transport can be calculated using different alternative models, namely Fick, Maxwell-Stefan or dusty gas model. The code adopts a segregated method to solve the resulting linear system of equations. The different regions of the SOFC, namely gas channels, electrodes and electrolyte, are solved independently, and coupled through boundary conditions. Restrictions: When extremely large species fluxes are considered, current implementation of the Neumann and Robin boundary conditions do not avoid negative values of molar and/or mass fractions, which finally end up with numerical instability. However this never happened in the documented runs. Eventually these boundary conditions could be reformulated to become more robust. Running time: From seconds to hours depending on the mesh size and number of species. For example, on a 64 bit machine with Intel Core Duo T8300 and 3 GBytes of RAM, the provided test run requires less than 1 second.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009PhDT.......206D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009PhDT.......206D"><span>An approach for modelling snowcover ablation and snowmelt runoff in cold region environments</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Dornes, Pablo Fernando</p> <p></p> <p>Reliable hydrological model simulations are the result of numerous complex interactions among hydrological inputs, landscape properties, and initial conditions. Determination of the effects of these factors is one of the main challenges in hydrological modelling. This situation becomes even more difficult in cold regions due to the ungauged nature of subarctic and arctic environments. This research work is an attempt to apply a new approach for modelling snowcover ablation and snowmelt runoff in complex subarctic environments with limited data while retaining integrity in the process representations. The modelling strategy is based on the incorporation of both detailed process understanding and inputs along with information gained from observations of basin-wide streamflow phenomenon; essentially a combination of deductive and inductive approaches. The study was conducted in the Wolf Creek Research Basin, Yukon Territory, using three models, a small-scale physically based hydrological model, a land surface scheme, and a land surface hydrological model. The spatial representation was based on previous research studies and observations, and was accomplished by incorporating landscape units, defined according to topography and vegetation, as the spatial model elements. Comparisons between distributed and aggregated modelling approaches showed that simulations incorporating distributed initial snowcover and corrected solar radiation were able to properly simulate snowcover ablation and snowmelt runoff whereas the aggregated modelling approaches were unable to represent the differential snowmelt rates and complex snowmelt runoff dynamics. Similarly, the inclusion of spatially distributed information in a land surface scheme clearly improved simulations of snowcover ablation. Application of the same modelling approach at a larger scale using the same landscape based parameterisation showed satisfactory results in simulating snowcover ablation and snowmelt runoff with minimal calibration. Verification of this approach in an arctic basin illustrated that landscape based parameters are a feasible regionalisation framework for distributed and physically based models. In summary, the proposed modelling philosophy, based on the combination of an inductive and deductive reasoning, is a suitable strategy for reliable predictions of snowcover ablation and snowmelt runoff in cold regions and complex environments.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014CoPhC.185.2350J','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014CoPhC.185.2350J"><span>Sailfish: A flexible multi-GPU implementation of the lattice Boltzmann method</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Januszewski, M.; Kostur, M.</p> <p>2014-09-01</p> <p>We present Sailfish, an open source fluid simulation package implementing the lattice Boltzmann method (LBM) on modern Graphics Processing Units (GPUs) using CUDA/OpenCL. We take a novel approach to GPU code implementation and use run-time code generation techniques and a high level programming language (Python) to achieve state of the art performance, while allowing easy experimentation with different LBM models and tuning for various types of hardware. We discuss the general design principles of the code, scaling to multiple GPUs in a distributed environment, as well as the GPU implementation and optimization of many different LBM models, both single component (BGK, MRT, ELBM) and multicomponent (Shan-Chen, free energy). The paper also presents results of performance benchmarks spanning the last three NVIDIA GPU generations (Tesla, Fermi, Kepler), which we hope will be useful for researchers working with this type of hardware and similar codes. Catalogue identifier: AETA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AETA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU Lesser General Public License, version 3 No. of lines in distributed program, including test data, etc.: 225864 No. of bytes in distributed program, including test data, etc.: 46861049 Distribution format: tar.gz Programming language: Python, CUDA C, OpenCL. Computer: Any with an OpenCL or CUDA-compliant GPU. Operating system: No limits (tested on Linux and Mac OS X). RAM: Hundreds of megabytes to tens of gigabytes for typical cases. Classification: 12, 6.5. External routines: PyCUDA/PyOpenCL, Numpy, Mako, ZeroMQ (for multi-GPU simulations), scipy, sympy Nature of problem: GPU-accelerated simulation of single- and multi-component fluid flows. Solution method: A wide range of relaxation models (LBGK, MRT, regularized LB, ELBM, Shan-Chen, free energy, free surface) and boundary conditions within the lattice Boltzmann method framework. Simulations can be run in single or double precision using one or more GPUs. Restrictions: The lattice Boltzmann method works for low Mach number flows only. Unusual features: The actual numerical calculations run exclusively on GPUs. The numerical code is built dynamically at run-time in CUDA C or OpenCL, using templates and symbolic formulas. The high-level control of the simulation is maintained by a Python process. Additional comments: !!!!! The distribution file for this program is over 45 Mbytes and therefore is not delivered directly when Download or Email is requested. Instead a html file giving details of how the program can be obtained is sent. !!!!! Running time: Problem-dependent, typically minutes (for small cases or short simulations) to hours (large cases or long simulations).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4222655','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4222655"><span>Inference of Epidemiological Dynamics Based on Simulated Phylogenies Using Birth-Death and Coalescent Models</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Boskova, Veronika; Bonhoeffer, Sebastian; Stadler, Tanja</p> <p>2014-01-01</p> <p>Quantifying epidemiological dynamics is crucial for understanding and forecasting the spread of an epidemic. The coalescent and the birth-death model are used interchangeably to infer epidemiological parameters from the genealogical relationships of the pathogen population under study, which in turn are inferred from the pathogen genetic sequencing data. To compare the performance of these widely applied models, we performed a simulation study. We simulated phylogenetic trees under the constant rate birth-death model and the coalescent model with a deterministic exponentially growing infected population. For each tree, we re-estimated the epidemiological parameters using both a birth-death and a coalescent based method, implemented as an MCMC procedure in BEAST v2.0. In our analyses that estimate the growth rate of an epidemic based on simulated birth-death trees, the point estimates such as the maximum a posteriori/maximum likelihood estimates are not very different. However, the estimates of uncertainty are very different. The birth-death model had a higher coverage than the coalescent model, i.e. contained the true value in the highest posterior density (HPD) interval more often (2–13% vs. 31–75% error). The coverage of the coalescent decreases with decreasing basic reproductive ratio and increasing sampling probability of infecteds. We hypothesize that the biases in the coalescent are due to the assumption of deterministic rather than stochastic population size changes. Both methods performed reasonably well when analyzing trees simulated under the coalescent. The methods can also identify other key epidemiological parameters as long as one of the parameters is fixed to its true value. In summary, when using genetic data to estimate epidemic dynamics, our results suggest that the birth-death method will be less sensitive to population fluctuations of early outbreaks than the coalescent method that assumes a deterministic exponentially growing infected population. PMID:25375100</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19840043466&hterms=Grid&qs=N%3D0%26Ntk%3DAll%26Ntx%3Dmode%2Bmatchall%26Ntt%3DGrid%2B2.0','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19840043466&hterms=Grid&qs=N%3D0%26Ntk%3DAll%26Ntx%3Dmode%2Bmatchall%26Ntt%3DGrid%2B2.0"><span>A nested-grid limited-area model for short term weather forecasting</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Wong, V. C.; Zack, J. W.; Kaplan, M. L.; Coats, G. D.</p> <p>1983-01-01</p> <p>The present investigation is concerned with a mesoscale atmospheric simulation system (MASS), incorporating the sigma-coordinate primitive equations. The present version of this model (MASS 3.0) has 14 vertical layers, with the upper boundary at 100 mb. There are 128 x 96 grid points in each layer. The earlier version of this model (MASS 2.0) has been described by Kaplan et al. (1982). The current investigation provides a summary of major revisions to that version and a description of the parameterization schemes which are presently included in the model. The planetary boundary layer (PBL) is considered, taking into account aspects of generalized similarity theory and free convection, the surface energy budget, the surface moisture budget, and prognostic equations for the depth h of the PBL. A cloud model is discussed, giving attention to stable precipitation, and cumulus convection.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24772029','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24772029"><span>Summary on several key techniques in 3D geological modeling.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Mei, Gang</p> <p>2014-01-01</p> <p>Several key techniques in 3D geological modeling including planar mesh generation, spatial interpolation, and surface intersection are summarized in this paper. Note that these techniques are generic and widely used in various applications but play a key role in 3D geological modeling. There are two essential procedures in 3D geological modeling: the first is the simulation of geological interfaces using geometric surfaces and the second is the building of geological objects by means of various geometric computations such as the intersection of surfaces. Discrete geometric surfaces that represent geological interfaces can be generated by creating planar meshes first and then spatially interpolating; those surfaces intersect and then form volumes that represent three-dimensional geological objects such as rock bodies. In this paper, the most commonly used algorithms of the key techniques in 3D geological modeling are summarized.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27591082','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27591082"><span>HAPRAP: a haplotype-based iterative method for statistical fine mapping using GWAS summary statistics.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Zheng, Jie; Rodriguez, Santiago; Laurin, Charles; Baird, Denis; Trela-Larsen, Lea; Erzurumluoglu, Mesut A; Zheng, Yi; White, Jon; Giambartolomei, Claudia; Zabaneh, Delilah; Morris, Richard; Kumari, Meena; Casas, Juan P; Hingorani, Aroon D; Evans, David M; Gaunt, Tom R; Day, Ian N M</p> <p>2017-01-01</p> <p>Fine mapping is a widely used approach for identifying the causal variant(s) at disease-associated loci. Standard methods (e.g. multiple regression) require individual level genotypes. Recent fine mapping methods using summary-level data require the pairwise correlation coefficients ([Formula: see text]) of the variants. However, haplotypes rather than pairwise [Formula: see text], are the true biological representation of linkage disequilibrium (LD) among multiple loci. In this article, we present an empirical iterative method, HAPlotype Regional Association analysis Program (HAPRAP), that enables fine mapping using summary statistics and haplotype information from an individual-level reference panel. Simulations with individual-level genotypes show that the results of HAPRAP and multiple regression are highly consistent. In simulation with summary-level data, we demonstrate that HAPRAP is less sensitive to poor LD estimates. In a parametric simulation using Genetic Investigation of ANthropometric Traits height data, HAPRAP performs well with a small training sample size (N < 2000) while other methods become suboptimal. Moreover, HAPRAP's performance is not affected substantially by single nucleotide polymorphisms (SNPs) with low minor allele frequencies. We applied the method to existing quantitative trait and binary outcome meta-analyses (human height, QTc interval and gallbladder disease); all previous reported association signals were replicated and two additional variants were independently associated with human height. Due to the growing availability of summary level data, the value of HAPRAP is likely to increase markedly for future analyses (e.g. functional prediction and identification of instruments for Mendelian randomization). The HAPRAP package and documentation are available at http://apps.biocompute.org.uk/haprap/ CONTACT: : jie.zheng@bristol.ac.uk or tom.gaunt@bristol.ac.ukSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.usgs.gov/sir/2015/5182/sir20155182.pdf','USGSPUBS'); return false;" href="https://pubs.usgs.gov/sir/2015/5182/sir20155182.pdf"><span>Summary of U.S. Geological Survey studies conducted in cooperation with the Citizen Potawatomi Nation, central Oklahoma, 2011–14</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Andrews, William J.; Becker, Carol J.; Ryter, Derek W.; Smith, S. Jerrod</p> <p>2016-01-19</p> <p>Numerical groundwater-flow models were created to characterize flow systems in aquifers underlying this study area and areas of particular interest within the study area. Those models were used to estimate sustainable groundwater yields from parts of the North Canadian River alluvial aquifer, characterize groundwater/surface-water interactions, and estimate the effects of a 10-year simulated drought on streamflows and water levels in alluvial and bedrock aquifers. Pumping of wells at the Iron Horse Industrial Park was estimated to cause negligible infiltration of water from the adjoining North Canadian River. A 10-year simulated drought of 50 percent of normal recharge was tested for the period 1990–2000. For this period, the total amount of groundwater in storage was estimated to decrease by 8.6 percent in the North Canadian River alluvial aquifer and approximately 0.2 percent in the Central Oklahoma aquifer, and groundwater flow to streams was estimated to decrease by 28–37 percent. This volume of groundwater loss showed that the Central Oklahoma aquifer is a bedrock aquifer that has relatively low rates of recharge from the land surface. The simulated drought decreased simulated streamflow, composed of base flow, in the North Canadian River at Shawnee, Okla., which did not recover to predrought conditions until the relatively wet year of 2007 after the simulated drought period.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/18654689','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/18654689"><span>Interpretation of diffusion coefficients in nanostructured materials from random walk numerical simulation.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Anta, Juan A; Mora-Seró, Iván; Dittrich, Thomas; Bisquert, Juan</p> <p>2008-08-14</p> <p>We make use of the numerical simulation random walk (RWNS) method to compute the "jump" diffusion coefficient of electrons in nanostructured materials via mean-square displacement. First, a summary of analytical results is given that relates the diffusion coefficient obtained from RWNS to those in the multiple-trapping (MT) and hopping models. Simulations are performed in a three-dimensional lattice of trap sites with energies distributed according to an exponential distribution and with a step-function distribution centered at the Fermi level. It is observed that once the stationary state is reached, the ensemble of particles follow Fermi-Dirac statistics with a well-defined Fermi level. In this stationary situation the diffusion coefficient obeys the theoretical predictions so that RWNS effectively reproduces the MT model. Mobilities can be also computed when an electrical bias is applied and they are observed to comply with the Einstein relation when compared with steady-state diffusion coefficients. The evolution of the system towards the stationary situation is also studied. When the diffusion coefficients are monitored along simulation time a transition from anomalous to trap-limited transport is observed. The nature of this transition is discussed in terms of the evolution of electron distribution and the Fermi level. All these results will facilitate the use of RW simulation and related methods to interpret steady-state as well as transient experimental techniques.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19720021013','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19720021013"><span>G and C boost and abort study summary, exhibit B</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Backman, H. D.</p> <p>1972-01-01</p> <p>A six degree of freedom simulation of rigid vehicles was developed to study space shuttle vehicle boost-abort guidance and control techniques. The simulation was described in detail as an all digital program and as a hybrid program. Only the digital simulation was implemented. The equations verified in the digital simulation were adapted for use in the hybrid simulation. Study results were obtained from four abort cases using the digital program.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li class="active"><span>16</span></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_16 --> <div id="page_17" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li class="active"><span>17</span></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="321"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFM.H23M1051V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFM.H23M1051V"><span>The Iterative Research Cycle: Process-Based Model Evaluation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Vrugt, J. A.</p> <p>2014-12-01</p> <p>The ever increasing pace of computational power, along with continued advances in measurement technologies and improvements in process understanding has stimulated the development of increasingly complex physics based models that simulate a myriad of processes at different spatial and temporal scales. Reconciling these high-order system models with perpetually larger volumes of field data is becoming more and more difficult, particularly because classical likelihood-based fitting methods lack the power to detect and pinpoint deficiencies in the model structure. In this talk I will give an overview of our latest research on process-based model calibration and evaluation. This approach, rooted in Bayesian theory, uses summary metrics of the calibration data rather than the data itself to help detect which component(s) of the model is (are) malfunctioning and in need of improvement. A few case studies involving hydrologic and geophysical models will be used to demonstrate the proposed methodology.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5070834','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5070834"><span>Spatiotemporal Variation in Distance Dependent Animal Movement Contacts: One Size Doesn’t Fit All</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Brommesson, Peter; Wennergren, Uno; Lindström, Tom</p> <p>2016-01-01</p> <p>The structure of contacts that mediate transmission has a pronounced effect on the outbreak dynamics of infectious disease and simulation models are powerful tools to inform policy decisions. Most simulation models of livestock disease spread rely to some degree on predictions of animal movement between holdings. Typically, movements are more common between nearby farms than between those located far away from each other. Here, we assessed spatiotemporal variation in such distance dependence of animal movement contacts from an epidemiological perspective. We evaluated and compared nine statistical models, applied to Swedish movement data from 2008. The models differed in at what level (if at all), they accounted for regional and/or seasonal heterogeneities in the distance dependence of the contacts. Using a kernel approach to describe how probability of contacts between farms changes with distance, we developed a hierarchical Bayesian framework and estimated parameters by using Markov Chain Monte Carlo techniques. We evaluated models by three different approaches of model selection. First, we used Deviance Information Criterion to evaluate their performance relative to each other. Secondly, we estimated the log predictive posterior distribution, this was also used to evaluate their relative performance. Thirdly, we performed posterior predictive checks by simulating movements with each of the parameterized models and evaluated their ability to recapture relevant summary statistics. Independent of selection criteria, we found that accounting for regional heterogeneity improved model accuracy. We also found that accounting for seasonal heterogeneity was beneficial, in terms of model accuracy, according to two of three methods used for model selection. Our results have important implications for livestock disease spread models where movement is an important risk factor for between farm transmission. We argue that modelers should refrain from using methods to simulate animal movements that assume the same pattern across all regions and seasons without explicitly testing for spatiotemporal variation. PMID:27760155</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1042383','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1042383"><span>AGR-1 Thermocouple Data Analysis</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Jeff Einerson</p> <p>2012-05-01</p> <p>This report documents an effort to analyze measured and simulated data obtained in the Advanced Gas Reactor (AGR) fuel irradiation test program conducted in the INL's Advanced Test Reactor (ATR) to support the Next Generation Nuclear Plant (NGNP) R&D program. The work follows up on a previous study (Pham and Einerson, 2010), in which statistical analysis methods were applied for AGR-1 thermocouple data qualification. The present work exercises the idea that, while recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, results of the numerical simulations can be used in combination with the statistical analysis methods tomore » further improve qualification of measured data. Additionally, the combined analysis of measured and simulation data can generate insights about simulation model uncertainty that can be useful for model improvement. This report also describes an experimental control procedure to maintain fuel target temperature in the future AGR tests using regression relationships that include simulation results. The report is organized into four chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program, AGR-1 test configuration and test procedure, overview of AGR-1 measured data, and overview of physics and thermal simulation, including modeling assumptions and uncertainties. A brief summary of statistical analysis methods developed in (Pham and Einerson 2010) for AGR-1 measured data qualification within NGNP Data Management and Analysis System (NDMAS) is also included for completeness. Chapters 2-3 describe and discuss cases, in which the combined use of experimental and simulation data is realized. A set of issues associated with measurement and modeling uncertainties resulted from the combined analysis are identified. This includes demonstration that such a combined analysis led to important insights for reducing uncertainty in presentation of AGR-1 measured data (Chapter 2) and interpretation of simulation results (Chapter 3). The statistics-based simulation-aided experimental control procedure described for the future AGR tests is developed and demonstrated in Chapter 4. The procedure for controlling the target fuel temperature (capsule peak or average) is based on regression functions of thermocouple readings and other relevant parameters and accounting for possible changes in both physical and thermal conditions and in instrument performance.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19780002187','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19780002187"><span>Proceedings of the Spacecraft Charging Technology Conference: Executive Summary</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Pike, C. P.; Whipple, E. C., Jr.; Stevens, N. J.; Minges, M. L.; Lehn, W. L.; Bunn, M. H.</p> <p>1977-01-01</p> <p>Aerospace environments are reviewed in reference to spacecraft charging. Modelling, a theoretical scheme which can be used to describe the structure of the sheath around the spacecraft and to calculate the charging currents within, is discussed. Materials characterization is considered for experimental determination of the behavior of typical spacecraft materials when exposed to simulated geomagnetic substorm conditions. Materials development is also examined for controlling and minimizing spacecraft charging or at least for distributing the charge in an equipotential manner, using electrical conductive surfaces for materials exposed to space environment.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19730016177','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19730016177"><span>Modal analysis for Liapunov stability of rotating elastic bodies. Ph.D. Thesis. Final Report</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Colin, A. D.</p> <p>1973-01-01</p> <p>This study consisted of four parallel efforts: (1) modal analyses of elastic continua for Liapunov stability analysis of flexible spacecraft; (2) development of general purpose simulation equations for arbitrary spacecraft; (3) evaluation of alternative mathematical models for elastic components of spacecraft; and (4) examination of the influence of vehicle flexibility on spacecraft attitude control system performance. A complete record is given of achievements under tasks (1) and (3), in the form of technical appendices, and a summary description of progress under tasks two and four.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19990024948','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19990024948"><span>Studies of Tenuous Planetary Atmospheres</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Combi, Michael R.</p> <p>1998-01-01</p> <p>The final report includes an overall project overview as well as scientific background summaries of dust and sodium in comets, and tenuous atmospheres of Jupiter's natural satellites. Progress and continuing work related to dust coma and tenuous atmospheric studies are presented. Also included are published articles written during the course of the report period. These are entitled: (1) On Europa's Magnetospheric Interaction: An MHD Simulation; (2) Dust-Gas Interrelations in Comets: Observations and Theory; and (3) Io's Plasma Environment During the Galileo Flyby: Global Three Dimensional MHD Modeling with Adaptive Mesh Refinement.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2001BAMS...82.2357B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2001BAMS...82.2357B"><span>The Community Climate System Model.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Blackmon, Maurice; Boville, Byron; Bryan, Frank; Dickinson, Robert; Gent, Peter; Kiehl, Jeffrey; Moritz, Richard; Randall, David; Shukla, Jagadish; Solomon, Susan; Bonan, Gordon; Doney, Scott; Fung, Inez; Hack, James; Hunke, Elizabeth; Hurrell, James; Kutzbach, John; Meehl, Jerry; Otto-Bliesner, Bette; Saravanan, R.; Schneider, Edwin K.; Sloan, Lisa; Spall, Michael; Taylor, Karl; Tribbia, Joseph; Washington, Warren</p> <p>2001-11-01</p> <p>The Community Climate System Model (CCSM) has been created to represent the principal components of the climate system and their interactions. Development and applications of the model are carried out by the U.S. climate research community, thus taking advantage of both wide intellectual participation and computing capabilities beyond those available to most individual U.S. institutions. This article outlines the history of the CCSM, its current capabilities, and plans for its future development and applications, with the goal of providing a summary useful to present and future users. The initial version of the CCSM included atmosphere and ocean general circulation models, a land surface model that was grafted onto the atmosphere model, a sea-ice model, and a flux coupler that facilitates information exchanges among the component models with their differing grids. This version of the model produced a successful 300-yr simulation of the current climate without artificial flux adjustments. The model was then used to perform a coupled simulation in which the atmospheric CO2 concentration increased by 1% per year. In this version of the coupled model, the ocean salinity and deep-ocean temperature slowly drifted away from observed values. A subsequent correction to the roughness length used for sea ice significantly reduced these errors. An updated version of the CCSM was used to perform three simulations of the twentieth century's climate, and several pro-jections of the climate of the twenty-first century. The CCSM's simulation of the tropical ocean circulation has been significantly improved by reducing the background vertical diffusivity and incorporating an anisotropic horizontal viscosity tensor. The meridional resolution of the ocean model was also refined near the equator. These changes have resulted in a greatly improved simulation of both the Pacific equatorial undercurrent and the surface countercurrents. The interannual variability of the sea surface temperature in the central and eastern tropical Pacific is also more realistic in simulations with the updated model. Scientific challenges to be addressed with future versions of the CCSM include realistic simulation of the whole atmosphere, including the middle and upper atmosphere, as well as the troposphere; simulation of changes in the chemical composition of the atmosphere through the incorporation of an integrated chemistry model; inclusion of global, prognostic biogeochemical components for land, ocean, and atmosphere; simulations of past climates, including times of extensive continental glaciation as well as times with little or no ice; studies of natural climate variability on seasonal-to-centennial timescales; and investigations of anthropogenic climate change. In order to make such studies possible, work is under way to improve all components of the model. Plans call for a new version of the CCSM to be released in 2002. Planned studies with the CCSM will require much more computer power than is currently available.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/1993STIN...9414351H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/1993STIN...9414351H"><span>BOAST 2 for the IBM 3090 and RISC 6000</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hebert, P.; Bourgoyne, A. T., Jr.; Tyler, J.</p> <p>1993-05-01</p> <p>BOAST 2 simulates isothermal, darcy flow in three dimensions. It assumes that reservoir liquids can be described in three fluid phases (oil, gas, and water) of constant composition, with physical properties that depend on pressure, only. These reservoir fluid approximations are acceptable for a large percentage of the world's oil and gas reservoirs. Consequently, BOAST 2 has a wide range of applicability. BOAST 2 can simulate oil and/or gas recovery by fluid expansion, displacement, gravity drainage, and capillary imhibition mechanisms. Typical field production problems that BOAST 2 can handle include primary depletion studies, pressure maintenance by water and/or gas injection, and evaluation of secondary recovery waterflooding and displacement operations. Technically, BOAST 2 is a finite, implicit pressure, explicit saturation (IMPES) numerical simulator. It applies both direct and iterative solution techniques for solving systems of algebraic equations. The well model allows specification of rate or pressure constraints on well performance, and the user is free to add or to recomplete wells during the simulation. In addition, the user can define multiple rock and PVT regions and can choose from three aquifer models. BOAST 2 also provides flexible initialization, a bubble-point tracking scheme, automatic time-step control, and a material balance check on solution stability. The user controls output, which includes a run summary and line-printer plots of fieldwide performance.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1073505','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1073505"><span>North Pacific Mesoscale Coupled Air-Ocean Simulations Compared with Observations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Koracin, Darko; Cerovecki, Ivana; Vellore, Ramesh</p> <p>2013-04-11</p> <p>Executive summary The main objective of the study was to investigate atmospheric and ocean interaction processes in the western Pacific and, in particular, effects of significant ocean heat loss in the Kuroshio and Kuroshio Extension regions on the lower and upper atmosphere. It is yet to be determined how significant are these processes are on climate scales. The understanding of these processes led us also to development of the methodology of coupling the Weather and Research Forecasting model with the Parallel Ocean Program model for western Pacific regional weather and climate simulations. We tested NCAR-developed research software Coupler 7 formore » coupling of the WRF and POP models and assessed its usability for regional-scale applications. We completed test simulations using the Coupler 7 framework, but implemented a standard WRF model code with options for both one- and two-way mode coupling. This type of coupling will allow us to seamlessly incorporate new WRF updates and versions in the future. We also performed a long-term WRF simulation (15 years) covering the entire North Pacific as well as high-resolution simulations of a case study which included extreme ocean heat losses in the Kuroshio and Kuroshio Extension regions. Since the extreme ocean heat loss occurs during winter cold air outbreaks (CAO), we simulated and analyzed a case study of a severe CAO event in January 2000 in detail. We found that the ocean heat loss induced by CAOs is amplified by additional advection from mesocyclones forming on the southern part of the Japan Sea. Large scale synoptic patterns with anomalously strong anticyclone over Siberia and Mongolia, deep Aleutian Low, and the Pacific subtropical ridge are a crucial setup for the CAO. It was found that the onset of the CAO is related to the breaking of atmospheric Rossby waves and vertical transport of vorticity that facilitates meridional advection. The study also indicates that intrinsic parameterization of the surface fluxes within the WRF model needs more evaluation and analysis.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20160007698','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20160007698"><span>An Efficient Ray-Tracing Method for Determining Terrain Intercepts in EDL Simulations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Shidner, Jeremy D.</p> <p>2016-01-01</p> <p>The calculation of a ray's intercept from an arbitrary point in space to a prescribed surface is a common task in computer simulations. The arbitrary point often represents an object that is moving according to the simulation, while the prescribed surface is fixed in a defined frame. For detailed simulations, this surface becomes complex, taking the form of real-world objects such as mountains, craters or valleys which require more advanced methods to accurately calculate a ray's intercept location. Incorporation of these complex surfaces has commonly been implemented in graphics systems that utilize highly optimized graphics processing units to analyze such features. This paper proposes a simplified method that does not require computationally intensive graphics solutions, but rather an optimized ray-tracing method for an assumed terrain dataset. This approach was developed for the Mars Science Laboratory mission which landed on the complex terrain of Gale Crater. First, this paper begins with a discussion of the simulation used to implement the model and the applicability of finding surface intercepts with respect to atmosphere modeling, altitude determination, radar modeling, and contact forces influencing vehicle dynamics. Next, the derivation and assumptions of the intercept finding method are presented. Key assumptions are noted making the routines specific to only certain types of surface data sets that are equidistantly spaced in longitude and latitude. The derivation of the method relies on ray-tracing, requiring discussion on the formulation of the ray with respect to the terrain datasets. Further discussion includes techniques for ray initialization in order to optimize the intercept search. Then, the model implementation for various new applications in the simulation are demonstrated. Finally, a validation of the accuracy is presented along with the corresponding data sets used in the validation. A performance summary of the method will be shown using the analysis from the Mars Science Laboratory's terminal descent sensing model. Alternate uses will also be shown for determining horizon maps and orbiter set times.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://rosap.ntl.bts.gov/view/dot/22389','DOTNTL'); return false;" href="https://rosap.ntl.bts.gov/view/dot/22389"><span>Simulating household travel study data in metropolitan areas : technical summary.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntlsearch.bts.gov/tris/index.do">DOT National Transportation Integrated Search</a></p> <p></p> <p>2001-05-01</p> <p>The objectives of this study are: 1. To develop and validate a methodology for MPOs to synthesize household travel survey data using local sociodemographic characteristics in conjunction with a national source of simulated travel data. 2. To evalu...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26796316','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26796316"><span>A modified Wright-Fisher model that incorporates Ne: A variant of the standard model with increased biological realism and reduced computational complexity.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Zhao, Lei; Gossmann, Toni I; Waxman, David</p> <p>2016-03-21</p> <p>The Wright-Fisher model is an important model in evolutionary biology and population genetics. It has been applied in numerous analyses of finite populations with discrete generations. It is recognised that real populations can behave, in some key aspects, as though their size that is not the census size, N, but rather a smaller size, namely the effective population size, Ne. However, in the Wright-Fisher model, there is no distinction between the effective and census population sizes. Equivalently, we can say that in this model, Ne coincides with N. The Wright-Fisher model therefore lacks an important aspect of biological realism. Here, we present a method that allows Ne to be directly incorporated into the Wright-Fisher model. The modified model involves matrices whose size is determined by Ne. Thus apart from increased biological realism, the modified model also has reduced computational complexity, particularly so when Ne⪡N. For complex problems, it may be hard or impossible to numerically analyse the most commonly-used approximation of the Wright-Fisher model that incorporates Ne, namely the diffusion approximation. An alternative approach is simulation. However, the simulations need to be sufficiently detailed that they yield an effective size that is different to the census size. Simulations may also be time consuming and have attendant statistical errors. The method presented in this work may then be the only alternative to simulations, when Ne differs from N. We illustrate the straightforward application of the method to some problems involving allele fixation and the determination of the equilibrium site frequency spectrum. We then apply the method to the problem of fixation when three alleles are segregating in a population. This latter problem is significantly more complex than a two allele problem and since the diffusion equation cannot be numerically solved, the only other way Ne can be incorporated into the analysis is by simulation. We have achieved good accuracy in all cases considered. In summary, the present work extends the realism and tractability of an important model of evolutionary biology and population genetics. Copyright © 2016 Elsevier Ltd. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22960215','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22960215"><span>A novel approach for choosing summary statistics in approximate Bayesian computation.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Aeschbacher, Simon; Beaumont, Mark A; Futschik, Andreas</p> <p>2012-11-01</p> <p>The choice of summary statistics is a crucial step in approximate Bayesian computation (ABC). Since statistics are often not sufficient, this choice involves a trade-off between loss of information and reduction of dimensionality. The latter may increase the efficiency of ABC. Here, we propose an approach for choosing summary statistics based on boosting, a technique from the machine-learning literature. We consider different types of boosting and compare them to partial least-squares regression as an alternative. To mitigate the lack of sufficiency, we also propose an approach for choosing summary statistics locally, in the putative neighborhood of the true parameter value. We study a demographic model motivated by the reintroduction of Alpine ibex (Capra ibex) into the Swiss Alps. The parameters of interest are the mean and standard deviation across microsatellites of the scaled ancestral mutation rate (θ(anc) = 4N(e)u) and the proportion of males obtaining access to matings per breeding season (ω). By simulation, we assess the properties of the posterior distribution obtained with the various methods. According to our criteria, ABC with summary statistics chosen locally via boosting with the L(2)-loss performs best. Applying that method to the ibex data, we estimate θ(anc)≈ 1.288 and find that most of the variation across loci of the ancestral mutation rate u is between 7.7 × 10(-4) and 3.5 × 10(-3) per locus per generation. The proportion of males with access to matings is estimated as ω≈ 0.21, which is in good agreement with recent independent estimates.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3522150','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3522150"><span>A Novel Approach for Choosing Summary Statistics in Approximate Bayesian Computation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Aeschbacher, Simon; Beaumont, Mark A.; Futschik, Andreas</p> <p>2012-01-01</p> <p>The choice of summary statistics is a crucial step in approximate Bayesian computation (ABC). Since statistics are often not sufficient, this choice involves a trade-off between loss of information and reduction of dimensionality. The latter may increase the efficiency of ABC. Here, we propose an approach for choosing summary statistics based on boosting, a technique from the machine-learning literature. We consider different types of boosting and compare them to partial least-squares regression as an alternative. To mitigate the lack of sufficiency, we also propose an approach for choosing summary statistics locally, in the putative neighborhood of the true parameter value. We study a demographic model motivated by the reintroduction of Alpine ibex (Capra ibex) into the Swiss Alps. The parameters of interest are the mean and standard deviation across microsatellites of the scaled ancestral mutation rate (θanc = 4Neu) and the proportion of males obtaining access to matings per breeding season (ω). By simulation, we assess the properties of the posterior distribution obtained with the various methods. According to our criteria, ABC with summary statistics chosen locally via boosting with the L2-loss performs best. Applying that method to the ibex data, we estimate θ^anc≈1.288 and find that most of the variation across loci of the ancestral mutation rate u is between 7.7 × 10−4 and 3.5 × 10−3 per locus per generation. The proportion of males with access to matings is estimated as ω^≈0.21, which is in good agreement with recent independent estimates. PMID:22960215</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018ClDy...50.4231T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018ClDy...50.4231T"><span>Sensitivity of the weather research and forecasting model to parameterization schemes for regional climate of Nile River Basin</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tariku, Tebikachew Betru; Gan, Thian Yew</p> <p>2018-06-01</p> <p>Regional climate models (RCMs) have been used to simulate rainfall at relatively high spatial and temporal resolutions useful for sustainable water resources planning, design and management. In this study, the sensitivity of the RCM, weather research and forecasting (WRF), in modeling the regional climate of the Nile River Basin (NRB) was investigated using 31 combinations of different physical parameterization schemes which include cumulus (Cu), microphysics (MP), planetary boundary layer (PBL), land-surface model (LSM) and radiation (Ra) schemes. Using the European Centre for Medium-Range Weather Forecast (ECMWF) ERA-Interim reanalysis data as initial and lateral boundary conditions, WRF was configured to model the climate of NRB at a resolution of 36 km with 30 vertical levels. The 1999-2001 simulations using WRF were compared with satellite data combined with ground observation and the NCEP reanalysis data for 2 m surface air temperature (T2), rainfall, short- and longwave downward radiation at the surface (SWRAD, LWRAD). Overall, WRF simulated more accurate T2 and LWRAD (with correlation coefficients >0.8 and low root-mean-square error) than SWRAD and rainfall for the NRB. Further, the simulation of rainfall is more sensitive to PBL, Cu and MP schemes than other schemes of WRF. For example, WRF simulated less biased rainfall with Kain-Fritsch combined with MYJ than with YSU as the PBL scheme. The simulation of T2 is more sensitive to LSM and Ra than to Cu, PBL and MP schemes selected, SWRAD is more sensitive to MP and Ra than to Cu, LSM and PBL schemes, and LWRAD is more sensitive to LSM, Ra and PBL than Cu, and MP schemes. In summary, the following combination of schemes simulated the most representative regional climate of NRB: WSM3 microphysics, KF cumulus, MYJ PBL, RRTM longwave radiation and Dudhia shortwave radiation schemes, and Noah LSM. The above configuration of WRF coupled to the Noah LSM has also been shown to simulate representative regional climate of NRB over 1980-2001 which include a combination of wet and dry years of the NRB.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017ClDy..tmp..525T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017ClDy..tmp..525T"><span>Sensitivity of the weather research and forecasting model to parameterization schemes for regional climate of Nile River Basin</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tariku, Tebikachew Betru; Gan, Thian Yew</p> <p>2017-08-01</p> <p>Regional climate models (RCMs) have been used to simulate rainfall at relatively high spatial and temporal resolutions useful for sustainable water resources planning, design and management. In this study, the sensitivity of the RCM, weather research and forecasting (WRF), in modeling the regional climate of the Nile River Basin (NRB) was investigated using 31 combinations of different physical parameterization schemes which include cumulus (Cu), microphysics (MP), planetary boundary layer (PBL), land-surface model (LSM) and radiation (Ra) schemes. Using the European Centre for Medium-Range Weather Forecast (ECMWF) ERA-Interim reanalysis data as initial and lateral boundary conditions, WRF was configured to model the climate of NRB at a resolution of 36 km with 30 vertical levels. The 1999-2001 simulations using WRF were compared with satellite data combined with ground observation and the NCEP reanalysis data for 2 m surface air temperature (T2), rainfall, short- and longwave downward radiation at the surface (SWRAD, LWRAD). Overall, WRF simulated more accurate T2 and LWRAD (with correlation coefficients >0.8 and low root-mean-square error) than SWRAD and rainfall for the NRB. Further, the simulation of rainfall is more sensitive to PBL, Cu and MP schemes than other schemes of WRF. For example, WRF simulated less biased rainfall with Kain-Fritsch combined with MYJ than with YSU as the PBL scheme. The simulation of T2 is more sensitive to LSM and Ra than to Cu, PBL and MP schemes selected, SWRAD is more sensitive to MP and Ra than to Cu, LSM and PBL schemes, and LWRAD is more sensitive to LSM, Ra and PBL than Cu, and MP schemes. In summary, the following combination of schemes simulated the most representative regional climate of NRB: WSM3 microphysics, KF cumulus, MYJ PBL, RRTM longwave radiation and Dudhia shortwave radiation schemes, and Noah LSM. The above configuration of WRF coupled to the Noah LSM has also been shown to simulate representative regional climate of NRB over 1980-2001 which include a combination of wet and dry years of the NRB.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3123842','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3123842"><span>Adaptive Control Model Reveals Systematic Feedback and Key Molecules in Metabolic Pathway Regulation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Moffitt, Richard A.; Merrill, Alfred H.; Wang, May D.</p> <p>2011-01-01</p> <p>Abstract Robust behavior in metabolic pathways resembles stabilized performance in systems under autonomous control. This suggests we can apply control theory to study existing regulation in these cellular networks. Here, we use model-reference adaptive control (MRAC) to investigate the dynamics of de novo sphingolipid synthesis regulation in a combined theoretical and experimental case study. The effects of serine palmitoyltransferase over-expression on this pathway are studied in vitro using human embryonic kidney cells. We report two key results from comparing numerical simulations with observed data. First, MRAC simulations of pathway dynamics are comparable to simulations from a standard model using mass action kinetics. The root-sum-square (RSS) between data and simulations in both cases differ by less than 5%. Second, MRAC simulations suggest systematic pathway regulation in terms of adaptive feedback from individual molecules. In response to increased metabolite levels available for de novo sphingolipid synthesis, feedback from molecules along the main artery of the pathway is regulated more frequently and with greater amplitude than from other molecules along the branches. These biological insights are consistent with current knowledge while being new that they may guide future research in sphingolipid biology. In summary, we report a novel approach to study regulation in cellular networks by applying control theory in the context of robust metabolic pathways. We do this to uncover potential insight into the dynamics of regulation and the reverse engineering of cellular networks for systems biology. This new modeling approach and the implementation routines designed for this case study may be extended to other systems. Supplementary Material is available at www.liebertonline.com/cmb. PMID:21314456</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19930008782','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19930008782"><span>Automation of closed environments in space for human comfort and safety</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p></p> <p>1992-01-01</p> <p>This report culminates the work accomplished during a three year design project on the automation of an Environmental Control and Life Support System (ECLSS) suitable for space travel and colonization. The system would provide a comfortable living environment in space that is fully functional with limited human supervision. A completely automated ECLSS would increase astronaut productivity while contributing to their safety and comfort. The first section of this report, section 1.0, briefly explains the project, its goals, and the scheduling used by the team in meeting these goals. Section 2.0 presents an in-depth look at each of the component subsystems. Each subsection describes the mathematical modeling and computer simulation used to represent that portion of the system. The individual models have been integrated into a complete computer simulation of the CO2 removal process. In section 3.0, the two simulation control schemes are described. The classical control approach uses traditional methods to control the mechanical equipment. The expert control system uses fuzzy logic and artificial intelligence to control the system. By integrating the two control systems with the mathematical computer simulation, the effectiveness of the two schemes can be compared. The results are then used as proof of concept in considering new control schemes for the entire ECLSS. Section 4.0 covers the results and trends observed when the model was subjected to different test situations. These results provide insight into the operating procedures of the model and the different control schemes. The appendix, section 5.0, contains summaries of lectures presented during the past year, homework assignments, and the completed source code used for the computer simulation and control system.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3789515','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3789515"><span>Invasion emerges from cancer cell adaptation to competitive microenvironments: Quantitative predictions from multiscale mathematical models</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Rejniak, Katarzyna A.; Gerlee, Philip</p> <p>2013-01-01</p> <p>Summary In this review we summarize our recent efforts using mathematical modeling and computation to simulate cancer invasion, with a special emphasis on the tumor microenvironment. We consider cancer progression as a complex multiscale process and approach it with three single-cell based mathematical models that examine the interactions between tumor microenvironment and cancer cells at several scales. The models exploit distinct mathematical and computational techniques, yet they share core elements and can be compared and/or related to each other. The overall aim of using mathematical models is to uncover the fundamental mechanisms that lend cancer progression its direction towards invasion and metastasis. The models effectively simulate various modes of cancer cell adaptation to the microenvironment in a growing tumor. All three point to a general mechanism underlying cancer invasion: competition for adaptation between distinct cancer cell phenotypes, driven by a tumor microenvironment with scarce resources. These theoretical predictions pose an intriguing experimental challenge: test the hypothesis that invasion is an emergent property of cancer cell populations adapting to selective microenvironment pressure, rather than culmination of cancer progression producing cells with the “invasive phenotype”. In broader terms, we propose that fundamental insights into cancer can be achieved by experimentation interacting with theoretical frameworks provided by computational and mathematical modeling. PMID:18524624</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25558702','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25558702"><span>Assessment of surgical discharge summaries and evaluation of a new quality improvement model.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Stein, Ran; Neufeld, David; Shwartz, Ivan; Erez, Ilan; Haas, Ilana; Magen, Ada; Glassberg, Elon; Shmulevsky, Pavel; Paran, Haim</p> <p>2014-11-01</p> <p>Discharge summaries after hospitalization provide the most reliable description and implications of the hospitalization. A concise discharge summary is crucial for maintaining continuity of care through the transition from inpatient to ambulatory care. Discharge summaries often lack information and are imprecise. Errors and insufficient recommendations regarding changes in the medical regimen may harm the patient's health and may result in readmission. To evaluate a quality improvement model and training program for writing postoperative discharge summaries for three surgical procedures. Medical records and surgical discharge summaries were reviewed and scored. Essential points for communication between surgeons and family physicians were included in automated forms. Staff was briefed twice regarding required summary contents with an interim evaluation. Changes in quality were evaluated. Summaries from 61 cholecystectomies, 42 hernioplasties and 45 colectomies were reviewed. The average quality score of all discharge summaries increased from 72.1 to 78.3 after the first intervention (P < 0.0005) to 81.0 following the second intervention. As the discharge summary's quality improved, its length decreased significantly. Discharge summaries lack important information and are too long. Developing a model for discharge summaries and instructing surgical staff regarding their contents resulted in measurable improvement. Frequent interventions and supervision are needed to maintain the quality of the surgical discharge summary.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li class="active"><span>17</span></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_17 --> <div id="page_18" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li class="active"><span>18</span></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="341"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://pubs.usgs.gov/of/2016/1212/ofr20161212.pdf','USGSPUBS'); return false;" href="http://pubs.usgs.gov/of/2016/1212/ofr20161212.pdf"><span>The U.S. Geological Survey Monthly Water Balance Model Futures Portal</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Bock, Andrew R.; Hay, Lauren E.; Markstrom, Steven L.; Emmerich, Christopher; Talbert, Marian</p> <p>2017-05-03</p> <p>The U.S. Geological Survey Monthly Water Balance Model Futures Portal (https://my.usgs.gov/mows/) is a user-friendly interface that summarizes monthly historical and simulated future conditions for seven hydrologic and meteorological variables (actual evapotranspiration, potential evapotranspiration, precipitation, runoff, snow water equivalent, atmospheric temperature, and streamflow) at locations across the conterminous United States (CONUS).The estimates of these hydrologic and meteorological variables were derived using a Monthly Water Balance Model (MWBM), a modular system that simulates monthly estimates of components of the hydrologic cycle using monthly precipitation and atmospheric temperature inputs. Precipitation and atmospheric temperature from 222 climate datasets spanning historical conditions (1952 through 2005) and simulated future conditions (2020 through 2099) were summarized for hydrographic features and used to drive the MWBM for the CONUS. The MWBM input and output variables were organized into an open-access database. An Open Geospatial Consortium, Inc., Web Feature Service allows the querying and identification of hydrographic features across the CONUS. To connect the Web Feature Service to the open-access database, a user interface—the Monthly Water Balance Model Futures Portal—was developed to allow the dynamic generation of summary files and plots  based on plot type, geographic location, specific climate datasets, period of record, MWBM variable, and other options. Both the plots and the data files are made available to the user for download </p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28032590','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28032590"><span>Predicting the impact of combined therapies on myeloma cell growth using a hybrid multi-scale agent-based model.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Ji, Zhiwei; Su, Jing; Wu, Dan; Peng, Huiming; Zhao, Weiling; Nlong Zhao, Brian; Zhou, Xiaobo</p> <p>2017-01-31</p> <p>Multiple myeloma is a malignant still incurable plasma cell disorder. This is due to refractory disease relapse, immune impairment, and development of multi-drug resistance. The growth of malignant plasma cells is dependent on the bone marrow (BM) microenvironment and evasion of the host's anti-tumor immune response. Hence, we hypothesized that targeting tumor-stromal cell interaction and endogenous immune system in BM will potentially improve the response of multiple myeloma (MM). Therefore, we proposed a computational simulation of the myeloma development in the complicated microenvironment which includes immune cell components and bone marrow stromal cells and predicted the effects of combined treatment with multi-drugs on myeloma cell growth. We constructed a hybrid multi-scale agent-based model (HABM) that combines an ODE system and Agent-based model (ABM). The ODEs was used for modeling the dynamic changes of intracellular signal transductions and ABM for modeling the cell-cell interactions between stromal cells, tumor, and immune components in the BM. This model simulated myeloma growth in the bone marrow microenvironment and revealed the important role of immune system in this process. The predicted outcomes were consistent with the experimental observations from previous studies. Moreover, we applied this model to predict the treatment effects of three key therapeutic drugs used for MM, and found that the combination of these three drugs potentially suppress the growth of myeloma cells and reactivate the immune response. In summary, the proposed model may serve as a novel computational platform for simulating the formation of MM and evaluating the treatment response of MM to multiple drugs.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=cambridge+AND+study+AND+guide&pg=3&id=ED044930','ERIC'); return false;" href="https://eric.ed.gov/?q=cambridge+AND+study+AND+guide&pg=3&id=ED044930"><span>The Guide to Simulation Games for Education and Training. Appendix: A Basic Reference Shelf on Simulation and Gaming by Paul A. Twelker.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Zuckerman, David W.; Horn, Robert E.</p> <p></p> <p>Simulation games are classed in this guide by subject area: business, domestic politics, economics, ecology, education, geography, history, international relations, psychology, skill development, sociology, social studies, and urban affairs. A summary description (of roles, objectives, decisions, and purposes), cost producer, playing data (age…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2006ITNS...53.2563C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2006ITNS...53.2563C"><span>Development of a Monte Carlo Simulation for APD-Based PET Detectors Using a Continuous Scintillating Crystal</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Clowes, P.; Mccallum, S.; Welch, A.</p> <p>2006-10-01</p> <p>We are currently developing a multilayer avalanche photodiode (APD)-based detector for use in positron emission tomography (PET), which utilizes thin continuous crystals. In this paper, we developed a Monte Carlo-based simulation to aid in the design of such detectors. We measured the performance of a detector comprising a single thin continuous crystal (3.1 mm times 9.5 mm times 9.5 mm) of lutetium yttrium ortho-silicate (LYSO) and an APD array (4times4) elements; each element 1.6 mm2 and on a 2.3 mm pitch. We showed that a spatial resolution of better than 2.12 mm is achievable throughout the crystal provided that we adopt a Statistics Based Positioning (SBP) Algorithm. We then used Monte Carlo simulation to model the behavior of the detector. The accuracy of the Monte Carlo simulation was verified by comparing measured and simulated parent datasets (PDS) for the SBP algorithm. These datasets consisted of data for point sources at 49 positions uniformly distributed over the detector area. We also calculated the noise in the detector circuit and verified this value by measurement. The noise value was included in the simulation. We show that the performance of the simulation closely matches the measured performance. The simulations were extended to investigate the effect of different noise levels on positioning accuracy. This paper showed that if modest improvements could be made in the circuit noise then positioning accuracy would be greatly improved. In summary, we have developed a model that can be used to simulate the performance of a variety of APD-based continuous crystal PET detectors</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27242622','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27242622"><span>Evidence for a Global Sampling Process in Extraction of Summary Statistics of Item Sizes in a Set.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Tokita, Midori; Ueda, Sachiyo; Ishiguchi, Akira</p> <p>2016-01-01</p> <p>Several studies have shown that our visual system may construct a "summary statistical representation" over groups of visual objects. Although there is a general understanding that human observers can accurately represent sets of a variety of features, many questions on how summary statistics, such as an average, are computed remain unanswered. This study investigated sampling properties of visual information used by human observers to extract two types of summary statistics of item sets, average and variance. We presented three models of ideal observers to extract the summary statistics: a global sampling model without sampling noise, global sampling model with sampling noise, and limited sampling model. We compared the performance of an ideal observer of each model with that of human observers using statistical efficiency analysis. Results suggest that summary statistics of items in a set may be computed without representing individual items, which makes it possible to discard the limited sampling account. Moreover, the extraction of summary statistics may not necessarily require the representation of individual objects with focused attention when the sets of items are larger than 4.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4970668','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4970668"><span>Computational Simulation of the Activation Cycle of Gα Subunit in the G Protein Cycle Using an Elastic Network Model</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Kim, Min Hyeok; Kim, Young Jin; Kim, Hee Ryung; Jeon, Tae-Joon; Choi, Jae Boong; Chung, Ka Young; Kim, Moon Ki</p> <p>2016-01-01</p> <p>Agonist-activated G protein-coupled receptors (GPCRs) interact with GDP-bound G protein heterotrimers (Gαβγ) promoting GDP/GTP exchange, which results in dissociation of Gα from the receptor and Gβγ. The GTPase activity of Gα hydrolyzes GTP to GDP, and the GDP-bound Gα interacts with Gβγ, forming a GDP-bound G protein heterotrimer. The G protein cycle is allosterically modulated by conformational changes of the Gα subunit. Although biochemical and biophysical methods have elucidated the structure and dynamics of Gα, the precise conformational mechanisms underlying the G protein cycle are not fully understood yet. Simulation methods could help to provide additional details to gain further insight into G protein signal transduction mechanisms. In this study, using the available X-ray crystal structures of Gα, we simulated the entire G protein cycle and described not only the steric features of the Gα structure, but also conformational changes at each step. Each reference structure in the G protein cycle was modeled as an elastic network model and subjected to normal mode analysis. Our simulation data suggests that activated receptors trigger conformational changes of the Gα subunit that are thermodynamically favorable for opening of the nucleotide-binding pocket and GDP release. Furthermore, the effects of GTP binding and hydrolysis on mobility changes of the C and N termini and switch regions are elucidated. In summary, our simulation results enabled us to provide detailed descriptions of the structural and dynamic features of the G protein cycle. PMID:27483005</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018MSSP...99..306B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018MSSP...99..306B"><span>Model selection and parameter estimation in structural dynamics using approximate Bayesian computation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ben Abdessalem, Anis; Dervilis, Nikolaos; Wagg, David; Worden, Keith</p> <p>2018-01-01</p> <p>This paper will introduce the use of the approximate Bayesian computation (ABC) algorithm for model selection and parameter estimation in structural dynamics. ABC is a likelihood-free method typically used when the likelihood function is either intractable or cannot be approached in a closed form. To circumvent the evaluation of the likelihood function, simulation from a forward model is at the core of the ABC algorithm. The algorithm offers the possibility to use different metrics and summary statistics representative of the data to carry out Bayesian inference. The efficacy of the algorithm in structural dynamics is demonstrated through three different illustrative examples of nonlinear system identification: cubic and cubic-quintic models, the Bouc-Wen model and the Duffing oscillator. The obtained results suggest that ABC is a promising alternative to deal with model selection and parameter estimation issues, specifically for systems with complex behaviours.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA282279','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA282279"><span>A Virtual Reality-Based Simulation of Abdominal Surgery</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1994-06-30</p> <p>415) 591-7881 In! IhNiI 1 SHORT TITLE: A Virtual Reality -Based Simulation of Abdominal Surgery REPORTING PERIOD: October 31, 1993-June 30, 1994 The...Report - A Virtual Reality -Based Simulation Of Abdominal Surgery Page 2 June 21, 1994 TECHNICAL REPORT SUMMARY Virtual Reality is a marriage between...applications of this technology. Virtual reality systems can be used to teach surgical anatomy, diagnose surgical problems, plan operations. simulate and</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/AD1002544','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/AD1002544"><span>Demonstration of Inexact Computing Implemented in the JPEG Compression Algorithm using Probabilistic Boolean Logic applied to CMOS Components</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2015-12-24</p> <p>Signal to Noise Ratio SPICE Simulation Program with Integrated Circuit Emphasis TIFF Tagged Image File Format USC University of Southern California xvii...sources can create errors in digital circuits. These effects can be simulated using Simulation Program with Integrated Circuit Emphasis ( SPICE ) or...compute summary statistics. 4.1 Circuit Simulations Noisy analog circuits can be simulated in SPICE or Cadence SpectreTM software via noisy voltage</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19900004913','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19900004913"><span>Definition of avionics concepts for a heavy lift cargo vehicle. Volume 1: Executive summary</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p></p> <p>1989-01-01</p> <p>A cost effective, multiuser simulation, test, and demonstration facility to support the development of avionics systems for future space vehicles is examined. The technology needs and requirements of future Heavy Lift Cargo Vehicles (HLCVs) are analyzed and serve as the basis for sizing of the avionics facility, although the lab is not limited in use to support of HLCVs. Volume 1 provides a summary of the vehicle avionics trade studies, the avionics lab objectives, a summary of the lab's functional requirements and design, physical facility considerations, and cost estimates.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/23278391','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/23278391"><span>Sequential sentinel SNP Regional Association Plots (SSS-RAP): an approach for testing independence of SNP association signals using meta-analysis data.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Zheng, Jie; Gaunt, Tom R; Day, Ian N M</p> <p>2013-01-01</p> <p>Genome-Wide Association Studies (GWAS) frequently incorporate meta-analysis within their framework. However, conditional analysis of individual-level data, which is an established approach for fine mapping of causal sites, is often precluded where only group-level summary data are available for analysis. Here, we present a numerical and graphical approach, "sequential sentinel SNP regional association plot" (SSS-RAP), which estimates regression coefficients (beta) with their standard errors using the meta-analysis summary results directly. Under an additive model, typical for genes with small effect, the effect for a sentinel SNP can be transformed to the predicted effect for a possibly dependent SNP through a 2×2 2-SNP haplotypes table. The approach assumes Hardy-Weinberg equilibrium for test SNPs. SSS-RAP is available as a Web-tool (http://apps.biocompute.org.uk/sssrap/sssrap.cgi). To develop and illustrate SSS-RAP we analyzed lipid and ECG traits data from the British Women's Heart and Health Study (BWHHS), evaluated a meta-analysis for ECG trait and presented several simulations. We compared results with existing approaches such as model selection methods and conditional analysis. Generally findings were consistent. SSS-RAP represents a tool for testing independence of SNP association signals using meta-analysis data, and is also a convenient approach based on biological principles for fine mapping in group level summary data. © 2012 Blackwell Publishing Ltd/University College London.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28853200','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28853200"><span>The 2017 Academic Emergency Medicine Consensus Conference: Catalyzing System Change Through Healthcare Simulation: Systems, Competency, and Outcomes.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Bond, William F; Hui, Joshua; Fernandez, Rosemarie</p> <p>2018-02-01</p> <p>Over the past decade, emergency medicine (EM) took a lead role in healthcare simulation in part due to its demands for successful interprofessional and multidisciplinary collaboration, along with educational needs in a diverse array of cognitive and procedural skills. Simulation-based methodologies have the capacity to support training and research platforms that model micro-, meso-, and macrosystems of healthcare. To fully capitalize on the potential of simulation-based research to improve emergency healthcare delivery will require the application of rigorous methods from engineering, social science, and basic science disciplines. The Academic Emergency Medicine (AEM) Consensus Conference "Catalyzing System Change Through Healthcare Simulation: Systems, Competency, and Outcome" was conceived to foster discussion among experts in EM, engineering, and social sciences, focusing on key barriers and opportunities in simulation-based research. This executive summary describes the overall rationale for the conference, conference planning, and consensus-building approaches and outlines the focus of the eight breakout sessions. The consensus outcomes from each breakout session are summarized in proceedings papers published in this issue of Academic Emergency Medicine. Each paper provides an overview of methodologic and knowledge gaps in simulation research and identifies future research targets aimed at improving the safety and quality of healthcare. © 2017 by the Society for Academic Emergency Medicine.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5172849','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5172849"><span>Simulation for Operational Readiness in a New Freestanding Emergency Department</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Kerner, Robert L.; Gallo, Kathleen; Cassara, Michael; D'Angelo, John; Egan, Anthony; Simmons, John Galbraith</p> <p>2016-01-01</p> <p>Summary Statement Simulation in multiple contexts over the course of a 10-week period served as a core learning strategy to orient experienced clinicians before opening a large new urban freestanding emergency department. To ensure technical and procedural skills of all team members, who would provide care without on-site recourse to specialty backup, we designed a comprehensive interprofessional curriculum to verify and regularize a wide range of competencies and best practices for all clinicians. Formulated under the rubric of systems integration, simulation activities aimed to instill a shared culture of patient safety among the entire cohort of 43 experienced emergency physicians, physician assistants, nurses, and patient technicians, most newly hired to the health system, who had never before worked together. Methods throughout the preoperational term included predominantly hands-on skills review, high-fidelity simulation, and simulation with standardized patients. We also used simulation during instruction in disaster preparedness, sexual assault forensics, and community outreach. Our program culminated with 2 days of in-situ simulation deployed in simultaneous and overlapping timeframes to challenge system response capabilities, resilience, and flexibility; this work revealed latent safety threats, lapses in communication, issues of intake procedure and patient flow, and the persistence of inapt or inapplicable mental models in responding to clinical emergencies. PMID:27607095</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016PRPER..12a0117C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016PRPER..12a0117C"><span>Learning from avatars: Learning assistants practice physics pedagogy in a classroom simulator</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Chini, Jacquelyn J.; Straub, Carrie L.; Thomas, Kevin H.</p> <p>2016-06-01</p> <p>[This paper is part of the Focused Collection on Preparing and Supporting University Physics Educators.] Undergraduate students are increasingly being used to support course transformations that incorporate research-based instructional strategies. While such students are typically selected based on strong content knowledge and possible interest in teaching, they often do not have previous pedagogical training. The current training models make use of real students or classmates role playing as students as the test subjects. We present a new environment for facilitating the practice of physics pedagogy skills, a highly immersive mixed-reality classroom simulator, and assess its effectiveness for undergraduate physics learning assistants (LAs). LAs prepared, taught, and reflected on a lesson about motion graphs for five highly interactive computer generated student avatars in the mixed-reality classroom simulator. To assess the effectiveness of the simulator for this population, we analyzed the pedagogical skills LAs intended to practice and exhibited during their lessons and explored LAs' descriptions of their experiences with the simulator. Our results indicate that the classroom simulator created a safe, effective environment for LAs to practice a variety of skills, such as questioning styles and wait time. Additionally, our analysis revealed areas for improvement in our preparation of LAs and use of the simulator. We conclude with a summary of research questions this environment could facilitate.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1364502','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1364502"><span>FY 2016 Status Report on the Modeling of the M8 Calibration Series using MAMMOTH</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Baker, Benjamin Allen; Ortensi, Javier; DeHart, Mark David</p> <p>2016-09-01</p> <p>This report provides a summary of the progress made towards validating the multi-physics reactor analysis application MAMMOTH using data from measurements performed at the Transient Reactor Test facility, TREAT. The work completed consists of a series of comparisons of TREAT element types (standard and control rod assemblies) in small geometries as well as slotted mini-cores to reference Monte Carlo simulations to ascertain the accuracy of cross section preparation techniques. After the successful completion of these smaller problems, a full core model of the half slotted core used in the M8 Calibration series was assembled. Full core MAMMOTH simulations were comparedmore » to Serpent reference calculations to assess the cross section preparation process for this larger configuration. As part of the validation process the M8 Calibration series included a steady state wire irradiation experiment and coupling factors for the experiment region. The shape of the power distribution obtained from the MAMMOTH simulation shows excellent agreement with the experiment. Larger differences were encountered in the calculation of the coupling factors, but there is also great uncertainty on how the experimental values were obtained. Future work will focus on resolving some of these differences.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1062654','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1062654"><span>Southern Regional Center for Lightweight Innovative Design</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Wang, Paul T.</p> <p></p> <p>The Southern Regional Center for Lightweight Innovative Design (SRCLID) has developed an experimentally validated cradle-to-grave modeling and simulation effort to optimize automotive components in order to decrease weight and cost, yet increase performance and safety in crash scenarios. In summary, the three major objectives of this project are accomplished: To develop experimentally validated cradle-to-grave modeling and simulation tools to optimize automotive and truck components for lightweighting materials (aluminum, steel, and Mg alloys and polymer-based composites) with consideration of uncertainty to decrease weight and cost, yet increase the performance and safety in impact scenarios; To develop multiscale computational models that quantifymore » microstructure-property relations by evaluating various length scales, from the atomic through component levels, for each step of the manufacturing process for vehicles; and To develop an integrated K-12 educational program to educate students on lightweighting designs and impact scenarios. In this final report, we divided the content into two parts: the first part contains the development of building blocks for the project, including materials and process models, process-structure-property (PSP) relationship, and experimental validation capabilities; the second part presents the demonstration task for Mg front-end work associated with USAMP projects.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19810009476','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19810009476"><span>VCE early acoustic test results of General Electric's high-radius ratio coannular plug nozzle</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Knott, P. R.; Brausch, J. F.; Bhutiani, P. K.; Majjigi, R. K.; Doyle, V. L.</p> <p>1980-01-01</p> <p>Results of variable cycle engine (VCE) early acoustic engine and model scale tests are presented. A summary of an extensive series of far field acoustic, advanced acoustic, and exhaust plume velocity measurements with a laser velocimeter of inverted velocity and temperature profile, high radius ratio coannular plug nozzles on a YJ101 VCE static engine test vehicle are reviewed. Select model scale simulated flight acoustic measurements for an unsuppressed and a mechanical suppressed coannular plug nozzle are also discussed. The engine acoustic nozzle tests verify previous model scale noise reduction measurements. The engine measurements show 4 to 6 PNdB aft quadrant jet noise reduction and up to 7 PNdB forward quadrant shock noise reduction relative to a fully mixed conical nozzle at the same specific thrust and mixed pressure ratio. The influences of outer nozzle radius ratio, inner stream velocity ratio, and area ratio are discussed. Also, laser velocimeter measurements of mean velocity and turbulent velocity of the YJ101 engine are illustrated. Select model scale static and simulated flight acoustic measurements are shown which corroborate that coannular suppression is maintained in forward speed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1374508','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1374508"><span>RELAP-7 Closure Correlations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Zou, Ling; Berry, R. A.; Martineau, R. C.</p> <p></p> <p>The RELAP-7 code is the next generation nuclear reactor system safety analysis code being developed at the Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework, MOOSE (Multi-Physics Object Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s and TRACE’s capabilities and extends their analysis capabilities for all reactor system simulation scenarios. The RELAP-7 codemore » utilizes the well-posed 7-equation two-phase flow model for compressible two-phase flow. Closure models used in the TRACE code has been reviewed and selected to reflect the progress made during the past decades and provide a basis for the colure correlations implemented in the RELAP-7 code. This document provides a summary on the closure correlations that are currently implemented in the RELAP-7 code. The closure correlations include sub-grid models that describe interactions between the fluids and the flow channel, and interactions between the two phases.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4194187','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4194187"><span>“Plateau”-related summary statistics are uninformative for comparing working memory models</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>van den Berg, Ronald; Ma, Wei Ji</p> <p>2014-01-01</p> <p>Performance on visual working memory tasks decreases as more items need to be remembered. Over the past decade, a debate has unfolded between proponents of slot models and slotless models of this phenomenon. Zhang and Luck (2008) and Anderson, Vogel, and Awh (2011) noticed that as more items need to be remembered, “memory noise” seems to first increase and then reach a “stable plateau.” They argued that three summary statistics characterizing this plateau are consistent with slot models, but not with slotless models. Here, we assess the validity of their methods. We generated synthetic data both from a leading slot model and from a recent slotless model and quantified model evidence using log Bayes factors. We found that the summary statistics provided, at most, 0.15% of the expected model evidence in the raw data. In a model recovery analysis, a total of more than a million trials were required to achieve 99% correct recovery when models were compared on the basis of summary statistics, whereas fewer than 1,000 trials were sufficient when raw data were used. At realistic numbers of trials, plateau-related summary statistics are completely unreliable for model comparison. Applying the same analyses to subject data from Anderson et al. (2011), we found that the evidence in the summary statistics was, at most, 0.12% of the evidence in the raw data and far too weak to warrant any conclusions. These findings call into question claims about working memory that are based on summary statistics. PMID:24719235</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25279263','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25279263"><span>Fast and accurate estimation of the covariance between pairwise maximum likelihood distances.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Gil, Manuel</p> <p>2014-01-01</p> <p>Pairwise evolutionary distances are a model-based summary statistic for a set of molecular sequences. They represent the leaf-to-leaf path lengths of the underlying phylogenetic tree. Estimates of pairwise distances with overlapping paths covary because of shared mutation events. It is desirable to take these covariance structure into account to increase precision in any process that compares or combines distances. This paper introduces a fast estimator for the covariance of two pairwise maximum likelihood distances, estimated under general Markov models. The estimator is based on a conjecture (going back to Nei & Jin, 1989) which links the covariance to path lengths. It is proven here under a simple symmetric substitution model. A simulation shows that the estimator outperforms previously published ones in terms of the mean squared error.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li class="active"><span>18</span></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_18 --> <div id="page_19" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li class="active"><span>19</span></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="361"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4179615','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4179615"><span>Fast and accurate estimation of the covariance between pairwise maximum likelihood distances</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p></p> <p>2014-01-01</p> <p>Pairwise evolutionary distances are a model-based summary statistic for a set of molecular sequences. They represent the leaf-to-leaf path lengths of the underlying phylogenetic tree. Estimates of pairwise distances with overlapping paths covary because of shared mutation events. It is desirable to take these covariance structure into account to increase precision in any process that compares or combines distances. This paper introduces a fast estimator for the covariance of two pairwise maximum likelihood distances, estimated under general Markov models. The estimator is based on a conjecture (going back to Nei & Jin, 1989) which links the covariance to path lengths. It is proven here under a simple symmetric substitution model. A simulation shows that the estimator outperforms previously published ones in terms of the mean squared error. PMID:25279263</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015SPIE.9322E..13M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015SPIE.9322E..13M"><span>Analysis of structural patterns in the brain with the complex network approach</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Maksimenko, Vladimir A.; Makarov, Vladimir V.; Kharchenko, Alexander A.; Pavlov, Alexey N.; Khramova, Marina V.; Koronovskii, Alexey A.; Hramov, Alexander E.</p> <p>2015-03-01</p> <p>In this paper we study mechanisms of the phase synchronization in a model network of Van der Pol oscillators and in the neural network of the brain by consideration of macroscopic parameters of these networks. As the macroscopic characteristics of the model network we consider a summary signal produced by oscillators. Similar to the model simulations, we study EEG signals reflecting the macroscopic dynamics of neural network. We show that the appearance of the phase synchronization leads to an increased peak in the wavelet spectrum related to the dynamics of synchronized oscillators. The observed correlation between the phase relations of individual elements and the macroscopic characteristics of the whole network provides a way to detect phase synchronization in the neural networks in the cases of normal and pathological activity.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA513965','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA513965"><span>Investigation of Shock Wave Attenuation in Porous Materials</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2009-12-01</p> <p>Foam ...... 38 Table 4. Summary of Material Characteristics of Polyurethane Foams ............ 40 Table 5. Summary of Experiment Results...polyurethane foam , he performed a simple symmetric impact simulation to investigate the material properties and wave propagation characteristics of the...describes the characteristics of the two foam materials studied in this research, namely the aluminum metal foam and rigid polyurethane foam , which</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/FR-2010-10-22/pdf/2010-26527.pdf','FEDREG'); return false;" href="https://www.gpo.gov/fdsys/pkg/FR-2010-10-22/pdf/2010-26527.pdf"><span>75 FR 65385 - Agency Information Collection Activities: Proposed Collection; Comment Request</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collection.action?collectionCode=FR">Federal Register 2010, 2011, 2012, 2013, 2014</a></p> <p></p> <p>2010-10-22</p> <p>... Earthquake Engineering Simulation (NEES). SUMMARY: In compliance with the requirement of section 3506(c)(2)(A... of the Network for Earthquake Engineering Simulation. Type of Information Collection Request: New... inform decision making regarding the future of NSF support for earthquake engineering research...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29101662','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29101662"><span>Computerized summary scoring: crowdsourcing-based latent semantic analysis.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Li, Haiying; Cai, Zhiqiang; Graesser, Arthur C</p> <p>2017-11-03</p> <p>In this study we developed and evaluated a crowdsourcing-based latent semantic analysis (LSA) approach to computerized summary scoring (CSS). LSA is a frequently used mathematical component in CSS, where LSA similarity represents the extent to which the to-be-graded target summary is similar to a model summary or a set of exemplar summaries. Researchers have proposed different formulations of the model summary in previous studies, such as pregraded summaries, expert-generated summaries, or source texts. The former two methods, however, require substantial human time, effort, and costs in order to either grade or generate summaries. Using source texts does not require human effort, but it also does not predict human summary scores well. With human summary scores as the gold standard, in this study we evaluated the crowdsourcing LSA method by comparing it with seven other LSA methods that used sets of summaries from different sources (either experts or crowdsourced) of differing quality, along with source texts. Results showed that crowdsourcing LSA predicted human summary scores as well as expert-good and crowdsourcing-good summaries, and better than the other methods. A series of analyses with different numbers of crowdsourcing summaries demonstrated that the number (from 10 to 100) did not significantly affect performance. These findings imply that crowdsourcing LSA is a promising approach to CSS, because it saves human effort in generating the model summary while still yielding comparable performance. This approach to small-scale CSS provides a practical solution for instructors in courses, and also advances research on automated assessments in which student responses are expected to semantically converge on subject matter content.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3520554','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3520554"><span>In Vitro Measurement of Tissue Integrity during Saccular Aneurysm Embolizations for Simulator-Based Training</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Tercero, C.; Ikeda, S.; Ooe, K.; Fukuda, T.; Arai, F.; Negoro, M.; Takahashi, I.; Kwon, G.</p> <p>2012-01-01</p> <p>Summary In the domain of endovascular neurosurgery, the measurement of tissue integrity is needed for simulator-based training and for the development of new intravascular instruments and treatment techniques. In vitro evaluation of tissue manipulation can be achieved using photoelastic stress analysis and vasculature modeling with photoelastic materials. In this research we constructed two types of vasculature models of saccular aneurysms for differentiation of embolization techniques according to the respect for tissue integrity measurements based on the stress within the blood vessel model wall. In an aneurysm model with 5 mm dome diameter, embolization using MicroPlex 10 (Complex 1D, with 4 mm diameter loops), a maximum area of 3.97 mm2 with stress above 1 kPa was measured. This area increased to 5.50 mm2 when the dome was touched deliberately with the release mechanism of the coil, and to 4.87 mm2 for an embolization using Micrusphere, (Spherical 18 Platinum Coil). In a similar way trans-cell stent-assisted coil embolization was also compared to human blood pressure simulation using a model of a wide-necked saccular aneurysm with 7 mm diameter. The area with stress above 1kPa was below 1 mm2 for the pressure simulation and maximized at 3.79 mm2 during the trans-cell insertion of the micro-catheter and at 8.92 mm2 during the embolization. The presented results show that this measurement system is useful for identifying techniques compromising tissue integrity, comparing and studying coils and embolization techniques for a specific vasculature morphology and comparing their natural stress variations such as that produced by blood pressure. PMID:23217635</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2003NIMPB.205..215S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2003NIMPB.205..215S"><span>Particle-in-cell code library for numerical simulation of the ECR source plasma</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Shirkov, G.; Alexandrov, V.; Preisendorf, V.; Shevtsov, V.; Filippov, A.; Komissarov, R.; Mironov, V.; Shirkova, E.; Strekalovsky, O.; Tokareva, N.; Tuzikov, A.; Vatulin, V.; Vasina, E.; Fomin, V.; Anisimov, A.; Veselov, R.; Golubev, A.; Grushin, S.; Povyshev, V.; Sadovoi, A.; Donskoi, E.; Nakagawa, T.; Yano, Y.</p> <p>2003-05-01</p> <p>The project ;Numerical simulation and optimization of ion accumulation and production in multicharged ion sources; is funded by the International Science and Technology Center (ISTC). A summary of recent project development and the first version of a computer code library for simulation of electron-cyclotron resonance (ECR) source plasmas based on the particle-in-cell method are presented.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19730003100','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19730003100"><span>Multidisciplinary research leading to utilization of extraterrestrial resources</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p></p> <p>1972-01-01</p> <p>Progress of the research accomplished during fiscal year 1972 is reported. The summaries presented include: (1) background analysis and coordination, (2) surface properties of rock in simulated lunar environment, (3) rock failure processes, strength and elastic properties in simulated lunar environment, (4) thermal fragmentation, and thermophysical and optical properties in simulated lunar environment, and (5) use of explosives on the moon.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA504498','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA504498"><span>A Summary of Proceedings for the Advanced Deployable Day/Night Simulation Symposium</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2009-07-01</p> <p>initiated to design , develop, and deliver transportable visual simulations that jointly provide night-vision and high-resolution daylight capability. The...Deployable Day/Night Simulation (ADDNS) Technology Demonstration Project was initiated to design , develop, and deliver transportable visual...was Dr. Richard Wildes (York University); Mr. Vitaly Zholudev (Department of Computer Science, York University), Mr. X. Zhu (Neptec Design Group), and</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..1614589W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..1614589W"><span>Effects of input uncertainty on cross-scale crop modeling</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Waha, Katharina; Huth, Neil; Carberry, Peter</p> <p>2014-05-01</p> <p>The quality of data on climate, soils and agricultural management in the tropics is in general low or data is scarce leading to uncertainty in process-based modeling of cropping systems. Process-based crop models are common tools for simulating crop yields and crop production in climate change impact studies, studies on mitigation and adaptation options or food security studies. Crop modelers are concerned about input data accuracy as this, together with an adequate representation of plant physiology processes and choice of model parameters, are the key factors for a reliable simulation. For example, assuming an error in measurements of air temperature, radiation and precipitation of ± 0.2°C, ± 2 % and ± 3 % respectively, Fodor & Kovacs (2005) estimate that this translates into an uncertainty of 5-7 % in yield and biomass simulations. In our study we seek to answer the following questions: (1) are there important uncertainties in the spatial variability of simulated crop yields on the grid-cell level displayed on maps, (2) are there important uncertainties in the temporal variability of simulated crop yields on the aggregated, national level displayed in time-series, and (3) how does the accuracy of different soil, climate and management information influence the simulated crop yields in two crop models designed for use at different spatial scales? The study will help to determine whether more detailed information improves the simulations and to advise model users on the uncertainty related to input data. We analyse the performance of the point-scale crop model APSIM (Keating et al., 2003) and the global scale crop model LPJmL (Bondeau et al., 2007) with different climate information (monthly and daily) and soil conditions (global soil map and African soil map) under different agricultural management (uniform and variable sowing dates) for the low-input maize-growing areas in Burkina Faso/West Africa. We test the models' response to different levels of input data from very little to very detailed information, and compare the models' abilities to represent the spatial variability and temporal variability in crop yields. We display the uncertainty in crop yield simulations from different input data and crop models in Taylor diagrams which are a graphical summary of the similarity between simulations and observations (Taylor, 2001). The observed spatial variability can be represented well from both models (R=0.6-0.8) but APSIM predicts higher spatial variability than LPJmL due to its sensitivity to soil parameters. Simulations with the same crop model, climate and sowing dates have similar statistics and therefore similar skill to reproduce the observed spatial variability. Soil data is less important for the skill of a crop model to reproduce the observed spatial variability. However, the uncertainty in simulated spatial variability from the two crop models is larger than from input data settings and APSIM is more sensitive to input data then LPJmL. Even with a detailed, point-scale crop model and detailed input data it is difficult to capture the complexity and diversity in maize cropping systems.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4184260','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4184260"><span>Fast and accurate imputation of summary statistics enhances evidence of functional enrichment</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Pasaniuc, Bogdan; Zaitlen, Noah; Shi, Huwenbo; Bhatia, Gaurav; Gusev, Alexander; Pickrell, Joseph; Hirschhorn, Joel; Strachan, David P.; Patterson, Nick; Price, Alkes L.</p> <p>2014-01-01</p> <p>Motivation: Imputation using external reference panels (e.g. 1000 Genomes) is a widely used approach for increasing power in genome-wide association studies and meta-analysis. Existing hidden Markov models (HMM)-based imputation approaches require individual-level genotypes. Here, we develop a new method for Gaussian imputation from summary association statistics, a type of data that is becoming widely available. Results: In simulations using 1000 Genomes (1000G) data, this method recovers 84% (54%) of the effective sample size for common (>5%) and low-frequency (1–5%) variants [increasing to 87% (60%) when summary linkage disequilibrium information is available from target samples] versus the gold standard of 89% (67%) for HMM-based imputation, which cannot be applied to summary statistics. Our approach accounts for the limited sample size of the reference panel, a crucial step to eliminate false-positive associations, and it is computationally very fast. As an empirical demonstration, we apply our method to seven case–control phenotypes from the Wellcome Trust Case Control Consortium (WTCCC) data and a study of height in the British 1958 birth cohort (1958BC). Gaussian imputation from summary statistics recovers 95% (105%) of the effective sample size (as quantified by the ratio of χ2 association statistics) compared with HMM-based imputation from individual-level genotypes at the 227 (176) published single nucleotide polymorphisms (SNPs) in the WTCCC (1958BC height) data. In addition, for publicly available summary statistics from large meta-analyses of four lipid traits, we publicly release imputed summary statistics at 1000G SNPs, which could not have been obtained using previously published methods, and demonstrate their accuracy by masking subsets of the data. We show that 1000G imputation using our approach increases the magnitude and statistical evidence of enrichment at genic versus non-genic loci for these traits, as compared with an analysis without 1000G imputation. Thus, imputation of summary statistics will be a valuable tool in future functional enrichment analyses. Availability and implementation: Publicly available software package available at http://bogdan.bioinformatics.ucla.edu/software/. Contact: bpasaniuc@mednet.ucla.edu or aprice@hsph.harvard.edu Supplementary information: Supplementary materials are available at Bioinformatics online. PMID:24990607</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2879599','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2879599"><span>On the Estimation of Disease Prevalence by Latent Class Models for Screening Studies Using Two Screening Tests with Categorical Disease Status Verified in Test Positives Only</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Chu, Haitao; Zhou, Yijie; Cole, Stephen R.; Ibrahim, Joseph G.</p> <p>2010-01-01</p> <p>Summary To evaluate the probabilities of a disease state, ideally all subjects in a study should be diagnosed by a definitive diagnostic or gold standard test. However, since definitive diagnostic tests are often invasive and expensive, it is generally unethical to apply them to subjects whose screening tests are negative. In this article, we consider latent class models for screening studies with two imperfect binary diagnostic tests and a definitive categorical disease status measured only for those with at least one positive screening test. Specifically, we discuss a conditional independent and three homogeneous conditional dependent latent class models and assess the impact of misspecification of the dependence structure on the estimation of disease category probabilities using frequentist and Bayesian approaches. Interestingly, the three homogeneous dependent models can provide identical goodness-of-fit but substantively different estimates for a given study. However, the parametric form of the assumed dependence structure itself is not “testable” from the data, and thus the dependence structure modeling considered here can only be viewed as a sensitivity analysis concerning a more complicated non-identifiable model potentially involving heterogeneous dependence structure. Furthermore, we discuss Bayesian model averaging together with its limitations as an alternative way to partially address this particularly challenging problem. The methods are applied to two cancer screening studies, and simulations are conducted to evaluate the performance of these methods. In summary, further research is needed to reduce the impact of model misspecification on the estimation of disease prevalence in such settings. PMID:20191614</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA595432','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA595432"><span>Comparison of ALE and SPH Methods for Simulating Mine Blast Effects on Structures</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2010-12-01</p> <p>Comparison of ALE and SPH methods for simulating mine blast effects on struc- tures Geneviève Toussaint Amal Bouamoul DRDC Valcartier Defence R&D...Canada – Valcartier Technical Report DRDC Valcartier TR 2010-326 December 2010 Comparison of ALE and SPH methods for simulating mine blast...Valcartier TR 2010-326 iii Executive summary Comparison of ALE and SPH methods for simulating mine blast effects on structures</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20120016029','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20120016029"><span>Data Comm Flight Deck Human-in-the-Loop Simulation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Lozito, Sandra; Martin, Lynne Hazel; Sharma, Shivanjli; Kaneshige, John T.; Dulchinos, Victoria</p> <p>2012-01-01</p> <p>This presentation discusses an upcoming simulation for data comm in the terminal area. The purpose of the presentation is to provide the REDAC committee with a summary of some of the work in Data Comm that is being sponsored by the FAA. The focus of the simulation is upon flight crew human performance variables, such as crew procedures, timing and errors. The simulation is scheduled to be conducted in Sept 2012.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA114628','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA114628"><span>Improving the Effectiveness and Acquisition Management of Selected Weapon Systems: A Summary of Major Issues and Recommended Actions.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1982-05-14</p> <p>need for effective training--a situation which will be impaired until the AH-64 combat mission simulator , now under development, becomes available in...antisubmarine warfare system includes the capability to detect, classify, localize, and destroy the enemy. This capability includes multimillion dollar...to simulate combat situations will simulate only air-to-air activity. Air-to-ground and electronic counter countermeasures simulations were deleted</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013JHyd..487...39G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013JHyd..487...39G"><span>Flood mapping in ungauged basins using fully continuous hydrologic-hydraulic modeling</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Grimaldi, Salvatore; Petroselli, Andrea; Arcangeletti, Ettore; Nardi, Fernando</p> <p>2013-04-01</p> <p>SummaryIn this work, a fully-continuous hydrologic-hydraulic modeling framework for flood mapping is introduced and tested. It is characterized by a simulation of a long rainfall time series at sub-daily resolution that feeds a continuous rainfall-runoff model producing a discharge time series that is directly given as an input to a bi-dimensional hydraulic model. The main advantage of the proposed approach is to avoid the use of the design hyetograph and the design hydrograph that constitute the main source of subjective analysis and uncertainty for standard methods. The proposed procedure is optimized for small and ungauged watersheds where empirical models are commonly applied. Results of a simple real case study confirm that this experimental fully-continuous framework may pave the way for the implementation of a less subjective and potentially automated procedure for flood hazard mapping.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1079638','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1079638"><span>Using Weather Data and Climate Model Output in Economic Analyses of Climate Change</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Auffhammer, M.; Hsiang, S. M.; Schlenker, W.</p> <p>2013-06-28</p> <p>Economists are increasingly using weather data and climate model output in analyses of the economic impacts of climate change. This article introduces a set of weather data sets and climate models that are frequently used, discusses the most common mistakes economists make in using these products, and identifies ways to avoid these pitfalls. We first provide an introduction to weather data, including a summary of the types of datasets available, and then discuss five common pitfalls that empirical researchers should be aware of when using historical weather data as explanatory variables in econometric applications. We then provide a brief overviewmore » of climate models and discuss two common and significant errors often made by economists when climate model output is used to simulate the future impacts of climate change on an economic outcome of interest.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3948591','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3948591"><span>Summary on Several Key Techniques in 3D Geological Modeling</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p></p> <p>2014-01-01</p> <p>Several key techniques in 3D geological modeling including planar mesh generation, spatial interpolation, and surface intersection are summarized in this paper. Note that these techniques are generic and widely used in various applications but play a key role in 3D geological modeling. There are two essential procedures in 3D geological modeling: the first is the simulation of geological interfaces using geometric surfaces and the second is the building of geological objects by means of various geometric computations such as the intersection of surfaces. Discrete geometric surfaces that represent geological interfaces can be generated by creating planar meshes first and then spatially interpolating; those surfaces intersect and then form volumes that represent three-dimensional geological objects such as rock bodies. In this paper, the most commonly used algorithms of the key techniques in 3D geological modeling are summarized. PMID:24772029</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2018-title12-vol3/pdf/CFR-2018-title12-vol3-part226-appH.pdf','CFR'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2018-title12-vol3/pdf/CFR-2018-title12-vol3-part226-appH.pdf"><span>12 CFR Appendix H to Part 226 - Closed-End Model Forms and Clauses</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2010&page.go=Go">Code of Federal Regulations, 2010 CFR</a></p> <p></p> <p>2018-01-01</p> <p>... Payment Summary Model Clause (§ 226.18(s)) H-4(F)—Adjustable-Rate Mortgage or Step-Rate Mortgage Interest Rate and Payment Summary Model Clause (§ 226.18(s)) H-4(G)—Mortgage with Negative Amortization Interest Rate and Payment Summary Model Clause (§ 226.18(s)) H-4(H)—Fixed-Rate Mortgage with Interest-Only...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5358897','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5358897"><span>Inferring epidemiological parameters from phylogenies using regression-ABC: A comparative study</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Gascuel, Olivier</p> <p>2017-01-01</p> <p>Inferring epidemiological parameters such as the R0 from time-scaled phylogenies is a timely challenge. Most current approaches rely on likelihood functions, which raise specific issues that range from computing these functions to finding their maxima numerically. Here, we present a new regression-based Approximate Bayesian Computation (ABC) approach, which we base on a large variety of summary statistics intended to capture the information contained in the phylogeny and its corresponding lineage-through-time plot. The regression step involves the Least Absolute Shrinkage and Selection Operator (LASSO) method, which is a robust machine learning technique. It allows us to readily deal with the large number of summary statistics, while avoiding resorting to Markov Chain Monte Carlo (MCMC) techniques. To compare our approach to existing ones, we simulated target trees under a variety of epidemiological models and settings, and inferred parameters of interest using the same priors. We found that, for large phylogenies, the accuracy of our regression-ABC is comparable to that of likelihood-based approaches involving birth-death processes implemented in BEAST2. Our approach even outperformed these when inferring the host population size with a Susceptible-Infected-Removed epidemiological model. It also clearly outperformed a recent kernel-ABC approach when assuming a Susceptible-Infected epidemiological model with two host types. Lastly, by re-analyzing data from the early stages of the recent Ebola epidemic in Sierra Leone, we showed that regression-ABC provides more realistic estimates for the duration parameters (latency and infectiousness) than the likelihood-based method. Overall, ABC based on a large variety of summary statistics and a regression method able to perform variable selection and avoid overfitting is a promising approach to analyze large phylogenies. PMID:28263987</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li class="active"><span>19</span></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_19 --> <div id="page_20" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li class="active"><span>20</span></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="381"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24719235','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24719235"><span>"Plateau"-related summary statistics are uninformative for comparing working memory models.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>van den Berg, Ronald; Ma, Wei Ji</p> <p>2014-10-01</p> <p>Performance on visual working memory tasks decreases as more items need to be remembered. Over the past decade, a debate has unfolded between proponents of slot models and slotless models of this phenomenon (Ma, Husain, Bays (Nature Neuroscience 17, 347-356, 2014). Zhang and Luck (Nature 453, (7192), 233-235, 2008) and Anderson, Vogel, and Awh (Attention, Perception, Psychophys 74, (5), 891-910, 2011) noticed that as more items need to be remembered, "memory noise" seems to first increase and then reach a "stable plateau." They argued that three summary statistics characterizing this plateau are consistent with slot models, but not with slotless models. Here, we assess the validity of their methods. We generated synthetic data both from a leading slot model and from a recent slotless model and quantified model evidence using log Bayes factors. We found that the summary statistics provided at most 0.15 % of the expected model evidence in the raw data. In a model recovery analysis, a total of more than a million trials were required to achieve 99 % correct recovery when models were compared on the basis of summary statistics, whereas fewer than 1,000 trials were sufficient when raw data were used. Therefore, at realistic numbers of trials, plateau-related summary statistics are highly unreliable for model comparison. Applying the same analyses to subject data from Anderson et al. (Attention, Perception, Psychophys 74, (5), 891-910, 2011), we found that the evidence in the summary statistics was at most 0.12 % of the evidence in the raw data and far too weak to warrant any conclusions. The evidence in the raw data, in fact, strongly favored the slotless model. These findings call into question claims about working memory that are based on summary statistics.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27070767','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27070767"><span>The Carbon-Water Interface: Modeling Challenges and Opportunities for the Water-Energy Nexus.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Striolo, Alberto; Michaelides, Angelos; Joly, Laurent</p> <p>2016-06-07</p> <p>Providing clean water and sufficient affordable energy to all without compromising the environment is a key priority in the scientific community. Many recent studies have focused on carbon-based devices in the hope of addressing this grand challenge, justifying and motivating detailed studies of water in contact with carbonaceous materials. Such studies are becoming increasingly important because of the miniaturization of newly proposed devices, with ubiquitous nanopores, large surface-to-volume ratio, and many, perhaps most of the water molecules in contact with a carbon-based surface. In this brief review, we discuss some recent advances obtained via simulations and experiments in the development of carbon-based materials for applications in water desalination. We suggest possible ways forward, with particular emphasis on the synergistic combination of experiments and simulations, with simulations now sometimes offering sufficient accuracy to provide fundamental insights. We also point the interested reader to recent works that complement our short summary on the state of the art of this important and fascinating field.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1291182','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1291182"><span>Proof-of-Concept Study for Uncertainty Quantification and Sensitivity Analysis using the BRL Shaped-Charge Example</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Hughes, Justin Matthew</p> <p></p> <p>These are the slides for a graduate presentation at Mississippi State University. It covers the following: the BRL Shaped-Charge Geometry in PAGOSA, mesh refinement study, surrogate modeling using a radial basis function network (RBFN), ruling out parameters using sensitivity analysis (equation of state study), uncertainty quantification (UQ) methodology, and sensitivity analysis (SA) methodology. In summary, a mesh convergence study was used to ensure that solutions were numerically stable by comparing PDV data between simulations. A Design of Experiments (DOE) method was used to reduce the simulation space to study the effects of the Jones-Wilkins-Lee (JWL) Parameters for the Composition Bmore » main charge. Uncertainty was quantified by computing the 95% data range about the median of simulation output using a brute force Monte Carlo (MC) random sampling method. Parameter sensitivities were quantified using the Fourier Amplitude Sensitivity Test (FAST) spectral analysis method where it was determined that detonation velocity, initial density, C1, and B1 controlled jet tip velocity.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA202493','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA202493"><span>Guidance and Control Systems Simulation and Validation Techniques</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1988-07-01</p> <p>AGARDograph No.273 GUIDANCE AND CONTROL SYSTEMS SIMULATION AND VALIDATION TECHNIQUES Edited by Dr William P.Albritton, Jr AMTEC Corporation 213 Ridgelawn...AND DEVELOPMENT PROCESS FOR TACTICAL GUIDED WEAPONS by Dr W.PAlbritton, Jr AMTEC Corporation 213 Ridgelawn Drive Athens, AL 35611, USA Summary A brief</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24532731','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24532731"><span>Neandertal admixture in Eurasia confirmed by maximum-likelihood analysis of three genomes.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Lohse, Konrad; Frantz, Laurent A F</p> <p>2014-04-01</p> <p>Although there has been much interest in estimating histories of divergence and admixture from genomic data, it has proved difficult to distinguish recent admixture from long-term structure in the ancestral population. Thus, recent genome-wide analyses based on summary statistics have sparked controversy about the possibility of interbreeding between Neandertals and modern humans in Eurasia. Here we derive the probability of full mutational configurations in nonrecombining sequence blocks under both admixture and ancestral structure scenarios. Dividing the genome into short blocks gives an efficient way to compute maximum-likelihood estimates of parameters. We apply this likelihood scheme to triplets of human and Neandertal genomes and compare the relative support for a model of admixture from Neandertals into Eurasian populations after their expansion out of Africa against a history of persistent structure in their common ancestral population in Africa. Our analysis allows us to conclusively reject a model of ancestral structure in Africa and instead reveals strong support for Neandertal admixture in Eurasia at a higher rate (3.4-7.3%) than suggested previously. Using analysis and simulations we show that our inference is more powerful than previous summary statistics and robust to realistic levels of recombination.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3982695','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3982695"><span>Neandertal Admixture in Eurasia Confirmed by Maximum-Likelihood Analysis of Three Genomes</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Lohse, Konrad; Frantz, Laurent A. F.</p> <p>2014-01-01</p> <p>Although there has been much interest in estimating histories of divergence and admixture from genomic data, it has proved difficult to distinguish recent admixture from long-term structure in the ancestral population. Thus, recent genome-wide analyses based on summary statistics have sparked controversy about the possibility of interbreeding between Neandertals and modern humans in Eurasia. Here we derive the probability of full mutational configurations in nonrecombining sequence blocks under both admixture and ancestral structure scenarios. Dividing the genome into short blocks gives an efficient way to compute maximum-likelihood estimates of parameters. We apply this likelihood scheme to triplets of human and Neandertal genomes and compare the relative support for a model of admixture from Neandertals into Eurasian populations after their expansion out of Africa against a history of persistent structure in their common ancestral population in Africa. Our analysis allows us to conclusively reject a model of ancestral structure in Africa and instead reveals strong support for Neandertal admixture in Eurasia at a higher rate (3.4−7.3%) than suggested previously. Using analysis and simulations we show that our inference is more powerful than previous summary statistics and robust to realistic levels of recombination. PMID:24532731</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013CoPhC.184.2840B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013CoPhC.184.2840B"><span>An atomistic geometrical model of the B-DNA configuration for DNA-radiation interaction simulations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bernal, M. A.; Sikansi, D.; Cavalcante, F.; Incerti, S.; Champion, C.; Ivanchenko, V.; Francis, Z.</p> <p>2013-12-01</p> <p>In this paper, an atomistic geometrical model for the B-DNA configuration is explained. This model accounts for five organization levels of the DNA, up to the 30 nm chromatin fiber. However, fragments of this fiber can be used to construct the whole genome. The algorithm developed in this work is capable to determine which is the closest atom with respect to an arbitrary point in space. It can be used in any application in which a DNA geometrical model is needed, for instance, in investigations related to the effects of ionizing radiations on the human genetic material. Successful consistency checks were carried out to test the proposed model. Catalogue identifier: AEPZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEPZ_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1245 No. of bytes in distributed program, including test data, etc.: 6574 Distribution format: tar.gz Programming language: FORTRAN. Computer: Any. Operating system: Multi-platform. RAM: 2 Gb Classification: 3. Nature of problem: The Monte Carlo method is used to simulate the interaction of ionizing radiation with the human genetic material in order to determine DNA damage yields per unit absorbed dose. To accomplish this task, an algorithm to determine if a given energy deposition lies within a given target is needed. This target can be an atom or any other structure of the genetic material. Solution method: This is a stand-alone subroutine describing an atomic-resolution geometrical model of the B-DNA configuration. It is able to determine the closest atom to an arbitrary point in space. This model accounts for five organization levels of the human genetic material, from the nucleotide pair up to the 30 nm chromatin fiber. This subroutine carries out a series of coordinate transformations to find which is the closest atom containing an arbitrary point in space. Atom sizes are according to the corresponding van der Waals radii. Restrictions: The geometrical model presented here does not include the chromosome organization level but it could be easily build up by using fragments of the 30 nm chromatin fiber. Unusual features: To our knowledge, this is the first open source atomic-resolution DNA geometrical model developed for DNA-radiation interaction Monte Carlo simulations. In our tests, the current model took into account the explicit position of about 56×106 atoms, although the user may enhance this amount according to the necessities. Running time: This subroutine can process about 2 million points within a few minutes in a typical current computer.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4907397','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4907397"><span>BioNetFit: a fitting tool compatible with BioNetGen, NFsim and distributed computing environments</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Thomas, Brandon R.; Chylek, Lily A.; Colvin, Joshua; Sirimulla, Suman; Clayton, Andrew H.A.; Hlavacek, William S.; Posner, Richard G.</p> <p>2016-01-01</p> <p>Summary: Rule-based models are analyzed with specialized simulators, such as those provided by the BioNetGen and NFsim open-source software packages. Here, we present BioNetFit, a general-purpose fitting tool that is compatible with BioNetGen and NFsim. BioNetFit is designed to take advantage of distributed computing resources. This feature facilitates fitting (i.e. optimization of parameter values for consistency with data) when simulations are computationally expensive. Availability and implementation: BioNetFit can be used on stand-alone Mac, Windows/Cygwin, and Linux platforms and on Linux-based clusters running SLURM, Torque/PBS, or SGE. The BioNetFit source code (Perl) is freely available (http://bionetfit.nau.edu). Supplementary information: Supplementary data are available at Bioinformatics online. Contact: bionetgen.help@gmail.com PMID:26556387</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5744449','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5744449"><span>Grid Block Design Based on Monte Carlo Simulated Dosimetry, the Linear Quadratic and Hug–Kellerer Radiobiological Models</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Gholami, Somayeh; Nedaie, Hassan Ali; Longo, Francesco; Ay, Mohammad Reza; Dini, Sharifeh A.; Meigooni, Ali S.</p> <p>2017-01-01</p> <p>Purpose: The clinical efficacy of Grid therapy has been examined by several investigators. In this project, the hole diameter and hole spacing in Grid blocks were examined to determine the optimum parameters that give a therapeutic advantage. Methods: The evaluations were performed using Monte Carlo (MC) simulation and commonly used radiobiological models. The Geant4 MC code was used to simulate the dose distributions for 25 different Grid blocks with different hole diameters and center-to-center spacing. The therapeutic parameters of these blocks, namely, the therapeutic ratio (TR) and geometrical sparing factor (GSF) were calculated using two different radiobiological models, including the linear quadratic and Hug–Kellerer models. In addition, the ratio of the open to blocked area (ROTBA) is also used as a geometrical parameter for each block design. Comparisons of the TR, GSF, and ROTBA for all of the blocks were used to derive the parameters for an optimum Grid block with the maximum TR, minimum GSF, and optimal ROTBA. A sample of the optimum Grid block was fabricated at our institution. Dosimetric characteristics of this Grid block were measured using an ionization chamber in water phantom, Gafchromic film, and thermoluminescent dosimeters in Solid Water™ phantom materials. Results: The results of these investigations indicated that Grid blocks with hole diameters between 1.00 and 1.25 cm and spacing of 1.7 or 1.8 cm have optimal therapeutic parameters (TR > 1.3 and GSF~0.90). The measured dosimetric characteristics of the optimum Grid blocks including dose profiles, percentage depth dose, dose output factor (cGy/MU), and valley-to-peak ratio were in good agreement (±5%) with the simulated data. Conclusion: In summary, using MC-based dosimetry, two radiobiological models, and previously published clinical data, we have introduced a method to design a Grid block with optimum therapeutic response. The simulated data were reproduced by experimental data. PMID:29296035</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017GeoRL..44.6124B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017GeoRL..44.6124B"><span>Uncertain soil moisture feedbacks in model projections of Sahel precipitation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Berg, Alexis; Lintner, Benjamin R.; Findell, Kirsten; Giannini, Alessandra</p> <p>2017-06-01</p> <p>Given the uncertainties in climate model projections of Sahel precipitation, at the northern edge of the West African Monsoon, understanding the factors governing projected precipitation changes in this semiarid region is crucial. This study investigates how long-term soil moisture changes projected under climate change may feedback on projected changes of Sahel rainfall, using simulations with and without soil moisture change from five climate models participating in the Global Land Atmosphere Coupling Experiment-Coupled Model Intercomparison Project phase 5 experiment. In four out of five models analyzed, soil moisture feedbacks significantly influence the projected West African precipitation response to warming; however, the sign of these feedbacks differs across the models. These results demonstrate that reducing uncertainties across model projections of the West African Monsoon requires, among other factors, improved mechanistic understanding and constraint of simulated land-atmosphere feedbacks, even at the large spatial scales considered here.<abstract type="synopsis"><title type="main">Plain Language SummaryClimate model projections of Sahel rainfall remain notoriously uncertain; understanding the physical processes responsible for this uncertainty is thus crucial. Our study focuses on analyzing the feedbacks of soil moisture changes on model projections of the West African Monsoon under global warming. Soil moisture-atmosphere interactions have been shown in prior studies to play an important role in this region, but the potential feedbacks of long-term soil moisture changes on projected precipitation changes have not been investigated specifically. To isolate these feedbacks, we use targeted simulations from five climate models, with and without soil moisture change. Importantly, we find that climate models exhibit soil moisture-precipitation feedbacks of different sign in this region: in some models soil moisture changes amplify precipitation changes (positive feedback), in others they dampen them (negative feedback). The impact of those feedbacks is in some cases of comparable amplitude to the projected precipitation changes themselves. In other words, we show, over a subset of climate models, how land-atmosphere interactions may be a cause of uncertainty in model projections of precipitation; we emphasize the need to evaluate these processes carefully in current and next-generation climate model simulations.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014BGeo...11.1261H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014BGeo...11.1261H"><span>Technical Note: Approximate Bayesian parameterization of a process-based tropical forest model</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hartig, F.; Dislich, C.; Wiegand, T.; Huth, A.</p> <p>2014-02-01</p> <p>Inverse parameter estimation of process-based models is a long-standing problem in many scientific disciplines. A key question for inverse parameter estimation is how to define the metric that quantifies how well model predictions fit to the data. This metric can be expressed by general cost or objective functions, but statistical inversion methods require a particular metric, the probability of observing the data given the model parameters, known as the likelihood. For technical and computational reasons, likelihoods for process-based stochastic models are usually based on general assumptions about variability in the observed data, and not on the stochasticity generated by the model. Only in recent years have new methods become available that allow the generation of likelihoods directly from stochastic simulations. Previous applications of these approximate Bayesian methods have concentrated on relatively simple models. Here, we report on the application of a simulation-based likelihood approximation for FORMIND, a parameter-rich individual-based model of tropical forest dynamics. We show that approximate Bayesian inference, based on a parametric likelihood approximation placed in a conventional Markov chain Monte Carlo (MCMC) sampler, performs well in retrieving known parameter values from virtual inventory data generated by the forest model. We analyze the results of the parameter estimation, examine its sensitivity to the choice and aggregation of model outputs and observed data (summary statistics), and demonstrate the application of this method by fitting the FORMIND model to field data from an Ecuadorian tropical forest. Finally, we discuss how this approach differs from approximate Bayesian computation (ABC), another method commonly used to generate simulation-based likelihood approximations. Our results demonstrate that simulation-based inference, which offers considerable conceptual advantages over more traditional methods for inverse parameter estimation, can be successfully applied to process-based models of high complexity. The methodology is particularly suitable for heterogeneous and complex data structures and can easily be adjusted to other model types, including most stochastic population and individual-based models. Our study therefore provides a blueprint for a fairly general approach to parameter estimation of stochastic process-based models.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19920008743','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19920008743"><span>An investigation of air transportation technology at the Massachusetts Institute of Technology, 1990-1991</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Simpson, Robert W.</p> <p>1991-01-01</p> <p>Brief summaries are given of research activities at the Massachusetts Institute of Technology (MIT) under the sponsorship of the FAA/NASA Joint University Program. Topics covered include hazard assessment and cockpit presentation issues for microburst alerting systems; the situational awareness effect of automated air traffic control (ATC) datalink clearance amendments; a graphical simulation system for adaptive, automated approach spacing; an expert system for temporal planning with application to runway configuration management; deterministic multi-zone ice accretion modeling; alert generation and cockpit presentation for an integrated microburst alerting system; and passive infrared ice detection for helicopter applications.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19740024186','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19740024186"><span>Asymmetrical booster ascent guidance and control system design study. Volume 1: Summary. [space shuttle development</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Williams, F. E.; Lemon, R. S.; Jaggers, R. F.; Wilson, J. L.</p> <p>1974-01-01</p> <p>Dynamics and control, stability, and guidance analyses are summarized for the asymmetrical booster ascent guidance and control system design studies, performed in conjunction with space shuttle planning. The mathematical models developed for use in rigid body and flexible body versions of the NASA JSC space shuttle functional simulator are briefly discussed, along with information on the following: (1) space shuttle stability analysis using equations of motion for both pitch and lateral axes; (2) the computer program used to obtain stability margin; and (3) the guidance equations developed for the space shuttle powered flight phases.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19950020723','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19950020723"><span>The lift-fan aircraft: Lessons learned</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Deckert, Wallace H.</p> <p>1995-01-01</p> <p>This report summarizes the highlights and results of a workshop held at NASA Ames Research Center in October 1992. The objective of the workshop was a thorough review of the lessons learned from past research on lift fans, and lift-fan aircraft, models, designs, and components. The scope included conceptual design studies, wind tunnel investigations, propulsion systems components, piloted simulation, flight of aircraft such as the SV-5A and SV-5B and a recent lift-fan aircraft development project. The report includes a brief summary of five technical presentations that addressed the subject The Lift-Fan Aircraft: Lessons Learned.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3653619','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3653619"><span>Neuronal Morphology goes Digital: A Research Hub for Cellular and System Neuroscience</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Parekh, Ruchi; Ascoli, Giorgio A.</p> <p>2013-01-01</p> <p>Summary The importance of neuronal morphology in brain function has been recognized for over a century. The broad applicability of “digital reconstructions” of neuron morphology across neuroscience sub-disciplines has stimulated the rapid development of numerous synergistic tools for data acquisition, anatomical analysis, three-dimensional rendering, electrophysiological simulation, growth models, and data sharing. Here we discuss the processes of histological labeling, microscopic imaging, and semi-automated tracing. Moreover, we provide an annotated compilation of currently available resources in this rich research “ecosystem” as a central reference for experimental and computational neuroscience. PMID:23522039</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA595727','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA595727"><span>Internal Blast in a Compartment-type Vessel. Part 1: Finite Element Modeling Investigation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2012-11-01</p> <p>données expérimentales devraient aussi être utilisées pour valider le modèle. DRDC Valcartier TM 2012-222 iii Executive summary Internal...pour simuler adéquatement le souffle d’explosion, incluant le choc et ses réflexions, à l’intérieur d’une large structure et le besoin de remédier à...terme, le développement et la validation d’un ‘raytracer’ et son couplage avec LS-DYNA seront étudiés pour modéliser le choc dans le compartiment</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.fs.usda.gov/treesearch/pubs/14411','TREESEARCH'); return false;" href="https://www.fs.usda.gov/treesearch/pubs/14411"><span>Simulating Silvicultural Treatments Using FIA Data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.fs.usda.gov/treesearch/">Treesearch</a></p> <p>Christopher W. Woodall; Carl E. Fiedler</p> <p>2005-01-01</p> <p>Potential uses of the Forest Inventory and Analysis Database (FIADB) extend far beyond descriptions and summaries of current forest resources. Silvicultural treatments, although typically conducted at the stand level, may be simulated using the FIADB for predicting future forest conditions and resources at broader scales. In this study, silvicultural prescription...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/ED289002.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/ED289002.pdf"><span>Aircrew Training Devices: Utility and Utilization of Advanced Instructional Features (Phase IV--Summary Report).</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Polzella, Donald J.; And Others</p> <p></p> <p>Modern aircrew training devices (ATDs) are equipped with sophisticated hardware and software capabilities, known as advanced instructional features (AIFs), that permit a simulator instructor to prepare briefings, manage training, vary task difficulty/fidelity, monitor performance, and provide feedback for flight simulation training missions. The…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/FR-2012-04-20/pdf/2012-9556.pdf','FEDREG'); return false;" href="https://www.gpo.gov/fdsys/pkg/FR-2012-04-20/pdf/2012-9556.pdf"><span>77 FR 23668 - GPS Satellite Simulator Working Group Notice of Meeting</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collection.action?collectionCode=FR">Federal Register 2010, 2011, 2012, 2013, 2014</a></p> <p></p> <p>2012-04-20</p> <p>... DEPARTMENT OF DEFENSE Department of the Air Force GPS Satellite Simulator Working Group Notice of Meeting AGENCY: The United States Air Force. ACTION: Meeting Notice. SUMMARY: This meeting notice is to..., telephone number, address and security clearance information. Wayne T. Urubio, 2nd Lieutenant, USAF, SMC/GPE...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://rosap.ntl.bts.gov/view/dot/36200','DOTNTL'); return false;" href="https://rosap.ntl.bts.gov/view/dot/36200"><span>Advanced vehicle technology simulation and research outreach to STEM programs : research report summary</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntlsearch.bts.gov/tris/index.do">DOT National Transportation Integrated Search</a></p> <p></p> <p>2017-05-30</p> <p>The University of Iowa (UI) and the leaders of the MyCarDoesWhat campaign partnered with the National Advanced Driving Simulator (NADS) miniSim and the UI Mobile Museum to build an interactive exhibit as part of the overall museum for visitors to exp...</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li class="active"><span>20</span></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_20 --> <div id="page_21" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li class="active"><span>21</span></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="401"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://rosap.ntl.bts.gov/view/dot/23568','DOTNTL'); return false;" href="https://rosap.ntl.bts.gov/view/dot/23568"><span>Finite element simulation of structural performance on flexible pavements with stabilized base/treated subbase materials under accelerated loading : tech summary.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntlsearch.bts.gov/tris/index.do">DOT National Transportation Integrated Search</a></p> <p></p> <p>2011-12-01</p> <p>Accelerated pavement testing (APT) has been increasingly used by state highway agencies in recent years for evaluating pavement : design and performance through applying a simulative heavy vehicular load to the pavement section under controlled fi el...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4715654','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4715654"><span>A Permutation Approach for Selecting the Penalty Parameter in Penalized Model Selection</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Sabourin, Jeremy A; Valdar, William; Nobel, Andrew B</p> <p>2015-01-01</p> <p>Summary We describe a simple, computationally effcient, permutation-based procedure for selecting the penalty parameter in LASSO penalized regression. The procedure, permutation selection, is intended for applications where variable selection is the primary focus, and can be applied in a variety of structural settings, including that of generalized linear models. We briefly discuss connections between permutation selection and existing theory for the LASSO. In addition, we present a simulation study and an analysis of real biomedical data sets in which permutation selection is compared with selection based on the following: cross-validation (CV), the Bayesian information criterion (BIC), Scaled Sparse Linear Regression, and a selection method based on recently developed testing procedures for the LASSO. PMID:26243050</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.usgs.gov/wri/1983/4251/report.pdf','USGSPUBS'); return false;" href="https://pubs.usgs.gov/wri/1983/4251/report.pdf"><span>A modification of the finite-difference model for simulation of two dimensional ground-water flow to include surface-ground water relationships</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Ozbilgin, M.M.; Dickerman, D.C.</p> <p>1984-01-01</p> <p>The two-dimensional finite-difference model for simulation of groundwater flow was modified to enable simulation of surface-water/groundwater interactions during periods of low streamflow. Changes were made to the program code in order to calculate surface-water heads for, and flow either to or from, contiguous surface-water bodies; and to allow for more convenient data input. Methods of data input and output were modified and entries (RSORT and HDRIVER) were added to the COEF and CHECKI subroutines to calculate surface-water heads. A new subroutine CALC was added to the program which initiates surface-water calculations. If CALC is not specified as a simulation option, the program runs the original version. The subroutines which solve the ground-water flow equations were not changed. Recharge, evapotranspiration, surface-water inflow, number of wells, pumping rate, and pumping duration can be varied for any time period. The Manning formula was used to relate stream depth and discharge in surface-water streams. Interactions between surface water and ground water are represented by the leakage term in the ground-water flow and surface-water mass balance equations. Documentation includes a flow chart, data deck instructions, input data, output summary, and program listing. Numerical results from the modified program are in good agreement with published analytical results. (USGS)</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/1987emc..symp.....M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/1987emc..symp.....M"><span>A research program to assess the impact of the electromagnetic pulse on electric power systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>McConnell, B. W.; Barnes, P. R.</p> <p></p> <p>A strong electromagnetic pulse (EMP) with an electric-field component on the order of tens of kilovolts per meter is produced by a nuclear detonation in or above the atmosphere. This paper presents an overview and a summary of the results to date of a program formulated to address the research and development of technologies and systems required to assess and reduce the impact of EMP on electric power systems. The technologies and systems being considered include simulation models, methods of assessment, definition of required experiments and data, development of protective hardware, and the creation or revision of operating and control procedures. Results to date include the development of relatively simple unclassified EMP environment models, the development of methods for extending EMP coupling models to the large transmission and distribution network associated with the electric power system, and the performance of a parametric study of HEMP induced surges using an appropriate EMP environment. An experiment to investigate the effect of corona on the coupling of EMP to conductors has been defined and has been performed in an EMP simulator. Experiments to determine the response of key components to simulated EMP surges and an investigation of the impact of steep-front, short-duration impulse on a selected number of the insulation systems used in electric power systems apparatus are being performed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1349754','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1349754"><span>Comparisons of RELAP5-3D Analyses to Experimental Data from the Natural Convection Shutdown Heat Removal Test Facility</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Bucknor, Matthew; Hu, Rui; Lisowski, Darius</p> <p>2016-04-17</p> <p>The Reactor Cavity Cooling System (RCCS) is an important passive safety system being incorporated into the overall safety strategy for high temperature advanced reactor concepts such as the High Temperature Gas- Cooled Reactors (HTGR). The Natural Convection Shutdown Heat Removal Test Facility (NSTF) at Argonne National Laboratory (Argonne) reflects a 1/2-scale model of the primary features of one conceptual air-cooled RCCS design. The project conducts ex-vessel, passive heat removal experiments in support of Department of Energy Office of Nuclear Energy’s Advanced Reactor Technology (ART) program, while also generating data for code validation purposes. While experiments are being conducted at themore » NSTF to evaluate the feasibility of the passive RCCS, parallel modeling and simulation efforts are ongoing to support the design, fabrication, and operation of these natural convection systems. Both system-level and high fidelity computational fluid dynamics (CFD) analyses were performed to gain a complete understanding of the complex flow and heat transfer phenomena in natural convection systems. This paper provides a summary of the RELAP5-3D NSTF model development efforts and provides comparisons between simulation results and experimental data from the NSTF. Overall, the simulation results compared favorably to the experimental data, however, further analyses need to be conducted to investigate any identified differences.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009JHyd..367..138J','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009JHyd..367..138J"><span>A root zone modelling approach to estimating groundwater recharge from irrigated areas</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Jiménez-Martínez, J.; Skaggs, T. H.; van Genuchten, M. Th.; Candela, L.</p> <p>2009-03-01</p> <p>SummaryIn irrigated semi-arid and arid regions, accurate knowledge of groundwater recharge is important for the sustainable management of scarce water resources. The Campo de Cartagena area of southeast Spain is a semi-arid region where irrigation return flow accounts for a substantial portion of recharge. In this study we estimated irrigation return flow using a root zone modelling approach in which irrigation, evapotranspiration, and soil moisture dynamics for specific crops and irrigation regimes were simulated with the HYDRUS-1D software package. The model was calibrated using field data collected in an experimental plot. Good agreement was achieved between the HYDRUS-1D simulations and field measurements made under melon and lettuce crops. The simulations indicated that water use by the crops was below potential levels despite regular irrigation. The fraction of applied water (irrigation plus precipitation) going to recharge ranged from 22% for a summer melon crop to 68% for a fall lettuce crop. In total, we estimate that irrigation of annual fruits and vegetables produces 26 hm 3 y -1 of groundwater recharge to the top unconfined aquifer. This estimate does not include important irrigated perennial crops in the region, such as artichoke and citrus. Overall, the results suggest a greater amount of irrigation return flow in the Campo de Cartagena region than was previously estimated.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010SPIE.7694E..15C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010SPIE.7694E..15C"><span>The layered sensing operations center: a modeling and simulation approach to developing complex ISR networks</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Curtis, Christopher; Lenzo, Matthew; McClure, Matthew; Preiss, Bruce</p> <p>2010-04-01</p> <p>In order to anticipate the constantly changing landscape of global warfare, the United States Air Force must acquire new capabilities in the field of Intelligence, Surveillance, and Reconnaissance (ISR). To meet this challenge, the Air Force Research Laboratory (AFRL) is developing a unifying construct of "Layered Sensing" which will provide military decision-makers at all levels with the timely, actionable, and trusted information necessary for complete battlespace awareness. Layered Sensing is characterized by the appropriate combination of sensors and platforms (including those for persistent sensing), infrastructure, and exploitation capabilities to enable this synergistic awareness. To achieve the Layered Sensing vision, AFRL is pursuing a Modeling & Simulation (M&S) strategy through the Layered Sensing Operations Center (LSOC). An experimental ISR system-of-systems test-bed, the LSOC integrates DoD standard simulation tools with commercial, off-the-shelf video game technology for rapid scenario development and visualization. These tools will help facilitate sensor management performance characterization, system development, and operator behavioral analysis. Flexible and cost-effective, the LSOC will implement a non-proprietary, open-architecture framework with well-defined interfaces. This framework will incentivize the transition of current ISR performance models to service-oriented software design for maximum re-use and consistency. This paper will present the LSOC's development and implementation thus far as well as a summary of lessons learned and future plans for the LSOC.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://onlinelibrary.wiley.com/doi/10.1111/j.2041-210X.2011.00113.x/abstract','USGSPUBS'); return false;" href="http://onlinelibrary.wiley.com/doi/10.1111/j.2041-210X.2011.00113.x/abstract"><span>Accounting for non-independent detection when estimating abundance of organisms with a Bayesian approach</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Martin, Julien; Royle, J. Andrew; MacKenzie, Darryl I.; Edwards, Holly H.; Kery, Marc; Gardner, Beth</p> <p>2011-01-01</p> <p>Summary 1. Binomial mixture models use repeated count data to estimate abundance. They are becoming increasingly popular because they provide a simple and cost-effective way to account for imperfect detection. However, these models assume that individuals are detected independently of each other. This assumption may often be violated in the field. For instance, manatees (Trichechus manatus latirostris) may surface in turbid water (i.e. become available for detection during aerial surveys) in a correlated manner (i.e. in groups). However, correlated behaviour, affecting the non-independence of individual detections, may also be relevant in other systems (e.g. correlated patterns of singing in birds and amphibians). 2. We extend binomial mixture models to account for correlated behaviour and therefore to account for non-independent detection of individuals. We simulated correlated behaviour using beta-binomial random variables. Our approach can be used to simultaneously estimate abundance, detection probability and a correlation parameter. 3. Fitting binomial mixture models to data that followed a beta-binomial distribution resulted in an overestimation of abundance even for moderate levels of correlation. In contrast, the beta-binomial mixture model performed considerably better in our simulation scenarios. We also present a goodness-of-fit procedure to evaluate the fit of beta-binomial mixture models. 4. We illustrate our approach by fitting both binomial and beta-binomial mixture models to aerial survey data of manatees in Florida. We found that the binomial mixture model did not fit the data, whereas there was no evidence of lack of fit for the beta-binomial mixture model. This example helps illustrate the importance of using simulations and assessing goodness-of-fit when analysing ecological data with N-mixture models. Indeed, both the simulations and the goodness-of-fit procedure highlighted the limitations of the standard binomial mixture model for aerial manatee surveys. 5. Overestimation of abundance by binomial mixture models owing to non-independent detections is problematic for ecological studies, but also for conservation. For example, in the case of endangered species, it could lead to inappropriate management decisions, such as downlisting. These issues will be increasingly relevant as more ecologists apply flexible N-mixture models to ecological data.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5848496','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5848496"><span>Computational Approaches to Chemical Hazard Assessment</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Luechtefeld, Thomas; Hartung, Thomas</p> <p>2018-01-01</p> <p>Summary Computational prediction of toxicity has reached new heights as a result of decades of growth in the magnitude and diversity of biological data. Public packages for statistics and machine learning make model creation faster. New theory in machine learning and cheminformatics enables integration of chemical structure, toxicogenomics, simulated and physical data in the prediction of chemical health hazards, and other toxicological information. Our earlier publications have characterized a toxicological dataset of unprecedented scale resulting from the European REACH legislation (Registration Evaluation Authorisation and Restriction of Chemicals). These publications dove into potential use cases for regulatory data and some models for exploiting this data. This article analyzes the options for the identification and categorization of chemicals, moves on to the derivation of descriptive features for chemicals, discusses different kinds of targets modeled in computational toxicology, and ends with a high-level perspective of the algorithms used to create computational toxicology models. PMID:29101769</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5079144','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5079144"><span>Modelling of Batch Lactic Acid Fermentation in
the Presence of Anionic Clay</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Jinescu, Cosmin; Aruş, Vasilica Alisa; Nistor, Ileana Denisa</p> <p>2014-01-01</p> <p>Summary Batch fermentation of milk inoculated with lactic acid bacteria was conducted in the presence of hydrotalcite-type anionic clay under static and ultrasonic conditions. An experimental study of the effect of fermentation temperature (t=38–43 °C), clay/milk ratio (R=1–7.5 g/L) and ultrasonic field (ν=0 and 35 kHz) on process dynamics was performed. A mathematical model was selected to describe the fermentation process kinetics and its parameters were estimated based on experimental data. A good agreement between the experimental and simulated results was achieved. Consequently, the model can be employed to predict the dynamics of batch lactic acid fermentation with values of process variables in the studied ranges. A statistical analysis of the data based on a 23 factorial experiment was performed in order to express experimental and model-regressed process responses depending on t, R and ν factors. PMID:27904318</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3061242','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3061242"><span>Improved Doubly Robust Estimation when Data are Monotonely Coarsened, with Application to Longitudinal Studies with Dropout</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Tsiatis, Anastasios A.; Davidian, Marie; Cao, Weihua</p> <p>2010-01-01</p> <p>Summary A routine challenge is that of making inference on parameters in a statistical model of interest from longitudinal data subject to drop out, which are a special case of the more general setting of monotonely coarsened data. Considerable recent attention has focused on doubly robust estimators, which in this context involve positing models for both the missingness (more generally, coarsening) mechanism and aspects of the distribution of the full data, that have the appealing property of yielding consistent inferences if only one of these models is correctly specified. Doubly robust estimators have been criticized for potentially disastrous performance when both of these models are even only mildly misspecified. We propose a doubly robust estimator applicable in general monotone coarsening problems that achieves comparable or improved performance relative to existing doubly robust methods, which we demonstrate via simulation studies and by application to data from an AIDS clinical trial. PMID:20731640</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/19897817','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/19897817"><span>Getting more from accuracy and response time data: methods for fitting the linear ballistic accumulator.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Donkin, Chris; Averell, Lee; Brown, Scott; Heathcote, Andrew</p> <p>2009-11-01</p> <p>Cognitive models of the decision process provide greater insight into response time and accuracy than do standard ANOVA techniques. However, such models can be mathematically and computationally difficult to apply. We provide instructions and computer code for three methods for estimating the parameters of the linear ballistic accumulator (LBA), a new and computationally tractable model of decisions between two or more choices. These methods-a Microsoft Excel worksheet, scripts for the statistical program R, and code for implementation of the LBA into the Bayesian sampling software WinBUGS-vary in their flexibility and user accessibility. We also provide scripts in R that produce a graphical summary of the data and model predictions. In a simulation study, we explored the effect of sample size on parameter recovery for each method. The materials discussed in this article may be downloaded as a supplement from http://brm.psychonomic-journals.org/content/supplemental.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70014130','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70014130"><span>Chemical reactions simulated by ground-water-quality models</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Grove, David B.; Stollenwerk, Kenneth G.</p> <p>1987-01-01</p> <p>Recent literature concerning the modeling of chemical reactions during transport in ground water is examined with emphasis on sorption reactions. The theory of transport and reactions in porous media has been well documented. Numerous equations have been developed from this theory, to provide both continuous and sequential or multistep models, with the water phase considered for both mobile and immobile phases. Chemical reactions can be either equilibrium or non-equilibrium, and can be quantified in linear or non-linear mathematical forms. Non-equilibrium reactions can be separated into kinetic and diffusional rate-limiting mechanisms. Solutions to the equations are available by either analytical expressions or numerical techniques. Saturated and unsaturated batch, column, and field studies are discussed with one-dimensional, laboratory-column experiments predominating. A summary table is presented that references the various kinds of models studied and their applications in predicting chemical concentrations in ground waters.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4512881','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4512881"><span>Modeling the spatiotemporal dynamics of light and heat propagation for in vivo optogenetics</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Stujenske, Joseph M.; Spellman, Timothy; Gordon, Joshua A.</p> <p>2015-01-01</p> <p>Summary Despite the increasing use of optogenetics in vivo, the effects of direct light exposure to brain tissue are understudied. Of particular concern is the potential for heat induced by prolonged optical stimulation. We demonstrate that high intensity light, delivered through an optical fiber, is capable of elevating firing rate locally, even in the absence of opsin expression. Predicting the severity and spatial extent of any temperature increase during optogenetic stimulation is therefore of considerable importance. Here we describe a realistic model that simulates light and heat propagation during optogenetic experiments. We validated the model by comparing predicted and measured temperature changes in vivo. We further demonstrate the utility of this model by comparing predictions for various wavelengths of light and fiber sizes, as well as testing methods for reducing heat effects on neural targets in vivo. PMID:26166563</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=20010060338&hterms=knowledge+scientist&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D60%26Ntt%3Dknowledge%2Bscientist','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=20010060338&hterms=knowledge+scientist&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D60%26Ntt%3Dknowledge%2Bscientist"><span>SIGMA: A Knowledge-Based Simulation Tool Applied to Ecosystem Modeling</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Dungan, Jennifer L.; Keller, Richard; Lawless, James G. (Technical Monitor)</p> <p>1994-01-01</p> <p>The need for better technology to facilitate building, sharing and reusing models is generally recognized within the ecosystem modeling community. The Scientists' Intelligent Graphical Modelling Assistant (SIGMA) creates an environment for model building, sharing and reuse which provides an alternative to more conventional approaches which too often yield poorly documented, awkwardly structured model code. The SIGMA interface presents the user a list of model quantities which can be selected for computation. Equations to calculate the model quantities may be chosen from an existing library of ecosystem modeling equations, or built using a specialized equation editor. Inputs for dim equations may be supplied by data or by calculation from other equations. Each variable and equation is expressed using ecological terminology and scientific units, and is documented with explanatory descriptions and optional literature citations. Automatic scientific unit conversion is supported and only physically-consistent equations are accepted by the system. The system uses knowledge-based semantic conditions to decide which equations in its library make sense to apply in a given situation, and supplies these to the user for selection. "Me equations and variables are graphically represented as a flow diagram which provides a complete summary of the model. Forest-BGC, a stand-level model that simulates photosynthesis and evapo-transpiration for conifer canopies, was originally implemented in Fortran and subsequenty re-implemented using SIGMA. The SIGMA version reproduces daily results and also provides a knowledge base which greatly facilitates inspection, modification and extension of Forest-BGC.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26327487','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26327487"><span>Replication and validation of higher order models demonstrated that a summary score for the EORTC QLQ-C30 is robust.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Giesinger, Johannes M; Kieffer, Jacobien M; Fayers, Peter M; Groenvold, Mogens; Petersen, Morten Aa; Scott, Neil W; Sprangers, Mirjam A G; Velikova, Galina; Aaronson, Neil K</p> <p>2016-01-01</p> <p>To further evaluate the higher order measurement structure of the European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Questionnaire Core 30 (QLQ-C30), with the aim of generating a summary score. Using pretreatment QLQ-C30 data (N = 3,282), we conducted confirmatory factor analyses to test seven previously evaluated higher order models. We compared the summary score(s) derived from the best performing higher order model with the original QLQ-C30 scale scores, using tumor stage, performance status, and change over time (N = 244) as grouping variables. Although all models showed acceptable fit, we continued in the interest of parsimony with known-groups validity and responsiveness analyses using a summary score derived from the single higher order factor model. The validity and responsiveness of this QLQ-C30 summary score was equal to, and in many cases superior to the original, underlying QLQ-C30 scale scores. Our results provide empirical support for a measurement model for the QLQ-C30 yielding a single summary score. The availability of this summary score can avoid problems with potential type I errors that arise because of multiple testing when making comparisons based on the 15 outcomes generated by this questionnaire and may reduce sample size requirements for health-related quality of life studies using the QLQ-C30 questionnaire when an overall summary score is a relevant primary outcome. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29218884','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29218884"><span>How powerful are summary-based methods for identifying expression-trait associations under different genetic architectures?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Veturi, Yogasudha; Ritchie, Marylyn D</p> <p>2018-01-01</p> <p>Transcriptome-wide association studies (TWAS) have recently been employed as an approach that can draw upon the advantages of genome-wide association studies (GWAS) and gene expression studies to identify genes associated with complex traits. Unlike standard GWAS, summary level data suffices for TWAS and offers improved statistical power. Two popular TWAS methods include either (a) imputing the cis genetic component of gene expression from smaller sized studies (using multi-SNP prediction or MP) into much larger effective sample sizes afforded by GWAS - TWAS-MP or (b) using summary-based Mendelian randomization - TWAS-SMR. Although these methods have been effective at detecting functional variants, it remains unclear how extensive variability in the genetic architecture of complex traits and diseases impacts TWAS results. Our goal was to investigate the different scenarios under which these methods yielded enough power to detect significant expression-trait associations. In this study, we conducted extensive simulations based on 6000 randomly chosen, unrelated Caucasian males from Geisinger's MyCode population to compare the power to detect cis expression-trait associations (within 500 kb of a gene) using the above-described approaches. To test TWAS across varying genetic backgrounds we simulated gene expression and phenotype using different quantitative trait loci per gene and cis-expression /trait heritability under genetic models that differentiate the effect of causality from that of pleiotropy. For each gene, on a training set ranging from 100 to 1000 individuals, we either (a) estimated regression coefficients with gene expression as the response using five different methods: LASSO, elastic net, Bayesian LASSO, Bayesian spike-slab, and Bayesian ridge regression or (b) performed eQTL analysis. We then sampled with replacement 50,000, 150,000, and 300,000 individuals respectively from the testing set of the remaining 5000 individuals and conducted GWAS on each set. Subsequently, we integrated the GWAS summary statistics derived from the testing set with the weights (or eQTLs) derived from the training set to identify expression-trait associations using (a) TWAS-MP (b) TWAS-SMR (c) eQTL-based GWAS, or (d) standalone GWAS. Finally, we examined the power to detect functionally relevant genes using the different approaches under the considered simulation scenarios. In general, we observed great similarities among TWAS-MP methods although the Bayesian methods resulted in improved power in comparison to LASSO and elastic net as the trait architecture grew more complex while training sample sizes and expression heritability remained small. Finally, we observed high power under causality but very low to moderate power under pleiotropy.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA416863','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA416863"><span>Test Results of Air-Permeable Charcoal Impregnated Suits to Challenge by Chemical and Biological Warfare Agents and Simulants. Executive Summary and Summary Report</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2003-05-01</p> <p>authorizing documents. REPORT DOCUMENTATION PAGE F ~ormApprove I OMB No. 0704-0188 Public reporting burden for this collection of Information Is...Health, San Antonio, TX e Dr. Annetta P. Watson, Life Sciences Division, Oak Ridge National Laboratories, Oak Ridge, TN 9 Leo F . Saubier, Battelle Memorial...37 F - Giat NBC SWAT Suit ........................................................... 43 G - Giat</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA588390','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA588390"><span>Interacting with Multi-Robot Systems Using BML</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2013-06-01</p> <p>Pullen, U. Schade, J. Simonsen & R. Gomez-Veiga, NATO MSG-048 C-BML Final Report Summary. 2010 Fall Simulation Interoperability Workshop (10F- SIW -039...NATO MSG-085. 2012 Spring Simulation Interoperability Workshop (12S- SIW -045), Orlando, FL, March 2012. [3] T. Remmersmann, U. Schade, L. Khimeche...B. Grautreau & R. El Abdouni Khayari, Lessons Recognized: How to Combine BML and MSDL. 2012 Spring Simulation Interoperability Workshop (12S- SIW -012</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2206091','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2206091"><span>Testing for Archaic Hominin Admixture on the X Chromosome: Model Likelihoods for the Modern Human RRM2P4 Region From Summaries of Genealogical Topology Under the Structured Coalescent</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Cox, Murray P.; Mendez, Fernando L.; Karafet, Tatiana M.; Pilkington, Maya Metni; Kingan, Sarah B.; Destro-Bisol, Giovanni; Strassmann, Beverly I.; Hammer, Michael F.</p> <p>2008-01-01</p> <p>A 2.4-kb stretch within the RRM2P4 region of the X chromosome, previously sequenced in a sample of 41 globally distributed humans, displayed both an ancient time to the most recent common ancestor (e.g., a TMRCA of ∼2 million years) and a basal clade composed entirely of Asian sequences. This pattern was interpreted to reflect a history of introgressive hybridization from archaic hominins (most likely Asian Homo erectus) into the anatomically modern human genome. Here, we address this hypothesis by resequencing the 2.4-kb RRM2P4 region in 131 African and 122 non-African individuals and by extending the length of sequence in a window of 16.5 kb encompassing the RRM2P4 pseudogene in a subset of 90 individuals. We find that both the ancient TMRCA and the skew in non-African representation in one of the basal clades are essentially limited to the central 2.4-kb region. We define a new summary statistic called the minimum clade proportion (pmc), which quantifies the proportion of individuals from a specified geographic region in each of the two basal clades of a binary gene tree, and then employ coalescent simulations to assess the likelihood of the observed central RRM2P4 genealogy under two alternative views of human evolutionary history: recent African replacement (RAR) and archaic admixture (AA). A molecular-clock-based TMRCA estimate of 2.33 million years is a statistical outlier under the RAR model; however, the large variance associated with this estimate makes it difficult to distinguish the predictions of the human origins models tested here. The pmc summary statistic, which has improved power with larger samples of chromosomes, yields values that are significantly unlikely under the RAR model and fit expectations better under a range of archaic admixture scenarios. PMID:18202385</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li class="active"><span>21</span></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_21 --> <div id="page_22" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li class="active"><span>22</span></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="421"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25475880','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25475880"><span>Cluster randomised crossover trials with binary data and unbalanced cluster sizes: application to studies of near-universal interventions in intensive care.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Forbes, Andrew B; Akram, Muhammad; Pilcher, David; Cooper, Jamie; Bellomo, Rinaldo</p> <p>2015-02-01</p> <p>Cluster randomised crossover trials have been utilised in recent years in the health and social sciences. Methods for analysis have been proposed; however, for binary outcomes, these have received little assessment of their appropriateness. In addition, methods for determination of sample size are currently limited to balanced cluster sizes both between clusters and between periods within clusters. This article aims to extend this work to unbalanced situations and to evaluate the properties of a variety of methods for analysis of binary data, with a particular focus on the setting of potential trials of near-universal interventions in intensive care to reduce in-hospital mortality. We derive a formula for sample size estimation for unbalanced cluster sizes, and apply it to the intensive care setting to demonstrate the utility of the cluster crossover design. We conduct a numerical simulation of the design in the intensive care setting and for more general configurations, and we assess the performance of three cluster summary estimators and an individual-data estimator based on binomial-identity-link regression. For settings similar to the intensive care scenario involving large cluster sizes and small intra-cluster correlations, the sample size formulae developed and analysis methods investigated are found to be appropriate, with the unweighted cluster summary method performing well relative to the more optimal but more complex inverse-variance weighted method. More generally, we find that the unweighted and cluster-size-weighted summary methods perform well, with the relative efficiency of each largely determined systematically from the study design parameters. Performance of individual-data regression is adequate with small cluster sizes but becomes inefficient for large, unbalanced cluster sizes. When outcome prevalences are 6% or less and the within-cluster-within-period correlation is 0.05 or larger, all methods display sub-nominal confidence interval coverage, with the less prevalent the outcome the worse the coverage. As with all simulation studies, conclusions are limited to the configurations studied. We confined attention to detecting intervention effects on an absolute risk scale using marginal models and did not explore properties of binary random effects models. Cluster crossover designs with binary outcomes can be analysed using simple cluster summary methods, and sample size in unbalanced cluster size settings can be determined using relatively straightforward formulae. However, caution needs to be applied in situations with low prevalence outcomes and moderate to high intra-cluster correlations. © The Author(s) 2014.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018PhR...730....1T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018PhR...730....1T"><span>Dark matter self-interactions and small scale structure</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tulin, Sean; Yu, Hai-Bo</p> <p>2018-02-01</p> <p>We review theories of dark matter (DM) beyond the collisionless paradigm, known as self-interacting dark matter (SIDM), and their observable implications for astrophysical structure in the Universe. Self-interactions are motivated, in part, due to the potential to explain long-standing (and more recent) small scale structure observations that are in tension with collisionless cold DM (CDM) predictions. Simple particle physics models for SIDM can provide a universal explanation for these observations across a wide range of mass scales spanning dwarf galaxies, low and high surface brightness spiral galaxies, and clusters of galaxies. At the same time, SIDM leaves intact the success of ΛCDM cosmology on large scales. This report covers the following topics: (1) small scale structure issues, including the core-cusp problem, the diversity problem for rotation curves, the missing satellites problem, and the too-big-to-fail problem, as well as recent progress in hydrodynamical simulations of galaxy formation; (2) N-body simulations for SIDM, including implications for density profiles, halo shapes, substructure, and the interplay between baryons and self-interactions; (3) semi-analytic Jeans-based methods that provide a complementary approach for connecting particle models with observations; (4) merging systems, such as cluster mergers (e.g., the Bullet Cluster) and minor infalls, along with recent simulation results for mergers; (5) particle physics models, including light mediator models and composite DM models; and (6) complementary probes for SIDM, including indirect and direct detection experiments, particle collider searches, and cosmological observations. We provide a summary and critical look for all current constraints on DM self-interactions and an outline for future directions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28562397','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28562397"><span>Simulation Research in Gastrointestinal and Urologic Care-Challenges and Opportunities: Summary of a National Institute of Diabetes and Digestive and Kidney Diseases and National Institute of Biomedical Imaging and Bioengineering Workshop.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Aggarwal, Rajesh; Brown, Kimberly M; de Groen, Piet C; Gallagher, Anthony G; Henriksen, Kerm; Kavoussi, Louis R; Peng, Grace C Y; Ritter, E Matthew; Silverman, Elliott; Wang, Kenneth K; Andersen, Dana K</p> <p>2018-01-01</p> <p>: A workshop on "Simulation Research in Gastrointestinal and Urologic Care: Challenges and Opportunities" was held at the National Institutes of Health in June 2016. The purpose of the workshop was to examine the extent to which simulation approaches have been used by skilled proceduralists (not trainees) caring for patients with gastrointestinal and urologic diseases. The current status of research findings in the use and effectiveness of simulation applications was reviewed, and numerous knowledge gaps and research needs were identified by the faculty and the attendees. The paradigm of "deliberate practice," rather than mere repetition, and the value of coaching by experts was stressed by those who have adopted simulation in music and sports. Models that are most useful for the adoption of simulation by expert clinicians have yet to be fully validated. Initial studies on the impact of simulation on safety and error reduction have demonstrated its value in the training domain, but the role of simulation as a strategy for increased procedural safety remains uncertain in the world of the expert practitioner. Although the basic requirements for experienced physicians to acquire new skills have been explored, the widespread availability of such resources is an unrealized goal, and there is a need for well-designed outcome studies to establish the role of simulation in improving the quality of health care.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20150003469','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20150003469"><span>Summary of CPAS EDU Testing Analysis Results</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Romero, Leah M.; Bledsoe, Kristin J.; Davidson, John.; Engert, Meagan E.; Fraire, Usbaldo, Jr.; Galaviz, Fernando S.; Galvin, Patrick J.; Ray, Eric S.; Varela, Jose</p> <p>2015-01-01</p> <p>The Orion program's Capsule Parachute Assembly System (CPAS) project is currently conducting its third generation of testing, the Engineering Development Unit (EDU) series. This series utilizes two test articles, a dart-shaped Parachute Compartment Drop Test Vehicle (PCDTV) and capsule-shaped Parachute Test Vehicle (PTV), both of which include a full size, flight-like parachute system and require a pallet delivery system for aircraft extraction. To date, 15 tests have been completed, including six with PCDTVs and nine with PTVs. Two of the PTV tests included the Forward Bay Cover (FBC) provided by Lockheed Martin. Advancements in modeling techniques applicable to parachute fly-out, vehicle rate of descent, torque, and load train, also occurred during the EDU testing series. An upgrade from a composite to an independent parachute simulation allowed parachute modeling at a higher level of fidelity than during previous generations. The complexity of separating the test vehicles from their pallet delivery systems necessitated the use the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulator for modeling mated vehicle aircraft extraction and separation. This paper gives an overview of each EDU test and summarizes the development of CPAS analysis tools and techniques during EDU testing.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=20040088563&hterms=biology&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D80%26Ntt%3Dbiology','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=20040088563&hterms=biology&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D80%26Ntt%3Dbiology"><span>Monte Carlo track structure for radiation biology and space applications</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Nikjoo, H.; Uehara, S.; Khvostunov, I. G.; Cucinotta, F. A.; Wilson, W. E.; Goodhead, D. T.</p> <p>2001-01-01</p> <p>Over the past two decades event by event Monte Carlo track structure codes have increasingly been used for biophysical modelling and radiotherapy. Advent of these codes has helped to shed light on many aspects of microdosimetry and mechanism of damage by ionising radiation in the cell. These codes have continuously been modified to include new improved cross sections and computational techniques. This paper provides a summary of input data for ionizations, excitations and elastic scattering cross sections for event by event Monte Carlo track structure simulations for electrons and ions in the form of parametric equations, which makes it easy to reproduce the data. Stopping power and radial distribution of dose are presented for ions and compared with experimental data. A model is described for simulation of full slowing down of proton tracks in water in the range 1 keV to 1 MeV. Modelling and calculations are presented for the response of a TEPC proportional counter irradiated with 5 MeV alpha-particles. Distributions are presented for the wall and wall-less counters. Data shows contribution of indirect effects to the lineal energy distribution for the wall counters responses even at such a low ion energy.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=availability+AND+heuristics&pg=3&id=ED268979','ERIC'); return false;" href="https://eric.ed.gov/?q=availability+AND+heuristics&pg=3&id=ED268979"><span>Computer-Assisted Scheduling of Army Unit Training: An Application of Simulated Annealing.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Hart, Roland J.; Goehring, Dwight J.</p> <p></p> <p>This report of an ongoing research project intended to provide computer assistance to Army units for the scheduling of training focuses on the feasibility of simulated annealing, a heuristic approach for solving scheduling problems. Following an executive summary and brief introduction, the document is divided into three sections. First, the Army…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/FR-2010-03-23/pdf/2010-6381.pdf','FEDREG'); return false;" href="https://www.gpo.gov/fdsys/pkg/FR-2010-03-23/pdf/2010-6381.pdf"><span>75 FR 13674 - Wassenaar Arrangement 2008 Plenary Agreements Implementation: Categories 1, 2, 3, 4, 5 Parts I...</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collection.action?collectionCode=FR">Federal Register 2010, 2011, 2012, 2013, 2014</a></p> <p></p> <p>2010-03-23</p> <p>...'', biological agents ``adapted for use in war'', chemical warfare agents, 'simulants' or ``riot control agents... AGENCY: Bureau of Industry and Security, Commerce. ACTION: Final rule; correcting amendment. SUMMARY: The.... 'Simulant': A substance or material that is used in place of toxic agent (chemical or biological) in...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.fs.usda.gov/treesearch/pubs/28057','TREESEARCH'); return false;" href="https://www.fs.usda.gov/treesearch/pubs/28057"><span>Summary of Simulated Field Trip Session</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.fs.usda.gov/treesearch/">Treesearch</a></p> <p></p> <p>1992-01-01</p> <p>The Simulated Field Trips offered resource managers an opportunity to "show" Symposium attendees their resource areas. The emphasis was on recreational activities in the wildland-urban interface and on management techniques for these areas. The six presentations were in the form of slide shows and videotapes. The session was moderated by Robert Laidlaw of the...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=sampling+AND+distribution&pg=4&id=ED458282','ERIC'); return false;" href="https://eric.ed.gov/?q=sampling+AND+distribution&pg=4&id=ED458282"><span>Improving Statistics Education through Simulations: The Case of the Sampling Distribution.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Earley, Mark A.</p> <p></p> <p>This paper presents a summary of action research investigating statistics students' understandings of the sampling distribution of the mean. With four sections of an introductory Statistics in Education course (n=98 students), a computer simulation activity (R. delMas, J. Garfield, and B. Chance, 1999) was implemented and evaluated to show…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19900010797','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19900010797"><span>Space station Simulation Computer System (SCS) study for NASA/MSFC. Volume 1: Overview and summary</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p></p> <p>1989-01-01</p> <p>NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned Marshall Space Flight Center (MSFC) Payload Training Complex (PTC) required to meet this need will train the space station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs. This study was performed August 1988 to October 1989. Thus, the results are based on the SSFP August 1989 baseline, i.e., pre-Langley configuration/budget review (C/BR) baseline. Some terms, e.g., combined trainer, are being redefined. An overview of the study activities and a summary of study results are given here.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018ascl.soft04014C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018ascl.soft04014C"><span>IMNN: Information Maximizing Neural Networks</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Charnock, Tom; Lavaux, Guilhem; Wandelt, Benjamin D.</p> <p>2018-04-01</p> <p>This software trains artificial neural networks to find non-linear functionals of data that maximize Fisher information: information maximizing neural networks (IMNNs). As compressing large data sets vastly simplifies both frequentist and Bayesian inference, important information may be inadvertently missed. Likelihood-free inference based on automatically derived IMNN summaries produces summaries that are good approximations to sufficient statistics. IMNNs are robustly capable of automatically finding optimal, non-linear summaries of the data even in cases where linear compression fails: inferring the variance of Gaussian signal in the presence of noise, inferring cosmological parameters from mock simulations of the Lyman-α forest in quasar spectra, and inferring frequency-domain parameters from LISA-like detections of gravitational waveforms. In this final case, the IMNN summary outperforms linear data compression by avoiding the introduction of spurious likelihood maxima.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29193306','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29193306"><span>Artificial neural networks for stiffness estimation in magnetic resonance elastography.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Murphy, Matthew C; Manduca, Armando; Trzasko, Joshua D; Glaser, Kevin J; Huston, John; Ehman, Richard L</p> <p>2018-07-01</p> <p>To investigate the feasibility of using artificial neural networks to estimate stiffness from MR elastography (MRE) data. Artificial neural networks were fit using model-based training patterns to estimate stiffness from images of displacement using a patch size of ∼1 cm in each dimension. These neural network inversions (NNIs) were then evaluated in a set of simulation experiments designed to investigate the effects of wave interference and noise on NNI accuracy. NNI was also tested in vivo, comparing NNI results against currently used methods. In 4 simulation experiments, NNI performed as well or better than direct inversion (DI) for predicting the known stiffness of the data. Summary NNI results were also shown to be significantly correlated with DI results in the liver (R 2  = 0.974) and in the brain (R 2  = 0.915), and also correlated with established biological effects including fibrosis stage in the liver and age in the brain. Finally, repeatability error was lower in the brain using NNI compared to DI, and voxel-wise modeling using NNI stiffness maps detected larger effects than using DI maps with similar levels of smoothing. Artificial neural networks represent a new approach to inversion of MRE data. Summary results from NNI and DI are highly correlated and both are capable of detecting biologically relevant signals. Preliminary evidence suggests that NNI stiffness estimates may be more resistant to noise than an algebraic DI approach. Taken together, these results merit future investigation into NNIs to improve the estimation of stiffness in small regions. Magn Reson Med 80:351-360, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009CoPhC.180.1382L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009CoPhC.180.1382L"><span>Coding coarse grained polymer model for LAMMPS and its application to polymer crystallization</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Luo, Chuanfu; Sommer, Jens-Uwe</p> <p>2009-08-01</p> <p>We present a patch code for LAMMPS to implement a coarse grained (CG) model of poly(vinyl alcohol) (PVA). LAMMPS is a powerful molecular dynamics (MD) simulator developed at Sandia National Laboratories. Our patch code implements tabulated angular potential and Lennard-Jones-9-6 (LJ96) style interaction for PVA. Benefited from the excellent parallel efficiency of LAMMPS, our patch code is suitable for large-scale simulations. This CG-PVA code is used to study polymer crystallization, which is a long-standing unsolved problem in polymer physics. By using parallel computing, cooling and heating processes for long chains are simulated. The results show that chain-folded structures resembling the lamellae of polymer crystals are formed during the cooling process. The evolution of the static structure factor during the crystallization transition indicates that long-range density order appears before local crystalline packing. This is consistent with some experimental observations by small/wide angle X-ray scattering (SAXS/WAXS). During the heating process, it is found that the crystalline regions are still growing until they are fully melted, which can be confirmed by the evolution both of the static structure factor and average stem length formed by the chains. This two-stage behavior indicates that melting of polymer crystals is far from thermodynamic equilibrium. Our results concur with various experiments. It is the first time that such growth/reorganization behavior is clearly observed by MD simulations. Our code can be easily used to model other type of polymers by providing a file containing the tabulated angle potential data and a set of appropriate parameters. Program summaryProgram title: lammps-cgpva Catalogue identifier: AEDE_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEDE_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU's GPL No. of lines in distributed program, including test data, etc.: 940 798 No. of bytes in distributed program, including test data, etc.: 12 536 245 Distribution format: tar.gz Programming language: C++/MPI Computer: Tested on Intel-x86 and AMD64 architectures. Should run on any architecture providing a C++ compiler Operating system: Tested under Linux. Any other OS with C++ compiler and MPI library should suffice Has the code been vectorized or parallelized?: Yes RAM: Depends on system size and how many CPUs are used Classification: 7.7 External routines: LAMMPS ( http://lammps.sandia.gov/), FFTW ( http://www.fftw.org/) Nature of problem: Implementing special tabular angle potentials and Lennard-Jones-9-6 style interactions of a coarse grained polymer model for LAMMPS code. Solution method: Cubic spline interpolation of input tabulated angle potential data. Restrictions: The code is based on a former version of LAMMPS. Unusual features.: Any special angular potential can be used if it can be tabulated. Running time: Seconds to weeks, depending on system size, speed of CPU and how many CPUs are used. The test run provided with the package takes about 5 minutes on 4 AMD's opteron (2.6 GHz) CPUs. References:D. Reith, H. Meyer, F. Müller-Plathe, Macromolecules 34 (2001) 2335-2345. H. Meyer, F. Müller-Plathe, J. Chem. Phys. 115 (2001) 7807. H. Meyer, F. Müller-Plathe, Macromolecules 35 (2002) 1241-1252.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=20170003718&hterms=climatology&qs=N%3D0%26Ntk%3DAll%26Ntx%3Dmode%2Bmatchall%26Ntt%3Dclimatology','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=20170003718&hterms=climatology&qs=N%3D0%26Ntk%3DAll%26Ntx%3Dmode%2Bmatchall%26Ntt%3Dclimatology"><span>Climatology of the Aerosol Optical Depth by Components from the Multi-Angle Imaging Spectroradiometer (MISR) and Chemistry Transport Models</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Lee, Huikyo; Kalashnikova, Olga V.; Suzuki, Kentaroh; Braverman, Amy; Garay, Michael J.; Kahn, Ralph A.</p> <p>2016-01-01</p> <p>The Multi-angle Imaging Spectroradiometer (MISR) Joint Aerosol (JOINT_AS) Level 3 product has provided a global, descriptive summary of MISR Level 2 aerosol optical depth (AOD) and aerosol type information for each month over 16+ years since March 2000. Using Version 1 of JOINT_AS, which is based on the operational (Version 22) MISR Level 2 aerosol product, this study analyzes, for the first time, characteristics of observed and simulated distributions of AOD for three broad classes of aerosols: spherical nonabsorbing, spherical absorbing, and nonspherical - near or downwind of their major source regions. The statistical moments (means, standard deviations, and skew-nesses) and distributions of AOD by components derived from the JOINT_AS are compared with results from two chemistry transport models (CTMs), the Goddard Chemistry Aerosol Radiation and Transport (GOCART) and SPectral RadIatioN-TrAnSport (SPRINTARS). Overall, the AOD distributions retrieved from MISR and modeled by GOCART and SPRINTARS agree with each other in a qualitative sense. Marginal distributions of AOD for each aerosol type in both MISR and models show considerable high positive skewness, which indicates the importance of including extreme AOD events when comparing satellite retrievals with models. The MISR JOINT_AS product will greatly facilitate comparisons between satellite observations and model simulations of aerosols by type.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28743218','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28743218"><span>Development of a lumbar EMG-based coactivation index for the assessment of complex dynamic tasks.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Le, Peter; Aurand, Alexander; Walter, Benjamin A; Best, Thomas M; Khan, Safdar N; Mendel, Ehud; Marras, William S</p> <p>2018-03-01</p> <p>The objective of this study was to develop and test an EMG-based coactivation index and compare it to a coactivation index defined by a biologically assisted lumbar spine model to differentiate between tasks. The purpose was to provide a universal approach to assess coactivation of a multi-muscle system when a computational model is not accessible. The EMG-based index developed utilised anthropometric-defined muscle characteristics driven by torso kinematics and EMG. Muscles were classified as agonists/antagonists based upon 'simulated' moments of the muscles relative to the total 'simulated' moment. Different tasks were used to test the range of the index including lifting, pushing and Valsalva. Results showed that the EMG-based index was comparable to the index defined by a biologically assisted model (r 2  = 0.78). Overall, the EMG-based index provides a universal, usable method to assess the neuromuscular effort associated with coactivation for complex dynamic tasks when the benefit of a biomechanical model is not available. Practitioner Summary: A universal coactivation index for the lumbar spine was developed to assess complex dynamic tasks. This method was validated relative to a model-based index for use when a high-end computational model is not available. Its simplicity allows for fewer inputs and usability for assessment of task ergonomics and rehabilitation.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28612356','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28612356"><span>Assessing the influence of rater and subject characteristics on measures of agreement for ordinal ratings.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Nelson, Kerrie P; Mitani, Aya A; Edwards, Don</p> <p>2017-09-10</p> <p>Widespread inconsistencies are commonly observed between physicians' ordinal classifications in screening tests results such as mammography. These discrepancies have motivated large-scale agreement studies where many raters contribute ratings. The primary goal of these studies is to identify factors related to physicians and patients' test results, which may lead to stronger consistency between raters' classifications. While ordered categorical scales are frequently used to classify screening test results, very few statistical approaches exist to model agreement between multiple raters. Here we develop a flexible and comprehensive approach to assess the influence of rater and subject characteristics on agreement between multiple raters' ordinal classifications in large-scale agreement studies. Our approach is based upon the class of generalized linear mixed models. Novel summary model-based measures are proposed to assess agreement between all, or a subgroup of raters, such as experienced physicians. Hypothesis tests are described to formally identify factors such as physicians' level of experience that play an important role in improving consistency of ratings between raters. We demonstrate how unique characteristics of individual raters can be assessed via conditional modes generated during the modeling process. Simulation studies are presented to demonstrate the performance of the proposed methods and summary measure of agreement. The methods are applied to a large-scale mammography agreement study to investigate the effects of rater and patient characteristics on the strength of agreement between radiologists. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011CoPhC.182.2350D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011CoPhC.182.2350D"><span>ms2: A molecular simulation tool for thermodynamic properties</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Deublein, Stephan; Eckl, Bernhard; Stoll, Jürgen; Lishchuk, Sergey V.; Guevara-Carrion, Gabriela; Glass, Colin W.; Merker, Thorsten; Bernreuther, Martin; Hasse, Hans; Vrabec, Jadran</p> <p>2011-11-01</p> <p>This work presents the molecular simulation program ms2 that is designed for the calculation of thermodynamic properties of bulk fluids in equilibrium consisting of small electro-neutral molecules. ms2 features the two main molecular simulation techniques, molecular dynamics (MD) and Monte-Carlo. It supports the calculation of vapor-liquid equilibria of pure fluids and multi-component mixtures described by rigid molecular models on the basis of the grand equilibrium method. Furthermore, it is capable of sampling various classical ensembles and yields numerous thermodynamic properties. To evaluate the chemical potential, Widom's test molecule method and gradual insertion are implemented. Transport properties are determined by equilibrium MD simulations following the Green-Kubo formalism. ms2 is designed to meet the requirements of academia and industry, particularly achieving short response times and straightforward handling. It is written in Fortran90 and optimized for a fast execution on a broad range of computer architectures, spanning from single processor PCs over PC-clusters and vector computers to high-end parallel machines. The standard Message Passing Interface (MPI) is used for parallelization and ms2 is therefore easily portable to different computing platforms. Feature tools facilitate the interaction with the code and the interpretation of input and output files. The accuracy and reliability of ms2 has been shown for a large variety of fluids in preceding work. Program summaryProgram title:ms2 Catalogue identifier: AEJF_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJF_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Special Licence supplied by the authors No. of lines in distributed program, including test data, etc.: 82 794 No. of bytes in distributed program, including test data, etc.: 793 705 Distribution format: tar.gz Programming language: Fortran90 Computer: The simulation tool ms2 is usable on a wide variety of platforms, from single processor machines over PC-clusters and vector computers to vector-parallel architectures. (Tested with Fortran compilers: gfortran, Intel, PathScale, Portland Group and Sun Studio.) Operating system: Unix/Linux, Windows Has the code been vectorized or parallelized?: Yes. Message Passing Interface (MPI) protocol Scalability. Excellent scalability up to 16 processors for molecular dynamics and >512 processors for Monte-Carlo simulations. RAM:ms2 runs on single processors with 512 MB RAM. The memory demand rises with increasing number of processors used per node and increasing number of molecules. Classification: 7.7, 7.9, 12 External routines: Message Passing Interface (MPI) Nature of problem: Calculation of application oriented thermodynamic properties for rigid electro-neutral molecules: vapor-liquid equilibria, thermal and caloric data as well as transport properties of pure fluids and multi-component mixtures. Solution method: Molecular dynamics, Monte-Carlo, various classical ensembles, grand equilibrium method, Green-Kubo formalism. Restrictions: No. The system size is user-defined. Typical problems addressed by ms2 can be solved by simulating systems containing typically 2000 molecules or less. Unusual features: Feature tools are available for creating input files, analyzing simulation results and visualizing molecular trajectories. Additional comments: Sample makefiles for multiple operation platforms are provided. Documentation is provided with the installation package and is available at http://www.ms-2.de. Running time: The running time of ms2 depends on the problem set, the system size and the number of processes used in the simulation. Running four processes on a "Nehalem" processor, simulations calculating VLE data take between two and twelve hours, calculating transport properties between six and 24 hours.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25845315','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25845315"><span>Pharmacometric Models for Characterizing the Pharmacokinetics of Orally Inhaled Drugs.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Borghardt, Jens Markus; Weber, Benjamin; Staab, Alexander; Kloft, Charlotte</p> <p>2015-07-01</p> <p>During the last decades, the importance of modeling and simulation in clinical drug development, with the goal to qualitatively and quantitatively assess and understand mechanisms of pharmacokinetic processes, has strongly increased. However, this increase could not equally be observed for orally inhaled drugs. The objectives of this review are to understand the reasons for this gap and to demonstrate the opportunities that mathematical modeling of pharmacokinetics of orally inhaled drugs offers. To achieve these objectives, this review (i) discusses pulmonary physiological processes and their impact on the pharmacokinetics after drug inhalation, (ii) provides a comprehensive overview of published pharmacokinetic models, (iii) categorizes these models into physiologically based pharmacokinetic (PBPK) and (clinical data-derived) empirical models, (iv) explores both their (mechanistic) plausibility, and (v) addresses critical aspects of different pharmacometric approaches pertinent for drug inhalation. In summary, pulmonary deposition, dissolution, and absorption are highly complex processes and may represent the major challenge for modeling and simulation of PK after oral drug inhalation. Challenges in relating systemic pharmacokinetics with pulmonary efficacy may be another factor contributing to the limited number of existing pharmacokinetic models for orally inhaled drugs. Investigations comprising in vitro experiments, clinical studies, and more sophisticated mathematical approaches are considered to be necessary for elucidating these highly complex pulmonary processes. With this additional knowledge, the PBPK approach might gain additional attractiveness. Currently, (semi-)mechanistic modeling offers an alternative to generate and investigate hypotheses and to more mechanistically understand the pulmonary and systemic pharmacokinetics after oral drug inhalation including the impact of pulmonary diseases.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/1409789-review-mechanical-behavior-metal-ceramic-interfaces-nanolayered-compositesexperiments-modeling','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/1409789-review-mechanical-behavior-metal-ceramic-interfaces-nanolayered-compositesexperiments-modeling"><span>Review: mechanical behavior of metal/ceramic interfaces in nanolayered composites—experiments and modeling</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Li, Nan; Liu, Xiang-Yang</p> <p></p> <p>In this study, recent experimental and modeling studies in nanolayered metal/ceramic composites are reviewed, with focus on the mechanical behaviors of metal/nitrides interfaces. The experimental and modeling studies of the slip systems in bulk TiN are reviewed first. Then, the experimental studies of interfaces, including co-deformation mechanism by micropillar compression tests, in situ TEM straining tests for the dynamic process of the co-deformation, thickness-dependent fracture behavior, and interrelationship among the interfacial bonding, microstructure, and mechanical response, are reviewed for the specific material systems of Al/TiN and Cu/TiN multilayers at nanoscale. The modeling studies reviewed cover first-principles density functional theory-based modeling,more » atomistic molecular dynamics simulations, and mesoscale modeling of nanolayered composites using discrete dislocation dynamics. The phase transformation between zinc-blende and wurtzite AlN phases in Al/AlN multilayers at nanoscale is also reviewed. Finally, a summary and perspective of possible research directions and challenges are given.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2751647','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2751647"><span>A Joint Model for Longitudinal Measurements and Survival Data in the Presence of Multiple Failure Types</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Elashoff, Robert M.; Li, Gang; Li, Ning</p> <p>2009-01-01</p> <p>Summary In this article we study a joint model for longitudinal measurements and competing risks survival data. Our joint model provides a flexible approach to handle possible nonignorable missing data in the longitudinal measurements due to dropout. It is also an extension of previous joint models with a single failure type, offering a possible way to model informatively censored events as a competing risk. Our model consists of a linear mixed effects submodel for the longitudinal outcome and a proportional cause-specific hazards frailty submodel (Prentice et al., 1978, Biometrics 34, 541-554) for the competing risks survival data, linked together by some latent random effects. We propose to obtain the maximum likelihood estimates of the parameters by an expectation maximization (EM) algorithm and estimate their standard errors using a profile likelihood method. The developed method works well in our simulation studies and is applied to a clinical trial for the scleroderma lung disease. PMID:18162112</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li class="active"><span>22</span></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_22 --> <div id="page_23" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li class="active"><span>23</span></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="441"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1409789-review-mechanical-behavior-metal-ceramic-interfaces-nanolayered-compositesexperiments-modeling','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1409789-review-mechanical-behavior-metal-ceramic-interfaces-nanolayered-compositesexperiments-modeling"><span>Review: mechanical behavior of metal/ceramic interfaces in nanolayered composites—experiments and modeling</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Li, Nan; Liu, Xiang-Yang</p> <p>2017-11-03</p> <p>In this study, recent experimental and modeling studies in nanolayered metal/ceramic composites are reviewed, with focus on the mechanical behaviors of metal/nitrides interfaces. The experimental and modeling studies of the slip systems in bulk TiN are reviewed first. Then, the experimental studies of interfaces, including co-deformation mechanism by micropillar compression tests, in situ TEM straining tests for the dynamic process of the co-deformation, thickness-dependent fracture behavior, and interrelationship among the interfacial bonding, microstructure, and mechanical response, are reviewed for the specific material systems of Al/TiN and Cu/TiN multilayers at nanoscale. The modeling studies reviewed cover first-principles density functional theory-based modeling,more » atomistic molecular dynamics simulations, and mesoscale modeling of nanolayered composites using discrete dislocation dynamics. The phase transformation between zinc-blende and wurtzite AlN phases in Al/AlN multilayers at nanoscale is also reviewed. Finally, a summary and perspective of possible research directions and challenges are given.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4744123','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4744123"><span>Bayesian Approach for Flexible Modeling of Semicompeting Risks Data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Han, Baoguang; Yu, Menggang; Dignam, James J.; Rathouz, Paul J.</p> <p>2016-01-01</p> <p>Summary Semicompeting risks data arise when two types of events, non-terminal and terminal, are observed. When the terminal event occurs first, it censors the non-terminal event, but not vice versa. To account for possible dependent censoring of the non-terminal event by the terminal event and to improve prediction of the terminal event using the non-terminal event information, it is crucial to model their association properly. Motivated by a breast cancer clinical trial data analysis, we extend the well-known illness-death models to allow flexible random effects to capture heterogeneous association structures in the data. Our extension also represents a generalization of the popular shared frailty models that usually assume that the non-terminal event does not affect the hazards of the terminal event beyond a frailty term. We propose a unified Bayesian modeling approach that can utilize existing software packages for both model fitting and individual specific event prediction. The approach is demonstrated via both simulation studies and a breast cancer data set analysis. PMID:25274445</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5629010','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5629010"><span>Stoichiometric modelling of assimilatory and dissimilatory biomass utilisation in a microbial community</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Hunt, Kristopher A.; Jennings, Ryan deM.; Inskeep, William P.; Carlson, Ross P.</p> <p>2017-01-01</p> <p>Summary Assimilatory and dissimilatory utilisation of autotroph biomass by heterotrophs is a fundamental mechanism for the transfer of nutrients and energy across trophic levels. Metagenome data from a tractable, thermoacidophilic microbial community in Yellowstone National Park was used to build an in silico model to study heterotrophic utilisation of autotroph biomass using elementary flux mode analysis and flux balance analysis. Assimilatory and dissimilatory biomass utilisation was investigated using 29 forms of biomass-derived dissolved organic carbon (DOC) including individual monomer pools, individual macromolecular pools and aggregate biomass. The simulations identified ecologically competitive strategies for utilizing DOC under conditions of varying electron donor, electron acceptor or enzyme limitation. The simulated growth environment affected which form of DOC was the most competitive use of nutrients; for instance, oxygen limitation favoured utilisation of less reduced and fermentable DOC while carbon-limited environments favoured more reduced DOC. Additionally, metabolism was studied considering two encompassing metabolic strategies: simultaneous versus sequential use of DOC. Results of this study bound the transfer of nutrients and energy through microbial food webs, providing a quantitative foundation relevant to most microbial ecosystems. PMID:27387069</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20160006330','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20160006330"><span>NASA Hybrid Wing Aircraft Aeroacoustic Test Documentation Report</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Heath, Stephanie L.; Brooks, Thomas F.; Hutcheson, Florence V.; Doty, Michael J.; Bahr, Christopher J.; Hoad, Danny; Becker, Lawrence; Humphreys, William M.; Burley, Casey L.; Stead, Dan; <a style="text-decoration: none; " href="javascript:void(0); " onClick="displayelement('author_20160006330'); toggleEditAbsImage('author_20160006330_show'); toggleEditAbsImage('author_20160006330_hide'); "> <img style="display:inline; width:12px; height:12px; " src="images/arrow-up.gif" width="12" height="12" border="0" alt="hide" id="author_20160006330_show"> <img style="width:12px; height:12px; display:none; " src="images/arrow-down.gif" width="12" height="12" border="0" alt="hide" id="author_20160006330_hide"></p> <p>2016-01-01</p> <p>This report summarizes results of the Hybrid Wing Body (HWB) N2A-EXTE model aeroacoustic test. The N2A-EXTE model was tested in the NASA Langley 14- by 22-Foot Subsonic Tunnel (14x22 Tunnel) from September 12, 2012 until January 28, 2013 and was designated as test T598. This document contains the following main sections: Section 1 - Introduction, Section 2 - Main Personnel, Section 3 - Test Equipment, Section 4 - Data Acquisition Systems, Section 5 - Instrumentation and Calibration, Section 6 - Test Matrix, Section 7 - Data Processing, and Section 8 - Summary. Due to the amount of material to be documented, this HWB test documentation report does not cover analysis of acquired data, which is to be presented separately by the principal investigators. Also, no attempt was made to include preliminary risk reduction tests (such as Broadband Engine Noise Simulator and Compact Jet Engine Simulator characterization tests, shielding measurement technique studies, and speaker calibration method studies), which were performed in support of this HWB test. Separate reports containing these preliminary tests are referenced where applicable.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/290897','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/290897"><span>Performance evaluation of rotating pump jet mixing of radioactive wastes in Hanford Tanks 241-AP-102 and -104</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Onishi, Y.; Recknagle, K.P.</p> <p></p> <p>The purpose of this study was to confirm the adequacy of a single mixer pump to fully mix the wastes that will be stored in Tanks 241-AP-102 and -104. These Hanford double-shell tanks (DSTs) will be used as staging tanks to receive low-activity wastes from other Hanford storage tanks and, in turn, will supply the wastes to private waste vitrification facilities for eventual solidification. The TEMPEST computer code was applied to Tanks AP-102 and -104 to simulate waste mixing generated by the 60-ft/s rotating jets and to determine the effectiveness of the single rotating pump to mix the waste. TEMPESTmore » simulates flow and mass/heat transport and chemical reactions (equilibrium and kinetic reactions) coupled together. Section 2 describes the pump jet mixing conditions the authors evaluated, the modeling cases, and their parameters. Section 3 reports model applications and assessment results. The summary and conclusions are presented in Section 4, and cited references are listed in Section 5.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3778385','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3778385"><span>3D nano-structures for laser nano-manipulation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Seniutinas, Gediminas; Gervinskas, Gediminas; Brasselet, Etienne; Juodkazis, Saulius</p> <p>2013-01-01</p> <p>Summary The resputtering of gold films from nano-holes defined in a sacrificial PMMA mask, which was made by electron beam lithography, was carried out with a dry plasma etching tool in order to form well-like structures with a high aspect ratio (height/width ≈ 3–4) at the rims of the nano-holes. The extraordinary transmission through the patterns of such nano-wells was investigated experimentally and numerically. By doing numerical simulations of 50-nm and 100-nm diameter polystyrene beads in water and air, we show the potential of such patterns for self-induced back-action (SIBA) trapping. The best trapping conditions were found to be a trapping force of 2 pN/W/μm2 (numerical result) exerted on a 50-nm diameter bead in water. The simulations were based on the analytical Lorentz force model. PMID:24062979</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=20010115231&hterms=workstation+design&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D50%26Ntt%3Dworkstation%2Bdesign','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=20010115231&hterms=workstation+design&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D50%26Ntt%3Dworkstation%2Bdesign"><span>Extended Operating Configuration 2 (EOC-2) Design Document</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Barkai, David; Blaylock, Bruce T. (Technical Monitor)</p> <p>1994-01-01</p> <p>This document describes the design and plan of the Extended Operating Configuration 2 (EOC-2) for the Numerical Aerodynamic Simulation division (NAS). It covers the changes in the computing environment for the period of '93-'94. During this period the computation capability at NAS will have quadrupled. The first section summarizes this paper: the NAS mission is to provide, by the year 2000, a computing system capable of simulating an entire aerospace vehicle in a few hours. This will require 100 GigaFlops sustained performance. The second section contains information about the NAS user community and the computational model used for projecting future requirements. In the third section, the overall requirements are presented, followed by a summary of the target EOC-2 system. The following sections cover, in more detail, each major component that will have undergone change during EOC-2: the high speed processor, mass storage, workstations, and networks.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20080031118','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20080031118"><span>Weld Residual Stress and Distortion Analysis of the ARES I-X Upper Stage Simulator (USS)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Raju, Ivatury; Dawicke, David; Cheston, Derrick; Phillips, Dawn</p> <p>2008-01-01</p> <p>An independent assessment was conducted to determine the critical initial flaw size (CIFS) for the flange-to-skin weld in the Ares I-X Upper Stage Simulator (USS). The Ares system of space launch vehicles is the US National Aeronautics and Space Administration s plan for replacement of the aging space shuttle. The new Ares space launch system is somewhat of a combination of the space shuttle system and the Saturn launch vehicles used prior to the shuttle. Here, a series of weld analyses are performed to determine the residual stresses in a critical region of the USS. Weld residual stresses both increase constraint and mean stress thereby having an important effect on fatigue and fracture life. While the main focus of this paper is a discussion of the weld modeling procedures and results for the USS, a short summary of the CIFS assessment is provided.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19760004043&hterms=missing+data&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D70%26Ntt%3Dmissing%2Bdata','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19760004043&hterms=missing+data&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D70%26Ntt%3Dmissing%2Bdata"><span>An experimental summary of plasma arc exposures of space shuttle high-temperature reusable surface insulation tile array with a single missing tile (conducted at the Ames Research Center)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Galanter, S. A.</p> <p>1975-01-01</p> <p>A space shuttle high temperature reusable surface insulation (HRSI) tile array with a single missing or lost tile was exposed to a hot gas simulated reentry environment to investigate the heating conditions in and around the vicinity of the missing HRSI tile. Heat flux and pressure data for the lost tile condition were obtained by the use of a water cooled lost tile calibration model. The maximum aluminum substrate temperature obtained during the simulated reentry was 128 C (263 F). The lost tile calibration data indicated a maximum heat flux in the lost tile cavity region of 63 percent of the upstream reference value. This test was conducted at the Ames Research Center in the 20 MW semielliptical thermal protection system (TPS) pilot plasma arc test facility.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/12320919','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/12320919"><span>[Population development and economic growth. A simulation analysis for Switzerland].</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Schmidt, C; Straubhaar, T</p> <p>1996-01-01</p> <p>"A simulation exercise of a general equilibrium model for Switzerland makes clear that the macroeconomic impacts of aging populations are not very strong. There is no need for urgent policy actions to avoid severe negative economic consequences....However, the aging of population affects negatively the net income of the active labor force. An increasing share of their gross salaries goes to the retirement system to finance the pension payments of a growing number of pensioners. Attempts to moderate the elderly dependency ratio would lower this burden for the active labor force. Options are an increase of the female participation rate, an increase of the labor participation rate of the elderly--[which] also means a higher retirement age--and an increasing flow of immigrants. But socioeconomic problems might probably generate practical limits on the extent to which immigration can be increased." (SUMMARY IN ENG AND FRE) excerpt</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19900013672','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19900013672"><span>An assessment of multibody simulation tools for articulated spacecraft</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Man, Guy K.; Sirlin, Samuel W.</p> <p>1989-01-01</p> <p>A survey of multibody simulation codes was conducted in the spring of 1988, to obtain an assessment of the state of the art in multibody simulation codes from the users of the codes. This survey covers the most often used articulated multibody simulation codes in the spacecraft and robotics community. There was no attempt to perform a complete survey of all available multibody codes in all disciplines. Furthermore, this is not an exhaustive evaluation of even robotics and spacecraft multibody simulation codes, as the survey was designed to capture feedback on issues most important to the users of simulation codes. We must keep in mind that the information received was limited and the technical background of the respondents varied greatly. Therefore, only the most often cited observations from the questionnaire are reported here. In this survey, it was found that no one code had both many users (reports) and no limitations. The first section is a report on multibody code applications. Following applications is a discussion of execution time, which is the most troublesome issue for flexible multibody codes. The representation of component flexible bodies, which affects both simulation setup time as well as execution time, is presented next. Following component data preparation, two sections address the accessibility or usability of a code, evaluated by considering its user interface design and examining the overall simulation integrated environment. A summary of user efforts at code verification is reported, before a tabular summary of the questionnaire responses. Finally, some conclusions are drawn.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20050060661','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20050060661"><span>Conceptual Design and Dynamics Testing and Modeling of a Mars Tumbleweed Rover</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Calhoun Philip C.; Harris, Steven B.; Raiszadeh, Behzad; Zaleski, Kristina D.</p> <p>2005-01-01</p> <p>The NASA Langley Research Center has been developing a novel concept for a Mars planetary rover called the Mars Tumbleweed. This concept utilizes the wind to propel the rover along the Mars surface, bringing it the potential to cover vast distances not possible with current Mars rover technology. This vehicle, in its deployed configuration, must be large and lightweight to provide the ratio of drag force to rolling resistance necessary to initiate motion from rest on the Mars surface. One Tumbleweed design concept that satisfies these considerations is called the Eggbeater-Dandelion. This paper describes the basic design considerations and a proposed dynamics model of the concept for use in simulation studies. It includes a summary of rolling/bouncing dynamics tests that used videogrammetry to better understand, characterize, and validate the dynamics model assumptions, especially the effective rolling resistance in bouncing/rolling dynamic conditions. The dynamics test used cameras to capture the motion of 32 targets affixed to a test article s outer structure. Proper placement of the cameras and alignment of their respective fields of view provided adequate image resolution of multiple targets along the trajectory as the test article proceeded down the ramp. Image processing of the frames from multiple cameras was used to determine the target positions. Position data from a set of these test runs was compared with results of a three dimensional, flexible dynamics model. Model input parameters were adjusted to match the test data for runs conducted. This process presented herein provided the means to characterize the dynamics and validate the simulation of the Eggbeater-Dandelion concept. The simulation model was used to demonstrate full scale Tumbleweed motion from a stationary condition on a flat-sloped terrain using representative Mars environment parameters.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..1817610K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..1817610K"><span>Regional assessment of the hydropower potential of rivers in West Africa</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kling, Harald; Stanzel, Philipp; Fuchs, Martin</p> <p>2016-04-01</p> <p>The 15 countries of the Economic Community of West African States (ECOWAS) face a constant shortage of energy supply, which limits sustained economic growth. Currently there are about 50 operational hydropower plants and about 40 more are under construction or refurbishment. The potential for future hydropower development - especially for small-scale plants in rural areas - is assumed to be large, but exact data are missing. This study supports the energy initiatives of the "ECOWAS Centre for Renewable Energy and Energy Efficiency" (ECREEE) by assessing the hydropower potential of all rivers in West Africa. For more than 500,000 river reaches the hydropower potential was computed from channel slope and mean annual discharge. In large areas there is a lack of discharge observations. Therefore, an annual water balance model was used to simulate discharge. The model domain covers 5 Mio km², including e.g. the Niger, Volta, and Senegal River basins. The model was calibrated with observed data of 410 gauges, using precipitation and potential evapotranspiration data as inputs. Historic variations of observed annual discharge between 1950 and 2010 are simulated well by the model. As hydropower plants are investments with a lifetime of several decades we also assessed possible changes in future discharge due to climate change. To this end the water balance model was driven with bias-corrected climate projections of 15 Regional Climate Models for two emission scenarios of the CORDEX-Africa ensemble. The simulation results for the river network were up-scaled to sub-areas and national summaries. This information gives a regional quantification of the hydropower potential, expected climate change impacts, as well as a regional classification for general suitability (or non-suitability) of hydropower plant size - from small-scale to large projects.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA205318','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA205318"><span>IDA (Institute for Defense Analyses) GAMMA-Ray Laser Annual Summary Report (1986). Investigation of the Feasibility of Developing a Laser Using Nuclear Transitions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1988-12-01</p> <p>a computer simulation for a small value of r .................................... 25 Figure 5. A typical pulse shape for r = 8192...26 Figure 6. Pulse duration as function of r from the statistical simulations , assuming a spontaneous lifetime of 1 s...scaling factor from the statistical simulations ................. 29 Figure 10. Basic pulse characteristics and associated Bloch vector angles for the</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19800010284','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19800010284"><span>A ground based phase control system for the solar power satellite. Executive summary, volume 1, phase 3</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Chie, C. M.</p> <p>1980-01-01</p> <p>The Solar Power Satellite (SPS) concept and the reference phase control system investigated in earlier efforts are reviewed. A summary overview of the analysis and selection of the pilot signal and power transponder design is presented along with the SOLARSIM program development and the simulated SPS phase control performance. Evaluations of the ground based phase control system as an alternate phase control concept are summarized.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1035733','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1035733"><span>Validation of New Process Models for Large Injection-Molded Long-Fiber Thermoplastic Composite Structures</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Nguyen, Ba Nghiep; Jin, Xiaoshi; Wang, Jin</p> <p>2012-02-23</p> <p>This report describes the work conducted under the CRADA Nr. PNNL/304 between Battelle PNNL and Autodesk whose objective is to validate the new process models developed under the previous CRADA for large injection-molded LFT composite structures. To this end, the ARD-RSC and fiber length attrition models implemented in the 2013 research version of Moldflow was used to simulate the injection molding of 600-mm x 600-mm x 3-mm plaques from 40% glass/polypropylene (Dow Chemical DLGF9411.00) and 40% glass/polyamide 6,6 (DuPont Zytel 75LG40HSL BK031) materials. The injection molding was performed by Injection Technologies, Inc. at Windsor, Ontario (under a subcontract by Oakmore » Ridge National Laboratory, ORNL) using the mold offered by the Automotive Composite Consortium (ACC). Two fill speeds under the same back pressure were used to produce plaques under slow-fill and fast-fill conditions. Also, two gating options were used to achieve the following desired flow patterns: flows in edge-gated plaques and in center-gated plaques. After molding, ORNL performed measurements of fiber orientation and length distributions for process model validations. The structure of this report is as follows. After the Introduction (Section 1), Section 2 provides a summary of the ARD-RSC and fiber length attrition models. A summary of model implementations in the latest research version of Moldflow is given in Section 3. Section 4 provides the key processing conditions and parameters for molding of the ACC plaques. The validations of the ARD-RSC and fiber length attrition models are presented and discussed in Section 5. The conclusions will be drawn in Section 6.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24990607','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24990607"><span>Fast and accurate imputation of summary statistics enhances evidence of functional enrichment.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Pasaniuc, Bogdan; Zaitlen, Noah; Shi, Huwenbo; Bhatia, Gaurav; Gusev, Alexander; Pickrell, Joseph; Hirschhorn, Joel; Strachan, David P; Patterson, Nick; Price, Alkes L</p> <p>2014-10-15</p> <p>Imputation using external reference panels (e.g. 1000 Genomes) is a widely used approach for increasing power in genome-wide association studies and meta-analysis. Existing hidden Markov models (HMM)-based imputation approaches require individual-level genotypes. Here, we develop a new method for Gaussian imputation from summary association statistics, a type of data that is becoming widely available. In simulations using 1000 Genomes (1000G) data, this method recovers 84% (54%) of the effective sample size for common (>5%) and low-frequency (1-5%) variants [increasing to 87% (60%) when summary linkage disequilibrium information is available from target samples] versus the gold standard of 89% (67%) for HMM-based imputation, which cannot be applied to summary statistics. Our approach accounts for the limited sample size of the reference panel, a crucial step to eliminate false-positive associations, and it is computationally very fast. As an empirical demonstration, we apply our method to seven case-control phenotypes from the Wellcome Trust Case Control Consortium (WTCCC) data and a study of height in the British 1958 birth cohort (1958BC). Gaussian imputation from summary statistics recovers 95% (105%) of the effective sample size (as quantified by the ratio of [Formula: see text] association statistics) compared with HMM-based imputation from individual-level genotypes at the 227 (176) published single nucleotide polymorphisms (SNPs) in the WTCCC (1958BC height) data. In addition, for publicly available summary statistics from large meta-analyses of four lipid traits, we publicly release imputed summary statistics at 1000G SNPs, which could not have been obtained using previously published methods, and demonstrate their accuracy by masking subsets of the data. We show that 1000G imputation using our approach increases the magnitude and statistical evidence of enrichment at genic versus non-genic loci for these traits, as compared with an analysis without 1000G imputation. Thus, imputation of summary statistics will be a valuable tool in future functional enrichment analyses. Publicly available software package available at http://bogdan.bioinformatics.ucla.edu/software/. bpasaniuc@mednet.ucla.edu or aprice@hsph.harvard.edu Supplementary materials are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA433175','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA433175"><span>Advanced Computer Simulations of Military Incinerators</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2004-12-01</p> <p>Reaction Engineering International (REI) has developed advanced computer simulation tools for analyzing chemical demilitarization incinerators. The...Manager, 2003a: Summary of Engineering Design Study Projectile Washout System (PWS) Testing. Assembled Chemical Weapons Alternatives (ACWA), Final... Engineering Design Studies for Demilitarization of Assembled Chemical Weapons at Pueblo Chemical Depot. O’Shea, L. et al, 2003: RIM 57 – Monitoring in</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20090024798','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20090024798"><span>Force Measurement on the GLAST Delta II Flight</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Gordon, Scott; Kaufman, Daniel</p> <p>2009-01-01</p> <p>This viewgraph presentation reviews the interface force measurement at spacecraft separation of GLAST Delta II. The contents include: 1) Flight Force Measurement (FFM) Background; 2) Team Members; 3) GLAST Mission Overview; 4) Methodology Development; 5) Ground Test Validation; 6) Flight Data; 7) Coupled Loads Simulation (VCLA & Reconstruction); 8) Basedrive Simulation; 9) Findings; and 10) Summary and Conclusions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=data+AND+sets&pg=2&id=EJ824692','ERIC'); return false;" href="https://eric.ed.gov/?q=data+AND+sets&pg=2&id=EJ824692"><span>Creating Realistic Data Sets with Specified Properties via Simulation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Goldman, Robert N.; McKenzie, John D. Jr.</p> <p>2009-01-01</p> <p>We explain how to simulate both univariate and bivariate raw data sets having specified values for common summary statistics. The first example illustrates how to "construct" a data set having prescribed values for the mean and the standard deviation--for a one-sample t test with a specified outcome. The second shows how to create a bivariate data…</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li class="active"><span>23</span></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_23 --> <div id="page_24" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li class="active"><span>24</span></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="461"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA069802','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA069802"><span>Summary of Seismic Discrimination and Explosion Yield Determination Research</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1978-11-01</p> <p>measured and nuemrically simulated displacements .. ........... ... 56 21 Comparison of experimental and numerically simulated source functions expressed...as RVP transforms ...... ..................... 5 22 Comparison of measured and predicted displace- ments for Test 1 ..... .............. ... 57 23...Comparison of measured and predicted displace- ments for the cratering shot (Test 8) . . . . 59 24 The vertical displacement from the complete two</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19770018036','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19770018036"><span>Automatic vehicle monitoring systems study. Report of phase O. Volume 1: Executive summary</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p></p> <p>1977-01-01</p> <p>A set of planning guidelines is presented to help law enforcement agencies and vehicle fleet operators decide which automatic vehicle monitoring (AVM) system could best meet their performance requirements. Improvements in emergency response times and resultant cost benefits obtainable with various operational and planned AVM systems may be synthesized and simulated by means of special computer programs for model city parameters applicable to small, medium, and large urban areas. Design characteristics of various AVM systems and the implementation requirements are illustrated and cost estimated for the vehicles, the fixed sites, and the base equipments. Vehicle location accuracies for different RF links and polling intervals are analyzed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19790009184','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19790009184"><span>Adsorption bed models used in simulation of atmospheric control systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Davis, S. H.</p> <p>1978-01-01</p> <p>Two separate techniques were used to obtain important basic data for the adsorption of seven liquid and eight gaseous trace contaminants. A columetric system used in previous HSC studies was modified to determine the HSC capacity of all the contaminants. A second study of six of the liquids was performed in a gas chromatorgraph. The results of these two studies are reported in two parts. First, a brief summary of the chromatographic results are given. Second, a thesis is given which reports in some detail the results of the volumetric studies. Comparison of the data that are common to both studies are also included.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017JAMES...9.1450H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017JAMES...9.1450H"><span>Convective aggregation in realistic convective-scale simulations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Holloway, Christopher E.</p> <p>2017-06-01</p> <p>To investigate the real-world relevance of idealized-model convective self-aggregation, five 15 day cases of real organized convection in the tropics are simulated. These include multiple simulations of each case to test sensitivities of the convective organization and mean states to interactive radiation, interactive surface fluxes, and evaporation of rain. These simulations are compared to self-aggregation seen in the same model configured to run in idealized radiative-convective equilibrium. Analysis of the budget of the spatial variance of column-integrated frozen moist static energy shows that control runs have significant positive contributions to organization from radiation and negative contributions from surface fluxes and transport, similar to idealized runs once they become aggregated. Despite identical lateral boundary conditions for all experiments in each case, systematic differences in mean column water vapor (CWV), CWV distribution shape, and CWV autocorrelation length scale are found between the different sensitivity runs, particularly for those without interactive radiation, showing that there are at least some similarities in sensitivities to these feedbacks in both idealized and realistic simulations (although the organization of precipitation shows less sensitivity to interactive radiation). The magnitudes and signs of these systematic differences are consistent with a rough equilibrium between (1) equalization due to advection from the lateral boundaries and (2) disaggregation due to the absence of interactive radiation, implying disaggregation rates comparable to those in idealized runs with aggregated initial conditions and noninteractive radiation. This points to a plausible similarity in the way that radiation feedbacks maintain aggregated convection in both idealized simulations and the real world.<abstract type="synopsis"><title type="main">Plain Language SummaryUnderstanding the processes that lead to the organization of tropical rainstorms is an important challenge for weather forecasters and climate scientists. Over the last 20 years, idealized models of the tropical atmosphere have shown that tropical rainstorms can spontaneously clump together. These studies have linked this spontaneous organization to processes related to the interaction between the rainstorms, atmospheric water vapor, clouds, radiation, surface evaporation, and circulations. The present study shows that there are some similarities in how organization of rainfall in more realistic computer model simulations interacts with these processes (particularly radiation). This provides some evidence that the work in the idealized model studies is relevant to the organization of tropical rainstorms in the real world.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4560692','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4560692"><span>Measures of Agreement Between Many Raters for Ordinal Classifications</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Nelson, Kerrie P.; Edwards, Don</p> <p>2015-01-01</p> <p>Screening and diagnostic procedures often require a physician's subjective interpretation of a patient's test result using an ordered categorical scale to define the patient's disease severity. Due to wide variability observed between physicians’ ratings, many large-scale studies have been conducted to quantify agreement between multiple experts’ ordinal classifications in common diagnostic procedures such as mammography. However, very few statistical approaches are available to assess agreement in these large-scale settings. Existing summary measures of agreement rely on extensions of Cohen's kappa [1 - 5]. These are prone to prevalence and marginal distribution issues, become increasingly complex for more than three experts or are not easily implemented. Here we propose a model-based approach to assess agreement in large-scale studies based upon a framework of ordinal generalized linear mixed models. A summary measure of agreement is proposed for multiple experts assessing the same sample of patients’ test results according to an ordered categorical scale. This measure avoids some of the key flaws associated with Cohen's kappa and its extensions. Simulation studies are conducted to demonstrate the validity of the approach with comparison to commonly used agreement measures. The proposed methods are easily implemented using the software package R and are applied to two large-scale cancer agreement studies. PMID:26095449</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1227806','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1227806"><span>Microgrid Design Toolkit (MDT) Technical Documentation and Component Summaries</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Arguello, Bryan; Gearhart, Jared Lee; Jones, Katherine A.</p> <p>2015-09-01</p> <p>The Microgrid Design Toolkit (MDT) is a decision support software tool for microgrid designers to use during the microgrid design process. The models that support the two main capabilities in MDT are described. The first capability, the Microgrid Sizing Capability (MSC), is used to determine the size and composition of a new microgrid in the early stages of the design process. MSC is a mixed-integer linear program that is focused on developing a microgrid that is economically viable when connected to the grid. The second capability is focused on refining a microgrid design for operation in islanded mode. This secondmore » capability relies on two models: the Technology Management Optimization (TMO) model and Performance Reliability Model (PRM). TMO uses a genetic algorithm to create and refine a collection of candidate microgrid designs. It uses PRM, a simulation based reliability model, to assess the performance of these designs. TMO produces a collection of microgrid designs that perform well with respect to one or more performance metrics.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014CG.....67..100I','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014CG.....67..100I"><span>Mathematical algorithm development and parametric studies with the GEOFRAC three-dimensional stochastic model of natural rock fracture systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ivanova, Violeta M.; Sousa, Rita; Murrihy, Brian; Einstein, Herbert H.</p> <p>2014-06-01</p> <p>This paper presents results from research conducted at MIT during 2010-2012 on modeling of natural rock fracture systems with the GEOFRAC three-dimensional stochastic model. Following a background summary of discrete fracture network models and a brief introduction of GEOFRAC, the paper provides a thorough description of the newly developed mathematical and computer algorithms for fracture intensity, aperture, and intersection representation, which have been implemented in MATLAB. The new methods optimize, in particular, the representation of fracture intensity in terms of cumulative fracture area per unit volume, P32, via the Poisson-Voronoi Tessellation of planes into polygonal fracture shapes. In addition, fracture apertures now can be represented probabilistically or deterministically whereas the newly implemented intersection algorithms allow for computing discrete pathways of interconnected fractures. In conclusion, results from a statistical parametric study, which was conducted with the enhanced GEOFRAC model and the new MATLAB-based Monte Carlo simulation program FRACSIM, demonstrate how fracture intensity, size, and orientations influence fracture connectivity.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4805115','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4805115"><span>Folding cooperativity in a 3-stranded β-sheet model</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Roe, Daniel R.; Hornak, Viktor</p> <p>2015-01-01</p> <p>Summary The thermodynamic behavior of a previously designed three-stranded β-sheet was studied via several µs of standard and replica exchange molecular dynamics simulations. The system is shown to populate at least four thermodynamic minima, including 2 partially folded states in which only a single hairpin is formed. Simulated melting curves show different profiles for the C and N-terminal hairpins, consistent with differences in secondary structure content in published NMR and CD/FTIR measurements, which probed different regions of the chain. Individual β-hairpins that comprise the 3-stranded β-sheet are observed to form cooperatively. Partial folding cooperativity between the component hairpins is observed, and good agreement between calculated and experimental values quantifying this cooperativity is obtained when similar analysis techniques are used. However, the structural detail in the ensemble of conformations sampled in the simulations permits a more direct analysis of this cooperatively than has been performed based on experimental data. The results indicate the actual folding cooperativity perpendicular to strand direction is significantly larger than the lower bound obtained previously. PMID:16095612</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27783232','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27783232"><span>Determination of the effects of water adsorption on the sensitivity and detonation performance of the explosive JOB-9003 by molecular dynamics simulation.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Hang, GuiYun; Yu, WenLi; Wang, Tao; Li, Zhen</p> <p>2016-11-01</p> <p>In order to determine the adsorption mechanism of water on the crystal surfaces of the explosive JOB-9003 and the effect of this adsorption on the sensitivity and detonation performance of this explosive, a model of the crystal of JOB-9003 was created in the software package Materials Studio (MS). The adsorption process was simulated, and molecular dynamics simulation was performed with the COMPASS force field in the NPT ensemble to calculate the sensitivity and detonation performance of the explosive. The results show that the maximum trigger bond length decreases whereas the interaction energy of the trigger bond and the cohesive energy density increase after adsorption, indicating that the sensitivity of JOB-9003 decreases. The results for the detonation performance show that the detonation pressure, detonation velocity, and detonation heat decrease upon the adsorption of water, thus illustrating that the detonation performance of JOB-9003 is degraded. In summary, the adsorption of water has a positive effect on the sensitivity and safety of the explosive JOB-9003 but a negative effect on its detonation performance.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1236591','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1236591"><span>M3FT-15OR0202212: SUBMIT SUMMARY REPORT ON THERMODYNAMIC EXPERIMENT AND MODELING</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>McMurray, Jake W.; Brese, Robert G.; Silva, Chinthaka M.</p> <p>2015-09-01</p> <p>Modeling the behavior of nuclear fuel with a physics-based approach uses thermodynamics for key inputs such as chemical potentials and thermal properties for phase transformation, microstructure evolution, and continuum transport simulations. Many of the lanthanide (Ln) elements and Y are high-yield fission products. The U-Y-O and U-Ln-O ternaries are therefore key subsystems of multi-component high-burnup fuel. These elements dissolve in the dominant urania fluorite phase affecting many of its properties. This work reports on an effort to assess the thermodynamics of the U-Pr-O and U-Y-O systems using the CALPHAD (CALculation of PHase Diagrams) method. The models developed within this frameworkmore » are capable of being combined and extended to include additional actinides and fission products allowing calculation of the phase equilibria, thermochemical and material properties of multicomponent fuel with burnup.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19830049412&hterms=Stanford&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D50%26Ntt%3DStanford','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19830049412&hterms=Stanford&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D50%26Ntt%3DStanford"><span>Conference on Complex Turbulent Flows: Comparison of Computation and Experiment, Stanford University, Stanford, CA, September 14-18, 1981, Proceedings. Volume 2 - Taxonomies, reporters' summaries, evaluation, and conclusions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Kline, S. J. (Editor); Cantwell, B. J. (Editor); Lilley, G. M.</p> <p>1982-01-01</p> <p>Computational techniques for simulating turbulent flows were explored, together with the results of experimental investigations. Particular attention was devoted to the possibility of defining a universal closure model, applicable for all turbulence situations; however, conclusions were drawn that zonal models, describing localized structures, were the most promising techniques to date. The taxonomy of turbulent flows was summarized, as were algebraic, differential, integral, and partial differential methods for numerical depiction of turbulent flows. Numerous comparisons of theoretically predicted and experimentally obtained data for wall pressure distributions, velocity profiles, turbulent kinetic energy profiles, Reynolds shear stress profiles, and flows around transonic airfoils were presented. Simplifying techniques for reducing the necessary computational time for modeling complex flowfields were surveyed, together with the industrial requirements and applications of computational fluid dynamics techniques.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5255049','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5255049"><span>A quantile regression model for failure-time data with time-dependent covariates</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Gorfine, Malka; Goldberg, Yair; Ritov, Ya’acov</p> <p>2017-01-01</p> <p>Summary Since survival data occur over time, often important covariates that we wish to consider also change over time. Such covariates are referred as time-dependent covariates. Quantile regression offers flexible modeling of survival data by allowing the covariates to vary with quantiles. This article provides a novel quantile regression model accommodating time-dependent covariates, for analyzing survival data subject to right censoring. Our simple estimation technique assumes the existence of instrumental variables. In addition, we present a doubly-robust estimator in the sense of Robins and Rotnitzky (1992, Recovery of information and adjustment for dependent censoring using surrogate markers. In: Jewell, N. P., Dietz, K. and Farewell, V. T. (editors), AIDS Epidemiology. Boston: Birkhaäuser, pp. 297–331.). The asymptotic properties of the estimators are rigorously studied. Finite-sample properties are demonstrated by a simulation study. The utility of the proposed methodology is demonstrated using the Stanford heart transplant dataset. PMID:27485534</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4222404','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4222404"><span>Probing viscoelastic surfaces with bimodal tapping-mode atomic force microscopy: Underlying physics and observables for a standard linear solid model</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p></p> <p>2014-01-01</p> <p>Summary This paper presents computational simulations of single-mode and bimodal atomic force microscopy (AFM) with particular focus on the viscoelastic interactions occurring during tip–sample impact. The surface is modeled by using a standard linear solid model, which is the simplest system that can reproduce creep compliance and stress relaxation, which are fundamental behaviors exhibited by viscoelastic surfaces. The relaxation of the surface in combination with the complexities of bimodal tip–sample impacts gives rise to unique dynamic behaviors that have important consequences with regards to the acquisition of quantitative relationships between the sample properties and the AFM observables. The physics of the tip–sample interactions and its effect on the observables are illustrated and discussed, and a brief research outlook on viscoelasticity measurement with intermittent-contact AFM is provided. PMID:25383277</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29789126','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29789126"><span>A comprehensive review on self-healing of asphalt materials: Mechanism, model, characterization and enhancement.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Sun, Daquan; Sun, Guoqiang; Zhu, Xingyi; Guarin, Alvaro; Li, Bin; Dai, Ziwei; Ling, Jianming</p> <p>2018-06-01</p> <p>Self-healing has great potential to extend the service life of asphalt pavement, and this capability has been regarded as an important strategy when designing a sustainable infrastructure. This review presents a comprehensive summary of the state-of-the-art investigations concerning the self-healing mechanism, model, characterization and enhancement, ranging from asphalt to asphalt pavement. Firstly, the self-healing phenomenon as a general concept in asphalt materials is analyzed including its definition and the differences among self-healing and some viscoelastic responses. Additionally, the development of self-healing in asphalt pavement design is introduced. Next, four kinds of possible self-healing mechanism and corresponding models are presented. It is pointed out that the continuum thermodynamic model, considering the whole process from damage initiation to healing recovery, can be a promising study field. Further, a set of self-healing multiscale characterization methods from microscale to macroscale as well as computational simulation scale, are summed up. Thereinto, the computational simulation shows great potential in simulating the self-healing behavior of asphalt materials from mechanical and molecular level. Moreover, the factors influencing self-healing capability are discussed, but the action mechanisms of some factors remain unclear and need to be investigated. Finally, two extrinsic self-healing technologies, induction heating and capsule healing, are recommended as preventive maintenance applications in asphalt pavement. In future, more effective energy-based healing systems or novel material-based healing systems are expected to be developed towards designing sustainable long-life asphalt pavement. Copyright © 2018 Elsevier B.V. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..1813677F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..1813677F"><span>Response of different regional online coupled models to aerosol-radiation interactions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Forkel, Renate; Balzarini, Alessandra; Brunner, Dominik; Baró, Rocio; Curci, Gabriele; Hirtl, Marcus; Honzak, Luka; Jiménez-Guerrero, Pedro; Jorba, Oriol; Pérez, Juan L.; Pirovano, Guido; San José, Roberto; Schröder, Wolfram; Tuccella, Paolo; Werhahn, Johannes; Wolke, Ralf; Žabkar, Rahela</p> <p>2016-04-01</p> <p>The importance of aerosol-meteorology interactions and their representation in online coupled regional atmospheric chemistry-meteorology models was investigated in COST Action ES1004 (EuMetChem, http://eumetchem.info/). Case study results from different models (COSMO-Muscat, COSMO-ART, and different configurations of WRF-Chem), which were applied for Europe as a coordinated exercise for the year 2010, are analyzed with respect to inter-model variability and the response of the different models to direct and indirect aerosol-radiation interactions. The main focus was on two episodes - the Russian heat wave and wildfires episode in July/August 2010 and a period in October 2010 with enhanced cloud cover and rain and including an of Saharan dust transport to Europe. Looking at physical plausibility the decrease in downward solar radiation and daytime temperature due to the direct aerosol effect is robust for all model configurations. The same holds for the pronounced decrease in cloud water content and increase in solar radiation for cloudy conditions and very low aerosol concentrations that was found for WRF-Chem when aerosol cloud interactions were considered. However, when the differences were tested for statistical significance no significant differences in mean solar radiation and mean temperature between the baseline case and the simulations including the direct and indirect effect from simulated aerosol concentrations were found over Europe for the October episode. Also for the fire episode differences between mean temperature and radiation from the simulations with and without the direct aerosol effect were not significant for the major part of the modelling domain. Only for the region with high fire emissions in Russia, the differences in mean solar radiation and temperature due to the direct effect were found to be significant during the second half of the fire episode - however only for a significance level of 0.1. The few observational data indicate that the inclusion of aerosol radiative effects improves simulated temperatures in this area. In summary, the direct aerosol effect leads to lower temperatures and PBL heights for all seasons whereas the impact of the aerosol indirect effect on temperature and pollutant concentrations over Northern Europe was found to depend strongly on the season. It cannot be generalized whether the inclusion of aerosol radiative effects and aerosol cloud interactions based on simulated aerosol concentrations does improve the simulation results. Furthermore, assumptions how aerosol optical properties are calculated, i.e. on the aerosol's mixing state have a strong effect on simulated aerosol optical depth and the aerosol effect on incoming solar radiation and temperature. The inter-model variation of the response of different online coupled models suggests that further work comparing the methodologies and parameterizations used to represent the direct and indirect aerosol effect in these models is still necessary.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70157327','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70157327"><span>Summary of hydrologic testing of the Floridan aquifer system at Fort Stewart, coastal Georgia, 2009-2010</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Gonthier, Gerald J.</p> <p>2011-01-01</p> <p>Two test wells were completed at Fort Stewart, coastal Georgia, to investigate the potential for using the Lower Floridan aquifer as a source of water to satisfy anticipated, increased water needs. The U.S. Geological Survey, in cooperation with the U.S. Department of the Army, completed hydrologic testing of the Floridan aquifer system at the study site, including flowmeter surveys, slug tests, and 24- and 72-hour aquifer tests by mid-March 2010. Analytical approaches and model simulation were applied to aquifer-test results to provide estimates of transmissivity and hydraulic conductivity of the multilayered Floridan aquifer system. Data from a 24-hour aquifer test of the Upper Floridan aquifer were evaluated by using the straight-line Cooper-Jacob analytical method. Data from a 72-hour aquifer test of the Lower Floridan aquifer were simulated by using axisymmetric model simulations. Results of aquifer testing indicated that the Upper Floridan aquifer has a transmissivity of 100,000 feet-squared per day, and the Lower Floridan aquifer has a transmissivity of 7,000 feet-squared per day. A specific storage for the Floridan aquifer system as a result of model calibration was 3E-06 ft–1. Additionally, during a 72-hour aquifer test of the Lower Floridan aquifer, a drawdown response was observed in two Upper Floridan aquifer wells, one of which was more than 1 mile away from the pumped well.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70025238','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70025238"><span>GCIP water and energy budget synthesis (WEBS)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Roads, J.; Lawford, R.; Bainto, E.; Berbery, E.; Chen, S.; Fekete, B.; Gallo, K.; Grundstein, A.; Higgins, W.; Kanamitsu, M.; Krajewski, W.; Lakshmi, V.; Leathers, D.; Lettenmaier, D.; Luo, L.; Maurer, E.; Meyers, T.; Miller, D.; Mitchell, Ken; Mote, T.; Pinker, R.; Reichler, T.; Robinson, D.; Robock, A.; Smith, J.; Srinivasan, G.; Verdin, K.; Vinnikov, K.; Vonder, Haar T.; Vorosmarty, C.; Williams, S.; Yarosh, E.</p> <p>2003-01-01</p> <p>As part of the World Climate Research Program's (WCRPs) Global Energy and Water-Cycle Experiment (GEWEX) Continental-scale International Project (GCIP), a preliminary water and energy budget synthesis (WEBS) was developed for the period 1996-1999 fromthe "best available" observations and models. Besides this summary paper, a companion CD-ROM with more extensive discussion, figures, tables, and raw data is available to the interested researcher from the GEWEX project office, the GAPP project office, or the first author. An updated online version of the CD-ROM is also available at http://ecpc.ucsd.edu/gcip/webs.htm/. Observations cannot adequately characterize or "close" budgets since too many fundamental processes are missing. Models that properly represent the many complicated atmospheric and near-surface interactions are also required. This preliminary synthesis therefore included a representative global general circulation model, regional climate model, and a macroscale hydrologic model as well as a global reanalysis and a regional analysis. By the qualitative agreement among the models and available observations, it did appear that we now qualitatively understand water and energy budgets of the Mississippi River Basin. However, there is still much quantitative uncertainty. In that regard, there did appear to be a clear advantage to using a regional analysis over a global analysis or a regional simulation over a global simulation to describe the Mississippi River Basin water and energy budgets. There also appeared to be some advantage to using a macroscale hydrologic model for at least the surface water budgets. Copyright 2003 by the American Geophysical Union.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010JHyd..381..101K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010JHyd..381..101K"><span>Modeling daily discharge responses of a large karstic aquifer using soft computing methods: Artificial neural network and neuro-fuzzy</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kurtulus, Bedri; Razack, Moumtaz</p> <p>2010-02-01</p> <p>SummaryThis paper compares two methods for modeling karst aquifers, which are heterogeneous, highly non-linear, and hierarchical systems. There is a clear need to model these systems given the crucial role they play in water supply in many countries. In recent years, the main components of soft computing (fuzzy logic (FL), and Artificial Neural Networks, (ANNs)) have come to prevail in the modeling of complex non-linear systems in different scientific and technologic disciplines. In this study, Artificial Neural Networks and Adaptive Neuro-Fuzzy Interface System (ANFIS) methods were used for the prediction of daily discharge of karstic aquifers and their capability was compared. The approach was applied to 7 years of daily data of La Rochefoucauld karst system in south-western France. In order to predict the karst daily discharges, single-input (rainfall, piezometric level) vs. multiple-input (rainfall and piezometric level) series were used. In addition to these inputs, all models used measured or simulated discharges from the previous days with a specified delay. The models were designed in a Matlab™ environment. An automatic procedure was used to select the best calibrated models. Daily discharge predictions were then performed using the calibrated models. Comparing predicted and observed hydrographs indicates that both models (ANN and ANFIS) provide close predictions of the karst daily discharges. The summary statistics of both series (observed and predicted daily discharges) are comparable. The performance of both models is improved when the number of inputs is increased from one to two. The root mean square error between the observed and predicted series reaches a minimum for two-input models. However, the ANFIS model demonstrates a better performance than the ANN model to predict peak flow. The ANFIS approach demonstrates a better generalization capability and slightly higher performance than the ANN, especially for peak discharges.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.usgs.gov/of/1980/0421/report.pdf','USGSPUBS'); return false;" href="https://pubs.usgs.gov/of/1980/0421/report.pdf"><span>A computer program for simulating geohydrologic systems in three dimensions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Posson, D.R.; Hearne, G.A.; Tracy, J.V.; Frenzel, P.F.</p> <p>1980-01-01</p> <p>This document is directed toward individuals who wish to use a computer program to simulate ground-water flow in three dimensions. The strongly implicit procedure (SIP) numerical method is used to solve the set of simultaneous equations. New data processing techniques and program input and output options are emphasized. The quifer system to be modeled may be heterogeneous and anisotropic, and may include both artesian and water-table conditions. Systems which consist of well defined alternating layers of highly permeable and poorly permeable material may be represented by a sequence of equations for two dimensional flow in each of the highly permeable units. Boundaries where head or flux is user-specified may be irregularly shaped. The program also allows the user to represent streams as limited-source boundaries when the streamflow is small in relation to the hydraulic stress on the system. The data-processing techniques relating to ' cube ' input and output, to swapping of layers, to restarting of simulation, to free-format NAMELIST input, to the details of each sub-routine 's logic, and to the overlay program structure are discussed. The program is capable of processing large models that might overflow computer memories with conventional programs. Detailed instructions for selecting program options, for initializing the data arrays, for defining ' cube ' output lists and maps, and for plotting hydrographs of calculated and observed heads and/or drawdowns are provided. Output may be restricted to those nodes of particular interest, thereby reducing the volumes of printout for modelers, which may be critical when working at remote terminals. ' Cube ' input commands allow the modeler to set aquifer parameters and initialize the model with very few input records. Appendixes provide instructions to compile the program, definitions and cross-references for program variables, summary of the FLECS structured FORTRAN programming language, listings of the FLECS and FORTRAN source code, and samples of input and output for example simulations. (USGS)</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3081249','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3081249"><span>Interprofessional collaboration: three best practice models of interprofessional education</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Bridges, Diane R.; Davidson, Richard A.; Odegard, Peggy Soule; Maki, Ian V.; Tomkowiak, John</p> <p>2011-01-01</p> <p>Interprofessional education is a collaborative approach to develop healthcare students as future interprofessional team members and a recommendation suggested by the Institute of Medicine. Complex medical issues can be best addressed by interprofessional teams. Training future healthcare providers to work in such teams will help facilitate this model resulting in improved healthcare outcomes for patients. In this paper, three universities, the Rosalind Franklin University of Medicine and Science, the University of Florida and the University of Washington describe their training curricula models of collaborative and interprofessional education. The models represent a didactic program, a community-based experience and an interprofessional-simulation experience. The didactic program emphasizes interprofessional team building skills, knowledge of professions, patient centered care, service learning, the impact of culture on healthcare delivery and an interprofessional clinical component. The community-based experience demonstrates how interprofessional collaborations provide service to patients and how the environment and availability of resources impact one's health status. The interprofessional-simulation experience describes clinical team skills training in both formative and summative simulations used to develop skills in communication and leadership. One common theme leading to a successful experience among these three interprofessional models included helping students to understand their own professional identity while gaining an understanding of other professional's roles on the health care team. Commitment from departments and colleges, diverse calendar agreements, curricular mapping, mentor and faculty training, a sense of community, adequate physical space, technology, and community relationships were all identified as critical resources for a successful program. Summary recommendations for best practices included the need for administrative support, interprofessional programmatic infrastructure, committed faculty, and the recognition of student participation as key components to success for anyone developing an IPE centered program. PMID:21519399</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li class="active"><span>24</span></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_24 --> <div id="page_25" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="481"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/21519399','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/21519399"><span>Interprofessional collaboration: three best practice models of interprofessional education.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Bridges, Diane R; Davidson, Richard A; Odegard, Peggy Soule; Maki, Ian V; Tomkowiak, John</p> <p>2011-04-08</p> <p>Interprofessional education is a collaborative approach to develop healthcare students as future interprofessional team members and a recommendation suggested by the Institute of Medicine. Complex medical issues can be best addressed by interprofessional teams. Training future healthcare providers to work in such teams will help facilitate this model resulting in improved healthcare outcomes for patients. In this paper, three universities, the Rosalind Franklin University of Medicine and Science, the University of Florida and the University of Washington describe their training curricula models of collaborative and interprofessional education.The models represent a didactic program, a community-based experience and an interprofessional-simulation experience. The didactic program emphasizes interprofessional team building skills, knowledge of professions, patient centered care, service learning, the impact of culture on healthcare delivery and an interprofessional clinical component. The community-based experience demonstrates how interprofessional collaborations provide service to patients and how the environment and availability of resources impact one's health status. The interprofessional-simulation experience describes clinical team skills training in both formative and summative simulations used to develop skills in communication and leadership.One common theme leading to a successful experience among these three interprofessional models included helping students to understand their own professional identity while gaining an understanding of other professional's roles on the health care team. Commitment from departments and colleges, diverse calendar agreements, curricular mapping, mentor and faculty training, a sense of community, adequate physical space, technology, and community relationships were all identified as critical resources for a successful program. Summary recommendations for best practices included the need for administrative support, interprofessional programmatic infrastructure, committed faculty, and the recognition of student participation as key components to success for anyone developing an IPE centered program.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2006AGUFMGC41B1059H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2006AGUFMGC41B1059H"><span>Windstorms and Insured Loss in the UK: Modelling the Present and the Future</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hewston, R.; Dorling, S.; Viner, D.</p> <p>2006-12-01</p> <p>Worldwide, the costs of catastrophic weather events have increased dramatically in recent years, with average annual insured losses rising from a negligible level in 1950 to over $10bn in 2005 (Munich Re 2006). When losses from non-catastrophic weather related events are included this figure is doubled. A similar trend is exhibited in the UK with claims totalling over £6bn for the period 1998-2003, more than twice the value for the previous five years (Dlugolecki 2004). More than 70% of this loss is associated with storms. Extratropical cyclones are the main source of wind damage in the UK. In this research, a windstorm model is constructed to simulate patterns of insured loss associated with wind damage in the UK. Observed daily maximum wind gust speeds and a variety of socioeconomic datasets are utilised in a GIS generated model, which is verified against actual domestic property insurance claims data from two major insurance providers. The increased frequency and intensity of extreme events which are anticipated to accompany climate change in the UK will have a direct affect on general insurance, with the greatest impact expected to be on property insurance (Dlugolecki 2004). A range of experiments will be run using Regional Climate Model output data, in conjunction with the windstorm model, to simulate possible future losses resulting from climate change, assuming no alteration to the vulnerability of the building stock. Losses for the periods 2020-2050 and 2070- 2100 will be simulated under the various IPCC emissions scenarios. Munich Re (2006). Annual Review: Natural Catastrophes 2005. Munich, Munich Re: 52. Dlugolecki, A. (2004). A Changing Climate for Insurance - A summary report for Chief Executives and Policymakers, Association of British Insurers</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2005CoPhC.167..217B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2005CoPhC.167..217B"><span>DPEMC: A Monte Carlo for double diffraction</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Boonekamp, M.; Kúcs, T.</p> <p>2005-05-01</p> <p>We extend the POMWIG Monte Carlo generator developed by B. Cox and J. Forshaw, to include new models of central production through inclusive and exclusive double Pomeron exchange in proton-proton collisions. Double photon exchange processes are described as well, both in proton-proton and heavy-ion collisions. In all contexts, various models have been implemented, allowing for comparisons and uncertainty evaluation and enabling detailed experimental simulations. Program summaryTitle of the program:DPEMC, version 2.4 Catalogue identifier: ADVF Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVF Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer: any computer with the FORTRAN 77 compiler under the UNIX or Linux operating systems Operating system: UNIX; Linux Programming language used: FORTRAN 77 High speed storage required:<25 MB No. of lines in distributed program, including test data, etc.: 71 399 No. of bytes in distributed program, including test data, etc.: 639 950 Distribution format: tar.gz Nature of the physical problem: Proton diffraction at hadron colliders can manifest itself in many forms, and a variety of models exist that attempt to describe it [A. Bialas, P.V. Landshoff, Phys. Lett. B 256 (1991) 540; A. Bialas, W. Szeremeta, Phys. Lett. B 296 (1992) 191; A. Bialas, R.A. Janik, Z. Phys. C 62 (1994) 487; M. Boonekamp, R. Peschanski, C. Royon, Phys. Rev. Lett. 87 (2001) 251806; Nucl. Phys. B 669 (2003) 277; R. Enberg, G. Ingelman, A. Kissavos, N. Timneanu, Phys. Rev. Lett. 89 (2002) 081801; R. Enberg, G. Ingelman, L. Motyka, Phys. Lett. B 524 (2002) 273; R. Enberg, G. Ingelman, N. Timneanu, Phys. Rev. D 67 (2003) 011301; B. Cox, J. Forshaw, Comput. Phys. Comm. 144 (2002) 104; B. Cox, J. Forshaw, B. Heinemann, Phys. Lett. B 540 (2002) 26; V. Khoze, A. Martin, M. Ryskin, Phys. Lett. B 401 (1997) 330; Eur. Phys. J. C 14 (2000) 525; Eur. Phys. J. C 19 (2001) 477; Erratum, Eur. Phys. J. C 20 (2001) 599; Eur. Phys. J. C 23 (2002) 311]. This program implements some of the more significant ones, enabling the simulation of central particle production through color singlet exchange between interacting protons or antiprotons. Method of solution: The Monte Carlo method is used to simulate all elementary 2→2 and 2→1 processes available in HERWIG. The color singlet exchanges implemented in DPEMC are implemented as functions reweighting the photon flux already present in HERWIG. Restriction on the complexity of the problem: The program relying extensively on HERWIG, the limitations are the same as in [G. Marchesini, B.R. Webber, G. Abbiendi, I.G. Knowles, M.H. Seymour, L. Stanco, Comput. Phys. Comm. 67 (1992) 465; G. Corcella, I.G. Knowles, G. Marchesini, S. Moretti, K. Odagiri, P. Richardson, M. Seymour, B. Webber, JHEP 0101 (2001) 010]. Typical running time: Approximate times on a 800 MHz Pentium III: 5-20 min per 10 000 unweighted events, depending on the process under consideration.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70035726','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70035726"><span>Use of hydrologic and hydrodynamic modeling for ecosystem restoration</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Obeysekera, J.; Kuebler, L.; Ahmed, S.; Chang, M.-L.; Engel, V.; Langevin, C.; Swain, E.; Wan, Y.</p> <p>2011-01-01</p> <p>Planning and implementation of unprecedented projects for restoring the greater Everglades ecosystem are underway and the hydrologic and hydrodynamic modeling of restoration alternatives has become essential for success of restoration efforts. In view of the complex nature of the South Florida water resources system, regional-scale (system-wide) hydrologic models have been developed and used extensively for the development of the Comprehensive Everglades Restoration Plan. In addition, numerous subregional-scale hydrologic and hydrodynamic models have been developed and are being used for evaluating project-scale water management plans associated with urban, agricultural, and inland costal ecosystems. The authors provide a comprehensive summary of models of all scales, as well as the next generation models under development to meet the future needs of ecosystem restoration efforts in South Florida. The multiagency efforts to develop and apply models have allowed the agencies to understand the complex hydrologic interactions, quantify appropriate performance measures, and use new technologies in simulation algorithms, software development, and GIS/database techniques to meet the future modeling needs of the ecosystem restoration programs. Copyright ?? 2011 Taylor & Francis Group, LLC.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5575530','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5575530"><span>Measuring the statistical validity of summary meta‐analysis and meta‐regression results for use in clinical practice</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Riley, Richard D.</p> <p>2017-01-01</p> <p>An important question for clinicians appraising a meta‐analysis is: are the findings likely to be valid in their own practice—does the reported effect accurately represent the effect that would occur in their own clinical population? To this end we advance the concept of statistical validity—where the parameter being estimated equals the corresponding parameter for a new independent study. Using a simple (‘leave‐one‐out’) cross‐validation technique, we demonstrate how we may test meta‐analysis estimates for statistical validity using a new validation statistic, Vn, and derive its distribution. We compare this with the usual approach of investigating heterogeneity in meta‐analyses and demonstrate the link between statistical validity and homogeneity. Using a simulation study, the properties of Vn and the Q statistic are compared for univariate random effects meta‐analysis and a tailored meta‐regression model, where information from the setting (included as model covariates) is used to calibrate the summary estimate to the setting of application. Their properties are found to be similar when there are 50 studies or more, but for fewer studies Vn has greater power but a higher type 1 error rate than Q. The power and type 1 error rate of Vn are also shown to depend on the within‐study variance, between‐study variance, study sample size, and the number of studies in the meta‐analysis. Finally, we apply Vn to two published meta‐analyses and conclude that it usefully augments standard methods when deciding upon the likely validity of summary meta‐analysis estimates in clinical practice. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:28620945</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28620945','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28620945"><span>Measuring the statistical validity of summary meta-analysis and meta-regression results for use in clinical practice.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Willis, Brian H; Riley, Richard D</p> <p>2017-09-20</p> <p>An important question for clinicians appraising a meta-analysis is: are the findings likely to be valid in their own practice-does the reported effect accurately represent the effect that would occur in their own clinical population? To this end we advance the concept of statistical validity-where the parameter being estimated equals the corresponding parameter for a new independent study. Using a simple ('leave-one-out') cross-validation technique, we demonstrate how we may test meta-analysis estimates for statistical validity using a new validation statistic, Vn, and derive its distribution. We compare this with the usual approach of investigating heterogeneity in meta-analyses and demonstrate the link between statistical validity and homogeneity. Using a simulation study, the properties of Vn and the Q statistic are compared for univariate random effects meta-analysis and a tailored meta-regression model, where information from the setting (included as model covariates) is used to calibrate the summary estimate to the setting of application. Their properties are found to be similar when there are 50 studies or more, but for fewer studies Vn has greater power but a higher type 1 error rate than Q. The power and type 1 error rate of Vn are also shown to depend on the within-study variance, between-study variance, study sample size, and the number of studies in the meta-analysis. Finally, we apply Vn to two published meta-analyses and conclude that it usefully augments standard methods when deciding upon the likely validity of summary meta-analysis estimates in clinical practice. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/12450354','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/12450354"><span>Simulating closed- and open-loop voluntary movement: a nonlinear control-systems approach.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Davidson, Paul R; Jones, Richard D; Andreae, John H; Sirisena, Harsha R</p> <p>2002-11-01</p> <p>In many recent human motor control models, including feedback-error learning and adaptive model theory (AMT), feedback control is used to correct errors while an inverse model is simultaneously tuned to provide accurate feedforward control. This popular and appealing hypothesis, based on a combination of psychophysical observations and engineering considerations, predicts that once the tuning of the inverse model is complete the role of feedback control is limited to the correction of disturbances. This hypothesis was tested by looking at the open-loop behavior of the human motor system during adaptation. An experiment was carried out involving 20 normal adult subjects who learned a novel visuomotor relationship on a pursuit tracking task with a steering wheel for input. During learning, the response cursor was periodically blanked, removing all feedback about the external system (i.e., about the relationship between hand motion and response cursor motion). Open-loop behavior was not consistent with a progressive transfer from closed- to open-loop control. Our recently developed computational model of the brain--a novel nonlinear implementation of AMT--was able to reproduce the observed closed- and open-loop results. In contrast, other control-systems models exhibited only minimal feedback control following adaptation, leading to incorrect open-loop behavior. This is because our model continues to use feedback to control slow movements after adaptation is complete. This behavior enhances the internal stability of the inverse model. In summary, our computational model is currently the only motor control model able to accurately simulate the closed- and open-loop characteristics of the experimental response trajectories.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70034819','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70034819"><span>Effects of uncertain topographic input data on two-dimensional flow modeling in a gravel-bed river</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Legleiter, C.J.; Kyriakidis, P.C.; McDonald, R.R.; Nelson, J.M.</p> <p>2011-01-01</p> <p>Many applications in river research and management rely upon two-dimensional (2D) numerical models to characterize flow fields, assess habitat conditions, and evaluate channel stability. Predictions from such models are potentially highly uncertain due to the uncertainty associated with the topographic data provided as input. This study used a spatial stochastic simulation strategy to examine the effects of topographic uncertainty on flow modeling. Many, equally likely bed elevation realizations for a simple meander bend were generated and propagated through a typical 2D model to produce distributions of water-surface elevation, depth, velocity, and boundary shear stress at each node of the model's computational grid. Ensemble summary statistics were used to characterize the uncertainty associated with these predictions and to examine the spatial structure of this uncertainty in relation to channel morphology. Simulations conditioned to different data configurations indicated that model predictions became increasingly uncertain as the spacing between surveyed cross sections increased. Model sensitivity to topographic uncertainty was greater for base flow conditions than for a higher, subbankfull flow (75% of bankfull discharge). The degree of sensitivity also varied spatially throughout the bend, with the greatest uncertainty occurring over the point bar where the flow field was influenced by topographic steering effects. Uncertain topography can therefore introduce significant uncertainty to analyses of habitat suitability and bed mobility based on flow model output. In the presence of such uncertainty, the results of these studies are most appropriately represented in probabilistic terms using distributions of model predictions derived from a series of topographic realizations. Copyright 2011 by the American Geophysical Union.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20000120213','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20000120213"><span>Numerical Propulsion System Simulation (NPSS) 1999 Industry Review</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Lytle, John; Follen, Greg; Naiman, Cynthia; Evans, Austin</p> <p>2000-01-01</p> <p>The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia, and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective, high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. In addition, the paper contains a summary of the feedback received from industry partners in the development effort and the actions taken over the past year to respond to that feedback. The NPSS development was supported in FY99 by the High Performance Computing and Communications Program.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009AIPC.1086...76S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009AIPC.1086...76S"><span>Summary Report of Working Group 2: Computation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Stoltz, P. H.; Tsung, R. S.</p> <p>2009-01-01</p> <p>The working group on computation addressed three physics areas: (i) plasma-based accelerators (laser-driven and beam-driven), (ii) high gradient structure-based accelerators, and (iii) electron beam sources and transport [1]. Highlights of the talks in these areas included new models of breakdown on the microscopic scale, new three-dimensional multipacting calculations with both finite difference and finite element codes, and detailed comparisons of new electron gun models with standard models such as PARMELA. The group also addressed two areas of advances in computation: (i) new algorithms, including simulation in a Lorentz-boosted frame that can reduce computation time orders of magnitude, and (ii) new hardware architectures, like graphics processing units and Cell processors that promise dramatic increases in computing power. Highlights of the talks in these areas included results from the first large-scale parallel finite element particle-in-cell code (PIC), many order-of-magnitude speedup of, and details of porting the VPIC code to the Roadrunner supercomputer. The working group featured two plenary talks, one by Brian Albright of Los Alamos National Laboratory on the performance of the VPIC code on the Roadrunner supercomputer, and one by David Bruhwiler of Tech-X Corporation on recent advances in computation for advanced accelerators. Highlights of the talk by Albright included the first one trillion particle simulations, a sustained performance of 0.3 petaflops, and an eight times speedup of science calculations, including back-scatter in laser-plasma interaction. Highlights of the talk by Bruhwiler included simulations of 10 GeV accelerator laser wakefield stages including external injection, new developments in electromagnetic simulations of electron guns using finite difference and finite element approaches.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3385986','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3385986"><span>Impact of Sampling Schemes on Demographic Inference: An Empirical Study in Two Species with Different Mating Systems and Demographic Histories</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>St. Onge, K. R.; Palmé, A. E.; Wright, S. I.; Lascoux, M.</p> <p>2012-01-01</p> <p>Most species have at least some level of genetic structure. Recent simulation studies have shown that it is important to consider population structure when sampling individuals to infer past population history. The relevance of the results of these computer simulations for empirical studies, however, remains unclear. In the present study, we use DNA sequence datasets collected from two closely related species with very different histories, the selfing species Capsella rubella and its outcrossing relative C. grandiflora, to assess the impact of different sampling strategies on summary statistics and the inference of historical demography. Sampling strategy did not strongly influence the mean values of Tajima’s D in either species, but it had some impact on the variance. The general conclusions about demographic history were comparable across sampling schemes even when resampled data were analyzed with approximate Bayesian computation (ABC). We used simulations to explore the effects of sampling scheme under different demographic models. We conclude that when sequences from modest numbers of loci (<60) are analyzed, the sampling strategy is generally of limited importance. The same is true under intermediate or high levels of gene flow (4Nm > 2–10) in models in which global expansion is combined with either local expansion or hierarchical population structure. Although we observe a less severe effect of sampling than predicted under some earlier simulation models, our results should not be seen as an encouragement to neglect this issue. In general, a good coverage of the natural range, both within and between populations, will be needed to obtain a reliable reconstruction of a species’s demographic history, and in fact, the effect of sampling scheme on polymorphism patterns may itself provide important information about demographic history. PMID:22870403</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/21255312-summary-report-working-group-computation','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/21255312-summary-report-working-group-computation"><span>Summary Report of Working Group 2: Computation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Stoltz, P. H.; Tsung, R. S.</p> <p>2009-01-22</p> <p>The working group on computation addressed three physics areas: (i) plasma-based accelerators (laser-driven and beam-driven), (ii) high gradient structure-based accelerators, and (iii) electron beam sources and transport [1]. Highlights of the talks in these areas included new models of breakdown on the microscopic scale, new three-dimensional multipacting calculations with both finite difference and finite element codes, and detailed comparisons of new electron gun models with standard models such as PARMELA. The group also addressed two areas of advances in computation: (i) new algorithms, including simulation in a Lorentz-boosted frame that can reduce computation time orders of magnitude, and (ii) newmore » hardware architectures, like graphics processing units and Cell processors that promise dramatic increases in computing power. Highlights of the talks in these areas included results from the first large-scale parallel finite element particle-in-cell code (PIC), many order-of-magnitude speedup of, and details of porting the VPIC code to the Roadrunner supercomputer. The working group featured two plenary talks, one by Brian Albright of Los Alamos National Laboratory on the performance of the VPIC code on the Roadrunner supercomputer, and one by David Bruhwiler of Tech-X Corporation on recent advances in computation for advanced accelerators. Highlights of the talk by Albright included the first one trillion particle simulations, a sustained performance of 0.3 petaflops, and an eight times speedup of science calculations, including back-scatter in laser-plasma interaction. Highlights of the talk by Bruhwiler included simulations of 10 GeV accelerator laser wakefield stages including external injection, new developments in electromagnetic simulations of electron guns using finite difference and finite element approaches.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2673664','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2673664"><span>Validation of CT dose-reduction simulation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Massoumzadeh, Parinaz; Don, Steven; Hildebolt, Charles F.; Bae, Kyongtae T.; Whiting, Bruce R.</p> <p>2009-01-01</p> <p>The objective of this research was to develop and validate a custom computed tomography dose-reduction simulation technique for producing images that have an appearance consistent with the same scan performed at a lower mAs (with fixed kVp, rotation time, and collimation). Synthetic noise is added to projection (sinogram) data, incorporating a stochastic noise model that includes energy-integrating detectors, tube-current modulation, bowtie beam filtering, and electronic system noise. Experimental methods were developed to determine the parameters required for each component of the noise model. As a validation, the outputs of the simulations were compared to measurements with cadavers in the image domain and with phantoms in both the sinogram and image domain, using an unbiased root-mean-square relative error metric to quantify agreement in noise processes. Four-alternative forced-choice (4AFC) observer studies were conducted to confirm the realistic appearance of simulated noise, and the effects of various system model components on visual noise were studied. The “just noticeable difference (JND)” in noise levels was analyzed to determine the sensitivity of observers to changes in noise level. Individual detector measurements were shown to be normally distributed (p>0.54), justifying the use of a Gaussian random noise generator for simulations. Phantom tests showed the ability to match original and simulated noise variance in the sinogram domain to within 5.6%±1.6% (standard deviation), which was then propagated into the image domain with errors less than 4.1%±1.6%. Cadaver measurements indicated that image noise was matched to within 2.6%±2.0%. More importantly, the 4AFC observer studies indicated that the simulated images were realistic, i.e., no detectable difference between simulated and original images (p=0.86) was observed. JND studies indicated that observers’ sensitivity to change in noise levels corresponded to a 25% difference in dose, which is far larger than the noise accuracy achieved by simulation. In summary, the dose-reduction simulation tool demonstrated excellent accuracy in providing realistic images. The methodology promises to be a useful tool for researchers and radiologists to explore dose reduction protocols in an effort to produce diagnostic images with radiation dose “as low as reasonably achievable.” PMID:19235386</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA470282','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA470282"><span>Code Validation Studies of High-Enthalpy Flows</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2006-12-01</p> <p>stage of future hypersonic vehicles. The development and design of such vehicles is aided by the use of experimentation and numerical simulation... numerical predictions and experimental measurements. 3. Summary of Previous Work We have studied extensively hypersonic double-cone flows with and in...the experimental measurements and the numerical predictions. When we accounted for that effect in numerical simulations, and also augmented the</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=computer+AND+medicine&pg=5&id=EJ1020586','ERIC'); return false;" href="https://eric.ed.gov/?q=computer+AND+medicine&pg=5&id=EJ1020586"><span>The Potential of Simulated Environments in Teacher Education: Current and Future Possibilities</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Dieker, Lisa A.; Rodriguez, Jacqueline A.; Lignugaris/Kraft, Benjamin; Hynes, Michael C.; Hughes, Charles E.</p> <p>2014-01-01</p> <p>The future of virtual environments is evident in many fields but is just emerging in the field of teacher education. In this article, the authors provide a summary of the evolution of simulation in the field of teacher education and three factors that need to be considered as these environments further develop. The authors provide a specific…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19840012205','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19840012205"><span>Estimation of discontinuous coefficients in parabolic systems: Applications to reservoir simulation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Lamm, P. D.</p> <p>1984-01-01</p> <p>Spline based techniques for estimating spatially varying parameters that appear in parabolic distributed systems (typical of those found in reservoir simulation problems) are presented. The problem of determining discontinuous coefficients, estimating both the functional shape and points of discontinuity for such parameters is discussed. Convergence results and a summary of numerical performance of the resulting algorithms are given.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/ED324839.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/ED324839.pdf"><span>Using Simulation Technology to Promote Social Competence of Handicapped Students. Final Report. Executive Summary.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Appell, Louise S.; And Others</p> <p></p> <p>The purpose of this project was to design and develop simulation materials utilizing vocational situations) in mildly/moderately handicapped young adults. The final product, a set of materials titled "Social Skills on the Job," includes a videotape of 15 lessons, a computer software package, and a teacher's guide, and was marketed to a commercial…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017CP....493..200M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017CP....493..200M"><span>Ab initio quantum direct dynamics simulations of ultrafast photochemistry with Multiconfigurational Ehrenfest approach</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Makhov, Dmitry V.; Symonds, Christopher; Fernandez-Alberti, Sebastian; Shalashilin, Dmitrii V.</p> <p>2017-08-01</p> <p>The Multiconfigurational Ehrenfest (MCE) method is a quantum dynamics technique which allows treatment of a large number of quantum nuclear degrees of freedom. This paper presents a review of MCE and its recent applications, providing a summary of the formalisms, including its ab initio direct dynamics versions and also giving a summary of recent results. Firstly, we describe the Multiconfigurational Ehrenfest version 2 (MCEv2) method and its applicability to direct dynamics and report new calculations which show that the approach converges to the exact result in model systems with tens of degrees of freedom. Secondly, we review previous ;on the fly; ab initio Multiple Cloning (AIMC-MCE) MCE dynamics results obtained for systems of a similar size, in which the calculations treat every electron and every nucleus of a polyatomic molecule on a fully quantum basis. We also review the Time Dependent Diabatic Basis (TDDB) version of the technique and give an example of its application. We summarise the details of the sampling techniques and interpolations used for calculation of the matrix elements, which make our approach efficient. Future directions of work are outlined.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/6078617-impacts-acidic-deposition-context-case-studies-forest-soils-southeastern-us','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/6078617-impacts-acidic-deposition-context-case-studies-forest-soils-southeastern-us"><span>Impacts of acidic deposition: context and case studies of forest soils in the southeastern US</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Binkley, D.; Driscoll, C.T.; Allen, H.L.</p> <p>1988-12-01</p> <p>The authors designed their assessment to include both the basic foundation needed by non-experts and the detailed information needed by experts. Their assessment includes background information on acidic deposition (Chap. 1), an in-depth discussion of the nature of soil acidity and ecosystem H(1+) budgets (Chap. 2), and a summary of rates of deposition in the Southeastern U.S. (Chap. 3). A discussion of the nature of forest soils in the region (Chap. 4) is followed by an overview of previous assessments of soil sensitivity to acidification (Chap. 5). The potential impacts of acidic deposition on forest nutrition are described in themore » context of the degree of current nutrient limitation on forest productivity (Chap. 6). The results of simulations with the MAGIC model provided evaluations of the likely sensitivity of a variety of soils representative of forest soils in the South (Chap. 7), as well as a test of soil sensitivity criteria. The authors' synthesis and recommendations for research (Chap. 8) also serve as an executive summary.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2008AGUFM.S33A1929Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2008AGUFM.S33A1929Z"><span>Simulation studies on the differences between spontaneous and triggered seismicity and on foreshock probabilities</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zhuang, J.; Vere-Jones, D.; Ogata, Y.; Christophersen, A.; Savage, M. K.; Jackson, D. D.</p> <p>2008-12-01</p> <p>In this study we investigate the foreshock probabilities calculated from earthquake catalogs from Japan, Southern California and New Zealand. Unlike conventional studies on foreshocks, we use a probability-based declustering method to separate each catalog into stochastic versions of family trees, such that each event is classified as either having been triggered by a preceding event, or being a spontaneous event. The probabilities are determined from parameters that provide the best fit of the real catalogue using a space- time epidemic-type aftershock sequence (ETAS) model. The model assumes that background and triggered earthquakes have the same magnitude dependent triggering capability. A foreshock here is defined as a spontaneous event that has one or more larger descendants, and a triggered foreshock is a triggered event that has one or more larger descendants. The proportion of foreshocks in spontaneous events of each catalog is found to be lower than the proportion of triggered foreshocks in triggered events. One possibility is that this is due to different triggering productivity in spontaneous versus triggered events, i.e., a triggered event triggers more children than a spontaneous events of the same magnitude. To understand what causes the above differences between spontaneous and triggered events, we apply the same procedures to several synthetic catalogs simulated by using different models. The first simulation is done by using the ETAS model with parameters and spontaneous rate fitted from the JMA catalog. The second synthetic catalog is simulated by using an adjusted ETAS model that takes into account the triggering effect from events lower than the magnitude. That is, we simulated the catalog with a low magnitude threshold with the original ETAS model, and then we remove the events smaller than a higher magnitude threshold. The third model for simulation assumes that different triggering behaviors exist between spontaneous event and triggered events. We repeat the fitting and reconstruction procedures to all those simulated catalogs. The reconstruction results for the first synthetic catalog do not show the difference between spontaneous events and triggered event or the differences in foreshock probabilities. On the other hand, results from the synthetic catalogs simulated with the second and the third models clearly reconstruct such differences. In summary our results implies that one of the causes of such differences may be neglecting the triggering effort from events smaller than the cut-off magnitude or magnitude errors. For the objective of forecasting seismicity, we can use a clustering model in which spontaneous events trigger child events in a different way from triggered events to avoid over-predicting earthquake risks with foreshocks. To understand the physical implication of this study, we need further careful studies to compare the real seismicity and the adjusted ETAS model, which takes the triggering effect from events below the cut-off magnitude into account.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_25 --> <div class="footer-extlink text-muted" style="margin-bottom:1rem; text-align:center;">Some links on this page may take you to non-federal websites. Their policies may differ from this site.</div> </div><!-- container --> <a id="backToTop" href="#top"> Top </a> <footer> <nav> <ul class="links"> <li><a href="/sitemap.html">Site Map</a></li> <li><a href="/website-policies.html">Website Policies</a></li> <li><a href="https://www.energy.gov/vulnerability-disclosure-policy" target="_blank">Vulnerability Disclosure Program</a></li> <li><a href="/contact.html">Contact Us</a></li> </ul> </nav> </footer> <script type="text/javascript"><!-- // var lastDiv = ""; function showDiv(divName) { // hide last div if (lastDiv) { document.getElementById(lastDiv).className = "hiddenDiv"; } //if value of the box is not nothing and an object with that name exists, then change the class if (divName && document.getElementById(divName)) { document.getElementById(divName).className = "visibleDiv"; lastDiv = divName; } } //--> </script> <script> /** * Function that tracks a click on an outbound link in Google Analytics. * This function takes a valid URL string as an argument, and uses that URL string * as the event label. */ var trackOutboundLink = function(url,collectionCode) { try { h = window.open(url); setTimeout(function() { ga('send', 'event', 'topic-page-click-through', collectionCode, url); }, 1000); } catch(err){} }; </script> <!-- Google Analytics --> <script> (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','//www.google-analytics.com/analytics.js','ga'); ga('create', 'UA-1122789-34', 'auto'); ga('send', 'pageview'); </script> <!-- End Google Analytics --> <script> showDiv('page_1') </script> </body> </html>